Refer a freelancer, and you both get 1 free week of DFH Premium. They must use your code {code} at sign-up. More referrals = more free weeks! T&Cs apply.
1 of 5 free roles viewed today. Upgrade to premium for unlimited.

Big Data Developer

This role is for a Big Data Developer in Alpharetta, GA, with a long-term contract. Candidates must have strong Python, Java, and SQL skills, expertise in Hadoop and Snowflake, and knowledge of data engineering concepts. Visa-independent candidates only.
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
🗓️ - Date discovered
February 20, 2025
🕒 - Project duration
Unknown
🏝️ - Location type
Hybrid
📄 - Contract type
Unknown
🔒 - Security clearance
Unknown
📍 - Location detailed
Alpharetta, GA
🧠 - Skills detailed
#Data Manipulation #"ETL (Extract #Transform #Load)" #Spark (Apache Spark) #Hadoop #Java #Data Security #Cloud #Python #Data Ingestion #Snowflake #Programming #HDFS (Hadoop Distributed File System) #Data Engineering #SQL (Structured Query Language) #Data Modeling #Security #Big Data #Data Framework #Data Quality
Role description
You've reached your limit of 5 free role views today. Upgrade to premium for unlimited access.

Job Title: Big data Hadoop-Spark Developer

Location: Alpharetta, GA (3days office 2 days remote)

Duration: Long-Term

Only Visa-independent candidates are encouraged to apply.

Required Skills:
• Programming Languages: Strong proficiency in Python, Java, and SQL.
• Big Data Frameworks: Deep understanding of Hadoop ecosystem (HDFS, MapReduce, Hive, Spark).
• Cloud Data Warehousing: Expertise in Snowflake architecture, data manipulation, and query optimization.
• Data Engineering Concepts: Knowledge of data ingestion, transformation, data quality checks, and data security practices.
• Data Modeling: Ability to design efficient data models for data warehousing environments.