Refer a freelancer, and you both get 1 free week of DFH Premium. They must use your code {code} at sign-up. More referrals = more free weeks! T&Cs apply.
1 of 5 free roles viewed today. Upgrade to premium for unlimited.

ETL Developer

This role is for an ETL Developer III, 12 months in Chicago, IL (onsite 2 days). Requires 6+ years ETL experience, SQL proficiency, 3-4 years Python/PySpark, and AWS services knowledge. Pay rate is $55-$60/hr on W2.
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
480
🗓️ - Date discovered
February 14, 2025
🕒 - Project duration
More than 6 months
🏝️ - Location type
On-site
📄 - Contract type
W2 Contractor
🔒 - Security clearance
Unknown
📍 - Location detailed
Chicago, IL
🧠 - Skills detailed
#Data Profiling #Dynatrace #SQL (Structured Query Language) #Lambda (AWS Lambda) #Datadog #AWS (Amazon Web Services) #PySpark #Data Processing #"ETL (Extract #Transform #Load)" #Redshift #RDS (Amazon Relational Database Service) #SQS (Simple Queue Service) #Programming #Apache Iceberg #MySQL #Python #S3 (Amazon Simple Storage Service) #Observability #SNS (Simple Notification Service) #Aurora #Data Cleansing #Debugging #Cloud #Spark (Apache Spark) #HTTP & HTTPS (Hypertext Transfer Protocol & Hypertext Transfer Protocol Secure) #Big Data
Role description
You've reached your limit of 5 free role views today. Upgrade to premium for unlimited access.

Summary:

Seeking an experienced and dynamic ETL Developer, to design, build, test and maintain systems that extract, load and transform data from multiple different systems.

Here are some of the specific details:

Job Title: ETL Developer III

Job Duration: 12 Months

Job Location: Chicago, IL (onsite 2 days)

SKILLS:
• 6+ years of experience using ETL tools to perform data cleansing, data profiling, transforming, and scheduling various workflows.
• Expert level proficiency with writing, debugging and optimizing SQL.
• 3-4 years of programming experience using Python or PySpark/Glue required.
• Knowledge of common design patterns, models and architecture used in Big Data processing.
• 3-4 years of experience with AWS services such as Glue, S3, Redshift, Lambda, Step Functions, RDS Aurora/MySQL, Apache Iceberg, CloudWatch, SNS, SQS, EventBridge.
• Capable of troubleshooting common database issues, familiarity with observability tools like Dynatrace or DataDog.

A reasonable, good faith estimate of the minimum and maximum for this position is $55/hr to $60/hr on W2

Benefits will also be available, and details are available at the following link: Harvey Nash Benefits https://rb.gy/foel75