Refer a freelancer, and you both get 1 free week of DFH Premium. They must use your code {code} at sign-up. More referrals = more free weeks! T&Cs apply.
1 of 5 free roles viewed today. Upgrade to premium for unlimited.

Data Engineer

This role is a Data Engineer position with a long-term contract, offering $79.16 to $91.66 hourly. Requires 7+ years of experience, expertise in Snowflake, DBT, Apache Kafka, and AWS technologies, along with strong skills in ETL processes and data visualization.
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
728
🗓️ - Date discovered
February 22, 2025
🕒 - Project duration
Unknown
🏝️ - Location type
Unknown
📄 - Contract type
Unknown
🔒 - Security clearance
Unknown
📍 - Location detailed
Chicago, IL
🧠 - Skills detailed
#Apache Kafka #Hadoop #Snowflake #dbt (data build tool) #AWS (Amazon Web Services) #"ETL (Extract #Transform #Load)" #Kafka (Apache Kafka) #Apache Spark #Fivetran #Airflow #Data Warehouse #Spark (Apache Spark) #Visualization #API (Application Programming Interface) #Pig #Cloud #Data Ingestion #Data Engineering #Data Processing
Role description
You've reached your limit of 5 free role views today. Upgrade to premium for unlimited access.

Salary: $79.16 to $91.66 hourly

Description

We are seeking a Data Engineer to join our team. As a Data Engineer, you will be responsible for designing and implementing data engineering solutions, leveraging tools like Snowflake, DBT, and Fivetran. This role offers a long term contract employment opportunity.

Responsibilities
• Design, develop, and maintain data engineering solutions using Snowflake
• Implement ETL processes using DBT and other tools
• Utilize Fivetran for data ingestion tasks
• Ensure the efficient scheduling of data warehouse tasks with tools like Airflow or Dagster
• Develop and maintain APIs for data interaction
• Implement algorithms and analytics for data processing
• Leverage Apache Kafka, Apache Pig, and Apache Spark in data processing tasks
• Use AWS technologies for cloud-based data engineering solutions
• Implement data visualization techniques for data presentation
• Mentor entry level team members on data warehouse concepts and practices
• Ensure clear and effective communication with team members and stakeholders.

Requirements
• Possess a minimum of 7 years of detail-oriented experience as a Data Engineer or in a related field.
• Proficient in Apache Kafka, Apache Pig, and Apache Spark.
• Skilled in various Cloud Technologies.
• Demonstrated ability in Data Visualization.
• Expertise in implementing Algorithms.
• Strong knowledge of Analytics.
• Experience with Apache Hadoop is mandatory.
• Proficiency in API Development.
• Familiarity with AWS Technologies.
• Experience working with Snowflake is highly desirable.
• Knowledge of Data Build Tool (DBT) is essential.
• Proven experience with ETL (Extract, Transform, Load) processes.

Technology Doesn't Change the World, People Do.®

Robert Half is the world’s first and largest specialized talent solutions firm that connects highly qualified job seekers to opportunities at great companies. We offer contract, temporary and permanent placement solutions for finance and accounting, technology, marketing and creative, legal, and administrative and customer support roles.

Robert Half works to put you in the best position to succeed. We provide access to top jobs, competitive compensation and benefits, and free online training. Stay on top of every opportunity - whenever you choose - even on the go.

All applicants applying for U.S. job openings must be legally authorized to work in the United States. Benefits are available to contract/temporary professionals, including medical, vision, dental, and life and disability insurance. Hired contract/temporary professionals are also eligible to enroll in our company 401(k) plan. Visit

© 2025 Robert Half. An Equal Opportunity Employer. M/F/Disability/Veterans. By clicking “Apply Now,” you’re agreeing to