Refer a freelancer, and you both get 1 free week of DFH Premium. They must use your code {code} at sign-up. More referrals = more free weeks! T&Cs apply.
1 of 5 free roles viewed today. Upgrade to premium for unlimited.

Data Engineer

This role is for a Data Engineer with 5+ years in SQL and Python, experienced in Apache Airflow, ETL, and data warehousing. It offers a full remote contract until the end of 2025 at $70/hour, requiring strong communication skills.
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
560
🗓️ - Date discovered
February 14, 2025
🕒 - Project duration
More than 6 months
🏝️ - Location type
Remote
📄 - Contract type
W2 Contractor
🔒 - Security clearance
Unknown
📍 - Location detailed
Houston, TX
🧠 - Skills detailed
#Data Pipeline #Airflow #Scala #Data Engineering #Data Quality #Data Security #GIT #SQL (Structured Query Language) #Datasets #Hadoop #Security #Spark (Apache Spark) #Data Architecture #Apache Airflow #Python #BigQuery #"ETL (Extract #Transform #Load)" #Data Integrity
Role description
You've reached your limit of 5 free role views today. Upgrade to premium for unlimited access.

Data Engineer - Full Remote Working from anywhere in the US

SQL+Python+ETL+Airflow

Our client is a global leading high tech company:

-Over 6,500 employees across 20+ offices;

-Fast growing;

-Cutting edge AR and VR technologies, 3D printing, etc.

They are looking for a Data Engineer to join their FinTech team!

You will develop scalable data pipelines, ensure data quality, and support business decision-making with high-quality datasets.

-Technologies: SQL, Python, ETL, Big Query, Spark, Hadoop, vision control systems such as Git, workflow management tools such as Airflow, Data Architecture, Data Warehousing

-Design and develop scalable ETL pipelines to automate data processes, optimize delivery, and comply with privacy and governance standards.

-Implement and manage data warehousing solutions, ensuring data integrity through rigorous testing and validation.

-Develop tools and systems to address limitations in data consumption portals.

-Implement and maintain robust data security practices

Requirements:

-5+ years of experience in SQL

-5+ years of development in Python

-MUST have experience in Apache Airflow

-Experience in Google BigQuery, Spark, and Hadoop is a big plus

-Experience with ETL tools, data architecture, and data warehousing solutions

-Strong communication skills

W2 employer full time contract, initially till the end of 2025 with the option to extend yearly. Offer up to $70 an hour (40 hours per week, Mon-Fri) plus 50% contribution to medical plan, sick leave and federal public holidays as PTOs. Full remote working from anywhere in the US.