Refer a freelancer, and you both get 1 free week of DFH Premium. They must use your code {code} at sign-up. More referrals = more free weeks! T&Cs apply.
1 of 5 free roles viewed today. Upgrade to premium for unlimited.

Data Engineer

This role is for a Data Engineer with 5 years of experience, offering a long-term remote contract. Key skills include SQL, ETL tools, and cloud platforms. A Bachelor’s degree in a related field is required, along with strong analytical skills.
🌎 - Country
United States
💱 - Currency
Unknown
💰 - Day rate
Unknown
Unknown
🗓️ - Date discovered
February 18, 2025
🕒 - Project duration
Unknown
🏝️ - Location type
Unknown
📄 - Contract type
W2 Contractor
🔒 - Security clearance
Unknown
📍 - Location detailed
Indianapolis, IN 46202
🧠 - Skills detailed
#Azure #Data Architecture #Data Modeling #Data Processing #Data Governance #SQL (Structured Query Language) #Cloud #Data Lake #Data Science #Data Engineering #BI (Business Intelligence) #Computer Science #"ETL (Extract #Transform #Load)" #Talend #Airflow #Data Quality #Data Pipeline #AWS (Amazon Web Services) #Data Warehouse #NoSQL #Scala #Apache Airflow #Databases #Security
Role description
You've reached your limit of 5 free role views today. Upgrade to premium for unlimited access.

Job Title: Data Engineer
Location : Remote
Duration : long term
VISA : Any visa
Candidates must be willing to work on our W2, OR C2C
• job description: We are seeking a skilled Data Engineer with 5 years of experience to join our dynamic team. The ideal candidate will be responsible for designing, building, and maintaining scalable data pipelines and architectures to support our analytical and business intelligence needs.
Responsibilities: Develop, construct, test, and maintain data architectures (data warehouses, data lakes, etc.).
Design and implement robust ETL processes to ensure data quality and integrity.
Collaborate with data scientists and analysts to meet data needs for analytics and reporting.
Optimize database and data processing performance.
Implement data governance and security best practices.
Troubleshoot and resolve data-related issues as they arise.
Document data flows and architecture for future reference and training.
Qualifications: Bachelor’s degree in Computer Science, Engineering, or a related field.
Proficient in SQL and experience with NoSQL databases.
Familiarity with data modeling, data warehousing, and ETL tools (e.g., Apache Airflow, Talend).
Experience with cloud platforms (AWS, Google Cloud, Azure) is a plus.
Strong analytical and problem-solving skills.
Job Type: Contract
Schedule:

8 hour shift

Work Location: In person