Refer a freelancer, and you both get 1 free week of DFH Premium. They must use your code {code} at sign-up. More referrals = more free weeks! T&Cs apply.
1 of 5 free roles viewed today. Upgrade to premium for unlimited.

Data Integration Engineer

This role is for a Data Integration Engineer on a W2 contract from 2/24/2025 to 8/31/2025, paying $70-$90/hour. Requires 4-7 years in data integration, ETL, SQL, cloud platforms, and data governance. Remote work within the U.S. only.
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
720
🗓️ - Date discovered
February 19, 2025
🕒 - Project duration
More than 6 months
🏝️ - Location type
Remote
📄 - Contract type
W2 Contractor
🔒 - Security clearance
Unknown
📍 - Location detailed
United States
🧠 - Skills detailed
#Automation #Code Reviews #Data Accuracy #Spark (Apache Spark) #Hadoop #Monitoring #Visualization #Python #Data Engineering #Scala #Azure #SQL (Structured Query Language) #Datasets #GCP (Google Cloud Platform) #Data Management #Consulting #Data Quality #Tableau #Data Framework #Data Governance #Consul #Data Integration #Data Integrity #Looker #AWS (Amazon Web Services) #Data Processing #Scripting #Microsoft Power BI #BI (Business Intelligence) #Documentation #Big Data #Cloud #Data Pipeline #"ETL (Extract #Transform #Load)" #Compliance #Security
Role description
You've reached your limit of 5 free role views today. Upgrade to premium for unlimited access.

Job Title: Data Integration Engineer

Type: Contract (W2 only)

Start Date: 2/24/2025

End Date: 8/31/2025 (with potential for extension)

Location: Remote (United States only)

Schedule: Monday-Friday, 40 hours per week

Compensation: $70 to $90 per hour

Job Summary:

Our global business & IT consulting client is seeking a highly skilled and detail-oriented Data Integration Engineer to assist in the design, development, and maintenance of scalable data pipelines and infrastructure. The ideal candidate will have 4-7 years of experience in data integration, ETL processes, data governance, and platform development. This role will involve working closely with cross-functional teams to enhance data accuracy, optimize data models, and implement security measures.

Responsibilities:

Data Integration and Management:
• Assist in the design, development, and maintenance of scalable data pipelines to ingest, process, and store large volumes of customer data from various sources.
• Help implement ETL (Extract, Transform, Load) processes to ensure data accuracy, consistency, and reliability.
• Identify opportunities for process improvements and automation to enhance data operations efficiency.

Data Platform Development:
• Consult on building robust data infrastructure to support the customer data platform, ensuring high availability and performance.
• Recommend optimization strategies for source/target data models and schemas to support analytics and reporting needs.
• Assist in identifying data governance and security measures to protect sensitive customer information.

Data Quality and Monitoring:
• Help establish and enforce data quality standards and best practices.
• Collaborate on the development and maintenance of data validation and monitoring frameworks to detect and resolve data issues proactively.
• Perform data audits and design corrective actions to maintain data integrity.

Collaboration and Support:
• Participate in code reviews, knowledge-sharing sessions, and continuous improvement initiatives.
• Work closely with team members to support business initiatives and data onboarding.

Documentation and Reporting:
• Create and maintain comprehensive documentation for data processes, workflows, and systems.
• Develop and deliver regular reports and dashboards to track key performance indicators (KPIs) and data platform metrics.

Required Qualifications & Experience:
• 4-7 years of experience in data integration, ETL processes, and data management.
• Proficiency in SQL, Python, or other scripting languages for data processing.
• Experience with cloud-based data platforms (AWS, Azure, GCP) and data warehousing solutions.
• Strong understanding of data governance, security, and compliance best practices.
• Ability to analyze complex datasets and implement data validation methodologies.
• Excellent problem-solving skills and ability to work collaboratively in a team environment.
• Strong communication and documentation skills.

Preferred Qualifications & Experience:
• A college degree in Information Technology, Data Analytics, etc.
• Experience with data visualization tools such as Tableau, Power BI, or Looker.
• Knowledge of big data technologies (Spark, Hadoop) and streaming data frameworks.
• Familiarity with CI/CD pipelines for data engineering workflows.