Refer a freelancer, and you both get 1 free week of DFH Premium. They must use your code {code} at sign-up. More referrals = more free weeks! T&Cs apply.
1 of 5 free roles viewed today. Upgrade to premium for unlimited.

Data Engineer for Global Hospitality Company

This role is for a Data Engineer with 2-3 years of experience, proficient in Python and SQL, to optimize data pipelines and ETL processes in a remote, on-premise environment. Contract duration is 6 months, with a pay rate of $55-$65/hr.
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
520
🗓️ - Date discovered
February 8, 2025
🕒 - Project duration
More than 6 months
🏝️ - Location type
Remote
📄 - Contract type
Unknown
🔒 - Security clearance
Unknown
📍 - Location detailed
California, United States
🧠 - Skills detailed
#"ETL (Extract #Transform #Load)" #REST (Representational State Transfer) #Schema Design #Data Engineering #Data Modeling #Data Ingestion #Airflow #Scala #Visualization #Tableau #Data Pipeline #Trino #Pandas #Kafka (Apache Kafka) #Data Quality #Automation #MySQL #AWS (Amazon Web Services) #Cloud #SQL (Structured Query Language) #Data Science #SQL Queries #Python #NumPy #Databases #Agile #Grafana #REST API #NiFi (Apache NiFi) #Redshift
Role description
You've reached your limit of 5 free role views today. Upgrade to premium for unlimited access.

Data Engineer for Global Hospitality Company

Summary:

Our client is seeking a Data Engineer to design, build, and optimize data pipelines, ETL processes, and reporting solutions in an on-premise, non-cloud environment. The ideal candidate will have strong Python and SQL skills, and the ability to work with large-scale data ingestion and visualization tools like Tableau or Superset.
• Responsibilities:Develop and maintain scalable data pipelines for analytics and reporting.
• Optimize SQL queries and ETL processes to improve data ingestion performance.
• Create dashboards and visualizations using Tableau, Superset, or Grafana.
• Work closely with data scientists and analysts to ensure data quality.
• Troubleshoot data infrastructure issues and implement solutions.
• Support schema design, data modeling, and integration across sources.
• Ensure efficient data ingestion into StarRocks, avoiding slow insert methods.
• Work in an agile development environment, collaborating across teams.
• Requirements:2-3 years of experience in data engineering, ETL development, and data warehousing.
• Proficiency in Python (Pandas, NumPy) and experience extracting data from REST APIs.
• Strong SQL skills with experience in Trino, MySQL, or similar databases.
• Experience with Airflow or NiFi for workflow automation.
• Familiarity with Kafka for real-time data streaming (preferred).
• Hands-on experience with data visualization tools like Tableau, Superset, or Grafana.
• Ability to work independently in an on-premise environment (no AWS, Redshift, or Glue).

Type: Contract

Duration: 6 months (Possible extension)

Location: Remote

Hourly Rate: $55-$65/hr DOE