Refer a freelancer, and you both get 1 free week of DFH Premium. They must use your code {code} at sign-up. More referrals = more free weeks! T&Cs apply.
1 of 5 free roles viewed today. Upgrade to premium for unlimited.

Data Engineer

This role is for a Data Engineer with a 12-18 month contract, paying "competitive rates" in a hybrid setting in Greenwood Village, CO. Key skills include SQL, Bash, and Python, with AWS experience preferred.
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
560
🗓️ - Date discovered
February 15, 2025
🕒 - Project duration
More than 6 months
🏝️ - Location type
Hybrid
📄 - Contract type
W2 Contractor
🔒 - Security clearance
Unknown
📍 - Location detailed
Greenwood Village, CO
🧠 - Skills detailed
#Big Data #Automation #Python #S3 (Amazon Simple Storage Service) #SQL (Structured Query Language) #Jira #DevOps #SQL Queries #Spark (Apache Spark) #Spark SQL #"ETL (Extract #Transform #Load)" #Version Control #Tableau #GIT #AWS (Amazon Web Services) #Data Engineering #Lambda (AWS Lambda) #Bash
Role description
You've reached your limit of 5 free role views today. Upgrade to premium for unlimited access.

Data Engineer

Our Denver based client is seeking a Data Engineer to join their growing team. The ideal candidate will have extensive experience writing SQL queries and working with big data.

This is a hybrid (4 days onsite) position in the Greenwood Village, CO area.Candidates must be able to be in office during core hours of 10am – 3pm MT.

Due to our client’s requirements we can only consider W2 or Salaried employees, no C2C.

This is a long term 12-18 months with extension.

Key Responsibilities:
• Develop, maintain, and optimize ETL processes using SQL, HQL, Python, and Bash.
• Identify and address shortcomings ETL processes, suggesting and implementing potential upgrades to the tech stack.
• Develop automated solutions to streamline redundant task
• Collaborate with DevOps engineers to update necessary templates, test code, and manage template changes.
• Analyze and propose solutions, providing detailed analysis on pros and cons, level of effort (LOE), and success probabilities.
• Communicate complex technical concepts simply and effectively to both technical and non-technical stakeholders.
• Document processes, solutions, and changes thoroughly and clearly.
• Learn and integrate AWS applications to enhance our data infrastructure.
• Able to participate in weekend on-call responsibilities and possibly holiday on-call responsibilities
• Validate solutions and provide thorough readouts
• Self-motivated and able to self-manage projects through Jira

Qualifications:
• Proven experience as a Data Engineer or in a similar role.
• Expert knowledge in SQL and Bash
• Junior-level knowledge in Python and Spark SQL
• Strong problem-solving skills and has a curious, innovative mindset.
• Experience with ETL processes and has the ability to identify areas for improvement.
• Ability to develop and implement automation processes.
• Excellent communication and written skills.
• Comfortable working with DevOps engineers and managing code changes.
• Willingness to learn and adapt to new technologies, particularly AWS applications.
• Knowledge of data warehousing concepts and best practices.

Additional Experience:
• Experience with AWS services such as S3, Step Function, Lambda, Secrets Manager, and Glue.
• Familiarity with CI/CD pipelines and version control systems like Git.
• Functional knowledge of Tableau