

Data Engineer
Data Engineer
Our Denver based client is seeking a Data Engineer to join their growing team. The ideal candidate will have extensive experience writing SQL queries and working with big data.
This is a hybrid (4 days onsite) position in the Greenwood Village, CO area.Candidates must be able to be in office during core hours of 10am – 3pm MT.
Due to our client’s requirements we can only consider W2 or Salaried employees, no C2C.
This is a long term 12-18 months with extension.
Key Responsibilities:
• Develop, maintain, and optimize ETL processes using SQL, HQL, Python, and Bash.
• Identify and address shortcomings ETL processes, suggesting and implementing potential upgrades to the tech stack.
• Develop automated solutions to streamline redundant task
• Collaborate with DevOps engineers to update necessary templates, test code, and manage template changes.
• Analyze and propose solutions, providing detailed analysis on pros and cons, level of effort (LOE), and success probabilities.
• Communicate complex technical concepts simply and effectively to both technical and non-technical stakeholders.
• Document processes, solutions, and changes thoroughly and clearly.
• Learn and integrate AWS applications to enhance our data infrastructure.
• Able to participate in weekend on-call responsibilities and possibly holiday on-call responsibilities
• Validate solutions and provide thorough readouts
• Self-motivated and able to self-manage projects through Jira
Qualifications:
• Proven experience as a Data Engineer or in a similar role.
• Expert knowledge in SQL and Bash
• Junior-level knowledge in Python and Spark SQL
• Strong problem-solving skills and has a curious, innovative mindset.
• Experience with ETL processes and has the ability to identify areas for improvement.
• Ability to develop and implement automation processes.
• Excellent communication and written skills.
• Comfortable working with DevOps engineers and managing code changes.
• Willingness to learn and adapt to new technologies, particularly AWS applications.
• Knowledge of data warehousing concepts and best practices.
Additional Experience:
• Experience with AWS services such as S3, Step Function, Lambda, Secrets Manager, and Glue.
• Familiarity with CI/CD pipelines and version control systems like Git.
• Functional knowledge of Tableau