Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer in St. Louis, MO, for 12 months+ on a W2 basis. Key skills include Python, ETL, Azure Data Factory, and cloud data engineering. A bachelor's degree in a related field is preferred.
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
704
🗓️ - Date discovered
April 11, 2025
🕒 - Project duration
More than 6 months
🏝️ - Location type
Hybrid
📄 - Contract type
W2 Contractor
🔒 - Security clearance
Unknown
📍 - Location detailed
St Louis, MO
🧠 - Skills detailed
#Microsoft Azure #Mathematics #Azure Databricks #Azure Data Factory #Data Lake #Spark (Apache Spark) #Scripting #Data Engineering #Cloud #DAX #Python #Business Analysis #SQL (Structured Query Language) #Agile #Storage #MIS Systems (Management Information Systems) #ADF (Azure Data Factory) #"ETL (Extract #Transform #Load)" #DevOps #Statistics #Computer Science #Azure #Jupyter #Data Processing #Azure ADLS (Azure Data Lake Storage) #ADLS (Azure Data Lake Storage) #Deployment #Batch #Databricks #C# #Data Management #Compliance
Role description

Job Title- Data Engineer

Location- St Louis, MO

Duration- 12 Month+

ONLY W2, NO C2C

ONLY W2, NO C2C

ONLY W2, NO C2C

Onsite in St. Louis office Monday – Thursday with Work from Home on Fridays.

The primary responsibilities of this role will be to work as a part of the data and analytics team. The ideal candidate would have experience using ETL and other data engineering tools in a cloud environment. Additionally, this candidate would have coding and scripting experience creating code to move and manipulate data using tools such as Azure Data Factory, Python, Databricks or spark. Microsoft Azure experience is a plus but relevant experience with other cloud-based environments will be considered.

This team uses Microsoft Azure to build and productionize data sets and processes to the product owners and product users and this role will be expected to contribute to those objectives. As a member of this team, you will work in conjunction with other data engineers, platform engineers, solution developers and others to execute and deliver working solutions to the end users.

Technical Skills

   • High focus on Python

   • More Programmer/Coder experience, less Cloud experience needed.

   • Intermediate or higher level of proficiency with Python

   • Must have the proven ability to learn quickly and apply cloud data engineering principles and skills to new tools and use cases

   • Ability to define and take technical ownership of detailed requirements and collaborate effectively with various team members such as Business Analysts and Project Managers and end users.

   • Excellent communication (verbal and written) and problem-solving skills are essential for this position.

   • Initiative, Proactive Cooperation, and curiosity are mandatory behavioral skills required for this position.

Experience with several of the following technologies is preferred:

   • Experience with Microsoft Azure Platform. In particular, the data processing toolsets including Azure Data Factory Pipelines, Azure Databricks, Azure Data Lake Storage.

   • Experience using Notebooks (Such as Jupyter, DataBricks) also desired.

   • Experience using code languages such as Python and C#

   • Experience using query languages such as SQL, MDX and DAX

   • Experience using scripting languages such as PowerShell, M-Query (Power Query), and Windows batch commands.

   • Understanding of Agile and DevOps principles for software development and ownership.

   • Familiar with compliance and controls for software development and deployment.

Education Required

A bachelor's in computer science, statistics, applied mathematics, data management, information systems, information science or a related quantitative field [or equivalent work experience] is not required but is preferred.