Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer on an 11+ month contract in St. Louis, MO (Hybrid), offering $80-84/hr. Key skills include Python, Azure Data Factory, and Databricks. A bachelor's in a related field is preferred.
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
672
🗓️ - Date discovered
April 11, 2025
🕒 - Project duration
More than 6 months
🏝️ - Location type
Hybrid
📄 - Contract type
Unknown
🔒 - Security clearance
Unknown
📍 - Location detailed
St. Louis County, MO
🧠 - Skills detailed
#Microsoft Azure #Mathematics #Azure Databricks #Azure Data Factory #Data Bricks #Data Lake #Spark (Apache Spark) #Scripting #Data Integration #Data Engineering #Cloud #BI (Business Intelligence) #DAX #Python #Business Analysis #Microsoft Power BI #SQL (Structured Query Language) #Agile #Datasets #Data Analysis #Storage #MIS Systems (Management Information Systems) #ADF (Azure Data Factory) #"ETL (Extract #Transform #Load)" #DevOps #Statistics #Dataflow #Computer Science #Azure #Jupyter #Data Processing #Azure ADLS (Azure Data Lake Storage) #ADLS (Azure Data Lake Storage) #Deployment #Batch #Databricks #C# #Data Management #Compliance
Role description

Akkodis is seeking for a Data Engineer role is a 11+ Month's Contract in St. Louis, MO with client.

Position- Data Engineer

Location-St. Louis, MO (Hybrid)

Duration-11+ Month's

Pay Range: $80-84/hr. (The Rate may be negotiable based on experience, Education, Geographic Location, and other factors.)

Job Description:

To work as a part of the data and analytics team.

The ideal candidate would have experience using ETL and other data engineering tools in a cloud environment.

Additionally, this candidate would have coding and scripting experience creating code to move and manipulate data using tools such as Azure Data Factory, Python, Databricks or spark.

Microsoft Azure experience is a plus but relevant experience with other cloud-based environments will be considered.

This team uses Microsoft Azure to build and productionize data sets and processes to the product owners and product users and this role will be expected to contribute to those objectives.

As a member of this team, you will work in conjunction with other data engineers, platform engineers, solution developers and others to execute and deliver working solutions to the end users.

Primary Responsibilities:

   • Design, construct, deploy, test and support solutions and applications to transform data for end-user consumption using Python, Azure Data factory, Databricks and other approved languages and platforms.

   • Embrace Agile principles for development and analytics

   • Actively collaborate in an open and respectful team environment

   • Contribute to team objectives through iterative and agile development practices.

   • Provide data analysis of internal and external data sources for transformation and delivery to end user solutions.

   • Develop, deploy, and support data integration using APIs, cloud storage, SFTP and other data transfer technologies

   • Enable Power BI users and solutions by delivering data via dataflows, datasets, and other methods.

   • Communicating in a timely manner all issues, risks, concerns, and status to management

   • Ensure developed applications are compliant and adhere to all applicable Nestle compliance standards.

   • Seek to Automate and improve efficiency as a way of working.

Technical Skills:

   • High focus on Python

   • More Programmer/Coder experience, less Cloud experience needed.

Intermediate or higher level of proficiency with Python.

Must have the proven ability to learn quickly and apply cloud data engineering principles and skills to new tools and use cases.

Ability to define and take technical ownership of detailed requirements and collaborate effectively with various team members such as Business Analysts and Project Managers and end users.

Excellent communication (verbal and written) and problem-solving skills are essential for this position.

Initiative, Proactive Cooperation, and curiosity are mandatory behavioral skills required for this position.

Experience with several of the following technologies is preferred:

   • Experience with Microsoft Azure Platform. In particular, the data processing toolsets including Azure Data Factory Pipelines, Azure Databricks, Azure Data Lake Storage.

   • Experience using Notebooks (Such as Jupyter, Data Bricks) also desired.

   • Experience using code languages such as Python and C#

   • Experience using query languages such as SQL, MDX and DAX

   • Experience using scripting languages such as PowerShell, M-Query (Power Query), and Windows batch commands.

   • Understanding of Agile and DevOps principles for software development and ownership.

   • Familiar with compliance and controls for software development and deployment.

Education Required:

A bachelor's in computer science, statistics, applied mathematics, data management, information systems, information science or a related quantitative field [or equivalent work experience] is not required but is preferred.

If you feel this is not something that you are currently interested in, but know of someone, that might be, please share the details with them or let me know their details so I can reach out to them!