1 of 5 free roles viewed today. Upgrade to premium for unlimited from only $19.99 with a 2-day free trial.

Senior Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer in Charlotte, NC, on a long-term contract with a pay rate of "unknown." Key skills include strong experience with Databricks, Azure, ETL pipeline development, and SQL. Excellent communication is essential.
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
🗓️ - Date discovered
April 1, 2025
🕒 - Project duration
Unknown
🏝️ - Location type
On-site
📄 - Contract type
Unknown
🔒 - Security clearance
Unknown
📍 - Location detailed
Charlotte, NC
🧠 - Skills detailed
#Data Lake #Python #"ETL (Extract #Transform #Load)" #SQL (Structured Query Language) #GCP (Google Cloud Platform) #Azure SQL #Azure #Cloud #PySpark #Synapse #R #ML (Machine Learning) #REST (Representational State Transfer) #JavaScript #Spark (Apache Spark) #ADF (Azure Data Factory) #Java #Data Engineering #Scripting #Databricks
Role description
You've reached your limit of 5 free role views today.
Upgrade to premium for unlimited access - from only $19.99.

Sr. Data Engineer

Location: This candidate MUST be local to Charlotte NC, and is required to come into the office 3-4 days a week. – or open to relocate day 1

Long term Contract

Let me know the time available for you to meet with my business partner on video screen chat?

This is a backfill that needs candidates within 1 hour asap.

Must Haves:

   • Strong experience with Databricks is a must have

   • Azure, ETL Pipeline Development, ADF, Databricks, etc. is a must have

   • Experience being technology agnostic.

   • Exposure to APIs - REST, PySpark

   • Cloud Data Engineering

   • Scripting: Shell, Python, R

   • Strong SQL Skills are a must have

   • Excellent communication and collaboration skills.

   • Ability to work with stakeholders to understand the problem and provide insights.

Nice to have: Data lake, Cosmos, Synapse, Machine Learning frameworks.

Job Description:

Project:

Product Iris (Sellthrough): Events that push items to stores without individual Store Management having to create orders for those products. All event based – Christmas, Easter, Mother’s Day, etc. Looking to forecast the data and how to sell it, sales %, etc.

Must Haves:

   • Strong reporting skills: PowerBi/Fabric

   • MS Azure, ETL Pipeline Development, ADF, etc. Technology agnostic.

   • Databricks, absolute must have

   • Exposure to APIs - REST, PySpark

   • Cloud Data Engineering (Azure)

   • Scripting: Shell, Python,R

   • Strong SQL Skills

   • Excellent communication and collaboration skills. Ability to work with stakeholders/Product Managers to understand the problem and provide insights.

Preferred Skills:

   • Azure DB: Data lake, Cosmos, Synapse, Azure SQL

   • Machine Learning frameworks.

   • ETL Pipeline Development

   • Data engineering tools in GCP.

   • Javascript

   • Java, Android