

Senior Data Engineer
Sr. Data Engineer
Location: This candidate MUST be local to Charlotte NC, and is required to come into the office 3-4 days a week. – or open to relocate day 1
Long term Contract
Let me know the time available for you to meet with my business partner on video screen chat?
This is a backfill that needs candidates within 1 hour asap.
Must Haves:
• Strong experience with Databricks is a must have
• Azure, ETL Pipeline Development, ADF, Databricks, etc. is a must have
• Experience being technology agnostic.
• Exposure to APIs - REST, PySpark
• Cloud Data Engineering
• Scripting: Shell, Python, R
• Strong SQL Skills are a must have
• Excellent communication and collaboration skills.
• Ability to work with stakeholders to understand the problem and provide insights.
Nice to have: Data lake, Cosmos, Synapse, Machine Learning frameworks.
Job Description:
Project:
Product Iris (Sellthrough): Events that push items to stores without individual Store Management having to create orders for those products. All event based – Christmas, Easter, Mother’s Day, etc. Looking to forecast the data and how to sell it, sales %, etc.
Must Haves:
• Strong reporting skills: PowerBi/Fabric
• MS Azure, ETL Pipeline Development, ADF, etc. Technology agnostic.
• Databricks, absolute must have
• Exposure to APIs - REST, PySpark
• Cloud Data Engineering (Azure)
• Scripting: Shell, Python,R
• Strong SQL Skills
• Excellent communication and collaboration skills. Ability to work with stakeholders/Product Managers to understand the problem and provide insights.
Preferred Skills:
• Azure DB: Data lake, Cosmos, Synapse, Azure SQL
• Machine Learning frameworks.
• ETL Pipeline Development
• Data engineering tools in GCP.
• Javascript
• Java, Android
Sr. Data Engineer
Location: This candidate MUST be local to Charlotte NC, and is required to come into the office 3-4 days a week. – or open to relocate day 1
Long term Contract
Let me know the time available for you to meet with my business partner on video screen chat?
This is a backfill that needs candidates within 1 hour asap.
Must Haves:
• Strong experience with Databricks is a must have
• Azure, ETL Pipeline Development, ADF, Databricks, etc. is a must have
• Experience being technology agnostic.
• Exposure to APIs - REST, PySpark
• Cloud Data Engineering
• Scripting: Shell, Python, R
• Strong SQL Skills are a must have
• Excellent communication and collaboration skills.
• Ability to work with stakeholders to understand the problem and provide insights.
Nice to have: Data lake, Cosmos, Synapse, Machine Learning frameworks.
Job Description:
Project:
Product Iris (Sellthrough): Events that push items to stores without individual Store Management having to create orders for those products. All event based – Christmas, Easter, Mother’s Day, etc. Looking to forecast the data and how to sell it, sales %, etc.
Must Haves:
• Strong reporting skills: PowerBi/Fabric
• MS Azure, ETL Pipeline Development, ADF, etc. Technology agnostic.
• Databricks, absolute must have
• Exposure to APIs - REST, PySpark
• Cloud Data Engineering (Azure)
• Scripting: Shell, Python,R
• Strong SQL Skills
• Excellent communication and collaboration skills. Ability to work with stakeholders/Product Managers to understand the problem and provide insights.
Preferred Skills:
• Azure DB: Data lake, Cosmos, Synapse, Azure SQL
• Machine Learning frameworks.
• ETL Pipeline Development
• Data engineering tools in GCP.
• Javascript
• Java, Android