Microsoft Fabric Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is a contract-to-hire Microsoft Fabric Data Engineer position requiring US citizens or GC holders. Key skills include 3+ years in data engineering, proficiency in SQL, Python, and Microsoft Fabric. Hybrid work with monthly travel to client offices is required.
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
🗓️ - Date discovered
April 19, 2025
🕒 - Project duration
Unknown
🏝️ - Location type
Hybrid
📄 - Contract type
Unknown
🔒 - Security clearance
Unknown
📍 - Location detailed
United States
🧠 - Skills detailed
#"ETL (Extract #Transform #Load)" #Semantic Models #Azure #Data Accuracy #Dataflow #DAX #PySpark #Code Reviews #BI (Business Intelligence) #Azure DevOps #Data Science #Documentation #Synapse #Computer Science #Cloud #Delta Lake #Microsoft Power BI #Storage #Compliance #Scala #Azure cloud #Data Pipeline #Data Warehouse #Data Governance #Agile #Python #KQL (Kusto Query Language) #Data Integration #Data Engineering #Data Storage #Spark (Apache Spark) #GitHub #SQL (Structured Query Language) #Security #DevOps
Role description

Our Client, an Agritech company with offices in US, Canada, and Mexico is hiring a talented Microsoft Fabric Data Engineer. The client has an on-prem data center and Azure Cloud services.

Unable to Sponsor and Company is looking for US Citizens or GC Holders ONLY.

This is a Contract to Hire role.

Responsibilities

   • Design, build, and maintain scalable data pipelines and solutions using Microsoft Fabric components.

   • Develop and manage data integration workflows using Pipelines, Notebook, Dataflows, and Synapse.

   • Optimize data storage and retrieval using OneLake, Delta Lake, and Lakehouse architecture.

   • Collaborate with data scientists, analysts, and BI developers to ensure data accuracy and accessibility.

   • Implement robust data governance, security, and compliance practices using Fabric’s built-in tools.

   • Monitor and troubleshoot data workflows and performance issues.

   • Participate in code reviews, solution architecture discussions, and agile ceremonies.

   • Create and maintain documentation for data models, processes, and configurations.

Qualifications

Required:

   • Bachelor’s degree in computer science, Information Systems, or a related field.

   • 3+ years of experience in data engineering or a similar role.

   • Hands-on experience with Microsoft Fabric ecosystem including Synapse, Dataflows, Power BI, Semantic models, Data Warehouse and OneLake.

   • Proficiency in SQL, Python or PySpark, KQL, and DAX.

   • Strong SQL query optimization and troubleshooting skills

   • Experience working with lakehouse architectures, delta lake tables, and Real-time intelligence from streaming data.

   • Strong understanding of data warehousing design and best practices, ETL/ELT pipelines, and cloud data platforms (preferably Azure).

   • Familiarity with CI/CD practices in data engineering.

Preferred:

   • Microsoft Certified: Fabric Analytics Engineer Associate or similar.

   • Experience with GitHub, Azure DevOps, and other development lifecycle tools.

   • Knowledge of data governance frameworks and tools (e.g., Microsoft Purview).

   • Excellent communication and collaboration skills.

Travel and Work Requirements

   • Willingness to travel to Client's offices in the US (once a month). Client has 18 offices in US.

   • Able to lift up to 50 lbs., if necessary.

   • Ability to work for extended periods or hours if necessary and as needed.