ℹ️ - 1 of 5 free roles viewed today. Upgrade to premium for unlimited from only $19.99 with a 2-day free trial.

Azure Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an Azure Data Engineer with a contract of 3-6 months, offering remote work with potential travel to Washington State. Requires 5+ years of Azure experience, strong SQL/Python skills, and familiarity with data governance and integration. US citizenship is mandatory.
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
🗓️ - Date discovered
April 3, 2025
🕒 - Project duration
3 to 6 months
🏝️ - Location type
Hybrid
📄 - Contract type
W2 Contractor
🔒 - Security clearance
Unknown
📍 - Location detailed
United States
🧠 - Skills detailed
#Azure Data Factory #"ETL (Extract #Transform #Load)" #Azure SQL Database #Compliance #NoSQL #Azure SQL #Vulnerability Management #Computer Science #ADLS (Azure Data Lake Storage) #Data Security #Data Lake #Delta Lake #Databricks #Data Modeling #EC2 #Azure Synapse Analytics #AWS (Amazon Web Services) #Azure #REST (Representational State Transfer) #Maven #Python #Synapse #Security #Documentation #RDS (Amazon Relational Database Service) #Data Warehouse #Scala #Azure ADLS (Azure Data Lake Storage) #Jenkins #Programming #Data Analysis #Databases #AWS EC2 (Amazon Elastic Compute Cloud) #DevOps #Data Pipeline #API (Application Programming Interface) #Docker #Azure Blob Storage #GIT #Migration #Version Control #Data Science #REST API #SQL (Structured Query Language) #Data Integration #Data Storage #Data Governance #ADF (Azure Data Factory) #Kubernetes #Data Migration #Data Engineering #Storage
Role description
You've reached your limit of 5 free role views today.
Upgrade to premium for unlimited access - from only $19.99.

Azure Data Engineer

Work Authorization: US Citizenship REQUIRED (No exceptions and No Corp-to-Corp/C2C)

Contract: Contract expected to last 3-6 months with the possibility of converting to full-time.

Remote Options: Can be remote but may require travel to the office in central Washington State. Candidates must reside in the United States.

Client’s Time Zone: Pacific Standard Time

Summary:

We are seeking a highly motivated and skilled Azure Data Engineer to join our team. You will be responsible for designing, developing, and maintaining scalable, reliable, and secure data pipelines and solutions on the Azure platform. You will collaborate with data scientists, data analysts, and other stakeholders to deliver data solutions that meet business needs.

Responsibilities:

   • Design, develop, and implement ETL (Extract, Transform, Load) and ELT (Extract, Load, Transform) pipelines using Azure services like Azure Data Factory, Azure Synapse Analytics, and Azure Data Lake Storage.

   • Develop and optimize data storage solutions, including Azure Blob Storage, Azure Data Lake Storage, and Azure SQL Database.

   • Ensure data security and compliance with industry standards and company policies.

   • Monitor and troubleshoot data pipeline performance issues, ensuring optimal performance and efficiency.

   • Implement data integration solutions to connect various data sources and systems.

   • Plan and execute data migration projects to Azure.

   • Maintain up-to-date documentation for data processes and pipelines.

   • Collaborate with stakeholders to gather requirements, understand business needs, and deliver solutions.

Required Skills and Qualifications:

   • Education: Bachelor's degree in Computer Science, Engineering, or a related field, or equivalent experience.

   • Azure Expertise: 5+ years of experience with a strong understanding of the following:

   • Azure services, including Azure Data Factory, Azure Synapse Analytics, Azure Data Lake Storage, Azure Blob Storage, and Azure SQL Database.

   • Experience designing, developing, and implementing data pipelines and data warehouses.

   • Proficiency in programming languages such as SQL, Python, or Scala.

   • Experience with data modeling and data warehousing techniques.

   • Strong problem-solving, communication, and collaboration skills.

   • Knowledge of data governance principles and best practices.

   • Familiarity with data versioning tools (Delta Lake, DVC, LakeFS, etc.)

   • Security and vulnerability management (package scans, remediation)

   • Working knowledge of Software Development tools and practices including DevOps and CI/CD tools (e.g., Git, Jenkins, Docker, Kubernetes, etc.)

   • Expertise in Distributed SQL and NoSQL Databases.

   • Proficient in Performance Tuning.

   • Knowledge in working of REST APIs, ability to read documentation and make calls to read or write data using API is a plus.

   • Azure qualifications are a must but AWS preferred stack Experience in AWS EC2, CFT, Route53, RDS.

   • Clear understanding of query planning

   • Adept at version control tools GIT Stash, build tools like Maven.

   • Expertise with databricks, data factory, profisee, purview