

Azure Data Engineer
Azure Data Engineer
Work Authorization: US Citizenship REQUIRED (No exceptions and No Corp-to-Corp/C2C)
Contract: Contract expected to last 3-6 months with the possibility of converting to full-time.
Remote Options: Can be remote but may require travel to the office in central Washington State. Candidates must reside in the United States.
Client’s Time Zone: Pacific Standard Time
Summary:
We are seeking a highly motivated and skilled Azure Data Engineer to join our team. You will be responsible for designing, developing, and maintaining scalable, reliable, and secure data pipelines and solutions on the Azure platform. You will collaborate with data scientists, data analysts, and other stakeholders to deliver data solutions that meet business needs.
Responsibilities:
• Design, develop, and implement ETL (Extract, Transform, Load) and ELT (Extract, Load, Transform) pipelines using Azure services like Azure Data Factory, Azure Synapse Analytics, and Azure Data Lake Storage.
• Develop and optimize data storage solutions, including Azure Blob Storage, Azure Data Lake Storage, and Azure SQL Database.
• Ensure data security and compliance with industry standards and company policies.
• Monitor and troubleshoot data pipeline performance issues, ensuring optimal performance and efficiency.
• Implement data integration solutions to connect various data sources and systems.
• Plan and execute data migration projects to Azure.
• Maintain up-to-date documentation for data processes and pipelines.
• Collaborate with stakeholders to gather requirements, understand business needs, and deliver solutions.
Required Skills and Qualifications:
• Education: Bachelor's degree in Computer Science, Engineering, or a related field, or equivalent experience.
• Azure Expertise: 5+ years of experience with a strong understanding of the following:
• Azure services, including Azure Data Factory, Azure Synapse Analytics, Azure Data Lake Storage, Azure Blob Storage, and Azure SQL Database.
• Experience designing, developing, and implementing data pipelines and data warehouses.
• Proficiency in programming languages such as SQL, Python, or Scala.
• Experience with data modeling and data warehousing techniques.
• Strong problem-solving, communication, and collaboration skills.
• Knowledge of data governance principles and best practices.
• Familiarity with data versioning tools (Delta Lake, DVC, LakeFS, etc.)
• Security and vulnerability management (package scans, remediation)
• Working knowledge of Software Development tools and practices including DevOps and CI/CD tools (e.g., Git, Jenkins, Docker, Kubernetes, etc.)
• Expertise in Distributed SQL and NoSQL Databases.
• Proficient in Performance Tuning.
• Knowledge in working of REST APIs, ability to read documentation and make calls to read or write data using API is a plus.
• Azure qualifications are a must but AWS preferred stack Experience in AWS EC2, CFT, Route53, RDS.
• Clear understanding of query planning
• Adept at version control tools GIT Stash, build tools like Maven.
• Expertise with databricks, data factory, profisee, purview
Azure Data Engineer
Work Authorization: US Citizenship REQUIRED (No exceptions and No Corp-to-Corp/C2C)
Contract: Contract expected to last 3-6 months with the possibility of converting to full-time.
Remote Options: Can be remote but may require travel to the office in central Washington State. Candidates must reside in the United States.
Client’s Time Zone: Pacific Standard Time
Summary:
We are seeking a highly motivated and skilled Azure Data Engineer to join our team. You will be responsible for designing, developing, and maintaining scalable, reliable, and secure data pipelines and solutions on the Azure platform. You will collaborate with data scientists, data analysts, and other stakeholders to deliver data solutions that meet business needs.
Responsibilities:
• Design, develop, and implement ETL (Extract, Transform, Load) and ELT (Extract, Load, Transform) pipelines using Azure services like Azure Data Factory, Azure Synapse Analytics, and Azure Data Lake Storage.
• Develop and optimize data storage solutions, including Azure Blob Storage, Azure Data Lake Storage, and Azure SQL Database.
• Ensure data security and compliance with industry standards and company policies.
• Monitor and troubleshoot data pipeline performance issues, ensuring optimal performance and efficiency.
• Implement data integration solutions to connect various data sources and systems.
• Plan and execute data migration projects to Azure.
• Maintain up-to-date documentation for data processes and pipelines.
• Collaborate with stakeholders to gather requirements, understand business needs, and deliver solutions.
Required Skills and Qualifications:
• Education: Bachelor's degree in Computer Science, Engineering, or a related field, or equivalent experience.
• Azure Expertise: 5+ years of experience with a strong understanding of the following:
• Azure services, including Azure Data Factory, Azure Synapse Analytics, Azure Data Lake Storage, Azure Blob Storage, and Azure SQL Database.
• Experience designing, developing, and implementing data pipelines and data warehouses.
• Proficiency in programming languages such as SQL, Python, or Scala.
• Experience with data modeling and data warehousing techniques.
• Strong problem-solving, communication, and collaboration skills.
• Knowledge of data governance principles and best practices.
• Familiarity with data versioning tools (Delta Lake, DVC, LakeFS, etc.)
• Security and vulnerability management (package scans, remediation)
• Working knowledge of Software Development tools and practices including DevOps and CI/CD tools (e.g., Git, Jenkins, Docker, Kubernetes, etc.)
• Expertise in Distributed SQL and NoSQL Databases.
• Proficient in Performance Tuning.
• Knowledge in working of REST APIs, ability to read documentation and make calls to read or write data using API is a plus.
• Azure qualifications are a must but AWS preferred stack Experience in AWS EC2, CFT, Route53, RDS.
• Clear understanding of query planning
• Adept at version control tools GIT Stash, build tools like Maven.
• Expertise with databricks, data factory, profisee, purview