Refer a freelancer, and you both get 1 free week of DFH Premium. They must use your code {code} at sign-up. More referrals = more free weeks! T&Cs apply.
1 of 5 free roles viewed today. Upgrade to premium for unlimited.

Azure Data Engineer- W2 Only

This role is for an Azure Data Engineer with a contract length of "unknown", offering a W2 pay rate. Candidates must have strong Azure Synapse Analytics and Python skills, with experience in data manipulation, ETL processes, and Azure services. Hybrid work location.
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
528
🗓️ - Date discovered
February 14, 2025
🕒 - Project duration
Unknown
🏝️ - Location type
Hybrid
📄 - Contract type
W2 Contractor
🔒 - Security clearance
Unknown
📍 - Location detailed
Montvale, NJ
🧠 - Skills detailed
#Data Manipulation #Azure Synapse Analytics #ADF (Azure Data Factory) #SQL (Structured Query Language) #"ETL (Extract #Transform #Load)" #Data Security #Synapse #AI (Artificial Intelligence) #Azure SQL Database #Python #Metadata #Scripting #Data Management #Storage #Compliance #Data Engineering #Data Quality #API (Application Programming Interface) #Automation #Data Modeling #Azure Data Factory #Azure SQL #Libraries #Azure #Security #Data Extraction
Role description
You've reached your limit of 5 free role views today. Upgrade to premium for unlimited access.

Data Engineer III

Job ID # 81627

Rate type: W2 only

Please note the selected candidate will be expected to report to the Montvale NJ office 2x per week.
• At least ONE interview will be conducted IN PERSON in the Montvale office
• Proven experience as a Data Engineer, with a strong focus on Azure Synapse Analytics. Proficiency in Python for data manipulation, scripting, and automation. Expertise in Azure services, including Azure Data Factory, Azure SQL Database, and Azure Storage. Hands-on experience with Azure AI Document Intelligence for document processing and data extraction. Create, manage, and optimize ETL processes to ensure efficient data flow and high data quality. Strong understanding of data warehousing concepts, ETL processes, and data modeling techniques. Monitor and troubleshoot data systems, ensuring optimal performance and reliability. Implement best practices for data security, metadata management, and compliance with organizational policies. Excellent problem-solving skills and ability to work in a fast-paced, collaborative environment. Effective communication and interpersonal skills.

Python (more than 3 years' experience with libraries to extract data from PDF, MS office files). Azure Synapse to create data Python pipeline jobs and clusters. Good knowledge of API integration