Refer a freelancer, and you both get 1 free week of DFH Premium. They must use your code {code} at sign-up. More referrals = more free weeks! T&Cs apply.
1 of 5 free roles viewed today. Upgrade to premium for unlimited.

Data Engineer

This role is for a Data Engineer in New York, NY, on a 6-month W2 contract. Key skills include Python ETL, Azure services, data governance, and financial services experience. Hybrid work requires 3 days onsite. Expected duration exceeds 6 months.
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
🗓️ - Date discovered
February 15, 2025
🕒 - Project duration
More than 6 months
🏝️ - Location type
Hybrid
📄 - Contract type
W2 Contractor
🔒 - Security clearance
Unknown
📍 - Location detailed
New York, NY
🧠 - Skills detailed
#Python #Azure Databricks #GDPR (General Data Protection Regulation) #Data Quality #Databricks #Data Governance #Quality Assurance #Data Engineering #Azure cloud #Data Integrity #Scala #Cloud #Azure Data Factory #Storage #Agile #Programming #Data Storage #Azure #Batch #Monitoring #Data Transformations #"ETL (Extract #Transform #Load)" #Microsoft Azure #Data Manipulation #Compliance #Azure Virtual Machines #ADF (Azure Data Factory) #Data Pipeline
Role description
You've reached your limit of 5 free role views today. Upgrade to premium for unlimited access.

Dice is the leading career destination for tech experts at every stage of their careers. Our client, Vedic Staffing, is seeking the following. Apply via Dice today!

Job Title:- Data Engineer Job Location:- New York, NY Work Mode:- Hybrid(3 Days Onsite)

Duration:- 06 Months Contract on W2 Tax term

Job Description:-
• Python ETL: The ideal candidate should have strong proficiency in Python programming language and experience in Extract, Transform, Load (ETL) processes. They should be able to design and develop efficient ETL workflows to extract data from various sources, transform it as per business requirements, and load it into target systems.
• Azure Experience: The candidate should have hands-on experience working with Microsoft Azure cloud platform. They should be familiar with Azure services and tools, and have a good understanding of Azure architecture and best practices.
• Azure Data Factory, DataBricks, Azure Storage, and Azure VM: The candidate should have practical experience with Azure Data Factory, DataBricks, Azure Storage, and Azure Virtual Machines. They should be able to design and implement data pipelines using Azure Data Factory, perform data transformations and analytics using DataBricks, and manage data storage and virtual machines in Azure.
• Data Governance, Data Quality, and Controls: The candidate should have a strong understanding of data governance principles, data quality management, and data controls. They should be able to implement data governance frameworks, establish data quality standards, and ensure compliance with data regulations and policies.
• Implementing alerts and notifications for batch jobs: The candidate should have experience in setting up alerts and notifications for batch jobs. They should be able to configure monitoring and alerting mechanisms to ensure timely identification and resolution of issues in batch job execution.

Must have:
• Proficiency in Python : Strong experience in Python programming, particularly in developing ETL processes and data manipulation.
• Experience with Azure Cloud: Hands-on knowledge of Microsoft Azure services, including Azure Data Factory, Azure Databricks, Azure Storage, and Azure Virtual Machines.
• ETL Development: Ability to design and implement efficient ETL workflows to extract, transform, and load data from various sources.
• Data Governance and Compliance: Understanding of data governance principles, data quality management, and experience ensuring compliance with financial regulations (e.g., GDPR, CCPA).
• Data Pipeline Design: Experience in architecting scalable and flexible data pipelines, with a focus on performance optimization.
• Data Quality Assurance: Skills in implementing data quality checks and monitoring to maintain data integrity throughout the pipeline.
• Financial Services Experience: Previous experience working in the banking or financial services industry, understanding specific data handling and compliance challenges.
• Agile Methodologies: Ability to work effectively in a cross-functional Agile team, adapting to fast-paced project requirements.
• Communication Skills: Strong collaboration skills to work with stakeholders and teams, translating business requirements into technical solutions.