Refer a freelancer, and you both get 1 free week of DFH Premium. They must use your code {code} at sign-up. More referrals = more free weeks! T&Cs apply.
1 of 5 free roles viewed today. Upgrade to premium for unlimited.

ETL Lead Developer

This role is for an ETL Lead Developer in Hartford, CT, for a 6-month contract at a W2 pay rate. Requires 10+ years of ETL experience, proficiency in AbInitio, SQL, and data warehousing. Experience with Talend, Databricks, and Python is preferred.
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
🗓️ - Date discovered
February 16, 2025
🕒 - Project duration
More than 6 months
🏝️ - Location type
On-site
📄 - Contract type
W2 Contractor
🔒 - Security clearance
Unknown
📍 - Location detailed
Hartford, CT
🧠 - Skills detailed
#Data Pipeline #Talend #"ETL (Extract #Transform #Load)" #Data Analysis #Data Engineering #Spark (Apache Spark) #Data Integration #SQL (Structured Query Language) #PySpark #Migration #Databricks #Python #Data Processing
Role description
You've reached your limit of 5 free role views today. Upgrade to premium for unlimited access.

Job:-Application Developer(ETL lead developer)

Position : W2

Location:-Hartford, 06103

Duration:-6 Months

Visa: H4 EAD

JD:-

Person should be a seasoned ETL lead developer with least 10 + years diverse experience in implementing an efficient ETL data solutions meeting NFRs.

They should be self driven to propose & implement fit for purpose ETL solution which can meet future application demands, Design and implement ETL processes and Contribute to the development of data warehousing solutions. Should have experience in leading a team in onshore-offshore model.

Also, some priori experience in driving or executing ETL Platform Migrations would be beneficial.

Good hands-on experience in AbInitio and implementing Data warehousing/ETL solutions is required.

Expert level of proficiency in SQL is recommended. Prior ETL Development experience using any other ETL tools and Data Warehousing (DW) concepts clarity is also a must have skill.

Any experience in Talend/Databricks/Python would be good to have.

Develop and implement data processing solutions using PySpark

Collaborate with data engineers to optimize data pipelines

Utilize Databricks Platform for data analysis and processing

Design and implement ETL processes for data integration

Contribute to the development of data warehousing solutions

Must-have skills and years of experience:

  1. AbInitio

  2. Data warehousing

  3. SQL

Nice-to-have skills and years of experience:

  1. Talend

  2. ETL Experience

  3. Databricks or Python

Thanks & Regards,

Mounika.