Refer a freelancer, and you both get 1 free week of DFH Premium. They must use your code {code} at sign-up. More referrals = more free weeks! T&Cs apply.
1 of 5 free roles viewed today. Upgrade to premium for unlimited.

ETL Developer

This role is for an ETL Developer with a contract length of "unknown" and a pay rate of "unknown." Key skills include DataStage, Unix, SQL, and Control-M, with preferred experience in SAS, AWS, Python, and Java. A bachelor's degree and 2-4 years of relevant experience are required.
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
🗓️ - Date discovered
February 15, 2025
🕒 - Project duration
Unknown
🏝️ - Location type
Unknown
📄 - Contract type
Unknown
🔒 - Security clearance
Unknown
📍 - Location detailed
McLean, VA
🧠 - Skills detailed
#Python #S3 (Amazon Simple Storage Service) #Spark (Apache Spark) #Integration Testing #AWS S3 (Amazon Simple Storage Service) #Snowflake #Data Lake #Data Engineering #Java #Athena #Scala #Cloud #Computer Science #Agile #DataStage #PySpark #Unit Testing #SAS #UAT (User Acceptance Testing) #Batch #"ETL (Extract #Transform #Load)" #SQL (Structured Query Language) #Unix #Automation #Databases #AWS (Amazon Web Services) #Informatica
Role description
You've reached your limit of 5 free role views today. Upgrade to premium for unlimited access.

• This is an Agile team (on 2 weeks sprints) working on Data research and Data changes as related to moving data from On prem to Cloud.
• The developer must have solid experience and knowledge in all the must have skills listed below along with Unit testing, System Integration Testing (SIT) and User Acceptance Testing (UAT). The testing is done before code goes into production.
• Experience in SAS, AWS, Python and Java are preferred. Experience in Python is preferred over Java.
• Some AWS knowledge is preferred as Data will be moved in and out of Snowflake during the project.
• Experience in Informatica is not required for this role.

Must Haves: DataStage, Unix, SQL and Control-M. Preferred: SAS, AWS, Python, & Java

Responsibilities:
• Develop scalable and reliable data solutions to move data across systems from multiple sources in real time as well as batch modes.
• Develop innovative and automation solutions in the areas of data engineering including technology implementations using Agile construct.
• Support applications, including providing fixes to production defects and making enhancements.
• Quickly evaluate defects, investigate causes and come up with solutions to address defects.
• Provide production support and provide clear communication on the resolution of incidents.

Qualifications:
• Bachelor’s degree in Computer Science or Engineering or equivalent working experience.
• 2 – 4 years of experience in delivering solutions using Software Development Life Cycle.
• 4 years of ETL experience using DataStage, SQL, Unix and Control-M.
• Strong data analytical skills using complex data sources including databases and text data files.

Optional Skills:
• Data Engineering solution delivery using SAS, AWS (S3, Glue, Athena, EMR, PySpark, Data lake), Python, Snowflake, Dev Ops and CI/CD tools.
• Knowledge and experience working with Informatica.
• Knowledge and experience working with Java