Refer a freelancer, and you both get 1 free week of DFH Premium. They must use your code {code} at sign-up. More referrals = more free weeks! T&Cs apply.
1 of 5 free roles viewed today. Upgrade to premium for unlimited.

ETL Developer

This role is for an ETL Developer with a contract length of "unknown," offering a pay rate of "$X/hour." Key skills include AWS services, Python, Snowflake, and data integration. A bachelor's degree and 5 years of relevant experience are required.
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
🗓️ - Date discovered
February 20, 2025
🕒 - Project duration
Unknown
🏝️ - Location type
Unknown
📄 - Contract type
Unknown
🔒 - Security clearance
Unknown
📍 - Location detailed
Arizona, United States
🧠 - Skills detailed
#Data Architecture #Oracle #Agile #"ETL (Extract #Transform #Load)" #Spark (Apache Spark) #XML (eXtensible Markup Language) #Java #SQL Server #Lambda (AWS Lambda) #Python #Data Warehouse #Data Ingestion #NoSQL #S3 (Amazon Simple Storage Service) #Snowflake #Informatica #Statistics #ML (Machine Learning) #Microservices #Data Engineering #Database Architecture #AWS Glue #Web Services #Databases #SQL (Structured Query Language) #AWS (Amazon Web Services) #Data Modeling #JSON (JavaScript Object Notation) #Athena #Computer Science #Data Pipeline #BI (Business Intelligence) #JavaScript #GitHub #Aurora #AWS S3 (Amazon Simple Storage Service) #Big Data #Data Lake #dbt (data build tool) #GIT #Data Extraction #AI (Artificial Intelligence) #Data Integration
Role description
You've reached your limit of 5 free role views today. Upgrade to premium for unlimited access.

Summary:

The ETL Developer/Data Engineer is a hands-on technical role focused on full stack software development within the Enterprise Data organization. The role will pay curtail part in shaping future big data and analytics initiatives.

Responsibilities:
• Designs and develops code and data pipelines to ingest from relational databases (Oracle, SQL Server, DB2, Aurora), file shares, and web services.
• Build Data Lake on AWS S3 with optimal performance considerations by partitioning and compressing data.
• Data Engineering and Analytics using AWS Glue, Informatica, EMR, Spark, Athena, Python.
• Data modeling and building Data Warehouse using Snowflake.
• Designs and develops code and data pipelines to ingest relational databases, file shares, and web services.
• Participates in requirements definition, system architecture design, and data architecture design.
• Participates in all aspects of the software life cycle using Agile development methodologies.

Minimum Qualifications:
• Bachelor’s degree in computer science, Computer Information Systems, Engineering, Statistics or closely related field (willing to accept foreign education equivalent) (required).
• Experience in AWS services for data and analytics (required).
• 5 years of experience in Data Ingestion, Data Extraction, and Data Integration (required).

Preferred Qualifications:
• 7+ years of experience in Enterprise Information Solution Architecture, Design, and development required.
• 7+ years of experience with integration architectures such as SOA, Microservices, ETL or other integration technologies.
• 7+ years of experience with working content or knowledge management systems, search engines, relational databases, NoSQL databases, ETL tools, geospatial systems, or semantic technology.
• 5+ years of hands-on experience with AWS services ( S3, Kinesis, Lambda, Athena, Glue, EMR) required.
• 5+ years’ experience on Snowflake, DBT and Denodo.
• Experience with JSON or XML data modeling required.
• Experience with Git/GitHub, branching, and other modern source code management methodologies required.
• Domain knowledge of NoSQL or relational database required.
• Understanding of database architecture and performance implications required.
• Experience integrating Business Intelligence applications like PowerBI.
• Experience with Machine Learning and Artificial Intelligence.
• Ability to multi-task effectively.
• Ability to work collaboratively as part of an Agile Team.
• Extensive knowledge and experience with Python, JavaScript and Java.
• Excellent written and verbal communication skills, sense of ownership, urgency and drive.