

ETL Developer
Summary:
The ETL Developer/Data Engineer is a hands-on technical role focused on full stack software development within the Enterprise Data organization. The role will pay curtail part in shaping future big data and analytics initiatives.
Responsibilities:
• Designs and develops code and data pipelines to ingest from relational databases (Oracle, SQL Server, DB2, Aurora), file shares, and web services.
• Build Data Lake on AWS S3 with optimal performance considerations by partitioning and compressing data.
• Data Engineering and Analytics using AWS Glue, Informatica, EMR, Spark, Athena, Python.
• Data modeling and building Data Warehouse using Snowflake.
• Designs and develops code and data pipelines to ingest relational databases, file shares, and web services.
• Participates in requirements definition, system architecture design, and data architecture design.
• Participates in all aspects of the software life cycle using Agile development methodologies.
Minimum Qualifications:
• Bachelor’s degree in computer science, Computer Information Systems, Engineering, Statistics or closely related field (willing to accept foreign education equivalent) (required).
• Experience in AWS services for data and analytics (required).
• 5 years of experience in Data Ingestion, Data Extraction, and Data Integration (required).
Preferred Qualifications:
• 7+ years of experience in Enterprise Information Solution Architecture, Design, and development required.
• 7+ years of experience with integration architectures such as SOA, Microservices, ETL or other integration technologies.
• 7+ years of experience with working content or knowledge management systems, search engines, relational databases, NoSQL databases, ETL tools, geospatial systems, or semantic technology.
• 5+ years of hands-on experience with AWS services ( S3, Kinesis, Lambda, Athena, Glue, EMR) required.
• 5+ years’ experience on Snowflake, DBT and Denodo.
• Experience with JSON or XML data modeling required.
• Experience with Git/GitHub, branching, and other modern source code management methodologies required.
• Domain knowledge of NoSQL or relational database required.
• Understanding of database architecture and performance implications required.
• Experience integrating Business Intelligence applications like PowerBI.
• Experience with Machine Learning and Artificial Intelligence.
• Ability to multi-task effectively.
• Ability to work collaboratively as part of an Agile Team.
• Extensive knowledge and experience with Python, JavaScript and Java.
• Excellent written and verbal communication skills, sense of ownership, urgency and drive.