

Data Engineer
Must haves:
• Advanced SQL/PLSQL development experience to implement business logic that fetches data from Oracle to generate reports and datafiles.
• Python used for data movement and automation
• Experience with Unix scripting and scheduling tools
• AWS
What are the NICE to have skills?
• AWS batch
• CICD tools such as Jenkins, GIT, Artifactory
• ETL tools (data transformation was being done with Informatica but is moving to Python based)
The Expertise and Skills You Bring
• Bachelor’s or Master’s Degree in a technology related field (e.g. Engineering, Computer Science, etc.) required with 5+ years of experience.
• Advanced SQL/PLSQL knowledge.
• 2+ years of Python development.
• Experience with Unix scripting and scheduling tools.
• Good experience in working with AWS or similar Cloud Technologies.
• Strong data modeling skills doing either Dimensional or Data Vault models.
• Proven data analysis skills.
• Experience working with snowflake database is a plus.
• Hands-on experience on SQL query optimization and tuning to improve performance is desirable.
• Hands-on experience with ETL tool like Informatica is a desirable
• Working experience with some or all the following: AWS, Containerization, associated build, and deployment CI/CD pipelines.
• Experience with DevOps, Continuous Integration and Continuous Delivery (Jenkins, Stash, Concourse, Artifactory).
• Experience in Agile methodologies (Kanban and SCRUM) is a plus.
• Proven track record to handle ambiguity and work in fast paced environment.
• Good interpersonal skills to work with multiple teams in the organization.