Refer a freelancer, and you both get 1 free week of DFH Premium. They must use your code {code} at sign-up. More referrals = more free weeks! T&Cs apply.
1 of 5 free roles viewed today. Upgrade to premium for unlimited.

Data Engineer

This role is for a Data Engineer with a contract length of "unknown," offering a pay rate of "$XX/hour." Required skills include advanced SQL/PLSQL, Python, Unix scripting, and AWS experience. A Bachelor's or Master's degree in a technology field and 5+ years of relevant experience are mandatory.
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
🗓️ - Date discovered
February 8, 2025
🕒 - Project duration
Unknown
🏝️ - Location type
Unknown
📄 - Contract type
Unknown
🔒 - Security clearance
Unknown
📍 - Location detailed
Smithfield, RI
🧠 - Skills detailed
#"ETL (Extract #Transform #Load)" #Data Analysis #GIT #Scrum #DevOps #Data Engineering #Data Modeling #Batch #Data Vault #Scripting #Artifactory #Kanban #Informatica #Computer Science #Automation #AWS (Amazon Web Services) #Cloud #SQL (Structured Query Language) #Oracle #Snowflake #Vault #Deployment #Python #Unix #Jenkins #Agile
Role description
You've reached your limit of 5 free role views today. Upgrade to premium for unlimited access.

Must haves:
• Advanced SQL/PLSQL development experience to implement business logic that fetches data from Oracle to generate reports and datafiles.
• Python used for data movement and automation
• Experience with Unix scripting and scheduling tools
• AWS

What are the NICE to have skills?
• AWS batch
• CICD tools such as Jenkins, GIT, Artifactory
• ETL tools (data transformation was being done with Informatica but is moving to Python based)

The Expertise and Skills You Bring
• Bachelor’s or Master’s Degree in a technology related field (e.g. Engineering, Computer Science, etc.) required with 5+ years of experience.
• Advanced SQL/PLSQL knowledge.
• 2+ years of Python development.
• Experience with Unix scripting and scheduling tools.
• Good experience in working with AWS or similar Cloud Technologies.
• Strong data modeling skills doing either Dimensional or Data Vault models.
• Proven data analysis skills.
• Experience working with snowflake database is a plus.
• Hands-on experience on SQL query optimization and tuning to improve performance is desirable.
• Hands-on experience with ETL tool like Informatica is a desirable
• Working experience with some or all the following: AWS, Containerization, associated build, and deployment CI/CD pipelines.
• Experience with DevOps, Continuous Integration and Continuous Delivery (Jenkins, Stash, Concourse, Artifactory).
• Experience in Agile methodologies (Kanban and SCRUM) is a plus.
• Proven track record to handle ambiguity and work in fast paced environment.
• Good interpersonal skills to work with multiple teams in the organization.