Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with 5 years of experience, specializing in healthcare services. Key skills include Databricks and Python, with nice-to-haves like GitHub and AWS. The position is remote, requiring a professional degree.
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
🗓️ - Date discovered
April 5, 2025
🕒 - Project duration
Unknown
🏝️ - Location type
Remote
📄 - Contract type
Unknown
🔒 - Security clearance
Unknown
📍 - Location detailed
Nashville, TN
🧠 - Skills detailed
#Python #AWS (Amazon Web Services) #Data Engineering #GitHub #Terraform #Data Pipeline #"ETL (Extract #Transform #Load)" #Quality Assurance #Databricks #Data Quality #Cloud
Role description

Job Title: Data Engineer

Location: Remote

Must-have skills for this role: Databricks, Python

Years of experience: 5

Education for this position? Professional Degree

Work location: Remote

Resource has to be Local: No

What are some nice-to-have skills: GitHub, Terraform, AWS Services

Job Description

Client & Project: We are seeking for a new talent to join the Data & Integration team where you will have the opportunity to collaborate in a project specializing in healthcare services.

Responsibilities: As a Data Engineer, you will be responsible for designing, developing, and maintaining data solutions for data generation, collection, and processing.

Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems.

Expected to be an SME and collaborate with the team to perform.

Engage with multiple teams and contribute on key decisions.

Expert proficiency in the Databricks Platform and Python is required. Advanced proficiency in GitHub, Terraform, and Intermediate proficiency in AWS Cloud Computing are suggested.

Develop and optimize data pipelines to enhance efficiency

Collaborate with cross-functional teams to integrate data solutions

Implement best practices for data quality assurance

Contribute to the continuous improvement of data processes