Refer a freelancer, and you both get 1 free week of DFH Premium. They must use your code {code} at sign-up. More referrals = more free weeks! T&Cs apply.
1 of 5 free roles viewed today. Upgrade to premium for unlimited.

Sr. Data Engineer

This role is for a Sr. Data Engineer in Cincinnati, OH, lasting over 6 months at a W2 pay rate. Requires 8+ years of experience, proficiency in SQL and Python, and expertise in AWS, Azure, or GCP, along with big data technologies.
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
🗓️ - Date discovered
February 20, 2025
🕒 - Project duration
More than 6 months
🏝️ - Location type
On-site
📄 - Contract type
W2 Contractor
🔒 - Security clearance
Unknown
📍 - Location detailed
Cincinnati, OH
🧠 - Skills detailed
#Data Accuracy #"ETL (Extract #Transform #Load)" #Spark (Apache Spark) #Hadoop #Kafka (Apache Kafka) #Data Security #Cloud #Python #Documentation #Scala #GCP (Google Cloud Platform) #Data Engineering #Data Science #SQL (Structured Query Language) #Azure #AWS (Amazon Web Services) #Data Modeling #Computer Science #Security #Data Pipeline #Big Data #Data Governance #Data Management
Role description
You've reached your limit of 5 free role views today. Upgrade to premium for unlimited access.

Job Description

Job Description

Job Title: Sr. Data Engineer (10 Yrs)

Location: Cincinnati, OH

Job Type: Full-time, W2 Only

Visa Requirement: Any Visa is accepted

Responsibilities:

Design, develop, and maintain scalable data pipelines and ETL processes.

Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and deliver high-quality data solutions.

Optimize and enhance data systems to improve performance and reliability.

Ensure data accuracy, integrity, and security across all data platforms.

Implement best practices for data management, including data governance and documentation.

Troubleshoot and resolve data-related issues in a timely manner.

Stay current with industry trends and technologies to continually improve data engineering practices.

Qualifications:

Bachelor's degree in Computer Science, Information Technology, or a related field.

Minimum of 8 years of experience in data engineering or a similar role.

Proficiency in SQL, Python, and data modeling.

Experience with cloud platforms such as AWS, Azure, or GCP.

Strong knowledge of big data technologies such as Hadoop, Spark, and Kafka.

Excellent problem-solving skills and attention to detail.

Ability to work independently and as part of a collaborative team.

Strong communication and documentation skills.

Skills:

Data Pipeline Development

ETL Processes

SQL and Python

Cloud Platforms (AWS, Azure, GCP)

Big Data Technologies (Hadoop, Spark, Kafka)

Data Governance

Data Security