Refer a freelancer, and you both get 1 free week of DFH Premium. They must use your code {code} at sign-up. More referrals = more free weeks! T&Cs apply.
1 of 5 free roles viewed today. Upgrade to premium for unlimited.

Senior GCP Data Engineer

This role is for a Senior GCP Data Engineer in Bentonville, AR, for 12 months at a competitive pay rate. Requires 11 years of IT experience, 5+ years in GCP, and retail domain expertise. Key skills include Java, Kafka, and Kubernetes.
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
🗓️ - Date discovered
February 18, 2025
🕒 - Project duration
More than 6 months
🏝️ - Location type
On-site
📄 - Contract type
Unknown
🔒 - Security clearance
Unknown
📍 - Location detailed
Bentonville, AR
🧠 - Skills detailed
#Jenkins #Big Data #C++ #Jira #Kafka (Apache Kafka) #Spring Boot #Kubernetes #Spark (Apache Spark) #Java #Python #Data Engineering #Computer Science #Airflow #BigQuery #Data Pipeline #GCP (Google Cloud Platform) #BitBucket #Scala #Apache Airflow #Docker #Physical Data Model
Role description
You've reached your limit of 5 free role views today. Upgrade to premium for unlimited access.

Job Description: Senior GCP Data Engineer

Location: Bentonville, AR (Onsite from Day 1)

Contract Duration: 12 months

Experience Required: 11 years

Must-Have: Retail domain experience

Key Responsibilities:

Design and develop big data applications using Java, Spring Boot, C++.

Build and optimize data pipelines using Kafka, Hive, Spark.

Develop logical and physical data models for big data platforms.

Automate workflows using Apache Airflow.

Manage CI/CD using Jenkins, Bamboo, or TFS.

Utilize Gitflow for source code management.

Deploy applications using Kubernetes, Docker.

Work with Atlassian tools (JIRA, BitBucket, Confluence).

Provide on-call support and system enhancements.

Mentor junior engineers and lead technical discussions.

Required Skills & Experience:

11 years in IT with 5+ years in GCP.

Expertise in Java, Spring Boot, C++, Python, Scala.

Strong knowledge of Kafka, Kubernetes, Jenkins, Docker.

Hands-on experience with GCP Dataproc, GCS, BigQuery.

Bachelor's degree in Computer Science or equivalent.