Refer a freelancer, and you both get 1 free week of DFH Premium. They must use your code {code} at sign-up. More referrals = more free weeks! T&Cs apply.
1 of 5 free roles viewed today. Upgrade to premium for unlimited.

Sr. GCP Data Engineer

This role is for a Sr. GCP Data Engineer on a 12-month contract in Bentonville, AR / Sunnyvale, CA, offering $55.00 - $61.00 per hour. Requires 11+ years of experience, 4+ years in GCP, and expertise in data pipelines, Java, and Python.
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
488
🗓️ - Date discovered
February 11, 2025
🕒 - Project duration
More than 6 months
🏝️ - Location type
On-site
📄 - Contract type
Unknown
🔒 - Security clearance
Unknown
📍 - Location detailed
Sunnyvale, CA 94086
🧠 - Skills detailed
#BigQuery #Jira #Spark (Apache Spark) #Agile #Data Engineering #Apache Kafka #Computer Science #Physical Data Model #Apache Spark #Data Pipeline #BitBucket #Programming #GCP (Google Cloud Platform) #Perl #Apache Airflow #Jenkins #Python #Data Lake #Apache Hive #Automated Testing #Data Processing #Kubernetes #Scrum #Java #Big Data #Scripting #Airflow #Kafka (Apache Kafka) #Scala #RDBMS (Relational Database Management System) #Hadoop
Role description
You've reached your limit of 5 free role views today. Upgrade to premium for unlimited access.

Title: GCP Data Engineer
Location: Bentonville, AR / Sunnyvale, CA
Terms: 12 Months contract
Must Have Skill-

GCP Data Engineer, Java
Python, Java, Scala, Jenkins, Kubernetes

Requirements:
GCP Experience
11+ years of hands-on experience with developing

4+ years of recent GCP experience
Experience building data pipelines in GCP
GCP Dataproc, GCS & BIGQuery experience
6+ years of hands-on experience developing a distributed data processing platform with Hadoop, Hive or Spark, Airflow or a workflow orchestration solution are required
5+ years of hands-on experience in modeling and designing schema for data lakes or for RDBMS platforms.
Experience with programming languages: Python, Java, Scala, etc.
Experience with scripting languages: Perl, Shell, etc.
Practice working with, processing, and managing large data sets (multi TB/PB scale).
Exposure to test driven development and automated testing frameworks.
Background in Scrum/Agile development methodologies.
Capable of delivering on multiple competing priorities with little supervision.
Excellent verbal and written communication skills.
Bachelor's Degree in computer science or equivalent experience.

The most successful candidates will also have experience in the following:

Gitflow
Atlassian products – BitBucket, JIRA, Confluence etc.
Continuous Integration tools such as Bamboo, Jenkins, or TFS

Responsibilities:

As a Senior Data Engineer, you will
Design and develop big data applications using the latest open source technologies.
Desired working in offshore model and Managed outcome
Develop logical and physical data models for big data platforms.
Automate workflows using Apache Airflow.
Create data pipelines using Apache Hive, Apache Spark, Apache Kafka.
Provide ongoing maintenance and enhancements to existing systems and participate in rotational on-call support.
Learn our business domain and technology infrastructure quickly and share your knowledge freely and actively with others in the team.
Mentor junior engineers on the team
Lead daily standups and design reviews
Groom and prioritize backlog using JIRA
Act as the point of contact for your assigned business domain

Job Type: Contract
Pay: $55.00 - $61.00 per hour
Schedule:

8 hour shift
Monday to Friday

Experience:

Data Engineer: 10 years (Preferred)
Java: 10 years (Preferred)
Python: 8 years (Preferred)
GCP: 4 years (Preferred)

Ability to Commute:

Sunnyvale, CA 94086 (Required)

Ability to Relocate:

Sunnyvale, CA 94086: Relocate before starting work (Required)

Work Location: In person