1 of 5 free roles viewed today. Upgrade to premium for unlimited from only $19.99 with a 2-day free trial.

Sr Data Engineer with GCP - Sunnyvale, CA [Local Only]

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Sr Data Engineer with GCP in Sunnyvale, CA, requiring 4+ years of GCP experience and 12+ years in data warehousing. Contract length is long-term, with expertise in Hadoop, Spark, and programming languages like Python and Java essential.
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
🗓️ - Date discovered
April 3, 2025
🕒 - Project duration
Unknown
🏝️ - Location type
On-site
📄 - Contract type
Unknown
🔒 - Security clearance
Unknown
📍 - Location detailed
Sunnyvale, CA
🧠 - Skills detailed
#RDBMS (Relational Database Management System) #Computer Science #Data Lake #Jira #Airflow #Data Processing #Spark (Apache Spark) #Scrum #Agile #BitBucket #Hadoop #Python #BigQuery #Data Warehouse #Scala #Automated Testing #Jenkins #Programming #GCP (Google Cloud Platform) #Consul #Java #Data Pipeline #Perl #Scripting #Data Engineering
Role description
You've reached your limit of 5 free role views today.
Upgrade to premium for unlimited access - from only $19.99.

Title: Sr Data Engineer with GCP

Location: Sunnyvale, CA - Onsite [Local consultant only]

Duration: Long term

GCP Experience

   • 4+ years of recent GCP experience

   • Experience building data pipelines in GCP

   • GCP Dataproc, GCS & BIGQuery experience

Required Skills:

   • 12+ years of hands-on experience with developing data warehouse solutions and data products.

   • 6+ years of hands-on experience developing a distributed data processing platform with Hadoop, Hive or Spark, Airflow or a workflow orchestration solution are required

   • 5+ years of hands-on experience in modeling and designing schema for data lakes or for RDBMS platforms.

   • Experience with programming languages: Python, Java, Scala, etc.

   • Experience with scripting languages: Perl, Shell, etc.

   • Practice working with, processing, and managing large data sets (multi TB/PB scale).

   • Exposure to test driven development and automated testing frameworks.

   • Background in Scrum/Agile development methodologies.

   • Capable of delivering on multiple competing priorities with little supervision.

   • Excellent verbal and written communication skills.

   • Bachelor's Degree in computer science or equivalent experience.

The most successful candidates will also have experience in the following:

   • Gitflow

   • Atlassian products – BitBucket, JIRA, Confluence etc.

   • Continuous Integration tools such as Bamboo, Jenkins, or TFS

Share resume at rakesh(@)chabeztech(dot)com