1 of 5 free roles viewed today. Upgrade to premium for unlimited from only $19.99 with a 2-day free trial.

GCP Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a GCP Data Engineer on a remote contract basis, requiring 4-6 years of experience. Key skills include GCP services, data pipelines, ETL, and DevOps tools like Jenkins and GitLab. Pay rate is unspecified.
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
600
🗓️ - Date discovered
April 3, 2025
🕒 - Project duration
Unknown
🏝️ - Location type
Remote
📄 - Contract type
Unknown
🔒 - Security clearance
Unknown
📍 - Location detailed
Remote
🧠 - Skills detailed
#"ETL (Extract #Transform #Load)" #Jenkins #GCP (Google Cloud Platform) #Data Storage #GitHub #GitLab #Consulting #Consul #DevOps #Data Ingestion #Data Pipeline #Cloud #Data Engineering #Storage
Role description
You've reached your limit of 5 free role views today.
Upgrade to premium for unlimited access - from only $19.99.

Position: GCP Data Engineer

 Location: Remote

 Type: Contract

 About Kaizen Analytix: Kaizen Analytix LLC is an analytics consulting services and product firm dedicated to helping clients unlock actionable business insights through fast, high-quality analytics solutions. We are looking for a dynamic GCP Data Engineer to join our growing team. This is an excellent opportunity for an Data professional with 4 to 6 years of experience who is eager to apply their Data Engineering expertise in a fast-paced, collaborative environment.

Configuring Google Cloud Platform services

  Managing data storage and processing

  Designing and deploying data pipelines using GCP services

  Developing data ingestion and transformation processes

  Establishing and managing data storage solutions using GCP services.

  DevOps work – Building Ci/CD Pipeline with Jenkins & Gitlab. Github experience will be a plus.