1 of 5 free roles viewed today. Upgrade to premium for unlimited from only $19.99 with a 2-day free trial.

GCP Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a GCP Data Engineer (PySpark) on a remote contract for 8 hours per week, paying $55.00 - $60.00 per hour. Requires 3+ years of GCP experience, Dataproc, BigQuery, Cloud Composer, and ETL skills.
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
480
🗓️ - Date discovered
April 3, 2025
🕒 - Project duration
Unknown
🏝️ - Location type
Remote
📄 - Contract type
Unknown
🔒 - Security clearance
Unknown
📍 - Location detailed
Remote
🧠 - Skills detailed
#"ETL (Extract #Transform #Load)" #GCP (Google Cloud Platform) #Python #BigQuery #Spark (Apache Spark) #PySpark #Cloud #Data Engineering
Role description
You've reached your limit of 5 free role views today.
Upgrade to premium for unlimited access - from only $19.99.

Hello,Hope you're doing well,This is Shakthi from Inscope global,Job Title: GCP Data Engineer (PySpark)Location: Remote (EST Timezone)(Note: PST candidate we can't process)Job Description:

GCP Data Engineers who have PySpark experience.The project itself is the same, but they want these folks to have the 'big picture' mindset as the initiative is a bigger effort than initially expected.Must Haves:

3+ years of GCP Engineer work

Dataproc

BigQuery

Cloud Composer

ETL experience - working with large data sets

PySpark

using Python to call APIs

Nice to haves:

Healthcare/Insurance industry

Thanks and Regards,Sakthi V+1 (734-992-5216)

Job Type: Contract

Pay: $55.00 - $60.00 per hour

Expected hours: 8 per week

Compensation Package:

Quarterly bonus

Schedule:

8 hour shift

Work Location: Remote