Refer a freelancer, and you both get 1 free week of DFH Premium. They must use your code {code} at sign-up. More referrals = more free weeks! T&Cs apply.
1 of 5 free roles viewed today. Upgrade to premium for unlimited.

Google Cloud Platform Data Engineer - Contract - Hartford, CT (Hybrid)

This role is for a Google Cloud Platform Data Engineer on a long-term contract in Hartford, CT (Hybrid). Requires 12+ years of experience, proficiency in Python, Apache Spark, SQL, Big Query, and knowledge of GCP ecosystem and BigQuery architecture.
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
🗓️ - Date discovered
February 21, 2025
🕒 - Project duration
More than 6 months
🏝️ - Location type
Hybrid
📄 - Contract type
Unknown
🔒 - Security clearance
Unknown
📍 - Location detailed
Hartford, CT
🧠 - Skills detailed
#Hadoop #Big Data #Apache Beam #DevOps #PySpark #Spark SQL #Spark (Apache Spark) #Unix #Python #Debugging #Kafka (Apache Kafka) #BigQuery #Data Engineering #Cloud #SQL (Structured Query Language) #Apache Spark #Agile #GCP (Google Cloud Platform)
Role description
You've reached your limit of 5 free role views today. Upgrade to premium for unlimited access.

Dice is the leading career destination for tech experts at every stage of their careers. Our client, Lorven Technologies, Inc., is seeking the following. Apply via Dice today!

Hi,

Our client is looking Google Cloud Platform Data Engineer for Long term contract project in Hartford, CT (Hybrid). below is the detailed requirements.

Job Title : Google Cloud Platform Data Engineer

Location : Hartford, CT (Hybrid)

Duration : Long term Contract

Years of EXP : 12+ years

Required Skills : Python, Apache Spark, SQL, Big Query, PySpark, Google Cloud Platform

Job Description:
• Good understanding of Google Cloud Platform eco system & Bigquery architecture with and streaming experience (Kafka/flume/Apache Beam and pubsub)
• Hands on experience in Spark with Python.
• Hands on with loading and manipulating large data sets using Spark & Hive into Hadoop, Google Cloud Platform .
• Knowledge on debugging and troubleshooting Dataproc jobs.
• Good communication and client interfacing skill.
• Prepare implementation plan as per the need and build the in scope applications in Big Data technologies
• Responsible for all technical deliveries of project Good understanding of Agile & DevOps methodology
• Good Communication & soft skills Prior experience with US customer is nice to have Should have worked in offshore delivery model Should be strong in Python, Unix SQL.