Refer a freelancer, and you both get 1 free week of DFH Premium. They must use your code {code} at sign-up. More referrals = more free weeks! T&Cs apply.
1 of 5 free roles viewed today. Upgrade to premium for unlimited.

GCP Data Engineer ( Automotive / Manufacturing )

This role is for a GCP Data Engineer (Automotive/Manufacturing) with an open-ended contract, offering $55-$70 per hour. Located in Dearborn, MI, it requires expertise in Python/PySpark, Google Cloud technologies, and Big Data analytics, along with 6 years of relevant experience.
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
🗓️ - Date discovered
February 14, 2025
🕒 - Project duration
Unknown
🏝️ - Location type
Hybrid
📄 - Contract type
Unknown
🔒 - Security clearance
Unknown
📍 - Location detailed
Dearborn, MI
🧠 - Skills detailed
#Java #GCP (Google Cloud Platform) #Data Transformations #Data Engineering #Deployment #PySpark #Cloud #Spark (Apache Spark) #Agile #Python #Big Data #BigQuery #Public Cloud #"ETL (Extract #Transform #Load)"
Role description
You've reached your limit of 5 free role views today. Upgrade to premium for unlimited access.

Job Title: GCP Data Engineer ( Automotive / Manufacturing )

Duration: Open ended

Location: 22001 Michigan Avenue,Dearborn,MI,48124

Pay is 55 – 70

Hybrid Position

Description: The Product Engineering and Management (PEM) division within Global Data Insight and Analytics (GDIA) plays a pivotal role in harnessing advanced analytics technologies within the Big Data realm. Our mission is to enhance the quality, design, marketing, and development of our current and forthcoming products through innovative solutions. A key initiative includes developing a state-of-the-art connected vehicle analytics platform, delivering actionable insights to boost product quality and value. We are on the lookout for a Data Analytics Software Engineer with a keen focus on Data Transformation. This role demands a candidate adept in crafting complex analytic solutions, utilizing Python and PySpark within an Agile framework. Essential to this position is a solid track record in managing intricate data transformations and a profound understanding of Big Data methodologies and tools. Role Overview: As a Data Analytics Software Engineer, you will be part of a compact, multidisciplinary team that operates in close collaboration. This role involves direct and ongoing engagement with business stakeholders, product managers, and designers, emphasizing rapid and frequent deployment.

Skills Required: Advanced Software Engineering Expertise: Profound knowledge and application of Object-Oriented Design principles are essential.

· Python/PySpark Development Proficiency: Demonstrated capability in developing Python/PySpark applications from conceptualization to deployment.

· Familiarity with Google Cloud Technologies: A solid understanding of Google Cloud solutions, including DataProc, BigQuery, Cloud Run, and Astronomer, is required.

· Big Data & Analytics Acumen: A robust grasp of Big Data and Analytics, including the ability to navigate and address the challenges unique to these fields.

· Continuous Learning: A strong inclination towards acquiring new skills and leveraging them to address business challenges and seize opportunities

Experience Required: Overall 6 years of work experience in delivering customer facing products.

· Minimum 2 years of strong development experience in at least one of the following technologies: and/or Python(preferred) and/or Java on a Public Cloud Platform