GCP Data Engineer

This role is for a GCP Data Engineer in Dallas, TX, on a long-term contract with a pay rate of "unknown." Key skills include Python, Apache Spark, BigQuery, and Dataflow. Industry experience in cloud data engineering is required.
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
🗓️ - Date discovered
January 17, 2025
🕒 - Project duration
Unknown
🏝️ - Location type
On-site
📄 - Contract type
Unknown
🔒 - Security clearance
Unknown
📍 - Location detailed
Dallas, TX
🧠 - Skills detailed
#Monitoring #PySpark #Security #"ETL (Extract #Transform #Load)" #Apache Airflow #Deployment #Datasets #Cloud #Python #Scripting #Scrum #SQL (Structured Query Language) #Dataflow #Stories #IAM (Identity and Access Management) #Bash #Data Pipeline #BigQuery #Data Security #GCP (Google Cloud Platform) #Terraform #Data Storage #Code Reviews #Storage #Data Engineering #Compliance #Data Analysis #Apache Spark #Airflow #Data Ingestion #Infrastructure as Code (IaC) #Spark (Apache Spark) #Data Processing
Role description
Log in or sign up for free to view the full role description and the link to apply.

Hi All

Role: GCP Data Engineer

Location: Dallas, TX (onsite)

Long Term Contract

Required Skills:
• Python, Apache Spark, BigQuery, PySpark, Cloud Composer, Cloud Dataflow, Cloud Dataproc, Cloud SQL, Data Fusion

Responsibilities:
• GCP Data Engineer will create, deliver, and support custom data products, as well as enhance/expand team capabilities.
• They will work on analyzing and manipulating large datasets supporting the enterprise by activating data assets to support Enabling Platforms and analytics.
• Google Cloud Data Engineers will be responsible for designing the transformation and modernization on Google Cloud Platform using GCP Services.
• Build data systems and pipelines on GCP Cloud using Data proc, Data Flow, Data Fusion, Big query and Pub/Sub Implement schedules/workflows and tasks for Cloud Composer/Apache Airflow.
• Create and manage data storage solutions using GCP services such as BigQuery, Cloud Storage, and Cloud SQL Monitor and troubleshoot data pipelines and storage solutions using GCPs Stackdriver and Cloud Monitoring.
• Develop efficient ETL/ELT pipelines and orchestration using Data Prep, Google Cloud Composer
• Develop and Maintain Data Ingestion and transformation process using Apache PySpark, Dataflow
• Automate data processing tasks using scripting languages such as Python or Bash.
• Ensuring data security and compliance with industry standards by configuring IAM roles, service accounts, and access policies.
• Automating cloud deployments and infrastructure management using Infrastructure as Code (IaC) tools such as Terraform or Google Cloud Deployment Manager.
• Participate in Code reviews, contribute to development best practices and usage of Developer.
• Assist tools to create a robust fail-safe data pipelines
• Collaborate with Product Owners, Scrum Masters and Data Analyst to deliver the User Stories and Tasks and ensure deployment of pipelines.