GCP Lead Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Lead GCP Data Engineer in Dallas, Texas, on a contract basis, offering a competitive pay rate. Candidates should have 12+ years of experience, including 3+ years in GCP, with expertise in data engineering, ETL, and cloud services.
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
🗓️ - Date discovered
April 17, 2025
🕒 - Project duration
Unknown
🏝️ - Location type
On-site
📄 - Contract type
Unknown
🔒 - Security clearance
Unknown
📍 - Location detailed
Dallas, TX
🧠 - Skills detailed
#Microservices #Python #Data Lake #SQL (Structured Query Language) #Java #Data Modeling #Airflow #DevOps #Kubernetes #Programming #GCP (Google Cloud Platform) #Infrastructure as Code (IaC) #Data Architecture #AWS (Amazon Web Services) #Docker #Security #Storage #Data Science #Data Quality #Data Warehouse #Data Pipeline #Data Engineering #Deployment #Data Ingestion #ML (Machine Learning) #Automated Testing #Computer Science #Compliance #Terraform #Cloud #"ETL (Extract #Transform #Load)" #Dataflow #Data Security #SaaS (Software as a Service) #Scala #Data Governance #Code Reviews #BigQuery #Monitoring #Apache Beam #Data Processing
Role description

Job Title: Lead GCP Data EngineerLocation:  Dallas Texas Employment Type: ContractExperience: 12+ years (with 3+ years in GCP)

Job Summary:

We are seeking a Lead GCP Data Engineer to architect, lead, and manage scalable data engineering initiatives on the Google Cloud Platform. As a technical leader, you will guide a team of data engineers, collaborate with cross-functional stakeholders, and drive the development of robust, secure, and efficient data platforms and pipelines to support business-critical analytics and machine learning use cases.

Key Responsibilities:

   • Lead the design, development, and deployment of end-to-end data pipelines and platforms using GCP services(BigQuery, Dataflow, Pub/Sub, Cloud Composer, Cloud Storage, etc.).

   • Architect robust and scalable data lake/data warehouse solutions aligned with business needs.

   • Mentor, guide, and manage a team of data engineers, conducting code reviews and promoting engineering best practices.

   • Own and optimize data ingestion, transformation (ETL/ELT), and real-time streaming pipelines.

   • Collaborate with enterprise architects, data scientists, analysts, and business stakeholders to understand data needs and translate them into scalable technical solutions.

   • Establish and enforce data quality, data governance, and data security best practices.

   • Drive DevOps and CI/CD practices for data engineering, including infrastructure as code (Terraform/Deployment Manager), automated testing, and monitoring.

   • Evaluate new technologies, tools, and frameworks to continuously improve platform capabilities.

   • Ensure cost optimization, performance tuning, and system reliability across GCP resources.

Required Qualifications:

   • Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field.

   • 8+ years of experience in data engineering, with 3+ years leading teams and delivering solutions in GCP.

   • Expertise in GCP core services: BigQuery, Dataflow (Apache Beam), Pub/Sub, Cloud Composer (Airflow), Cloud Functions, Cloud Storage.

   • Strong programming skills in Python, SQL, and optionally Java/Scala.

   • Proven experience designing and building large-scale ETL/ELT systems and data lakes/warehouses.

   • Experience with real-time data processing, streaming pipelines, and event-driven architectures.

   • Proficiency in infrastructure as code tools like Terraform or GCP Deployment Manager.

   • Solid understanding of data modeling, partitioning, performance tuning, and cost control in BigQuery.

   • Experience implementing data security, access control, encryption, and compliance standards.

Preferred Qualifications:

   • GCP Professional Data Engineer or Architect Certification.

   • Experience with Kubernetes, Docker, and microservices-based data architectures.

   • Familiarity with machine learning workflows and supporting data pipelines for ML models.

   • Experience integrating GCP with other platforms (e.g., on-premise, AWS, or SaaS tools).