

GCP Lead Data Engineer
Job Title: Lead GCP Data EngineerLocation: Dallas Texas Employment Type: ContractExperience: 12+ years (with 3+ years in GCP)
Job Summary:
We are seeking a Lead GCP Data Engineer to architect, lead, and manage scalable data engineering initiatives on the Google Cloud Platform. As a technical leader, you will guide a team of data engineers, collaborate with cross-functional stakeholders, and drive the development of robust, secure, and efficient data platforms and pipelines to support business-critical analytics and machine learning use cases.
Key Responsibilities:
• Lead the design, development, and deployment of end-to-end data pipelines and platforms using GCP services(BigQuery, Dataflow, Pub/Sub, Cloud Composer, Cloud Storage, etc.).
• Architect robust and scalable data lake/data warehouse solutions aligned with business needs.
• Mentor, guide, and manage a team of data engineers, conducting code reviews and promoting engineering best practices.
• Own and optimize data ingestion, transformation (ETL/ELT), and real-time streaming pipelines.
• Collaborate with enterprise architects, data scientists, analysts, and business stakeholders to understand data needs and translate them into scalable technical solutions.
• Establish and enforce data quality, data governance, and data security best practices.
• Drive DevOps and CI/CD practices for data engineering, including infrastructure as code (Terraform/Deployment Manager), automated testing, and monitoring.
• Evaluate new technologies, tools, and frameworks to continuously improve platform capabilities.
• Ensure cost optimization, performance tuning, and system reliability across GCP resources.
Required Qualifications:
• Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field.
• 8+ years of experience in data engineering, with 3+ years leading teams and delivering solutions in GCP.
• Expertise in GCP core services: BigQuery, Dataflow (Apache Beam), Pub/Sub, Cloud Composer (Airflow), Cloud Functions, Cloud Storage.
• Strong programming skills in Python, SQL, and optionally Java/Scala.
• Proven experience designing and building large-scale ETL/ELT systems and data lakes/warehouses.
• Experience with real-time data processing, streaming pipelines, and event-driven architectures.
• Proficiency in infrastructure as code tools like Terraform or GCP Deployment Manager.
• Solid understanding of data modeling, partitioning, performance tuning, and cost control in BigQuery.
• Experience implementing data security, access control, encryption, and compliance standards.
Preferred Qualifications:
• GCP Professional Data Engineer or Architect Certification.
• Experience with Kubernetes, Docker, and microservices-based data architectures.
• Familiarity with machine learning workflows and supporting data pipelines for ML models.
• Experience integrating GCP with other platforms (e.g., on-premise, AWS, or SaaS tools).
Job Title: Lead GCP Data EngineerLocation: Dallas Texas Employment Type: ContractExperience: 12+ years (with 3+ years in GCP)
Job Summary:
We are seeking a Lead GCP Data Engineer to architect, lead, and manage scalable data engineering initiatives on the Google Cloud Platform. As a technical leader, you will guide a team of data engineers, collaborate with cross-functional stakeholders, and drive the development of robust, secure, and efficient data platforms and pipelines to support business-critical analytics and machine learning use cases.
Key Responsibilities:
• Lead the design, development, and deployment of end-to-end data pipelines and platforms using GCP services(BigQuery, Dataflow, Pub/Sub, Cloud Composer, Cloud Storage, etc.).
• Architect robust and scalable data lake/data warehouse solutions aligned with business needs.
• Mentor, guide, and manage a team of data engineers, conducting code reviews and promoting engineering best practices.
• Own and optimize data ingestion, transformation (ETL/ELT), and real-time streaming pipelines.
• Collaborate with enterprise architects, data scientists, analysts, and business stakeholders to understand data needs and translate them into scalable technical solutions.
• Establish and enforce data quality, data governance, and data security best practices.
• Drive DevOps and CI/CD practices for data engineering, including infrastructure as code (Terraform/Deployment Manager), automated testing, and monitoring.
• Evaluate new technologies, tools, and frameworks to continuously improve platform capabilities.
• Ensure cost optimization, performance tuning, and system reliability across GCP resources.
Required Qualifications:
• Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field.
• 8+ years of experience in data engineering, with 3+ years leading teams and delivering solutions in GCP.
• Expertise in GCP core services: BigQuery, Dataflow (Apache Beam), Pub/Sub, Cloud Composer (Airflow), Cloud Functions, Cloud Storage.
• Strong programming skills in Python, SQL, and optionally Java/Scala.
• Proven experience designing and building large-scale ETL/ELT systems and data lakes/warehouses.
• Experience with real-time data processing, streaming pipelines, and event-driven architectures.
• Proficiency in infrastructure as code tools like Terraform or GCP Deployment Manager.
• Solid understanding of data modeling, partitioning, performance tuning, and cost control in BigQuery.
• Experience implementing data security, access control, encryption, and compliance standards.
Preferred Qualifications:
• GCP Professional Data Engineer or Architect Certification.
• Experience with Kubernetes, Docker, and microservices-based data architectures.
• Familiarity with machine learning workflows and supporting data pipelines for ML models.
• Experience integrating GCP with other platforms (e.g., on-premise, AWS, or SaaS tools).