

GCP Dataflow Engineer
Role- GCP Dataflow Engineer
Remote
Note- Must be GCP Certified
A GCP Dataflow Engineer is responsible for designing, developing, and managing data pipelines using Google Cloud Platform's Dataflow service, primarily focusing on building scalable and efficient data processing solutions for both batch and streaming data using Apache Beam programming model, often integrating with other GCP services like BigQuery, Cloud Storage, and Pub/Sub to handle complex data ingestion, transformation, and analysis needs.
Key Responsibilities:
• Architect and implement data pipelines using Apache Beam and the Dataflow runner on GCP.
• Translate business requirements into data processing workflows.
• Optimize pipeline performance for speed, cost-efficiency, and scalability.
• Write complex data processing logic using Beam's data manipulation primitives (e.g., PTransforms).
• Implement data cleaning, filtering, aggregation, and enrichment within pipelines.
• Connect data pipelines to sources like Cloud Storage, BigQuery, Pub/Sub, and other external data sources.
• Leverage BigQuery for data warehousing and analysis.
• Monitor pipeline health and performance metrics
• Troubleshoot issues and implement error handling mechanisms
• Optimize pipeline execution based on monitoring data
• Deploy Dataflow pipelines to production environments
• Manage pipeline versions and updates
• Implement dataflow job scheduling and automation