

Data Engineer
Job Title: Data Engineer (Google Cloud Specialist)
Rate: DOE (outside IR35)
Location: Remote
Contract Length: 6 months
A growing London consultancy are seeking a Specialist Google Cloud Platform Data Engineer to work with them on an upcoming project via a contract basis. This is an exciting opportunity to work on cutting-edge data projects, building scalable data pipelines and cloud-based systems that deliver real impact.
Data Engineer (GCP Specialist) - Key Responsibilities:
• Design, develop, and maintain scalable and high-performance data pipelines on Google Cloud.
• Utilise Google Cloud services such as BigQuery, Dataflow, Pub/Sub, and Cloud Storage to manage large datasets and provide actionable insights.
• Optimise data storage and retrieval processes to enhance performance and reduce costs.
• Collaborate with data scientists, analysts, and product teams to deliver data-driven solutions that support business objectives.
• Work with both structured and unstructured data to assist with business intelligence, analytics, and machine learning initiatives.
• Ensure data security, governance, and compliance within the cloud environment.
• Troubleshoot and optimise existing cloud-based data infrastructure to improve efficiency and cost-effectiveness.
Data Engineer (GCP Specialist) - Experience and Qualifications Required:
• Proven experience as a Data Engineer, Cloud Engineer, or in a similar role with hands-on expertise in Google Cloud Platform.
• Strong proficiency with GCP tools such as BigQuery, Dataflow, Pub/Sub, and Cloud Storage.
• Solid experience with data processing frameworks (e.g., Apache Beam, Spark).
• Proficiency in SQL, Python, or other programming languages used in data engineering.
• Strong understanding of ETL processes, data modelling, and optimisation techniques.
• Experience working with data pipelines, orchestration tools, and automation frameworks (e.g., Apache Airflow).
• Knowledge of data security best practices and experience with IAM (Identity and Access Management) within GCP.
• A collaborative mindset and the ability to work with cross-functional teams in a fast-paced environment.
Please only apply to this role if you have commercial experience using Google Cloud Platform, preference given to those with a Google Cloud Certification.