Refer a freelancer, and you both get 1 free week of DFH Premium. They must use your code {code} at sign-up. More referrals = more free weeks! T&Cs apply.
1 of 5 free roles viewed today. Upgrade to premium for unlimited.

Senior Data Engineer

This role is for a Senior Data Engineer on a 6-12 month contract (outside IR35), paying up to £525 p/d. Key skills include Python, Apache Spark, GCP, BigQuery, and ETL workflows. UK-based candidates with relevant experience are required.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
💰 - Day rate
Unknown
Unknown
525
🗓️ - Date discovered
February 20, 2025
🕒 - Project duration
More than 6 months
🏝️ - Location type
Remote
📄 - Contract type
Outside IR35
🔒 - Security clearance
Unknown
📍 - Location detailed
Greater London, England, United Kingdom
🧠 - Skills detailed
#Kubernetes #Spark (Apache Spark) #Datasets #BigQuery #Terraform #GCP (Google Cloud Platform) #PostgreSQL #Data Engineering #Apache Beam #Consul #Data Processing #FastAPI #Scala #Python #Data Pipeline #"ETL (Extract #Transform #Load)" #Apache Spark #Elasticsearch #Cloud #Dataflow
Role description
You've reached your limit of 5 free role views today. Upgrade to premium for unlimited access.

📍 Location: Remote-first (UK-based)

💰 Rate: Up to £525 p/d

📆 Contract: 6 - 12 months (Outside IR35)

🛠 Tech Stack: Python, FastAPI, GCP, BigQuery, Apache Spark, Apache Beam, Google Cloud Dataflow

We're working with a forward-thinking consultancy that helps top companies build and scale high-performance data platforms. They take an engineering-first approach, and more than half of their team consists of hands-on engineers. If you love working with large-scale data processing and cutting-edge cloud technologies, this one’s for you.

What You’ll Be Doing:

🔹 Building data pipelines and ETL workflows that process huge datasets

🔹 Designing, optimizing, and maintaining high-throughput reporting solutions

🔹 Working with Apache Spark for large-scale data processing

🔹 Using Apache Beam and Google Cloud Dataflow to manage complex data workflows

🔹 Developing and improving backend APIs to support data-heavy applications

What You Need:

✔ Strong Python skills – writing clean, efficient, and scalable code

✔ Experience with BigQuery, PostgreSQL, and Elasticsearch

✔ Hands-on experience with Google Cloud, Kubernetes, and Terraform

✔ Deep understanding of Apache Spark for large-scale data processing

✔ Knowledge of Apache Beam & Google Cloud Dataflow for data pipeline orchestration

✔ A team-first mindset with strong communication skills

This is a contract role outside IR35, so you must be UK-based and have a registered company in the UK. Interested? Click Apply or reach out to Ionut Roghina for more details! 🚀