Refer a freelancer, and you both get 1 free week of DFH Premium. They must use your code {code} at sign-up. More referrals = more free weeks! T&Cs apply.
1 of 5 free roles viewed today. Upgrade to premium for unlimited.

Principle Data Engineer

This role is for a Principal Data Engineer on a 3-month initial contract, paying £600-£650 per day, remote. Requires 7+ years in Data Engineering, expertise in Python, SQL, GCP/AWS/Azure, and previous retail experience is preferred.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
💰 - Day rate
Unknown
Unknown
650
🗓️ - Date discovered
February 19, 2025
🕒 - Project duration
3 to 6 months
🏝️ - Location type
Remote
📄 - Contract type
Outside IR35
🔒 - Security clearance
Unknown
📍 - Location detailed
London, England, United Kingdom
🧠 - Skills detailed
#Infrastructure as Code (IaC) #Scrum #DevOps #Leadership #Data Mapping #SQL (Structured Query Language) #Scala #Data Lake #Python #"ETL (Extract #Transform #Load)" #Databricks #Docker #Data Engineering #Kanban #Airflow #AWS (Amazon Web Services) #Kubernetes #Azure DevOps #Agile #Cloud #Terraform #Data Warehouse #Azure #GCP (Google Cloud Platform) #Spark (Apache Spark)
Role description
You've reached your limit of 5 free role views today. Upgrade to premium for unlimited access.

About

Principal Data Engineer 3 Month Initial Contract Remote £600-£650pd (Outside IR35) We’re looking for a Principal Data Engineer to take ownership of data models across domains, data mapping, transformation, and tooling, ensuring best practises and consistency. Your Knowledge & Experience

7+ years’ experience in Data Engineering/Architecture (or similar) with extensive knowledge in designing and managing complex data models and solutions. Proven leadership and mentoring abilities within technical teams. Strong communication skills, with the ability to engage effectively with clients and stakeholders at all levels. Hands-on expertise in modern data platforms, tools, and technologies, including: Advanced data modelling - operational and analytical Python, SQL Databricks, Spark Orchestration frameworks such as Dataform, Airflow, and GCP Workflows Modern architecture and cloud platforms (GCP, AWS, Azure) DevOps practises Data warehouse and data lake design and implementation Familiarity with containerization and IaC tools (e.g., Docker, Kubernetes, Terraform). Experience in agile environments (Scrum, Kanban, etc.). Previous experience in the retail domain is a significant advantage. Experience in developing training materials and fostering knowledge sharing. A strategic mindset focused on scalability, performance, and innovation

Nice-to-have skills
• Python
• SQL
• Spark
• GCP
• AWS
• Azure
• DevOps
• Docker
• Kubernetes
• Terraform
• Agile
• Scrum
• Kanban
• Retail
• London, England

Work experience
• Data Engineer
• Data Infrastructure
• Software Architect

Languages
• English