1 of 5 free roles viewed today. Upgrade to premium for unlimited from only $19.99 with a 2-day free trial.

Databricks Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Databricks Data Engineer on a 6-month contract in London, paying £600 per day. Key skills include Databricks, Spark (Scala, Python, SQL), Delta Lake, Azure Data Lake, and Data Governance. Capital Markets experience is preferred.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
💰 - Day rate
Unknown
Unknown
600
🗓️ - Date discovered
April 2, 2025
🕒 - Project duration
More than 6 months
🏝️ - Location type
Hybrid
📄 - Contract type
Outside IR35
🔒 - Security clearance
Unknown
📍 - Location detailed
London Area, United Kingdom
🧠 - Skills detailed
#Data Lake #MDM (Master Data Management) #Databricks #Python #Informatica IDQ (Informatica Data Quality) #Metadata #Axon #Programming #Azure #Data Governance #Data Quality #SQL (Structured Query Language) #Data Architecture #Spark (Apache Spark) #Informatica #Data Ingestion #Datasets #Data Management #Scala #Data Engineering #Collibra #Data Access #"ETL (Extract #Transform #Load)" #Delta Lake
Role description
You've reached your limit of 5 free role views today.
Upgrade to premium for unlimited access - from only $19.99.

Databricks Data Engineer - Contract Opportunity at Leading Banking and Financial Services Firm

Position Overview

   • Type: Contract

   • Rate: £600 per day (Outside IR35)

   • Duration: 6 months (rolling contract)

   • Location: London office (hybrid working arrangement)

   • Industry: Banking and financial services

The Role

We are seeking an experienced Databricks Data Engineer to join a prestigious banking and financial services firm in London. In this role, you will leverage your technical expertise to design, develop, and implement data solutions using Databricks technology, working with complex financial datasets in a dynamic trading environment.

Key Requirements

   • Strong Data Engineering experience specifically in Databricks platform

   • Proficient in Spark programming using Scala, Python, and SQL

   • Experience with Delta Lake technology and Databricks workflows/jobs

   • Solid understanding of Azure Data Lake, with demonstrated experience in data ingestion and ETL/ELT frameworks

   • Strong background in Data Governance including metadata management, data quality control, lineage tracking, and data access models

   • Understanding of Data Modelling concepts, Data Products and Data Domains

   • Experience with Unity Catalog (this is a key differentiator)

Desired Experience

   • Experience with MS Purview or similar metadata and data quality tools (Collibra, Informatica Data Quality/MDM/Axon)

   • Capital Markets or Banking industry knowledge

   • Data Architecture experience

   • Previous work in financial services environments

What We Offer

   • Competitive day rate (£600/day, Outside IR35)

   • Opportunity to work with cutting-edge data technologies

   • Hybrid working arrangement at a prestigious London location

   • Potential for contract extension beyond initial 6-month term

   • Collaboration with industry experts in financial data management