SC Cleared Azure Data Engineer - Government Client

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an SC Cleared Azure Data Engineer, offering up to £455/day for a 3-month contract, fully remote in the UK. Key skills include Azure services, Databricks, PySpark, SQL, and data modeling. Active Security Clearance is required.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
💰 - Day rate
Unknown
Unknown
480
🗓️ - Date discovered
April 15, 2025
🕒 - Project duration
3 to 6 months
🏝️ - Location type
Remote
📄 - Contract type
Inside IR35
🔒 - Security clearance
Yes
📍 - Location detailed
England, United Kingdom
🧠 - Skills detailed
#Monitoring #Data Science #Synapse #BI (Business Intelligence) #Spark (Apache Spark) #JSON (JavaScript Object Notation) #PySpark #Data Orchestration #Data Bricks #Data Mapping #Security #Azure #Deployment #Business Analysis #"ETL (Extract #Transform #Load)" #SQL (Structured Query Language) #Data Engineering #Microsoft Power BI #Dimensional Data Models #Data Layers #Databricks #Data Ingestion #ADF (Azure Data Factory)
Role description

Job Title: SC Cleared Azure Data Engineer - Government client - Fully Remote

Location: Fully Remote - UK Based

Salary/Rate: Up to £455 a day Inside IR35

Start Date: April / May

Job Type: 3 Month Contract (with scope to extend)

Company Introduction

We are looking for an SC Cleared Data Engineer to join our client in the Government Administration sector.

   • 

   • Candidates applying for this role must hold active Security Clearance

   • 

   • As a senior data engineer, you would be engaging with data leads, data scientists, analysts and users around the data space for the data analytics, data insights development and implementation of this team. Engage with business analyst, data scientist , project and delivery leads in analysing backlogs, defining/redefining metric tickets, implementation logic, data mapping, related tasks creation and estimations. A strong actioner of data standards for ETL purposes , data modelling, best practices and strive for its implementation .

Required Skills/Experience

   • Should be strong in Azure data services like ADF, Synapse, SQL, ADB , etc..

   • Should be strong in Databricks notebooks development for data ingestion, validation, transformation and metric build.

   • Should be strong in PySpark and SQL.

   • Should be strong in ADF pipeline development, data orchestration techniques, monitoring and troubleshooting

   • Should be strong in stored procedure development.

   • Good knowledge in data modelling (dimensional) and Power BI reporting.

Job Responsibilities/Objectives

   • Analyse raw data (mostly in Json format ) for data parsing, schema evolution, data transformation towards metric development purpose.

   • Analyse reporting/metric requirements from data engineering perspective for refinement, estimation , development and deployment.

   • Closely work with analysts , data scientists to understand the business requirements, data sources and logic for metric development.

   • Create normalised/dimensional data models based on the requirement.

   • Translated and refine the notebooks and logics developed as part of prototype

   • Transform data from landing/staging/transformed to synapse dimensional model.

   • Creating notebooks in Databricks for incremental data load and transformation.

   • Creating stored procedures for data load and transformation in azure synapse dedicated pools

   • Created ADF pipelines for data orchestration across different data layers of data bricks and synapse

If you are interested in this opportunity, please apply now with your updated CV in Microsoft Word/PDF format.

Disclaimer

Notwithstanding any guidelines given to level of experience sought, we will consider candidates from outside this range if they can demonstrate the necessary competencies.

Square One is acting as both an employment agency and an employment business, and is an equal opportunities recruitment business. Square One embraces diversity and will treat everyone equally. Please see our website for our full diversity statement.