

SC Cleared Azure Data Engineer - Government Client
Job Title: SC Cleared Azure Data Engineer - Government client - Fully Remote
Location: Fully Remote - UK Based
Salary/Rate: Up to £455 a day Inside IR35
Start Date: April / May
Job Type: 3 Month Contract (with scope to extend)
Company Introduction
We are looking for an SC Cleared Data Engineer to join our client in the Government Administration sector.
•
• Candidates applying for this role must hold active Security Clearance
•
• As a senior data engineer, you would be engaging with data leads, data scientists, analysts and users around the data space for the data analytics, data insights development and implementation of this team. Engage with business analyst, data scientist , project and delivery leads in analysing backlogs, defining/redefining metric tickets, implementation logic, data mapping, related tasks creation and estimations. A strong actioner of data standards for ETL purposes , data modelling, best practices and strive for its implementation .
Required Skills/Experience
• Should be strong in Azure data services like ADF, Synapse, SQL, ADB , etc..
• Should be strong in Databricks notebooks development for data ingestion, validation, transformation and metric build.
• Should be strong in PySpark and SQL.
• Should be strong in ADF pipeline development, data orchestration techniques, monitoring and troubleshooting
• Should be strong in stored procedure development.
• Good knowledge in data modelling (dimensional) and Power BI reporting.
Job Responsibilities/Objectives
• Analyse raw data (mostly in Json format ) for data parsing, schema evolution, data transformation towards metric development purpose.
• Analyse reporting/metric requirements from data engineering perspective for refinement, estimation , development and deployment.
• Closely work with analysts , data scientists to understand the business requirements, data sources and logic for metric development.
• Create normalised/dimensional data models based on the requirement.
• Translated and refine the notebooks and logics developed as part of prototype
• Transform data from landing/staging/transformed to synapse dimensional model.
• Creating notebooks in Databricks for incremental data load and transformation.
• Creating stored procedures for data load and transformation in azure synapse dedicated pools
• Created ADF pipelines for data orchestration across different data layers of data bricks and synapse
If you are interested in this opportunity, please apply now with your updated CV in Microsoft Word/PDF format.
Disclaimer
Notwithstanding any guidelines given to level of experience sought, we will consider candidates from outside this range if they can demonstrate the necessary competencies.
Square One is acting as both an employment agency and an employment business, and is an equal opportunities recruitment business. Square One embraces diversity and will treat everyone equally. Please see our website for our full diversity statement.
Job Title: SC Cleared Azure Data Engineer - Government client - Fully Remote
Location: Fully Remote - UK Based
Salary/Rate: Up to £455 a day Inside IR35
Start Date: April / May
Job Type: 3 Month Contract (with scope to extend)
Company Introduction
We are looking for an SC Cleared Data Engineer to join our client in the Government Administration sector.
•
• Candidates applying for this role must hold active Security Clearance
•
• As a senior data engineer, you would be engaging with data leads, data scientists, analysts and users around the data space for the data analytics, data insights development and implementation of this team. Engage with business analyst, data scientist , project and delivery leads in analysing backlogs, defining/redefining metric tickets, implementation logic, data mapping, related tasks creation and estimations. A strong actioner of data standards for ETL purposes , data modelling, best practices and strive for its implementation .
Required Skills/Experience
• Should be strong in Azure data services like ADF, Synapse, SQL, ADB , etc..
• Should be strong in Databricks notebooks development for data ingestion, validation, transformation and metric build.
• Should be strong in PySpark and SQL.
• Should be strong in ADF pipeline development, data orchestration techniques, monitoring and troubleshooting
• Should be strong in stored procedure development.
• Good knowledge in data modelling (dimensional) and Power BI reporting.
Job Responsibilities/Objectives
• Analyse raw data (mostly in Json format ) for data parsing, schema evolution, data transformation towards metric development purpose.
• Analyse reporting/metric requirements from data engineering perspective for refinement, estimation , development and deployment.
• Closely work with analysts , data scientists to understand the business requirements, data sources and logic for metric development.
• Create normalised/dimensional data models based on the requirement.
• Translated and refine the notebooks and logics developed as part of prototype
• Transform data from landing/staging/transformed to synapse dimensional model.
• Creating notebooks in Databricks for incremental data load and transformation.
• Creating stored procedures for data load and transformation in azure synapse dedicated pools
• Created ADF pipelines for data orchestration across different data layers of data bricks and synapse
If you are interested in this opportunity, please apply now with your updated CV in Microsoft Word/PDF format.
Disclaimer
Notwithstanding any guidelines given to level of experience sought, we will consider candidates from outside this range if they can demonstrate the necessary competencies.
Square One is acting as both an employment agency and an employment business, and is an equal opportunities recruitment business. Square One embraces diversity and will treat everyone equally. Please see our website for our full diversity statement.