Refer a freelancer, and you both get 1 free week of DFH Premium. They must use your code {code} at sign-up. More referrals = more free weeks! T&Cs apply.
1 of 5 free roles viewed today. Upgrade to premium for unlimited.

Azure Data Engineer

This role is for an Azure Data Engineer, initially 12 months, paying £350/day, located in London (NW1) with hybrid work. Essential skills include Azure services, Snowflake, Python, and SQL. Azure certifications and public sector experience are preferred.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
💰 - Day rate
Unknown
Unknown
350
🗓️ - Date discovered
February 15, 2025
🕒 - Project duration
More than 6 months
🏝️ - Location type
Hybrid
📄 - Contract type
Outside IR35
🔒 - Security clearance
Unknown
📍 - Location detailed
London Area, United Kingdom
🧠 - Skills detailed
#Spatial Data #Azure Data Factory #Data Processing #Apache Kafka #Automation #Data Engineering #Synapse #Scala #Data Integration #Microsoft Power BI #Data Lake #Data Pipeline #Tableau #Databases #Data Quality #Snowflake #SQL (Structured Query Language) #Data Governance #"ETL (Extract #Transform #Load)" #Security #ADF (Azure Data Factory) #Azure cloud #BI (Business Intelligence) #Datasets #Kafka (Apache Kafka) #Computer Science #Cloud #Python #IoT (Internet of Things) #Visualization #Azure Synapse Analytics #Schema Design #Storage #Azure #Azure Blob Storage #Compliance #Data Security
Role description
You've reached your limit of 5 free role views today. Upgrade to premium for unlimited access.

Greenrich is looking for fill the following role. Interested candidates, please send cv's to resourcing@greenrichit.com

Role :
• Azure Data Engineer
• Client :
• Public Sector services
• Duration :
• Initially 12 months
• Location : London (NW1) / Hybrid (mostly 2 days onsite)

Day Rate : £350

IR35 Determination : Outside (TBC)

IlR, PSW, British Citizen and Dependent are eligible.
• Skills and Experience

• Essential
• :
• Proven experience as a Data Engineer, working on large-scale data integration and analytics projects.
• Strong expertise with Azure Cloud Services, including Azure Data Factory (ADF), Azure Synapse Analytics, and Azure Blob Storage.
• Hands-on experience with Snowflake for data warehousing, modeling, and performance optimization.
• Proficiency in Python for data engineering, automation, and ETL development.
• Solid understanding of SQL for querying and managing relational databases.
• Familiarity with data lake architectures, data pipelines, and schema design principles.
• Experience with data governance, security, and compliance frameworks.
• Desirable
• :
• Experience with IoT data processing and working on smart city or similar projects.
• Knowledge of real-time data streaming technologies like Azure Event Hub or Apache Kafka.
• Proficiency in visualization tools like Power BI or Tableau.
• Understanding of geospatial data processing and analytics.
• Key Responsibilities
• • Design, develop, and optimize scalable ETL/ELT pipelines using Azure Data Factory (ADF) and Azure Synapse Analytics.
• Build and maintain robust data integration workflows to connect various sources, ensuring high data quality and performance.
• Implement and manage Snowflake-based data warehousing solutions for advanced analytics and reporting.
• Collaborate with cross-functional teams to support the development of data-driven smart city applications.
• Write efficient, reusable, and scalable Python scripts for data processing, transformation, and automation tasks.
• Develop and enforce data governance standards, ensuring data security and compliance.
• Monitor, troubleshoot, and optimize data pipelines and workflows to ensure reliability and efficiency.
• Work with large, complex datasets to deliver insights that support smart city initiatives, such as IoT data, public transport, and environmental analytics.
• Qualifications
• • Bachelor’s or Master’s degree in Computer Science, Data Engineering, or a related field.
• Azure certifications (e.g., Azure Data Engineer Associate) are a strong plus.