Data Engineer

This role is for a Data Engineer with a 6-month contract, offering up to £300.00 per day. Key skills include Python, PySpark, JSON, and data security principles. SC Clearance is required, and UK citizenship is mandatory. Remote work is available.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
💰 - Day rate
Unknown
Unknown
300
🗓️ - Date discovered
January 16, 2025
🕒 - Project duration
More than 6 months
🏝️ - Location type
Remote
📄 - Contract type
Unknown
🔒 - Security clearance
Yes
📍 - Location detailed
Remote
🧠 - Skills detailed
#Azure Databricks #YAML (YAML Ain't Markup Language) #CLI (Command-Line Interface) #SQL Server #Delta Lake #SQL (Structured Query Language) #Bash #Azure #Databricks #Data Pipeline #Scripting #Data Security #Jira #Apache Spark #Datasets #Data Processing #Data Engineering #Python #Azure SQL #Spark (Apache Spark) #PySpark #Scala #pydantic #Programming #JSON (JavaScript Object Notation) #GIT #Security #Data Ingestion #GDPR (General Data Protection Regulation)
Role description
Log in or sign up for free to view the full role description and the link to apply.

UK Citizen
SC Clearance Required
Data Engineer Skills and Technologies Data Engineering:- Strong understanding of data concepts- data types, data structures, schemas (both JSON and Spark), schema management etc- Strong understanding of complex JSON manipulation - Experience working with Data Pipelines using a custom Python/PySpark frameworks- Strong understanding of the 4 core Data categories (Reference, Master, Transactional, Freeform) and the implications of each, particularly managing/handling Reference Data.- Strong understanding of Data Security principles- data owners, access controls- row and column level, GDPR etc including experience of handling sensitive datasets- Strong problem solving and analytical skills, particularly able to demonstrate these intuitively
Skills
Languages / Frameworks- JSON- YAML- Python (as a programming language, not just able to write basic scripts. Pydantic experience would be a bonus.)- SQL- PySpark- Delta Lake- Bash (both CLI usage and scripting)- Git- Markdown- Scala (bonus, not compulsory)- Azure SQL Server as a HIVE Metastore (bonus) Technologies- Azure Databricks- Apache Spark- Delta Tables- Data processing with Python- PowerBI (Integration / Data Ingestion)- JIRA
Job Type: TemporaryContract length: 6 months
Pay: Up to £300.00 per day
Benefits:

Company pension
Private medical insurance
Work from home