Refer a freelancer, and you both get 1 free week of DFH Premium. They must use your code {code} at sign-up. More referrals = more free weeks! T&Cs apply.
1 of 5 free roles viewed today. Upgrade to premium for unlimited.

Data Engineer - Python/Pyspark - Contract - Outside IR35

This role is for a Data Engineer - Python/Pyspark on a contract basis, requiring 4-5 days on-site in the West Midlands. Pay rate is "unknown". Key skills include Pyspark, Python, Azure, Docker, and Apache Airflow. Sole UK Nationals only.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
💰 - Day rate
Unknown
Unknown
🗓️ - Date discovered
February 22, 2025
🕒 - Project duration
Unknown
🏝️ - Location type
On-site
📄 - Contract type
Outside IR35
🔒 - Security clearance
Yes
📍 - Location detailed
West Midlands, England, United Kingdom
🧠 - Skills detailed
#Apache NiFi #Spark (Apache Spark) #Kubernetes #Batch #Storage #Airflow #Scala #Deployment #Data Storage #Data Pipeline #Azure #Azure cloud #Data Processing #Automation #Security #Python #Docker #Data Engineering #Cloud #PySpark #NiFi (Apache NiFi) #Apache Airflow #Data Integration
Role description
You've reached your limit of 5 free role views today. Upgrade to premium for unlimited access.

Please Read Full Summary Before Applying

Methods are looking for 3 - 4 Data Engineers with experience using Pyspark in Python for a new contract engagement. You will need to be able to commit to 4 or 5 days on a client site in the West Midlands. This role will require SC clearance or to be willing to go through the SC process.

This role also requires Sole UK Nationals.

We are seeking a seasoned Senior Data Engineer (Infrastructure) to join our team. This role is essential for designing, building, and maintaining sophisticated data infrastructure systems that operate across both on-premises and Azure cloud environments. The position involves deploying and managing scalable data operations that support advanced analytics and data-driven decision-making, crucial for our organisational growth and innovation.

Requirements
• Develop and Manage Data Pipelines: You will design, construct, and maintain efficient and reliable data pipelines using Python, capable of supporting both streaming and batch data processing across structured, semi-structured, and unstructured data in on-premises and Azure environments.
• Hybrid Cloud and Data Storage Solutions: Implement and manage data storage solutions leveraging both on-premises infrastructure and Azure, ensuring seamless data integration and accessibility across platforms.
• Containerisation and Orchestration: Utilise Docker for containerisation and Kubernetes for orchestration, ensuring scalable and efficient deployment of applications across both cloud-based and on-premises environments.
• Workflow Automation: Employ tools such as Apache NiFi and Apache Airflow to automate data flows and manpyage complex workflows within hybrid environments.

This role will require you to have or be willing to go through Security Clearance. As part of the onboarding process candidates will be asked to complete a Baseline Personnel Security Standard; details of the evidence required to apply may be found on the government website Gov.UK. If you are unable to meet this and any associated criteria, then your employment may be delayed, or rejected.

Details of this will be discussed with you at interview