DevOps Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a DevOps Engineer specializing in Cloud & Data Platforms, with an initial 12-month contract in London (3 days onsite, 2 days remote). Key skills include Terraform, Kubernetes, CI/CD, Azure DevOps, and big data technologies.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
💰 - Day rate
Unknown
Unknown
🗓️ - Date discovered
April 16, 2025
🕒 - Project duration
More than 6 months
🏝️ - Location type
Hybrid
📄 - Contract type
Inside IR35
🔒 - Security clearance
Unknown
📍 - Location detailed
London Area, United Kingdom
🧠 - Skills detailed
#DevOps #Azure DevOps #Cloud #Kubernetes #Cloudera #Jenkins #Deployment #Big Data #Azure Data Factory #ADF (Azure Data Factory) #Terraform #GitLab #Azure cloud #Spark (Apache Spark) #Vault #Infrastructure as Code (IaC) #Shell Scripting #Data Management #Scripting #MDM (Master Data Management) #Azure #HDFS (Hadoop Distributed File System) #Data Processing #Databricks #Microsoft Azure #Storage #TeamCity
Role description

DevOps Engineer (Cloud & Data Platforms)

Duration: initial 12 months

Location: London (3 days onsite, 2 days remote)

Contractor setup: Umbrella (inside ir35)

We are seeking a skilled DevOps Engineer to join our team for a high-profile global marketing data platform project. The project aims to centralize client profiles and marketing preferences, ensuring seamless integration of multiple data sources and delivery to marketing platforms.

Key skills needed for this job include expertise in Terraform, Kubernetes, Shell/Powershell scripting, CI/CD pipelines (GitLab, Jenkins), Azure DevOps, IaC, and experience with big data platforms like Cloudera, Spark, and Azure Data Factory/DataBricks.

Key Responsibilities:

   • Implement and maintain Infrastructure as Code (IaC) using Terraform, Shell/Powershell scripting, and CI/CD pipelines (GitLab, TeamCity, Jenkins).

   • Manage data flows between multiple systems using Azure Data Factory and DataBricks (desirable).

   • Architect and deploy cloud-based solutions with Microsoft Azure, leveraging PaaS services like Azure DevOps and Vault.

   • Work with the Technical and Solution Architect teams to design the overall solution architecture for end-to-end data flows.

   • Utilize big data technologies such as Cloudera, Hue, Hive, HDFS, and Spark for data processing and storage.

   • Ensure smooth data management for marketing consent and master data management (MDM) systems.

Key Skills and Technologies:

   • Terraform: Essential for defining infrastructure as code.

   • Kubernetes: Orchestrate containerized applications and services.

   • Shell/Powershell scripting: Automate processes and deployment pipelines.

   • IaC: Deploy and manage infrastructure through code.

   • CI/CD (GitLab, Jenkins, TeamCity): Continuous integration and delivery for streamlined development workflows.

   • Azure Data Factory/DataBricks: Experience with these services is a plus for handling complex data processes.

   • Cloudera (Hue, Hive, HDFS, Spark): Experience with these big data tools is highly desirable for data processing.

   • Azure DevOps, Vault: Core skills for working in Azure cloud environments.

   • Strong problem-solving skills, communication, and team collaboration.