1 of 5 free roles viewed today. Upgrade to premium for unlimited from only $19.99 with a 2-day free trial.

Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Kafka Engineer/Data Engineer on a contract basis, located in Leeds, UK, hybrid (3 days in-office). Key skills include extensive Apache Kafka experience, real-time architecture, and knowledge of cloud platforms like GCP.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
💰 - Day rate
Unknown
Unknown
🗓️ - Date discovered
April 3, 2025
🕒 - Project duration
Unknown
🏝️ - Location type
Hybrid
📄 - Contract type
Inside IR35
🔒 - Security clearance
Unknown
📍 - Location detailed
Leeds, England, United Kingdom
🧠 - Skills detailed
#NoSQL #S3 (Amazon Simple Storage Service) #Data Processing #Spark (Apache Spark) #Security #JDBC (Java Database Connectivity) #Data Ingestion #Scala #Apache Kafka #GCP (Google Cloud Platform) #MongoDB #Databases #Data Pipeline #Monitoring #Kafka (Apache Kafka) #Cloud #Data Engineering #Storage #Deployment
Role description
You've reached your limit of 5 free role views today.
Upgrade to premium for unlimited access - from only $19.99.

Here are the job details-

Role – Kafka Engineer / Data Engineer

Location – Leeds, UK

Mode of Work: Hybrid 3 days from office

Job type- Contract Inside IR35

Job Description:

A Kafka Real-Time Architect is responsible for designing and implementing scalable, real time data processing systems in Kafka. This role involves architecting Kafka cluster, ensuring high availability, and integrating with other data processing tools and platforms.

As part of the CTO Data Ingestion Service, the incumbent will be required to:

   • Designing and architecting scalable, real-time systems in Kafka.

   • Configuring, deploying, and maintaining Kafka clusters to ensure high availability and scalability.

   • Integrating Kafka with other data processing tools and platforms such as Kafka Streams, Kafka Connect, Spark Streaming Schema Registry, Flink and Beam.

   • Collaborating with cross-functional teams to understand data requirements and design solutions that meet business needs.

   • Implementing security measures to protect Kafka clusters and data streams.

   • Monitoring Kafka performance and troubleshooting issues to ensure optimal performance.

   • Providing technical guidance and support to development operations teams.

   • Staying updated with the latest Kafka features, updates and industry practices.

Required Skills Experience

   • Extensive Experience with Apache Kafka and real-time architecture including event driven frameworks.

   • Strong Knowledge of Kafka Streams, Kafka Connect, Spark Streaming, Schema Registry, Flink and Beam.

   • Experience with cloud platform such as GCP Pub/Sub.

   • Excellent problem-solving skills.

Knowledge & Experience / Qualifications:

   • Knowledge of Kafka data pipelines and messaging solutions to support critical business operations and enable real-time data processing.

   • Monitoring Kafka performance, enhancing decision making and operational efficiency

   • Collaborating with development teams to integrate Kafka applications and services.

   • Maintain an architectural library for Kafka deployment models and patterns

   • Helping developers to maintain Kafka connectors such as JDBC, MongoDB and S3 connectors, along with topics schemas, to streamline data ingestion from databases, NoSQL data stores and cloud storage, enabling faster data insight.

Thanks & Regards,