1 of 5 free roles viewed today. Upgrade to premium for unlimited from only $19.99 with a 2-day free trial.

Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with a contract length of "unknown," offering a pay rate of "unknown." Key skills include extensive experience with Apache Kafka, real-time architecture, and cloud platforms, specifically GCP Pub/Sub.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
💰 - Day rate
Unknown
Unknown
🗓️ - Date discovered
April 3, 2025
🕒 - Project duration
Unknown
🏝️ - Location type
Unknown
📄 - Contract type
Unknown
🔒 - Security clearance
Unknown
📍 - Location detailed
Leeds, England, United Kingdom
🧠 - Skills detailed
#S3 (Amazon Simple Storage Service) #Apache Kafka #GCP (Google Cloud Platform) #MongoDB #Monitoring #Data Processing #Security #Spark (Apache Spark) #JDBC (Java Database Connectivity) #Documentation #Kafka (Apache Kafka) #Cloud #Data Engineering #Scala
Role description
You've reached your limit of 5 free role views today.
Upgrade to premium for unlimited access - from only $19.99.

Job Description:

We are seeking a Kafka Real-Time Architect to design and implement scalable, real-time data processing systems. This role involves architecting Kafka clusters, ensuring high availability, and integrating with other data platforms.

Responsibilities:

   • Design and deploy scalable Kafka-based real-time systems.

   • Configure, maintain, and optimize Kafka clusters for high availability.

   • Integrate Kafka with tools like Kafka Streams, Kafka Connect, Spark Streaming, Flink, and Beam.

   • Implement security and performance monitoring for Kafka environments.

   • Support development teams with Kafka connectors (JDBC, MongoDB, S3) and schema management.

   • Maintain Kafka architecture documentation and best practices.

Required Skills:

   • Extensive experience with Apache Kafka, event-driven frameworks, and real-time architecture.

   • Strong knowledge of Kafka Streams, Kafka Connect, Spark Streaming, Flink, and Beam.

   • Experience with cloud platforms (preferably GCP Pub/Sub).

   • Excellent problem-solving and troubleshooting skills.