Refer a freelancer, and you both get 1 free week of DFH Premium. They must use your code {code} at sign-up. More referrals = more free weeks! T&Cs apply.
1 of 5 free roles viewed today. Upgrade to premium for unlimited.

Hadoop Big Data Developer

This role is for a "Hadoop Big Data Developer" in Windsor, UK, with a 6-month contract at £400 per day. Requires 10+ years IT experience, 5+ years Hadoop development, expertise in PySpark, Spark SQL, and a utilities background preferred.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
💰 - Day rate
Unknown
Unknown
400
🗓️ - Date discovered
February 21, 2025
🕒 - Project duration
More than 6 months
🏝️ - Location type
Hybrid
📄 - Contract type
Unknown
🔒 - Security clearance
Unknown
📍 - Location detailed
Wokingham, England, United Kingdom
🧠 - Skills detailed
#Databases #Python #Data Processing #Hadoop #"ETL (Extract #Transform #Load)" #Data Storage #Security #MySQL #Spark SQL #PostgreSQL #Spark (Apache Spark) #MongoDB #PySpark #Big Data #Data Privacy #Data Pipeline #Storage #SQL (Structured Query Language) #Documentation #Kafka (Apache Kafka)
Role description
You've reached your limit of 5 free role views today. Upgrade to premium for unlimited access.

Job Title: Hadoop Big Data Developer

Location: Windsor, UK

Work Model: Hybrid (office visit once per week)

Duration: 6 months

Start Date: 17-03-2025

Rate Payable: £400 per day

Number of Contractors Required: 1

Job Purpose & Primary Objectives:
• Role: Hadoop Big Data Developer

Key Responsibilities:

Work closely with the development team to assess existing Big Data infrastructure
• Design and code Hadoop applications to analyze data compilations
• Create data processing frameworks
• Extract and isolate data clusters
• Test scripts to analyze results and troubleshoot bugs
• Create data tracking programs and documentation
• Maintain security and data privacy

Key Skills & Knowledge:
• Build, schedule, and maintain data pipelines
• Expertise in PySpark, Spark SQL, Hive, Python, and Kafka
• Strong experience in Data Collection and Integration, Scheduling, Data Storage, ETL (Extract, Transform, Load)
• Knowledge of relational and non-relational databases (MySQL, PostgreSQL, MongoDB)
• Good written and verbal communication skills
• Experience in managing business stakeholders for requirement clarification
• Experience Required:10+ years of total IT experience
• 5+ years of Hadoop development experience
• Candidate from Utilities background will be preferred