Refer a freelancer, and you both get 1 free week of DFH Premium. They must use your code {code} at sign-up. More referrals = more free weeks! T&Cs apply.
1 of 5 free roles viewed today. Upgrade to premium for unlimited.

Data Scientist - SPACE AGENCY!

This role is for a Data Scientist with a 12-month contract in Guildford, paying competitively. Key skills include strong SQL, cloud platform experience, Python or Scala proficiency, and ETL knowledge. Previous data engineering experience in a fast-paced environment is required.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
💰 - Day rate
Unknown
Unknown
425
🗓️ - Date discovered
February 20, 2025
🕒 - Project duration
More than 6 months
🏝️ - Location type
Remote
📄 - Contract type
Unknown
🔒 - Security clearance
Unknown
📍 - Location detailed
Guildford
🧠 - Skills detailed
#Data Architecture #Data Accuracy #AWS (Amazon Web Services) #Data Governance #Kafka (Apache Kafka) #Spark (Apache Spark) #Business Analysis #SAP #SQL (Structured Query Language) #Big Data #Data Science #DBA (Database Administrator) #GCP (Google Cloud Platform) #DevOps #Hadoop #Data Engineering #Data Processing #SAP Analytics #Data Warehouse #Scala #Python #Data Pipeline #"ETL (Extract #Transform #Load)" #Cloud #Security #Azure
Role description
You've reached your limit of 5 free role views today. Upgrade to premium for unlimited access.

Data Engineer - 12-Month Contract - Guildford
IT Talent is working with our space agency client to find a Data Scientist for a 12-month contract in Guildford. This is an on-site role with one day of remote work per week.
Key Responsibilities:
• Develop insightful reports and dashboards to support critical business decisions.
• Ensure data accuracy by collaborating with the database administrator.
• Maintain a single source of truth for all data-related decisions.
• Optimise existing reports for performance and usability.
• Transfer current Data Warehouse reports into SAP Analytics Cloud (SAC).
• Integrate data from multiple sources using ETL processes.
• Troubleshoot reporting issues and document processes.
• Work closely with business analysts, data engineers, and stakeholders
Skills Required
• Strong SQL skills for data querying and manipulation
• Experience with cloud platforms (AWS, Azure, or GCP)
• Proficiency in Python or Scala for data processing
• Knowledge of ETL processes and data pipelines
• Familiarity with big data technologies (Spark, Hadoop, Kafka)
• Previous work in data engineering, ideally in a fast-paced environment
• Hands-on experience designing and optimising data architectures
• Working with structured and unstructured data sources
• Bonus Points
• Experience with DevOps & CI/CD for data pipelines
• Familiarity with data governance and security best practices

This is an exciting opportunity to work in the space sector on high-impact data projects. Apply now to find out more!