Refer a freelancer, and you both get 1 free week of DFH Premium. They must use your code {code} at sign-up. More referrals = more free weeks! T&Cs apply.
1 of 5 free roles viewed today. Upgrade to premium for unlimited.

Senior Data Engineer

This role is for a Senior Data Engineer with a contract length of "unknown," offering a pay rate of "unknown." Key skills include Databricks, AWS services, Python, and SQL. Requires 5+ years of Data Engineering experience focused on cloud platforms.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
💰 - Day rate
Unknown
Unknown
🗓️ - Date discovered
February 21, 2025
🕒 - Project duration
Unknown
🏝️ - Location type
Unknown
📄 - Contract type
Unknown
🔒 - Security clearance
Unknown
📍 - Location detailed
United Kingdom
🧠 - Skills detailed
#Data Modeling #BI (Business Intelligence) #Python #AWS (Amazon Web Services) #Redshift #Data Processing #Databricks #"ETL (Extract #Transform #Load)" #Data Science #Security #Spark SQL #Delta Lake #Data Governance #Data Integration #S3 (Amazon Simple Storage Service) #Data Engineering #Spark (Apache Spark) #Athena #PySpark #Cloud #Migration #SQL (Structured Query Language) #Data Migration #Data Quality #Scala #Data Architecture #Lambda (AWS Lambda)
Role description
You've reached your limit of 5 free role views today. Upgrade to premium for unlimited access.

Job Specific Duties & Responsibilities:
• Develop & optimize ETL/ELT pipelines using Databricks, PySpark, and SQL
• Migrate on-premise data to AWS cloud, ensuring integrity and performance
• Design and implement scalable data models to support analytics and business intelligence
• Manage and develop stored procedures for efficient data processing
• Perform data integration & transformation across various cloud and on-prem data sources
• Enhance data analytics capabilities by processing and structuring data for reporting
• Optimize AWS-based data architecture (e.g., S3, Redshift, Glue, Lambda, Athena)
• Ensure data quality, governance, and security across the pipeline
• Collaborate with data scientists, analysts, and business teams to drive insights

Required Skills, Experience & Qualifications:
• 5+ years of experience in Data Engineering with a focus on cloud platforms
• Strong expertise in Databricks (PySpark, SQL, Delta Lake)
• Proficiency in AWS services (S3, Redshift, Glue, Lambda, Athena, Step Functions)
• Advanced skills in Python & SQL, including writing stored procedures
• Experience in large-scale data migration to the cloud
• Knowledge of data modeling, warehousing, and analytics
• Strong experience in data integration, transformation, and ETL pipelines
• Understanding of data governance, security, and performance optimization