Refer a freelancer, and you both get 1 free week of DFH Premium. They must use your code {code} at sign-up. More referrals = more free weeks! T&Cs apply.
1 of 5 free roles viewed today. Upgrade to premium for unlimited.

Data Engineer with SQL and Snowflake

This role is for a Data Engineer with SQL and Snowflake, lasting 6 months in Edinburgh (Hybrid). Requires strong SQL Server and Snowflake experience, ETL proficiency, AWS knowledge, and data warehousing expertise. Familiarity with GitLab and large datasets is essential.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
💰 - Day rate
Unknown
Unknown
🗓️ - Date discovered
February 22, 2025
🕒 - Project duration
More than 6 months
🏝️ - Location type
Hybrid
📄 - Contract type
Unknown
🔒 - Security clearance
Unknown
📍 - Location detailed
Edinburgh, Scotland, United Kingdom
🧠 - Skills detailed
#"ETL (Extract #Transform #Load)" #Data Governance #Athena #Data Modeling #IAM (Identity and Access Management) #RDS (Amazon Relational Database Service) #Datasets #SQL (Structured Query Language) #Airflow #Scala #Informatica #Redshift #SQL Server #Data Pipeline #Schema Design #S3 (Amazon Simple Storage Service) #Snowflake #Data Integration #Security #Data Manipulation #Version Control #Data Engineering #Cloud #GitLab #Informatica PowerCenter #AWS (Amazon Web Services) #Indexing
Role description
You've reached your limit of 5 free role views today. Upgrade to premium for unlimited access.

Role :Data Engineer with SQL and Snowflake

Duration: 6 months

Location: Edinburgh(Hybrid)

Required Skills And Qualifications
• Proven experience working with SQL Server (e.g., T-SQL, Stored Procedures, Indexing, Query Optimization, System Catalog Views).
• Strong experience in Snowflake architecture, including data loading, transformation, and performance tuning.
• Proficient in ETL processes using tools such as Informatica PowerCenter and BDM, AutoSys, Airflow, and SQL Server Agent.
• Experience with cloud platforms preferably AWS.
• Strong knowledge of AWS cloud services, including EMR, RDS Postgres, Redshift Athena, S3, and IAM.
• Solid understanding of data warehousing principles and best practices.
• Strong proficiency in SQL for data manipulation, reporting, and optimization.
• Knowledge of data modeling and schema design.
• Experience working with large, complex datasets and implementing scalable data pipelines.
• Familiarity with version control tools such as GitLab.
• Experience with data integration, data governance, and security best practices.