Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer on a contract basis, focusing on data migration and infrastructure reengineering for 6 months at a pay rate of "X". Key skills include Data Vault 2.0, Snowflake, dbt, and 5+ years of data engineering experience.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
💰 - Day rate
Unknown
Unknown
🗓️ - Date discovered
April 19, 2025
🕒 - Project duration
Unknown
🏝️ - Location type
Unknown
📄 - Contract type
Unknown
🔒 - Security clearance
Unknown
📍 - Location detailed
United Kingdom
🧠 - Skills detailed
#"ETL (Extract #Transform #Load)" #SAP #Data Modeling #Data Cleansing #Scripting #Data Integrity #Snowflake #Data Lifecycle #Storage #Compliance #Scala #Databases #Data Pipeline #Data Governance #Data Vault #Python #Data Integration #Data Engineering #Database Management #Data Processing #Data Storage #Data Transformations #SQL (Structured Query Language) #Vault #Security #Programming #dbt (data build tool)
Role description

Data Engineer

Project Overview:

After a recent merger of two large business units, we are embarking on a project to reengineer and migrate their end-to-end reporting requirements (direct involvement with pipelines and systems) and operational systems (indirect involvement with data). This transition includes a shift from existing ETL processes to a modern data infrastructure, leveraging Data Vault 2.0 modeling, Snowflake for database management, and dbt for data transformation to establish robust data pipelines. In addition it requires a comprehensive cleansing and alignment of existing data sets and data structures according to the new design of the data models. This is specifically targeting Customer as a data element across Snowflake, IFS and SAP ECC 6.0.

Key Responsibilities:

  1. Data Modeling and Architecture: Design and implement scalable and robust data pipelines and platforms using Data Vault 2.0 methodology to support high-level reporting and operational requirements.

  1. Data Integration and Pipeline Development: Develop, construct, test, and maintain architectures such as databases and large-scale processing systems using Snowflake and dbt for data transformations.

  1. ETL to ELT Transition: Transition existing ETL processes to modern ELT processes, ensuring seamless data flow and integration across platforms.

  1. Data Cleansing and Alignment: Conduct comprehensive data cleansing to unify, correct, and standardize large data sets, ensuring data integrity across Snowflake, IFS, and SAP ECC 6.0 systems according to designs set by Enterprise Architecture teams.

  1. Data Governance and Compliance: Recommending data governance policies and procedures to manage the data lifecycle, ensuring compliance with data protection regulations and best practices.

Required Skills & Experience:

  1. Expertise in Data Vault 2.0: Strong experience in Data Vault 2.0 modeling techniques, ideally certified in Data Vault methodology.

  1. Proficiency with Snowflake: In-depth knowledge of Snowflake’s data warehousing solutions, including architecture, security, and data storage optimizations.

  1. Experience with dbt (data build tool): Demonstrated capability in using dbt for performing complex data transformations within data pipelines.

  1. Strong Background in Data Engineering: Minimum of 5 years of experience in data engineering, with a focus on building scalable and high-performance data infrastructures.

  1. Programming Skills: Proficiency in SQL and experience with scripting languages such as Python for data processing.