Senior Data Engineer

This role is for a "Senior Data Engineer" on a 9-month contract, paying £500 - £520 per day. It requires expertise in Data Vault 2.0, Snowflake, dbt, and at least 5 years in data engineering, focusing on data modeling and pipeline development.
🌎 - Country
United Kingdom
💱 - Currency
£ GBP
💰 - Day rate
Unknown
Unknown
520
🗓️ - Date discovered
January 18, 2025
🕒 - Project duration
More than 6 months
🏝️ - Location type
Remote
📄 - Contract type
Unknown
🔒 - Security clearance
Unknown
📍 - Location detailed
London Area, United Kingdom
🧠 - Skills detailed
#Data Cleansing #Programming #Data Integrity #"ETL (Extract #Transform #Load)" #Data Integration #Scala #Security #Database Management #Data Pipeline #Data Transformations #Data Vault #Snowflake #Scripting #Data Engineering #dbt (data build tool) #Data Modeling #Data Lifecycle #Vault #Data Processing #Databases #Storage #Compliance #Data Storage #SQL (Structured Query Language) #SAP #Data Governance #Python
Role description
Log in or sign up for free to view the full role description and the link to apply.

Data Engineer

£500 - £520 per day

9 Month contract

Fully Remote

After a recent merger of two large business units, my client are embarking on a project to reengineer and migrate their end-to-end reporting requirements (direct involvement with pipelines and systems) and operational systems (indirect involvement with data). This transition includes a shift from existing ETL processes to a modern data infrastructure, leveraging Data Vault 2.0 modeling, Snowflake for database management, and dbt for data transformation to establish robust data pipelines. In addition it requires a comprehensive cleansing and alignment of existing data sets and data structures according to the new design of the data models. This is specifically targeting Customer as a data element across Snowflake, IFS and SAP ECC 6.0.

We need someone with dbt, Snowflake and Data Warehousing skills.

Key Responsibilities:

  1. Data Modeling and Architecture: Design and implement scalable and robust data pipelines and platforms using Data Vault 2.0 methodology to support high-level reporting and operational requirements.

  2. Data Integration and Pipeline Development: Develop, construct, test, and maintain architectures such as databases and large-scale processing systems using Snowflake and dbt for data transformations.

  3. ETL to ELT Transition: Transition existing ETL processes to modern ELT processes, ensuring seamless data flow and integration across platforms.

  4. Data Cleansing and Alignment: Conduct comprehensive data cleansing to unify, correct, and standardize large data sets, ensuring data integrity across Snowflake, IFS, and SAP ECC 6.0 systems according to designs set by Enterprise Architecture teams.

  5. Data Governance and Compliance: Recommending data governance policies and procedures to manage the data lifecycle, ensuring compliance with data protection regulations and best practices.

  6. Performance Optimization: Optimize data retrieval and processing speeds to enhance user interactions with data-driven applications and reports.

Required Skills & Experience:

  1. Knoweldge of Data Vault 2.0 or experience in Data Vault 2.0 modeling techniques, ideally certified in Data Vault methodology.

  2. Proficiency with Snowflake: In-depth knowledge of Snowflake’s data warehousing solutions, including architecture, security, and data storage optimizations.

  3. Experience with dbt (data build tool): Demonstrated capability in using dbt for performing complex data transformations within data pipelines.

  4. Strong Background in Data Engineering: Minimum of 5 years of experience in data engineering, with a focus on building scalable and high-performance data infrastructures.

  5. Programming Skills: Proficiency in SQL and experience with scripting languages such as Python for data processing.