

Data Engineer
Insight Global is seeking multiple Data Engineers to join a prestigious energy client based in London. The successful candidates will be responsible for designing, implementing, and managing live data streaming pipelines for the client’s Energy Trading team.
Key Responsibilities:
• Pipeline Management: Design, implement, and manage live data streaming pipelines using Azure Databricks to ensure seamless data flow and real-time processing.
• Process Evaluation: Assess and optimize on-premise to cloud data exchange processes for accuracy, efficiency, and scalability.
• Code Review and Debugging: Conduct thorough code reviews and debugging sessions, providing guidance and mentorship to junior data engineers to ensure high-quality code and best practices.
• Problem Solving: Develop innovative solutions to address computing and cost challenges, leveraging advanced technologies and methodologies.
• Remote Collaboration: Work fully remotely within the UK, maintaining effective communication and collaboration with US teams during overlapping working hours.
Must Haves:
- Deep expertise with Azure Databricks (DLT, Data Streaming , Unity Catalogue etc.).
- Prove experience designing high volume, live data streaming solutions using Azure DLT (Delta Live Tables).
- Expert with Apache Spark and PySpark (ability to review quality of code and debug issues).
- Experience with Qlik Replicate to move data from on-prem to the cloud.
- Background in Data warehousing (SAP Hana, BI/BW, Oracle etc.).
- Proficient with SQL.
Insight Global is seeking multiple Data Engineers to join a prestigious energy client based in London. The successful candidates will be responsible for designing, implementing, and managing live data streaming pipelines for the client’s Energy Trading team.
Key Responsibilities:
• Pipeline Management: Design, implement, and manage live data streaming pipelines using Azure Databricks to ensure seamless data flow and real-time processing.
• Process Evaluation: Assess and optimize on-premise to cloud data exchange processes for accuracy, efficiency, and scalability.
• Code Review and Debugging: Conduct thorough code reviews and debugging sessions, providing guidance and mentorship to junior data engineers to ensure high-quality code and best practices.
• Problem Solving: Develop innovative solutions to address computing and cost challenges, leveraging advanced technologies and methodologies.
• Remote Collaboration: Work fully remotely within the UK, maintaining effective communication and collaboration with US teams during overlapping working hours.
Must Haves:
- Deep expertise with Azure Databricks (DLT, Data Streaming , Unity Catalogue etc.).
- Prove experience designing high volume, live data streaming solutions using Azure DLT (Delta Live Tables).
- Expert with Apache Spark and PySpark (ability to review quality of code and debug issues).
- Experience with Qlik Replicate to move data from on-prem to the cloud.
- Background in Data warehousing (SAP Hana, BI/BW, Oracle etc.).
- Proficient with SQL.