

Data Engineer
Data Engineer – HIRING ASAP
Start date: ASAP
Duration: 6 Months
Location: Remote
Rate: £450 - £500 per day outside ir35
Summary
We are currently working with a new generation consultancy based across UK and EU and founded on the premises of the engineering excellence and empowering people to make an impact. All their consultants have equity in the company, genuinely love what they do and are really good at it.
They work with all modern tech stacks and typically run agile scrum on all our projects.
Responsibilities
• We expect you to work closely with business to understand their current data problems and to know how to analyse and cleanse their data. Also, we expect you to design data storage solutions and be a good communicator with your team.
Key Skills
• Python in the software engineering level, including unit and integration test experience.
• Distributed computing knowledge covered by PySpark or Scala, can debug things in SparkUI and knows how to optimise for this purpose.
• AWS experience
• Good understanding of data modelling, change data capture and/or ACID compliant table structure.
• Good data lake/data Lakehouse understanding, very good understanding of a traditional data platform.
• Experience on different aspects of ingestion, via api calls, batch and/or streaming through pulling and/or pushing data.
• Good understanding of SQL and NoSQL databases
• Git version control and cicd experience
• At least one Cloud experience with solid and extensive data platform build (AWS pref)
• Container experience, either docker or Kubernetes
• Consulting experience – working on projects with other consultancies and multiple stakeholders.
Data Engineer – HIRING ASAP
Start date: ASAP
Duration: 6 Months
Location: Remote
Rate: £450 - £500 per day outside ir35
Summary
We are currently working with a new generation consultancy based across UK and EU and founded on the premises of the engineering excellence and empowering people to make an impact. All their consultants have equity in the company, genuinely love what they do and are really good at it.
They work with all modern tech stacks and typically run agile scrum on all our projects.
Responsibilities
• We expect you to work closely with business to understand their current data problems and to know how to analyse and cleanse their data. Also, we expect you to design data storage solutions and be a good communicator with your team.
Key Skills
• Python in the software engineering level, including unit and integration test experience.
• Distributed computing knowledge covered by PySpark or Scala, can debug things in SparkUI and knows how to optimise for this purpose.
• AWS experience
• Good understanding of data modelling, change data capture and/or ACID compliant table structure.
• Good data lake/data Lakehouse understanding, very good understanding of a traditional data platform.
• Experience on different aspects of ingestion, via api calls, batch and/or streaming through pulling and/or pushing data.
• Good understanding of SQL and NoSQL databases
• Git version control and cicd experience
• At least one Cloud experience with solid and extensive data platform build (AWS pref)
• Container experience, either docker or Kubernetes
• Consulting experience – working on projects with other consultancies and multiple stakeholders.