top of page
Skyscraper Horizontal
The sky's the limit...

Senior Data Engineer

Total OTE package: £85K-£100K p.a.

​

We are looking for a highly skilled Senior Data Engineer to become part of our UK business and help develop and deliver new data strategy for our global ecommerce client.

Our client is embarking on a significant business transformation project as they migrate from on-premises Hadoop and set up new Google Cloud / Databricks platform. We need top-notch Data Engineers to join our team and to help guide the client’s product & technology teams through this journey.

​

Your main goal is to work with client’s third parties to help deliver a platform that can provide insights based on reliable and secure data, so everyone can spend more time delivering value. In executing against this mission, you will work closely with engineers, architects, and product owners.

​

Main responsibilities:

You will be helping support migration from Hadoop to Google Cloud, this will involve:

  • Data ingestion and consumption services for batch and streaming to democratize data access.

  • Tools for managing end-to-end ML workflows for experimentation, training, evaluation and serving in order to reduce time from idea to impact globally.

  • Shared feature stores and serving systems to facilitate reusability of curated data features across domains.

  • Contribute to the architecture of their system, and help to make it more scalable and resilient. Client wants you to spot opportunities to improve the development work in all areas from coding and processes, to tools and testing.

  • Define, plan and execute on data capabilities.

  • Communicate and align with peers and cross functional stakeholders.

  • Drive for technical excellence and pick the right balance between quality and speed of delivery.

  • Consistently support the target architecture by identifying areas of critical need based on future growth.

  • Be the go-to expert inside the org and influencer inside the team in several technologies and technical areas.

  • Actively participate in department-wide cross-functional tech initiatives. 

​​

Role requirements:

  • You have experience designing and productionizing large-scale distributed systems built around machine-learned models and big data. Experience with ML Ops and associated best practices.

  • You have strong expertise in Java/Scala and/or Python programming languages.

  • You have experience with batch and streaming technologies: e.g Apache Flink, Apache Spark, Apache Beam, Google DataFlow.

  • You have expertise with distributed data stores (Casandra, Google BigTable, Redis, ClickHouse, Elasticsearch) and messaging systems (Kafka, Google PubSub) at scale.

  • You have experience with Linux, Docker, private and/or public cloud (OpenStack and GCP preferred).

  • Experience with CI/CD technologies

  • Experience with Infrastructure as code development, i.e. terraform

  • Good knowledge of GCP services (storage and processing)

  • You have experience working in a lead/senior role.

  • 8+ years of relevant industry experience with at least 3+ years of data and machine learning related work or research

  • You have worked in teams using agile methodologies, e.g. scrum​​

​​​​​

To be considered for this role, please send your CV to applications@dynamicfutures.co.uk quoting

DF-SDE-2022.

​

Closing date for applications: 28th February 2022.

​

Back to all vacancies 

bottom of page