Senior Data Engineer
Role details
Job location
Tech stack
Job description
The Finance Platform - AI & Agentic team is looking for a passionate and experienced Senior Data Engineer to join our growing team! If you thrive on enabling high-performing software, streamlining data pipelines and automating ETL processes, you're looking in the right place.
You'll play a key role in crafting and delivering our Finance AI & Agentic Data strategy, applying modern cloud-native technologies to accelerate data delivery, improve operational resilience, and ensure compliance with security and engineering standards. You'll be working with Data Analytics managers and business stakeholders to build Finance based AI intelligence solutions - your role will be to source data, understand the data, build data pipelines and identify how to automate all data pipelines.
You'll be comfortable understanding the architecture and use of all data products - CDPs, ODPs and FDPs - to create strong reuse and sourcing of data for our AI & ML models. Working with solution architects to create software and data solutions to support AI and ML applications to be developed -with security, control and lineage to be incorporated into your detailed designs. You will be working closely with data scientists, ML & AI engineers, software and DevOps engineers to collaborate on solutions to best serve our internal finance customers., * Lead the design, implementation, and maintenance of scalable, secure, and performant Data engineering pipelines and tooling on Google Cloud Platform (GCP).
- Collaborate with engineers, architects, and product teams to enable data pipelines, data transformations and script data loads.
- You'll support building BigQuery datasets & materialised views to support the Data Analytics teams, and work to keep our data costs low and queries streamlined.
- Implementing and using tools such as dBT, dataflow, EasyIngest etc to manage data effectively.
- Champion the use of endorsed technologies and common build patterns to minimise technical debt.
- Mentor junior engineers and support recruitment to grow Data Engineering capability across the team, lab and platform.
Requirements
- Understanding of utilising DevOps with data engineering practices and cloud-native tooling, including YAML scripting, CI/CD, source code management, and orchestration.
- Proficiency with automation and scripting in languages like Python, Apache Beam, Bash, or your preferred language.
- A passion for process improvement, operational excellence, and platform reliability.
- Ability to lead initiatives independently and influence engineering best practices.
- Excellent stakeholder engagement, communication, and teamwork skills.
- Experience using gcloud commands for provisioning resources in containerized applications
- Data integration with multiple internal platforms- AI tool (Vertex, Cortex), Machine Learning as a service, on-prem databases.
And any experience of these would be useful...
- Experience using Gcloud commands for provisioning resources in containerized applications
- Understanding of cost optimisation in cloud environments.
- Understand how to utilise API & MCP connectivity and enablement to support our applications and data pipelines.
Benefits & conditions
We also offer a wide-ranging benefits package, which includes…
- A generous pension contribution of up to 15%
- An annual bonus award, subject to Group performance
- Share schemes including free shares
- Benefits you can adapt to your lifestyle, such as discounted shopping
- 30 days' holiday, with bank holidays on top
- A range of wellbeing initiatives and generous parental leave policies