Software Engineer
Role details
Job location
Tech stack
Job description
We are looking for a Software Engineer to join our Prudential & Analytics Platform to help transform our commercial model execution, analytics engine, and enterprise data warehouse. You'll play a key role in modernising our cloud architecture and enabling scalable, data-driven capabilities that support critical pricing, risk, and finance functions across the organisation. This is a great opportunity to shape the next generation of our analytics and data processing platforms using Google Cloud technologies, modern DevOps tooling, and best-in-class engineering practices. Your work will contribute directly to delivering reliable, high-quality analytical services to the wider Group. Working as part of a feature team, you'll design, build, and deliver cloud-native solutions that modernise our commercial analytics ecosystem. You'll collaborate closely with data scientists, analysts, architects, and engineering peers to build scalable platforms using Google Cloud Platform, Python, SQL and modern DevOps pipelines. You'll transform legacy capabilities into modernised, automated, and observable cloud services; build high-quality data pipelines; optimise BigQuery workloads; and support containerised deployments across our environments. You'll play an active role in shaping engineering standards, driving automation, and improving the resilience and performance of our platforms.
Requirements
Strong software engineering experience (typically 3-5+ years) Proficiency in Python and SQL. Hands-on experience with key Google Cloud Platform services, ideally including: BigQuery, Cloud Composer (Airflow), Cloud Storage, Logs Explorer, Data Catalog. Experience building or migrating capabilities into cloud environments Experience with DevOps and CI/CD pipelines, ideally using Harness Understanding of containerisation and orchestration technologies, including: Docker and Kubernetes. Experience building scalable data pipelines, analytics engines, or data warehouse solutions. Experience working in an agile development team. Any of the following would be great to see Broader GCP skills or certification. Experience working with Airflow DAGs, distributed compute workloads or orchestration patterns. Infrastructure-as-code experience (Terraform, Deployment Manager). Exposure to data modelling, optimisation of analytical workloads, or ELT/ETL patterns. Familiarity with observability tooling, logging, and cost-optimisation techniques. Experience working with distributed teams and cross-functional delivery groups. Experience with additional programming languages (e.g., Java, Go, TypeScript).