Senior Data Engineer

Phoenix Group
Birmingham, United Kingdom
17 days ago

Role details

Contract type
Permanent contract
Employment type
Part-time (≤ 32 hours)
Working hours
Shift work
Languages
English
Experience level
Senior
Compensation
£ 60K

Job location

Birmingham, United Kingdom

Tech stack

API
Azure
Computer Programming
Data Architecture
Information Engineering
Data Integration
ETL
Data Systems
Data Warehousing
Interoperability
Python
Microsoft SQL Server
Operational Data Store
Performance Tuning
Salesforce
Scala
SQL Databases
Data Streaming
Data Storage Technologies
Data Ingestion
Azure
Data Lake
Data Pipelines
Databricks

Job description

We have an incredible opportunity to join us here at Phoenix Group as a Senior Data Engineer to join our Engineering & Delivery team within Group IT., We are seeking a Senior Data Engineer to join the Engineering and Delivery function in Group I.T. This is a pivotal role for candidates with a strong background in data and engineering who want to shape how data drives every aspect of a modern pensions business. From operational efficiency and digital transformation to regulatory compliance and customer engagement, your work will influence decisions and enable change across the organisation., As a Senior Data Engineer, you will be responsible for designing, implementing, and optimizing data solutions on cloud platforms, with a strong emphasis on Databricks. Beyond analytics, you will help embed data capabilities into core business processes, supporting areas such as operations, digital services, risk management, accounting and actuarial. You will collaborate with cross-functional teams-including data scientists, analysts, product owners, and operational leaders-to ensure data is a trusted, integrated asset powering innovation and business outcomes., * Design and implement end-to-end data engineering solutions across multiple platforms, including Azure, Databricks, SQL Server, and Salesforce, enabling seamless data integration and interoperability.

  • Architect and optimize Delta Lake environments within Databricks to support scalable, reliable, and high-performance data pipelines for both batch and streaming workloads.
  • Develop and manage robust data pipelines for operational, analytical, and digital use cases, leveraging best practices for data ingestion, transformation, and delivery.
  • Integrate diverse data sources-cloud, on-premises, and third-party systems-using connectors, APIs, and ETL frameworks to ensure consistent and accurate data flow across the enterprise.
  • Implement advanced data storage and retrieval strategies that support operational data stores (ODS), transactional systems, and analytical platforms.
  • Collaborate with cross-functional teams (data scientists, analysts, product owners, and operational leaders) to embed data capabilities into business processes and digital services.
  • Optimize workflows for performance and scalability, addressing bottlenecks and ensuring efficient processing of large-scale datasets.
  • Apply security and compliance best practices, safeguarding sensitive data and ensuring adherence to governance and regulatory standards.
  • Create and maintain comprehensive documentation for data architecture, pipelines, and integration processes to support transparency and knowledge sharing, We are committed to ensuring that everyone feels accepted and welcome applicants from all backgrounds. If your experience looks different from what we've advertised and you believe that you can bring value to the role, we'd love to hear from you.

Requirements

  • Proven experience in enterprise-scale data engineering, with a strong focus on cloud platforms (Azure preferred) and cross-platform integration (e.g., Azure * Salesforce, SQL Server).
  • Deep expertise in Databricks and Delta Lake architecture, including designing and optimizing data pipelines for batch and streaming workloads.
  • Strong proficiency in building and managing data pipelines using modern ETL/ELT frameworks and connectors for diverse data sources.
  • Hands-on experience with operational and analytical data solutions, including ODS, data warehousing, and real-time processing.
  • Solid programming skills in Python, Scala, and SQL, with experience in performance tuning and workflow optimization.
  • Experience with cloud-native services (Azure Data Factory, Synapse, Event Hub, etc.) and integration patterns for hybrid environments.

Benefits & conditions

Salary and benefits: £45,000 - £60,000 plus 16% bonus up to 32%, private medical cover, 38 days annual leave, excellent pension, 12x salary life assurance, career breaks, income protection, 3x volunteering days and much

About the company

We want to be the best place that any of our 6,600 colleagues have ever worked. We're Phoenix Group, we're a long-term savings and retirement business. We offer a range of products across our market-leading brands, Standard Life, SunLife, Phoenix Life and ReAssure. Around 1 in 5 people in the UK has a pension with us. We're a FTSE 100 organisation that is tackling key issues such as transitioning our portfolio to net zero by 2050, and we're not done yet.

Apply for this position