Senior Data Engineer
Role details
Job location
Tech stack
Job description
This is a senior, hands-on technical role with real ownership. As our Senior Data Engineer, you'll lead the design, development, and optimisation of our cloud-based data platform, ensuring it is scalable, secure, and fit for the future.
You'll act as a technical authority within the data engineering function, mentoring others, shaping best practice, and working closely with engineering, analytics, security, and business stakeholders to enable high-quality, trusted data across TCFG.
While the role is home-based, regular travel to our national sites is required to build strong relationships and understand how data supports our operations on the ground.
Key Responsibilities
Data Architecture & Engineering Leadership
- Lead the design and delivery of robust ELT/ETL pipelines using modern, scalable patterns
- Define and promote best practices in pipeline design, orchestration, and transformation
- Provide technical leadership, mentoring junior engineers and contributing to architectural decisions
Advanced Pipeline Design & Optimisation
- Architect real-time, batch, and micro-batch data workflows
- Use orchestration tools such as Airflow, Azure Data Factory, and dbt Cloud for scheduling, observability, and lineage
- Monitor and optimise performance, reliability, and cost efficiency
Cloud Platform Ownership
- Take ownership of data infrastructure across Azure (ADLS, Synapse, Event Hubs) and hybrid environments
- Implement CI/CD pipelines, DevOps practices, and infrastructure-as-code (Terraform/Bicep)
- Design for high availability, disaster recovery, and fault tolerance
Data Governance, Security & Privacy
- Embed security- and privacy-by-design across all data solutions
- Work closely with Information Security to ensure GDPR and internal policy compliance
- Lead data lineage, classification, audit logging, and access control initiatives
Integration & Interoperability
- Design and maintain integrations with ERP platforms (e.g. Infor M3, ION), planning tools, and external data sources
- Build reusable ingestion frameworks for structured and semi-structured data (JSON, XML, CSV, Avro, Parquet)
Data Quality & Observability
- Implement testing and validation frameworks (dbt tests, Great Expectations, or similar)
- Ensure clear documentation and lineage from source to consumption
- Build observability dashboards and alerts to ensure reliability and transparency
Collaboration & Delivery
- Translate business needs into scalable, well-engineered data solutions
- Contribute to agile planning, estimation, and roadmap delivery
- Maintain clear documentation including architecture decisions and data standards
Requirements
- 7+ years' experience in data engineering, including senior or lead-level technical delivery
- Expert knowledge of SQL and Python
- Strong experience with Azure data services (ADF, Synapse, Event Hub, ADLS)
- Proven experience building and optimising lakehouse architectures (Databricks, Delta Lake, Snowflake)
- Hands-on experience with orchestration tools and data modelling approaches
- Experience integrating APIs and event-driven platforms
- Strong understanding of data security, GDPR, encryption, and IAM
- Experience with CI/CD pipelines and DevOps practices
Desirable
- Experience in manufacturing or FMCG environments (ERP, MES, TPM systems)
- Familiarity with Microsoft Purview or data cataloguing tools
- Exposure to Power BI, Tableau, or similar BI platforms
- Relevant certifications (e.g. Azure DP-203, dbt, cloud architecture)