Data Engineer

Birchgrove
Cobham, United Kingdom
4 days ago

Role details

Contract type
Temporary contract
Employment type
Full-time (> 32 hours)
Working hours
Regular working hours
Languages
English
Compensation
£ 13K

Job location

Cobham, United Kingdom

Tech stack

API
Data Integration
ETL
Data Warehousing
Python
Automation of Marketing
Operational Data Store
Power BI
Systems Integration
Data Logging
Snowflake
Build Management
Operational Systems
Webhooks

Job description

We're looking for an experienced Data Engineer to join Birchgrove on a 3-month contract to deliver several clearly defined, high-impact data integration projects.

This is a hands-on, delivery-focused role. You'll design, build and document reliable, production-grade ETL/ELT pipelines that integrate operational systems into our cloud data warehouse enabling improved reporting and analytics across the business.

You'll be joining at an exciting stage in our data journey, helping us move from early foundations to a more connected, scalable and dependable data platform.

Key Project Deliverables

During the contract, you will deliver the following priority projects:

  1. Fall detection system integration
  • Ingest data from a fall detection platform using APIs and webhooks
  • Land and model the data in Snowflake
  • Implement reliability best practices: monitoring, alerting, logging, retries, and clear documentation
  1. Resident management system integration
  • Extract and ingest data from our resident management system
  • Design robust data models to support reporting on neighbour wellbeing and operations
  • Ensure maintainable transformations and clear data definitions
  1. Facilities management systems integration
  • Design and build an API-based integration between two facilities management systems
  • Enable joined-up reporting across maintenance, safety and operational data
  • Deliver clean, consistent datasets suitable for analytics and dashboards
  1. Marketing automation platform integration
  • Ingest data from our marketing platform using APIs
  • Land and model the data in Snowflake

These projects will directly support improved insight, faster decision-making and better outcomes for our neighbours and team.

Tools & Technology Stack

You'll work with and help establish best practice around the following tools:

  • Snowflake (cloud data warehouse)
  • Fivetran (managed ingestion)
  • Airbyte (custom & API-based integrations)
  • dbt (transformations, testing and documentation)
  • Power BI (analytics and dashboards)

Requirements

Do you have experience in Python?, * API-driven pipeline design (authentication, pagination, rate limiting, incremental loads)

  • Webhook ingestion patterns and event-driven data capture
  • Building reliable, well-monitored pipelines with clear documentation and ownership

About You

  • Proven experience as a Data Engineer, delivering pipelines end-to-end in modern cloud stacks
  • Strong hands-on skills with APIs, webhooks, and pipeline-based ETL/ELT
  • Confident using Python for data integration and automation
  • Comfortable implementing practical reliability patterns (e.g., idempotency, retries, monitoring, alerting)
  • Strong data modelling and transformation experience (ideally with dbt)
  • Able to work independently, but collaborate closely with non-technical stakeholders
  • Motivated by purpose-driven work and using data to improve real lives

Apply for this position