Data Engineer

NatWest Group
Edinburgh, United Kingdom
7 days ago

Role details

Contract type
Permanent contract
Employment type
Full-time (> 32 hours)
Working hours
Regular working hours
Languages
English

Job location

Edinburgh, United Kingdom

Tech stack

Airflow
Amazon Web Services (AWS)
Amazon Web Services (AWS)
Data Architecture
Information Engineering
ETL
Data Systems
Data Warehousing
Python
Performance Tuning
SQL Databases
Data Ingestion
Snowflake
GIT
Kafka
Software Version Control
Data Pipelines

Job description

  • You'll be the voice of our customers, using data to tell their stories and put them at the heart of all decision-making
  • We'll look to you to drive the build of effortless, digital first customer experiences
  • If you're ready for a new challenge and want to make a far-reaching impact through your work, this could be the opportunity you're looking for

What you'll do

As a Data Engineer, you'll be looking to simplify our organisation by developing innovative data driven solutions through data pipelines, modelling and ETL design, inspiring to be commercially successful while keeping our customers, and the bank's data, safe and secure.

You'll drive customer value by understanding complex business problems and requirements to correctly apply the most appropriate and reusable tool to gather and build data solutions. You'll support our strategic direction by engaging with the data engineering community to deliver opportunities, along with carrying out complex data engineering tasks to build a scalable data architecture.

Your responsibilities will also include:

  • Building advanced automation of data engineering pipelines through removal of manual stages
  • Embedding new data techniques into our business through role modelling, training, and experiment design oversight
  • Delivering a clear understanding of data platform costs to meet your departments cost saving and income targets
  • Sourcing new data using the most appropriate tooling for the situation
  • Developing solutions for streaming data ingestion and transformations in line with our streaming strategy

Requirements

To thrive in this role, you'll need a strong experience of Snowflake for data warehousing along with writing efficient SQL and managing schemas. You'll also bring proficiency in Airflow for orchestration and workflow management as well as hands on experience with AWS services particularly S3 and Lambda.

You'll have excellent communication skills with the ability to proactively engage and manage a wide range of stakeholders.

Additionally, you'll need:

  • Expert level knowledge of ETL/ELT process along with in-depth knowledge of data warehousing and data modelling capabilities
  • Experience with Kafka concepts like producers, consumers and topics with the ability to integrate with streaming pipelines
  • Proficiency in Python for data engineering and version control systems such as Git
  • The ability to lead technical initiatives along with experience of mentoring junior colleagues
  • Knowledge of Snowflake performance tuning would be hugely beneficial

Apply for this position