Data Engineer
Role details
Job location
Tech stack
Job description
Roles at Senior Associate level apply technical knowledge to design, build, and improve software and infrastructure. Scope includes working more independently, contributing to team design decisions, and taking responsibility for the quality of delivered work. The impact is increasing delivery speed, system reliability, and shared team knowledge.
Competencies
Technical: I apply designs, deploy and test software, infrastructure or networking solutions using growing domain knowledge, including relevant financial markets. I can debug, solve problems and take ownership of features, systems or configurations.
Customer focus: I consider customer needs in my work, contribute ideas to improve our product, respond quickly to operational issues that affect or risk affecting customers, and seek feedback to improve how we work.
Risk Management: I apply secure engineering practices to reduce delivery risk across software and infrastructure. I use testing, observability, automation and safe configuration to identify and resolve risks. I flag blockers and contribute to safe, resilient delivery.
Leadership: I take accountability for delivering reliable work. I support and coach peers and share knowledge openly. I contribute to team outcomes by collaborating effectively and helping others succeed through trust and shared ownership.
Analytical: I apply structured thinking and data to solve problems, test assumptions, and learn from outcomes. I support team decisions through evidence, insight, and continuous improvement.
Adaptability: I continuously learn and adapt to new priorities, try different approaches, and share what works with my team. I make learning my craft a regular habit, not just something I pick up through daily work.
Communication: I contribute to team discussions and communicate clearly with peers. I share context, ask questions and help the team align around shared work. Role profile: We are looking for a talented and passionate Data Engineer to join the SwapClear Data Services development team, part of the SwapClear Technology function supporting the LCH SwapClear Clearing business. The team is responsible for the designing, building, and supporting critical interfaces and systems that facilitate the SwapClear Clearing operations. This Data Engineer role will focus on designing, building and maintaining Data Services technology stack and related systems that are critical to the SwapClear Clearing operations. You will be assigned to a scrum team and work collaboratively within an Agile environment to ensure the smooth execution of the software development lifecycle, from planning and development to deployment and support. You will report to the SwapClear Interface development Team Lead while actively contributing to the delivery of high-quality solutions. The successful candidate needs to be hands-on, with a strong and deep understanding of Data Engineering and Data Warehousing plus the knowledge of the front-end and cloud development, and a proactive approach in all aspects of development. You will also be expected to make contributions to testing, ensure the solutions meet high standards for quality and reliability. WHAT YOU'LL BE DOING:
- Take ownership of the analysis, design, develop and delivery of software solutions
- Take responsibility for identification, estimation and reporting on progress of tasks, along with liaising across the business analyst, developer and continuous integration teams
- Contribute to unit testing, system integration testing, and participate in test case design and execution
- Support production systems, troubleshoot issues, provide timely fixes including participation in out-of-hours support rota
- Document technical specifications, workflow and system design
- Work within an Agile/Scrum team, actively participating in sprint planning, daily stand-ups and retrospectives
Requirements
- 3+ years of practical experience in system design, application development, testing, and operational stability using SQL, Python, ETL, ELT, Informatica PowerCenter, Business Objects, Snowflake and AWS.
- Expertise in Data Warehousing, Data Science & Analytics
- Expertise in Python Coding, Spark and distributed computing
- Can Demonstrate good working knowledge of relational databases including Oracle, Snowflake
- Experience with performance tuning and system optimization
- Experience with Test Driven Development
- Experience in working with GIT source control tool
- Exposure to agile methodologies such as CI/CD, Application Resiliency, and Security
- Adaptability to changing project requirements and technologies
Desirable skills
- Experience with the finance domain and knowledge of FpML
- Exposure to containerization and orchestration tools like Docker and Kubernetes
- Exposure to cloud technology