Snowflake Senior Developer
Resourgenix Ltd
7 days ago
Role details
Contract type
Permanent contract Employment type
Full-time (> 32 hours) Working hours
Regular working hours Languages
English Experience level
SeniorJob location
Tech stack
Airflow
Amazon Web Services (AWS)
Azure
Cluster Analysis
Databases
Data Governance
ETL
Data Systems
Data Vault Modeling
Data Warehousing
Python
Performance Tuning
Role-Based Access Control
Power BI
SQL Databases
Tableau
Data Processing
Azure
Snowflake
Caching
Debezium
Kafka
Looker Analytics
Data Pipelines
Job description
We are seeking a Snowflake Senior Developer to design, develop, and optimise data solutions on our cloud data platform. You will work closely with data engineers, analysts, and architects to deliver high-quality, scalable data pipelines and models. Strong expertise in Snowflake, ETL/ELT, data modelling, and data warehousing is essential., * Snowflake Development: Build and optimise Snowflake objects (databases, schemas, tables, views, tasks, streams, resource monitors).
- ETL/ELT Pipelines: Develop and maintain robust data pipelines using tools like dbt, Airflow, Azure Data Factory, or similar.
- Data Modelling: Implement dimensional models (star/snowflake schemas), handle SCDs, and design efficient structures for analytics.
- Performance Tuning: Optimise queries, manage clustering, caching, and warehouse sizing for cost and speed.
- Data Quality: Implement testing frameworks (dbt tests, Great Expectations) and ensure data accuracy and freshness.
- Security & Governance: Apply RBAC, masking policies, and comply with data governance standards.
- Collaboration: Work with BI teams to ensure semantic alignment and support self-service analytics.
- Documentation: Maintain clear technical documentation for pipelines, models, and processes.
Requirements
- Matric and a Degree in IT
- Strong SQL skills (complex queries, performance tuning) and proficiency in Python for data processing.
- Experience with ETL/ELT tools (dbt, Airflow, ADF, Informatica, Matillion).
- Solid understanding of data warehousing concepts (Kimball, Data Vault, normalization).
- Familiarity with cloud platforms (Azure preferred; AWS/GCP acceptable).
- Knowledge of data governance, security, and compliance (GDPR).
- Excellent problem-solving and communication skills., * Experience with Snowpark, UDFs, dynamic tables, and external tables.
- Exposure to streaming/CDC (Kafka, Fivetran, Debezium).
- BI tool integration (Power BI, Tableau, Looker).
- Certifications: SnowPro Core or Advanced.