We're looking for a Senior Data Engineer to join a major data and analytics initiative, delivering high-quality, scalable solutions across the organisation. You'll play a key role in designing and building robust ETL pipelines and working with modern cloud-based data technologies to support data-driven decision-making. Tech Stack: Snowflake - Cloud data warehouse Python - Scripting and data engineering Apache Airflow - Workflow orchestration Key Responsibilities: Develop and maintain ETL pipelines to support data ingestion, transformation, and loading from a variety of sources Collaborate with analysts, data scientists, and stakeholders to understand data requirements Build well-structured, reusable, and performant data models Schedule and monitor workflows using Apache Airflow Optimise Snowflake performance and manage data pipelines in a secure, efficient manner About You: Strong experience as a Data Engineer working on large-scale data and analytics projects Proven technical expertise in Snowflake, Python, and Apache Airflow Deep understanding of ETL best practices and data pipeline architecture Ability to work independently and communicate effectively with cross-functional teams Familiarity with data governance and quality standards Desirable: Experience with CI/CD in a data environment Exposure to dbt, Terraform, or other modern data tools