Job Description:
">
* Design and build scalable data solutions that support business growth.
* Collaborate closely with product managers, analysts, and engineers to develop tools that democratise access to data.
">
Main Responsibilities:
">
* Develop robust ETL/ELT pipelines using dbt, Kafka, and Change Data Capture (CDC).
* Design and optimise a Snowflake-based data warehouse with dimensional modelling.
* Automate infrastructure using Terraform and manage workflows in a cloud-native environment.
* Integrate data from multiple sources with best practices for scalability and security.
* Write clean, maintainable Python code and implement software development best practices.
* Leverage AWS services such as S3, Lambda, API Gateway, and SageMaker.
">
Requirements:
">
* 2–3 years' experience in data engineering or backend development.
* Advanced SQL skills with experience in dimensional modelling and performance tuning.
* Proficiency with Snowflake, dbt, and data streaming tools like Kafka or Flink.
* Familiarity with Infrastructure as Code (Terraform) and AWS cloud services.
* Strong Python coding skills and understanding of modern data engineering practices.