Our company is seeking a skilled professional to fill the role of Data Engineer.
About the Job
We are expanding our data ecosystem, and as such, we need someone who can develop and maintain robust ETL/ELT pipelines. You will be responsible for designing and evolving our Snowflake-based data warehouse with a focus on dimensional modeling and performance optimization.
In addition, you will automate infrastructure and deployments using Terraform, manage workflows in a cloud-native environment, and integrate data from multiple sources. Your experience with Snowflake and data transformation tools like dbt is crucial.
This mid-senior level position requires advanced SQL skills, strong data pipeline exposure, and experience with Apache Kafka and Flink. You should have proficiency in Python, ideally with experience building utilities or data-specific libraries.
A problem solver mindset with a strong focus on delivering business value through data is essential. A bachelor's degree in Computer Science, Mathematics, or Statistics is required.
Key Responsibilities:
* Develop and maintain robust ETL/ELT pipelines.
* Design and evolve our Snowflake-based data warehouse.
* Automate infrastructure and deployments using Terraform.
* Integrate data from multiple sources.
Requirements:
* Advanced SQL skills.
* Strong data pipeline exposure.
* Experience with Snowflake and dbt.
* Proficiency in Python.
* Bachelor's degree in Computer Science, Mathematics, or Statistics.
Nice to Have:
* Experience working with event-driven architectures and CDC tools.
* Familiarity with orchestration tools.