We are seeking a Data Engineering professional to help us design and develop scalable, efficient data pipelines using Kafka.
Key Responsibilities
* Work closely with key stakeholders to comprehend and fulfill complex data integration needs
* Architect and implement robust and efficient real-time data pipelines using Kafka
* Develop and manage ETL/ELT processes and workflows using Python to ensure optimal data quality and efficiency
* Craft and execute SQL queries, stored procedures, and views within Snowflake data lake/warehouse
* Drive the design, development, and maintenance of our AWS architecture
Requirements
* Bachelor's Degree in Computer Science or Mathematics or equivalent and/or demonstrated experience in a similar role
* 5 years plus experience in a data-centric role
* Proven experience in using Kafka, Terraform, Buildkite, and AWS or similar services
* Experience in building and maintaining data pipelines
* Understanding of data architecture and infrastructure-as-code (IaC) principles
This is an excellent opportunity for a skilled Data Engineer to work in a dynamic environment and contribute to increasing self-service capabilities by developing user-friendly tools and platforms.