About the Role
We're seeking an experienced software engineer to join a specialist team focused on building a scalable data platform.
The successful candidate will be instrumental in transforming a legacy architecture into a containerized, scalable, cloud-ready system.
Responsibilities include:
* Designing and implementing robust components for ingesting, processing, and persisting high-frequency telemetry data
* Collaborating with data scientists to host, orchestrate, and optimize workloads in various languages
* Designing and building components using technologies like Apache Spark, Delta Lake, Redis, MQTT, and PostgreSQL
Additional responsibilities include driving modernization efforts through containerization, deployment on Kubernetes, and integration with S3-compatible object stores.
Required Skills and Qualifications
Extensive experience in backend development with languages such as Java, Scala, and Python is required.
Additionally, the ideal candidate should have:
* A proven track record working in teams to develop large, complex applications
* Deep understanding of streaming and batch data processing, ideally with Apache Spark or similar
* Experience with containerization (Docker) and orchestration (Kubernetes)
* Familiarity with data lake/lakehouse architectures, especially Delta Lake
* Strong knowledge of message brokers (MQTT, Kafka) and caching systems (Redis)
Benefits
This role offers competitive remuneration package and attractive bonus and share options.
Our company is committed to safety and wellbeing, and provides a strong commitment to diversity and inclusion, equal opportunity, and equal outcome.
Why Work with Us?
As part of our forward-thinking team, you'll be instrumental in shaping the future of our data-centric systems.
With opportunities for career development and global prospects, this role offers a truly meaningful work experience.