Transforming Data for Better Outcomes
The role of Data Engineer is pivotal in shaping the Commission's ability to provide accurate, timely, and trusted insights.
This position plays a crucial part in designing, developing, and delivering robust data pipelines, models, and processes that support the Commission's mission.
The successful candidate will be responsible for building, maintaining, and optimizing modern cloud-based data assets that enable effective decision-making, strengthen risk management, and improve the quality of care for older Australians.
* Develop, maintain, and optimize data pipelines and workflows in Snowflake and related cloud platforms.
* Build, test, and maintain dbt models and transformations aligned with Medallion architecture principles (Bronze, Silver, Gold).
* Apply data modeling techniques to structure, optimize, and ensure accessibility of data assets.
* Implement dbt tests and quality frameworks to maintain accuracy, integrity, and reliability of data.
* Contribute to data governance activities, including tagging, masking, and compliance with security policies.
* Support CI/CD pipelines and deployment workflows for dbt projects using Azure DevOps or similar tools.
* Prepare and maintain dbt documentation, including lineage and model descriptions, to improve transparency and governance.
* Troubleshoot and resolve technical issues to ensure reliable and trusted data products.
* Produce and maintain clear technical documentation to support data engineering practices and business use.
* Collaborate with technical and non-technical stakeholders to deliver business-aligned data solutions.
To succeed in this role, you will need to demonstrate strong experience in data engineering within cloud environments. You should possess a solid understanding of data architecture, ELT/ETL pipelines, and Medallion framework.
Additionally, you should have practical experience with data quality frameworks, metadata management, and governance controls.
Able to work in agile, fast-paced environments delivering iterative improvements, you must be a strong problem-solver and analytical thinker with high attention to detail.
Effective communication and collaboration skills are essential, as you will work closely with senior data engineers, analysts, and business teams.
Demonstrated aptitude and willingness to learn and adopt new tools, technologies, and best practices are also required.
Benefits:
As a valued member of our team, you will receive a competitive salary package and the opportunity to work on challenging projects that make a real difference in people's lives.
Our commitment to diversity and inclusion ensures a supportive and inclusive workplace culture, where everyone feels valued and respected.
We recognize the importance of work-life balance and offer flexible working arrangements to support your needs.
Position Eligibility Requirements:
* Strong experience in data engineering within cloud environments.
* Strong skills across Snowflake, dbt, SQL, Python, and Azure.
* Experience with databases and relational data modeling.
* Solid understanding of data architecture, ELT/ETL pipelines, and Medallion framework.
* Practical experience with data quality frameworks, metadata management, and governance controls.
* Ability to work in agile, fast-paced environments delivering iterative improvements.
* Strong problem-solving and analytical skills with high attention to detail.
* Effective communicator with the ability to collaborate across technical and business teams.
* Demonstrated aptitude and willingness to learn and adopt new tools, technologies, and best practices.
How to Apply:
If you are a motivated and experienced data engineer looking for a new challenge, please submit your application by 11:59pm AEDST, Monday 5 January 2026.
We look forward to receiving your application and discussing this exciting opportunity further.