We are seeking an experienced Senior Data Engineer with deep expertise in the Microsoft Azure Data Platform to deliver high-impact data solutions that empower the business to make data driven decisions. This contract role will be until the 30th of June Initially.
This role will sit within the NSW state government with an emphasis on strong problem-solving skills, solid architectural understanding and proven experience building scalable, governed data and BI solutions to deliver reliable data models and meaningful business insights.
Responsibilities
Design and build scalable data models including the design and development of well structured facts, dimensions and semantic layers using PySpark, Synapse, Fabric Lakehouse/Warehouse and Power BI ensuring alignment with business and analytical requirements.
Design and implement end-to-end data ingestion frameworks in Synapse Pipelines and Data Factory to source data from APIs, databases and files. Build efficient ELT frameworks in Synapse and Fabric to integrate and transform large, complex datasets.
Partner with architects, analysts, and business stakeholders to design end-to-end data products, while mentoring engineers and promoting best practices in data design and delivery.
Required Skills
Expert in data warehousing, data modelling and end-to-end business intelligence solutions using Power BI, SQL & DAX.
Expert level knowledge in PySpark & SQL.
Hands-on Synapse & Fabric (Spark + Pipelines).
Strong background in Lakehouses, data platform frameworks and data mesh architecture.
Experience with Azure Devops, branching and pipelines.
Experience delivering data solutions in projects with big data volumes.
Experience applying AI/LLMs to data engineering (nice to have).
If you're interested in this role, click 'apply now' to forward an up-to-date copy of your CV, If this job isn't quite right for you but you are looking for a new position, please contact us for a confidential discussion on your career.