Senior Data Engineer – Microsoft Fabric (Snowflake or Databricks)
We're growing our data engineering practice and need a Senior Data Engineer who can own delivery as well as build. This person will lead project squads – running stand-ups, managing scope, keeping RAID up to date, and reporting to stakeholders – while also being hands‑on in Microsoft Fabric. Snowflake or Databricks is a nice‑to‑have. This role is for someone who can keep a data project on track, on budget, and visible to the business.
Note: the role requires the candidate to be Sydney-based, able to work a minimum of 4 days per week in the office, and to have full working rights in Australia.
Core Responsibilities:
* Lead delivery for a small data squad: plan iterations, run ceremonies, manage scope and risks, maintain RAID, and provide regular status updates to stakeholders.
* Work with architects/consultants on solution design, estimation and SoWs; demo increments to business/exec users.
* Design and build scalable data models and pipelines on Microsoft Fabric (OneLake, Lakehouse, Pipelines, Dataflows Gen2, Semantic Models) and at least one of Snowflake or Databricks.
* Implement notebook/dbt transformations (tests, docs, exposures) and enforce best practices (modularity, lineage, CI checks).
* Set up CI/CD for data projects (Git, Azure DevOps or GitHub Actions).
* Optimise performance and cost (Fabric capacities, pipeline runtimes; Snowflake warehouses or Databricks clusters).
* Ensure security and governance (RLS/column security, PII handling, Purview/lineage).
* Keep documentation, patterns, and reusable components current for the practice.
Required Qualifications:
* 5–10+ years in data engineering in consulting or complex enterprise environments.
* Sydney-based and able to work a minimum of 4 days per week in the office.
* Must have full working rights in Australia.
* Proven experience leading project workstreams/small teams – sprint planning, stakeholder comms, status reporting, risk management.
* Strong hands‑on with Microsoft Fabric and at least one of Snowflake or Databricks.
* Solid CI/CD for data, strong SQL and Python, familiarity with Parquet/Delta/Iceberg.
* Excellent communication skills – able to talk to both engineers and business.
Nice to have
* AWS data stack, Terraform/Bicep, AWS DMS.
* Data governance & lineage (Purview), RLS/CLS, masking.
* Testing/observability tools (Great Expectations, Soda, Monte Carlo/Revefi).
* Power BI/Tableau, DAX/Tabular.
* Formal PM experience or having acted as a project lead on data deliveries.
What you'll get in return:
Work with marquee Australian clients on modern stacks (Snowflake, Fabric, Databricks). Real ownership with support from senior architects and practice leads. Flexible hybrid work and a supportive culture.
#J-18808-Ljbffr