Seeking a Senior Data & Analytics Engineer with recent Databricks for this rapidly expanding and successful business in Western Sydney!Looking for a high-performing hands on Data Engineer to design and implement robust data pipelines using Databricks with a focus on Data Modelling for sales, operations marketing and financial workstreams.Key responsibilitiesDelivering solutions: Defining the problem statements, conducting the business analysis and building requirements, design and delivering solutionsPipeline Architecture: Design, build, and maintain scalable end-to-end data pipelines in Databricks (PySpark/SQL). Includes orchestrate Medallion-architecture pipelines in Databricks (Unity Catalog, Delta Live Tables).Advanced Data Modelling: Develop star/snowflake schemas and vault models that unify disparate data from various ERP sources.Domain Expertise: Build complex logic for sales, operations and financial reporting, including and analytics (pipeline health, churn, conversion).BI Delivery: Develop and optimise high-performance semantic layers in to ensure \"one version of the truth.\"Training users: provide training to usersPerformance Tuning: Optimise large-scale Spark jobs to ensure lightning-fast reporting for stakeholders. Includes optimisation, indexing strategies, and partition management to reduce latency in BI reports.Security & Governance: Implement Row-Level Security (RLS) and Object-Level Security (OLS) across Databricks and SQL environments to ensure sensitive financial data is restricted.Success metricsBusiness value: User satisfaction and utilisation of the data services by our users in running the day-to-day operationsData Reliability: Zero \"Data Mismatch\" incidents reported by the Finance team during month-end closing.Operational Excellence: Implementation of automated alerts for pipeline failures and database performance bottlenecksCost Efficiency: Reduction in monthly Databricks/Cloud spend through optimised SQL and cluster management.Reporting Speed: load times for the primary finance & Sales dashboards. Required skillsCore Tech: Solid Data Engineering experience with a focus on Databricks (Medallion architecture)Data Modelling: Proven expertise in dimensional modelling and translating complex business processes into logical data structures.BI Tools: Expert-level proficiency in a major BI tool for complex calculated fields.Programming: Strong proficiency in Python (PySpark) and SQL.What's in it for you?A friendly, professional environment that has been recognised as a 'Great Place to Work' for five years consecutivelyA greenfield opportunity to make your mark on a true Business & Technology TransformationOutstanding career development and learningWork on multiple CRM and ERP platforms including SAP S/4HANA Public CloudHybrid workingIf this sounds like you, click on the Apply Tab or send your resume directly to me at