Looking for a high-performing hands on Data Engineer to design and implement robust data pipelines using Databricks with a focus on Data Modelling for sales, operations marketing and financial workstreams.
Key responsibilities
* Delivering solutions: Defining the problem statements, conducting the business analysis and building requirements, design and delivering solutions
* Pipeline Architecture: Design, build, and maintain scalable end-to-end data pipelines in Databricks (PySpark/SQL). Includes orchestrate Medallion-architecture pipelines in Databricks (Unity Catalog, Delta Live Tables).
* Advanced Data Modelling: Develop star/snowflake schemas and vault models that unify disparate data from various ERP sources.
* Domain Expertise: Build complex logic for sales, operations and financial reporting, including and analytics (pipeline health, churn, conversion).
* BI Delivery: Develop and optimise high-performance semantic layers in to ensure "one version of the truth."
* Training users: provide training to users
* Performance Tuning: Optimise large-scale Spark jobs to ensure lightning-fast reporting for stakeholders. Includes optimisation, indexing strategies, and partition management to reduce latency in BI reports.
* Security & Governance: Implement Row-Level Security (RLS) and Object-Level Security (OLS) across Databricks and SQL environments to ensure sensitive financial data is restricted.
Success metrics
* Business value: User satisfaction and utilisation of the data services by our users in running the day-to-day operations
* Data Reliability: Zero "Data Mismatch" incidents reported by the Finance team during month-end closing.
* Operational Excellence: Implementation of automated alerts for pipeline failures and database performance bottlenecks
* Cost Efficiency: Reduction in monthly Databricks/Cloud spend through optimised SQL and cluster management.
* Reporting Speed: load times for the primary finance & Sales dashboards.
Required skills
* Core Tech: Solid Data Engineering experience with a focus on Databricks (Medallion architecture)
* Data Modelling: Proven expertise in dimensional modelling and translating complex business processes into logical data structures.
* BI Tools: Expert-level proficiency in a major BI tool for complex calculated fields.
* Programming: Strong proficiency in Python (PySpark) and SQL.
What's in it for you?
* An outstanding environment that has been recognised as a 'Great Place to Work' for five years consecutively
* A greenfield opportunity to make your make in a true Business & Technology Transformation
* Outstanding career development and learning
* Hybrid working
If this sounds like you, click on the Apply Tab or send your resume directly to me at
Balance Recruitment is committed to equal opportunity employment. We celebrate diversity and encourage people from all sections of the community to apply.