* Enterprise Data Architect
* Full Time Permanent Role
* Adelaide Based Position with Hybrid working setup
The Enterprise Data Architect plays a key role in shaping, implementing, and governing the organisation's enterprise data architecture and long-term data strategy. The position is responsible for driving the adoption and maturity of a unified data platform, using Databricks and related technologies to deliver scalable, governed, and cost-effective data solutions.
Through advanced data modelling, governance frameworks, and secure analytics delivery, this role supports improved decision-making across business functions including engineering, finance, operations, and project management. It also underpins the shift toward AI-driven insights and data-centric practices across the organisation.
The position requires strong technical expertise, the ability to bridge business and technology needs, and adaptability to evolving priorities and emerging opportunities.
Responsibilities:
* Define and maintain enterprise data architecture, frameworks, and roadmaps.
* Lead the design and delivery of scalable, secure, and cost-effective data solutions on Databricks and cloud platforms.
* Oversee enterprise-level data governance, metadata management, and master data practices.
* Establish and enforce standards for data modelling, warehousing, and medallion architectures (Bronze/Silver/Gold).
* Design and maintain high-quality data pipelines to support analytics, BI, and AI/ML use cases.
* Provide expert guidance in database and data warehouse design to meet both transactional and analytical needs.
* Collaborate with business stakeholders, data scientists, analysts, and engineers to align data solutions with organisational priorities.
* Evaluate and recommend tools, methods, and technologies to enhance data engineering and analytics capabilities.
* Act as a subject matter expert on data governance, security, privacy, and compliance.
Qualifications:
* Significant background in architecting and implementing enterprise-scale data platforms using Databricks alongside major cloud environments (Azure, AWS, or GCP).
* Track record of delivering robust, secure, and cost-efficient data solutions that scale with business needs.
* Hands-on expertise with Databricks Lakehouse, Unity Catalog, Delta Live Tables, dbt, Python, SQL, CI/CD pipelines (Azure DevOps or GitHub), and infrastructure automation (Terraform or Bicep).
* Deep understanding of data modelling techniques, modern warehousing practices, and layered architectures such as Bronze/Silver/Gold.
* Strong history of working with cross-disciplinary teams to deliver integrated enterprise data outcomes.
* Advanced knowledge of governance, security, privacy, and compliance standards for enterprise data environments.
* Degree qualifications in Computer Science, IT, Engineering, or an equivalent field.
Desirable:
* Industry certifications in Databricks, Azure, AWS, or GCP (e.g., Certified Data Engineer, Solutions Architect).
* Familiarity with data visualisation and semantic modelling tools including Power BI or Tableau.
* Experience gained in consulting or project-driven settings, with an appreciation for aligning technology with business outcomes.
If the above role sounds of interest, please click on "
Apply Now",
or get in touch with
Ivan
via /