* Drive high-impact, organisation-wide AI and analytics initiatives
* Work with a modern, diverse cloud ecosystem and complex integrations
About Our Client
The client is a Brisbane-based Registered Training Organisation (RTO) offering nationally recognised vocational courses across health, education, business, and more.
Job Description
* Design, develop, and maintain scalable data architectures using Microsoft Azure technologies, including Azure Data Warehouse, Cosmos DB, Data Factory, Databricks, Power BI, and Logic Apps.
* Build and optimise data pipelines and ETL workflows using Azure Data Factory and Databricks.
* Develop and maintain interactive dashboards and reporting solutions using Power BI and Google Analytics 4.
* Implement and manage data solutions on GCP, including BigQuery and related Google Cloud services.
* Design and deploy automation workflows across cloud environments to streamline and enhance data operations.
* Optimise data storage, retrieval, and processing for high performance and scalability.
* Enforce robust data governance practices covering data quality, metadata management, and data security.
* Integrate data from systems such as Salesforce, HubSpot, JobReady, and aXcelerate using middleware tools like Azure Logic Apps.
* Collaborate with cross-functional teams to gather requirements and deliver AI-driven data solutions.
* Provide technical leadership, guidance, and mentorship to junior data engineers.
The Successful Applicant
Technical Skills
* Extensive experience in data engineering and architecture, with strong exposure to AI engineering.
* Advanced proficiency in SQL and Python.
* Deep expertise across Microsoft Azure technologies including Azure Data Warehouse, Cosmos DB, Data Factory, Databricks, Power BI, and Logic Apps.
* Strong hands‑on experience working with cloud platforms-primarily Azure and GCP-and their associated data services.
* Solid background in integrating data from CRM and LMS platforms such as Salesforce, HubSpot among others.
* Strong understanding of data governance, security, privacy, and compliance best practices.
* Working knowledge of CI/CD tooling and DevOps approaches for data engineering workflows.
* Practical experience with machine learning frameworks and deploying AI models.
* Familiarity with big data ecosystems (Spark, Hadoop, Kafka). (Desirable)
* Experience with distributed computing and real‑time data processing. (Desirable)
* Exposure to containerisation technologies such as Docker and Kubernetes. (Desirable)
Soft Skills
* Excellent problem‑solving capability and strong aptitude in resolving complex technical issues.
* Clear and confident communicator, able to translate technical concepts for both technical and non‑technical stakeholders.
* Demonstrated experience coaching, mentoring, or leading small teams.
Qualifications
* 5+ years of experience in data engineering or similar technical roles.
* Relevant Microsoft Azure or related cloud certifications.
* Bachelor's or Master's degree in Computer Science, Data Engineering, or a related field.
What's on Offer
* Competitive salary
* Permanent role based in Bowen Hills
* Supportive and collaborative work culture.
* Chance to work with cutting‑edge technology and innovative projects.
#J-18808-Ljbffr