About insightfactory.ai
insightfactory.ai is a South Australian Data & AI company transforming how enterprises harness data, build intelligence, andise AI at scale. Through our Insight Factory platform and professional services team, we help clients modernise their data estates, establish governed lakehouse environments, and deploy AI-driven capabilities into production.
About the Role
We are looking for a highly capable Data Engineer to join our Professional Services team within the Insight Factory. In this role, you’ll design, build, and maintain modern data pipelines and integration frameworks across our client environments helping them move, transform, and operationalise data efficiently and reliably.
You will work primarily within Azure Databricks, leveraging strong SQL and data engineering skills to develop scalable solutions across ingest, enrichment, modelling, and consumption layers.
Key Responsibilities
Design, develop, and maintain robust data pipelines using Azure Databricks, Delta Lake, and related Azure Data Services.
Ingest and integrate structured and unstructured data from a range of enterprise systems and external sources.
Build and optimise ETL/ELT workflows, ensuring performance, reliability, and data quality.
Implement and maintain data models that support analytics, AI, and downstream reporting use cases.
Apply data enrichment, transformation, and validation logic aligned to business requirements.
Collaborate with analytics translators, forward-deployed engineers, and platform teams to deliver end-to-end data solutions.
Contribute to insightfactory.ai’s internal accelerators and reusable data engineering frameworks.
Support data governance, security, and quality practices consistent with Unity Catalog and Lakehouse architecture standards.
Skills & Experience
Strong proficiency in SQL and data transformation.
Proven experience developing data pipelines in Azure Databricks, Azure Data Factory, or cloud environments.
Understanding of data modelling concepts (e.g., dimensional, relational, and semantic layers).
Familiarity with version control (Git), CI/CD pipelines, and Agile delivery practices.
Strong problem-solving and communication skills, with the ability to work effectively in cooperative, client-facing environments.
Preferred Qualifications
Bachelor’s degree in Computer Science, Data Engineering, Information Systems, or related field.
Microsoft or Databricks certifications (e.g., DP-203, Databricks Data Engineer Associate/Professional).
Prior consulting or professional services experience in data-driven projects.
Work with one of Australia’s fastest-growing Data & AI companies.
Collaborate with experts across data engineering, AI, and agentic automation.
Contribute to meaningful, enterprise-scale projects across energy, transport, infrastructure, education, and community services.
Be part of a culture that values curiosity, delivery excellence, and innovation.
#J-18808-Ljbffr