Job Title: Data Analytics Specialist
We are seeking a skilled Data Analytics Specialist to join our team. As a key member of our organization, you will be responsible for designing and implementing data pipelines in Microsoft Fabric (Data Factory, Dataflows Gen2, Notebooks, Spark). Your expertise will help shape the future of our lakehouse architecture, ensuring scalability and reliability.
About the Role:
This is an exciting opportunity for an experienced underground Analytics Engineer who thrives on solving real-world problems with technology. You will work closely with stakeholders to deliver end-to-end data products, from building robust data transformation pipelines and semantic models to developing polished Power BI reporting apps.
Your Key Responsibilities:
* Design and maintain data pipelines in Microsoft Fabric (Data Factory, Dataflows Gen2, Notebooks, Spark)
* Optimize lakehouse architecture (bronze/silver/gold) and manage Delta/Parquet storage formats
* Build scalable Power BI semantic models and datasets with advanced DAX
* Implement automated data quality checks and observability across pipelines
* Develop modular notebooks for data wrangling using Python/PySpark
* Apply performance tuning for pipelines, tables, and Power BI datasets
* Translate business needs into trusted data products and dashboards
* Collaborate with IT on security, governance, and access control
* Establish CI/CD for analytics using Git and deployment pipelines
* Champion data governance, cataloguing, and documentation standards
* Monitor costs and optimize pipelines for efficiency and SLAs
* Stay ahead of new Fabric features and lead adoption initiatives
About You:
We are looking for someone with a relevant tertiary qualification in Business Analytics, Data Science, Statistics, or Mathematics. A minimum of 5 years' experience in both data engineering and analytics within a medium–large organization is required. Strong SQL development and optimization skills, proficiency in Power BI (semantic modeling, DAX, and report development), notebook-based coding in Python/PySpark, and a solid understanding of data governance practices including lineage, cataloguing, security, and role-based access are essential.
What We Offer:
Our company offers a competitive salary and daily site allowance. We support work-life balance with autonomy and mutual respect. Access to counseling services for you and your family, mentorship from experienced mining professionals, opportunities to collaborate with our software development team make us an attractive employer.