Responsibilities
* Develop and maintain ETL pipelines to integrate data from multiple sources
* Build and optimize data models for analytics and reporting platforms
* Collaborate with analysts, data scientists, and business stakeholders to deliver high-quality datasets
* Implement best practices in data governance, security, and performance tuning
* Support migration and integration projects across cloud platforms (AWS/Azure/GCP)
Qualifications
* Proven experience as a Data Engineer (3+ years in contract or permanent roles)
* Strong proficiency in SQL, Python, and ETL frameworks
* Hands‑on experience with cloud data services (AWS Glue, Azure Data Factory, GCP BigQuery)
* Familiarity with data warehousing (Snowflake, Redshift, Synapse)
* Excellent problem‑solving skills and ability to work in agile teams
About the Client
Our client is a major Australian financial services organisation, headquartered in Sydney, with a strong focus on digital transformation and customer‑centric innovation. They manage millions of customer accounts nationwide and are investing heavily in cloud‑based data platforms to enhance risk management, compliance, and personalised services.
Benefits
* Work on high‑impact projects with leading Australian enterprises
* Exposure to modern cloud‑based data platforms
* Flexible hybrid working arrangements
* Opportunity to extend beyond the initial contract
If you're ready to bring your expertise as a Data Engineer to a challenging and rewarding role, we encourage you to apply today
#J-18808-Ljbffr