About Our Client Our client is a major Australian financial services organisation, headquartered in Sydney, with a strong focus on digital transformation and customer-centric innovation.
They manage millions of customer accounts nationwide and are investing heavily in cloud-based data platforms to enhance risk management, compliance, and personalised services.
Job Description Develop and maintain ETL pipelines to integrate data from multiple sources Build and optimize data models for analytics and reporting platforms Collaborate with analysts, data scientists, and business stakeholders to deliver high-quality datasets Implement best practices in data governance, security, and performance tuning Support migration and integration projects across cloud platforms (AWS/Azure/GCP) The Successful Applicant Proven experience as a Data Engineer (3+ years in contract or permanent roles) Strong proficiency in SQL, Python, and ETL frameworks Hands-on experience with cloud data services (AWS Glue, Azure Data Factory, GCP BigQuery) Familiarity with data warehousing (Snowflake, Redshift, Synapse) Excellent problem-solving skills and ability to work in agile teams