Permanent role - excellent career longevity
Financial Services - excellent career opportunities
In this role, you'll design and optimize systems that ensure data flows seamlessly, enabling advanced AI products and innovation at scale.
Requirements
5+ years in data engineering with production pipeline experience
Degree in Computer Science, Engineering, or related field
Expertise in GCP data platforms (BigQuery, Dataflow, Pub/Sub, Cloud Storage)
Strong SQL and Python skills for data pipelines
Experience in data modelling, ETL/ELT, and warehousing
Familiarity with Docker and Kubernetes
Knowledge of data governance, quality frameworks, and regulated environments
Experience with real-time streaming and event-driven architectures (preferred)
Strong organization and time management
For further details please click
"Apply Now"
or email your CV to
du***********@*****om.au
. Alternatively call Dustin on
************
for a confidential chat.
#J-*****-Ljbffr