Base pay range
A$180,000.00/yr - A$200,000.00/yr
Direct message the job poster from Konnexus
Recruitment Executive | 25 + Experience | Transforming hiring in Data Analytics, AI/ML, Data & Platform Engineering | Driving Business Impact through…
- Hybrid - 2 Days in the Office
- Collaborative and Purposeful Organisation
- Salary will depend on the level of experience across GCP
Work with this leading organisation to help design, develop, and maintain secure, scalable, and high-performing data pipelines within Google Cloud Platform. You will transform raw data into trusted, well-structured datasets that empower analytics, reporting, and evidence-based decision-making across the organisation. Working alongside cross-functional teams, you will optimise architecture, streamline automation, and uphold governance and compliance standards to ensure the platform delivers maximum value.
Responsibilities of this role will include:
- Build and maintain data pipelines and architectures using GCP services such as BigQuery, Dataflow, Cloud Composer, and Cloud Storage.
- Convert raw data into structured, high-quality datasets to support advanced reporting and analytics.
- Esnure all data solutions meet privacy, security, and compliance standards in line with organisational and GCP best practices.
- Partner with stakeholders to define, analyse, and document data requirements, including metadata and lineage tracking.
- Lead workshops and stakeholder discussions to shape clear, actionable technical specifications.
- Develop logical and physical data models aligned to enterprise data
- Create and manage ETL/ELT workflows using GCP-native tools and SQL in BigQuery.
Seeking experience that will include:
- At least 3 years’ experience delivering large-scale data engineering or analytics solutions in GCP.
- Hands-on expertise with BigQuery, Dataflow, Cloud Storage, Cloud Composer, and Cloud Functions.
- Strong proficiency in SQL development and performance tuning in BigQuery.
- Proven experience in building and optimising ETL/ELT pipelines in GCP.
- Solid understanding of data modelling, governance, compliance, and security in cloud environments.
- Familiarity with CI/CD pipelines, DevOps practices, and metadata-driven frameworks.
- Problem-solving capability with experience working both independently and in agile teams.
- A degree in Computer Science, Information Technology, or a related discipline.
To be considered for this role, you must be an Australian Permanent Resident or an Australian Citizen.
If this position is of interest to you, please click ‘Apply’ or you can just send your details to
Konnexus specialise in the recruitment of permanent and contract professionals within Data Analytics | AI/ML | Data & Platform Engineering. If this role is not right but you are wanting a new position or a conversation about the market, please call us on and we will pass you onto the right Consultant.
Seniority level
- Mid-Senior level
Employment type
- Full time
Job function
- Information Technology
Industries
- Staffing and Recruiting
Referrals increase your chances of interviewing at Konnexus by 2x
#J-18808-Ljbffr
📌 Senior Data Engineer
🏢 Konnexus
📍 Melbourne