Jobs
My ads
My job alerts
Sign in
Find a job Career Tips Companies
Find

Transformation leader

Sydney
beBeeDataEngineer
Posted: 14 September
Offer description

Senior Data Engineer Role

We're looking for an experienced data engineer to join our team and help us build a robust data infrastructure. As a senior data engineer, you will be responsible for designing, developing, and optimizing highly scalable and reliable ETL/ELT data pipelines using batch and streaming technologies.

You will architect, implement, and maintain robust data solutions within cloud environments, primarily leveraging Google Cloud Platform (GCP) and Amazon Web Services (AWS). Additionally, you will lead the design and implementation of data warehousing solutions utilizing Google BigQuery and Snowflake, optimizing for performance, cost-efficiency, and analytical needs.

The ideal candidate will have extensive experience designing, building, and managing data pipelines on GCP and AWS, as well as solid experience with Apache Airflow for workflow orchestration and scheduling. They should also be proficient in Python for data manipulation and automation.

This is a unique opportunity to shape the future of data engineering and drive business growth through data-driven insights. If you're passionate about working with large-scale data systems and driving innovation, we encourage you to apply.

-----------------------------------


Required Skills and Qualifications

* Bachelor's degree in Computer Science, Software Engineering, or a related quantitative field.
* 5+ years of hands-on experience in data engineering roles, with a proven track record of delivering large-scale data solutions.
* Expert-level proficiency in SQL and strong experience with relational and analytical databases.
* Demonstrated experience with Google BigQuery and Snowflake as primary data warehousing solutions.
* Extensive experience designing, building, and managing data pipelines on Google Cloud Platform (GCP) and Amazon Web Services (AWS).
* Solid experience with Apache Airflow for workflow orchestration and scheduling.
* Experience with Metabase for semantic modelling, dashboarding, and self-service analytics.
* Strong understanding of data modelling principles.
* Proficiency in Python for data manipulation and automation.
* Experience with version control systems.
* Excellent problem-solving, analytical, and communication skills.

Send an application
Create a job alert
Alert activated
Saved
Save
Similar jobs
jobs Sydney
jobs New South Wales
Home > Jobs > Transformation Leader

About Jobstralia

  • Career Advice
  • Company Reviews

Search for jobs

  • Jobs by job title
  • Jobs by sector
  • Jobs by company
  • Jobs by location

Contact / Partnership

  • Contact
  • Publish your job offers on Jobijoba

Legal notice - Terms of Service - Privacy Policy - Manage my cookies - Accessibility: Not compliant

© 2025 Jobstralia - All Rights Reserved

Send an application
Create a job alert
Alert activated
Saved
Save