Jobs
My ads
My job alerts
Sign in
Find a job Career Tips Companies
Find

Aws cloud data specialist

Sydney
beBeeDataEngineer
Posted: 14 September
Offer description

The ideal candidate for this role will possess a strong understanding of data engineering principles and have hands-on experience in developing scalable data pipelines.

">

They will be proficient in cloud-based technologies, specifically Amazon Web Services (AWS), and have a solid grasp of data architectures, including data modeling techniques and commercial experience in programming languages such as Python, Java, or Scala.

">

A strong background in GIT and CI/CD tools is also required, along with practical experience in developing and supporting ETL frameworks and tooling.

">

Additionally, they will have experience with Big Data technologies, focusing on Spark, DBT, Presto, and big data storage formats such as Parquet, ORC, and Avro, as well as data pipeline orchestration tools like Airflow.

">

Key Responsibilities

">
* Develop and maintain applications such as data ingestion and transformation pipelines, cloud infrastructure, APIs, query engines, and orchestration platforms utilizing the AWS technology stack.

">
* Design and implement data architectures that meet business requirements, ensuring scalability, reliability, and performance.

">
* Collaborate with cross-functional teams to ensure seamless integration of data solutions across the organization.

">
* Stay up-to-date with industry trends and emerging technologies, applying knowledge to improve data engineering practices and processes.

">

Requirements

">
* Experience in one or more of the following Amazon Data Services: Redshift, Glue, RDS, EMR.

">
* Data Architecture experience, including different data modeling techniques.

">
* Commercial experience in at least one programming language: Python, Java, or Scala.

">
* Practical experience using GIT and at least one CI/CD tool: Bamboo, CloudBees, or TeamCity, Gitlab.

">
* Experience with Big Data technologies: Spark, DBT, Presto, Parquet, ORC, Avro, Airflow.

">

Desirable Skills

">
* Proficient in Spark SQL, with proven ability in tuning and optimization.

">
* Proven hands-on experience in developing and supporting ETL frameworks and tooling.

">
* Experience with containerization using Docker and Kubernetes.

">

This is an exciting opportunity for a talented data engineer to join our team and contribute to the design and implementation of innovative data solutions.

"],

Send an application
Create a job alert
Alert activated
Saved
Save
Similar jobs
jobs Sydney
jobs New South Wales
Home > Jobs > AWS Cloud Data Specialist

About Jobstralia

  • Career Advice
  • Company Reviews

Search for jobs

  • Jobs by job title
  • Jobs by sector
  • Jobs by company
  • Jobs by location

Contact / Partnership

  • Contact
  • Publish your job offers on Jobijoba

Legal notice - Terms of Service - Privacy Policy - Manage my cookies - Accessibility: Not compliant

© 2025 Jobstralia - All Rights Reserved

Send an application
Create a job alert
Alert activated
Saved
Save