Jobs
My ads
My job alerts
Sign in
Find a job Career Tips Companies
Find

Data engineer

Sydney
CareCone Group
Posted: 13 February
Offer description

Role- Data Engineer (AWS SageMaker)
Type- Permanent
Location- Sydney

Description
● In-depth knowledge of core AWS services: Lambda, S3, DynamoDB, CloudWatch, SQS, SNS, and API Gateway.
● Strong understanding of AWS networking (VPC, security groups, private endpoints) and IAM for secure, fine-grained access control.
● Databases (Relational and No-SQL):■ Expertise with Amazon RDS (Relational Database Service) for both real-time data ingestion and efficient batch export. This includes optimising database performance, connection pooling, and transaction management for high-throughput, low-latency operations.
■ Proficiency with Amazon Redshift for large-scale data warehousing, including data loading strategies (e.g., COPY command), query optimisation, and managing Redshift clusters for analytical workloads and batch export.
■ Experience with No-SQL databases such as Amazon DynamoDB, MongoDB
■ ■ ■ for high-performance, low-latency data storage and retrieval, particularly for real-time applications and feature serving.
● Orchestration and Workflow Management: ● Experience with AWS Managed Apache Airflow (MWAA)for orchestrating complex data pipelines, scheduling batch jobs, and managing dependencies between ingestion, processing, and export tasks.
● Ability to write, deploy, and manage Airflow DAGs (Directed Acyclic Graphs) for robust workflow automation.
● Monitoring and Observability (Real-time Heartbeat Export): ● Ability to design and implement comprehensive real-time monitoring solutions, including custom metrics, detailed logging, and tracing.
● Experience with AWS CloudWatch for collecting, analysing, and acting on operational data, specifically for generating and exporting "heartbeat" signals to external systems or dashboards.
● Knowledge of setting up proactive alerts and automated notifications for system health, performance degradation, and data pipeline anomalies.
● Software Development Practices & Architecture:
● Strong understanding of software engineering principles, design patterns, and architectural best practices for building scalable, maintainable, and reusable data frameworks.
● Proficiency with version control systems (Git)and collaborative development workflows.
● Experience with CI/CD pipelines for automated testing, deployment, and release management of data ingestion and export solutions.
● Familiarity with Infrastructure as Code (e.g., AWS CloudFormation, Terraform) for managing and provisioning AWS resources
Key Skillset:
1. Previous experience in developing framework for batch and real time data ingestions in relation and no-sql databases or filesystems.
2. Previous experience in Real time data ingestion with low latency with experience on AWS Kinesis Stream, Apache Kafka or similar streaming technologies
3. Proven experience in various database technologies (e.g. Oracle, Teradata, MongoDB, Snowflake etc)
4. Sound knowledge and experience in building Data Warehouses and Data Lakehouses.
5. Data Modeling experience is a plus point

Interested people can share their resume

Send an application
Create a job alert
Alert activated
Saved
Save
Similar jobs
jobs Sydney
jobs New South Wales
Home > Jobs > Data Engineer

About Jobstralia

  • Career Advice
  • Company Reviews

Search for jobs

  • Jobs by job title
  • Jobs by sector
  • Jobs by company
  • Jobs by location

Contact / Partnership

  • Contact
  • Publish your job offers on Jobijoba

Legal notice - Terms of Service - Privacy Policy - Manage my cookies - Accessibility: Not compliant

© 2026 Jobstralia - All Rights Reserved

Send an application
Create a job alert
Alert activated
Saved
Save