Jobs
My ads
My job alerts
Sign in
Find a job Career Tips Companies
Find

Principal data engineer (city of ipswich)

Ipswich
Sleek
Posted: 4 October
Offer description

Overview

We are looking for a Principal Data Engineer that is excited about the mission and outcomes over the next 6-12 months.

Mission

Work closely with cross-functional teams to translate our business vision into impactful data solutions. Drive the alignment of data architecture requirements with strategic goals, ensuring each solution meets analytical needs and advances core objectives. Bridge the gap between business insights and technical execution by tackling complex challenges in data integration, modeling, and security, and by setting the stage for exceptional data performance and insights. Shape the data roadmap, influence design decisions, and empower our team to deliver innovative, scalable, high-quality data solutions every day.

Outcomes

- Architecture & Design
- Define the overall greenfield data architecture (batch + streaming) using GCP - BigQuery.
- Establish best practices for ingestion, transformation, data quality, and governance
- Data Ingestion & Processing
- Lead the design and implementation of ETL/ELT pipelines
- Ingestion: Datastream, Pub/Sub, Dataflow, Airbyte, Fivetran, Rivery
- Storage & Compute: BigQuery, GCS
- Transformations: dbt, Cloud Composer (Airflow), Dagster
- Ensure data quality and reliability with dbt tests, Excellent Expectations/Soda, and monitoring
- Governance & Security
- Implement Dataplex & Data Catalog for metadata, lineage, and discoverability
- Define IAM policies, row/column-level security, DLP strategies, and compliance controls
- Monitoring, Observability & Reliability
- Define and enforce SLAs, SLOs, and SLIs for pipelines and data products
- Implement observability tooling:
- Cloud-native: Cloud Monitoring, Logging, Error Reporting, Cloud Trace
- Third-party (nice-to-have): Monte Carlo, Datafold, Databand, Bigeye
- Build alerting and incident response playbooks for data failures and anomalies
- Ensure pipeline resilience (idempotency, retries, backfills, incremental loads)
- Establish disaster recovery and high availability strategies (multi-region storage, backup/restore policies)
- Analytics Enablement
- Partner with BI/analytics teams to deliver governed self-service through Looker, Looker Studio, and other tools
- Support squad-level data product ownership with clear contracts and SLAs
- Team Leadership
- Mentor a small data engineering team; set coding, CI/CD, and operational standards
- Collaborate with squads, product managers, and leadership to deliver trusted data

Requirements

- 10+ years experience in data engineering, architecture, or platform roles
- Strong expertise in GCP data stack: BigQuery, GCS, Dataplex, Data Catalog, Pub/Sub, Dataflow
- Hands-on experience building ETL/ELT pipelines with dbt + orchestration (Composer/Airflow/Dagster)
- Deep knowledge of data modeling, warehousing, partitioning/clustering strategies
- Experience with monitoring, reliability engineering, and observability for data systems
- Familiarity with data governance, lineage, and security policies (IAM, DLP, encryption)
- Strong SQL skills and solid knowledge of Python for data engineering

Nice-to-Have

- Experience with Snowflake, Databricks, AWS (Redshift, Glue, Athena), or Azure Synapse
- Knowledge of open-source catalogs (DataHub, Amundsen, OpenMetadata)
- Background in streaming systems (Kafka, Kinesis, Flink, Beam)
- Exposure to data observability tools (Monte Carlo, Bigeye, Datafold, Databand)
- Prior work with Looker, Hex, or other BI/analytics tools
- Startup or scale-up experience (fast-moving, resource-constrained environments)

Behavioural fit

Ownership, Humility, Structured Thinking, Attention to detail, Excellent listener and clear communicator

Interview Process

The successful candidate will participate in the interview stages described below. The order might vary. Expect the process to last no more than 3 weeks from start to finish. Interviews may be conducted via video or in person depending on location and role.

- Screening call - 30-minute chat with Talent Acquisition to learn about you and your goals.
- Technical screening - 30-minute call with a Senior Data Engineer on core concepts.
- Technical competency panel - 60-minute panel focusing on Python and SQL.
- Behavioural panel interview - 60-minute conversation with business leaders.
- Final interview - Closing conversation with the CTO.
- Offer + references - Non-binding offer and references check.

Background checks & Compliance

Sleek is a regulated entity and performs background checks appropriate to the role. Consent will be obtained. Checks may include education verification, criminal history, political exposure, and bankruptcy/adverse credit history. Depending on role, an adverse result may affect probation. By applying, you confirm you have read our Data Privacy Statement for Candidates at sleek.com.

Benefits

Humility and kindness; Flexibility to work from home 5 days per week (fully remote from anywhere in the world for 1 month per year); Financial benefits including market-competitive salaries, generous time off, and potential employee share ownership; Personal growth through responsibility, autonomy, and training programs. Sleek is a certified B Corp and aims for carbon neutrality by 2030.

#J-18808-Ljbffr

📌 Principal Data Engineer
🏢 Sleek
📍 City of Ipswich

Send an application
Create a job alert
Alert activated
Saved
Save
Similar jobs
jobs Ipswich
jobs Queensland
Home > Jobs > Principal Data Engineer (City of Ipswich)

About Jobstralia

  • Career Advice
  • Company Reviews

Search for jobs

  • Jobs by job title
  • Jobs by sector
  • Jobs by company
  • Jobs by location

Contact / Partnership

  • Contact
  • Publish your job offers on Jobijoba

Legal notice - Terms of Service - Privacy Policy - Manage my cookies - Accessibility: Not compliant

© 2025 Jobstralia - All Rights Reserved

Send an application
Create a job alert
Alert activated
Saved
Save