Jobs
My ads
My job alerts
Sign in
Find a job Career Tips Companies
Find

Confluent consultant (kafka & data streaming specialist) - contract - melbourne

Melbourne
Hastha Solutions
Posted: 7 May
Offer description

Contract | Hastha Solutions | Australia

Posted On 28/04/2026

Job Information

Job Opening ID ZR_6213_JOB

Work Experience 5+ years

IT Services

City Melbourne

State/Province Victoria

3000

Job Description

Urgent requirement of Confluent Consultant (Kafka & Data Streaming Specialist) - Contract - Melbourne

Requirements

Must Have:

* Bachelor's or Master's degree in Computer Science, Information Systems, or a related field.
* 5+ years hands‐on experience with Apache Kafka / Confluent Platform.
* Architecture & Design: Define topic strategy (naming, partitions, RF, retention/compaction), idempotency/ordering; size and configure Confluent clusters; plan DR using Cluster Linking; leverage Tiered Storage as needed.
* Build & Integration: Implement/tune producers & consumers; design Kafka Connect integrations (managed/self‐managed connectors, REST/File/JDBC/CDC); use Schema Registry for contracts and compatibility; optional ksqlDB for stream transformations.
* Security & Compliance: Enforce RBAC, API keys/service accounts, mTLS/OAuth/OIDC; manage secrets; implement encryption in transit/rest; align retention, PII handling, and audit. Support private networking (VPC peering/PrivateLink) in Confluent Cloud.
* Observability & Operations: Set up Control Centre dashboards/alerts (lag, throughput, error rates, connector health); create run books for DLQ triage, replay, offset resets, connector/broker incidents; capacity planning and performance testing.
* Delivery & Governance: Produce HLD/LLD, participate in design reviews, release/cutover plans, DR drills, and knowledge transfer.
Good to Have
* Confluent Certified Developer/Administrator
* Experience with stream governance, RBAC, and tiered storage
* Background in data engineering, ETL, or micro services architecture
* Familiarity with traditional messaging systems and data warehouses
* Experience in Agile methodologies and DevOps practices
Responsibility
* Architecture & Design: Define topic strategy (naming, partitions, RF, retention/compaction), idempotency/ordering; size and configure Confluent clusters; plan DR using Cluster Linking; leverage Tiered Storage as needed.
* Build & Integration: Implement/tune producers & consumers; design Kafka Connect integrations (managed/self‐managed connectors, REST/File/JDBC/CDC); use Schema Registry for contracts and compatibility; optional ksqlDB for stream transformations.
* Security & Compliance: Enforce RBAC, API keys/service accounts, mTLS/OAuth/OIDC; manage secrets; implement encryption in transit/rest; align retention, PII handling, and audit. Support private networking (VPC peering/PrivateLink) in Confluent Cloud.
* Observability & Operations: Set up Control Centre dashboards/alerts (lag, throughput, error rates, connector health); create run books for DLQ triage, replay, offset resets, connector/broker incidents; capacity planning and performance testing.
* Delivery & Governance: Produce HLD/LLD, participate in design reviews, release/cutover plans, DR drills, and knowledge transfer.

Duration: 06 Months and possible extension

Eligibility: Australian/NZ Citizens/PR Holders only

#J-18808-Ljbffr

Send an application
Create a job alert
Alert activated
Saved
Save
Similar jobs
jobs Melbourne
jobs Victoria
Home > Jobs > Confluent Consultant (Kafka & Data Streaming Specialist) - Contract - Melbourne

About Jobstralia

  • Career Advice
  • Company Reviews

Search for jobs

  • Jobs by job title
  • Jobs by sector
  • Jobs by company
  • Jobs by location

Contact / Partnership

  • Contact
  • Publish your job offers on Jobijoba

Legal notice - Terms of Service - Privacy Policy - Manage my cookies - Accessibility: Not compliant

© 2026 Jobstralia - All Rights Reserved

Send an application
Create a job alert
Alert activated
Saved
Save