Job Opportunity
We are seeking an experienced professional to fill a key role in our organization.
This is a Full Time position based in Wollongong. As part of our data team, the Data Engineer plays a crucial role in our data-centric business environment.
The primary objective is to work closely with stakeholders and be responsible for architecting, designing, developing, testing, and implementing scalable, robust, and efficient real-time data pipelines using Kafka.
A critical component of this role is to develop and manage ETL/ELT processes and workflows using Python to ensure optimal data quality and efficiency.
The role involves crafting and executing SQL queries, stored procedures, and views within Snowflake data lake/warehouse. Additionally, you will be responsible for driving the design, development, and maintenance of the organization's AWS architecture.
The successful candidate will foster a data-driven culture and advocate for robust, scalable, and reliable data infrastructure. This role is also responsible for monitoring and maintaining existing integrations and implementations while ensuring compliance with relevant security governance, policies, and procedures.
Key Responsibilities include:
* Assist in designing and managing our evolving data infrastructure environment.
* Leverage deep understanding of data principles to guide the business on large-scale projects.
* Design, develop, and implement integration and monitoring solutions adhering to best practices.
* Monitor and maintain existing integrations and implementations.
* Adhere to industry-related security governance, policies, and procedures.
* Increase self-service capabilities by developing user-friendly tools and platforms.
To Be Successful You Will Have:
* Bachelor's Degree in Computer Science, Mathematics or equivalent and/or demonstrated experience in a similar role.
* 5+ years' experience in a data-centric role.
* Proven experience in using Kafka, Terraform, Buildkite, and AWS or similar services.
* Experience in the use of Ipaas Products; desirable but not essential (e.g., Workato).
* Strong Python coding skills.
* Experience in building and maintaining data pipelines.
* Understanding of data architecture and infrastructure-as-code (IaC) principles.
* Ability to prioritize, multitask, and work to deadlines.
* Excellent attention to detail and accuracy.
* Proven problem-solving skills.
* DevOps experience (GitHub, CI/CD).
* Demonstrated willingness to work as part of a multidisciplinary team.
Benefits for You
* Competitive pay and more cash in your pocket (less tax) with salary packaging.
* Flexible working conditions.
* Birthday leave - relax and take a day off.
* Professional and career development opportunities.
* Multiple career pathways.
* Discounted gym memberships.
* Free counselling via Employee Assistance Program (EAP) and staff wellness program.