The Data Platforms (DP) function comprises a team of data professionals dedicated to building game-changing data capabilities for Westpac Group and shaping the future of banking. We understand the big picture and focus on and deliver initiatives that will significantly and positively impact the Westpac Group.
With a significant transformation agenda over the next few years, we are committed to contributing meaningfully and impactfully. The Group operates within a complex Information Technology environment that must adapt to the rapidly changing needs of our customers and employees, the evolving Data & Technology landscape, and regulatory and statutory requirements.
As an Engineer within DP, you will be a strong technical expert, responsible for building real-time and batch data solutions. You will handle multiple programs concurrently and be a go-to SME for the development of solutions.
Your responsibilities include :
* Enhancement, maintenance, and support of the technical deliverables within your domain. This involves hands-on coding, ensuring the quality of your team’s output, representing your team in various forums, and aligning your team’s contributions with the overall strategic roadmap.
* Designing and implementing scalable, reliable, and secure solutions, and collaborating with cross-functional teams to gather requirements and ensure successful delivery.
* Collaborating with fellow Engineers across the domain to define the technical direction for service offerings, and working with service owners and business stakeholders to shape the delivery roadmap and coordinate with support teams to ensure smooth delivery.
What do I need?
* Solid engineering experience developing large-scale applications using Scala and Python.
* Proven expertise in real-time streaming jobs, with extensive experience in real-time data processing and streaming pipelines in production environments.
* Demonstrated ability to handle large datasets using PySpark in numerous production workloads.
* Extensive experience in optimising Spark code using Directed Acyclic Graphs (DAGs) involving several strategies to enhance performance and efficiency.
* Experience working with HDFS, Pig, Hive, HBase, Phoenix, etc.
* Proven experience and understanding of cloud platforms, particularly within Azure, having worked with services like ALDS, EventHub, Stream Analytics, HDInsight, Synapse Analytics, Data Factory, Cosmos DB, Kubernetes. Similar experience with AWS / GCP is also acceptable.
* Strong functional programming skills and ability with complex data structures.
* Understanding and experience in data warehousing concepts.
* Commitment to delivering high-quality, peer-reviewed, and well-tested code.
* Expertise in DevOps functions, contributing to CI / CD pipelines.
* Experience in implementing MLOps pipelines in production is advantageous.
* Proficient in using source control tools, preferably Bitbucket, with a good understanding of branching and merging strategies.
* Tertiary qualifications in Computer Science, Information Technology, or a related field are highly desirable.
* Comfortable working in a fast-paced, agile development environment.
* Excellent problem-solving, analytical, and stakeholder management skills.
Start here. Just click on the APPLY or APPLY NOW button.
At Westpac, we’re all about creating a supportive culture and ensuring our workplaces, branches, products, and services are accessible and inclusive for everyone—our customers, employees, and the wider community. If you’re interested in discussing workplace flexibility, please feel free to mention it in your application.
We invite candidates of all ages, genders, sexual orientations, cultural backgrounds, people with disabilities, neurodiverse individuals, veterans and reservists, and Indigenous Australians to apply.
#J-18808-Ljbffr