The people here at Apple don't just create products — they create the kind of wonder that's
revolutionised entire industries. It's the diversity of those people and their ideas that inspires the
innovation that runs through everything we do, from amazing technology to industry-leading
environmental efforts. Join Apple, and help us leave the world better than we found it.
Apple's Sales organisation generates the revenue needed to fuel our ongoing development of
products and services. This in turn, enriches the lives of hundreds of millions of people around the
world. Our sales team, is in many ways, the face of Apple to our largest customers.
As the Analytics Infrastructure Engineer, you will be instrumental in the design, development, and
implementation of data engineering solutions for the Apple Channel Sales team in ANZSA
(Australia, New Zealand and South Asia) that have direct and measurable impact on Apple Sales
and its customers.
Description
Partnering with internal and external partners you will be responsible for the data that enables our
Sales Teams to make informed decisions. You will be the main architect behind our data
infrastructure, ensuring that data is easily accessible, reliable, and actionable. This in turn will
facilitate the automation of sales processes as well as the creation of insights, reports and AI
solutions.
Key responsibilities include building, maintaining and optimising scalable data pipelines for both
structured and unstructured data, enabling the development and deployment of AIML models.
Collaborating with various cross functional, regional and global teams, you will optimise data
processes, and ensure that our data practices align with industry standards, best practices and
regulations.
This includes defining and disseminating global best practices, setting technical standards for data
architecture, coding and and modelling conventions, and supporting data governance policies to
ensure consistent data generation, processing, and reporting.
We are looking for an exceptional individual that lives at the intersection of development,
operations, data, and systems engineering to build solutions for scalable data transformation and
delivery.
Minimum Qualifications
BS or MS in Computer Science, Engineering or equivalent industry experience.
5+ years of experience in designing, building, and maintaining scalable data solutions for large-
scale analytics.
Preferred Qualifications
Development experience with cloud database environments like Snowflake, Dremio, Redshift or
Databricks.
Proficiency in programming languages like SQL, Python, Java or R
Architecting and developing data pipelines through ETL tools, API integrations with systematic
and cloud based source systems.
Strong understanding of data modelling, data warehousing, and ETL concepts
Familiarity with AI/ML model development lifecycle and data needs for training and deployment.
Experience and understanding of unstructured data, API development and basic frontend
development.
Solid understanding of data governance frameworks.
Experience articulating and translating business questions into data solutions and proven ability
to lead development projects from start to finish.
Able to balance competing priorities, long-term projects, and ad hoc requirements.
Demonstrated ability to positively influence and collaborate with people across all functional
areas of an organisation.
Demonstrated ability to achieve strategic goals in an innovative and fast-paced environment.
Outstanding ability to problem solve, develop creative solutions, and demonstrate
resourcefulness, while maintaining extreme attention to detail.
Eagerness and ability to learn new skills and solve dynamic problems in an encouraging and
expansive environment.
Excellent communication skills, able to inspire non-technical colleagues on value proposition and
impact of data governance