Overview
Unlock your potential with a prestigious opportunity alongside an award-winning, data-driven industry leader A tremendous opportunity awaits a Data Engineer to collaborate with a highly respected team. In this role, you will spearhead day-to-day data and analytics support for the diverse cross-functional team with Product owners, Data Architects, and stakeholders. Seize this chance to harness your expertise and drive impactful decisions within a thriving, forward-looking organization at the forefront of innovation. A great 6-month contract opportunity with extensions for a mid-level Data Engineer.
Responsibilities
Work with business and technical stakeholders to understand requirements relating to data and analytics initiatives with a focus on delivering business outcomes
Design, develop, test, document, and maintain data models, reports, and dashboards that meet business needs and requirements
Design, develop, test, document, and maintain data pipelines across a range of technologies that extract, load, combine and transform data from a range of internal and external sources
Translate business requirements into logical and physical data models, ensuring alignment with reporting and BI needs
Understand and implement data governance requirements that support data and analytics solutions, including security, data quality, data classification, trustworthiness, and re-usability
Design and implement star and snowflake schemas to support scalable, performance-oriented analytics across business domains
Collaborate with technical and non-technical stakeholders through a range of forums, including cross-functional delivery teams, the data and analytics practice group, and the data and analytics platform team
Qualifications
Extensive expertise in data engineering and data analysis
Proficient in T-SQL, SQL Server, Databricks, PySpark, Power BI, and other related data tools and platforms
Experience with Lambda and Kappa architecture stacks
Hands-on experience building data streaming solutions using technologies such as Apache Kafka, Azure Event Hub, Confluent, and Apache Flink
Skilled in constructing and optimizing data pipelines, data models, reports, and dashboards
Previous hands-on work with a broad range of Azure services related to Data and Analytics, including Azure Data Factory, Azure Databricks, Delta Lake, and Azure Synapse Analytics
Experience in star/snowflake schema design
Understanding of slowly changing dimensions (SCD Types 1, 2, etc.)
Ability to translate business logic into fact/dimension tables
Proficient in handling large and complex datasets
Ability to mentor engineers, guide technical decisions, and contribute to the overall strategy
For more information on this role or others like this in the market, apply below or contact Mark.cornwel-smith @mane.com.au
#J-*****-Ljbffr