*About the role* This role will involve development of data pipelines and data models to support the integration of new and existing data source systems used in business critical processes including operational reporting, analytics and support of Tableau data visualisation/reports.
Expert SQL skills with Redshift and Python capabilities who can cope and work in a fast-moving outcome-based delivery environment. Client is following a hybrid working model.
Typical Responsibilities
* Design, build and test data transformation jobs, including modelling of very large data sets.
* Design, engineer and support various ETL data loading patterns such as near real time, batch, historical and incremental.
* Write complex SQL queries to manage, manipulate and automate complex data feeds.
* Develop Python scripts to manage ETL jobs and orchestrate job schedules for near real time and batch
* Identify and implement efficiencies and optimisation of data transformation using industry best practice methods and tools.
* Exceptional documentation authoring skills of business and technical documents including but not limited to Design specifications, Technical specifications, Requirements and Test documents.
* Collaborate and work with Stakeholders to understand business requirements and translate into technical specifications.
* Evaluate ETL requirements for completeness, accuracy, and feasibility.
* Conduct analysis to estimate project timeframes for deliverables.
* Contribute to the design and implementation of the ETL framework and data model.
* Ability to gather requirements but suggest alternative better solutions to stakeholders and end users.
* Distil complex information and be able to communicate simply and in language that non technical staff understand.
Skills and experience required
* Minimum of 5 years' working experience in the data & analytics domain
* Strong experience with ETL technologies such as Matillion, AWS Glue/Data Pipeline, SSIS or similar with a flair for building complex SQL queries
* Extensive experience with Python (preferably 3+ years) for automation and integration
* Demonstrated working knowledge of databases such as Redshift, SQL Server or PostgreSQL
* Good understanding of RDBMS principles, database design, and query optimisation
* Experience documenting ETL processes and data models (e.g. entity relationship diagrams)
* Strong data modelling skills with the ability to transform staged data into conformed models and BI Datasets
* Knowledge of the utility industry highly desirable
* Strong analytical and interpretative skills
* Tertiary qualification in an IT related discipline
Desirable
* Post-grad qualification in Computer Science or similar
* Experience working with SAP CRM or Billing
This role can be based in Melbourne or Sydney.To be eligible to apply, you must be an Australian / New Zealand citizen or hold permanent residency status in Australia. Please only apply if you meet the above skills