Role Overview
Our client is seeking an experienced professional to support critical projects within their Data management team. The role requires a candidate with a data management background and associated qualifications who can support operational outcomes by providing subject matter expertise across the process to identify data products that will meet business requirements, data engineering, data extraction, ingestion and integration, and development of data visualisations products. The successful candidate will be required to operate as a trusted advisor to project stakeholders, delivering pragmatic solutions to support the ongoing operations of the team.
Key Responsibilities
* Working in the Data Engineering and Operations team you will be primarily responsible for all platform engineering, data engineering and related ITSM activities. This includes but is not limited to:
* ETL and ELT activities across the platform as well as ingress and egress of data from data platforms and electronic data warehouses.
* ITSM and incident management functions across platforms.
* Change management and coordination.
* Platform and application configuration and management including patching.
* Continuous improvement.
* Delivery of continuous support and defined engineering outcomes that underpin the management and sustainment of the Azure data platform.
* Support to business units (user community) in realising engineering and platform outcomes to enable, support decentralised execution under a federated data mesh.
* Applying and managing data security including but not limited to Attribute Based Access Control and Role Based Access Control under a data centric security model.
* Development, mentoring and promoting data skills, cultures and behaviours.
* Individuals in the Data Engineering and Operations Team will be expected to have strong proficiency in the administration and management and use of the platforms and applications listed below (as they apply to roles) within the directorate:
* MS Azure (Essential) and AWS (preferred).
* On‐premises infrastructure eg IBM Fusion.
* Azure Data Factory.
* Redhat and Azure Kubernetes Services.
* Cloudera (Apache Atlas, Cloudera Data Engineering, Apache Nifi, Apache Kafka, Apache Ranger).
* Cloudera Data Visualisation (Administration configuration and management) and Cloudera Machine Learning (Administration configuration and management).
* JFROG.
* Neo4J.
* Power BI.
* MS SQL Server and Analysis Studio.
* InfoArchive and Application builder.
In addition to the above skills and experience, working knowledge of defence security and architectures is advantageous but not essential.
Specific Supporting Information for Data Engineering and Operations Team Members
* The majority of roles within the Data Engineering and Operations Team will require individuals to be Australian Citizens and hold/maintain a NV2 security clearance. Please note that individuals who have an NV1 security clearance may be selected based on the ability to obtain a NV2 clearance.
* Candidates must primarily work in premises at a Defence establishment. Physical work locations can be in Canberra or Melbourne.
* Candidates must sign Non‐Disclosure Agreements, confidentiality clauses on commencement mandatory, conflict of interest declarations may be required.
Essential Criteria
* Azure Data Factory experience.
* 1+ year Pyspark experience.
* 1+ year Azure DevOps experience.
* Security Clearance NV1.
* Melbourne or Canberra based.
Highly Desirable Criteria
* Azure Cloud experience.
* Experience using Cloudera.
* Security Clearance NV2.
Additional Information
* Role Type: Full‐Time.
* Security Clearance Requirement: NV1 with ability to gain NV2.
* ITAR: No.
* Location: Canberra ACT (remote work not available – work is to be conducted on customer premises).
#J-18808-Ljbffr