Own end-to-end AWS-Snowflake ETL pipelines as a Senior Data Engineer for a Large scale Fed Gov Department
Your new company
The agency follows its Corporate Plan and the APS Code of Conduct, operating within a clear legislative and regulatory framework. Its core values-putting people first, working together, striving for continuous improvement, and taking responsibility-guide every decision.
The Data and Analytics Branch turns raw information into useful insights. It builds and maintains enterprise data warehouse and reporting systems, creates business intelligence tools and analytics prototypes, and helps staff across the organisation understand and use data. By spotting trends and keeping processes transparent, this branch drives better performance and smarter decisions.
Your new role
The APS6 Operations Data Engineer (Labour Hire) will support and maintain data assets on the Cloud Data Lake, Cloud EDW, Legacy EDW and SAS analytics platform. Day-to-day duties include running the daily data delivery processes, handling month-end workloads and responding to routine data requests in the business-as-usual pipeline.
The engineer will ensure the platform consistently meets its service-level agreements and underpins all operational workflows across the data tools. This involves monitoring performance, troubleshooting issues, and keeping process documentation current to guarantee reliable, end-to-end data delivery.
Key duties and responsibilities
* Experience in AWS Cloud, AWS S3, AWS Glue or similar tools within the cloud environment
* Provide level 2/3 technical support for AWS, Control-M, Teradata (legacy EDW), Snowflake, and ETL tool-based data integration solutions.
* DevOps - Ability to understand DevOps process and can use DevOps tools in accordance with the process
* Programming - High level of competency in Programming, including knowledge of supplementary programming languages such as Python
* Experience in Control M or similar scheduling applications.
* Worked on File transfer tools e.g. GoAnyWhere or any other similar tools
* Version control - Ability to demonstrate knowledge of version controls and its appropriate uses
* Facilitate continuous improvement measures in delivery principles, coding standards, documentation and provide training sessions to team.
* Prioritise work items and add them to a work queue.
* Understand, analyse and size user requirements.
* Development and maintenance of SQL analytical and ETL code.
* Development and maintenance of system documentation.
* Work within a state-of-the-art, greenfield DevOps environment.
* Collaboration with data consumers, database development, testers and IT support teams.
What you'll need to succeed
*THIS ROLE IS ONLY AVAILABLE TO AUSTRALIAN CITIZENS*
Essential criteria
* Proficiency in AWS services related to data engineering ie, AWS Glue, S3, Lambda, EC2, RDS
* Data pipeline design and development using ETL/ELT frameworks
* Proficiency with Snowflake as a Cloud EDW/Teradata as On-Prem EDW
* Proficiency in programming languages: Python (preferred), Control-m Orchestration / Monitoring or similar applications.
* Strong experience in Operational Support processes and working in a BAU environment
Desirable criteria
* Experience with infrastructure-as-code tools: CloudFormation or Terraform
* Exposure to at least one ETL tool like DBT, Talend, Informatica etc
* Strong SQL Proficiency
* SAS (Base, EG, SAS Viya)
What you'll get in return
* 12-Month Contract W/ Possible 12-Month Extension
* Competitive hourly rate
* Hybrid working arrangement
What you need to do now
If you're interested in this role, click 'apply now' to forward an up-to-date copy of your CV, or call us now.
If this job isn't quite right for you, but you are looking for a new position, please contact us for a confidential discussion on your career.