Urgent requirement of Technical Data Analyst - Perm / Contract - Sydney
Requirements
Your responsibilities
* Having 05+ years of Experience in DATA MODELING, AWS
* Document technical specifications and integration patterns required for the Data Platform (AWS Tech Stack) across the downstream and upstream environments.
* A nalyze, design, and implement data solutions on AWS to support business objectives.
* Work closely with data engineers, analysts, and business teams to optimize and improve cloud-based data pipelines. Build reports and undertake technical data analysis as required via a working knowledge of SQL.
* Assess Business Requirements and Technical documents to determine the build work required
* Assist with documentation of User Stories.
* Implement system configuration, eg, Commercial Off The Shelf (COTS) system changes to achieve Business Requirement and / or Technical outcomes.
* Process flow Analysis, Data Models, and configuration documentation.
* Conduct deep analysis on structured and unstructured datasets to extract meaningful insights.
* Assist in data migration projects from on-premise or legacy systems to AWS.
* Develop dashboards, reports, and analytics using AWS-native tools.
* Collaborate with dev teams to automate processes and enhance data accessibility.
* Be part of a cross-functional team planning, configuring, unit testing and documenting new and existing system features in order to deliver working end-to-end solutions and thus value to stakeholders.
* Analyze interface requirements and document interface mappings
* Undertake impact analysis on system implementation and design options, making recommendations, assessing and managing associated risks.
* Uphold solution integrity by ensuring that work is compatible with agreed system architectures, adheres to corporate standards, upholds quality, and ensures security.
* Contributes to the development of organizational policies, standards, and guidelines for test environment management.
* Collaborate with non-technical stakeholders to understand whether solutions meet their needs.
Your experience and qualifications must have :
* Proven track record of technical business analysis, taking responsibility across all stages and iterations of the lifecycle, including a strong understanding of best practices and design patterns.
* E xperience working within a low-code / configuration environment (eg. COTS systems) deploying system changes within a defined software delivery lifecycle (SDLC).
* Strong proficiency in AWS data technologies, including :
* Amazon Redshift (Data warehousing), AWS Glue (ETL processes), A mazon S3 (Object storage), A WS Athena (Serverless querying), A WS Lake Formation (Data lake management), AWS QuickSight (Business intelligence and visualization)
* Experience working with SQL and NoSQL databases .
* Understanding of data governance, security, and compliance best practices in cloud environments.
* Exposure to machine learning pipelines using AWS SageMaker (preferred but not required).
* Strong analytical skills and experience interpreting complex datasets .
* Ability to translate business needs into scalable technical solutions .
* Familiarity in integration technologies including API and event driven architectures.
* Familiarity with Agile software engineering methodologies and software product implementation.
* Experience with relational database (SQL Server) and capital markets data products / models.
* Experience working with Java applications, micro services, data streaming capabilities
* Detail-oriented individual with the ability to quickly assimilate and apply new concepts, business models and technologies
* Experience in unit / integration and Non Functional testing
Nice to have
* Experience working in the Capital Markets Industry with a solid understanding of product and transaction life cycles.
* Passionate about solving problems, troubleshooting software issues, and triaging environment issues.
* Excellent verbal and written communication skills with the ability to work with internal and external stakeholders.
* AWS Certified Data Analytics – Specialty or Solutions Architect certification.
* Knowledge of big data frameworks (e.g., Apache Spark, Hadoop) on AWS.
* Prior experience with streaming data solutions like AWS Kinesis or Kafka on AWS .
* Lateral thinker – bring forward ideas that will automate repeatable tasks to reduce work effort and ensure quality deliverables
* Learn quickly, enjoy the challenge of learning new systems
* Compassionate, empathetic, and self-motivated
* Able to take ownership and not be afraid to acknowledge failure
* Understand the importance of maintainable code and comprehensive testing.
Duration : Permanent / 06 months and possible extension
Eligibility : Australian / NZ Citizens / PR Holders only
#J-18808-Ljbffr