Analyst Programmer
Location: Sydney, Australia
Salary: AUD 90,000 - 110,000 annually
Employment Type: Full-Time
About INGRITY
INGRITY is a Microsoft Data and AI Solution Partner dedicated to delivering transformative data and AI-driven solutions. We collaborate with some of the best ASX-listed companies and medium-sized businesses, working closely with Microsoft to create measurable value for our customers. Our success is built on innovation, customer advocacy, and a strategic partnership with Microsoft.
About the job:
We are looking for a skilled Analyst Programmer to join our delivery team. This role suits a hands-on professional with a solid background in data and analytics development, cloud data architecture, pipeline design, and system integration.
Key Responsibilities:
* Work with both technical and business stakeholders throughout the organization to identify opportunities for leveraging data to drive solutions and analytical projects.
* Design, develop, and maintain scalable data pipelines using modern cloud technologies.
* Participate in data warehouse migration and optimization projects.
* Collaborate with business analysts and developers to define technical requirements and project scope.
* Support cost optimization efforts by tuning and restructuring BigQuery tables and cloud resources.
* Participate in proof-of-concept initiatives and contribute to architecture design discussions.
* Conduct R&D to explore new tools, frameworks, and practices that enhance data integration solutions.
* Provide technical leadership in cross-functional projects, acting as solution architect or senior developer where needed.
Technical Capabilities:
* 10+ years of experience in data and analytics solution development.
* Lead team of junior data engineers/developers to deliver analytical projects across the BFSI domain.
* Experience with Azure or GCP (BigQuery, Cloud Run, Terraform, Buildkite) or AWS (Redshift, S3).
* Expertise in data orchestration tools (e.g., Airflow, DBT) and containerized environments (Docker).
* Proficient in Python, SQL, Scala, and Spark for data engineering and transformation tasks.
* Background in ETL development and data integration.
* Version control and CI/CD using Git and related DevOps practices.
* Ability to work within complex and challenging data environments.
* Comfortable working in cross-functional teams to deliver high-quality solutions on time.
Education:
* Bachelor of Science in Computer Engineering or related discipline.
#J-18808-Ljbffr