Role Title: Snowflake Architect
Job Description: Join our dynamic Data & AI practice as a Snowflake Architect, where you’ll be delivering high impact client-facing data transformation initia
tives across industries.
We are seeking a highly skilled and experienced Snowflake Architect to join our team. The ideal candidate will have a strong background in data architecture, data warehousing, and cloud technologies, with a specific focus on Snowflake.
As a Snowflake Architect, you will be responsible for designing, implementing, and optimizing Snowflake-based data solutions to meet the needs of our clients, and guiding high performance teams in modern approaches and best practices. You’ll have the opportunity to leverage the latest tools to enable our clients with modern, AI ready Data Platforms.
Our Snowflake practice continues to go from strength to strength, having been awarded the Snowflake APJ Data Cloud Services Innovation Partner of the Year 2025, Snowflake Marketers & Advertisers Data Cloud Services Partner of the Year 2025, and the Global Data Cloud Services Implementation Partner of the Year in 2024.
We are embedding Gen AI into the solutions we deliver for our clients, and in how we deliver them, making Accenture’s Data & AI team a fantastic place to make an impact and grow your career.
Project Description / Key Responsibilities:
- Design and implement scalable and efficient data architectures using Snowflake.
- Design modern data platforms leveraging Snowflake and a range of modern tools such as dbt, Fivetran, Coalesce.
- Collaborate with stakeholders to understand business requirements and translate them into technical solutions.
- Provide technical leadership and guidance to development teams on Snowflake best practices.
- Partner with senior technical and business leaders to build relationship and gather and define requirements, including providing presales support, and solutioning projects
- Work with Data Architects and Modelers to advise on relevant, effective and efficient solutions
Key Skills:
- Have a deep understanding of Snowflake’s architecture, including virtual warehouses, micro-partitioning, query optimization, resource optimization, usage monitoring, caching, clustering strategies and integration with observability tools (e.g., Splunk, Datadog)
- Strong experience in designing and implementing data models (star, snowflake schema) and data warehouse best practices tailored for Snowflake
- Hands-on experience with AWS, Azure, or GCP, and integrating Snowflake with cloud storage, ETL tools (dbt, Fivetran, Matillion, Airflow), APIs, and real-time data pipelines.
- Knowledge of Snowflake’s security features like role-based access control (RBAC), data masking, encryption, row access policies, and compliance with regulatory frameworks (GDPR, HIPAA, SOC2)
- Proficiency in Terraform, CI/CD pipelines, scripting (Python, SQL, Shell) to automate deployments, environment setup, & monitoring and experience with Snowpark (Python, Java, Scala) for data engineering, ML model deployment, and UDF development
- Understanding of Cortex’s AI/ML capabilities, including built-in LLMs, vector search, to leverage these for business use cases
- Valuable communication skills and ability to learn quickly.
Location: Melbourne, Sydney, Brisbane, Canberra