Role - Data Architect/Lead (Senior Role)
Skills - Azure/AWS, Databricks, Pyspark, BI tools and Dashboard.
Location - NewCastle NSW
Duration - Permanent
As a Data Architect, candidate will be responsible for designing and implementing enterprise-level data architecture that supports advanced analytics, reporting, and operational needs. You will define standards for data modelling, storage, and integration across multiple Insurance related applications using Azure/AWS platforms, ensuring scalability, security, performance and implementing Business Intelligence solutions that enable data-driven decision-making across the organization.
Key Responsibilities:
* Cloud Strategy & Integration:
* Data Quality and Data Governance:
* Must have Cross platform Data Migration experience
* Should be able to Develop new and understand existing end-to-end data architecture for data lakes, warehouses, and real-time streaming systems.
* Create conceptual, logical, and physical data models for structured and unstructured data.
* Create Source to Target mapping for ELT/ETL solutions
* Design hybrid and multi-cloud solutions leveraging Azure, Databricks, Pyspark etc
* Lead cloud migration projects and optimize cost and performance.
* Perform Data profiling
* Work closely with business stakeholders, data engineers, and BI teams to align architecture with business goals.
* Evaluate emerging technologies (AWS, Snowflake, Delta Lake) and recommend adoption strategies.
* Implement DQ practices and Data Governance rules
* Experience in metadata management and data lineage tracking .
* Design and implement BI architecture, including data models & reporting frameworks
* Collaborate with business teams to gather requirements and translate them into technical solutions
* Provide solutions to BI tools and dashboards for performance and usability
* Mentor junior engineers and enforce best practices in data engineering and reporting.
Technical Skills:
* Data Modelling: Advanced knowledge of normalization, dimensional modeling, and NoSQL design.
* Cloud Expertise: Deep experience in Azure (ADF, databricks, Pyspark, Python) or AWS (Glue, Kafka, etc)
* Programming: Proficiency in SQL, Python, Pyspark, and Big data processing.
* Architecture Tools: ERwin, PowerDesigner, or similar modeling tools.
* Security & Governance: Familiarity with Identity Access Management (IAM ), encryption, and compliance frameworks.
#J-18808-Ljbffr