* Global software company
* Build a brand-new data and analytics capability
* Reporting directly to the Head of IT
We are partnering with a Sydney-based global software organisation that is transforming how it uses data across the business. As part of this shift, they are establishing a new lean internal Data, Analytics & AI team. This Senior Data Engineer role is a core hire and will operate with significant autonomy.
This is not a narrow delivery role. You will work directly with end users to understand their challenges, translate those needs into scalable data solutions, and make judgement calls on priorities. With a small team, success depends on balancing short‑term delivery with long‑term architectural value while designing systems that enable self‑service instead of creating bottlenecks.
Key responsibilities include:
• Design, build and maintain scalable ETL/ELT pipelines using Databricks (SQL, Python, PySpark)
• Build end‑to‑end integrations with SaaS platforms such as Salesforce, NetSuite, Jira and others
• Implement incremental and historical data processing, including CDC, MERGE operations and SCD
• Apply Lakehouse and Medallion Architecture principles across Bronze, Silver and Gold layers
• Ensure data quality, reliability and performance through testing, validation and optimisation
• Implement and contribute to data governance, including data lineage, cataloguing and metadata
• Support production operations through monitoring, observability and troubleshooting
• Translate stakeholder requirements into reliable, well‑structured datasets for analytics and AI
• Share knowledge and contribute to documentation, engineering standards and best practice
You will bring:
• Strong commercial experience with Databricks
• Hands‑on experience building ingestion pipelines using Auto Loader, Lakeflow, Fivetran and/or custom API/file integrations
• Strong SQL, Python and Apache Spark (PySpark) skills for large‑scale data processing
• Experience orchestrating pipelines using Databricks Workflows, Jobs, Lakeflow Declarative Pipelines and/or Delta Live Tables
• Deep understanding of incremental processing (CDC), MERGE patterns and Slowly Changing Dimensions
• Proven experience designing Lakehouse‑based data models and Medallion Architecture
• Experience implementing data quality controls using DLT expectations, Great Expectations or similar frameworks
• Strong understanding of data lineage, governance and metadata management using Unity Catalog
• Experience with Spark optimisation, partitioning, Delta Lake performance tuning and workload efficiency
• Familiarity with Git, code review processes, and promoting pipelines across environments
• Understanding of data security, sensitive data handling and access control in governed environments
• Strong communication skills and ability to collaborate with business users, analysts and technical teams
Nice to have:
• Experience with AWS services
• Exposure to DevOps practices such as Terraform, CI/CD pipelines (Jenkins), or Databricks Asset Bundles
• Knowledge of dimensional modelling (Kimball, star schemas, facts & dimensions)
• Basic understanding of ML/AI concepts, including preparing structured/unstructured data for ML or agent use cases
There will also be opportunities to expand your skillset into platform engineering, cloud, and visualisation, supported by mentoring and cross‑functional collaboration.
Please reach out via ************@hudson.com with your contact number and resume in Microsoft Word format if you are interested.
(Tip for successful consideration: Your resume should clearly outline your hands-on contributions, decision making responsibility, and commercial impact. We rely solely on the information provided in your application and cannot infer experience not explicitly stated.)
Candidates must be based in Sydney NSW or willing to relocate and have full working rights in Australia.
Diversity, Equity & Inclusion at HudsonHudson is committed to helping you find a workplace where you feel respected, supported, and free to thrive. We welcome applications from all backgrounds, identities, and lived experiences—because when different voices come together, amazing things happen.
Casual Loading Please note for all Australian based contract and temporary roles only, the pay rate is inclusive of mandatory 25% casual loading. This excludes permanent and fixed term roles.