Build the data foundations of our monetisation platform.
Thanks is building a customer-first monetisation platform that delivers growth without compromise – for advertisers, publishers, and customers. We're operating at scale today, and entering a phase where data reliability, performance, and intelligence are critical to everything we do.
We're hiring a Staff Data Engineer to build our data foundations from the ground up. This is our first dedicated data engineering hire; a senior individual contributor who will design and deliver the data architecture, models and products that will scale the success of Thanks.
This role is deeply hands-on, highly influential, and foundational to the future of our engineering and product organisation.
What you'll do
* Build the data platform: Design and deliver a scalable platform that serves as the primary engine for the Thanks Network. You'll move us beyond operational databases into a well-modeled environment that supports both business intelligence and high-scale feature engineering.
* Own the models: You won't just move data; you will own the data science, technical implementation and performance of our ranking systems. You'll be responsible for the models that determine how we prioritise, personalise and deliver content across the network.
* Build for real-time inference: Own the end-to-end lifecycle of our models – from training and validation to real-time inference. You'll ensure our ranking system is fast, reliable, and fed by high-quality, near-real-time data.
* Unlock model experimentation: You will build the framework that allows us to run experiments on our ranking systems, ensuring we can accurately measure lift, attribution, and model success.
* Own pipelines & observability: Build robust batch and near-real-time pipelines that are resilient and observable. You'll ensure that the data feeding our models and experimentation frameworks is flawless.
* Enable self–serve analytics: Design clean, trusted datasets and data marts that allow product, engineering, and commercial teams to answer their own questions without bottlenecks.
* Set the data direction: Be opinionated about tooling, architecture, and trade-offs – helping define what to build, what to buy, and what to retire as our data needs evolve.
* Lead through expertise: Act as the go-to expert for data across the business, influencing roadmaps and decisions through strong technical judgment rather than formal authority.
Requirements
What we're looking for
* Experience operating as a senior, hands-on individual contributor in high-growth environments – able to build for scale without over-engineering too early
* Deep strength in both data engineering and applied data science – equally comfortable writing production-grade Python and complex, performance-optimised SQL
* Experience building and operating data pipelines in cloud environments
* Hands-on experience with analytical databases and comfort working across both operational and analytical data stores
* Familiarity with streaming or event-driven data architectures
* Comfortable operating as a senior IC in a greenfield environment – balancing long-term direction with hands-on delivery
* Excellent communication skills and the ability to partner effectively across Product, Engineering, and Commercial teams
* Uses AI thoughtfully to augment exploration, modelling, and engineering workflows – accelerating experimentation, debugging, and analysis, while maintaining high standards for data quality, correctness, and ownership
* Strong internal drive – you care deeply about performance, correctness, and building systems that last
Nice to have
* Experience in adtech, marketplaces, or performance-driven platforms
* Exposure to experimentation frameworks and attribution models
* Experience enabling analytics for non-technical teams
Technical skills
* Data Engineering: PySpark, dbt, strong SQL skills (must have)
* Data Workflow Pipeline: one of Airflow, Dagster, Step Functions or equivalent (must have, at least 1)
* DevOps / DataOps: Terraform, CloudFormation, Azure ARM, Kubernetes (must have, at least 1)
* Data Warehouse: Databricks, Snowflake, BigQuery, ClickHouse, Redshift, etc. (must have, at least 1)
* Data Catalog / Feature Store: Databricks Unity Catalog, Atlas (nice to have)
* Event Streaming: Kafka, Kinesis, or equivalent (nice to have)
* Data Analytics / Reporting: Experience working on or supporting reporting functions such as Tableau, Power BI, Superset, etc. (nice to have)
* Data QA: Great Expectations, DBT testing, etc (nice to have)
Benefits
Why Thanks?
At Thanks, we're building a customer-first monetisation platform that delivers growth without compromise – for advertisers, publishers, and customers alike. We power growth for the world's leading brands, delivering tens of millions of high-value "thanks" moments every month.
This is a genuine inflection point for the business. What makes this role different:
Foundational ownership: You'll build and own core data foundations from the ground up – shaping how ranking, reporting, and decision-making work as the business scales.
Impact you can see: Your work directly influences product performance, experimentation, and how the business learns and scales.
Strategy meets execution: This is a hands-on role – operating deep in the data while setting direction for a fast-growing, complex platform.
Growth, without chaos: You'll work closely with our founders and Head of Product in a culture that values courage over comfort, high standards without ego, and kindness without complacency.
Attractive compensation: Including meaningful equity.
Flexibility with intent: We're Sydney-headquartered and value in-person collaboration, ideally a couple of days per week. That said, we care more about leadership, impact, and outcomes than rigid rules - and we're open to exceptional candidates across Australia's east coast. #LI-Hybrid
We're building something deliberately – not copying what already exists. If you're excited by foundational ownership, complex data problems, and building systems that genuinely matter, we'd love to hear from you.
Let's build something extraordinary.