We are looking for a mid-level Dataiku Developer to support the migration of analytics and data processing workflows from Lavastorm to Dataiku DSS within a Snowflake data platform environment. You will build and enhance Dataiku workflows, develop Python and SQL transformations, and support testing and reconciliation to ensure migrated workloads meet functional and performance expectations.
This role also requires hands-on use of AI-assisted coding and automation tools such as Claude Code and GitHub Copilot to accelerate development, refactoring, and documentation while maintaining strong engineering discipline.
Key Responsibilities
Dataiku Workflow Development
* Build and modify Dataiku workflows using visual recipes and Python code recipes
* Translate Lavastorm logic into Dataiku implementations with guidance from the architect
* Refactor existing flows for maintainability, readability, and performance
* Create scenarios for scheduling, monitoring, alerts, and failure handling
Snowflake Data Engineering
* Develop and optimise SQL transformations in Snowflake
* Support ELT patterns including incremental loads and structured transformations
* Assist with data validation and reconciliation activities (counts, checks, business rule comparisons)
* Troubleshoot and resolve workflow failures and data issues
Migration Delivery Support
* Assist with migration discovery tasks: mapping inputs/outputs, documenting dependencies, identifying edge cases
* Execute migration build tasks and support test cycles across DEV/UAT/PROD
* Support cutover activities and post-migration hypercare (monitoring, defect fixes, tuning)
AI Tooling and Coding Automation
* Use Claude Code and GitHub Copilot to accelerate development of Python/SQL, test scripts, and documentation
* Apply validation practices for AI-generated code: unit tests, peer review, and performance checks
* Contribute to automation scripts for reconciliation, regression testing, and workflow scaffolding
Collaboration and Communication
* Work closely with the Dataiku Architect, data engineers, and offshore team members
* Document workflows, decisions, and operational procedures
* Communicate clearly with stakeholders and contribute to a positive, culturally diverse team environment
Required Skills and Experience
* Hands-on experience developing in Dataiku DSS (workflow build and enhancement)
* Strong Python skills for data transformations and automation
* Solid SQL experience; Snowflake experience strongly preferred
* Experience with GitHub or similar version control (commits, branching, pull requests)
* Familiarity with workflow scheduling/automation and troubleshooting production issues
Desirable / Nice to Have
* Prior ETL/analytics migration experience (Lavastorm or similar tools)
* Familiarity with data quality checks and reconciliation methods
* Exposure to performance tuning across SQL and Python workloads
* Experience working with offshore delivery teams
Personal Attributes
* Strong problem-solving mindset and attention to detail
* Quick learner with a proactive, self-driven approach
* Strong interpersonal skills and effective communication
* Comfortable working in culturally diverse, distributed teams
#J-18808-Ljbffr