Technology Recruiting Solutions – Melbourne VIC
We're seeking an experienced Data Engineer with strong SSIS and ETL skills to join a major data migration and platform transition program within a large Financial-Services enterprise.
You'll be part of a collaborative data team working across onshore and offshore streams, building and maintaining pipelines to move, transform, and reconcile large data sets between core systems.
You'll work closely with the Lead Data Engineer, business analysts, and testing teams to design, build, and support data integration solutions across SQL Server and related technologies.
This role requires strong SQL development experience, hands-on SSIS delivery, and attention to data quality, validation, and cutover readiness.
Key Responsibilities
Develop, test, and deploy ETL/ELT workflows using SQL Server Integration Services (SSIS).
Write and optimise T-SQL stored procedures, functions, and queries to support data extraction, transformation, and load.
Build and maintain data mappings and transformation logic between multiple source and target systems.
Conduct data validation, reconciliation, and quality checks to ensure accuracy and completeness.
Troubleshoot and resolve ETL performance and data-related issues.
Collaborate with team members across architecture, testing, and project delivery to meet key milestones.
Contribute to documentation, peer reviews, and best-practice standards for ETL development.
Skills & Experience
5+ years' experience as a SQL Developer / ETL Developer / Data Engineer in enterprise environments.
Expert-level SQL skills including T-SQL, stored procedures, performance tuning, and data analysis.
Strong experience with SSIS – developing, maintaining, and troubleshooting ETL packages.
Proven background in data migration or system integration projects (preferably within Financial Services, Insurance, or other regulated sectors).
Experience with data validation, reconciliation, and production cutover processes.
Solid understanding of relational databases, data modelling, and referential integrity.
Desirable: exposure to cloud or modern data tools (Azure Data Factory, dbt, Snowflake, Redshift, Python).
#J-*****-Ljbffr