We are seeking to engage a suitably qualified and experienced Data Factory Engineer to deliver and support a range of ICT projects/programs. Data Factory Engineers are responsible for designing, developing, and maintaining data integration solutions that connect on-premises and cloud-based enterprise applications, leveraging Microsoft Azure tools and technologies.
This role involves creating efficient data pipelines, ensuring data quality, optimising performance, and adhering to security and compliance standards. The role involves collaborating with cross-functional teams to support through testing and quality assurance, monitoring, troubleshooting and issue resolution.
Data Factory Engineers will be responsible for:
* Collaborating with team members and stakeholders to design data integration solutions that meet business requirements, utilising Microsoft Azure tools such as Azure Data Factory and SQL.
* Develop, implement, and maintain ETL pipelines to ensure efficient and secure data transfer between various systems, adhering to best practices for data processing and transformation.
* Design and maintain data models, ensuring data accuracy, consistency, and availability while considering scalability and performance optimisations.
* Implement data validation and quality checks to identify and rectify data anomalies, errors, and inconsistencies within integrated systems.
* Continuously monitor and optimise data integration processes to enhance performance, reduce latency, and minimise resource consumption.
* Create and maintain comprehensive documentation for data integration solutions, including technical specifications, data flow diagrams, and operational procedures.
* Work closely with cross-functional teams, including developers, and system analysts, to understand data requirements and ensure integrations align with business objectives.
Candidates must hold either a Baseline or NV1 security clearance