Strong Fabric Experience neededHybrid roleAbout Our ClientThis large organisation plays a key role in the transport and distribution industry.
With a focus on innovation and operational excellence, the company offers a supportive environment for professional growthJob DescriptionPurpose of the RoleYou will be responsible for maintaining and evolving an Azure-based Microsoft Fabric Data Platform.You will ensure the stability, performance, and scalability of the platform and its associated data pipelines, models, and data products.You will help enable high-quality analysis and data-driven decision-making by ensuring the Data Platform is reliable, well-structured, secure, and responsive to business needs.Key ResponsibilitiesCollaborate with analysts, business users, and vendors to improve data delivery performance and usability.Proactively identify opportunities to streamline platform operations and support new use cases.Oversee the operation and optimisation of the Azure Fabric Data Platform, including configuration, capacity management, monitoring, security, and automation of recurring tasks.Design cloud-native architecture to support structured, semi-structured, and unstructured data workloads.Build and maintain data ingestion and transformation pipelines that ensure accurate and timely delivery of data.Ensure the platform effectively supports downstream analytics tools, reporting environments, APIs, and other data services.Ensure the data platform is secure, performant, scalable, and available to meet enterprise-grade non-functional requirements.Uphold standards for data security, access control, documentation, and data quality monitoring.Contribute to the implementation of the organisation's Information Governance Framework.Promote architectural standards, reusable patterns, and best practices across teams.Provide training and support to end-users on platform capabilities, reports, and data interpretation, fostering a data-driven culture across the organisation.The Successful ApplicantTECHNICAL SKILLS & EDUCATION REQUIREDA degree in a relevant field such as Data Science, Computer Science, Information Technology, Mathematics, or Engineering, or equivalent experience.5+ years of technical experience designing, building and managing BI and data product workloads.Demonstrated proficiency with modern data platform skills:Programming languages: Python, SQL (and variants), PowerShell, Power QueryData Ingestion: Azure Data Factory, SQL Server Integration Services, Databricks, Apache spark, or equivalent tooling.DevOps: Enterprise level CI/CD pipelines, Infrastructure-as-Code behaviours, Azure DevOpsBuild secure, resilient data integration for APIs, event-driven, and batch ETL/ELT process across ERP, IOT and SaaS applicationDemonstratable knowledge of data governance, metadata management and data quality frameworks.Demonstrable understanding of modern cloud and big data technology.Experience in Data Warehouse migrations.Demonstrated experience with Microsoft Azure technologies including:Azure FabricAzure SQL ServerOneLakeFabric Data FactoryAzure PurviewAzure DevOpsApache SparkExperience with ITIL, Agile and DevOps methodologies.Strong analytical and entrepreneurial problem-solving skills, with experience in troubleshooting and optimising data product workloads.Excellent written and verbal communication skills for collaborating with cross-functional teams and conveying technical concepts to non-technical stakeholders.What's on OfferCompetitive salaryPermanent role with long-term career prospects.Opportunities to work on cutting-edge data technologies.Supportive company culture with a focus on innovation.Additional benefits to be confirmed during the hiring process.If you are passionate about data engineering and want to make a significant impact in the transport and distribution industry, apply now
#J-*****-Ljbffr