We have an exciting new position for a Data Engineer join our IT Integrated Services team. We are in the process of moving from our on premise DW to a state of the art solution in the Snowflake Cloud and this role will play a key part in implementing and supporting our journey. In this role you will develop and maintain data solutions that are scalable, efficient and meet business requirements.
We are looking for an enthusiastic individual who has in-depth understanding of existing, new and emerging data trends. Ideally you will be able to collaborate with both the business and IT teams to define the business problem, refine the requirements, and design and develop data deliverables accordingly.
Primary responsibilities for this position:
* Collaborate with a cross-functional team to design, build, and maintain efficient, reusable, and reliable data pipeline
* Recommend and implement ways to improve data reliability, efficiency, and quality through creating and maintaining optimal data pipeline architecture,
* Assemble large, complex data sets that meet functional and technical business requirements.
* Develop and fine-tune data processing workflows utilizing big data technologies and frameworks
* Build and execute complex SQL queries for database operations and data manipulation
* Identify, design, and implement internal process improvements: automating manual processes, optimising data delivery, re-designing infrastructure for greater scalability and data discovery and onboarding, etc.
* Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using 'big data' technologies.
* Work with JBS and partner stakeholders including project managers, business analysts, data scientists and developers to assist with data-related technical issues and support their data infrastructure needs.
Your background
To be successful in this role you will have:
* A bachelor's degree in computer science, engineering, information Technology or a related field or equivalent practical experience
* 3+ years' experience SQL, CI/CD pipelines, data integration and data delivery
* 3+ years' experience data engineering & ETL
* 3+ years data modelling and design
* 1-2 Years' snowflake desired
* Click Replicate desired
* Click QDI – Click Data Integrations (data modelling tool)
End-to-end Software development lifecycle expertise (requirements, design, architecture, development, testing, deployment, release and support)
Technology Stack
* Snowflake
* MS SQL Server and Management Studio
* Pentaho PD
* Docker
* MS Visual Studio and TFS
* AWS console/CLI
Key Interactions:
* Business & IT Leadership teams
* Business Users
* Key partners and third-party vendors as appropriate
Job Types: Full-time, Permanent
Pay: $115,000.00 – $130,000.00 per year
Benefits:
* Employee discount
* Health insurance
Work Location: In person