As part of ongoing data transformation efforts, we're seeking a Data Engineer to help build and optimize data pipelines for critical business applications for one of our leading clients. This role is hands-on, focusing on implementing modern data engineering solutions that streamline data flows, enhance internal operations, and improve customer-facing experiences.
You'll work with tools like Snowflake, Databricks, dbt, and 5Tran to develop and maintain scalable data pipelines, collaborating closely with senior engineers and business stakeholders.
Key Responsibilities:
* Data Pipeline Development: Build and optimize end-to-end data pipelines using Snowflake, Databricks, dbt, and 5Tran, ensuring data is structured and ready for business applications.
* Data Transformation & Modeling: Develop dbt models to transform raw data into structured datasets for analytics and operational use.
* System Optimization: Support the design and implementation of efficient data structures that improve internal data flows and performance.
* Collaboration & Communication: Work with cross-functional teams, including senior engineers, analysts, and business stakeholders, to ensure data solutions meet business needs.
* Code Quality & Best Practices: Write clean, efficient SQL and Python code while following best practices for data engineering, testing, and performance optimization.
* Agile Development: Participate in Scrum ceremonies, contributing to sprint planning, standups, and retrospectives.
Skills & Experience:
* Experience in data engineering, with hands-on experience in building data pipelines.
* Strong knowledge of Snowflake and Databricks for data processing and warehousing.
* Experience with dbt to create transformations and build reusable data models.
* Familiarity with 5Tran for data integration and pipeline automation.
* Proficiency in SQL and data modeling, with a focus on optimizing query performance.
* Experience working with cloud platforms (AWS, GCP, or Azure) and ETL processes.
* Ability to troubleshoot data pipeline issues and optimize performance.
* Strong problem-solving skills and ability to work both independently and collaboratively in a team environment.
* Experience with Agile/Scrum methodologies.
Preferred:
* Experience with Python for automation or additional data engineering tasks.
* Knowledge of business-specific data concepts like unit costs, skews, and menu data is a plus.
* Previous experience in foodservice or a similar industry is beneficial.
* This is an exciting opportunity to grow your expertise in modern data engineering while working on impactful projects within a collaborative team environment
Seniority level
* Seniority level
Mid-Senior level
Employment type
* Employment type
Contract
Job function
* Job function
Information Technology
* Industries
IT Services and IT Consulting
Referrals increase your chances of interviewing at Agility Partners by 2x
Get notified about new Data Engineer jobs in Charlotte Metro.
Charlotte, NC $124,800.00-$234,000.00 2 weeks ago
Charlotte, NC $75,000.00-$95,000.00 19 hours ago
Charlotte, NC $150,000.00-$170,000.00 2 weeks ago
Charlotte, NC $85,000.00-$150,000.00 1 week ago
Charlotte, NC $100,000.00-$115,000.00 2 days ago
Kings Mountain, NC $100,000.00-$200,000.00 1 week ago
We're unlocking community knowledge in a new way. Experts add insights directly into each article, started with the help of AI.
#J-18808-Ljbffr