Our company values efficiency and scalability in data management. We're seeking a skilled DataOps Engineer to join our team.
The ideal candidate will be responsible for designing, implementing, and maintaining automated data workflows using cloud-based infrastructure.
This role requires expertise in DevOps, cloud data platforms, infrastructure automation, and data pipeline orchestration.
Key Responsibilities:
* Streamline data pipeline deployment through continuous integration and delivery (CI/CD).
* Ensure observability, performance, and security in cloud data environments.
* Automate infrastructure management and data workflow orchestration.
* Drive best practices for DataOps, governance, and cost optimization.
Requirements:
* Bachelor's or Master's degree in Computer Science, Data Engineering, Cloud Computing, or related field/relevant experience.
* 3+ years of experience in DevOps, Data Engineering, or Cloud Infrastructure roles.
* Proficiency in CI/CD tools, such as GitHub Actions, Jenkins, Azure DevOps.
* Strong understanding of ETL/ELT pipelines, workflow scheduling, and automation.
* Hands-on experience with cloud data platforms, including AWS, Azure, GCP.
* Proficiency in Terraform for infrastructure-as-code (IaC).
* Experience with observability tools, such as Datadog, Prometheus, Grafana, CloudWatch.
* Strong troubleshooting skills for data pipeline failures, latency, and performance optimization.
* Proficiency in Python, Bash, or PowerShell for automation and scripting.
* Experience with SQL and NoSQL databases.
What We Offer:
* A competitive salary and generous commission scheme.
* A dynamic work environment with a highly motivated team.
* Professional development opportunities and career growth.
* A comprehensive employee benefits program.