Unlock the Power of Data Engineering
The role of Data Engineer is a vital part of our organization's infrastructure, responsible for designing, constructing, and managing massive data systems that drive critical decisions and product features.
Key Responsibilities:
* Data Pipeline Development: Develop, test, and maintain scalable data pipelines using technologies like Dataflow, Composer, and BigQuery to ensure high-throughput and robustness.
* Infrastructure Management: Architect and optimize data models within BigQuery for performance and cost efficiency to maximize the value of our data assets.
* Collaboration: Work closely with Data Scientists, Analysts, and Software Engineers to understand data needs and ensure data quality, fostering a culture of cross-functional collaboration.
* Operational Excellence: Monitor and troubleshoot production data systems, implementing best practices for CI/CD of data pipelines to ensure seamless operations.
What You'll Need:
* Technical Skills: Proficiency in programming languages such as Python, Java, or Scala, and experience with data engineering tools like Apache Beam, Apache Spark, and Cloud Dataflow.
* Domain Knowledge: In-depth understanding of data storage, processing, and analytics, including data warehousing, ETL processes, and big data platforms.
* Soft Skills: Excellent communication and collaboration skills, with the ability to work effectively with diverse teams and stakeholders to achieve business objectives.