Job Opportunity
GCP BigData Engineering involves the design and optimization of BigQuery schemas, partitioning and clustering, as well as cost and performance tuning. This role requires building streaming and batch pipelines using Apache Beam/Dataflow and Pub/Sub with exactly-once semantics, backpressure handling, and replay strategies.
* Mandatory Skillsets
* 1. Cloud-based Data Engineering (BigQuery + Dataform + Pub/Sub)
* 2. Python Programming with Orchestration (Airflow/Cloud Composer)
* 3. Data Security Governance on Google Cloud Platform
Key Requirements
Experience in data engineering is essential, preferably with a total of 7+ years of experience and 4+ years working with production systems on GCP.