Designation / Job title : Developer Programmer - 261312
Primary or Mandatory skills :-
Bigquery
Python
Good to have skills :
Cloud Composer
Looker
Dataflow
Detailed Job description : -
•Develop, Build, review and test data pipelines in Google cloud platform using services - Data flow, Big Query, Cloud function, Fire store, Composer
•Develop and automate data ingestion pipeline using DataFlow
•Work with various teams and Business to develop cost effective migration strategies.
•Analysing, creating and optimizing complex SQL queries in Vertica and Big Query.
•Create DAG's and schedule pipelines using Apache airflow.
•Server/application maintenance of IBM Master Data management and Data Stage as part of admin work and resolving any other issues raises by users.
•Automating the work loads using Python and Unix Shell scripting.
•Meeting with multiples teams and business users for Requirements gathering and creating Technical mapping documents required for developing Data Pipelines .
•Closely working with Business and technical teams.
•Create end to end Data lineage documents of existing ETL Jobs.
•Leading and providing support to the ETL development team.
•Create and manage Sprint (Jira's)work for the development team.
•Solving data issues raised by business users on existing or developed Google cloud(Big Query, Looker reports) and creating RCA documents for the same.