Description

Skills Required:

Minimum 3 Years of Experience in Java/python in-depth Minimum 2 Years of Experience in data engineering pipelines/ building data warehouse systems with ability to understand ETL principles and write complex sql queries. Minimum 5 Years of GCP experience working in GCP based Big Data deployments (Batch/Real-Time) leveraging Big Query, Big Table, Google Cloud Storage, PubSub, Data Fusion, Dataflow, Dataproc Minimum 2 years of experience in development using Data warehousing, Big Data Eco System Hive (Hql) & Oozie Scheduler, ETL IBM Data Stage, Informatica IICS with Teradata 1 Year experience of deploying google cloud services using Terraform

Experience Required:

Minimum 3 Years of Experience in Java/python in-depth Minimum 2 Years of Experience in data engineering pipelines/ building data warehouse systems with ability to understand ETL principles and write complex sql queries. Minimum 5 Years of GCP experience working in GCP based Big Data deployments (Batch/Real-Time) leveraging Big Query, Big Table, Google Cloud Storage, PubSub, Data Fusion, Dataflow, Dataproc Minimum 2 years of experience in development using Data warehousing, Big Data Eco System Hive (Hql) & Oozie Scheduler, ETL IBM Data Stage, Informatica IICS with Teradata 1 Year experience of deploying google cloud services using Terraform


 

Education

Bachelor's degree