Description

Top Requirements:
Bachelor's or Master's Degree in a relevant field.
Minimum of 3 years of experience with Java/Python, and at least 2 years of experience in building data engineering pipelines and data warehouse systems, with a solid understanding of ETL principles and the ability to write complex SQL queries.
Minimum of 5 years of experience with GCP, working on GCP-based Big Data deployments (batch/real-time), leveraging BigQuery, Bigtable, Google Cloud Storage, Pub/Sub, Data Fusion, Dataflow, and Dataproc.
2 years of experience in development using data warehousing and big data ecosystems, including Hive and Oozie Scheduler.
One year of experience deploying Google Cloud services using Terraform.

Key Skills
Education

Bachelor's or Master's Degree