Description

Position: GCP Data Engineer
Location: Phoenix , Arizona,85054 (onsite)
Duration: πŸ‘‰ Fulltime
Experience: 10+ Years

JOB DESCRIPTION-
GCP Data Engineer with hands on experience of 3+ years in Big Query, Data Proc, Airflow, Cloud Composer, GCP Hydra Services, Cloud Data Optimization , Python / Pyspark and Spark Architecture

Should be able to perform below tasks/responsibilities
-          Analyze, design, develop, support testing, and implementation of system applications.
-          Able to do programming using Core Java or Python/Scala
-          Good to write and understand complex SQL(Hive/PySpark-dataframes), optimizing joins while processing huge amount of data
-          Ability to design and develop optimized Data pipelines for batch and real time data processing.
-          Develop and document technical and functional specifications and analyze software and system processing flows.
-          Experience of working with Map-Reduce, Hive, Spark (core, SQL and PySpark)
 

Education

Any Graduate