Description

Data Engineering

    Tech Skills - Scala, Spark, GCP, Dataproc, Hadoop, Airflow, SBT, Maven, Docker, Kubernetes, pyspark, Jenkins, Bigquery

    Experience with workflow management tools such as Jenkins, Airflow

  • Experience running spark/hadoop workloads using Dataproc, Dataflow, Cloud composer, EMR, HD Insights or similar.
  • Proven working expertise with Big Data Technologies Spark, PySpark, hive, and SQL.
  • Demonstrates expertise in writing complex, highly optimized queries across large data sets
  • Knowledge and experience in Kafka, Storm, Druid and Presto

Education

Any Graduate