Description

Responsibilities:

  • Required 8+ years of strong hands-on expertise in Cloud(GCP, AWS, Azure) with strong experience in working with modern data and analytics frameworks, big data, and DevOps techniques, and is proficient with programming languages (like Python, PySpark, Scala).
  • Proficient in distributed computing ex: Spark with Java/Scala. Preferred to have Java over Scala
  • Proficient in SQL
  • Proficient in Kafka and Spark streaming
  • Experience in Workflow management using Airflow or similar tool
  • Experience in dealing with large volumes of data (Flat files, APIs and streaming)
  • Google Cloud Platform with BigQuery and DataProc is a plus
  • Knowledge in data modeling

Education

Any Gradute