Job Description:
Degree in Computer Science, Applied Mathematics, Engineering, or any other technology related field (or equivalent work experience)
4+ years of experience working as a big data engineer (Required) must be able to articulate use cases supported and outcomes driven.
Strong in a any cloud like AWS / GCP.
Expertise in Java language.
knowledge of the following (expectation is to demonstrate these skills live during the interview): PySpark, Spark, Python, Scala, Hive, Pig, and MapReduce
Experience in SQL.
Proven experience designing, building, and operating enterprise grade data streaming use cases leveraging one of the following: Kafka, Spark Streaming, Storm, and/or Flink
Large scale data engineering and business intelligence delivery experience
Design of large-scale enterprise level big data platforms
Experience working with and performing analysis using large data sets
Proven and Demonstrated experience working with or on a mature, self-organized agile team
Bachelor's degree in Computer Science