Requirements
Bachelor s degree in Computer Science, Math, Physics, Electrical Engineering, or similar
Deep knowledge of Apache Spark
Ability to develop and maintain complex systems using a variety of programming languages (Python, Scala, Java, Bash, etc.)
Experience with Hadoop technology stack (MapReduce, Hive, Sqoop, etc.)
Working understanding of data structures (stacks, queues, graphs, etc.) and analysis of algorithms
Excellent communication skills
Bachelor's degree in Computer Science