Description

Skills: 
1. Hands-on expertise in one or more big data technologies like Kafka, Spark, Hadoop, Hive etc
2. Hands-on expertise in using AWS.
3. Hands-on experience using scripting languages like Python etc.
4. Good to have programming skills, preferably in Java.
5. Good to have DevOps mindset.

Primary Job Duties (quick summary of their day to day activity) - Build and maintain the big data platform(Kafka, Spark, Storm, Hadoop/Hive), Build automation tools for the data platform, Develop and maintain Streaming and batch Ingestion pipelines, Build platform and application level alerting and monitoring, Build and Architect systems for Scale(100-200Billion events/day)
Top 2-3 skills you look for when reviewing resumes 1. One or more of the following technologies: Kafka, Spark, Storm, Hadoop/Hive, Key/Value Stores. 2. Very familiar with operating in AWS 3. Good Linux skills.

Education

ANY GRADUATE