Description

Job Description :- 
Preferred skills in Python / Java programming
Mandated Expertise in Big Data (Spark Core, Hive, Airflow) 
Experience on Hadoop Architecture having knowledge on Hadoop, Map Reduce, HDFS. 
UNIX shell scripting experience 
Experience of defining and using CI/CD pipelines Experience in caching and queuing stacks (Redis, Kafka)

Education

Bachelor's degree