Description

Mandatory Skills-

3+ years' experience in software development with Big data technologies
Proficiency in at least one of the following programming languages: Java/Scala/Python
Experience of development and deployment of at least one end-to-end data storage/processing pipeline
Basic understanding of RDBMS
Intermediate level expertise in Apache Spark, HDFS and Hive
Experience of working with Hadoop cluster
Good communication skills and logical skills

Nice To Have

Prior experience of writing Spark jobs using Java is highly appreciated
Prior experience of working with Cloudera Data Platform (CDP)
Hands-on experience with NoSQL databases like HBase, Cassandra, Elasticsearch etc.
Experience of using maven and git
Agile scrum methodologies
BFSI domain knowledge

Desired Skills and Experience
Big Data, Hadoop, Apache Spark, RDBMS, Nosql, Cloudera data platform
 

Education

Any Graduate