● Big Data Hadoop platform technologies, Java, Scala, RDBMS
● Experience on Hadoop platform including Hadoop, Hive, HBase, AtScale, Spark, Sqoop, HDFS, and related tools to build analytical applications
● Experience with both Cloudera and Hortonworks Hadoop distributions is a plus
● Strong working experience with Object-Oriented Design and Development using Java is necessary
● Solid understanding of principles and APIs of MapReduce, RDD, DataFrame, and DataSets
● Strong working experiences in implementing Big Data processing using MapReduce algorithms and Hadoop/Spark APIs
● Experience building workflow to perform predictive analysis, multi-dimensional analysis, data enrichment, etc.
● Experience in database fundamentals, RDBMS data modeling, NoSQL database, and data modeling and programming
● Knowledge of secure coding practices and framework is a plus
● Experience in Agile methodologies such as Scrum.
● Experience in supporting large enterprise applications
● Always have an aptitude to learn new technologies and take on challenges
● Bachelor or Master Degree in Computer Science, or another related technology discipline
● Knowledge of Security Principles, and Accessibility Best Practices
Any Graduate