Required Skills & Experience
• Requires 7+ years of development experience
• Experience working with big data and eco system (e.g. Hadoop, Hive, pySpark, HBase, Sqoop, Impala, Kafka, Flume, Oozie, MapReduce, S3 etc.)
• Experience with high level programming languages (Python, Java, etc)
• Experience working with database systems (e.g. Exadata, NoSQL, Cassandra, Neo4j, etc.)
• Experience working as part of an Agile team and using Agile SDLC tools (Jira, etc.)
• Ability to partner with other functional areas to ensure execution of development, design, coding, testing, debugging and documenting application
• Ability to develop awareness of business function for which application is being designed in order to drive out detailed requirements and form effective partnerships
• Analytical ability, independent problem solving, and good communication skills
• Demonstrated ability to learn new technologies and deliver in a fast paced agile environment
• Top 3 required IT/Technical skill-sets: Hadoop, Hive, PySpark
Any Graduate