Description

Design and develop on Hadoop applications,

Hands on in developing Jobs in pySpark with Python/ SCALA (Preferred) or Java/SCALA

Experience on Core Java, Experience on Map Reduce programs, Hive programming, Hive queries performance concepts

Experience on source code management with Git repositories

Secondary skills

Exposure to AWS Ecosystem with hands-on knowledge of ec2, S3 and services

Basic SQL programming

Knowledge of agile methodology for delivering software solutions

Build scripting with Maven / Cradle, Exposure to Jenkins

Key Skills
Education

Any Graduate