Job Requirements
· Strong Experience in Spark Scala Programing.
· Very extensive experience on Hive, Impala, Kafka & Hbase.
· Should have good knowledge on Spark Streaming.
· Clear understanding of Hadoop Ecosystem & Spark Architecture.
· Must have experience in Unix Shell Scripting, PL/SQL & Autosys Scheduler.
· Should have used udeploy, TeamCity, SharePoint.
· Should be expert in debugging & tracing errors/exceptions. occurred in Spark job execution.
· Must have experience working in different SDLC phases including testing support, production deployment & Post Prod support.
· Flexible, committed, dynamic and willing to take additional responsibility beyond BAU role.
· Fluent in English Communication.
Bachelor's degree