Description

Technical skills:

2-3 years of experience working on a project like this :
Hadoop
Kubernetes
Kafka
Object storage experience
Preferred skills/nice to have - adds more value:

Minio or Ceph
Apache ozone
Object storage technologies - S3
Spark, Scala, Hive, Base, Kudu
Roles & Responsibilities:

7-8 years experience in Hadoop administration activities.
Exposure in AWS Big data Deploy/maintain hadoop clusters, add/remove nodes using cluster monitoring tools, configuring the NameNode high availability and keeping a track of all the running hadoop jobs.
Implementing, managing and administering the overall hadoop infrastructure.
Takes care of the day-to-day running of Hadoop clusters
Work closely with the database team, network team, BI team and application teams to make sure that all the big data applications are highly available and performing as expected.
Capacity planning and estimating the requirements for lowering or increasing the capacity of the hadoop cluster.
Decide the size of the hadoop cluster based on the data to be stored in HDFS.
Ensure that the hadoop cluster is up and running all the time.
Monitoring the cluster connectivity and performance.
Manage and review Hadoop log files.
Backup and recovery tasks Resource and security management
Troubleshooting application errors and ensuring that they do not occur again.

Education

Any Graduate