Description

roles & responsibilities :

• Managing and optimizing disk space for data handling

• Backup and recovery procedure of database

• Performance observation and fine-tuning on the data pattern changes

• Software installation and configuration

• Data modeling, design and execution of data based on recognized practices

• Checking the connectivity and security measurements of data

• Automating manual tasks for swift performance

• Installing patches and upgrading software

• The potential to install and execute the Hadoop cluster, add and eradicate nodes, monitor workflows and all the critical parts of the cluster, configuration of name-node, recovery of backups and many more.

• In-depth knowledge of Unix based file infrastructure

• The expertise of general operations, including troubleshooting and sound understanding of network and system.

• Networking proficiency

• Experience with open-source configuration deployment and management tools such as Chef, Puppet, etc.

• Strong fundamental knowledge of the operating system – Linux

• Understanding Core Java is a plus point for efficient job performance


 

Education

Any Graduate