Position Responsibilities
Administration knowledge and experience in managing Hadoop clusters on promise and in cloud
Required skills HDFS, Map Reduce, Hive 1.1.0, Hue 3.9.0, Pig, Flume, Oozie, Sqoop, CDH5 or higher, Apache Hadoop 2.6, Spark, SOLR, Storm, Knox, Cloudera Manager, Red Hat and Oracle
Implementing, managing and administering the overall infrastructure.
Takes care of the day-to-day running of data servers
Administrator will have to work closely with the database team, network team, BI team and application teams to make sure that all the big data applications are highly available and performing as expected
Backup and recovery tasks
Resource and security management
Troubleshooting application errors and ensuring that they do not occur again
Experience in data operations and production support processes
Work as an integrated member of the big data technology team and drive cross functional international Big Data initiatives from beginning to end with various business units
Admin responsibility for Hadoop Admin and responsible for taking care of everything related to the clusters total of 100 nodes ranges from POC (Proof-of-Concept) to PROD clusters
Work with department and team leads to expand current architecture’s scalability
Help design, instantiate, and maintain data storage setups for analytics and data science
Interacting with Cloudera support and log the issues in Cloudera portal and fixing them as per the recommendations
Unique Competencies:
Required
– 5+ years’ administration experience in various big data integration technologies focusing on Cloudera or Horton
Experience in HDFS, Ranger/Sentry, Hive, Spark, Kudu, Kafka
Knowledge of Unix/Linux, AWS Cloud, Kubernetes, OpenShift
Experience in security configurations: Active Directory, LDAP, SAML, Kerberos, SSL, Encryption
Educational Requirements:
Bachelor's Degree in Computer Science, Information Systems, or equivalent
Bachelor's Degree in Computer Science