Key Responsibilities and Technology Experience:
Analyzes, designs, creates and implements Big Data infrastructures, including access methods, device allocations, validation checks, organization and security.
Designs data models, logical and physical infrastructure designs, etc.
Assists in system planning, scheduling, and implementation. Initiates corrective actions to stay on schedule. Installs, upgrades, and tests complex big data deployments.
Develops and implements recovery plans and procedures.
Disciplines: Hadoop design and analysis.
Involved in the analysis, design, development and implementation of software applications. Determines user requirements, leads application design, plans projects, establishes priorities and monitors progress.
Solid administrative knowledge of Apache Hadoop a must (Cloudera distribution a plus).
Bi Tool integration with Hadoop.
DBA experience HBASE
experience with database replication and scaling
Design, install, and maintain highly available systems (including monitoring, security, backup, and performance tuning).
Linux (RHEL) proficiency a must.
Scripting experience.
automation experience (chef/puppet).
Must possess good analytics and problem solving skills.
Any Graduate