● Candidate should have overall 10+ year of experience.
● Should have experience in architecting large scale storage, data center and /or globally distributed solutions.
● Experience in designing and deploying production large-scale Hadoop solutions
● Ability to understand and translate customer requirements into technical requirements
● Experience designing data queries against data in a Hadoop environment using tools such as Apache Hive, Apache Druid, Apache Phoenix or others.
● Experience designing and deploying production streaming solutions using tools such as Apache Storm, Flink, etc.
● Experience building data pipelines.
● Experience installing and administering multi-node Hadoop clusters
● Strong experience implementing software and/or solutions in the enterprise Linux or Unix environment
● Strong understanding with various enterprise security solutions such as LDAP and/or Kerberos
● Strong understanding of network configuration, devices, protocols, speeds and optimizations
● Strong understanding of the Java development, debugging & profiling
● Significant previous work writing to network-based APIs, preferably REST/JSON or XML/SOAP
● Solid background in Database administration and design, along with Data Modeling with star schema, slowing changing dimensions, and/or data capture.
● Demonstrated experience implementing big data use-cases, understanding of standard design patterns commonly used in Hadoop-based deployments.
● Excellent verbal and written communications
Any Graduate