Must-have Skills
5 years of work experience with a bachelor’s degree or 2 years of work experience with an Advanced Degree.
Experienced with one or more scripting languages, Python, Ruby, Bash, and/or NodeJS.
Experience in Design, build, assemble, and configure application or technical architecture components using business requirements.
Experience in implementing Kafka platform in provisioning, implementing security & authorization on Kafka cluster includes creating a backup & mirroring of Kafka Cluster brokers, broker sizing, topic sizing, h/w sizing, performance monitoring, broker security, topic security, consumer/producer access management (ACL)
Experience in Kafka, Zookeepers, Kafka connect, schema registry, KSQL, Rest proxy and Kafka Mirror Maker
Establish best practice standards for configuring Source and Sink connectors.
Experience in High availability cluster setup, maintenance, and post implementation support.
Knowledge of Kafka API (development experience is a plus)
Knowledge of best practices related to security, performance, and disaster recovery.
Good understanding of OS (RHEL/Centos), SAN Storage, Networking technology, load balancer SSL/TLS, and Firewall.
Desired Skills
Experience in setting standards to automate deployments using Chef or Jenkins.
Experience with container technology Kubernetes, Docker.
familiar with Opensource monitoring products (Prometheus, Grafana)
Experience with bigdata technology (Hadoop echo system) is a plus
Excellent verbal and written communication ability
Ability to work well within a globally distributed team, maintain a positive attitude while working with a high throughput platform demands short deadlines
Passion about cutting edge Opensource technology and staying current with industry trends
Great problem-solving skills with the ability to see and solve issues.
Any Graduate