Description

Technologies and Tools used:
- Java/Python
- Flink
- Cassandra/MemSql/others
- Openshift / K8s
- Kafka
Responsibilities: Create new, and maintain existing, Flink jobs written in Java/Python, deploy on OpenShift. Produce unit and system tests for all code • Participate in design discussions to improve our existing frameworks • Define scalable calculation logic for interactive and batch use cases • Interact with infrastructure and data teams to produce complex analysis across data Required Qualifications: A minimum of 3 years of experience developing stream processing systems. A minimum of 5 years of programming experience
Required experience with Flink real-time data streaming: • Knowledge and experience with cloud-based technologies, preferably OpenShift • Experience in batch
• Experience in Flink Kubernetes Operator • Familiarity with open-source configuration management and development tools
• Ability to dynamically adapt to conventional big-data frameworks and open-source tools if the project demands
• Deep knowledge of troubleshooting and tuning Streaming applications to achieve optimal performance
• Knowledge of design strategies for developing a scalable, resilient, always-on data lake
• Strong development/automation skills • Must be very comfortable with reading and writing Java/Python code • An aptitude for analytical problem-solving

Key Skills
Education

Any Graduate