our responsibilities:
Develop Flink data processing applications to handle streaming data with high throughput.
A senior who can help his Flink development team, guiding and helping them implement custom solutions through Flink.
Develop applications with good usability and scalability principles that read from various sources and writes into various sinks.
Worked on integrations of other technologies with Flink, eg: Kafka, MongoDB, etc
Collaborate with team to design, develop, test and refine deliverables that meet the objectives.
Provide design and architectural solutions to the business problems.
Conduct frequent brainstorming sessions and motivate team and drive innovations.
Experience in the areas of Messaging, Data processing, preferably on Flink on any cloud platform (Azure, GCP, AWS)
Mandatory Skills:
10+ years of Java development with expertise in transforming data.
5+ years of experience consuming streaming data from Kafka, Flink.
5+ years of experience on building pipelines which can handle high throughput using Flink.
Hands on experience with Continuous Integration & Deployment (CI/CD)
Product & Design Knowledge - Experience with Large Enterprise Scale Integrations (preferably in design/development of customer facing large enterprise applications)
Experience in Digital Banking/ecommerce or any complex customer facing applications.
Excellent business communication skills
Seasoned Java developer who knows about all aspects of SDLC.
Desired Skills:
Experience in Apache Flink (Stream, Batch, Table APIs)
Experience in Apache Spark (Structured streaming, Batch processing)
Experience working with document databases preferably MongoDB
Experience working on Kafka with Flink
Working experience in Agile methodologies
Knowledge in cloud platform, preferably Azure
Experience of working across one or more geographic territories or regions
Any Graduate