Description

Responsibilities
Responsible for designing, developing, and deploying scalable and reliable systems that can handle high volumes of data
Develop and implement efficient and scalable solutions using Confluent Kafka, Java, and Spring Boot technologies
Work closely with other engineers to ensure that data pipelines are integrated with existing systems and meet the needs of the business
Design, develop, and deploy Kafka-based data pipelines
Work with AWS services to manage and process data
Mentor and train junior engineers
Provide expertise in integration architecture and best practices, ensuring optimal system performance and reliability
Collaborate with stakeholders to identify opportunities for process improvement and optimization within the retail industry domain
Conduct code reviews, identify areas for improvement, and ensure adherence to coding standards
Troubleshoot and resolve integration issues, ensuring smooth data flow and system interoperability
Stay up to date with the latest trends, technologies, and advancements in the Confluent Kafka ecosystem and integration domain

Required Qualifications:
7 years of professional experience as an AWS MSK /Apache Kafka/ Confluent Kafka developer with deep understanding of Kafka Development
Well versed with Kafka and streaming internals, Java, Spring Boot and similar technologies
Direct experience or working knowledge with AWS technologies including ECS/EKS, Lambda, API Gateway, S3, Terraforms and Kafka
Sound knowledge with authorization and authentication approaches like oAuth, SSO/SAML
Experience with build tools like maven, gradle and unit test tools like Junit, Mocito
Foundational experience with service orchestration and messaging technology Experience with Kafka governance
Experience with Agile development methodologies

Education

Bachelor's degree