Description

As an Integration Developer specializing in Confluent Cloud Kafka as a Service, you will be responsible for designing, developing, and maintaining integration solutions that leverage Apache Kafka. Your role will involve initial configuration and setup of a Kafka as a Service environment in Confluent Cloud, working closely with various stakeholders to ensure seamless data flow across systems, enabling real-time data integration and analytics. You will play a critical role in enhancing our Kafka-based services and ensuring reliability, scalability, and performance.

Responsibilities:
• Setup and configure initial Kafka as a Service environment in Confluent Cloud.
• Design and implement integration solutions using Apache Kafka to support real-time data streaming and processing.
• Develop and maintain Kafka producers, consumers, and streams to facilitate data exchange between multiple systems and applications.
• Design, develop, test, and maintain Kafka-based applications and APIs.
• Collaborate with software engineers, data engineers, and other stakeholders to gather requirements and design scalable integration architectures.
• Ensure high availability, performance, and reliability of Kafka clusters by monitoring, tuning, and optimizing the infrastructure.
• Implement data pipelines and workflows that facilitate ETL/ELT processes.
• Troubleshoot and resolve issues related to Kafka integrations, including connectivity, performance, and data consistency.
• Develop and enforce best practices for Kafka development, including code reviews, testing, and documentation.
• Stay up to date with the latest trends and advancements in Apache Kafka, Confluent Cloud, and related technologies.
• Contribute to the continuous improvement of our Kafka as a Service platform, identifying and implementing enhancements to improve user experience and system efficiency. 
• Provide technical guidance and mentorship to junior developers and other team members.
• Implement scalable architectural models for data processing and storage using Kafka.
• Troubleshoot and resolve production issues related to Kafka to minimize impact.
• Enhance and optimize Kafka clusters for performance and reliability.
• Develop RESTful services using Spring and Spring Boot frameworks.
• Work closely with data engineers, data scientists, and other stakeholders to ensure seamless data integration.

Education

Any Graduate