Description

 
Required Skills and Experience:
Technical Expertise:
Extensive experience designing and implementing large-scale Kafka architectures, including Kafka Streams, Kafka Connect, and KSQL.
Strong knowledge of distributed systems, event-driven architecture, and message queues.
Proficiency with related technologies: Zookeeper, schema registries, Avro, or Protobuf.
Experience in performance tuning, monitoring (e.g., Prometheus, Grafana), and troubleshooting Kafka clusters.
Hands-on experience with cloud-based Kafka services (e.g., AWS MSK, Confluent Cloud) and containerized environments (Docker, Kubernetes).
Development Experience:
Strong programming skills in Java, Scala, Python, or Go.
Familiarity with CI/CD pipelines and Infrastructure as Code (IaC) practices (e.g., Terraform, Ansible).
Experience with relational databases (SQL, NoSQL) and stream processing frameworks (e.g., Flink, Spark Streaming).
Problem-Solving and Analytical Skills:
Ability to troubleshoot and optimize Kafka performance under high-throughput conditions.
Strong analytical skills to handle complex data integration scenarios and improve system efficiency.

Preferred Qualifications:
Certified Kafka Developer or Kafka Operations certification.
Experience in real-time analytics platforms, data warehousing, or big data environments.
Familiarity with other streaming and messaging platforms like RabbitMQ, Pulsar, or ActiveMQ.
Knowledge of

Education

Bachelor's degree