4+ years of professional experience in Stream/Batch Processing systems at scale.
Strong Programming skills in Java, Python.
Experience in Public Cloud is a must. Experience with GCP and GCP managed services is a strong plus.
Experience in Messaging/Stream Processing systems on Cloud such as Pub/Sub, Kafka, Kinesis, Dataflow, Flink etc., and/or
Experience in Batch Processing systems such as Hadoop, Pig, Hive, Spark. Experience with Dataproc is a strong plus.
Knowledge of DevOps principles and tools (e.g. CI/CD, IaC/Terraform).
Strong understanding of Containerization technologies (e.g., Docker, Kubernetes).
Strong problem solving and critical thinking skills.
Strong written/verbal communication skills with the ability to thrive in a remote work environment.
(For Senior leads/architects) Ability to explore new areas/problems as well as design and architect scalable solutions in Stream/Batch Processing at scale.
Ability to technically lead a team of engineers on a project/component.