Strong understanding of distributed systems and cloud computing principles. Demonstrated experience with Apache Spark and its components, especially Spark Structured Streaming and related data stores (Iceberg, Delta, Kafka etc) and file formats (Parquet etc.)
Extensive hands-on experience and strong understanding of Kubernetes environments, preferably on AWS Cloud.
Extensive experience with major cloud providers, especially with AWS services related to computing (EKS, EC2), storage (S3), and networking (VPC,NLB, Ingress) .
Proficiency in at least two programming languages (Java, Scala, Python) (Java Preferred)
Experience with setting up alerting, monitoring dashboards and remediation automation in a large scale distributed environment
Identify opportunities for automation, optimization, and cost-saving measures within the cloud infrastructure