Requirements
6+ years of experience in developing scalable Big Data applications or solutions on distributed platforms
4+ years of experience working with distributed technology tools, including Spark, Python, Scala
Working knowledge of Data warehousing, Data modelling, Governance and Data Architecture
Proficient in working on Amazon Web Services (AWS) mainly S3, Managed Airflow, EMR/ EC2, IAM etc
Experience working in Agile and Scrum development process
3+ years of experience in Amazon Web Services (AWS) mainly S3, Managed Airflow, EMR/ EC2, IAM etc
Experience architecting data product in Streaming, Serverless and Microservices Architecture and platform
Experience with creating/ configuring Jenkins pipeline for smooth CI/CD process for Managed Spark jobs, build Docker images, etc
Working knowledge of Reporting & Analytical tools such as Tableau, Quicksite etc
Any Graduate