Mandatory Skills: Pyspark, Spark, EMR, Scripting, Language Experience, Cloud Engineer
Job Description:
- 9 Years DevOps/SRE experience required on On-Prem to AWS migration of Infra (EC2, EKS), Application and Databases (S3, RDS, Dynamo etc.)
- 5 Years of Solid Experience in Coding language like Python, Bash, Java
- 5 Years of experience building CI/CD Infrastructure / Pipeline such as Jenkins/CodeDeploy/Code Pipeline/Terraform
- 5 Years of operations – Automation using IAC, automation for backup, patching, provisioning, configuration, release management, monitoring,
- 5 Years of centralized monitoring and logging using CloudWatch, Grafana, Datadog.
- 5 Years of provisioning and Config management of Dev, QA, Prod environments
- 3 Years of Experience on supporting Spark data pipes, PySpark & debugging issues on Production.
- 3 Years of security – IAM Roles, Oauth, SSO, CloudTrail, Cloud Watch
- Databricks SRE Experience is preferred.
- A solid understanding of containerization and orchestration technologies such as Docker, Kubernetes, or ECS
- Experience with preparing & executing Systems Resilience / Disaster Recovery strategies for Data pipelines, Web applications.
- Apart from very hands-on tech expertise, a very quick learner, excellent work ethics and proactive in communication is a must.