Qualifications:
Expertise in reducing costs and increasing speed and efficiency in large scale data platform deployments.
Proficiency in Terraform, Python Scripting along with GitHub Actions for deployment of infrastructure as code.
Deep understanding of data governance policies, implementation best practices and hands-on experience in implementing Policies as Code
Architecture, design & implementation of unified data access using AWS Lake formation, AWS Data Zone, IAM, OKTA and others.
Experience in architecture & security of AWS EC2, EBS, S3, EKS, Athena, Redshift, RDS, Kafka, Glue, and other data management tools
Experience in in design and implementation of granular access to data without losing the speed & agility of delivering the access provisioning.
Expertise in CI/CD process and deployment technologies using tools like GitHub, GitHub Actions
Expertise in logging, monitoring, reliability engineering
Strong knowledge of networking concepts and experience managing network-related issues.
Key Responsibilities:
Architect, design and implement unified data access for various personas like data engineer, data analyst, data steward by simplified access control mechanism which supports governance on AWS and non-AWS assets.
Design, build & optimize CI/CD pipelines for data engineers to deploy and push code across multiple environments.
Develop and maintain infrastructure as code (IaaC) with Terraform for reproducible and scalable deployments.
Design & implementation of AWS cloud infrastructure using best practices and industry standards by working closely with internal & external stakeholders like Information Security, Cloud Infrastructure, Data Engineering teams, etc.
Work closely with data governance teams in implementing data governance policies, periodic review, and optimization of repository from security & governance perspective.
Continuous improvement of Cloud Data platform to enhance speed, agility, and cost efficiency.
Any gradudate