Description

Description:
10+ years of experience in data areas like Cloud (GCP/AWS/Azure), Data
Candidate Should be Hands on GCP
Migrate on-premises Hadoop data lake workloads to GCP using BigQuery, Cloud Datapro, CloudSQL &, etc
Experience with scripting (Shell, Python, etc.) and a good understanding of Unix/Linux
Knowledge of GCP products like GKE, CloudSQL, Spanner, Cloud Functions..etc
Expertise in build automation and continuous integration/delivery ecosystem: GitLab, Maven, Jenkins
Strong problem-solving, domain technical, and analytical skills; advanced task estimation and planning skills
Must be a team player with great interpersonal and communication skills; time-management skills are critical.

Skills:
Primary Skills: Terraform, Kafka, Kubernetes,
IAC Certification: Google Cloud Professional Cloud Engineer Certified
Nice To Have Skills: Azure, Multi-cloud, CI/CD DevOps
Cloud Platforms: Proficiency in working with cloud platforms such as GCP, Azure, or AWS. GCP preferred.
Programming: Strong programming and/or scripting skills in languages such as Python, shell script, etc.
Infrastructure as Code: Experienced building infrastructure using Terraform for API and data pipeline hosting(RestFul APIs, KaTa pipelines) 
CI/CD: Experienced building CI/CD pipelines using GitHub Actions and/or Jenkins
Networking: Understanding of network concepts, including virtual networks, subnets, load balancers, security groups, firewalls, etc.
Containers and Orchestration: Solid understanding of Docker and Kubernetes
Security: Good understanding of identity and access management, encryption, and data security - Monitoring, Logging, 
Telemetry: Experience with monitoring tools such as Prometheus, and Grafana. Experienced in log aggregation tools.
Database Management: Proficiency in managing databases such as MongoDB

Education

Bachelor's degree