Description

Description: Our client is currently seeking a Big Data Engineer - IV

 

JOB TITLE: GCP Senior Big Data Engineer Looking for a Senior Big Data Engineer/Architect on Google Cloud Platform to help strategize, architect and implement various solutions to migrate data hosted on our on-prem platform to Google cloud Platform (GCP).

 

The architect will design and implement enterprise infrastructure and platforms required for setting up data engineering pipelines utilizing the tools available on the GCP Platform.

 

As a GCP Platform Architect - You will work on Advanced Data Engineering products using Google Big Data technologies such as GCS, Data Proc, Airflow, Data Store and Big Query. Very strong leadership and communication skills exhibiting right negotiating posture with customer and program teams to make the right decisions.

 

Experience leading one or more of the following areas of a Cloud transformation journey: strategy, design, application migration planning and implementation for any private and public cloud.

 

Cloud foundation design and build/implement Cloud Transformation & Migration Cloud Managed service (IaaS and PaaS) Cloud foundation design and build/implement

 

MUST HAVE SKILLS (Most Important): Google Cloud Certified Professional Cloud Architect Certification Bachelor?s degree with 3-5 years? experience on Google cloud with deep understanding, design and development experience with GCP products on Infrastructure, Data management, Application Development, Smart Analytics, Artificial Intelligence, Security and DevOps Extract, Transform and Load (ETL) & Big Data Tools: BigQuery, Cloud Dataflow, Cloud Proc, Cloud Pub/Sub, Cloud Composer, Google Data Studio, Google Cloud Storage. NoSQL databases: Cloud Bigtable, Cloud Fire store, Firebase Realtime Database, Cloud Memory store. Search Technologies: Lucene and Elasticsearch Relational Databases: Cloud Spanner, Cloud SQL

 

DESIRED SKILLS: Strong knowledge on Google cloud storage Data lifecycle management Strong knowledge on BIGQuery Slots management Cost optimization for Dataproc workload management Experience of designing, building, and deploying production-level data pipelines using tools from Hadoop stack (HDFS, Hive, Spark, HBase, Kafka, NiFi, Oozie, Splunk etc). Development and deployment technologies (e.g. JIRA, GitHub, Jenkins, Nexus, Artifactory) Software development background with solid understanding of and experience in Software development life cycle (SDLC), DevOps, CI/CD. At least 2-year experience in architecting in enterprises using Agile methodologies - Experience in data visualization tools like Kibana, Grafana, Tableau and associated architectures.

 

JOB DUTIES: Provide Subject Matter Expertise in cloud and hybrid-cloud computing with Google Cloud and related products; thereby becoming a trusted advisor to influential decision makers. Provide end-to-end technical guidance and expertise on how to effectively use Google Cloud to build solutions; creatively applying cloud infrastructure and platform services to help solve business problems; and communicating these approaches to different business users Design and implement Google solution architecture with the different products like Google App Engine, BigQuery, Kubernetes Engine, AutoML, assess architecture needs for projects, work with different development leads and managers to scope and craft proposals. Work to harvest best practices and document lessons learned as part of continuous improvement and aid in company-wide data governance. Periodically update senior management with the status of the project with excellent written and verbal communication skills.

 

EDUCATION/CERTIFICATIONS: Bachelor?s degree with 3-5+ years of experience on Google cloud with deep understanding, design and development experience with GCP products on Infrastructure, Data management, Application Development, Smart Analytics, Artificial Intelligence, Security and DevOps


 

Education

Any Graduate