Description

Description :

  • • Minimum of 9+ years of experience as a Data Engineer, focusing strongly on GCP or similar cloud providers like Azure and AWS.
  • • Extensive hands-on experience with GCP services or other clouds(Azure, AWS) and tools such as Terraform, BigQuery, BigTable, Google Cloud Storage, PubSub, Data Fusion, Dataflow, Dataproc, Cloud Build, Airflow, Cloud Composer, Tekton, Vertex AI.
  • • Proficiency in designing and implementing large-scale data solutions using GCP services, ensuring scalability, reliability, and performance.
  • • Strong knowledge of data integration, transformation, and processing techniques, leveraging GCP services and tools.
  • • Experience with infrastructure automation using Terraform for GCP resource provisioning and management.
  • • Solid understanding of CI/CD practices and experience with Tekton and other relevant tools for building data engineering pipelines.
  • • In-depth knowledge of data storage and retrieval mechanisms using GCP services such as BigQuery, BigTable, and Google Cloud Storage.
  • • Familiarity with data orchestration and workflow management using GCP services like Dataproc, Cloud Build, and Airflow.
  • • Strong proficiency in big data technologies, including HDFS, Hive, Sqoop, Spark, PySpark, Scala, and Python.
  • • Proven experience in building end-to-end machine learning pipelines and deploying ML models in production.
  • • Familiarity with ML frameworks such as TensorFlow, PyTorch, or Scikit-learn.
  • • Good Python programming skills