• Proficiency in data technologies, such as relational databases, data warehousing, big data platforms (e.g., Hadoop, Spark), data streaming (e.g., Kafka), and cloud services (e.g., AWS, GCP, Azure).
• Strong programming skills in languages like Python (numpy, pandas, pyspark), Java (Core Java, Spark with Java, functional interface, lambda, java collections), or Scala, with experience in automation and scripting.
• Experience with containerization and orchestration tools like Docker and Kubernetes is a plus.
• Experience with data governance(data plex), data security, and compliance best practices on GCP.
• Experience with data quality assurance and testing on GCP.
• Proficiency with GCP data services (Big Query; Dataflow; Data Fusion; Dataproc; Cloud Composer; Pub/Sub; Google Cloud Storage).
Bachelor's degree in Computer Science