Description

If you are fine with the JD then please revert along with your employer details.

•             5+ years of hands-on experience in building Data pipeline(ETL/ELT) in a cloud platform

•             GCP knowledge strongly preferred (Cloud experience required, not necessarily GCP)

•             5+ years’ of hands on experience of building and operationalizing data processing systems

•             Strong Python scripting experience is very important requirement

•             Cloud experience is required. (not necessarily GCP)

•             Working knowledge of Distributed Data Processing.(Beam, Spark, MapReduce)

Nice to have:

•             2+ years’ experience in NoSQL databases and close familiarity with technologies/languages such as Python/R, Scala, Java, Hive, Spark, Kafka

•             2+ years’ experience working with data platforms (Data warehouse, Data Lake, ODS)

•             2+ years’ experience working with tools to automate CI/CD pipelines (e.g., Jenkins, GIT, Control-M)

 Will be using following google tools / services ( for reference only)

•             Python

•             Cloud Data flow/Data proc (Apache Beam)

•             Cloud pub sub

•             Cloud function

•             Whistle map SDK

•             Google Health care API/ FHIR store


 

Education

Any Graduate