Description

Description:

This position will require an individual who has a strong background with multiple database technologies, who is process oriented and has knowledge of Big Data and GCP environment.

 

Skills Required:

-Experience with GCP based Big Data deployments (Batch/Real-Time) leveraging Big Query, Google Cloud Storage, Pub-Sub, Data Fusion, Dataflow, Dataproc, Cloud functions etc. -Experience with relational SQL databases -Experience building and optimizing "big data" data pipelines, architectures and data sets. -Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement. -Build processes supporting data transformation, data structures, metadata, dependency and workload management. -Experience with big data tools: Hadoop, Spark, Kafka, etc. -Experience supporting and working with cross-functional teams in agile mode.

 

Skills Preferred:

Proficient in Python and SQL

 

Experience Required:

10+ years of application development experience 3+ years of GCP experience 4+ years of experience with large-scale solutioning and operationalization of data warehouses and/or data lakes


 

Education

Any Graduate