Description

Job Description: Basic Qualifications:

 

Minimum 3 years of designing, building and operationalizing large-scale enterprise data solutions and applications using one or more of GCP data and analytics services in combination with 3rd parties - Spark, Cloud DataProc, Cloud Dataflow, Apache Beam, BigTable, Cloud BigQuery, Cloud PubSub, Cloud Functions, etc.
Minimum 1 year of hands-on experience analyzing, re-architecting and re-platforming on-premise data warehouses to data platforms on GCP cloud using GCP/3rd party services
Minimum 1 year of designing and building production data pipelines from ingestion to consumption within a hybrid big data architecture, using Java, Python, Scala etc.
Minimum 1 year of designing and implementing data engineering, ingestion and curation functions on GCP cloud using GCP native or custom programming
Minimum 1 year of experience in performing detail assessments of current state data platforms and creating an appropriate transition path to GCP cloud
Hands-on GCP experience with a minimum of 1 solution designed and implemented at production scale
 

Preferred Qualifications:

 

1 year of hands-on experience architecting and designing data lakes on GCP cloud serving analytics and BI application integrations
Minimum 1 year of experience in designing and optimizing data models on GCP cloud using GCP data stores such as BigQuery, BigTable
Minimum 1 year of experience integrating GCP or 3rd party KMS, HSM with GCP data services for building secure data solutions
Minimum 1 year of experience introducing and operationalizing self-service data preparation tools (e.g., Trifacta, Paxata) on GCP
Minimum 1 year of architecting and implementing metadata management on GCP
Architecting and implementing data governance and security for data platforms on GCP
Designing operations architecture and conducting performance engineering for large scale data lakes a production environment
Craft and lead client design workshops and provide tradeoffs and recommendations towards building solutions
2+ years of experience writing complex SQL queries, stored procedures, etc
Google Cloud Platform certification is a plus
Experience with CI/CD pipelines such as Concourse, Jenkins
Experience with AtScale, Airflow (DAGs) a plus
 

Education

Bachelor's degree in Computer Science