Description

Job-title: GCP Data engineer (100% remote)

 

Note: No C2C with employers (Only W2 or 1099)

 

Job Description:

You will be responsible for developing scalable big data pipeline solutions in the GCP Data Factory.

 

Skills Required:

·       5+ years of experience with the following:

·       Data design, data architecture and data modeling (both transactional and analytic)

·       Building Big Data pipelines for operational and analytical solutions

·       Running and tuning queries in databases including Big Query, SQL Server, Hive

·       Data Management

·       including running queries and compiling data for analytics

·       Experience with developing code in Java (Springboot/ Apache Beam)

·       2+ year of experience with the following:

·       GCP Cloud data implementation projects experience (ApacheBeam, SpringBoot, Dataflow, Airflow, BigQuery, Cloud Storage, Cloud Build, Cloud Run, etc.)

·       Agile methodologies

 

Skills Preferred:

·          Certification: Google Professional Data Engineer

·          7 years of experience in a Software engineering role building complex data pipelines for operational and analytical solutions

Education

Any Graduate