Description

This is a work from home opportunity. (100% remote)

We have urgent requirement with our premium Client.

Experience -7+ years experience

Job Title: Senior Data Engineer Developer 

Location:  Remote 

Roles and Responsibilities

➔ Collaborate with stakeholders, data architects, and other team members to understand

data requirements and translate them into scalable data engineering solutions on GCP.

➔ Design and develop data pipelines, ETL (Extract, Transform, Load) processes, and data

integration workflows using GCP services such as Dataflow, Pub/Sub, BigQuery, Cloud

Storage, and others.

➔ Implement data transformation and data cleansing operations to ensure data quality and

consistency throughout the data pipelines.

➔ Build and manage data storage systems, including databases, data lakes, and data

warehouses, leveraging GCP services like BigQuery, Cloud Storage, and Cloud Spanner.

➔ Optimize data pipelines and data storage systems for performance, scalability, and

cost-effectiveness, considering factors such as data volume, velocity, variety, and quality.

➔ Implement data security measures and ensure compliance with data governance and privacy policies.

➔ Monitor, troubleshoot, and optimize data pipelines and data infrastructure to ensure the

availability, reliability, and efficiency of data processing.

➔ Collaborate with DevOps teams to design and implement monitoring, logging, and

alerting solutions for data pipelines and infrastructure.

 

Skill Sets Requirements

➔ Have at least 3 years of experience in the Google Cloud Platform (especially Big Query )

➔ Experience with Java, Python and Google Cloud SDK & API Scripting.

➔ Experience with GCP Migration activities will be an added advantage.

➔ Strong knowledge of Google Cloud Platform (GCP) services and technologies, including

but not limited to Dataflow, Pub/Sub, BigQuery, Cloud Storage, Container, Docker, and

Cloud Spanner.

➔ Proficiency in programming languages such as Python, SQL, or Java for data processing

and scripting tasks.

➔ Experience in designing and building data pipelines and ETL processes using GCP data

engineering tools.

➔ Familiarity with data modeling, schema design, and data integration techniques.

➔ Knowledge of data warehousing concepts and experience with data warehouse solutions

like BigQuery.

➔ Experience with version control systems, CI/CD pipelines, and infrastructure automation

tools like Git, Jenkins, or Terraform.

Soft Skills

➔ Excellent verbal and written communication skills.

➔ Attention to detail

➔ Strong problem-solving abilities

➔ The ability to adapt to changing requirements and technologies ensuring accuracy and

precision in data analysis, report design, and development

Nidhi Tiwari

Technical Recruiter

WhatsApp: 9595741193

nidhi@iitjobs.com

Post your jobs at world’s 1st & only Global Technology Job Portal www.iitjobs.com

Refer a suitable candidate and earn Rs 50000 

 

 

Education

Any Graduate: Any Computer Science Degree will be preferred

Salary

INR 1000000 -1100000