Description

About the job
NOTE: This position is not Remote (Candidate Open to Hybrid or Relocation to Temple Terrace FL – 3 Days/week)

MUST HAVE: GCP and Big Data Development experience.

Please do not apply if you are not interested in hybrid and does not have Must Have skills.

Job Title: GCP Big Data Engineer

Duration: Up to 30 Months i.e., 2.5 Years (Including extensions)

Client: One of top in fortune 50 companies.

JOB DESCRIPTION :-

Must have Good Hands-On on Big Data Development.

Summary

Artificial Intelligence and Data team is looking for BI engineers with expert level experience in developing enterprise software applications on Google Cloud Data Flow Engineer (3-4 Years Experience) Position Summary:

Seeking a skilled Google Cloud Data Flow Engineer with a proven track record in leveraging Google Cloud Platform (GCP) tools to design, build, and maintain scalable and reliable data solutions. The ideal candidate will possess deep expertise in GCP tools, especially Dataflow, BigQuery, and Cloud Composer, and will work collaboratively with cross-functional teams to address data-related technical challenges and enhance our data infrastructure.

Key Responsibilities

Design, build, and deploy scalable and robust data pipelines using Google Cloud Dataflow.
Harness the capabilities of BigQuery for data analytics, ensuring optimized performance and cost efficiency.
Manage workflow orchestration and automation using Cloud Composer.
Monitor, troubleshoot, and optimize data pipelines for performance, ensuring data quality and integrity.
Stay updated with GCP's latest features and best practices to ensure the company's data infrastructure remains cutting-edge.
Document data architectures, processes, and data lineage for transparency and maintainability.

Qualifications

3-4 years of experience as a data engineer with significant exposure to the Google Cloud Platform.
Strong expertise in GCP tools, particularly Dataflow, BigQuery, and Cloud Composer.
Strong GCP Dataflow, Java/Python & beam skills, Lead experience for delivery streams.
Experience in working with streaming / messaging systems like Kafka, Pulsar, GCP PubSub, RabbitMQ and similar tools. Including connectors for systems like Cassandra.
Familiarity with other GCP services such as Cloud Storage, and Dataproc..
Strong analytical and problem-solving skills.
Familiarity with other cloud platforms (e.g., AWS, Azure) is a plus.
Excellent communication skills, both written and verbal.
Bachelor's degree in Computer Science, Engineering, or a related field.

Additional Requirements

Demonstrated ability to work in a fast-paced environment, managing multiple projects simultaneously.
Commitment to continuous learning and adapting to the rapidly evolving data landscape.
Proven ability to collaborate effectively with both technical and non-technical stakeholders.

Education

Any Graduate