Description

Position Summary:
Seeking a skilled Google Cloud Data Flow Engineer with a proven track record in leveraging Google Cloud Platform (GCP) tools to design, build, and maintain scalable and reliable data solutions. The ideal candidate will possess deep expertise in GCP tools, especially Dataflow, BigQuery, and Cloud Composer, and will work collaboratively with cross-functional teams to address data-related technical challenges and enhance our data infrastructure.

Key Responsibilities:
1. Design, build, and deploy scalable and robust data pipelines using Google Cloud Dataflow.
2. Harness the capabilities of BigQuery for data analytics, ensuring optimized performance and cost efficiency.
3. Manage workflow orchestration and automation using Cloud Composer.
4. Monitor, troubleshoot, and optimize data pipelines for performance, ensuring data quality and integrity.
5. Stay updated with GCP's latest features and best practices to ensure the company's data infrastructure remains cutting-edge.
6. Document data architectures, processes, and data lineage for transparency and maintainability.

Qualifications:

1. 3-4 years of experience as a data engineer with significant exposure to the Google Cloud Platform.
2. Strong expertise in GCP tools, particularly Dataflow, BigQuery, and Cloud Composer.
3. Strong GCP Dataflow, Java/Python & beam skills, Lead experience for delivery streams.
4. Experience in working with streaming / messaging systems like Kafka, Pulsar, GCP PubSub, RabbitMQ and similar tools. Including connectors for systems like Cassandra.
5. Familiarity with other GCP services such as Cloud Storage, and Dataproc..
6. Strong analytical and problem-solving skills.
7. Familiarity with other cloud platforms (e.g., AWS, Azure) is a plus.
8. Excellent communication skills, both written and verbal.
9. Bachelor's degree in Computer Science, Engineering, or a related field.

Education

Bachelor's degree in Computer Science, Engineering