Key Responsibilities:
1. Design, build, and deploy scalable and robust data pipelines using Google Cloud Dataflow.
2. Harness the capabilities of BigQuery for data analytics, ensuring optimized performance and cost efficiency.
3. Manage workflow orchestration and automation using Cloud Composer.
4. Monitor, troubleshoot, and optimize data pipelines for performance, ensuring data quality and integrity.
5. Stay updated with GCP's latest features and best practices to ensure the company's data infrastructure remains cutting-edge.
6. Document data architectures, processes, and data lineage for transparency and maintainability.
Qualifications:
1. 3-4 years of experience as a data engineer with significant exposure to the Google Cloud Platform.
2. Strong expertise in GCP tools, particularly Dataflow, BigQuery, and Cloud Composer.
3. Strong GCP Dataflow, Java/Python & beam skills, Lead experience for delivery streams.
4. Experience in working with streaming / messaging systems like Kafka, Pulsar, GCP PubSub, RabbitMQ and similar tools. Including connectors for systems like Cassandra.
5. Familiarity with other GCP services such as Cloud Storage, and Dataproc..
6. Strong analytical and problem-solving skills.
7. Familiarity with other cloud platforms (e.g., AWS, Azure) is a plus.
8. Excellent communication skills, both written and verbal.
9. Bachelor's degree in Computer Science, Engineering, or a related field.
Additional Requirements:
1. Demonstrated ability to work in a fast-paced environment, managing multiple projects simultaneously.
2. Commitment to continuous learning and adapting to the rapidly evolving data landscape.
3. Proven ability to collaborate effectively with both technical and non-technical stakeholders
Any Gradute