Description

Skills Required: 

8+ years of experience in data engineering or data architecture, with at least 2 years in a GCP environment.

Data Integration: Develop and implement ETLELT processes to ingest data from various sources into BigQuery, ensuring data quality and integrity.

GCP Expertise: Strong knowledge of Google Cloud services, especially BigQuery, Dataflow, Cloud Storage, and PubSub.SQL

Proficiency: Proficient in SQL and experience with optimizing complex queries in BigQuery.

Data Modeling: Experience in data modeling techniques and best practices for data warehousing.

IRIS: 2+ years of hands on experience in solutioning IRIS Integrations

Programming Skills: Proficiency in programming languages such as Python, Java, or Scala for data manipulation and automation.

Soft Skills

To oversee quality assurance processes, ensuring adherence to coding standards, implementation of best practices and perform Value creation and KM activities.

To ensure process improvement and compliance| and participate in technical design discussion and to review technical documents.

Responsible for shaping the overall project strategy working closely with stakeholders to define project scope, objectives, deliverables and keeping track of schedule to ensure on time delivery as per the defined quality standards.

To work closely with the development team, On-site Engineers to understand technical requirements and work with them to address and resolve technical issues.

Identify & flag potential risks and issues that may impact project timelines or quality, develop mitigation strategies / contingency plans to address risks and provide regular project updates to key stakeholders.

Education

Any Graduate