Description


Responsibilities

Description :

Partner with product owners, document business requirements, and implement a robust data platform.

Work on producing solution design, interface design, high-level data flow diagrams, data standards, and naming conventions and evaluated the consistency and

integrity of the model and repository.

Design, develop, and implement scalable and efficient data pipelines, ETL processes, and workflows to collect, process, and store data from various sources.

Build and maintain data warehouses, data lakes, and other data storage systems to facilitate data accessibility, reliability, and performance.

Develop and optimize database schemas, data models, and queries to support efficient data retrieval and analysis.

Implement data quality assurance processes, including data validation, data cleansing, and error handling to ensure data accuracy and integrity.

Monitor and troubleshoot data pipelines and systems to identify and resolve issues, ensuring high availability and performance.

Document data engineering processes, data flows, and system architectures to ensure knowledge sharing and maintain system documentation.

Skillset Needed:.

More than 8 years of relevant experience in data engineering tools like Spark, Hadoop, Azure Data pipelines , Azure Data factory, Azure Data Lake Storage, Databricks,

Delta Lake, Python, SQLServer, and experience with other cloud providers –AWS, GCP. (experience preferably with insurance domain)

Strong knowledge and experience designing and implementing solutions on Databases like Hadoop, SQLServer, Oracle, Snowflake
Demonstrated experience in building, tuning data pipelines on Spark, Cloud native tools like EMR, Azure Data Factory, ETL tools –Informatica, Abinitio
Understand the need for both batch and stream ingestion of the data from source databases like SQLServer, Oracle, MySQL and other RDBMS based databases.
Experience demonstrating and ability to talk about wide variety of data engineering tools, architectures across cloud providers and open-source tools and packages.
Experience working with business teams, dev teams and coming up with designs to deliver data solutions with best practices and standards.

Education

Any Graduate