Description

 Bachelors or Masters in IT, Computer, Software or related field.
7 years of Data Warehousing experience with 3 years of Snowflake concepts with implementation/hands-on experience
Experience in following best practices to maintain code versioning through GitHub, understanding of CI/CD

Strong SQL, PL\SQL skills, Experience/Understanding of Unix and python scripting

Good to have - Streamsets knowledge & Data Vault knowledge

Able to communicate well with client and understand requirements.

Strong P&C Insurance knowledge especially in Commercial Property, General Liability, Inland Marine, Product Liability, Auto and Umbrella BU's

Proficient and Prior experience in ELT\ETL tools is must
Participate in Design discussion with the Data modeler and architect team
Participate in Data Modelling sessions and develop Data Mapping specifications for the relevant processes
Prepare high level and detailed design documents for necessary processes
Design and develop the system components for the extract/transform or conversion of data from source systems to the target application(s)
Analyze the system capabilities and requirements through Storing and manipulating data inputs
Develop processes to handle Audit, Balancing, Reconciliation, and Error handling programs through the use of extract development tools or custom developed procedures
Identify components that require performance optimization and provide inputs for optimization of relevant jobs for optimal performance Analyze user needs and system requirements to determine feasibility of design within available constraints
Design and document jobs for the required scheduling tools
Perform internal design review and code review
Design and document data flow and explain to internal and external teams based on the need
Interact with client regularly, meet them to present work.
Having hands on experience or basic knowledge on Agile Scrum methodology

Education

Bachelors or Masters in IT, Computer, Software