Role Description:
Experience in Data warehousing, AWS Cloud Business, Intelligence, Informatica and Databricks(data flow design, development, enhancement and maintenance).
Real time data ingestion using StreamsetsExperience UDM ModelExperience in Dimensional Modelling, Data Migration, Data Cleansing, Data conversion and bridging, and ETL techniques on multiple databases and operating systems.Experience in Python, Databricks, perl, spark , Kubernetes, dockers and other cloud native tools and technologies.DataLakehouse development on Ingestion , Harmonization and Curation layersExperience in developing application framework to ingest third party data P&C insurance domain knowledge
Competencies:Python, Digital : Databricks
Essential Skills:
Data warehousing, AWS Cloud Business, Intelligence, Informatica and Databricks(data flow design, development, enhancement and maintenance).Real time data ingestion using StreamsetsExperience UDM ModelExperience in Dimensional Modelling, Data Migration, Data Cleansing, Data conversion and bridging, and ETL techniques on multiple databases and operating systems.Python, Databricks, perl, spark , Kubernetes, dockers and other cloud native tools and technologies.Implement data Integration mappings which included various types of transformation like Joiner, Lookup, Aggregator, Sorter, XML Parser, Router, Expression, Sequence Generator etc. Convert linux based DIF Framework Perl Scripts to Windows based code to use in Pet Windows based environment.UDM Model ExperienceNW application experienceExperience creating Stored Procedures in Snowflake to load the data from AWS S3 buckets to Snowflake via SnowpipeExperience in UNIX, LINUX, Shell Scripting, SQL Server, Stored Procedure and AutoSys Jils.Experience in optimizing resources/Performance TuningExperience in CI/CD tools like Concourse, CircleCI, Azure DevOps, Bitbucket, Bamboo, XLRelease, GIT, GITHub
Desirable Skills: P&C insurance domain knowledge
Keywords :Databricks
Any Graduate