·12+ years of relevant experience in delivering Data Solutions on variety of Data Warehousing, Big Data and Cloud Data Platforms.
·Implementing SCD Techniques like slowly changing dimensions (type1, type2 and type3). experience working with distributed data technologies (e.g., Spark, Kafka etc) for building efficient, large scale big data pipelines.
· Strong Software Engineering experience with proficiency in at least one of the following programming languages: Spark, Python, Scala or equivalent ·
·Experience in working with AWS S3 and Snowflake cloud data warehouse. · Experience in Transforming/integrating the data in Redshift/Snowflake.
· Handling large and complex data sets like JSON, ORC, PARQUET, CSV files from various sources like AWS S3.
·Good exposure in Snowflake Cloud Architecture and Snow SQL and SNOWPIPE for continuous data ingestion.
·Hands on experience in bulk loading and unloading data into Snowflake tables.
·Experience in writing complex SQL scripts using Statistical Aggregate functions and Analytical functions to support ETL in snowflake cloud data warehouse.
Bachelor's degree