Architecture, design, implementation and operationalization of large-scale data and analytics solutions on Snowflake Cloud Data Warehouse.
Hands-on experience with Snowflake utilities such as SnowSQL, SnowPipe, Python, Tasks, Streams, Time travel, Optimizer, Metadata Manager, data sharing, and stored procedures.
Experience in Data warehousing - OLTP, OLAP, Dimensions, Facts, and Data modeling.
Need to have working knowledge of MS Azure configuration items with respect to Snowflake.
Developing EL pipelines in and out of data warehouse using combination of Databricks, Python and SnowSQL.
Developing scripts UNIX, Python etc. to do Extract, Load and Transform data.
Provide production support for Data Warehouse issues such data load problems, transformation translation problems
Translate mapping specifications to data transformation design and development strategies and code, incorporating standards and best practices for optimal execution.
Understanding data pipelines and modern ways of automating data pipeline using cloud based testing and clearly document implementations, so others can easily understand the requirements, implementation, and test conditions.
Perform code reviews to ensure fit to requirements, optimal execution patterns and adherence to established standards.
Establishing and monitoring Operational Level Agreements for the health and performance/cost of the warehouse environment (Loads, queries, data quality)
Basic Qualifications:
Minimum 3 years of designing and implementing an operational production grade large-scale data solution on MA Azure Snowflake Data Warehouse.
Including hands on experience with productionized data ingestion and processing pipelines using Python, Databricks, SnowSql
Excellent understanding of Snowflake Internals and integration of Snowflake with other data processing and reporting technologies
Excellent presentation and communication skills, both written and verbal ability to problem solve and design in an environment with unclear requirements.
Ability to lead and drive performance of a team of developers – local and offshore.
Preferred Skills:
Bachelor’s degree in Computer Science, Engineering, Technical Science
3 years of technical architecture and build experience with large-scale data warehouse solutions. Code optimization expertise
Experience building data ingestion pipelines using Python and Databricks in working with MS Azure.
3 years’ experience in Finance / Banking industry – some understanding of Securities and Banking products and their data footprints.