Responsibilities:
• Design and development of large-scale data solutions with Python and build robust data pipelines and dynamic systems.
• Design and deploy ETL pipelines using Python and Snowflake.
• Write and optimize Python and snow-SQL scripts
• Support different teams on deploying tools and configuring environment for development, testing and Production.
• Collaborates with different client teams, to develop and maintain long-term relationships with key stakeholders.
• Deep engagement and consultation with all business teams to understand current and future needs.
• Works with client team in establishing design patterns and development standards. Conducts code reviews and oversees unit testing.
• Collaborates between on-shore and offshore teams for project plan, code reviews, QA, and deployments.
• Brainstorm with development team on optimizing existing data-flow, quality and performance tuning, and building proof-of-concepts.
Minimum qualifications:
• Professional graduate/post-graduate degree in Computer Science or information technology discipline
• 4+ years of relevant experience with Snow SQL and Python scripting
Preferred skills:
• Subject Matter Expert in Python and it’s libraries, ETL pipelines, Web Services.
• Expert level programming experience in Python.
• Expertise in Snowflake and snow SQL
• Expert level Shell Scripting.
• Experience processing large amounts of structured and unstructured data, including integrating data from multiple sources.
• Handling entire software lifecycle: requirement gathering, project planning and status reporting to various stakeholders
• End to End project handling, requirements gathering, designing, and implementing effective pipelines.
Any Graduate