Description

Key Requirements and Technology Experience:

Strong SQL and Python. 
Tableau, DBT, Airflow is a plus
Design, develop, and maintain scaled ETL process to deliver meaningful insights from large and complicated data sets. 
Play key role in building out a semantic layer through development of ETLs and virtualized views. 
Collaborate with Engineering teams to discovery and leverage new data being introduced into the environment
Work as part of a team to build out and support data warehouse, implement solutions using Python to process structured and unstructured data. 
Support existing ETL processes written in SQL, troubleshoot and resolve production issues. 
Hands-on experience with Apache Airflow or equivalent tools (AWS MWAA) for orchestration of data pipelines
Create and maintain report specifications and process documentations as part of the required data deliverables. 
Serve as liaison with business and technical teams to achieve project objectives, delivering cross functional reporting solutions. 
Troubleshoot and resolve data, system, and performance issues
Communicating with business partners, other technical teams, and management to collect requirements, articulate data deliverables, and provide technical designs. 
Ability to multitask and prioritize an evolving workload in a fast-paced environment. 

Education

Any Graduate