Developing, debugging, assessing, and monitoring quality of ETL logic and outputs.
Investigating data anomalies, reporting upstream issues, and uncovering areas for improvement
Leading ongoing reviews of requirements, roadmaps, and developing optimization strategies
Working within the AWS environment to managing jobs, run scripts, test coding changes, etc.
Creating and maintaining documentation on reporting requirements, metric definitions, etc.
Effectively communicating findings to teammates and stakeholders
Gathering feedback and action items from meetings and translate into tickets / team notes.
Ensuring data integrity is maintained throughout the release of new product features by testing and monitoring data alongside test groups both pre and post launch.
Managing multiple projects, developing plans, and monitoring performance
Updating, implementing, and maintaining procedures
Monitoring deliverables and ensuring timely completion of projects
Querying / combining / analyzing data in Hive and Athena to answer business questions.
Support in the development, validation, and management of dashboards and insight solutions.
Must Haves:
Must have experience developing ETL packages using SQL and AWS Redshift (AWS Redshift is a nice to have)
Strong SQL experience – in being able to take a business question and answer the question by querying the data.
Must have experience writing stored procedures to automate reporting.
Experience being a thought leader regarding how they solve business problems or answer questions using data (they are building the framework AND doing a lot of investigation
Someone who’s technical and can solve complex issues by creating a simple solution.
Familiarity with front end BI tools (Tableau, Power BI, Qlik, etc.)
Experience working with Tableau would be a plus, or any other reporting tool.