Job Description:
- The Data Analyst will be a key part of the Enterprise Data Services team, which is responsible for transforming data from disparate systems to provide insights and analytics for business stakeholders.
- Specifically, this role will support a multi-year project to migrate legacy data jobs and reports owned by business areas across the company to cloud-based technology solutions and modern data tools.
- You will collaborate with Data Engineers, Data Analysts, DBAs, cross-functional teams, and business teams.
- You will analyze, classify, migrate or redesign data jobs/reports into modern data tools, using Agile methodology, that empower users to make informed business decisions.
- Progress will be documented and validated against current report outputs.
- You are self-motivated, work independently, and have direct experience with data tools, data analysis, data engineering, data consulting, etc.
- You have a deep understanding of the full life data lifecycle and the role that high-quality data plays across applications, business analytics, and reporting.
- Strong candidates will exhibit solid critical thinking skills, the ability to resolve technical problems, and a talent for transforming data to create solutions that add value to business requirements.
Responsibilities:
- Bachelor of Science degree in Computer Science or equivalent
- At least 5+ years of post-degree professional experience
- Required experience in SQL and other querying languages to convert into modern data tools or jobs
- Strongly preferred experience in data analysis to create, publish, and/or manage reports and data visualizations in Tableau, PowerBI, AWS Quicksight, or AWS DataZone (i.e., our target solutions)
- Highly desirable experience implementing performance tuning in Tableau dashboards
- Highly desirable experience working with TOAD data tool
- Demonstrated experience building and preparing data for analytics
- Strong Knowledge with database technologies and data development such as Python, PLSQL, etc.
- Understanding how to build and modify data queries/applications, including performance tuning, utilizing indexes, and materialized views to improve query performance
- Identify necessary business rules for extracting data along with functional or technical risks related to data sources (e.g. data latency, frequency, etc.)
- Basic understanding of performing test cases for profiling data, validating analysis, testing assumptions, driving data quality assessment specifications, and define a path to deployment
- Comprehension of best practices for data ingestion and data design
Nice to Have:
- Any cloud experience on AWS, Azure, and snowflake with any relevant certifications
About the practice/ Project: