Description

Building integrations with data sources to ingest data in the central data lake using various technologies

Cleansing, joining, preparing, and transforming data from raw sources into models suited for analytics purpose

Leveraging DataOps practices such as data test automation, automated quality checks, and automated deployment - to ensure high quality and improve time to delivery

Collaborate with the members of the BI team and others within the organization to ensure data needs are met

Supporting the end users of the data and analytics, responding to tickets and inquiries from business partners when data quality issues occur

Maintaining data governance through documentation of data solutions, through ERDs, Confluence documentation, or external tools

Ensuring our enterprise data is timely and accurate

Requirements

Bachelor’s degree (B.A.) in Information Systems or other related field from a four-year college or university, or equivalent combination of education and experience

2+ years of experience

Strong SQL skills

Experience with cloud-based data warehouses, BigQuery a plus

Data modeling skills and understanding of analytical data warehousing

Understanding of data exploration, visualization, and BI tools such as Looker, Tableau, and Power BI

Experience with data pipeline and workflow management tools such as Azkaban, Luigi, Airflow

Experience with data processing using Python

Knowledge of Git

Education

Bachelor's