Description

Our Technological Stack includes: Airflow, DBT, Python, Snowflake, AWS, GCP, Amplitude, Fivetran, and more.

What You'll Do:

Building, and continuously improving our data gathering, modeling, reporting capabilities and self-service data platforms

Working closely with Data Engineers, Data Analysts, Data Scientists, Product Owners, and Domain Experts to identify data needs
What You'll Bring:
Relevant Bachelor degree – preferably CS, Engineering/ Information Systems or other equivalent Software Engineering background

8+ years of experience as a Data/BI engineer

Strong SQL abilities and hands-on experience with SQL and no-SQL DBs, performing analysis and performance optimizations

Hands-on experience in Python or equivalent programming language

Experience with data warehouse solutions (like BigQuery/ Redshift/ Snowflake)

Experience with data modeling, data catalog concepts, data formats, data pipelines/ETL design, implementation and maintenance

Experience with AWS/GCP cloud services such as GCS/S3, Lambda/Cloud Function, EMR/Dataproc, Glue/Dataflow, Athena

Experience with Airflow and DBT - Advantage

Experience with data visualization tools and infrastructures (like Tableau/SiSense/Looker/other) - Advantage

Experience with development practices – Agile, CI/CD, TDD - Advantage

Experience with Infrastructure as Code practices - Terraform - Advantage

Education

ANY GRADUATE