Description

Our Technological Stack includes: Airflow, DBT, Python, Snowflake, AWS, GCP, Amplitude, Fivetran, and more.What will you actually be doing?

Building, and continuously improving our data gathering, modeling, reporting capabilities and self-service data platforms

Working closely with Data Engineers, Data Analysts, Data Scientists, Product Owners, and Domain Experts to identify data needs

Required Experience

Relevant Bachelor degree – preferably CS, Engineering/ Information Systems or other equivalent Software Engineering background

6+ years of experience as a Data/BI engineer

Strong SQL abilities and hands-on experience with SQL and no-SQL DBs, performing analysis and performance optimizations - must have

Hands-on experience in Python or equivalent programming language - must have

Experience with data warehouse solutions (like BigQuery/ Redshift/ Snowflake) - must have

Experience with data modeling, data catalog concepts, data formats, data pipelines/ETL design, implementation and maintenance

Experience with AWS/GCP cloud services such as GCS/S3, Lambda/Cloud Function, EMR/Dataproc, Glue/Dataflow, Athena - must have

Experience with development practices – Agile, CI/CD, TDD - Advantage

 

Education

ANY GRADUATE