Description

Responsibilities:

Design and develop ELT / ETL to load data from internal sources into on-premise ODS (MongoDB) and data lake & data warehouse on Azure.
Create APIs, data structures and schemas to store and retrieve data.
Create and maintain data mappings, data definitions, architecture and data flow diagrams.
Implement solutions to validate and test data quality.
Build proof-of-concepts to determine viability of new processes and technologies.
Deploy and manage code in non-prod and prod environments.
Troubleshoot data related issues and fix defects.
Optimize and performance tune databases and queries.
Participate in code reviews and train other team members.
Document code and create code templates based on industry best practices.
Experience:

Bachelors Degree and/or equivalent experience required. 
Minimum 5 years of experience working as a Data Engineer, ETL Engineer, or similar roles.
Skills:

5+ years of experience in Python. 
3+ years of continuous experience with Airflow.
3+ years of experience with Spark. 
3+ years of experience working with relational databases: Oracle, SQL Server, PostgreSQL, or similar.
3+ years of experience writing SQL code.
2+ years of experience in columnar databases: Snowflake, Azure Synapse, or similar.
2+ years of Azure / AWS experience.
Pluses, but not required: Any work experience in the following:

ETL / ELT: Apache NiFi, Kafka, Luigi, Prefect.
Languages: Java, Scala, R.
Platforms: Databricks, Azure Synapse.
Database: MongoDB.
Data Orchestrator: Control-M, Autosys

Education

Bachelor's degree in Computer Science