Description

Job Description Summary

( DE )

Work with technology and business stakeholders to understand data requirements

Create low-level design artifacts, including mapping specifications.

Build scalable and reliable data pipelines to support data ingestions (batch and /or streaming) and transformation from multiple data sources using SQL, AWS, Snowflake, and data integration technologies.

Create unit/integration tests and implement automated build and deployment.

Participate in code reviews to ensure standards and best practices.

Deploy, monitor, and maintain production systems.

Create and update user stories in backlog

Collaborate with product owner, analysts, architects, QA and other team members

The Experience You Will Bring

Minimum 2 year hands on ETL development experience using DBT

Minimum 2 year hands on experience working with SQL and Snowflake database

Minimum 1year hands on experience (not just training or POC) in using Git and Python

Agile Scrum work experience

Strong communication skills

What Will Make You Stand Out

Hands on Experience in ELT development using Matillion

Experience working with Azure Dev Ops, Build and Release CI/CD pipelines

Experience working with AWS and Control M

Experience coding complex transformations (not just extract/load mappings) in DBT

Typical Qualifications

3-5+ years of data engineering experience

BS Degree: IT or Computer Science or Engineering

Education

Any Gradute