Description

Top Skills:

  • DBT
  • Snowflake
  • Azure

 

Job Description: Data Ingestion Engineer

About the Role

We are seeking a highly motivated and experienced Data Ingestion Engineer to join our growing team. In this role, you will be responsible for designing, developing, and implementing data pipelines to ingest data from various sources, including structured (e.g., databases) and unstructured data (e.g., logs, social media). You will play a key role in ensuring the quality, consistency, and timeliness of data for downstream analytics and reporting.

 

Responsibilities:

Design and develop dbt models:

  • Create reusable dbt models to transform raw data into a consistent and well-defined format for Snowflake.
  • Utilize advanced dbt features like sources, transformations, tests, and documentation to ensure data quality and maintainability.

 

Implement CI/CD pipelines in Azure:

  • Configure Azure DevOps pipelines to automate the building, testing, and deployment of dbt models and data pipelines.
  • Integrate data pipeline execution with code version control for traceability and rollback capabilities.
  • Implement automated data quality checks and alerts within the CI/CD pipeline.

 

Design, develop, and maintain data pipelines using dbt and Azure Data Factory (ADF):

  • Extract, transform, and load (ETL) data from various sources including databases, APIs, cloud storage (Azure Blob Storage), and potentially mobile applications (Apple/Android).
  • Cleanse and validate data to ensure accuracy and consistency
  • Develop and implement data models for Snowflake
  • Leverage DevOps principles to automate data pipeline deployments and monitoring beyond CI/CD (e.g., infrastructure provisioning, configuration management)
  • Collaborate with business teams to understand data requirements
  • Work effectively in a team environment and mentor junior team members (including potential offshore team members)
  • Stay up-to-date with the latest data ingestion technologies and best practices

 

Required Skills:

  • Minimum 2 years of experience with dbt
  • Minimum 2 years of experience with Snowflake
  • Minimum 2 years of experience with the Azure platform, including Azure Data Factory (ADF), Azure Blob Storage, and Azure DevOps tools
  • Experience working with structured and unstructured data formats
  • Strong SQL and scripting skills (e.g., Python, Bash)
  • Excellent problem-solving and analytical skills
  • Strong communication and collaboration skills
  • Ability to work independently and as part of a team
  • Experience working with business teams to understand data requirements (a plus)
  • Experience with data integration from mobile applications (Apple/Android) (a plus)

 


 

Education

Bachelor's degree