Description

 

Data Platform Architect / SME

 

Apply Now

Share Job 

Job Description:

 

We are looking for a skilled Data Platform Architect / SME with a strong background in designing, implementing, and maintaining data pipelines using Airflow, AWS and Snowflake. The ideal candidate will have experience developing reusable scripts, procedures, and workflows to streamline data processing and analysis tasks. As a Data Platform Architect / SME, you will play a key role in building scalable and efficient data platform that enable our clients to unlock the full potential of their data and ensure compliance of the Orchestration platform.

 

Responsibilities:

 

  • Design, develop, and maintain data pipelines using AWS, Apache Airflow to automate the extraction, transformation, and loading (ETL) of data from various sources into Snowflake.
  • Collaborate with cross-functional teams to understand data requirements and design scalable and efficient data models and architectures.
  • Develop reusable scripts, procedures, and workflows to standardize data processing tasks and ensure consistency and reliability across pipelines.
  • Optimize and tune data pipelines for performance, scalability, and cost-effectiveness, leveraging best practices and industry standards.
  • Implement monitoring and alerting solutions to proactively identify and address issues in data pipelines, ensuring high availability and reliability.
  • Document data engineering processes, procedures, and best practices, and provide training and support to team members as needed.
  • Design and implement reusable Directed Acyclic Graphs (DAGs) in Apache Airflow to orchestrate complex workflows and dependencies between tasks within data pipelines.
  • Define task dependencies, scheduling intervals, retries, and error handling strategies within DAGs to ensure the reliable execution of data processing tasks.
  • Implement dynamic DAG generation and parameterization techniques to support flexible and scalable pipeline configurations.
  • Stay current with emerging technologies and industry trends in Airflow & data engineering and analytics, continuously evaluating and incorporating new tools and techniques to improve our data platform offerings.
  • Good understanding of various snowflake features like Snowpipes, SnowTasks, Dynamic Data masking, Row access policies, Object tagging, RBAC, Streams etc.
  • Design, develop, and maintain data transformation pipelines using DBT to support various analytics and reporting needs.
  • Provide technical guidance and support to data analysts and other team members on best practices for using DBT.
  • Strong analytical and problem-solving skills, with the ability to troubleshoot and optimize SQL queries and DBT models.
  • Design, implement, and maintain CI/CD pipelines using DevOps tools like Terraform, Cloud formation & Jenkins for automated build, test, and deployment processes.
  • Create, Manage, and optimize infrastructure on AWS, ensuring high availability, scalability, and cost-effectiveness.
  • Build frameworks to automate CI/CD deployments on snowflake. 

 

Requirements:

 Jenkins, harness

  • Bachelor’s degree in computer science, Engineering, or related field, or equivalent experience.
  • 6+ years of experience in data engineering roles, with a focus on building and maintaining data pipelines using Airflow and Snowflake.
  • Proficiency in Apache Airflow, including designing and orchestrating complex workflows, creating custom operators, and managing dependencies.
  • Strong SQL skills and experience working with Snowflake or other cloud-based data warehouse platforms.
  • Experience developing reusable scripts and procedures in Python or other programming languages for data processing and automation tasks.
  • Experience with version control systems such as Git and CI/CD pipelines for automated testing and deployment of data pipelines.
  • SnowPro certification is preferred.
  • Strong analytical and problem-solving skills, with the ability to understand complex data requirements and design appropriate solutions.
  • Excellent communication and collaboration skills, with the ability to work effectively in a cross-functional team environment.
  • Looking for someone who is willing to pick up Platform operational work too using Devops stack like Terraform,  etc

Education

Bachelor's degree