Description

Job Title: Python, Snowflake, AWS ETL Developer

Job Location: Remote

Job Duration: Long-Term

Job description

Delivery Responsibilities

 

  • Lead the technical planning, architecture, estimation, develop, and testing of ETL solutions
  • Knowledge and experience in most of the following architectural styles: Layered Architectures, Transactional applications, PaaS-based architectures, and SaaS-based applications; Experience developing ETL-based Cloud PaaS and SaaS solutions.
  • Create Data models that are aligned with clients requirements.
  • Design, Develop and support ETL mapping, strong SQL skills with experience in developing ETL specifications
  • Create ELT pipeline, Data Model Updates, & Orchestration using DBT / Streams/ Tasks / Astronomer & Testing
  • Focus on ETL aspects including performance, scalability, reliability, monitoring, and other operational concerns of data warehouse solutions
  • Design reusable assets, components, standards, frameworks, and processes to support and facilitate end to end ETL solutions
  • Experience gathering requirements and defining the strategy for 3rd party data ingestion methodologies such as SAP Hana, and Oracle
  • Understanding and experience on most of the following architectural styles: Layered Architectures, Transactional applications, PaaS-based architectures and SaaS-based applications; Experience designing ETL based Cloud PaaS and SaaS solutions.

 

Expert Hands-on Experience In The Following

 

  • Technologies such as Python, Teradata, MYSQL, SQL Server, RDBMS, Apache Airflow, AWS S3, AWS Datalake, Unix scripting, AWS Cloud Formation, DevOps, GitHub
  • Demonstrate best practices in implementing Airflow orchestration best practices such as creating DAG's, and hands on knowledge in Python libraries including Pandas, Numpy, Boto3, Dataframe, connectors to different databases, APIs
  • Data modelling, Master and Operational Data Stores, Data ingestion & distribution patterns, ETL / ELT technologies, Relational and Non-Relational DB's, DB Optimization patterns
  • Develop virtual warehouses using Snowflake for data-sharing needs for both internal and external customers.
  • Create Snowflake data-sharing capabilities that will create a marketplace for sharing files, datasets, and other types of data in real-time and batch frequencies
  • At least 8+ years' experience in ETL/Data Development experience
  • Working knowledge of Fact / Dimensional data models and AWS Cloud
  • Strong Experience in creating Technical design documents, source-to-target mapping, Test cases/resultsd.
  • Understand the security requirements and apply RBAC, PBAC, ABAC policies on the data.
  • Build data pipelines in Snowflake leveraging Data Lake (S3/Blob), Stages, Streams, Tasks, Snowpipe, Time travel, and other critical capabilities within Snowflake
  • Ability to collaborate, influence, and communicate across multiple stakeholders and levels of leadership, speaking at the appropriate level of detail to both business executives and technology teams
  • Excellent communication skills with a demonstrated ability to engage, influence, and encourage partners and stakeholders to drive collaboration and alignment
  • High degree of organization, individual initiative, results and solution oriented, and personal accountability and resiliency
  • Demonstrated learning agility, ability to make decisions quickly and with the highest level of integrity
  • Demonstrable experience of driving meaningful improvements in business value through data management and strategy
  • Must have a positive, collaborative leadership style with colleague and customer first attitude
  • Should be a self-starter and team player, capable of working with a team of architects, co-developers, and business analysts

 

Preferred Qualifications

 

  • Experience with Azure Cloud, DevOps implementation
  • Ability to work as a collaborative team, mentoring and training the junior team members
  • Position requires expert knowledge across multiple platforms, data ingestion patterns, processes, data/domain models, and architectures.
  • Candidates must demonstrate an understanding of the following disciplines: enterprise architecture, business architecture, information architecture, application architecture, and integration architecture.
  • Ability to focus on business solutions and understand how to achieve them according to the given timeframes and resources.
  • Recognized as an expert/thought leader. Anticipates and solves highly complex problems with a broad impact on a business area.
  • Experience with Agile Methodology / Scaled Agile Framework (SAFe).
  • Outstanding oral and written communication skills including formal presentations for all levels of management combined with strong collaboration/influencing.


 

Key Skills

SAP Hana Oracle Python Teradata MYSQL SQL Server RDBMS Apache

Education

ANY GRADUATE

  • Posted On: Few Days Ago
  • Category: Python Developer
  • Tenure: Any