Description

Description: This role will be responsible for creating Data orchestration with Azure Data Factory Pipelines & Dataflows. Key role is to understand the business requirements and implement the requirements using Azure Data Factory.
Location: In USA the candidate could be present in any location and can work remotely
Duration: Long term (TQualified, on-going project)
Must have: At least 7-8 years of software development experience, and 1 year of experience on Azure Data factory
Budget: Open for now

Roles & Responsibilities:
Understand business requirement and actively provide inputs from Data perspective
Understand the underlying data and flow of data
Build simple to complex pipelines & dataflows
Work with other Azure stack modules like Azure Data Lakes, SQL DW,
Should be able to implement modules that has security and authorization frameworks
Recognize and adapt to the changes in processes as the project evolves in size and function Knowledge, Skills & Abilities:
Expert level knowledge on Azure Data Factory
Expert level knowledge of SQL DB & Datawarehouse
Should know at least one programming language
Should be able to analyze and understand complex data
Knowledge of Azure data lake is required
Knowledge of other Azure Services like Analysis Service, SQL Databases will be an added advantage
Excellent interpersonal/communication skills (both oral/written) with the ability to communicate at various levels with clarity & precision
Qualifications & Experience:
Bachelor or master’s degree in computer science or Data engineering

Education

Any Graduate