Job Description :
• The Data Integration/ETL Architect is a seasoned and broadly experienced IT professional able to look at portfolios of applications, services, components, infrastructures, and databases in a holistic manner.
• Defining architecture, guiding principles, and integration models for the Latest Cloud and On-prem tech stack.
• Working with client business owners and internal technical teams and third-party technology providers to evolve architecture for individual use cases.
• Defining architecture, guiding principles, and integration models for the OSS stack.
• Working with client business owners and internal technical teams and third-party technology providers to evolve architecture for individual use cases.
• Identify and define system data collection requirements
• Lead and provide data-driven solutions to business partners through architecture, design, and implementation of an ecosystem of a Data Warehouse in Azure, AWS, Google using cloud-native ETL solutions Like Matillion, IICS, Fivetran, ADF, Glue, etc…
• Develop and re-engineers the data acquisition, storage, processing, security, data management, and analysis using Cloud and on-premise technologies leading towards a modern data platform.
• Define and develop best practices on data integration/engineering, including ETL/ELT, Replication, ESB, API, etc. based on cloud and on-premise modern data platform.
• Recommend and establish standards for advanced data integration, data processing to include designing practical solutions for optimal performance with a fine balance of availability, reliability, and security.
• Establish Data Integration/Engineering standards and conduct review sessions with developers.
•
Required:
• 12+ yrs of experience with ETL Architecture using tools such as Informatica, Matillion, Fivetran, AWS Glue, ADF, etc...
• 5+ yrs of experience in Cloud ( AWS, Azure or Google)
• 5+ yrs of experience with Data Migrations from on-prem to cloud.
• 7+ yrs of experience with SQL, Stored Procedures, Python, Java & Shell Scripting.
• 5+ yrs of experience with data replication tools such as Golden gate, Autunity, Striim, etc.
• Experience with DEVOPS processes.
• At least 3 Major Datawarehouse Project implementations from scratch.
• 3-5 years of deep understanding of dimensional modeling, OLTP, and implementation strategies.
• Excellent Oral and written communication skills.
• Ability to work quickly and accurately under pressure and time constraints
• Demonstrated critical thinking and analytical skills are required; the ability to apply systems knowledge to troubleshoot and analyze new issues is critical
• Must be a self-starter, able to work independently with a proactive working style.
Any Graduate