Description

● Managing backend data ingestion/integration pipelines development lifecycle including architecture, design, development, testing, and deployment. Explore and discover new data sources and quickly familiarize with the available APIs or other data acquisition methods like web-scraping to ingest data Build quick proof of concepts of new data sources to showcase data capabilities and help analytics team identify key metrics and dimensions Design, develop and maintain data ingestion & integration pipelines from various sources which may include contacting primary or third party-data providers to resolve questions, inconsistencies, and/or obtain missing data Design, implement and manage a near real-time ingestion & integration pipelines Analyze data to identify outliers, missing, incomplete, and/or invalid data; Ensure accuracy of all data from source to final deliverable by creating automated quality checks Evangelize an extremely high standard of code quality, system reliability, and performance● 3+ years of experience using an ETL tool like Pentaho enterprise or community edition and a total of at least a total of 5+ years of experience in ETL or web application experience.
           ● Minimum 10+ years of experience in building enterprise level software solutions. Minimum 4+ years of experience in architecting cloud-based software solutions. Minimum 4+ years of experience in APIs based development using Python, Java

Education

Any Graduate