Job Key Responsibilities:
Writing optimized high performance SQL queries & DB procedures/functions for data loading, integration with other applications & Maintaining data quality and overseeing database security.
Front end the delivery of processes to data extraction, transformation, and load from disparate sources into a form that is consumable by analytics processes, for projects with moderate complexity, using strong technical capabilities and sense of database performance.
Batch Processing – Implement an efficient way of processing high volumes of data where a group of transactions is collected over a period.
Data Quality, Profiling and Cleansing - Capability to review (profile) a data set to establish its quality against a defined set of parameters and to highlight data where corrective action (cleansing) is required to remediate the data.
Stream Systems - Capability to discover, integrate, and ingest all available data from the machines that produce it, as fast as it is produced, in any format, and at any quality.
Excellent interpersonal skills to build network with variety of department across business to understand data and deliver business value and may interface and communicate with program teams, management and stakeholders as required to deliver small to medium-sized projects.
The Role Offers:
• 7+ years of experience in Abinitio
Nice to have experience in AWS and Python programming...
• Designing and creating the data warehouse
• All related extraction, transformation and load of data functions in the company, nature may vary based on the tool being used.
• You must be experts at looking a big-picture view of a company's data situation
• Should possess good analytical mind
• Good in SQL, UNIX, Autosys and ETL concepts such as 3NF, Dimensional models, facts loads, surrogate key, joins, data comparison across tables, automated job executions, shell scripting
Any Graduate