Knowledge of IBM Infosphere DataStage ETL is required
Knowledge on building and supporting ETL Code using Infosphere DataStage to load enterprise data warehouses using relational database (Oracle, DB2, Sybase) applying transformations and business rules as defined in requirements is a MUST.
Knowledge of oracle SQL and PL/SQL (cursors, triggers, packages, stored procedures)
Knowledge in integrating data from multiple sources- MQ, Solace, XML, Excel files, unstructured data and loading them into staging/warehouse tables
Understanding of dimensional data modelling - star schema and snowflake schema
Extensive experience in creating jobs, routines and sequences to create end to end ETL framework in DataStage
Experienced in maintaining DataStage code version using GitHub/Microsoft Azure Knowledge of code migration and production deployment using automated migration process
Experienced in performance tuning DataStage ETL processes using appropriate data partitioning algorithms
Must have extensive Knowledge of Unix/Linux Commands and shell scripting
Knowledge on scheduling tools like Control-M.
Knowledge on Azure DevOps and Agile methodology.
Knowledge on Apache Kafka
Nice to have some knowledge on Service now and ITSM concepts. Understanding of enterprise architecture patterns and best practices
Excellent oral and written communication skills are required
Analytical and problem-solving skills
A positive goal orientated attitude with a focus on service delivery
Knowledge of IBM Infosphere DataStage ETL is required
Knowledge on building and supporting ETL Code using Infosphere DataStage to load enterprise data warehouses using relational database (Oracle, DB2, Sybase)
Knowledge of oracle SQL and PL/SQL (cursors, triggers, packages, stored procedures)
Must have extensive Knowledge of Unix/Linux Commands and shell scripting.