Description


Description
1) Strong working experience with Oracle /AWS Redshift/ MS SQL Server Database and highly proficient with complex SQL scripting to validate source/target data load and transformation
2) Experience in Bigdata data pipeline testing with Hadoop/Spark framework in AWS cloud infrastructure
3) Knowledge of applying the Spark DataFrame API concepts using PySpark to complete data manipulation task’s
4) Very good understanding of business intelligence concepts, architecture & building blocks in areas ETL processing, Datawarehouse, dashboards and analytics.
4) Good to have knowledge in scripting languages like Python to create automated Test Scripts or Automated tools like TOSCA will be an added advantage
5) Good to have some functional knowledge of Salesforce CRM/ Oracle Eloqua Marketing application
6) Experience on different types of testing that includes Black Box testing, Smoke testing, Functional testing, System Integration testing, End-to-End Testing, Regression testing & User Acceptance testing (UAT) & Involved in Load Testing, Performance Testing & Stress Testing as needed
6) Hands-on experience and strong understanding of Software Development Life Cycle (SDLC) and Software Testing Life Cycle (STLC) on Agile Scrum environments
7) Expertise in using JIRA / Excel for writing the Test Cases and tracking the Defects
8) Participate in Defect Triaging during Major releases and track the defects for resolution/ conclusion
9) An effective communicator with strong analytical abili

Education

Any Graduate