Description

Description:

REQUIRED EXPERTISE IN TOOLS & TECHNOLOGIES:

  • MUST HAVE
    • Hadoop HDFS, Spark, Scala, Hive, Sqoop, Postgres and RedShift
    • Unix Shell Scripting - AIX or Linux
    • Amazon Web Services or Microsoft Azure
  • PREFERRED
    • Certification in Hadoop / Big Data
       

OTHER SKILLS/EXPERIENCE REQUIRED:

  • More than 3 years of experience working in big data/hadoop environment using Spark/Scala, Hive, Postrges and RedShift
  • Working knowledge of Informatica Data Quality is preferred
  • Excellent understanding of data warehousing concepts.
  • Candidate should be able to clearly communicate fundamental concepts during the interview and demonstrate previous experience in all aspects.
  • Experience in Business Intelligence reporting tools like Cognos and Business Objects
  • Experience in Oracle database programming using Partitioning, Materialized View and OLAP
  • Experience in tuning Oracle queries/processes and performance management tools
  • Experience in scheduling tools like Tivoli, Auto-sys
  • Strong data modeling skills (normalized and multidimensional)
  • Strong business and communication skills

Education

Any Graduate