Description

Develop and configure system to migrate legacy system to Hadoop using ETL concepts, MapReduce, Sqoop, Hive, HDFS, PySpark, COBOL, JCL, DB2, CA7, CONTROL-M and perform unit testing. Responsible for mapping the source and target data and setting up Hadoop architecture using various components like MapReduce, HDFS, YARN, Sqoop, Pig, Hive, HBase, Oozie and ZooKeeper. Perform logical and physical data structure designs and DDL generation to facilitate the implementation of database tables and columns out to the SQL Server, AWS Cloud (Snowflake) and Oracle DB schema environment. Will work in Manchester, CT and/or various client sites throughout the U.S. Must be willing to travel and/or relocate.


 

Education

Any Graduate