Skills Needed: • Must have 6+ years of ‘Recent’ experience working for a Major Bank or Brokerage house in the US. • Must 12 years+ of experience maintaining applications utilizing Java, J2EE, SDLC, and WebSphere. • Must have last 6 years experience working with CASSANDRA, Hadoop, MongoDB, Apache Spark, HDFS, YARN, MapReduce, PID&HIVE, Flume & Scoop, and Zookeeper. • Must have 6 years of experience maintaining Tier-1 data driven apps. • Must have experience with 24/7 uptime and strict SLA. • Extensive experience maintaining data pipelines, aggregate & transform raw data coming from a variety of data sources. • Extensive experience optimizing data delivery and helping redesign to improve performance, handling, transformation, and managing BigData using BigData Frameworks. • Extensive experience maintaining processed data parallel on top of distributed Hadoop storage using MapReduce. • Must have experience with SOA -Design principles. • Must have 5+ years programming in Scala, Java, Python or GO • Must have 5+ years developing on Hadoop/Spark. • Must have 6+ years developing on an RDBMS such as Microsoft SQL Server, and PostgreSQL. • Must have experience with large data sets – regularly transforming and querying tables or sets of greater than 20 million records • Exposure to data hygiene routines and models • Experience in database maintenance. • Ability to identify problems, and effectively communicate solutions to team.Less
ANY GRADUATE