Job Description:
Develop and maintain Flink applications using Java or Scala.
Hadoop/Omnia Dev Engineers To develop and deliver codes for the work assigned in accordance with time| quality and cost standards.
Key Responsibilities
Interact with business stake holders and designers to implement to understand business requirements.
Hadoop development and implementation.
Loading from disparate data sets.
Good to have Teradata Knowledge
Must have working experience in IntelliJ IDEA, AutoSys, WinSCP, Putty & GitHub.
Designing, building, installing, configuring and supporting Hadoop.
Transform data using spark & scala
Translate complex functional and technical requirements into detailed design.
Perform analysis of vast data stores and uncover insights.
Maintain security and data privacy.
Create scalable and high-performance web services for data tracking.
High-speed querying.
Managing and deploying HBase.
Test prototypes and oversee handover to operational teams.
Propose best practices/standards.
Required Skills:
HADOOP,IDEA, AutoSys, WinSCP, Putty
Bachelor's or Master's degree in Computer Science