Description

Job Description:

· Bachelor's Degree with specialized coursework in Computer Science or Management Information Systems.

· 4+ years of experience developing enterprise-grade data integration solution.

· Good knowledge of Java and JVM, Python/JavaScript, C and Linux system (5+ years' experience required); you should be capable of programming in compiled and dynamic languages.

· Should have worked on Big Data and Hadoop platforms.

· Good understanding of Kafka.

· You have expertise in data stores (both transactional and non-transactional), and can code in a highly concurrent environment.

· Prior ETL development job experience required; 2+ years of experience ideal.

· Ability to operate effectively in ambiguous situations.

· Ability to learn quickly, manage work independently and is a team player.

· Familiarity with Agile software development methodologies to ensure the early and continuous delivery of software

· Experience with Cassandra.

· Modelling and API development.

Experience with Hadoop/Spark/Kafka.

Education

Any Graduate