Job Description:
Required:
· Collaboration with our Quants and global technology teams to create data pipelines for a new data analytics platform and build interfaces for data to be extracted and analyzed.
· Development and management of bringing in various data sources into a global Data Lake.
· Collaborating with Quants to convert their models into strategic software whilst enabling the quants to adjust and adapt their models at will. Working with IT to align as closely as possible with the rest of the business, whilst still ensuring that the quants are able to innovate, research, and act with agility is essential.
· Data Warehousing, ETL development and testing experience in Big Data technologies is essential whilst experience working with relational databases such as Oracle 11g or higher will be beneficial.
· Define and manage best practice in configuration and management of the data lake.
Essential Skills:
· Strong technical skills across the Hadoop stack (HDFS, HBase, Phoenix, Hive, Pig) and SQL
· Familiarity with web, ftp, api, sql and related ETL technologies
· Databases knowledge should extend to PL SQL, SQL and Transact-SQL. Oracle is a plus
· A knowledge of modern NoSQL data stores
· Hands on experience with J2EE(JDBC, JNDI, JMS), Tomcat a plus
· Experience with Spring Framework Development (Spring Core and Spring Batch in particular. Spring Integration a plus)
· Strong knowledge of REST web services
· Experience with GIT and MAVEN
· Experience of handling data in various file types; flat files, XML, Parquet, data frames etc
· Experience in resolving maintenance issues, data issues and bug fixes.
· Performance tuning and optimization techniques
· Production support and application maintenance knowledge
· Experience working directly on the command line / shell scripting experience
· Data system operational knowledge, such as scheduler, query performance, security/encryption, etc.
Any gradudate