Responsibility
● Deploy modern data management tools to curate our most important data sets, models and processes, while identifying areas for process automation and further efficiencies
● Evaluate, select and acquire new internal & external data sets that contribute to business decision-making.
● Engineer streaming data processing pipelines.
● Drive adoption of Cloud technology for data processing and warehousing (AWS ,Azure).
● Engage with data consumers and producers in order to design appropriate models to suit all needs.
Skills and Experience we are looking for
● 8 - 11 years (expert) / 11+ years (advanced) of relevant work experience in a team-focused environment
● A bachelor’s degree (Masters preferred) in a computational field (Computer Science, Applied Mathematics, Engineering, or in a related quantitative discipline)
● Working knowledge of more than one programming language (Python, Java, C , C#, etc.)
● Deep understanding of multi multi-dimensionality of data, data curation and data quality, such as traceability, security, performance latency and correctness across supply and demand processes
● In-depth knowledge of relational and columnar SQL databases, including database design
● General knowledge of business processes, data flows and the quantitative models that generate or consume data
● Excellent communication skills and the ability to work with subject matter expert to extract critical business concepts
● Independent thinker, willing to engage, challenge or learn
● Ability to stay commercially focused and push for quantifiable commercial impact
● Strong work ethic, a sense of ownership and urgency
● Strong analytical and problem-solving skills
● Ability to collaborate effectively across global teams and communicate complex ideas in a simple manner
● Strong understanding of data structure and algorithms
● Functional knowledge of buy-side business is good to have
● Working with data pipes from Bloomberg, Reuters, Fact set etc. is good to have.
Bachelor's degree