Key Responsibilities:
- As a Senior Database specifical, you will play a pivotal role in driving and shaping Pershing X – Trading and Rebalancing’ data strategy as we start on exciting initiatives to build a highly scalable and incredibly resilient Portfolio Management system in Pershing X space. Working closely with the team and your peers, will be responsible for-
- The Design and Delivery of data solutions using Mongo, with an emphasis on quality automation and continuous integration.
- Supporting the development and implementation of a data strategy that enables current and future data processing, analytics and reporting requirements in alignment with the overall Pershing X’ data strategy and priorities.
- Contributing to the development and implementation of appropriate practices, standards, governance and risk management processes for data delivery
- Evaluation of different architectural and software solutions, understanding trade-offs and designing, delivering appropriate solutions.
- In the business-facing role, you will be working in a fast-paced environment and should be able to share ideas and fresh perspective based on your prior experience(s) Working to collaborate with peers and contribute to new technology discussions.
- Building strong relationships with business stakeholders, collaborate on requirements, relationships and ensuring that the technology deliveries are aligned with business goals and strategy.
- Writing reusable, testable and efficient code besides coming up with best practices for information security and data protection.
Job Specific Competencies Technical Skills/System Knowledge( and associated skill level)
- Bachelor's degree in computer science engineering or a related discipline, or equivalent work experience required.
- A Minimum of 8 years of experience in software development required; Good understanding of financial markets with prior experience in building, supporting front [1]office trading systems is a plus.
- Minimum 5 years of experience around building highly scalable systems using MongoDB.
- Demonstrable experience building data pipelines in Python.
- Experience building on the Public Cloud (preferably Azure or AWS)
- Experience using cloud based PaaS data platforms such as Snowflake and Databricks.
- Experience with modern opensource ETL/ELT and orchestration tools such as Airbyte or Airflow.
- Experience with data modelling for both operational and analytical datastores
- Experience with data lake concepts and strong understanding of data design patterns culminating into microservices development.
- Experience in developing Data Pipelines to support near-real-time and event-driven messaging and data processing.
- Expertise in Agile development
- Experience in handling high volume data processing through concurrency and multi[1]threading techniques to build scalable, performant and resilient applications.
- Collaborate cross-functionally with data engineers, business users, project managers and other engineers to achieve elegant solutions.