Participate in agile team working with product/tech owner, and team members to understand and code data requirements.
Design ETL Flow based on the requirements of the specific use case.
Use the appropriate tools and frameworks available to develop the data acquisition and ingestion process in accordance with the approved design.
Experience with Cloud technologies preferably AWS RDS.
Design and Build Data Access Layer using Rest APIs.
Assist to build out data model with other team members.
Ensure the necessary data validation steps are included to ensure completeness and accuracy of data.
Perform initial validation of the process and inspection of the data.
Proper Entitlement is established on data objects.
Provide Support for data jobs once deployed in production to ensure SLAs are met for data consumers.
Partner with application teams to assess 3rd party data models and/or support the creation of proprietary logical and physical data models that are fit for purpose and in line with the projects business goals
Leverage workflow tools to maintain accurate status of assigned tasks (JIRA, etc.).
Help to test and fix performance issues.
Follow SDLC adoption of strategic, enterprise information architecture data standards, including naming and modelling standards
Qualifications
A strong understanding of Master Data Management Platform.
Experience in MDM party, Security Master and other financial refence data domains
Hands on experience doing MDM implementations using TIBCO EBX or other related MDM tools.
Experience implementing data work flow and data validations within MDM platform
Hands on experience in writing complex SQL (may include queries, stored procedures, triggers, etc.) in the course of development and debugging issues.
Hands-on experience with Oracle, Sybase and SQL Server is a must – from initial logical design to physical deployment, for both new build and change management
Proficient in Python.
Experience debugging potential issues within the implementation of the MDM tools and methods to bringing them to resolution.
Experience setting up and executing Best practices as it pertains to Support and Operations.
Experience with designing/building Data API and its interaction with data consumers.
Experience with test-driven development.
Working experience with Data Management and Governance solutions (i.e. Data Warehousing, Data Lakes and Reservoirs) in both relational technologies (e.g. Oracle) and no-SQL solutions (Hadoop) for both on-prem and cloud (AWS) deployments required