Establish Data Architecture principles. Develop and maintain the Conceptual Architecture and Data Models for Data Lake and Data Marts.
Understand client data requirements and translate them into plans for data platform modernization.
Extract and transform data from systems using automated tools, ensuring that data integrity is maintained.
Migrate extracted data to new systems, optimizing its storing and ensuring security.
Work with QA teams to ensure that new system works as intended and resolve any identified bugs.
Travel and/or relocation to various unanticipated locations throughout the U.S. required.
Work with the following: Azure, Python, Snowflake, SQL Scripting, Oracle, Power BI, Tableau.
Requirements
Position requires a Bachelor’s Degree in Computer Science, Engineering, or related field and five (5) years of progressively responsible experience.
Two (2) years of the aforementioned five years of progressively responsible experience must have included: Azure, Python, Snowflake, SQL Scripting, Oracle, Power BI, Tableau.
Travel and/or Relocation to various unanticipated locations throughout the U.S. required