Description

Job Description:

  • The DataLake Architect will be an integral part of the Innovation team.
  • Core responsibilities will involve operating, managing, and monitoring the data lake infrastructure to enable efficient data storage, processing, and analysis.
  • The DataLake Architect will determine core data storage and replication needs, build Extract Transform Load process and pipelines to applicable data stores.
  • The DataLake Architect will utilize various on-prem and cloud systems to populate and manage the growing Azure & AWS Infrastructures.
  • Data Bricks will be used for analytical processing within the DataLake.
  • The DataLake Architect will be responsible for ensuring that data is available to support self-service analytics, and data automations from source systems to DataLake as programmed.


Roles and Responsibilities:

  • Operate and maintain the data lake infrastructure, ensuring its reliability, scalability, and security.
  • DOA Collaboration on Data Ingestion/ Storage Needs (Data from APIs, On Prem, Cloud, Legacy Systems).
  • Design and develop data ingestion pipelines to efficiently collect and integrate data from various sources into the data lake.
  • Monitor data lake performance metrics and identify opportunities for optimization.
  • Optimize data processing workflows to enable efficient data retrieval and analysis.
  • Design Initial Wireframes, Proof of Concepts, and Ideation of Data Lake Architecture and ETL Processes.
  • Research methods and best practices in creating secure, monitored and reliable source of truth.
  • Provide updates on work as accomplished.
  • Ensure data policy adherence, row-level security, and ensure data privacy in produced data models and pipelines.
  • Test for proper operation of newly deployed pipelines and data distributions.
  • Assist in end-user training as determined.
  • Document data lake infrastructure, processes, and workflows.
  • The DataLake Architect will assist with new data pipeline initiatives that need direct design, creation, and deployment to production for divisional consumption.


Required Qualifications:

  • Utilize Azure, AWS Data Storage, with Data Bricks Analytical Platform.
  • 3 years or more experience with Azure, AWS data warehousing.
  • Must be able to discuss the requirements effectively with internal/external stakeholders.
  • Understanding of Data Storage and Design Process of AIS.
  • Able to create a story of business challenges, and how the Data Lake can solve.
  • Able to create User Stories, and Divisional Personas of DOA.
  • Able to research and recommend best practices for at scale digital data deployments and integrations.
  • The contractor must be able to manage multiple tasks concurrently.
  • Must be a self-starter & be able to use own judgment/initiative to undertake activities with minimal supervision.
  • Must have excellent oral & written communications skills as well as the ability to work alone or within a team environment.
  • Must be able to work collaboratively with others to achieve team & organizational goals; prioritize projects and/or tasks; provide constructive input to achieve team goals; deliver a customer-focused, responsive service to customers; support efforts to enhance business efficiency & effectiveness; demonstrate a positive, can-do attitude; respond constructively to new information, changing conditions, & unexpected obstacles.


 

Education

Any Graduate