Description

Roles and Responsibilities:

  • Skilled in Business Intelligence/Data Warehouse, Data and Technical Architecture, Methodology, Architecture, Data Governance, Data Modeling, ETL Tools, Big Data, Data Lake. Strong expertise applying industry best practice methods and sound enterprise architecture, data architecture/ management, and integration techniques across domains.
  • End to End responsibility for data modeling (OLTP, OLAP)
  • Data analysis experience with writing complex queries, unions, joins and aggregations.
  • Data analysis experience using major RDBMS solutions like snowflake, redshift etc.
  • Experience with ELT, ETL processes using glue, glue brew transformations.
  • Use airflow to schedule and time data transfers.
  • Understanding of deep JSON structures and partitioning using Kafka, spark, Scala, s3 etc.
  • Work with Data Scientist team to build segmentations, ML use cases, forecasting etc.
  • Experience working with sagemaker, jupyter notebooks for deep data analysis.
  • Work with the BI specialists to design develop and enhance connectors to get closer to business use cases.
  • Migration of existing custom pipelines to a normalized connector approach
  • Help educate CT teams on data integration, validation standards and drive clean ingestion egestion patterns for the platform.

Experience and Qualifications:

  • Work with APIs to extract and ingest data.
  • Work with virtual warehouses and configure them for optimal performance and efficiency.
  • Conduct ETL data integration, cleansing transformations using glue spark script.
  • Work on aggregations of data coming from applications and apis to store results in a historic table.
  • Experience with streaming data analytics and building of streaming pipelines and connectors.
  • Experience with connections to BI solutions like tableau which include configurations of roles, policies within aws
  • Leverage Lambda, glue and step functions to cleanse and transform data.
  • Work with DWH technologies like EC2, s3, redshift, athena, snowflake to churn large data sets and partition them in readable formats and in real time.

Education

Bachelor’s Degree