Description

Job Description:
Design and implement data products and features in collaboration with product owners, data analysts, and business partners.
Contribute to overall architecture, frameworks and patterns for processing and storing large data volumes.
Evaluate and utilize new technologies/tools/frameworks centered around high-volume data processing.
Translate product backlog items into engineering designs and logical units of work.
Profile and analyze data for the purpose of designing scalable solutions.
Define and apply appropriate data acquisition and consumption strategies for given technical scenarios.
Design and implement distributed data processing pipelines using tools and languages prevalent in the big data ecosystem.
Build utilities, user defined functions, libraries, and frameworks to better enable data flow patterns.
Implement complex automated routines using workflow orchestration tools.
Anticipate, identify and tackle issues concerning data management to improve data quality.
Build and incorporate automated unit tests and participate in integration testing efforts.

Key Skills
Education

Bachelor's degree