Job Description:
In addition to the technical competencies outlined in the job description, the ideal candidate will have experience with at least a subset of the following:
*This role will require a blend of engineering skills with a strong background in data analysis, forecasting, and data pipeline infrastructure
*Automation of infrastructure delivery with configuration orchestration and management solutions (e.g. Ansible/AAP and Terraform)
*Understanding of coding practices, source code control (Gitlab), and developing in a preferred scripting language (e.g. Python, UNIX Shell Scripting)
*Strong experience with Linux and Windows systems engineering principles
*Strong understanding of application resiliency and data resiliency across multiple platforms.
*Strong hands-on experience in data observability tools such as Datadog
*Experience in data pipeline engineering and architecture which can focus on extracting, aggregating, and analyzing infrastructure performance data alongside business data to forecast infrastructure capacity needs.
*This engineer should be able to take a data analysis problem and produce recommendations, and architecture, and create a pipeline to produce a result to the problem.
**Collaboration and Communication:
*Work closely with cross-functional teams to understand business requirements and drivers.
*Communicate findings and recommendations to stakeholders through reports and presentations.
*Create and maintain comprehensive documentation for data processes and models.
Bachelor's degree in Computer Science