Description

Job Description:

• 5+ years experience in AWS Cloud Technologies

• Proficiency with Python, Java, shell scripting (Bash and Powershell) & SQL

• Experience working with streaming data and data extraction from different databases (Oracle, DB2, MySQL etc)

• Experienced with deploying and managing infrastructures based on Docker, Kubernetes, or OpenShift

• Experience with scalable data extraction tools is a plus

• Experience working with Kafka, Aurora, AWS Glue, Redshift is a plus

• Understanding of data engineering, real time streaming and/or eventing, and json parsing

• Experience with automating application deployment, continuous delivery, and continuous integration (Jenkins, Ansible etc.)

• Experience building micro services and API architecture

• Debugging & troubleshooting skills

• Must be flexible and have a passion to learn and collaborate with colleagues

• Must have strong oral and written communication skills

• Business Intelligence/Analytics experience a plus

• Embraces diverse people, thinking and styles.

• Consistently makes safety and security, of self and others, the priority.

 

Responsibilities:

• Develop data APIs and data delivery services that support operational and analytical applications for Delta’s internal business operations, customers and partners

• Develop solid and supportable modular designs for data streaming, Cloud transformation/migration, and API product development in support of critical applications

• Work within automated testing and CICD processes

• Be an expert on the products we built

• Document solutions in written and diagram form, and communicate across teams

• Leverage developer coding standards to ensure it meets design goals and business needs

• Identify technical issues, articulate impact and need for prioritization

• Proactive communication to both team and leadership

Education

Any Graduate