Qualifications:
Bachelor’s degree in Computer Science or Engineering, Information Systems, or similar field or combination of relevant professional experience, education, and training
4+ years of deep experience in deployments/developments in the public cloud infrastructure (AWS preferred: DynamoDB, S3, lambdas, open search, typescript, Python, Node.js, Java, APIs, Kafka Streams, Security groups etc.)
Diverse and deep experience in data management, Template technologies, micro- services, RESTful API design principles such as versioning, pagination, error handling, and authentication
Advanced working knowledge with NoSQL data stores such as DynamoDB and optional Graph DB, and other AWS services such as Lambda, API Gateway, DynamoDB, RDS, SQS, SNS, etc.
Experience with observability tools such as New Relic, SignalFx, and Splunk
Familiarity with serverless architecture and event-driven programming
Experience working in a large-scale agile environment
Knowledge of various continuous delivery patterns & CI/CD pipelines, version control systems (e.g., Git), and automated testing frameworks
Strong problem-solving skills, an innovative mindset, and a proactive approach to challenges
Excellent communication skills, both written and verbal, with the ability to explain technical concepts to non-technical stakeholders
Experience working in an Agile development environment. Scrum / Kanban experience required - Design thinking preferred
Bachelor’s degree in Computer Science