Description

Key Responsibilities:

Lead a group of engineers building data pipelines using technologies like (Java / Python, Spark, ETL, Kafka, AWS Glue) on medium to large scale datasets

Experience with distributed systems and microservices; and the ability to work independently and as part of a team

Utilize Java/ Python to create software solutions

Should have experience in AWS cloud platform services (ECS, ECR, EC2, S3, SNS, SQS, Lambda, etc)

Continuously improve software engineering practices

Stay up to date with the latest software engineering trends and technologies

Minimum Qualifications:

10+ years of experience developing software solutions in Java / Python, Spark, ETL, Kafka, AWS Glue cloud platform

Build the next generation Distributed Streaming Data Pipelines and Analytics Data Stores using streaming frameworks (e.g. Spark Streaming, etc.) using programming languages like Java, Scala, Python

Use emerging and traditional technologies such as Java, J2EE, Design Patterns, TDD, Spring, Spring Boot, Microservices, RESTful services, SQL/ Oracle / PostgreSQL/ MongoDB / Postgres databases and AWS Cloud Infrastructure

Understanding of containerization technologies (Docker, Kubernetes, etc)

Should have experience with CI/CD systems

Desired Qualifications: 

Nice to have experience in the full suite of Go (GoLang) frameworks and tools

Nice to have experience in Banking / Finance domain

Education

Any Gradute