Description

About You – experience, education, skills, and accomplishments

Bachelor's degree in computer science or equivalent experience in software development.

Min 5 years of experience in building and maintaining data pipelines, ETL processes, and leveraging cloud-based services on AWS.

Proficiency in Unix/Linux environments.

Hands-on experience with Docker and Kubernetes.

Hands-on experience with AWS cloud services (EKS, EC2, S3, RDS, Lambda, OpenSearch).

Strong programming skills in Python for data development.

Solid understanding of SQL databases.

 

It would be great if you also had:

Familiarity with CI/CD pipelines and DevOps practices.

Hands-on experience with the full Software Development Life Cycle (SDLC).

Experience working in an Agile Scrum methodology.

Proven experience leading projects and driving them to completion.


What will you be doing in this role?

Design, develop, and maintain scalable data pipelines and ETL processes.

Utilize AWS services to architect and implement cloud-based data solutions.

Develop and manage data models and data storage solutions.

Optimize data workflows and processes for performance, reliability, and scalability.

Implement and maintain infrastructure using Docker and Kubernetes.

Write efficient, maintainable, and reusable code using Python and Django.

Ensure data quality, integrity, and security across all projects.

Troubleshoot and resolve issues related to data processing and workflows.

Education

Bachelor's degree in computer science