5+ Years of experience in Design and implement data pipelines, ETL processes, and data storage solutions that support data-intensive applications
3+ years of experience in developing, testing, and maintaining architectures such as databases and large-scale data processing systems using tools such as Spark, Databricks and AWS
Experience Java/Python, Data structure and Algorithms
3+ years of experience in cloud development with the AWS platform
3+ years in designing and delivering Transactional, Warehouse, Analytical and Reporting Data Platforms leveraging modern cloud data technologies
Implementation experience of engineering practices like TDD, DevSecOps, Software automation, CI/CD. Strong understanding of Agile and XP practices
Excellent problem-solving and critical-thinking abilities.
Strong communication skills to convey technical concepts to non-technical stakeholders.
Ability to work independently and as part of a team in a fast-paced environment.
Bachelor's degree