Job Details
Technical/Functional Skills:
- 10 years of relevant industry experience in Data Engineering working with large scale data driven systems.
- Must have 3+ years' experience with data engineering using AWS platform and Python.
- At least 3 years' experience with Amazon Web Services (AWS)
- At least 3 year's Python scripting language
- At least 3 years' experience in Data Pipeline Development role
- Familiar with AWS Services like EC2, S3, Redshift/Spectrum, Glue, Athena, RDS, Lambda, and API gateway.
Roles & Responsibilities
- Design, build and maintain data infrastructure that powers both batch and real time processing of billions of records a day.
- Build applications using Python, SQL, Databricks and AWS
- Developing sustainable, scalable, and adaptable data pipelines
- Building the Data Lake using AWS technologies like S3, EKS, ECS, AWS Glue, EMR
- Develop data pipelines that provide fast, optimized, and robust end-to-end solutions.
- Operationalizing data pipelines to support advanced analytics and decision making.