Required Skills:
At least 6+ years of experience on designing and developing Data Pipelines for Data Ingestion or Transformation using Scala or Python
At least 4 years of experience with Python, Spark.
At least 3 years of experience working on AWS technologies.
Experience of designing, building, and deploying production-level data pipelines using tools from AWS Glue, Lamda, Kinesis using databases Aurora and Redshift.
Experience with Spark programming (pyspark or scala).
Hands on experience with AWS components like (EMR, S3, Redshift, Lamdba, API Gateway, Kinesis ) in production environments
Any Graduate