Description

  • Experience:- 10 to 14 Years

       Skillsets we are looking for:

  • 8+ years of working experience in Data Engineering.
  • 6+ years’ experience in PySprak.
  • 5+ years’ experience in AWS Cloud.
  • 5+ years’ experience in AWS Glue.
  • 5+ years’ experience in AWS Redshift.
  • 5+ years exp in AWS CI/CD pipeline like code build, code commit, code deploy and code pipeline.
  • Experience in ETL tools.
  • Strong proficiency in AWS services such as S3, EC2, EMR, SNS, Lambda, Step Functions and Event bridge.
  • Experience implementing automated testing platforms like PyTest.
  • Strong proficiency in Python, Hadoop, Spark and or PySpark is required.
  • Skill of writing clean, readable, commented and easily maintainable code.
  • Understanding of fundamental design principles for building a scalable solution.
  • Skill for writing reusable libraries.
  • Proficiency in understanding code versioning tools such as Git, SVN, TFS etc.,
  • Bachelor's degree or higher.

Education

  • Bachelor in BE or B.Tech/MCA

Education

BE