Bachelor’s degree in Computer Science Engineering or equivalent.
· Bigdata: 5-6 years of Hands-on experience in Hadoop / Spark (Scala or Python), Hbase, Hive, Sqoop.
· Knowledge on Database architectures of RDBMS & No-SQL. Good in writing optimized SQL queries which give good performance.
· AWS: Working knowledge on Lambda, EC2, ECS, EMR, Athena, S3.
· Working knowledge on Git/bitbucket.
· AWS Debugging: Very good at debugging issues in AWS CloudWatch.
· Hands-on and good coding skills. Preferably with Scala or Python.
· Data-processing constructs like Joins, MapReduce.
· Unit Testing: Should be efficient in writing Unit Test cases [positive and negative scenarios] and executing them.
· Good Communication Skills and positive attitude.
· Experience in working in Agile environment.
Bachelor's degree