Required Qualifications:
Bachelors degree or foreign equivalent required from an accredited institution. Will also consider three years of progressive experience in the specialty in lieu of every year of education
At least 5 years of experience in Pyspark, Spark with Hadoop distributed frameworks while handling large amount of big data using Spark and Hadoop Ecosystems in Data Pipeline creation , deployment , Maintenance and debugging
Experience in scheduling and monitoring Jobs and creating tools for automation
At least 4 years of experience with Scala and Python required.
Proficient knowledge of SQL with any RDBMS.
Strong communication skills (verbal and written) with ability to communicate across teams, internal and external at all levels.
Ability to work within deadlines and effectively prioritize and execute on tasks.
Preferred Qualifications:
At least 1 years of AWS development experience is preferred
Experience in Drive automations
DevOps Knowledge is an added advantage
Bachelor's Degree