Mandatory to have: Python (advanced level), Pyspark, data flow pipeline in AWS, distributed system, Snowflake, Redshift, ETL testing, QE knowledge
Required Skills:
· Over 8+ Years of experience with strong programming in Python/Pyspark – functional programming experience is a definite plus.
· Experience in Cloud and building data flow/pipelines in mandatory – AWS is preferred but any cloud is fine.
· Should know how to program distributed systems
· Expected to do Quality engineering of Data Pipelines, Datawarehouse including Snowflake, Redshift etc.
· Knowledge of Test Automation preferred
Any Graduate