Description

Job Description
We are expecting the candidate who are having the hands-on experience on Databrics + PySpark + AWS knowledge (S3, IAM..etc).

Candidate Should have an experience on below basic topics in Databricks:
• PySpark Coding experience like writing Notebooks
• Data Bricks concepts like Delta lake and Compute
• Experience in creating Workflows/Jobs in Databrics
• Should be aware of Optimize concepts if any performance issues

Databricks - Cluster, notebook, Pyspark, ETL, (migration Journey) are mandatory skills and candidates should be extremely proficient. Only if the candidate is good databricks they shall be interviewed on AWS and Python. Kindly add profiles on high priority.

Education

Any gradudate