Description

Job Description :

Must-Have:
·        Databricks expert – need an experienced resource who can provide solutions and give recommendations to client.
·        Pyspark – Super important
·        SQL – Super important
·        Airflow – good experience

Experience :
·        Experience in implementation of AWS Data Lake and Data Publication using Databricks, Airflow and AWS S3.
·        Experience in Databricks Data Engineering to create Data Lake solutions using AWS services.
·        Knowledge of Databricks cluster and SQL warehouse, Experience in Delta and Parquet file handling.
·        Experience in Data Engineering and Data Pipeline creation on Databricks.
·        Experience in Data Build Tool (DBT) using Python and SQL.
·        Extensive Experience in SQL, PL/SQL, complex Join, Aggregation function and DBT, Python, Data frames and Spark.
·        Experience in Airflow for Job Orchestration, dependency Setup and Job scheduling.
·        Knowledge of Databricks Unity Catalog and Consumption patterns.
·        Knowledge of GitHub and CI/CD Pipelines, AWS Infra like IAM Role, Secrets and S3 buckets.

Education

Bachelor's degree