Description

Role Description:

Technical Expertise:

  • Proven experience with Apache Spark, including Spark SQL, Spark Streaming, and PySpark.
  • Strong proficiency in using Apache Airflow for workflow orchestration.
  • Hands-on experience with cloud platforms such as AWS, Azure, or Google Cloud.

Programming Skills:

  • Proficiency in programming languages such as Python, Scala, or Java.
  • Solid understanding of SQL and experience with relational databases.

Data Engineering:

  • Experience with data modeling, data warehousing, and data integration.
  • Familiarity with Big Data technologies and frameworks.
  • Knowledge of containerization technologies like Docker and orchestration tools like Kubernetes is a plus.
  • Understanding of DevOps practices and CI/CD pipelines.
  • Strong problem-solving skills and attention to detail.
  • Ability to work in a fast-paced, agile environment.

 


 

Education

Any Gradute