Job Description:
Minimum 7 years of experience in data engineering and Data Pipelines
Minimum 5 years of extensive experience in Python Programming
Minimum 3 years of extensive experience in SQL, Unix/Linux Shell Scripting
Hands-on experience writing complex SQL queries, exporting, and importing large amounts of data using utilities.
Minimum 3 year of AWS experience
Basic Knowledge of CI/CD
Excellent communication skills and Good Customer Centricity.
Nice to Have's:
Prior experience with data migration project
Experience with Kafka Streams/building data intensive streaming applications (stream processing, e.g. Kafka, Spark Streaming)
Experience/Knowledge of Scala or Java Programming
Experience with at least one Cloud DW such as Snowflake
Experience with Distributed Computing Platforms
Any Gradute