Job Description:
Proficient in programming languages such as Python, PySpark, Scala and Java.
Working knowledge of Spark, XML, ETL, API and Web Services.
Experience with integration of data from multiple data sources.
Experience with NoSQL databases such as HBase, Cassandra and MongoDB.
Knowledge of various ETL techniques and frameworks such as Flume.
Experience with various messaging systems such as Kafka or RabbitMQ.
Experience with building stream-processing systems using solutions such as Storm or Spark-Streaming.
Working knowledge and experience in Big data services in one of the Cloud Providers will be good (AWS, Azure and GCP).
Expertise in Snowflake implementations with 7 to 10 years in Cloud and Data industry.
7+ years experience in cloud data engineering experience.
5+ years of strong experience in SQL stored procedures and ELT in Snowflake Environment.
2+ years of experience in Python development for Data engineering.
Experience with cloud, on-prem DevOps tools for orchestration, scheduling and logging required for data engineering development.
In-depth understanding of Snowflake Architecture including Snow SQL, Performance tuning, Compute and Storage.
Hands-on experience in selecting, integrating any Big Data tools and frameworks required to provide requested capabilities.
Responsible for end-to-end data analytics design and architecture for Snowflake in Azure.
Extensive experience writing and tuning SQL queries.
Any Graduate