4 years of experience in the Big data Solutions on GCP. Expertise in Python, Bigquery, Kubernetes and Airflow is a must have.
Summary
The main function of the Data Engineer is to develop, evaluate, test and maintain architectures and data solutions within our organization. The typical Data Engineer executes plans, policies, and practices that control, protect, deliver, and enhance the value of the organization's data assets.
Necessary Skills And Attributes
3+ years of code based ETL development using python and SQL
3+ years of experience writing complex SQL queries.
3+ years of Python development experience
2+ years of experience on GCP services such as Bigquery, Kubernetes and Composer
2+ years of working experience in Apache Airflow
Experience in developing high-performance, reliable and maintainable code.
Analytical and problem-solving skills, applied to Big Data domain.
Experience and understanding of Big Data engineering concepts.
End to End exposure and understanding of Data engineering projects.
Experience on spark and Dataproc is a plus.
Proven understanding and hands on experience with github, development IDEs such as VS code.
B.S. or M.S. in Computer Science or Engineering
Bachelors or Masters in Computer Engineering
Bachelor's degree