Touch base with customers to collect the requirements and analyze them
Design and build end-to-end data pipelines to get the data for customers
Unit testing of the pipelines and UAT support
Deployment and post-production support
Understand the current codes, pipelines and scripts in production and fix the issues in case of incidents.
Adapt changes to the existing scripts, codes and pipelines.
Reviewing design, code and other deliverables created by your team to guarantee high-quality results
Capable enough to own the PoCs and deliver the results in a reasonable time
Accommodate and accomplish any ad-hoc assignments
Challenging current practices, giving feedback to colleagues and encouraging software development of best practices to build the best solution for our users
Job Qualifications
Bachelor’s or Master’s degree in Computer Science, Information Technology or equivalent work experience
2+ years of full-time data engineering experience
2+ years of work experience with Azure and Azure Data Factory, Storage accounts and Notebooks
Skilled in the ETL/ELT process. Good working knowledge of ETL tools.
Experienced with Hadoop/Big data ecosystems
Data migration experience from on-premise to cloud
Expertise in structured query language and PL/SQL
Exposure to log analytics and debugging
Good to have DevOps, Continuous Deployment and testing techniques
Agile development experience
Fluent English is spoken and written
Bachelor's degree in Computer Science