Description

Essential Skills

ETL Expertise: Strong hands-on experience with ETL tools, specifically Ab Initio (or comparable tools like Informatica, Talend)
Big Data: Experience working with big data in a large, complex organization
Java: Proficiency in core Java development (Scala or Python may be considered if Java experience is lacking)
ETL Pipeline Development: Proven experience in designing, building, and maintaining ETL pipelines

Preferred Skills

Cloud: Experience with AWS cloud services
Big Data Processing: Knowledge of Hadoop and Spark (especially relevant for the legacy ETL migration)
Data Streaming: Familiarity with Kafka
Scripting: Experience with Unix scripting and Python

Responsibilities

Design, develop, and maintain ETL pipelines using Ab Initio or similar tools
Work on the migration of legacy ETL processes to a Spark-based framework
Collaborate with data analysts and other stakeholders to understand data requirements
Optimize ETL processes for performance and efficiency
Troubleshoot and resolve issues related to data extraction, transformation, and loading
Ensure data quality and integrity throughout the ETL process

Education

Any Graduate