Description

Key Responsibilities:
ETL Development: Design, develop, and maintain ETL processes using Informatica PowerCenter and Python scripts.
Data Integration: Work with various data sources (e.g., SQL, NoSQL, APIs) to extract, transform, and load data into data warehouses and databases.
Automation: Develop Python scripts to automate data processing and integration tasks, including data extraction, transformation, and loading.
Data Quality: Implement data quality checks and validation processes to ensure data accuracy and integrity.
Performance Optimization: Optimize ETL processes for performance and scalability, and troubleshoot performance issues as needed.
Collaboration: Collaborate with data architects, data scientists, and business stakeholders to understand requirements and provide effective data solutions.
Documentation: Create and maintain technical documentation for ETL processes, data flows, and Python scripts.
Support and Maintenance: Provide support for existing ETL processes and workflows, including troubleshooting and resolving issues.
Required Qualifications:
Experience: 8+ years of experience in ETL development using Informatica PowerCenter and Python.
Technical Skills:
Strong proficiency in Python programming for data manipulation and automation.
Experience with Informatica PowerCenter, including workflow and mapping design.
Proficiency in SQL and database management systems (e.g., Oracle, SQL Server, PostgreSQL).
Understanding of data warehousing concepts and ETL best practices.
Analytical Skills: Strong analytical and problem-solving skills with the ability to troubleshoot complex data issues.
Communication Skills: Excellent communication skills to work effectively with cross-functional teams and stakeholders.
Education: Bachelor’s degree in Computer Science, Information Systems, Data Science, or a related field.
Preferred Qualifications:
Experience with cloud platforms such as AWS, Azure, or Google Cloud.
Knowledge of data modeling, data governance, and data quality frameworks.
Familiarity with big data technologies (e.g., Hadoop, Spark).
Experience with RESTful APIs and web services for data integration.

Education

Bachelor's degree in Computer Science