Description

Job Description

We are seeking a skilled Data Engineer with expertise in building and managing data pipelines, data transformation, and integration processes. The ideal candidate will have strong experience in handling data in Dimension and Fact tables, as well as in migrating and managing data on a Common Data Platform (CDP). You will be responsible for ensuring the integrity and availability of data across various systems, supporting both Technical and Business QA processes, and collaborating with cross-functional teams to deliver high-quality data solutions. 
 

 Experience – 5-12 Years 
 

Key Responsibilities: 
 

Data Pipeline Development: 
 

Design, develop, and maintain data pipelines for integration and ingestion from various sources. 
 

Implement data cleaning, enriching, and aggregation processes within a Common Data Platform, following the Bronze, Silver, and Gold architecture layers. 
 

Ensure data pipelines are robust, scalable, and efficient, feeding data to consumer systems in a timely and reliable manner. 
 

Data Handling & Management: 
 

Manage data in Dimension and Fact tables, ensuring data integrity and accuracy. 
 

Update Entity-Relationship (E-R) models and maintain data cataloging within the platform. 
 

Migrate local data platforms to the Common Data Platform using tools like Talend or Informatica. 
 

Replicate existing ETL transformations from local sources to the CDP, ensuring consistency and accuracy. 
 

Technical & Business QA Support: 
 

Collaborate with Technical and Business QA teams to validate data pipelines and transformations. 
 

Troubleshoot and resolve data-related issues, ensuring that all data processes meet the required quality standards. 
 

Technology Skills: 
 

Must-Have: 
 

Proficiency in PySpark & SQL. 
 

Should-Have: 
 

Scripting experience with Python & Bash. 
 

AWS Knowledge. 
 

CI/CD experience with GitHub Actions. 
 

Could-Have: 
 

Terraform for infrastructure management. 
 

Airflow for workflow automation. 
 

Knowledge of graph databases like Neptune or Neo4j. 
 

Understanding of Medallion Architecture for data structuring. 
 

Experience with Atlassian or Unity Catalog for data governance. 
 

Qualifications: 
 

Bachelor’s degree in Computer Science, Information Technology, or a related field. 
 

3+ years of experience in Data Engineering with a focus on data pipeline development, data integration, and data transformation. 
 

Strong problem-solving skills and attention to detail. 
 

Excellent communication skills and the ability to work collaboratively in a team environment. 
 

Preferred Qualifications: 
 

Experience with Talend, Informatica, or similar ETL tools. 
 

Familiarity with cloud platforms and data migration strategies

Education

Bachelor's Degree