Description

Data Engineer (Child Welfare) - Durham, NC (Hybrid)

[Durham, NC] |   2024-05-29 07:54:02

Apply Now

Share Job 

Job Code : 2024-MY3TECH1016

Job: Data Engineer

Location: Research Triangle Park, Durham NC (Hybrid)

 

Q1.Please list all Professional Certifications and the year they were received. The certifications will be validated.

Q2.Please provide a brief description of recent professional development work in a data conversion effort.

 

Client requires the services of Specialists in the roles of Data Engineers that will analyze, map, build, test, and maintain database pipeline architectures using Oracle and DB2 for Child Welfare Data Conversion efforts.

 

Client requires the assistance of contract resources to serve as Data Engineers. The primary purpose of these positions is to analyze, map, build, test, and maintain database pipeline architectures using Oracle and DB2 for Child Welfare Data Conversion efforts as the legacy system migrates to Salesforce. The work involves collaborating with various stakeholders to understand business requirements and translating them into efficient and scalable technical solutions for the Child Welfare program.

 

The persons in this position will play a crucial role in designing, implementing, and optimizing data pipelines and ETL processes to support the extraction, transformation, and loading of data from various Child Welfare sources into Salesforce.  They will collaborate with cross-functional teams to understand data requirements and translate them into technical solutions.  The data engineers will need excellent problem-solving skills and the ability to work independently as well as part of a team.  The work involves developing and maintaining data models, schemas, and databases to support data conversion needs.  The work also involves monitoring and troubleshooting data pipelines to ensure data quality, completeness, integrity, and availability.  This position will be implementing best practices for data governance, security, and compliance to ensure regulatory requirements are met. 

 

Responsibilities and Duties:

  • Analyze, Map, Build, Test, and Maintain database pipeline architectures using Oracle and DB2
  • Collaborate with SMEs to understand and execute data conversion objectives that transforms, masks, and loads data into Salesforce objects
  • Create and maintain table schemas
  • Create new data validation methods to ensure data accuracy, completeness, and cleanliness
  • Ensure compliance with data governance and security policies

 

 

Experience with Oracle

Required

5

Years

Experience with PL/SQL

Required

5

Years

Experience with SQL

Required

7

Years

Experience with data migrations from legacy systems

Required

5

Years

Experience with data cleansing

Required

1

Years

Experience with DB2

Highly desired

1

Years

Experience with ETL processes involving Salesforce

Highly desired

1

Years

Experience with Talend

Highly desired

1

Years

Experience with GIT

Highly desired

1

Years

Experience with Jira

Highly desired

1

Years

Experience with Java

Highly desired

5

Years

Experience with Python

Highly desired

3

Years

Experience with cloud platforms (e.g. AWS and Azure) and associated services for data processing and storage

Highly desired

1

Years

Experience with Azure Synapse

Highly desired

2

Years

Experience with Azure Databricks

Nice to have

1

Years

Experience with Cloud Computing

Nice to have

1

Years

Experience with Power BI

Education

Any Graduate