Description

Azure Data Engineer

Remote Job |   2022-04-28 13:21:12

Apply Now

Share Job 

Job Code : Capgemini018

Hi All, 

 

We Have an Immediate requirement on " Azure Data Engineer " @ Remote. 

 

Role: Azure Data Engineer  

Duration: 12+ months contract

Location: remote  

 

Top 3-5 skills:  

Azure DevOps

Databricks

Data Factory

 

Overview: Shifting from waterfall to agile methodology. There is no production release yet, this role is to assist a current employee who wears 5 different hats. Release of product will be in a couple months, and team structure will be reassigned. This role is more on the engineering Cloud (Azure Dev Ops) space, not looking for someone on the reporting side. Will focus on migration and building out the Cloud. Senior role would be a plus however open to all levels. Manager has success with candidates with experience in Informatica and DataStage. These candidates transitioned well into the environment.

 

JOB SUMMARY: The data engineer I is responsible for designing, developing, implementing, and supporting data warehouse and cloud storage solutions that support company analytics.

 

Must Haves:

  • Azure DevOps
  • Databricks
  • Data Factory

 

Nice to Have:

  • Erwin and Collabra - Data Management tools
  • Informatica and DataStage

 

KEY SELECTION/CRITERIA - Minimum qualifications include:

  • Bachelor’s degree or equivalent work experience in Computer Science, Management Information Systems (MIS), Information Technology (IT), or related field
  • 2+ years of experience in Structured Query Language (SQL) programming. Preferred experience other programming languages.
  • 2+ Experience with cloud services (AWS, Azure (preferred), Google Cloud)
  • Familiarity with business intelligence tools such as Power BI, Tableau, MicroStrategy, Business Objects, DAX, and Power Query preferred
  • Strong analytical skills, detail oriented, and organized
  • Strong communication skills and self-motivated

 

ESSENTIAL JOB FUNCTIONS:

  • Consult with business counterparts to understand new data requirements. Design data models to support the requirements.
  • Data profiling and source system analysis to present insights to peers and business partners to support the end use of data
  • Collaborate with senior engineers and architects to ensure data models fit within the company data and systems architecture.
  • Develop, test, and implement Extraction, Transform and Load (ETL) processes to acquire and load data from internal and external sources to the data lake or data warehouse to be used for analytical purposes.
  • Design, build and test data products based on feeds from multiple systems using a range of different storage technologies and/or access methods
  • Monitor and support ETL jobs. Research poor performing ETL jobs and collaborate with database administers and other resources to improve the overall efficiency and stability of the data warehouse environment.
  • Support and partner with business analytics users by identifying relevant data and delivering views, cubes, models, and other semantic objects to ensure ease of access to data for non-technical individuals.
  • Deliver data solutions in accordance with agreed organizational standards that ensure services are resilient, scalable and future-proof
  • Provide technical and project documentation, utilizing agile project management methodologies.

 

 

STANDARD CORPORATE DUTIES:

  • Actively pursues personal continuous learning, development of skills and knowledge in job-related technical and professional areas
  • Support corporate efforts for safety and government compliance
  • Support and follow all corporate policies and procedures
  • Perform other related duties as required and assigned
  • Promote the values of a diverse workforce

Education

Any Graduate