Description

Responsibilities

A Cloud Data Engineer is responsible for designing, implementing, and maintaining data pipelines and infrastructure in cloud environments. One key responsibility for this role is to move data from on-prem sources to Azure cloud platform using ADF and Event Hub. The person will play a critical role in ensuring that data is collected, stored, and processed efficiently and securely to support the data analytics and business intelligence needs of an organization. Here is a detailed job description:

Data Pipeline Development: Design, develop, and maintain data pipelines that extract, transform, and load (ETL) data from various on-prem sources into cloud-based data warehouses or data lakes (Azure in this case using ADF and Event Hub).

Data Integration: Integrate data from diverse sources, including databases, APIs, and streaming data, into a unified and structured format for analysis and reporting.

Cloud Platform Expertise: Work with cloud platforms (Azure) to deploy and manage data infrastructure and services.

Data Transformation: Apply data transformation and cleansing techniques to ensure data accuracy and consistency.

Monitoring and Optimization: Monitor data pipelines and infrastructure, identifying and addressing performance bottlenecks and issues proactively.

Documentation: Maintain comprehensive documentation of data engineering processes, data flows, and system configurations.

Collaboration: Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and deliver solutions that meet their needs.

Qualifications

Knowledge of Azure ecosystem, especially working on ADF, Event Hub etc.

Experience of moving data from on-prem sources onto Azure cloud platform

Educational Background: A bachelor's degree in computer science, information technology, or a related field is typically required. A master's degree can be advantageous.

Cloud Certification: Certifications in Azure cloud platform .

Programming Skills: Proficiency in programming languages like Python, Java, or Scala for data processing and scripting.

Database Skills: Strong knowledge of relational and NoSQL databases, as well as SQL query optimization.

ETL Tools: Experience with ETL tools on cloud-native ETL services.

Version Control: Proficiency in version control systems like Git.

Problem-Solving: Strong analytical and problem-solving skills to address complex data engineering challenges.

Communication Skills: Effective communication and collaboration skills to work with cross-functional teams.

If it matches your profile, send your resume to sandhiya.s@cortexconsultants.com. Looking forward to connecting with you!

Education

Any graduate