Description

#Hiring ParamInfo is looking for Data Engineer Lead (Onsite) for Abu Dhabi - Location.
 
 Mail me on [email protected]

Role - Data Engineer Lead (Onsite)

Job details:

  • Data Engineering Lead is responsible for the development and support for internally created or supported ETL or and database program (Big Data, data warehouse), including business requirement gathering, designing the data model and developing the solution. 

Job Role & Responsibilities-

  • Responsible for the understanding and documenting the business requirements 
  • Deep understanding of ETL methodologies, Big Data and Data Warehousing principles, approaches, technologies, and architectures including the concepts, designs, and usage of data warehouses 
  • Works closely with business and technical teams to understand, document, design, develop, code, and test ETL processes 
  • Demonstrated experience in ETL design and options to improve load and extract performance 
  • Translate source to target mapping documents into ETL processes 
  • Building and integrating APIs 
  • Deploying ML solutions in the cloud 
  • Design, Develop, Test, Optimize and Deploy ETL code and stored procedures to perform all ETL related functions. 
  • Works in Agile environment and utilizes best principles of CI/CD and DevOps 
  • Design the data model for the Big Data 
  • Responsible for designing/implementing/managing ETL processes using Talend and Azure Data Factory 
  • Responsible to administrate and maintain the Azure/MS SQL server Databases and Microsoft Enterprise Data warehouse 
  • Developing Spark Job in Scala or Java 
  • Write, implement, and maintain appropriate ETL processes 
  • Lead, train and support the work of other staff engaged in similar functions. 
  • Monitoring and maintain the following all the databases and al the ETL components 
  • Designs, codes in Python and SQL, orchestrates and monitors jobs in Azure Databricks / Snowflake or any data warehouse cloud 

Education & Experience Required

Essential: 

  • Bachelor’s Degree in Computer Science, Engineering, or a related field
  • 5+ years of experience within particular area of expertise 
  • 5+ years’ experience of ETL development using Talend / Airflow 
  • 5+ years’ experience of developing SQL queries, stored procedures, and views 
  • 3+ years’ experience developing/coding in Python 
  • 3+ years’ experience of database administration 
  • 3+ years’ experience working with any cloud solutions 

Desirable

  • 3+ years’ experience of Azure Databricks or any cloud data warehouse such as (Snowflake, redshift) 
  • 2+ years’ experience of Azure DF 

Certifications Desirable:

  • Advanced certification in Azure (Azure DF, Azure ML) 
  • Certification in Azure Databricks or any cloud warehouse 
  • Any certification in data science is a big plus. 

Thanks