Description

▪    8-10+ years of relevant experience in ETL, Data Management or Data Integration.

▪    5+ years in Talend as a Senior Data Engineer.

▪    Strategic/Management consulting experience is highly desired.

▪    Must have experience on one or more Talend Modules – TAC, TMC, Data Integration, Data Quality, ESB, API Design and Services and Big Data Frameworks.

▪    Must have worked on Talend Data Integration for 5+ years in Cloud environment and experience with Talend Admin Console (TAC).

▪    Experience working on the Talend Cloud Management Console Public API enabling users to manage Talend. Management Console (TMC) from an external system (for example, a scheduler, script, or program).

▪    Experienced Talend Architect and proficient in designing and developing mappings, transformations, sessions and workflows, and deploying integration solutions.

▪    Extensive experience in ETL methodology for performing Data Profiling, Data Migration, Extraction Transformation and Loading using Talend.

▪    Experience in delivering quality work on time with multiple, competing priorities.

▪    Excellent troubleshooting and problem-solving skills, must be able consistently identify critical elements, variables and alternatives to develop solutions.

▪    Experience in identifying, analyzing and translating business requirements into conceptual, logical and physical data models in complex, multi-application environments.

▪    Experience with Agile and Scaled Agile Frameworks.

▪    Experience in identifying and documenting data integration issues, challenges such as duplicate data, non-conformed data, and unclean data. Multiple platform development experience.

▪    Strong experience in performance tuning of ETL processes using Talend.

▪    Experience in Cloud technologies such as AWS/Azure or Google cloud.

▪    Experience in using cloud components and connectors to make API calls for accessing data from cloud storage (Amazon S3) in Talend Open Studio. A- pache Spark design and development experience using Scala, Java, Python or Data Frames with Resilient Distributed Datasets (RDDs).

▪    Development experience in databases like Oracle, AWS Redshift, AWS RDS, Postgres Databricks and/or Snowflake.

▪    Hands-on professional work experience with Python is highly desired.

▪    Experience in real-time or batch data ingestion.

▪    Strong communication and teamwork skills to interface with development team members, business analysts, and project management. Excellent analytical skills.

Education

Any Graduate