Description

ETL Developer I

**This position is on-site, not remote/hybrid**

Description of Position: ETL is the backbone for any data led project. As an ETL developer, you will be spending most of your time building ETL Pipelines. You will be spending time understanding and analyzing complex client data from various banks and processors and device methods to clean and organize it. You will write code to get vast amounts of data seamlessly into our system. You will be spending time building reports to provide insights into data. 

 

Duties and Responsibilities: Primary responsibilities of the position include designing, implementing, and continuously expanding data pipelines by performing extraction, transformation, and loading activities. Investigate data to identify potential issues within ETL pipelines, notify end-users and propose adequate solutions. Develop and implement data collection systems and other strategies that optimize Statistical efficiency and data quality. Acquire data from primary or secondary data sources and maintain databases/data systems. Identify, analyze, and interpret trends or patterns in complex data sets. 

 

Other duties may include: 

 

  • Work closely with management to prioritize business and information needs. 
  • Locate and define new process improvement opportunities. 
  • Prepare documentation for further reference. 
  • High attention to detail.
  • Must be passionate about complex data structures and problem solving. 
  • Be able to look at the data and understand its pattern in order to produce an ETL pipeline. 
  • Must be able to handle unorganized Big Data from different banks and processors. 
  • Should be able to create innovative solutions for technical problems in the ETL pipeline. 

 

Qualifications needed to be successful: 

 

  • Bachelor’s degree in computer science, electrical engineering, or information technology. 
  • Experience with HPCC AND ECL IS A PLUS! 
  • Experience working with complex data sets. 
  • Knowledge of at least one high-level programming language (Python, Java, JavaScript, C++, etc.) 
  • Familiarity with Kafka on-premises architecture and ELK.
  • Understanding of cross cluster replication, index lifecycle management, and hot-warm architectures.
  • Preferred: Experience with Hadoop, Spark, or any other Big Data tools.

 

What we offer: 

  • Career/Professional Development Training 
  • Full Medical, Dental and Vision 
  • Supplemental Life Insurance 
  • Maternity/Paternity Leave 
  • Employee Stock Options 
  • Employee Activities 
  • 401k Contribution 
  • Paid Time Off 
  • 7 Paid Holidays 
  • On-site Gym 
  • Generous Employee Referral Program 

 

Key Skills

Python Java JavaScript C++

Education

ANY GRADUATE

  • Posted On: Few Days Ago
  • Category: Data/ETL Developer
  • Tenure: Any