Description

Essential Functions:

•       Write code for moderately complex system designs. Write programs that span platforms. Code and/or create Application Programming Interfaces (APIs).

•       Write code for enhancing existing programs or developing new programs.

•       Review code developed by other IT Developers.

• Design, implement, and maintain big data solutions using Spark/Scala

• Work with the team to identify business needs and pain points that can be addressed with data and analytics

• Understand complex data sets and ETL processes, and how they can be optimized using Spark

• Design and develop high-performance Spark jobs for data transformation, cleansing, and enrichment

• Tune Spark jobs for optimal performance and resource utilization

• Monitor Spark cluster health and performance, and take corrective action as needed

• Collaborate with other teams to integrate Spark jobs into the overall data pipeline

• Write unit tests and integration tests for Spark jobs

• Debug production issues and provide root cause analysis

• Keep abreast of new features and capabilities in Spark, and how they can be leveraged to improve our data processing

•  Document Spark jobs and related processes

•       Provide input to and drive programming standards.

•       Write detailed technical specifications for subsystems. Identify integration points.

•       Report missing elements found in system and functional requirements and explain impacts on subsystem to team members.
 

•       Consult with other IT Developers, Business Analysts, Systems Analysts, Project Managers and vendors.

•       “Scope” time, resources, etc., required to complete programming projects. Seek review from other IT Developers, Business Analysts, Systems Analysts or
       Project Managers on estimates.

•       Perform unit testing and debugging. Set test conditions based upon code specifications. May need assistance from other IT Developers and team members to debug more complex errors.

•       Supports transition of application throughout the Product Development life cycle. Document what has to be migrated. May require more coordination points for subsystems.

•       Researches vendor products / alternatives. Conducts vendor product gap analysis / comparison.

•       Accountable for including IT Controls and following standard corporate practices to protect the confidentiality, integrity, as well as availability of the application and data processed or output by the application.

•       The essential functions listed represent the major duties of this role, additional duties may be assigned

Job Requirements:

•       Experience and understanding with unit testing, release procedures, coding design and documentation protocol as well as change management procedures

•       Proficiency using versioning tools.

•       Thorough knowledge of Information Technology fields and computer systems

•       Demonstrated organizational, analytical and interpersonal skills

•       Flexible team player

•       Ability to manage tasks independently and take ownership of responsibilities

•       Ability to learn from mistakes and apply constructive feedback to improve performance

•       Must demonstrate initiative and effective independent decision-making skills

•       Ability to communicate technical information clearly and articulately

•       Ability to adapt to a rapidly changing environment

•       In-depth understanding of the systems development life cycle

•       Proficiency programming in more than one object-oriented programming language

•       Proficiency using standard desktop applications such as MS Suite and flowcharting tools such as Visio

•       Proficiency using debugging tools

•       High critical thinking skills to evaluate alternatives and present solutions that are consistent with business objectives and strategy

Specific Tools/Languages Required:

HADOOP

Spark

Scala

Ab Initio

REQUIRED EDUCATION/EXPERIENCE:

Related Bachelor's degree in an IT related field or relevant work experience

*5 -8 years related work experience in IT development/programming/coding professional within a Hadoop environment (or equivalent combination of transferable experience and education)

*Strong, demonstrated experience as a senior developer using Hadoop, Spark & Scala (all three required)

*Experience with Agile Methodology

*Experience with Ab initio Technology preferred

Education

Bachelor's degree