Description

Job Description

 

Big data and advanced analytics products and solutions.
In this role, you will have the opportunity to contribute to several high-quality data solutions and enhance your technical skills across many disciplines.

 

Responsibilities:

 

  • Design, develop, and maintain end-to-end data solutions using open-source, modern data lake, and enterprise data warehouse technologies (Hadoop, Spark, Cloud, etc.)
  • Contribute to multiple data solutions throughout their entire lifecycle (conception to launch)
  • Partner with business stakeholders to understand and meet their data requirements
  • Provide ongoing maintenance and enhancements to existing data solutions
  • Maintain security in accordance with security policies
  • Participate in an Agile development environment

 

Qualifications:

 

  • Bachelor's degree in Computer Science, Engineering, or Information Management (or equivalent) with 8+ years of relevant work experience
  • Professional experience designing, creating and maintaining scalable data pipelines
  • Hands-on experience with a variety of big data technologies (Hadoop / Cloudera, Spark, Cloud)
  • Experience with object-oriented scripting languages: Java (required), Python, etc.
  • Advanced knowledge of SQL and experience with relational databases
  • Experience with UNIX shell scripts and commands
  • Experience with version control (git), issue tracking (jira), and code reviews
  • Proficient in agile development practices
  • Ability to clearly document operational procedures and solution designs
  • Ability to communicate effectively (both verbal and written)
  • Ability to work collaboratively in a team environment

Education

BTECH