● Have the opportunity to contribute to several high-quality data solutions and enhance your technical skills across many disciplines. Key Responsibilities Design, develop, and maintain end to end data solutions using open source, modern data lake, and enterprise data warehouse technologies (Hadoop, Spark, Cloud, etc.) Contribute to multiple data solutions throughout their entire lifecycle (conception to launch) Partner with business stakeholders to understand and meet their data requirements Provide ongoing maintenance and enhancements to existing data solutions Maintain security in accordance with Bank security policies Participate in an Agile development environment.
● Bachelor"s degree in Computer Science, Engineering, or Information Management (or equivalent) 5+ years of relevant work experience Professional experience designing, creating and maintaining scalable data pipelines Hands-on experience with a variety of big data technologies (Hadoop / Cloudera, Spark, Cloud, etc.) Experience with object-oriented scripting languages: Java (required), Python, etc. Advanced knowledge of SQL and experience with relational databases Experience with UNIX shell scripts and commands Experience with version control (git), issue tracking (jira), and code reviews Proficient in agile development practices Ability to clearly document operational procedures and solution designs Ability to communicate effectively (both verbal and written) Ability to work collaboratively in a team environment Ability to balance competing priorities and expectations.
Any Graduate