Job Description (Dev Python/ETL):
Enterprise Capital Management technology team is looking for a strong individual performer with ability to work in a fast paced agile environment. The objective of our team is to ensure Client and its subsidiaries are compliant from a regulatory capital reporting perspective under the Basel rules in a new stable, scalable and sustainable capital reporting and analysis platform. The project elaborates requirements for data sourcing elements for calculating risk weighted assets, details business logic for computing these financial elements and defines process-flow and formulas for generating Basel rules-based capital ratios. These requirements are captured for various entities at Client including bank holding company (BHC) and related depository institution entities (banks).
This role will be responsible for hands-on application development to support the current and target process, as well as partnering with the multiple Technology teams to implement the target architecture and migration to strategic platform. Typically requires 8+ years of applicable experience.
Required Skills:
- 4+ years of experience in Python development experience is a must
- 2+ years of experience in big data technologies like Spark, Hadoop
- 2+ years of SQL programming experience preferably with databases such as Oracle Exadata
- 2+ years of data wrangling/ETL experience preferably with vendor products such as Alteryx/Trifacta or Informatica
- Knowledge of performance tuning data intensive applications
- Expertise in performance profiling, ability to identify performance improvements and memory optimizations
- Strong coding, debugging, and analytical skills
- Experience in large scale enterprise application design and implementation
- Creative individual with a track record of working on and implementing innovative tech based solutions
-
Desired Skills:
- Degree from outstanding university
- Background in capital calculations
- BS/MS in Computer Science, Engineering, or any quantitative discipline
- Knowledge and/or experience working within the Hadoop or other big data distributed ecosystem
- Knowledge of cloud computing or distributed computing
- 2+ years of Java development is preferred
- Experience working with agile methodologies and SDLC processes would be preferred
- 1+ years of UNIX scripting experience and unit test mock frameworks would be preferred
- Experience in Quartz (Internal bank platform) would be preferred but optional