Description

 

Top Skills Details

1) SQL Server and SSIS experience - experience writing stored procedures, object oriented, querying, DDLs. SSIS - know how to connect to source, what are the transformations/ doing transformations, how to log/ do logging.
How do you connect to a source?
What are the transformations?
How do you do logging?

2) Hadoop experience - python scripting, how to query hive data, how do you enrich this data through Spark. Eventually in the future they will be migrating to Hadoop so need to have that experience. Experience pulling data from SQL using SSIS to massage it, from Hadoop to SQL
- How do you query hive data?
- How do you enrich this data through Spark?

3) Experience discussing requirements with the business team and changing into the technical requirements. Need to know the SDLC process and Jira process, Scrum and Kanban should know that.

4) Experience with data analysis and coming in with request and reporting requirements based on what those are, looking at the metrics and figuring out what the issues are, collect that data.

 

Experience Level

Expert Level

 

EVP

Looking for this to be a longer term role, potential to convert FTE down the road. Anywhere from 3 months in to later at 12 months +

Strategy: The more tactical need (2-3 years) is to implement a robust big data platform on-premise to meet Wells Fargo’s Cyber Security BI/analytics/reporting and data science/ML needs. This includes building a custom Data Pipeline solution using Spark, Airflow and on top of the Hadoop platform using python. In parallel, we would like to start onboarding select early-adopter use cases to our target state Google Cloud Platform (GCP) starting Q1’2023. Portability of our on-premise solutions to GCP is critical. As we learn and gain momentum on GCP, we will start to accelerate our journey to the public cloud – expect that to be around Q3/Q4 of 2023.