Key Requirements and Technology Experience:
Skills-Snowflake,SQL,PL/SQL and DB2.
6+ Years of Data Engineering (DB2, Snowflake, Oracle, MongoDB, Redshift, PostgreSQL).
SQL, PL/SQL, Python, and shell scripting languages, (50% PostgresSQL, 30% python, 20% shell scripting).
Interview will include complex SQL query coding challenges as well as PostgreSQL and shell.
Must have experience working with ETL and streaming tools and processes (Kafka is preferred, but would consider other tools as well, ie: Adeptia or Alteryx).
Bachelor’s or master’s degree in a technology related field like Computer Science or Engineering with demonstrated ability.
Ensures alignment with enterprise data architecture strategies.
Improves data availability via APIs and shared services and recommends optimization solutions using cloud technologies for data processing, storage, and advanced analytics.
Performs SQL tuning and data integration activities.
Provides technical guidance for cyber security on database technologies.
Performs risk assessments and execute tests of data processing system to ensure functioning of data processing activities and security measures.
Performs independent and complex technical and functional analysis for multiple projects supporting several divisional initiatives.
Building technical infrastructure required for efficient Extraction, Transformation, and Loading (ETL) of data from a wide variety of data sources by leveraging, object-oriented/object function scripting languages such as Python.
Expertise with relational databases, Splunk, Snowflake, YugabyteDB, Aerospike, S3 and similar data management platforms.
Experience working with and handling large data sets in DB2 with SQL; writing sophisticated stored procedures to process data.
Object oriented Python programming and solid experience with machine learning libraries - Pandas, NumPy, Scikit-learn, TensorFlow, etc.
Data parsing/analytics experience in large data sets using Python, scripting, and other similar technologies, integrating with and consuming APIs.
Familiarity with quantitative techniques and methods, statistics, econometrics – including probability, linear regression, time series data analysis and optimizations.
Knowledge of hybrid on-prem and cloud data architectures and services, especially data streaming, storage and processing functionality
Awareness of event-based systems, functional programming, emerging technologies and messaging frameworks such as Kafka.
Experience in Agile methodologies (Kanban and SCRUM) is a plus.
Any Graduate