General tasks and responsibilities will include:
• Lead construction, maintenance, & optimization of data pipelines.
• Drive Automation through effective metadata management, using innovative and modern tools, techniques, and architectures to improve productivity.
• Lead renovation of the data management infrastructure to drive automation in data integration and management.
• Collaborate with multiple Analytics Center of Excellence teams and EIM team in refining their data requirements for various DnA initiatives and data consumption requirements.
• Identify gaps in new data initiatives and how to address new data requirements.
• Transfer knowledge of data and/or domain understanding in addressing new data requirements.
• Propose appropriate (and innovative) data ingestion, preparation, integration and operationalization techniques to optimize data requirements.
• Work with data governance teams to ensure data scientists and consumers use corresponding data responsibly through data governance and compliance initiatives.
• Design and implement data quality monitoring systems, which includes source-to-target data validations as well as anomaly detection
Qualifications and Experience Requirements:
Bachelor’s degree/master's degree/advanced degree in computer science, statistics, applied mathematics, data management, information systems, information science or related field
Expertise and excellent understanding of Snowflake Internals and integration of Snowflake with other data processing and reporting technologies
Preferred experience in working with data science teams in refining and optimizing data science and machine learning models and algorithms
Full Lifecycle Data Warehouse, Data Mart Development experience is a big plus
Agile Software development methodologies
Preferred experience with any automation tool such as AutoSys, Cntl-M.
Minimum 10 years of work experience in data management disciplines, including data integration, modeling, optimization and data quality, and/or other areas directly relevant to data engineering responsibilities and tasks.
Minimum 2 years designing and implementing a fully operational production grade large scale data solution on Snowflake Data Warehouse
Minimum 10 years of hands on experience designing and implementing production grade data warehousing solutions on large scale databases such as SQL Server, Oracle or DB2
5+ years data pipeline development experience with Informatica Powercenter and IBM DataStage
3+ years of experience with advanced analytics tools for Object-oriented/object function scripting using languages such as R, Python, Java, C++, Scala, others
3+ years of experience with Client/Perl Scripting
5+ years of experience with popular database programming languages including SQL, PL/SQL (required), others for relational databases and certifications on upcoming NoSQL/Hadoop oriented databases (preferred) like MongoDB, Cassandra, others for nonrelational databases Minimum Qualifications: A Developer assists in designing, developing, and supporting applications. These applications include systems developed solely for the web environment as well as development efforts designed to web-enable end-user applications. This individual may also assist in the creation and ongoing management of corporate web sites and intranet communities. The Developer will have a thorough understanding of programming techniques and tools, web development, and system management tools.
General Tasks and Responsibilities Will Include: • Develop and understand all features of each a module and their impact on other modules. • Analyze and develop understanding of how data flows through the system and apply that knowledge to the current business model. • Identify and define system errors and/or deficiencies. • Liaise with other developers (i.e: software, application, web, etc) to determine b est path solutions and manage the priorities through collaboration with internal stakeholders. • Plan and coordinate testing and implementation of system patches and upgrades leading a cross functional team of power users. • Extract key business performance metrics. • Analyze user needs and software requirements to determine areas of opportunity for increased efficiencies. • Create written user documentation, instructions, and procedures for the purpose of training new employees and improving over-all user proficiency. • Evaluate new technologies for potential implementation including development of business justification and project implementation planning, when appropriate. • Utilizing in-house or external resources, manage the network and related hardware. Ensure that help-desk tasks are promptly addressed and resolved
Educational Level:
• A Baccalaureate Degree from an accredited college or university with a major in Computer Science, Systems Engineering, applied Mathematics, Business Administration, Economics/Statistics, Telecommunications, Data Communications, or a related field of study; and
• Five (5) years of progressive, responsible experience in the field of data processing, computer systems, and applications.
• Operations Specialty requires supervisory experience (5 years).
• Network Services requires a telecommunications background and experience.
• Broad knowledge and expertise in the characteristics of computers, peripheral devices, communications systems and hardware capabilities, programming languages, E.D.P. applications, systems analysis methodology, data management and retrieval techniques; or
• A satisfactory equivalent combination of training, education, and experience.
Years of Experience:
• Five (5) years of progressive, responsible experience in the field of data processing, computer systems, and applications.
• Operations Specialty requires supervisory experience (5 years).
DATA WAREHOUSE
Any Graduate