Job Summary
If you love to develop great quality software and are eager to learn new technologies and develop new skills, then we have a great opportunity for you: join our PDI family and work closely with other talented PDI engineers to deliver solutions that delight our customers every day!
As a Data Engineer II, you will be part of an agile data services team responsible for developing and maintaining our industry-leading cloud-based big data and data analytics infrastructure serving major global fortune 500 companies. The candidate will help with design, development, unit testing, performance testing, deployment, troubleshooting, defect resolution, and support of our data services infrastructure.
Responsibilities
Utilize architectural and design skills for a variety of data-oriented tasks
Establish and maintain strong relationships with business stakeholders
Build high reliability systems that provide complete, secure, accurate and timely data for analytics, data modeling, new product features and services - including financial reporting
Assist development and analytics organization to meet its delivery goals with regard to data deign, flow and infrastructure
Perform Data Modeling, Architecture design and implementation to support OLTP and Warehousing applications assist features in Data flow design and Data
Mentor other team members
Work with business users to understand business requirements, issues and business and/or client processes
Hands on coding of data pipelines
Hands on administration of existing databases and big data systems
Develop, test, and maintain high-performance of our data systems to meet the requirements of the business and/clients while adhering to departmental standards
Perform quality assurance testing for all work performed
Prepare required documentation as outlined by departmental standards
Meet with agile teams as required to define and document application requirements
Follow project development & deployment process
Maintain industry standards and best practices
Maintain security and organization of the company’s data
Provide off-hour support as assigned
Provide high level estimates for new business requirements/features
Work with manager to ascertain the company’s data requirements
Install new databases, data pipelines, maintain existing databases and data pipelines
Design and implement processes and solutions for data distribution and data archiving
Develop and implement backup and recovery plans to mitigate the possibility of data loss
Monitor storage space, storage capacity, and system performance
Identify, analyze and repair data problems as needed
Provide recommendations for application and system improvements
Plan work to meet project deadlines, accommodate demands by development teams, set priorities and escalate issues appropriately
Provide recommendations to development teams for data restructuring and performing complex maintenance
Deploy and manage changes in development, staging and production
Assist development and dev ops teams on SQL queries and tuning
Knowledge, Skills & Abilities
Strong management skills with at least 4 years of experience building, managing and maintaining a team of Data Engineers/ DBAs
At least 4 years of experience in basic-level administration of both data infrastructure and data
At least 3 years coding experience in Java, Python, R or other equivalent programming language
At least 4 years big data and data architecture experience
Proficient in a variety of big-data tools and technologies, including Hadoop, Spark, etc. Proficient in SQL and PL-SQL, query tuning, optimization, ETL, ELT, and Data Warehousing
Experience in custom archive and purge procedures and scripts
Proficient in Business Intelligence (BI), analytic database and products, able to take business requirements and translate into database and pipeline design and tasks
General data-specific concepts, frameworks and processes
Agile development practices
Working within an SDLC
Designing data warehouses including definition, structure, documentation, maintenance, long-range requirements, operational guidelines, and protection
Linux operating systems
Data integration and processing including extraction, transformation, and loading (ETL), data analysis, and metadata
Other Skills And Experience
Cloud experience, such as AWS, GCP or Azure
Excellent oral and written communication
Multi-tasking and managing multiple priorities
Working well in a team-oriented, collaborative environment with people from different disciplines and varying degrees of technical experience
Working in an Agile team environment
Any Graduate