HYBRID ON SITE REQUIRED: Expected 2 days on site and 3 days remote
Clear Communication skills is important
Skill matrix:
Skill
Years of experience
• 7 years of experience with Data Engineering, working with large-scale data processing and ETL pipelines.
• 5 years of hands-on experience with data modeling, architecture and management.
• 5 years of experience with Relational Database Systems, Data Design, RDBMS Concepts, ETL
• 5 years of experience working with data in cloud environments such as AWS (preferred), Azure, GCP
• 3 years of experience in T-SQL, SQL, ELT/ETL performance tuning.
• Programming experience in Snowflake, Hadoop, or other Data Warehouse technologies; Snowflake preferred.
• Experience in Microsoft SQL Server 2008R2 or newer.
• Experience with SSIS or equivalent ETL tool.
• Experience in SQL/Stored Procedure development.
Data Integration Specialist
Description of Work:
• Worked with advanced technical principles, theories, and concepts; well versed in technical products; able to work on complex technical problems and providing innovative solutions; and can work with highly experienced and technical resources.
• Demonstrates ability to communicate technical concepts to non-technical audiences both in written and verbal form.
• Assembles large, complex data sets to meet business requirements.
• Works in tandem with Data Architects to align on data architecture requirements provided by the latter.
• Creates and maintains optimal data pipeline architecture.
• Identifies, designs, and implements internal process improvements: automating manual processes, optimizing data delivery.
• Implements big data and NoSQL solutions by developing scalable data processing platforms to drive high-value insights to the organization.
• Supports development of Data Dictionaries and Data Taxonomy for product solutions.
• Demonstrates strong understanding with coding and programming concepts to build data pipelines (e.g., data transformation, data quality, data integration, etc.).
• Builds data models with Data Architect and develops data pipelines to store data in defined data models and structures.
• Demonstrates strong understanding of data integration techniques and tools (e.g., Extract, Transform, Load (ETL) / Extract, Load, Transform (ELT)) tools and database architecture.
• Demonstrates strong understanding of database storage concepts (data lake, relational databases, NoSQL, Graph, data warehousing).
• Identifies ways to improve data reliability, efficiency, and quality of data management.
• Conducts ad-hoc data retrieval for business reports and dashboards.
• Assesses the integrity of data from multiple sources.
• Manages database configuration including installing and upgrading software and maintaining relevant documentation.
• Monitors database activity and resource usage.
• Performs peer review for another Data Engineer's work.
• Assists with development, building, monitoring, maintaining, performance tuning, troubleshooting, and capacity estimation.
• Sources data from the operational systems.
• Prepares the database-loadable file(s) for the Data Warehouse.
• Manages deployment of the data acquisition tool(s).
• Monitors and maintains Data Warehouse/ELT.
• Monitors, reports, and resolves data quality.
• Works closely with all involved parties to ensure system stability and longevity.
• Supports and maintains Business Intelligence functionality.
• Evaluates, understands, and implements patches to the Data Warehouse environment.
• Loads best practices and designs multidimensional schemas.
• Attend all customer technical discussions/design/development meetings and provide technical inputs to further enhance the code quality/process.
• Provide guidance/support to other junior/mid-level developers.
• Impact functional strategy by developing new solutions, processes, standards, or operational plans that position Leidos competitively in the marketplace.
• All other duties as assigned or directed.
ANY GRADUATE