Job Description
We are seeking a seasoned Technical Data Lead/Architect with over 10 years of experience in leading data programs, and a deep understanding of data architecture, data engineering, and data analysis. The successful candidate will play a pivotal role in understanding data requirements, designing, developing, testing, and maintaining ETL architectures and databases for large-scale data processing systems.
Key Responsibilities:
Provide technical leadership and oversight of the data programs.
Collaborate with business stakeholders, data analysts, data scientists to identify business requirements, analyze opportunities for data-driven improvements and solutions.
Design, develop, deploy, test, and maintain highly scalable data solutions.
Lead the development and maintenance of data architecture, data modeling, and ETL processes.
Collaborate with teams to integrate systems and data quickly and effectively, regardless of technical challenges or complexities.
Ensure the performance, reliability, and security of databases.
Develop and implement data standards, procedures, and guidelines to ensure data integrity, dependability, and regulatory compliance.
Anticipate future demands of initiatives related to data and make recommendations to higher management on necessary upgrades or new systems.
Qualifications
Over 10 years of experience leading data programs.
More than 5 years of hands-on experience in cloud databases like Snowflake with deep understanding of Snowflake's architecture and how to use its features to manage and analyze data efficiently. This includes features like data lakehouse, data sharing, warehouses, snowsql, snowpipe, snowpark, data masking, etc.
Must be a US citizen or a permanent residence.
Eligible to obtain Government Clearance.
5+ years of data engineering experience with ETL (Extract, Transform, Load) tools and processes is crucial, including data migration and data integration from various sources into Snowflake and expertise in at least one ELT tool.
Proficiency in SQL is essential, including the ability to write complex queries and stored procedures.
Experience working with legacy and modern sources and file formats like json, csv, xml. and classified data with PII, PHI etc.
Experience with python for legacy and modern ingestion is essential and usage of libraries for data engineering like pandas, sqlalchemy, snowpark, etc.
Familiarity with at least one cloud platform, such as AWS, Azure, or Google Cloud, and understanding how Snowflake integrates with these platforms. Proficiency in Azure Cloud services like Azure data factory, DevOps are advantageous.
Excellent knowledge of data backup, recovery, security, integrity.
Strong analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail and accuracy.
Staying up to date with the latest features and updates from Snowflake.
Certification, such as the SnowPro Core Certification or Azure solutions architect/Data Engineer associate, can demonstrate a high level of expertise and commitment to the role.
Strong leadership and communication skills.
A degree in Computer Science, Engineering, or a related field.
Any Graduate