Description

Job Description: -Top skills: Snowflake, DBT, Fivetran, Python and ideally AWS, but open to looking at someone with Azure background if they have the other technologies, and must have good communication skills -Need someone who can create policies and procedures, best practices, run procedures on how to mitigate costs with Snowflake; put controls and monitoring around it -not looking for an architect, but looking for someone who can lead and frame controls and configurations within Snowflake -needs to be confident and excellent communication skills to work in conjunction with the Architecture and Data teams -will work with snowflake (vendor); so vendor experience is nice to have -will be number 1 person on the team for administration of snowflake -moving out of on-prem (DB2 on AIX) and use Informatica PowerCenter as current ETL tool -Could be open to someone with azure background if they have other DBT, Fivetran Responsibilities: -Design, implement, and maintain Snowflake data warehouse solutions to fulfill business requirements effectively. -Deploy and optimize DBT models and transformations to streamline data processing and facilitate analytics. -Manage Fivetran data pipelines to enable seamless data integration and synchronization. -Utilize AWS Glue for data cataloging, executing ETL jobs, and performing data transformation operations. -Apply Python and SageMaker for advanced analytics, machine learning initiatives, and data science projects. -Employ DB2, SQLServer, and Informatica as necessary for data migration, integration, and administration tasks. -Collaborate with data engineers, analysts, and stakeholders to comprehend data needs and deliver actionable insights. -Develop and enforce data governance and security protocols to uphold data quality and regulatory compliance. -Monitor system performance, troubleshoot issues, and implement performance optimization strategies. -Stay abreast of industry best practices and emerging technologies in data warehousing and cloud computing domains. Qualifications: -Bachelor's degree in Computer Science, Information Technology, or related field. -5+ years of experience in a data engineering or analytics role -Proven experience working with Snowflake, DBT, Fivetran, AWS Glue, Python, SageMaker, DB2, SQLServer, and Informatica technologies -Certifications in Snowflake, DBT, AWS Glue, Python, SageMaker, or related technologies. -Knowledge of data governance frameworks and data privacy regulations (e.g., GDPR, CCPA). -Excellent SQL skills and experience with data modeling, ETL/ELT processes, and data pipeline management. -Familiarity with cloud platforms such as AWS, Azure, or Google Cloud Platform. -Experience with scripting languages (e.g., Python, Bash) for automation and orchestration tasks. -Excellent analytical, problem-solving, and communication skills. -Ability to work independently and collaboratively in a fast-paced environment.

Education

ANY GRADUATE