Description

Job Description

Ability to work with data engineers, troubleshoot access issues, help with automation

Databricks platform experience/administration

Utilize Terraform for infrastructure as code to automate and manage cloud resources effectively

Understanding of a Datalake architecture, as well as medallion

Google Cloud Platform management, usually via Terreform

Proficiency in Linux and Bash scripting to maintain and troubleshoot our systems, or work on docker automation, etc

Experience in Python or Ruby for automation

Exposure to Scala is a plus, but not mandatory

Administration experience in Kubernetes, ensuring smooth operation and scalability of containerized applications

Comfortable working on a MacBook due to the preferred development environment

Transitioning from a DevOps or sysadmin background is highly desirable

Familiarity with configuration management tools such as Chef, Ansible, Puppet, or Salt is a bonus

Working understanding of streaming vs batch operations, with experience in technologies such as Kafka

Proficiency in SQL is required for data management tasks, though mastery is not essential

Proficiency in GitLab CI/CD pipeline_   Proficiency in VCS systems, such as Git

Experience with Airflow, understanding of DAGs for orchestrating complex workflows and data pipelines