Description

Responsibilities:

Data Ops Engineer - JD

Qualifications:

  1. 5+ years of overall software engineering experience that includes hands-on software development, data engineering
  2. 3+ years for hands-on coding experience in SQL, Python, PySpark
  3. 3+ years of experience with advanced orchestration tools like Apache Airflow
  4. 2+ years of experience in at least one cloud (Azure, AWS, GCP) platforms, preferably GCP
  5. Experience in building CI/CD processes and pipelines

Roles and Responsibilities:

  1. Worked in an agile environment
  2. Proactively identify and assist in solving recurring data quality or data availability issues
  3. Monitor, support, triage data pipelines that ingest, move, transform, and integrate as it moves from acquisition to consumption layers
  4. Exceptional problem solving and troubleshooting skills, analyze data to figure out issues/patterns
  5. Effective communication skills with technical and business teams
  6. Aspire to be efficient, thorough, and proactive
  7. Able to develop queries, metrics for data platform related ad-hoc reporting and/or ETL batch triage
  8. Maintain knowledge base and FAQ documentation providing instructions for solving a problem that jobs commonly run into.