Responsibilities:
Data Ops Engineer - JD
Qualifications:
- 5+ years of overall software engineering experience that includes hands-on software development, data engineering
- 3+ years for hands-on coding experience in SQL, Python, PySpark
- 3+ years of experience with advanced orchestration tools like Apache Airflow
- 2+ years of experience in at least one cloud (Azure, AWS, GCP) platforms, preferably GCP
- Experience in building CI/CD processes and pipelines
Roles and Responsibilities:
- Worked in an agile environment
- Proactively identify and assist in solving recurring data quality or data availability issues
- Monitor, support, triage data pipelines that ingest, move, transform, and integrate as it moves from acquisition to consumption layers
- Exceptional problem solving and troubleshooting skills, analyze data to figure out issues/patterns
- Effective communication skills with technical and business teams
- Aspire to be efficient, thorough, and proactive
- Able to develop queries, metrics for data platform related ad-hoc reporting and/or ETL batch triage
- Maintain knowledge base and FAQ documentation providing instructions for solving a problem that jobs commonly run into.