Description

Core Job Responsibilities:

•            Develop end-to-end ML pipelines encompassing the ML lifecycle from data ingestion, data transformation, model training, model validation, model serving, and model evaluation over time.

•            Collaborate closely with AI scientists to accelerate productionization of ML algorithms.

•            Setup CI/CD/CT pipelines, model repository for ML algorithms

•            Deploy models as a service both on-cloud and on-prem.

•            Learn and apply new tools, technologies, and industry best practices.

Key Qualifications

•            MS in Computer Science, Software Engineering, or equivalent field

•            Experience with Cloud Platforms, especially GCP and related skills: Docker, Kubernetes, edge computing

•            Familiarity with task orchestration tools such as MLflow, Kubeflow, Airflow, Vertex AI, Azure ML, etc.

•            Fluency in at least one general purpose programming language.  Python - Required

•            Strong skills in the following: Linux/Unix environment, testing, troubleshooting, automation, Git, dependency management, and build tools (GCP Cloud Build, Jenkins, Gitlab CI/CD, Github Actions, etc.).

•            Data engineering skills are a plus, such as Beam, Spark, Pandas, SQL, Kafka, GCP Dataflow, etc.

•            5+ years of experience, including academic experience, in any of the above.

Education

Any Graduate