Description

About the job
DataOPS:- Proficiency in Python Core/Advanced for development and data pipelining.
Strong understanding of data structures, Pandas, Numpy, sklearn, concurrency, and design patterns.
DevOPS:- Experience in deploying applications using CI/CD tools such as Jenkins, Jfrog, Docker, Kubernetes, and Openshift Container Platform.
Microservices & REST APIs:
Familiarity with FastAPI, Flask, and Tornado for developing microservices and REST APIs.
Cloud:
Knowledge of building and deploying applications using cloud platforms.
Databases & SQL: - Proficiency in working with databases such as Postgres, Clickhouse, and MongoDB.
Caching & Queuing: - Experience with Pub/Sub (RabbitMQ), Redis, and Diskcache for caching and queuing purposes.
Operating system: - Strong understanding of both Linux and Windows operating systems.
Monitoring and Logging: - Familiarity with Splunk for monitoring and logging applications.

Good To Have Skills Include

Generative AI knowledge: - Knowledge of the Langchain framework and ChatGPT for generative AI applications.
MLOPS knowledge: - Experience with Databricks, MLFlow, Kubeflow, and ClearML for managing machine learning operations.
Testing knowledge: - Proficiency in integration testing, Python Behave, and Pytest for ensuring code quality.
Maintaining code quality standards: - Working knowledge of Pylint for maintaining code quality standards.
Logging: - Familiarity with Kibana and Elastic search for advanced logging and analysis.

Skills: tornado,mlflow,devops,clearml,openshift,redis,cloud,windows,python core,jenkins,mongodb,integration testing,pandas,sklearn,linux,python behave,chatgpt,flask,clickhouse,langchain,splunk,diskcache,numpy,ci/cd tools,kibana,docker,kubeflow,python advanced,postgres,kubernetes,pylint,cloud platforms,python,databricks,jfrog,fastapi,elastic search,rabbitmq,pytest

Education

Any Graduate