Name | mljet JSON |
Version |
0.7.0
JSON |
| download |
home_page | |
Summary | minimalistic ML-models auto mljetnt tool |
upload_time | 2023-05-06 15:19:10 |
maintainer | |
docs_url | None |
author | |
requires_python | >=3.8 |
license | |
keywords |
ml
deploy
autodeploy
|
VCS |
|
bugtrack_url |
|
requirements |
No requirements were recorded.
|
Travis-CI |
No Travis.
|
coveralls test coverage |
|
# mljet
<p align="center">
<img width="600" height="250" src="docs/source/mljet-logo-p.png">
</p>
<div align="center">
![Codacy grade](https://img.shields.io/codacy/grade/cc8845c151cc45919bfd193e266df293?style=for-the-badge)
![GitHub branch checks state](https://img.shields.io/github/checks-status/qnbhd/mljet/main?style=for-the-badge)
![Codecov](https://img.shields.io/codecov/c/github/qnbhd/mljet?style=for-the-badge)
![PyPI - Python Version](https://img.shields.io/pypi/pyversions/mljet?style=for-the-badge)
[<img height="40" width="120" src="https://user-images.githubusercontent.com/6369915/200408291-f0a22126-00b4-4680-ad29-6f3fc48b4e2e.png">](https://mljet.readthedocs.io/en/latest/)
</div>
If you have been working on ML models, then you have probably faced the task of deploying these models.
Perhaps you are participating in a hackathon or want to show your work to management.
According to our survey, more than `60%` of the data-scientists surveyed faced this task and more than `60%` of the respondents spent more than half an hour creating such a service.
The most common solution is to wrap it in some kind of web framework (like Flask).
Our team believes that it can be made even easier!
Our tool automatically collects all the necessary files and dependencies, creates a docker container, and launches it! And all this in one line of source code.
# Pipeline
<p align="center">
<img width="800" height="400" src="docs/pipeline.svg">
</p>
1. First, we initialize the project directory for the next steps;
2. Next, we serialize your machine learning models (for example, with Joblib or Pickle);
3. Next, we create a final `.py` file based on the templates that contains the endpoint handlers. Handlers are chosen based on models, and templates based on your preferences (templates are also `.py` files using, for example, Sanic or Flask);
4. Copy or additionally generate the necessary files (e.g. Dockerfile);
5. The next step is to compile the API documentation for your project;
6. After these steps, we build a Docker container, or a Python package, or we just leave the final directory and then we can deploy your project in Kubernetes, or in Heroku.
## Prerequisites
On your PC with local run you must have Docker & Python >= 3.8
## Installation
Install `mljet` with pip:
```bash
pip install mljet
```
or with your favorite package manager.
## Example
```python
from sklearn.datasets import load_iris
from sklearn.ensemble import RandomForestClassifier
from mljet import cook
X, y = load_iris(return_X_y=True, as_frame=True)
clf = RandomForestClassifier()
clf.fit(X, y)
cook(strategy="docker", model=clf, port=5010)
```
After running script you can see new Docker container.
To interact with service simply open URL, logged after script running.
On this page you can see Swagger UI, test simple requests (examples included).
For direct post-requests you can use Curl:
```bash
curl -X POST "http://127.0.0.1:5001/predict" -H "accept: application/json" -H "Content-Type: application/json" -d "{\"data\":[{\"sepal length (cm)\":5.8,\"sepal width (cm)\":2.7,\"petal length (cm)\":3.9,\"petal width (cm)\":1.2}]}"
```
## Models support
Currently, we support the following models:
- `sklearn`
- `xgboost`
- `catboost`
- `lightgbm`
## RoadMap
1. Deploy to Heroku & clusters
2. Model's basic vizualization
3. Tighter integration with [LightAutoML](https://github.com/sb-ai-lab/LightAutoML)
4. Support many popular ML-frameworks, such as `XGBoost`, `TensorFlow`, `CatBoost`, etc.
5. *Your ideas!*
## Contribution
We are always open to your contributions!
Please check our issue's and make PR.
Raw data
{
"_id": null,
"home_page": "",
"name": "mljet",
"maintainer": "",
"docs_url": null,
"requires_python": ">=3.8",
"maintainer_email": "Konstantin Templin <1qnbhd@gmail.com>, Kristina Zheltova <masterkristall@gmail.com>",
"keywords": "ML,deploy,AutoDeploy",
"author": "",
"author_email": "Konstantin Templin <1qnbhd@gmail.com>, Kristina Zheltova <masterkristall@gmail.com>",
"download_url": "https://files.pythonhosted.org/packages/41/cb/20c5640765be8f724c89927a47bea07ce4477ec1d8a57dc45c1dea8a67b3/mljet-0.7.0.tar.gz",
"platform": null,
"description": "# mljet\n\n<p align=\"center\">\n <img width=\"600\" height=\"250\" src=\"docs/source/mljet-logo-p.png\">\n</p>\n\n<div align=\"center\">\n\n![Codacy grade](https://img.shields.io/codacy/grade/cc8845c151cc45919bfd193e266df293?style=for-the-badge)\n![GitHub branch checks state](https://img.shields.io/github/checks-status/qnbhd/mljet/main?style=for-the-badge)\n![Codecov](https://img.shields.io/codecov/c/github/qnbhd/mljet?style=for-the-badge)\n\n\n![PyPI - Python Version](https://img.shields.io/pypi/pyversions/mljet?style=for-the-badge)\n\n[<img height=\"40\" width=\"120\" src=\"https://user-images.githubusercontent.com/6369915/200408291-f0a22126-00b4-4680-ad29-6f3fc48b4e2e.png\">](https://mljet.readthedocs.io/en/latest/)\n</div>\n\n\n\nIf you have been working on ML models, then you have probably faced the task of deploying these models.\nPerhaps you are participating in a hackathon or want to show your work to management.\n\nAccording to our survey, more than `60%` of the data-scientists surveyed faced this task and more than `60%` of the respondents spent more than half an hour creating such a service.\n\nThe most common solution is to wrap it in some kind of web framework (like Flask).\n\nOur team believes that it can be made even easier!\n\nOur tool automatically collects all the necessary files and dependencies, creates a docker container, and launches it! And all this in one line of source code.\n\n# Pipeline\n\n\n\n<p align=\"center\">\n <img width=\"800\" height=\"400\" src=\"docs/pipeline.svg\">\n</p>\n\n\n\n1. First, we initialize the project directory for the next steps;\n2. Next, we serialize your machine learning models (for example, with Joblib or Pickle);\n3. Next, we create a final `.py` file based on the templates that contains the endpoint handlers. Handlers are chosen based on models, and templates based on your preferences (templates are also `.py` files using, for example, Sanic or Flask);\n4. Copy or additionally generate the necessary files (e.g. Dockerfile);\n5. The next step is to compile the API documentation for your project;\n6. After these steps, we build a Docker container, or a Python package, or we just leave the final directory and then we can deploy your project in Kubernetes, or in Heroku.\n\n\n\n## Prerequisites\n\nOn your PC with local run you must have Docker & Python >= 3.8\n\n## Installation\n\nInstall `mljet` with pip:\n\n```bash\npip install mljet\n```\n\nor with your favorite package manager.\n\n## Example\n\n```python\nfrom sklearn.datasets import load_iris\nfrom sklearn.ensemble import RandomForestClassifier\n\nfrom mljet import cook\n\nX, y = load_iris(return_X_y=True, as_frame=True)\n\nclf = RandomForestClassifier()\nclf.fit(X, y)\n\ncook(strategy=\"docker\", model=clf, port=5010)\n```\n\nAfter running script you can see new Docker container.\nTo interact with service simply open URL, logged after script running.\n\nOn this page you can see Swagger UI, test simple requests (examples included).\nFor direct post-requests you can use Curl:\n\n```bash\ncurl -X POST \"http://127.0.0.1:5001/predict\" -H \"accept: application/json\" -H \"Content-Type: application/json\" -d \"{\\\"data\\\":[{\\\"sepal length (cm)\\\":5.8,\\\"sepal width (cm)\\\":2.7,\\\"petal length (cm)\\\":3.9,\\\"petal width (cm)\\\":1.2}]}\"\n```\n\n## Models support\n\nCurrently, we support the following models:\n\n- `sklearn`\n- `xgboost`\n- `catboost`\n- `lightgbm`\n\n## RoadMap\n\n1. Deploy to Heroku & clusters\n2. Model's basic vizualization\n3. Tighter integration with [LightAutoML](https://github.com/sb-ai-lab/LightAutoML)\n4. Support many popular ML-frameworks, such as `XGBoost`, `TensorFlow`, `CatBoost`, etc.\n5. *Your ideas!*\n\n## Contribution\n\nWe are always open to your contributions!\nPlease check our issue's and make PR.\n\n",
"bugtrack_url": null,
"license": "",
"summary": "minimalistic ML-models auto mljetnt tool",
"version": "0.7.0",
"project_urls": {
"Repository": "https://github.com/qnbhd/mljet"
},
"split_keywords": [
"ml",
"deploy",
"autodeploy"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "99044d801159edba0537543d77bae47f0626d5bc405db1a843b23e41b236686a",
"md5": "3664ef66dd705384d6acf6a0a69f3eed",
"sha256": "cca206e71b2d2364025dfe4d18b29622a92c8558cbeb64c7e91dab63c81b145a"
},
"downloads": -1,
"filename": "mljet-0.7.0-py3-none-any.whl",
"has_sig": false,
"md5_digest": "3664ef66dd705384d6acf6a0a69f3eed",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.8",
"size": 51717,
"upload_time": "2023-05-06T15:19:09",
"upload_time_iso_8601": "2023-05-06T15:19:09.458364Z",
"url": "https://files.pythonhosted.org/packages/99/04/4d801159edba0537543d77bae47f0626d5bc405db1a843b23e41b236686a/mljet-0.7.0-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "41cb20c5640765be8f724c89927a47bea07ce4477ec1d8a57dc45c1dea8a67b3",
"md5": "ee3f973e2fa117381e38a3363bef4f21",
"sha256": "4462578c035129487e5fa95f09561ddd5699de7df4f37054c0b15475c47d9492"
},
"downloads": -1,
"filename": "mljet-0.7.0.tar.gz",
"has_sig": false,
"md5_digest": "ee3f973e2fa117381e38a3363bef4f21",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.8",
"size": 39351,
"upload_time": "2023-05-06T15:19:10",
"upload_time_iso_8601": "2023-05-06T15:19:10.970505Z",
"url": "https://files.pythonhosted.org/packages/41/cb/20c5640765be8f724c89927a47bea07ce4477ec1d8a57dc45c1dea8a67b3/mljet-0.7.0.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2023-05-06 15:19:10",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "qnbhd",
"github_project": "mljet",
"travis_ci": false,
"coveralls": true,
"github_actions": true,
"requirements": [],
"lcname": "mljet"
}