# MindSpore Serving
[查看中文](./README_CN.md)
<!-- TOC -->
- [MindSpore Serving](#mindspore-serving)
- [Overview](#overview)
- [Installation](#installation)
- [Installing Serving](#installing-serving)
- [Configuring Environment Variables](#configuring-environment-variables)
- [Quick Start](#quick-start)
- [Documents](#documents)
- [Developer Guide](#developer-guide)
- [Community](#community)
- [Governance](#governance)
- [Communication](#communication)
- [Contributions](#contributions)
- [Release Notes](#release-notes)
- [License](#license)
<!-- /TOC -->
## Overview
MindSpore Serving is a lightweight and high-performance service module that helps MindSpore developers efficiently
deploy online inference services in the production environment. After completing model training on MindSpore, you can
export the MindSpore model and use MindSpore Serving to create an inference service for the model.
MindSpore Serving architecture:
<img src="docs/architecture.png" alt="MindSpore Architecture" width="600"/>
MindSpore Serving includes two parts: `Client` and `Server`. On a `Client` node, you can deliver inference service
commands through the gRPC or RESTful API. The `Server` consists of a `Main` node and one or more `Worker` nodes.
The `Main` node manages all `Worker` nodes and their model information, accepts user requests from `Client`s, and
distributes the requests to `Worker` nodes. `Servable` is deployed on a worker node, indicates a single model or a
combination of multiple models and can provide different services in various methods. `
On the server side, when [MindSpore](#https://www.mindspore.cn/) is used as the inference backend,, MindSpore Serving
supports the Ascend 910/310P/310 and Nvidia GPU environments. When [MindSpore Lite](#https://www.mindspore.cn/lite) is
used as the inference backend, MindSpore Serving supports Ascend 310, Nvidia GPU and CPU environments. Client` does not
depend on specific hardware platforms.
MindSpore Serving provides the following functions:
- gRPC and RESTful APIs on clients
- Pre-processing and post-processing of assembled models
- Batch. Multiple instance requests are split and combined to meet the `batch size` requirement of the model.
- Simple Python APIs on clients
- The multi-model combination is supported. The multi-model combination and single-model scenarios use the same set of
interfaces.
- Distributed model inference
## Installation
For details about how to install and configure MindSpore Serving, see the [MindSpore Serving installation page](https://www.mindspore.cn/serving/docs/en/master/serving_install.html).
## Quick Start
[MindSpore-based Inference Service Deployment](https://www.mindspore.cn/serving/docs/en/master/serving_example.html) is
used to demonstrate how to use MindSpore Serving.
## Documents
### Developer Guide
- [gRPC-based MindSpore Serving Access](https://www.mindspore.cn/serving/docs/en/master/serving_grpc.html)
- [RESTful-based MindSpore Serving Access](https://www.mindspore.cn/serving/docs/en/master/serving_restful.html)
- [Services Provided Through Model Configuration](https://www.mindspore.cn/serving/docs/en/master/serving_model.html)
- [Services Composed of Multiple Models](https://www.mindspore.cn/serving/docs/en/master/serving_model.html#services-composed-of-multiple-models)
- [MindSpore Serving-based Distributed Inference Service Deployment](https://www.mindspore.cn/serving/docs/en/master/serving_distributed_example.html)
For more details about the installation guide, tutorials, and APIs,
see [MindSpore Python API](https://www.mindspore.cn/serving/docs/en/master/server.html).
## Community
### Governance
[MindSpore Open Governance](https://gitee.com/mindspore/community/blob/master/governance.md)
### Communication
- [MindSpore Slack](https://join.slack.com/t/mindspore/shared_invite/zt-dgk65rli-3ex4xvS4wHX7UDmsQmfu8w) developer
communication platform
## Contributions
Welcome to MindSpore contribution.
## Release Notes
[RELEASE](RELEASE.md)
## License
[Apache License 2.0](LICENSE)
Raw data
{
"_id": null,
"home_page": "https://www.mindspore.cn",
"name": "mindspore-serving",
"maintainer": "",
"docs_url": null,
"requires_python": ">=3.7",
"maintainer_email": "",
"keywords": "mindspore machine learning",
"author": "The MindSpore Authors",
"author_email": "contact@mindspore.cn",
"download_url": "https://gitee.com/mindspore/serving/tags",
"platform": null,
"description": "# MindSpore Serving\n\n[\u67e5\u770b\u4e2d\u6587](./README_CN.md)\n\n<!-- TOC -->\n\n- [MindSpore Serving](#mindspore-serving)\n - [Overview](#overview)\n - [Installation](#installation)\n - [Installing Serving](#installing-serving)\n - [Configuring Environment Variables](#configuring-environment-variables)\n - [Quick Start](#quick-start)\n - [Documents](#documents)\n - [Developer Guide](#developer-guide)\n - [Community](#community)\n - [Governance](#governance)\n - [Communication](#communication)\n - [Contributions](#contributions)\n - [Release Notes](#release-notes)\n - [License](#license)\n\n<!-- /TOC -->\n\n## Overview\n\nMindSpore Serving is a lightweight and high-performance service module that helps MindSpore developers efficiently\ndeploy online inference services in the production environment. After completing model training on MindSpore, you can\nexport the MindSpore model and use MindSpore Serving to create an inference service for the model.\n\nMindSpore Serving architecture:\n\n<img src=\"docs/architecture.png\" alt=\"MindSpore Architecture\" width=\"600\"/>\n\nMindSpore Serving includes two parts: `Client` and `Server`. On a `Client` node, you can deliver inference service\ncommands through the gRPC or RESTful API. The `Server` consists of a `Main` node and one or more `Worker` nodes.\nThe `Main` node manages all `Worker` nodes and their model information, accepts user requests from `Client`s, and\ndistributes the requests to `Worker` nodes. `Servable` is deployed on a worker node, indicates a single model or a\ncombination of multiple models and can provide different services in various methods. `\n\nOn the server side, when [MindSpore](#https://www.mindspore.cn/) is used as the inference backend,, MindSpore Serving\nsupports the Ascend 910/310P/310 and Nvidia GPU environments. When [MindSpore Lite](#https://www.mindspore.cn/lite) is\nused as the inference backend, MindSpore Serving supports Ascend 310, Nvidia GPU and CPU environments. Client` does not\ndepend on specific hardware platforms.\n\nMindSpore Serving provides the following functions:\n\n- gRPC and RESTful APIs on clients\n- Pre-processing and post-processing of assembled models\n- Batch. Multiple instance requests are split and combined to meet the `batch size` requirement of the model.\n- Simple Python APIs on clients\n- The multi-model combination is supported. The multi-model combination and single-model scenarios use the same set of\n interfaces.\n- Distributed model inference\n\n## Installation\n\nFor details about how to install and configure MindSpore Serving, see the [MindSpore Serving installation page](https://www.mindspore.cn/serving/docs/en/master/serving_install.html).\n\n## Quick Start\n\n[MindSpore-based Inference Service Deployment](https://www.mindspore.cn/serving/docs/en/master/serving_example.html) is\nused to demonstrate how to use MindSpore Serving.\n\n## Documents\n\n### Developer Guide\n\n- [gRPC-based MindSpore Serving Access](https://www.mindspore.cn/serving/docs/en/master/serving_grpc.html)\n- [RESTful-based MindSpore Serving Access](https://www.mindspore.cn/serving/docs/en/master/serving_restful.html)\n- [Services Provided Through Model Configuration](https://www.mindspore.cn/serving/docs/en/master/serving_model.html)\n- [Services Composed of Multiple Models](https://www.mindspore.cn/serving/docs/en/master/serving_model.html#services-composed-of-multiple-models)\n- [MindSpore Serving-based Distributed Inference Service Deployment](https://www.mindspore.cn/serving/docs/en/master/serving_distributed_example.html)\n\nFor more details about the installation guide, tutorials, and APIs,\nsee [MindSpore Python API](https://www.mindspore.cn/serving/docs/en/master/server.html).\n\n## Community\n\n### Governance\n\n[MindSpore Open Governance](https://gitee.com/mindspore/community/blob/master/governance.md)\n\n### Communication\n\n- [MindSpore Slack](https://join.slack.com/t/mindspore/shared_invite/zt-dgk65rli-3ex4xvS4wHX7UDmsQmfu8w) developer\n communication platform\n\n## Contributions\n\nWelcome to MindSpore contribution.\n\n## Release Notes\n\n[RELEASE](RELEASE.md)\n\n## License\n\n[Apache License 2.0](LICENSE)\n",
"bugtrack_url": null,
"license": "Apache 2.0",
"summary": "MindSpore is a new open source deep learning training/inference framework that could be used for mobile, edge and cloud scenarios.",
"version": "2.0.2",
"project_urls": {
"Download": "https://gitee.com/mindspore/serving/tags",
"Homepage": "https://www.mindspore.cn",
"Issue Tracker": "https://gitee.com/mindspore/serving/issues",
"Sources": "https://gitee.com/mindspore/serving"
},
"split_keywords": [
"mindspore",
"machine",
"learning"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "7e76a295af110859320bfe1be880377dc62a8d03e5c38c8c77be127b6ea9fda4",
"md5": "da5f9ac6f91fc5eb78422d2b11d1fbf2",
"sha256": "b9283a64dc650de8efa7b9de24c6153c7ed6c2300630ef4257f689ae973c9046"
},
"downloads": -1,
"filename": "mindspore_serving-2.0.2-cp37-cp37m-manylinux1_x86_64.whl",
"has_sig": false,
"md5_digest": "da5f9ac6f91fc5eb78422d2b11d1fbf2",
"packagetype": "bdist_wheel",
"python_version": "cp37",
"requires_python": ">=3.7",
"size": 7765506,
"upload_time": "2023-10-28T06:29:36",
"upload_time_iso_8601": "2023-10-28T06:29:36.334498Z",
"url": "https://files.pythonhosted.org/packages/7e/76/a295af110859320bfe1be880377dc62a8d03e5c38c8c77be127b6ea9fda4/mindspore_serving-2.0.2-cp37-cp37m-manylinux1_x86_64.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "4e2ddb33e53e0c37b62cb81b51d16febce2fa2b02941b968e6b45a7b8e3566d5",
"md5": "30f4d88f152ff88a3609dd5aff123813",
"sha256": "445f5e8de313be786785de1bbd6916d404cc232d9a1b3c2934b8039f462cd627"
},
"downloads": -1,
"filename": "mindspore_serving-2.0.2-cp37-none-any.whl",
"has_sig": false,
"md5_digest": "30f4d88f152ff88a3609dd5aff123813",
"packagetype": "bdist_wheel",
"python_version": "cp37",
"requires_python": ">=3.7",
"size": 7436796,
"upload_time": "2023-10-28T06:30:03",
"upload_time_iso_8601": "2023-10-28T06:30:03.114298Z",
"url": "https://files.pythonhosted.org/packages/4e/2d/db33e53e0c37b62cb81b51d16febce2fa2b02941b968e6b45a7b8e3566d5/mindspore_serving-2.0.2-cp37-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "3fdd871b5a5de2cbf06d16143e8909c016f59e7e46a88b68ece89fa4403aa45b",
"md5": "1180f20f862bc2074d77cb4b39d37293",
"sha256": "ae9f023d4bdee7631ac4ab4c7badf30ad04ab5338f5065fbf597a71c6af7ede3"
},
"downloads": -1,
"filename": "mindspore_serving-2.0.2-cp38-cp38-manylinux1_x86_64.whl",
"has_sig": false,
"md5_digest": "1180f20f862bc2074d77cb4b39d37293",
"packagetype": "bdist_wheel",
"python_version": "cp38",
"requires_python": ">=3.7",
"size": 7775709,
"upload_time": "2023-10-28T06:29:27",
"upload_time_iso_8601": "2023-10-28T06:29:27.984871Z",
"url": "https://files.pythonhosted.org/packages/3f/dd/871b5a5de2cbf06d16143e8909c016f59e7e46a88b68ece89fa4403aa45b/mindspore_serving-2.0.2-cp38-cp38-manylinux1_x86_64.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "6ff9558297aebe2a13b9db602e7b3252cd250dfa9459b531067c4388f40ebd92",
"md5": "acb3b4ccf300362875c98735a1981ed3",
"sha256": "e0fc3cb67b81c1221ef517b07e6c6314a1f287cfaf537053bf0f31d15a22bcc3"
},
"downloads": -1,
"filename": "mindspore_serving-2.0.2-cp38-none-any.whl",
"has_sig": false,
"md5_digest": "acb3b4ccf300362875c98735a1981ed3",
"packagetype": "bdist_wheel",
"python_version": "cp38",
"requires_python": ">=3.7",
"size": 7446021,
"upload_time": "2023-10-28T06:29:54",
"upload_time_iso_8601": "2023-10-28T06:29:54.707940Z",
"url": "https://files.pythonhosted.org/packages/6f/f9/558297aebe2a13b9db602e7b3252cd250dfa9459b531067c4388f40ebd92/mindspore_serving-2.0.2-cp38-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "848c0143bcb5fceeb2e7f9fc4c066e9101c2b929a3455a6f91a82a1fcb12aa3b",
"md5": "7f243124a66077d8f0f3ab6e7dc5a3b2",
"sha256": "962046e09aa043d609d3e90642ab76d0f5604b9e0fb658d55cbf1a32435a7f5f"
},
"downloads": -1,
"filename": "mindspore_serving-2.0.2-cp39-cp39-manylinux1_x86_64.whl",
"has_sig": false,
"md5_digest": "7f243124a66077d8f0f3ab6e7dc5a3b2",
"packagetype": "bdist_wheel",
"python_version": "cp39",
"requires_python": ">=3.7",
"size": 7775574,
"upload_time": "2023-10-28T06:29:19",
"upload_time_iso_8601": "2023-10-28T06:29:19.909498Z",
"url": "https://files.pythonhosted.org/packages/84/8c/0143bcb5fceeb2e7f9fc4c066e9101c2b929a3455a6f91a82a1fcb12aa3b/mindspore_serving-2.0.2-cp39-cp39-manylinux1_x86_64.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "b3a5dfbfab7531c553d8a333f738255dd198fd15e5dd94d7f6a1368e00d45d80",
"md5": "449269408ff8b26a1e60edf8dd8743f2",
"sha256": "81bef8dbaf0cbd6a33db382bdeed6e2af1ec24b7516544cbb2c2973d714352cd"
},
"downloads": -1,
"filename": "mindspore_serving-2.0.2-cp39-none-any.whl",
"has_sig": false,
"md5_digest": "449269408ff8b26a1e60edf8dd8743f2",
"packagetype": "bdist_wheel",
"python_version": "cp39",
"requires_python": ">=3.7",
"size": 7446430,
"upload_time": "2023-10-28T06:29:45",
"upload_time_iso_8601": "2023-10-28T06:29:45.255866Z",
"url": "https://files.pythonhosted.org/packages/b3/a5/dfbfab7531c553d8a333f738255dd198fd15e5dd94d7f6a1368e00d45d80/mindspore_serving-2.0.2-cp39-none-any.whl",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2023-10-28 06:29:36",
"github": false,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"lcname": "mindspore-serving"
}