# TorchServe Python Client
<!-- WARNING: THIS FILE WAS AUTOGENERATED! DO NOT EDIT! -->
## Install
``` sh
pip install torchserve_client
```
## Usage
Using `torchserve_client` is a breeze! It has support for both REST APIs
and gRPC APIs.
### REST Client
To make calls to REST endpoint, simply initialize a
[`TorchServeClientREST`](https://Ankur-singh.github.io/torchserve_client/rest.html#torchserveclientrest)
object as shown below:
``` python
from torchserve_client import TorchServeClientREST
# Initialize the REST TorchServeClient object
ts_client = TorchServeClientREST()
ts_client
```
TorchServeClient(base_url=http://localhost, management_port=8081, inference_port=8080)
If you wish to customize the *base URL*, *management port*, or
*inference port* of your TorchServe server, you can pass them as
arguments during initialization:
``` python
from torchserve_client import TorchServeClientREST
# Customize the base URL, management port, and inference port
ts_client = TorchServeClientREST(base_url='http://your-torchserve-server.com',
management_port=8081, inference_port=8080)
ts_client
```
TorchServeClient(base_url=http://your-torchserve-server.com, management_port=8081, inference_port=8080)
### gRPC Client
To create a gRPC client, simply create a
[`TorchServeClientGRPC`](https://Ankur-singh.github.io/torchserve_client/grpc.html#torchserveclientgrpc)
object
``` python
from torchserve_client import TorchServeClientGRPC
# Initialize the gRPC TorchServeClient object
ts_client = TorchServeClientGRPC()
ts_client
```
TorchServeClientGRPC(base_url=localhost, management_port=7071, inference_port=7070)
To customize base URL and default ports, pass them as arguments during
initialization
``` python
from torchserve_client import TorchServeClientGRPC
# Initialize the gRPC TorchServeClient object
ts_client = TorchServeClientGRPC(base_url='http://your-torchserve-server.com',
management_port=7071, inference_port=7070)
ts_client
```
TorchServeClientGRPC(base_url=your-torchserve-server.com, management_port=7071, inference_port=7070)
With these intuitive APIs at your disposal, you can harness the full
power of the Management and Inference API and take your application to
next level. Happy inferencing! 🚀🔥
Raw data
{
"_id": null,
"home_page": "https://github.com/Ankur-singh/torchserve_client",
"name": "torchserve-client",
"maintainer": "",
"docs_url": null,
"requires_python": ">=3.7",
"maintainer_email": "",
"keywords": "nbdev jupyter notebook python",
"author": "Ankur Singh",
"author_email": "as.ankursingh3.1@gmail.com",
"download_url": "https://files.pythonhosted.org/packages/d7/e7/0f0faa551a101953d946ab2e0f0ee1082936849063e72020ba8f60677fab/torchserve_client-0.0.2.tar.gz",
"platform": null,
"description": "# TorchServe Python Client\n\n<!-- WARNING: THIS FILE WAS AUTOGENERATED! DO NOT EDIT! -->\n\n## Install\n\n``` sh\npip install torchserve_client\n```\n\n## Usage\n\nUsing `torchserve_client` is a breeze! It has support for both REST APIs\nand gRPC APIs.\n\n### REST Client\n\nTo make calls to REST endpoint, simply initialize a\n[`TorchServeClientREST`](https://Ankur-singh.github.io/torchserve_client/rest.html#torchserveclientrest)\nobject as shown below:\n\n``` python\nfrom torchserve_client import TorchServeClientREST\n\n# Initialize the REST TorchServeClient object\nts_client = TorchServeClientREST()\nts_client\n```\n\n TorchServeClient(base_url=http://localhost, management_port=8081, inference_port=8080)\n\nIf you wish to customize the *base URL*, *management port*, or\n*inference port* of your TorchServe server, you can pass them as\narguments during initialization:\n\n``` python\nfrom torchserve_client import TorchServeClientREST\n\n# Customize the base URL, management port, and inference port\nts_client = TorchServeClientREST(base_url='http://your-torchserve-server.com', \n management_port=8081, inference_port=8080)\nts_client\n```\n\n TorchServeClient(base_url=http://your-torchserve-server.com, management_port=8081, inference_port=8080)\n\n### gRPC Client\n\nTo create a gRPC client, simply create a\n[`TorchServeClientGRPC`](https://Ankur-singh.github.io/torchserve_client/grpc.html#torchserveclientgrpc)\nobject\n\n``` python\nfrom torchserve_client import TorchServeClientGRPC\n\n# Initialize the gRPC TorchServeClient object\nts_client = TorchServeClientGRPC()\nts_client\n```\n\n TorchServeClientGRPC(base_url=localhost, management_port=7071, inference_port=7070)\n\nTo customize base URL and default ports, pass them as arguments during\ninitialization\n\n``` python\nfrom torchserve_client import TorchServeClientGRPC\n\n# Initialize the gRPC TorchServeClient object\nts_client = TorchServeClientGRPC(base_url='http://your-torchserve-server.com', \n management_port=7071, inference_port=7070)\nts_client\n```\n\n TorchServeClientGRPC(base_url=your-torchserve-server.com, management_port=7071, inference_port=7070)\n\nWith these intuitive APIs at your disposal, you can harness the full\npower of the Management and Inference API and take your application to\nnext level. Happy inferencing! \ud83d\ude80\ud83d\udd25\n",
"bugtrack_url": null,
"license": "Apache Software License 2.0",
"summary": "Python Client for TorchServe APIs",
"version": "0.0.2",
"project_urls": {
"Homepage": "https://github.com/Ankur-singh/torchserve_client"
},
"split_keywords": [
"nbdev",
"jupyter",
"notebook",
"python"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "3f7f6b7e19f40c7d607511e31cbd7f3bf7750c8b5b451b7f06dd9519e30d2a7f",
"md5": "e0c6f6aaaa26e564343d4bd3e3e2876d",
"sha256": "682446845ba743a678bfaefa5015f79d22e9fc266db3291a64d91e79b6f25bbc"
},
"downloads": -1,
"filename": "torchserve_client-0.0.2-py3-none-any.whl",
"has_sig": false,
"md5_digest": "e0c6f6aaaa26e564343d4bd3e3e2876d",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.7",
"size": 16291,
"upload_time": "2023-08-14T02:24:16",
"upload_time_iso_8601": "2023-08-14T02:24:16.960921Z",
"url": "https://files.pythonhosted.org/packages/3f/7f/6b7e19f40c7d607511e31cbd7f3bf7750c8b5b451b7f06dd9519e30d2a7f/torchserve_client-0.0.2-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "d7e70f0faa551a101953d946ab2e0f0ee1082936849063e72020ba8f60677fab",
"md5": "4b288b2159f6f99511e9ae7d81e2ad1f",
"sha256": "51de7a93884594f7e1fc88cadc3e55bda5f6e7720031437b5453d6c22e7c28ad"
},
"downloads": -1,
"filename": "torchserve_client-0.0.2.tar.gz",
"has_sig": false,
"md5_digest": "4b288b2159f6f99511e9ae7d81e2ad1f",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.7",
"size": 14779,
"upload_time": "2023-08-14T02:24:18",
"upload_time_iso_8601": "2023-08-14T02:24:18.807740Z",
"url": "https://files.pythonhosted.org/packages/d7/e7/0f0faa551a101953d946ab2e0f0ee1082936849063e72020ba8f60677fab/torchserve_client-0.0.2.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2023-08-14 02:24:18",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "Ankur-singh",
"github_project": "torchserve_client",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"lcname": "torchserve-client"
}