open-inference-grpc


Nameopen-inference-grpc JSON
Version 2.0.0a1 PyPI version JSON
download
home_pagehttps://github.com/open-inference/python-clients
SummaryGenerated gRPC client library for the open-inference-protocol
upload_time2023-12-04 20:07:52
maintainerZev Isert
docs_urlNone
authorZev Isert
requires_python>=3.8,<4.0
licenseApache-2.0
keywords mlserver kserve triton seldon openvino amdinfer pytorch-serve grpc
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # Open Inference Protocol gRPC Client

<p>
<a href="https://pypi.org/project/open-inference-grpc/">
    <img src="https://badge.fury.io/py/open-inference-grpc.svg" alt="Package version">
</a>
</p>

`open-inference-grpc` is a generated client library based on the gRPC protocol definition tracked in the [open-inference/open-inference-protocol/](https://github.com/open-inference/open-inference-protocol/blob/main/specification/protocol/open_inference_grpc.proto) repository.

---

## Installation

This package requires Python 3.8 or greater.

Install with your favorite tool from pypi.org/project/open-inference-grpc/

```console
$ pip install open-inference-grpc
$ poetry add open-inference-grpc
```

> A REST-based python client ([`open-inference-openapi`](../open-inference-openapi/README.md)) also exists for the Open Inference Protocol, and can be installed alongside this gRPC client, as both are distributed as [namespace packages](https://packaging.python.org/en/latest/guides/packaging-namespace-packages/#packaging-namespace-packages).

## Example

```python
# These dependencies are installed by open-inference-grpc
import grpc
from google.protobuf.json_format import MessageToDict

from open_inference.grpc.service import GRPCInferenceServiceStub
from open_inference.grpc.protocol import (
    ServerReadyRequest,
    ModelReadyRequest,
    ModelMetadataRequest,
    ModelInferRequest,
)


with grpc.insecure_channel("localhost:8081") as channel:
    client = GRPCInferenceServiceStub(channel)

    # Check that the server is live, and it has the iris model loaded
    client.ServerReady(ServerReadyRequest())
    client.ModelReady(ModelReadyRequest(name="iris-model"))

    # Make an inference request
    pred = client.ModelInfer(
        ModelInferRequest(
            model_name="iris-model",
            inputs=[
                {
                    "name": "input-0",
                    "datatype": "FP64",
                    "shape": [1, 4],
                    "contents": {"fp64_contents": [5.3, 3.7, 1.5, 0.2]},
                }
            ],
        )
    )

print(MessageToDict(pred))
# {
#     "modelName": "iris-model",
#     "parameters": {"content_type": {"stringParam": "np"}},
#     "outputs": [
#         {
#             "name": "output-1",
#             "datatype": "INT64",
#             "shape": ["1", "1"],
#             "parameters": {"content_type": {"stringParam": "np"}},
#             "contents": {"int64Contents": ["0"]},
#         }
#     ],
# }
```

<details><summary>Async versions of the same APIs are also available, use <code>grpc.aio</code> instead to create a channel then <code>await</code> and requests made.</summary>

```py
async with grpc.aio.insecure_channel('localhost:8081') as channel:
    stub = GRPCInferenceServiceStub(channel)
    await stub.ServerReady(ServerReadyRequest())
```

</details>

## Dependencies

The `open-inference-grpc` python package relies only on [`grpcio`](https://github.com/grpc/grpc), the underlying transport implementation of gRPC.

## Contribute

This client is largely generated automatically by [`grpc-tools`](https://grpc.io/docs/languages/python/quickstart/#generate-grpc-code), with a small amount of build post-processing in [build.py](./build.py).

> Run `python build.py` to build this package, it will:
>
> 1. If `proto/open_inference_grpc.proto` is not found, download it from [open-inference/open-inference-protocol/](https://github.com/open-inference/open-inference-protocol/blob/main/specification/protocol/open_inference_grpc.proto)
> 1. Run grpcio_tools.protoc to create the python client
> 1. Postprocess filenames and imports
> 1. Prepend the Apache 2.0 License preamble
> 1. Format with [black](https://github.com/psf/black)

If you want to contribute to the open-inference-protocol itself, please create an issue or PR in the [open-inference/open-inference-protocol](https://github.com/open-inference/open-inference-protocol) repository.

## License

By contributing to Open Inference Protocol Python client repository, you agree that your contributions will be licensed under its Apache 2.0 License.

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/open-inference/python-clients",
    "name": "open-inference-grpc",
    "maintainer": "Zev Isert",
    "docs_url": null,
    "requires_python": ">=3.8,<4.0",
    "maintainer_email": "zevisert@users.noreply.github.com",
    "keywords": "mlserver,kserve,triton,seldon,openvino,amdinfer,pytorch-serve,grpc",
    "author": "Zev Isert",
    "author_email": "zevisert@users.noreply.github.com",
    "download_url": "https://files.pythonhosted.org/packages/5c/d3/6f9c33d7f26927a1897eca9b69d7f7f555ebf2db560437046b220cdf57d6/open_inference_grpc-2.0.0a1.tar.gz",
    "platform": null,
    "description": "# Open Inference Protocol gRPC Client\n\n<p>\n<a href=\"https://pypi.org/project/open-inference-grpc/\">\n    <img src=\"https://badge.fury.io/py/open-inference-grpc.svg\" alt=\"Package version\">\n</a>\n</p>\n\n`open-inference-grpc` is a generated client library based on the gRPC protocol definition tracked in the [open-inference/open-inference-protocol/](https://github.com/open-inference/open-inference-protocol/blob/main/specification/protocol/open_inference_grpc.proto) repository.\n\n---\n\n## Installation\n\nThis package requires Python 3.8 or greater.\n\nInstall with your favorite tool from pypi.org/project/open-inference-grpc/\n\n```console\n$ pip install open-inference-grpc\n$ poetry add open-inference-grpc\n```\n\n> A REST-based python client ([`open-inference-openapi`](../open-inference-openapi/README.md)) also exists for the Open Inference Protocol, and can be installed alongside this gRPC client, as both are distributed as [namespace packages](https://packaging.python.org/en/latest/guides/packaging-namespace-packages/#packaging-namespace-packages).\n\n## Example\n\n```python\n# These dependencies are installed by open-inference-grpc\nimport grpc\nfrom google.protobuf.json_format import MessageToDict\n\nfrom open_inference.grpc.service import GRPCInferenceServiceStub\nfrom open_inference.grpc.protocol import (\n    ServerReadyRequest,\n    ModelReadyRequest,\n    ModelMetadataRequest,\n    ModelInferRequest,\n)\n\n\nwith grpc.insecure_channel(\"localhost:8081\") as channel:\n    client = GRPCInferenceServiceStub(channel)\n\n    # Check that the server is live, and it has the iris model loaded\n    client.ServerReady(ServerReadyRequest())\n    client.ModelReady(ModelReadyRequest(name=\"iris-model\"))\n\n    # Make an inference request\n    pred = client.ModelInfer(\n        ModelInferRequest(\n            model_name=\"iris-model\",\n            inputs=[\n                {\n                    \"name\": \"input-0\",\n                    \"datatype\": \"FP64\",\n                    \"shape\": [1, 4],\n                    \"contents\": {\"fp64_contents\": [5.3, 3.7, 1.5, 0.2]},\n                }\n            ],\n        )\n    )\n\nprint(MessageToDict(pred))\n# {\n#     \"modelName\": \"iris-model\",\n#     \"parameters\": {\"content_type\": {\"stringParam\": \"np\"}},\n#     \"outputs\": [\n#         {\n#             \"name\": \"output-1\",\n#             \"datatype\": \"INT64\",\n#             \"shape\": [\"1\", \"1\"],\n#             \"parameters\": {\"content_type\": {\"stringParam\": \"np\"}},\n#             \"contents\": {\"int64Contents\": [\"0\"]},\n#         }\n#     ],\n# }\n```\n\n<details><summary>Async versions of the same APIs are also available, use <code>grpc.aio</code> instead to create a channel then <code>await</code> and requests made.</summary>\n\n```py\nasync with grpc.aio.insecure_channel('localhost:8081') as channel:\n    stub = GRPCInferenceServiceStub(channel)\n    await stub.ServerReady(ServerReadyRequest())\n```\n\n</details>\n\n## Dependencies\n\nThe `open-inference-grpc` python package relies only on [`grpcio`](https://github.com/grpc/grpc), the underlying transport implementation of gRPC.\n\n## Contribute\n\nThis client is largely generated automatically by [`grpc-tools`](https://grpc.io/docs/languages/python/quickstart/#generate-grpc-code), with a small amount of build post-processing in [build.py](./build.py).\n\n> Run `python build.py` to build this package, it will:\n>\n> 1. If `proto/open_inference_grpc.proto` is not found, download it from [open-inference/open-inference-protocol/](https://github.com/open-inference/open-inference-protocol/blob/main/specification/protocol/open_inference_grpc.proto)\n> 1. Run grpcio_tools.protoc to create the python client\n> 1. Postprocess filenames and imports\n> 1. Prepend the Apache 2.0 License preamble\n> 1. Format with [black](https://github.com/psf/black)\n\nIf you want to contribute to the open-inference-protocol itself, please create an issue or PR in the [open-inference/open-inference-protocol](https://github.com/open-inference/open-inference-protocol) repository.\n\n## License\n\nBy contributing to Open Inference Protocol Python client repository, you agree that your contributions will be licensed under its Apache 2.0 License.\n",
    "bugtrack_url": null,
    "license": "Apache-2.0",
    "summary": "Generated gRPC client library for the open-inference-protocol",
    "version": "2.0.0a1",
    "project_urls": {
        "Bug Tracker": "https://github.com/open-inference/python-clients/issues",
        "Homepage": "https://github.com/open-inference/python-clients",
        "Repository": "https://github.com/open-inference/python-clients"
    },
    "split_keywords": [
        "mlserver",
        "kserve",
        "triton",
        "seldon",
        "openvino",
        "amdinfer",
        "pytorch-serve",
        "grpc"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "277083d61b0d0784df2c601c9bcdd12a45461d4f6180dfc166b5df693690796e",
                "md5": "d49cfd2cedd7abb3a59871c4ca5cd403",
                "sha256": "76f1e241315750cb18622fa77e71783eafd401e7bcdbffe7fb67300a8ae916ae"
            },
            "downloads": -1,
            "filename": "open_inference_grpc-2.0.0a1-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "d49cfd2cedd7abb3a59871c4ca5cd403",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.8,<4.0",
            "size": 9435,
            "upload_time": "2023-12-04T20:07:48",
            "upload_time_iso_8601": "2023-12-04T20:07:48.582204Z",
            "url": "https://files.pythonhosted.org/packages/27/70/83d61b0d0784df2c601c9bcdd12a45461d4f6180dfc166b5df693690796e/open_inference_grpc-2.0.0a1-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "5cd36f9c33d7f26927a1897eca9b69d7f7f555ebf2db560437046b220cdf57d6",
                "md5": "509a0d4398bc44a56e2ab9aa94b8b00f",
                "sha256": "6aa3933c0b5b2bad7ea79cb477ed1bdd26c047cddd824a001aa569703dce834b"
            },
            "downloads": -1,
            "filename": "open_inference_grpc-2.0.0a1.tar.gz",
            "has_sig": false,
            "md5_digest": "509a0d4398bc44a56e2ab9aa94b8b00f",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.8,<4.0",
            "size": 9100,
            "upload_time": "2023-12-04T20:07:52",
            "upload_time_iso_8601": "2023-12-04T20:07:52.151451Z",
            "url": "https://files.pythonhosted.org/packages/5c/d3/6f9c33d7f26927a1897eca9b69d7f7f555ebf2db560437046b220cdf57d6/open_inference_grpc-2.0.0a1.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2023-12-04 20:07:52",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "open-inference",
    "github_project": "python-clients",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": false,
    "lcname": "open-inference-grpc"
}
        
Elapsed time: 0.15317s