# Python Infer Client
[](https://github.com/rabiloo/python-infer-client/actions/workflows/test.yml)
[](https://pypi.org/project/infer-client)
[](https://pypi.org/project/infer-client)
[](https://pypi.org/project/infer-client)
[](https://pypi.org/project/infer-client)
## About Python Infer Client
[Python Infer Client](https://github.com/rabiloo/python-infer-client) is a python inference client library. It provides one interface to interact with many types of inference client as onnxruntime, tritonclient...
## Install
With using the tritonclient client, only supported with GRPC
```
$ pip install infer-client[tritonclient]
```
With using the onnxruntime client, both CPU and GPU are supported
```
$ pip install infer-client[onnxruntime]
or
$ pip install infer-client[onnxruntime-gpu]
```
## Usage
```
import numpy as np
from infer_client.adapters.onnx import OnnxInferenceAdapter
from infer_client.inference import Inference
adapter = OnnxInferenceAdapter(model_name="resources/test_classify", version="1", limit_mem_gpu=-1)
infer_client_obj = Inference(adapter)
res = infer_client_obj.inference({"input": np.random.rand(1, 3, 224, 224)}, ["output"])
```
## Changelog
Please see [CHANGELOG](CHANGELOG.md) for more information on what has changed recently.
## Contributing
Please see [CONTRIBUTING](.github/CONTRIBUTING.md) for details.
## Security Vulnerabilities
Please review [our security policy](../../security/policy) on how to report security vulnerabilities.
## Credits
- [Dao Quang Duy](https://github.com/duydq12)
- [All Contributors](../../contributors)
## License
The MIT License (MIT). Please see [License File](LICENSE) for more information.
## Reference
- [Onnxruntime](https://github.com/microsoft/onnxruntime)
- [Triton Client](https://github.com/triton-inference-server/client)
Raw data
{
"_id": null,
"home_page": null,
"name": "infer-client",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.10",
"maintainer_email": "Rabiloo Developers <oss@rabiloo.com>",
"keywords": "infer-client, onnxruntime, onnxruntime-gpu, tritonclient",
"author": null,
"author_email": "Dao Quang Duy <duydaoquang12@gmail.com>",
"download_url": "https://files.pythonhosted.org/packages/58/fc/fc66457463a56dcb316c007aed805f54557cfad2a356a1fc27287a47adaf/infer_client-0.1.0.tar.gz",
"platform": null,
"description": "# Python Infer Client\n\n[](https://github.com/rabiloo/python-infer-client/actions/workflows/test.yml)\n[](https://pypi.org/project/infer-client)\n[](https://pypi.org/project/infer-client)\n[](https://pypi.org/project/infer-client)\n[](https://pypi.org/project/infer-client)\n\n## About Python Infer Client\n\n[Python Infer Client](https://github.com/rabiloo/python-infer-client) is a python inference client library. It provides one interface to interact with many types of inference client as onnxruntime, tritonclient...\n\n## Install\nWith using the tritonclient client, only supported with GRPC\n```\n$ pip install infer-client[tritonclient]\n```\n\nWith using the onnxruntime client, both CPU and GPU are supported\n```\n$ pip install infer-client[onnxruntime]\nor\n$ pip install infer-client[onnxruntime-gpu]\n```\n## Usage\n\n```\nimport numpy as np\n\nfrom infer_client.adapters.onnx import OnnxInferenceAdapter\nfrom infer_client.inference import Inference\n\n\nadapter = OnnxInferenceAdapter(model_name=\"resources/test_classify\", version=\"1\", limit_mem_gpu=-1)\ninfer_client_obj = Inference(adapter)\n\nres = infer_client_obj.inference({\"input\": np.random.rand(1, 3, 224, 224)}, [\"output\"])\n```\n\n## Changelog\n\nPlease see [CHANGELOG](CHANGELOG.md) for more information on what has changed recently.\n\n## Contributing\n\nPlease see [CONTRIBUTING](.github/CONTRIBUTING.md) for details.\n\n## Security Vulnerabilities\n\nPlease review [our security policy](../../security/policy) on how to report security vulnerabilities.\n\n## Credits\n\n- [Dao Quang Duy](https://github.com/duydq12)\n- [All Contributors](../../contributors)\n\n## License\n\nThe MIT License (MIT). Please see [License File](LICENSE) for more information.\n\n## Reference\n- [Onnxruntime](https://github.com/microsoft/onnxruntime)\n- [Triton Client](https://github.com/triton-inference-server/client)\n",
"bugtrack_url": null,
"license": "The MIT License",
"summary": "Abstraction for AI Inference Client",
"version": "0.1.0",
"project_urls": {
"Bug Tracker": "https://github.com/rabiloo/python-infer-client/issues",
"Homepage": "https://github.com/rabiloo/python-infer-client",
"Repository": "https://github.com/rabiloo/python-infer-client"
},
"split_keywords": [
"infer-client",
" onnxruntime",
" onnxruntime-gpu",
" tritonclient"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "049765f08b8448cbe5461cc0b7b9a6d564f0cfa11587458ebf34ed9ecb3562ea",
"md5": "b6c35c7ec0758766239c0b72b28b92cd",
"sha256": "e31bf9070172686491948a2f9dfd0950810846b0df9bce6c15980f366a5d3a73"
},
"downloads": -1,
"filename": "infer_client-0.1.0-py3-none-any.whl",
"has_sig": false,
"md5_digest": "b6c35c7ec0758766239c0b72b28b92cd",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.10",
"size": 8166,
"upload_time": "2024-11-20T04:48:43",
"upload_time_iso_8601": "2024-11-20T04:48:43.257997Z",
"url": "https://files.pythonhosted.org/packages/04/97/65f08b8448cbe5461cc0b7b9a6d564f0cfa11587458ebf34ed9ecb3562ea/infer_client-0.1.0-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "58fcfc66457463a56dcb316c007aed805f54557cfad2a356a1fc27287a47adaf",
"md5": "c057e03b9f4cff75a6564426b6dbdb5f",
"sha256": "debc618899f5f2a7417dd0e3b380bf6763f2f309b82a10fe3d82354a6e6c45a9"
},
"downloads": -1,
"filename": "infer_client-0.1.0.tar.gz",
"has_sig": false,
"md5_digest": "c057e03b9f4cff75a6564426b6dbdb5f",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.10",
"size": 8514,
"upload_time": "2024-11-20T04:48:44",
"upload_time_iso_8601": "2024-11-20T04:48:44.754995Z",
"url": "https://files.pythonhosted.org/packages/58/fc/fc66457463a56dcb316c007aed805f54557cfad2a356a1fc27287a47adaf/infer_client-0.1.0.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-11-20 04:48:44",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "rabiloo",
"github_project": "python-infer-client",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"lcname": "infer-client"
}