Name | apstone JSON |
Version |
0.0.8
JSON |
| download |
home_page | https://github.com/ykk648/apstone |
Summary | ai_power base stone |
upload_time | 2024-01-09 02:12:13 |
maintainer | |
docs_url | None |
author | ykk648 |
requires_python | |
license | |
keywords |
|
VCS |
 |
bugtrack_url |
|
requirements |
No requirements were recorded.
|
Travis-CI |
No Travis.
|
coveralls test coverage |
No coveralls.
|
### Introduction
Base stone of AI_power, maintain all inference of AI_Power models.
#### Wrapper
- Supply different model infer wrapper, including ONNX/TensorRT/Torch JIT;
- Support onnx different Execution Providers (EP) , including cpu/gpu/trt/trt16/int8;
- High level mmlab model (converted) infer wrapper, including MMPose/MMDet;
#### Model Convert
- torch2jit torch2onnx etc.
- detectron2 to onnx
- modelscope to onnx
- onnx2simple2trt
- tf2pb2onnx
#### Model Tools
- torch model edit
- onnx model shape/speed test (different EP)
- common scripts from onnxruntime
### Usage
#### onnx model speed test
```python
from apstone import ONNXModel
onnx_p = 'pretrain_models/sr_lib/realesr-general-x4v3-dynamic.onnx'
input_dynamic_shape = (1, 3, 96, 72) # None
# cpu gpu trt trt16 int8
ONNXModel(onnx_p, provider='cpu', debug=True, input_dynamic_shape=input_dynamic_shape).speed_test()
```
### Install
```sh
pip install apstone
```
#### Envs
| Execution Providers | Needs |
| ------------------- | ----------------------------------------------------------- |
| cpu | pip install onnxruntime |
| gpu | pip install onnxruntime-gpu |
| trt/trt16/int8 | onnxruntime-gpu compiled with tensorrt EP |
| TensorRT | pip install tensorrt pycuda |
| torch JIT | install [pytorch](https://pytorch.org/get-started/locally/) |
Raw data
{
"_id": null,
"home_page": "https://github.com/ykk648/apstone",
"name": "apstone",
"maintainer": "",
"docs_url": null,
"requires_python": "",
"maintainer_email": "",
"keywords": "",
"author": "ykk648",
"author_email": "ykk648@gmail.com",
"download_url": "https://files.pythonhosted.org/packages/75/5b/6ac2ffe7fd2dee1e8f522c904acdaa517ab58bd46db9549404f533113eed/apstone-0.0.8.tar.gz",
"platform": null,
"description": "### Introduction\n\nBase stone of AI_power, maintain all inference of AI_Power models.\n\n#### Wrapper\n\n- Supply different model infer wrapper, including ONNX/TensorRT/Torch JIT;\n- Support onnx different Execution Providers (EP) , including cpu/gpu/trt/trt16/int8;\n- High level mmlab model (converted) infer wrapper, including MMPose/MMDet;\n\n#### Model Convert\n\n- torch2jit torch2onnx etc.\n- detectron2 to onnx\n- modelscope to onnx\n- onnx2simple2trt\n- tf2pb2onnx\n\n#### Model Tools\n\n- torch model edit\n- onnx model shape/speed test (different EP)\n- common scripts from onnxruntime\n\n### Usage\n\n#### onnx model speed test\n```python\nfrom apstone import ONNXModel\n\nonnx_p = 'pretrain_models/sr_lib/realesr-general-x4v3-dynamic.onnx'\ninput_dynamic_shape = (1, 3, 96, 72) # None\n# cpu gpu trt trt16 int8\nONNXModel(onnx_p, provider='cpu', debug=True, input_dynamic_shape=input_dynamic_shape).speed_test()\n```\n\n### Install\n\n```sh\npip install apstone\n```\n\n#### Envs\n\n| Execution Providers | Needs |\n| ------------------- | ----------------------------------------------------------- |\n| cpu | pip install onnxruntime |\n| gpu | pip install onnxruntime-gpu |\n| trt/trt16/int8 | onnxruntime-gpu compiled with tensorrt EP |\n| TensorRT | pip install tensorrt pycuda |\n| torch JIT | install [pytorch](https://pytorch.org/get-started/locally/) |\n\n",
"bugtrack_url": null,
"license": "",
"summary": "ai_power base stone",
"version": "0.0.8",
"project_urls": {
"Bug Tracker": "https://github.com/ykk648/apstone/issues",
"Homepage": "https://github.com/ykk648/apstone"
},
"split_keywords": [],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "755b6ac2ffe7fd2dee1e8f522c904acdaa517ab58bd46db9549404f533113eed",
"md5": "8a820b749c0b159508e5934469a3bd4a",
"sha256": "0e93f99275affbe0abf733f2ec065a23cfe6bc64365d11e666c3b3f374bc6f60"
},
"downloads": -1,
"filename": "apstone-0.0.8.tar.gz",
"has_sig": false,
"md5_digest": "8a820b749c0b159508e5934469a3bd4a",
"packagetype": "sdist",
"python_version": "source",
"requires_python": null,
"size": 41309,
"upload_time": "2024-01-09T02:12:13",
"upload_time_iso_8601": "2024-01-09T02:12:13.769820Z",
"url": "https://files.pythonhosted.org/packages/75/5b/6ac2ffe7fd2dee1e8f522c904acdaa517ab58bd46db9549404f533113eed/apstone-0.0.8.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-01-09 02:12:13",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "ykk648",
"github_project": "apstone",
"travis_ci": false,
"coveralls": false,
"github_actions": false,
"lcname": "apstone"
}