# ssi4onnx
**S**imple **S**hape **I**nference tool for **ONNX**.
https://github.com/PINTO0309/simple-onnx-processing-tools
[](https://pepy.tech/project/ssi4onnx)  [](https://pypi.org/project/ssi4onnx/) [](https://github.com/PINTO0309/ssi4onnx/actions?query=workflow%3ACodeQL)
<p align="center">
<img src="https://user-images.githubusercontent.com/33194443/170158744-69bfdb6a-e032-4ed9-982c-ee9ac8889022.png" />
</p>
## 1. Setup
### 1-1. HostPC
```bash
### option
$ echo export PATH="~/.local/bin:$PATH" >> ~/.bashrc \
&& source ~/.bashrc
### run
$ pip install -U onnx \
&& python3 -m pip install -U onnx_graphsurgeon --index-url https://pypi.ngc.nvidia.com \
&& pip install -U ssi4onnx
```
### 1-2. Docker
https://github.com/PINTO0309/simple-onnx-processing-tools#docker
## 2. CLI Usage
```
$ ssi4onnx -h
usage:
ssi4onnx [-h]
-if INPUT_ONNX_FILE_PATH
[-of OUTPUT_ONNX_FILE_PATH]
[-n]
optional arguments:
-h, --help
show this help message and exit.
-if INPUT_ONNX_FILE_PATH, --input_onnx_file_path INPUT_ONNX_FILE_PATH
Input onnx file path.
-of OUTPUT_ONNX_FILE_PATH, --output_onnx_file_path OUTPUT_ONNX_FILE_PATH
Output onnx file path.
-n, --non_verbose
Do not show all information logs. Only error logs are displayed.
```
## 3. In-script Usage
```python
>>> from ssi4onnx import shape_inference
>>> help(shape_inference)
Help on function shape_inference in module ssi4onnx.onnx_shape_inference:
shape_inference(
input_onnx_file_path: Union[str, NoneType] = '',
onnx_graph: Union[onnx.onnx_ml_pb2.ModelProto, NoneType] = None,
output_onnx_file_path: Union[str, NoneType] = '',
non_verbose: Union[bool, NoneType] = False
) -> onnx.onnx_ml_pb2.ModelProto
Parameters
----------
input_onnx_file_path: Optional[str]
Input onnx file path.
Either input_onnx_file_path or onnx_graph must be specified.
Default: ''
onnx_graph: Optional[onnx.ModelProto]
onnx.ModelProto.
Either input_onnx_file_path or onnx_graph must be specified.
onnx_graph If specified, ignore input_onnx_file_path and process onnx_graph.
output_onnx_file_path: Optional[str]
Output onnx file path. If not specified, no ONNX file is output.
Default: ''
non_verbose: Optional[bool]
Do not show all information logs. Only error logs are displayed.
Default: False
Returns
-------
estimated_graph: onnx.ModelProto
Shape estimated onnx ModelProto.
```
## 4. CLI Execution
```bash
$ ssi4onnx --input_onnx_file_path nanodet_320x320.onnx
```
## 5. In-script Execution
```python
from ssi4onnx import shape_inference
estimated_graph = shape_inference(
input_onnx_file_path="crestereo_init_iter2_120x160.onnx",
)
```
## 6. Sample
### Before

### After

## 7. Reference
1. https://github.com/onnx/onnx/blob/main/docs/Operators.md
2. https://docs.nvidia.com/deeplearning/tensorrt/onnx-graphsurgeon/docs/index.html
3. https://github.com/NVIDIA/TensorRT/tree/main/tools/onnx-graphsurgeon
4. https://github.com/PINTO0309/simple-onnx-processing-tools
5. https://github.com/PINTO0309/PINTO_model_zoo
## 8. Issues
https://github.com/PINTO0309/simple-onnx-processing-tools/issues
Raw data
{
"_id": null,
"home_page": "https://github.com/PINTO0309/ssi4onnx",
"name": "ssi4onnx",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.6",
"maintainer_email": null,
"keywords": null,
"author": "Katsuya Hyodo",
"author_email": "rmsdh122@yahoo.co.jp",
"download_url": "https://files.pythonhosted.org/packages/c1/d3/c829bdf3081dd935a0d1870a72d95f130022c9019bdf6dcb256e06b61c54/ssi4onnx-1.0.4.tar.gz",
"platform": "linux",
"description": "# ssi4onnx\n**S**imple **S**hape **I**nference tool for **ONNX**.\n\n\nhttps://github.com/PINTO0309/simple-onnx-processing-tools\n\n[](https://pepy.tech/project/ssi4onnx)  [](https://pypi.org/project/ssi4onnx/) [](https://github.com/PINTO0309/ssi4onnx/actions?query=workflow%3ACodeQL)\n\n<p align=\"center\">\n <img src=\"https://user-images.githubusercontent.com/33194443/170158744-69bfdb6a-e032-4ed9-982c-ee9ac8889022.png\" />\n</p>\n\n## 1. Setup\n### 1-1. HostPC\n```bash\n### option\n$ echo export PATH=\"~/.local/bin:$PATH\" >> ~/.bashrc \\\n&& source ~/.bashrc\n\n### run\n$ pip install -U onnx \\\n&& python3 -m pip install -U onnx_graphsurgeon --index-url https://pypi.ngc.nvidia.com \\\n&& pip install -U ssi4onnx\n```\n### 1-2. Docker\nhttps://github.com/PINTO0309/simple-onnx-processing-tools#docker\n\n## 2. CLI Usage\n```\n$ ssi4onnx -h\n\nusage:\n ssi4onnx [-h]\n -if INPUT_ONNX_FILE_PATH\n [-of OUTPUT_ONNX_FILE_PATH]\n [-n]\n\noptional arguments:\n -h, --help\n show this help message and exit.\n\n -if INPUT_ONNX_FILE_PATH, --input_onnx_file_path INPUT_ONNX_FILE_PATH\n Input onnx file path.\n\n -of OUTPUT_ONNX_FILE_PATH, --output_onnx_file_path OUTPUT_ONNX_FILE_PATH\n Output onnx file path.\n\n -n, --non_verbose\n Do not show all information logs. Only error logs are displayed.\n```\n\n## 3. In-script Usage\n```python\n>>> from ssi4onnx import shape_inference\n>>> help(shape_inference)\n\nHelp on function shape_inference in module ssi4onnx.onnx_shape_inference:\n\nshape_inference(\n input_onnx_file_path: Union[str, NoneType] = '',\n onnx_graph: Union[onnx.onnx_ml_pb2.ModelProto, NoneType] = None,\n output_onnx_file_path: Union[str, NoneType] = '',\n non_verbose: Union[bool, NoneType] = False\n) -> onnx.onnx_ml_pb2.ModelProto\n\n Parameters\n ----------\n input_onnx_file_path: Optional[str]\n Input onnx file path.\n Either input_onnx_file_path or onnx_graph must be specified.\n Default: ''\n\n onnx_graph: Optional[onnx.ModelProto]\n onnx.ModelProto.\n Either input_onnx_file_path or onnx_graph must be specified.\n onnx_graph If specified, ignore input_onnx_file_path and process onnx_graph.\n\n output_onnx_file_path: Optional[str]\n Output onnx file path. If not specified, no ONNX file is output.\n Default: ''\n\n non_verbose: Optional[bool]\n Do not show all information logs. Only error logs are displayed.\n Default: False\n\n Returns\n -------\n estimated_graph: onnx.ModelProto\n Shape estimated onnx ModelProto.\n```\n\n## 4. CLI Execution\n```bash\n$ ssi4onnx --input_onnx_file_path nanodet_320x320.onnx\n```\n\n## 5. In-script Execution\n```python\nfrom ssi4onnx import shape_inference\n\nestimated_graph = shape_inference(\n input_onnx_file_path=\"crestereo_init_iter2_120x160.onnx\",\n)\n```\n\n## 6. Sample\n### Before\n\n\n### After\n\n\n## 7. Reference\n1. https://github.com/onnx/onnx/blob/main/docs/Operators.md\n2. https://docs.nvidia.com/deeplearning/tensorrt/onnx-graphsurgeon/docs/index.html\n3. https://github.com/NVIDIA/TensorRT/tree/main/tools/onnx-graphsurgeon\n4. https://github.com/PINTO0309/simple-onnx-processing-tools\n5. https://github.com/PINTO0309/PINTO_model_zoo\n\n## 8. Issues\nhttps://github.com/PINTO0309/simple-onnx-processing-tools/issues\n",
"bugtrack_url": null,
"license": "MIT License",
"summary": "Simple Shape Inference tool for ONNX.",
"version": "1.0.4",
"project_urls": {
"Homepage": "https://github.com/PINTO0309/ssi4onnx"
},
"split_keywords": [],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "ec72277e24966c0bb7918b9f94e84b1477d761154cf989d6d94f60860309fb95",
"md5": "5cccadcf70c577f37fec8a47ecc61eef",
"sha256": "8e6d3593ea2f624d6ffab39117e22c162f547c30ea53dcbc1d4e7070674d5d9c"
},
"downloads": -1,
"filename": "ssi4onnx-1.0.4-py3-none-any.whl",
"has_sig": false,
"md5_digest": "5cccadcf70c577f37fec8a47ecc61eef",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.6",
"size": 5652,
"upload_time": "2024-05-28T23:51:08",
"upload_time_iso_8601": "2024-05-28T23:51:08.528309Z",
"url": "https://files.pythonhosted.org/packages/ec/72/277e24966c0bb7918b9f94e84b1477d761154cf989d6d94f60860309fb95/ssi4onnx-1.0.4-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "c1d3c829bdf3081dd935a0d1870a72d95f130022c9019bdf6dcb256e06b61c54",
"md5": "f3c000f34deb78887b859b693ee5c652",
"sha256": "7dcf61ebf3493524f692856936ff55b6ec33231d33ffef6cd36e43e5aab0caef"
},
"downloads": -1,
"filename": "ssi4onnx-1.0.4.tar.gz",
"has_sig": false,
"md5_digest": "f3c000f34deb78887b859b693ee5c652",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.6",
"size": 4794,
"upload_time": "2024-05-28T23:51:09",
"upload_time_iso_8601": "2024-05-28T23:51:09.973420Z",
"url": "https://files.pythonhosted.org/packages/c1/d3/c829bdf3081dd935a0d1870a72d95f130022c9019bdf6dcb256e06b61c54/ssi4onnx-1.0.4.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-05-28 23:51:09",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "PINTO0309",
"github_project": "ssi4onnx",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"lcname": "ssi4onnx"
}