ssi4onnx


Namessi4onnx JSON
Version 1.0.3 PyPI version JSON
download
home_pagehttps://github.com/PINTO0309/ssi4onnx
SummarySimple Shape Inference tool for ONNX.
upload_time2024-04-30 06:29:18
maintainerNone
docs_urlNone
authorKatsuya Hyodo
requires_python>=3.6
licenseMIT License
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # ssi4onnx
**S**imple **S**hape **I**nference tool for **ONNX**.


https://github.com/PINTO0309/simple-onnx-processing-tools

[![Downloads](https://static.pepy.tech/personalized-badge/ssi4onnx?period=total&units=none&left_color=grey&right_color=brightgreen&left_text=Downloads)](https://pepy.tech/project/ssi4onnx) ![GitHub](https://img.shields.io/github/license/PINTO0309/ssi4onnx?color=2BAF2B) [![PyPI](https://img.shields.io/pypi/v/ssi4onnx?color=2BAF2B)](https://pypi.org/project/ssi4onnx/) [![CodeQL](https://github.com/PINTO0309/ssi4onnx/workflows/CodeQL/badge.svg)](https://github.com/PINTO0309/ssi4onnx/actions?query=workflow%3ACodeQL)

<p align="center">
  <img src="https://user-images.githubusercontent.com/33194443/170158744-69bfdb6a-e032-4ed9-982c-ee9ac8889022.png" />
</p>

## 1. Setup
### 1-1. HostPC
```bash
### option
$ echo export PATH="~/.local/bin:$PATH" >> ~/.bashrc \
&& source ~/.bashrc

### run
$ pip install -U onnx \
&& python3 -m pip install -U onnx_graphsurgeon --index-url https://pypi.ngc.nvidia.com \
&& pip install -U ssi4onnx
```
### 1-2. Docker
https://github.com/PINTO0309/simple-onnx-processing-tools#docker

## 2. CLI Usage
```
$ ssi4onnx -h

usage:
    ssi4onnx [-h]
    -if INPUT_ONNX_FILE_PATH
    [-of OUTPUT_ONNX_FILE_PATH]
    [-n]

optional arguments:
  -h, --help
        show this help message and exit.

  -if INPUT_ONNX_FILE_PATH, --input_onnx_file_path INPUT_ONNX_FILE_PATH
        Input onnx file path.

  -of OUTPUT_ONNX_FILE_PATH, --output_onnx_file_path OUTPUT_ONNX_FILE_PATH
        Output onnx file path.

  -n, --non_verbose
        Do not show all information logs. Only error logs are displayed.
```

## 3. In-script Usage
```python
>>> from ssi4onnx import shape_inference
>>> help(shape_inference)

Help on function shape_inference in module ssi4onnx.onnx_shape_inference:

shape_inference(
    input_onnx_file_path: Union[str, NoneType] = '',
    onnx_graph: Union[onnx.onnx_ml_pb2.ModelProto, NoneType] = None,
    output_onnx_file_path: Union[str, NoneType] = '',
    non_verbose: Union[bool, NoneType] = False
) -> onnx.onnx_ml_pb2.ModelProto

    Parameters
    ----------
    input_onnx_file_path: Optional[str]
        Input onnx file path.
        Either input_onnx_file_path or onnx_graph must be specified.
        Default: ''

    onnx_graph: Optional[onnx.ModelProto]
        onnx.ModelProto.
        Either input_onnx_file_path or onnx_graph must be specified.
        onnx_graph If specified, ignore input_onnx_file_path and process onnx_graph.

    output_onnx_file_path: Optional[str]
        Output onnx file path. If not specified, no ONNX file is output.
        Default: ''

    non_verbose: Optional[bool]
        Do not show all information logs. Only error logs are displayed.
        Default: False

    Returns
    -------
    estimated_graph: onnx.ModelProto
        Shape estimated onnx ModelProto.
```

## 4. CLI Execution
```bash
$ ssi4onnx --input_onnx_file_path nanodet_320x320.onnx
```

## 5. In-script Execution
```python
from ssi4onnx import shape_inference

estimated_graph = shape_inference(
    input_onnx_file_path="crestereo_init_iter2_120x160.onnx",
)
```

## 6. Sample
### Before
![image](https://user-images.githubusercontent.com/33194443/169821344-f3560cfe-476f-4480-9c76-8c71476ebb57.png)

### After
![image](https://user-images.githubusercontent.com/33194443/169821518-bb58ea27-37d7-42d7-84c6-0e40c522760e.png)

## 7. Reference
1. https://github.com/onnx/onnx/blob/main/docs/Operators.md
2. https://docs.nvidia.com/deeplearning/tensorrt/onnx-graphsurgeon/docs/index.html
3. https://github.com/NVIDIA/TensorRT/tree/main/tools/onnx-graphsurgeon
4. https://github.com/PINTO0309/simple-onnx-processing-tools
5. https://github.com/PINTO0309/PINTO_model_zoo

## 8. Issues
https://github.com/PINTO0309/simple-onnx-processing-tools/issues

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/PINTO0309/ssi4onnx",
    "name": "ssi4onnx",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.6",
    "maintainer_email": null,
    "keywords": null,
    "author": "Katsuya Hyodo",
    "author_email": "rmsdh122@yahoo.co.jp",
    "download_url": "https://files.pythonhosted.org/packages/a5/66/98c41a33848127adbe8b7e1bd70b7bd60aa8913c5c88451ec932e2864ec2/ssi4onnx-1.0.3.tar.gz",
    "platform": "linux",
    "description": "# ssi4onnx\n**S**imple **S**hape **I**nference tool for **ONNX**.\n\n\nhttps://github.com/PINTO0309/simple-onnx-processing-tools\n\n[![Downloads](https://static.pepy.tech/personalized-badge/ssi4onnx?period=total&units=none&left_color=grey&right_color=brightgreen&left_text=Downloads)](https://pepy.tech/project/ssi4onnx) ![GitHub](https://img.shields.io/github/license/PINTO0309/ssi4onnx?color=2BAF2B) [![PyPI](https://img.shields.io/pypi/v/ssi4onnx?color=2BAF2B)](https://pypi.org/project/ssi4onnx/) [![CodeQL](https://github.com/PINTO0309/ssi4onnx/workflows/CodeQL/badge.svg)](https://github.com/PINTO0309/ssi4onnx/actions?query=workflow%3ACodeQL)\n\n<p align=\"center\">\n  <img src=\"https://user-images.githubusercontent.com/33194443/170158744-69bfdb6a-e032-4ed9-982c-ee9ac8889022.png\" />\n</p>\n\n## 1. Setup\n### 1-1. HostPC\n```bash\n### option\n$ echo export PATH=\"~/.local/bin:$PATH\" >> ~/.bashrc \\\n&& source ~/.bashrc\n\n### run\n$ pip install -U onnx \\\n&& python3 -m pip install -U onnx_graphsurgeon --index-url https://pypi.ngc.nvidia.com \\\n&& pip install -U ssi4onnx\n```\n### 1-2. Docker\nhttps://github.com/PINTO0309/simple-onnx-processing-tools#docker\n\n## 2. CLI Usage\n```\n$ ssi4onnx -h\n\nusage:\n    ssi4onnx [-h]\n    -if INPUT_ONNX_FILE_PATH\n    [-of OUTPUT_ONNX_FILE_PATH]\n    [-n]\n\noptional arguments:\n  -h, --help\n        show this help message and exit.\n\n  -if INPUT_ONNX_FILE_PATH, --input_onnx_file_path INPUT_ONNX_FILE_PATH\n        Input onnx file path.\n\n  -of OUTPUT_ONNX_FILE_PATH, --output_onnx_file_path OUTPUT_ONNX_FILE_PATH\n        Output onnx file path.\n\n  -n, --non_verbose\n        Do not show all information logs. Only error logs are displayed.\n```\n\n## 3. In-script Usage\n```python\n>>> from ssi4onnx import shape_inference\n>>> help(shape_inference)\n\nHelp on function shape_inference in module ssi4onnx.onnx_shape_inference:\n\nshape_inference(\n    input_onnx_file_path: Union[str, NoneType] = '',\n    onnx_graph: Union[onnx.onnx_ml_pb2.ModelProto, NoneType] = None,\n    output_onnx_file_path: Union[str, NoneType] = '',\n    non_verbose: Union[bool, NoneType] = False\n) -> onnx.onnx_ml_pb2.ModelProto\n\n    Parameters\n    ----------\n    input_onnx_file_path: Optional[str]\n        Input onnx file path.\n        Either input_onnx_file_path or onnx_graph must be specified.\n        Default: ''\n\n    onnx_graph: Optional[onnx.ModelProto]\n        onnx.ModelProto.\n        Either input_onnx_file_path or onnx_graph must be specified.\n        onnx_graph If specified, ignore input_onnx_file_path and process onnx_graph.\n\n    output_onnx_file_path: Optional[str]\n        Output onnx file path. If not specified, no ONNX file is output.\n        Default: ''\n\n    non_verbose: Optional[bool]\n        Do not show all information logs. Only error logs are displayed.\n        Default: False\n\n    Returns\n    -------\n    estimated_graph: onnx.ModelProto\n        Shape estimated onnx ModelProto.\n```\n\n## 4. CLI Execution\n```bash\n$ ssi4onnx --input_onnx_file_path nanodet_320x320.onnx\n```\n\n## 5. In-script Execution\n```python\nfrom ssi4onnx import shape_inference\n\nestimated_graph = shape_inference(\n    input_onnx_file_path=\"crestereo_init_iter2_120x160.onnx\",\n)\n```\n\n## 6. Sample\n### Before\n![image](https://user-images.githubusercontent.com/33194443/169821344-f3560cfe-476f-4480-9c76-8c71476ebb57.png)\n\n### After\n![image](https://user-images.githubusercontent.com/33194443/169821518-bb58ea27-37d7-42d7-84c6-0e40c522760e.png)\n\n## 7. Reference\n1. https://github.com/onnx/onnx/blob/main/docs/Operators.md\n2. https://docs.nvidia.com/deeplearning/tensorrt/onnx-graphsurgeon/docs/index.html\n3. https://github.com/NVIDIA/TensorRT/tree/main/tools/onnx-graphsurgeon\n4. https://github.com/PINTO0309/simple-onnx-processing-tools\n5. https://github.com/PINTO0309/PINTO_model_zoo\n\n## 8. Issues\nhttps://github.com/PINTO0309/simple-onnx-processing-tools/issues\n",
    "bugtrack_url": null,
    "license": "MIT License",
    "summary": "Simple Shape Inference tool for ONNX.",
    "version": "1.0.3",
    "project_urls": {
        "Homepage": "https://github.com/PINTO0309/ssi4onnx"
    },
    "split_keywords": [],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "c5901dd6d124fd0095867d951d83dfca9b7a86050fc24448a127d17e370365ae",
                "md5": "c635d0fc4c7fc5cc50f90ca7d50a35a8",
                "sha256": "9f42fc00a4d7779a9564a6bf625233991405ad12a884e0c81dedd598c7bf7649"
            },
            "downloads": -1,
            "filename": "ssi4onnx-1.0.3-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "c635d0fc4c7fc5cc50f90ca7d50a35a8",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.6",
            "size": 5582,
            "upload_time": "2024-04-30T06:29:17",
            "upload_time_iso_8601": "2024-04-30T06:29:17.750862Z",
            "url": "https://files.pythonhosted.org/packages/c5/90/1dd6d124fd0095867d951d83dfca9b7a86050fc24448a127d17e370365ae/ssi4onnx-1.0.3-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "a56698c41a33848127adbe8b7e1bd70b7bd60aa8913c5c88451ec932e2864ec2",
                "md5": "92c6d15d46aa089d535333919ad2db11",
                "sha256": "235fbfad1d03fa392f0fb1ef268f2f6012d2ce2430ce9d03f652fadcd784024b"
            },
            "downloads": -1,
            "filename": "ssi4onnx-1.0.3.tar.gz",
            "has_sig": false,
            "md5_digest": "92c6d15d46aa089d535333919ad2db11",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.6",
            "size": 4712,
            "upload_time": "2024-04-30T06:29:18",
            "upload_time_iso_8601": "2024-04-30T06:29:18.689457Z",
            "url": "https://files.pythonhosted.org/packages/a5/66/98c41a33848127adbe8b7e1bd70b7bd60aa8913c5c88451ec932e2864ec2/ssi4onnx-1.0.3.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-04-30 06:29:18",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "PINTO0309",
    "github_project": "ssi4onnx",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "ssi4onnx"
}
        
Elapsed time: 0.24254s