sbi4onnx


Namesbi4onnx JSON
Version 1.0.6 PyPI version JSON
download
home_pagehttps://github.com/PINTO0309/sbi4onnx
SummaryA very simple script that only initializes the batch size of ONNX. Simple Batchsize Initialization for ONNX.
upload_time2024-04-30 05:52:07
maintainerNone
docs_urlNone
authorKatsuya Hyodo
requires_python>=3.6
licenseMIT License
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # sbi4onnx
A very simple script that only initializes the batch size of ONNX. **S**imple **B**atchsize **I**nitialization for **ONNX**.

https://github.com/PINTO0309/simple-onnx-processing-tools

[![Downloads](https://static.pepy.tech/personalized-badge/sbi4onnx?period=total&units=none&left_color=grey&right_color=brightgreen&left_text=Downloads)](https://pepy.tech/project/sbi4onnx) ![GitHub](https://img.shields.io/github/license/PINTO0309/sbi4onnx?color=2BAF2B) [![PyPI](https://img.shields.io/pypi/v/sbi4onnx?color=2BAF2B)](https://pypi.org/project/sbi4onnx/) [![CodeQL](https://github.com/PINTO0309/sbi4onnx/workflows/CodeQL/badge.svg)](https://github.com/PINTO0309/sbi4onnx/actions?query=workflow%3ACodeQL)

<p align="center">
  <img src="https://user-images.githubusercontent.com/33194443/170157713-78a1b84e-caf6-4abe-92e4-c9ea51bcaacd.png" />
</p>

# Key concept

- [x] Initializes the ONNX batch size with the specified characters.
- [x] This tool is not a panacea and may fail to initialize models with very complex structures. For example, there is an ONNX that contains a `Reshape` that involves a batch size, or a `Gemm` that contains a batch output other than 1 in the output result.
- [x] A `Reshape` in a graph cannot contain more than two undefined shapes, such as `-1` or `N` or `None` or `unk_*`. Therefore, before initializing the batch size with this tool, make sure that the `Reshape` does not already contain one or more `-1` dimensions. If it already contains undefined dimensions, it may be possible to successfully initialize the batch size by pre-writing the undefined dimensions of the relevant `Reshape` to static values using **[sam4onnx](https://github.com/PINTO0309/sam4onnx)**.

## 1. Setup
### 1-1. HostPC
```bash
### option
$ echo export PATH="~/.local/bin:$PATH" >> ~/.bashrc \
&& source ~/.bashrc

### run
$ pip install -U onnx \
&& python3 -m pip install -U onnx_graphsurgeon --index-url https://pypi.ngc.nvidia.com \
&& pip install --no-deps -U onnx-simplifier \
&& pip install -U sbi4onnx
```
### 1-2. Docker
https://github.com/PINTO0309/simple-onnx-processing-tools#docker

## 2. CLI Usage
```
$ sbi4onnx -h

usage:
  sbi4onnx [-h]
  -if INPUT_ONNX_FILE_PATH
  -of OUTPUT_ONNX_FILE_PATH
  -ics INITIALIZATION_CHARACTER_STRING
  [-dos]
  [-n]

optional arguments:
  -h, --help
      show this help message and exit.

  -if INPUT_ONNX_FILE_PATH, --input_onnx_file_path INPUT_ONNX_FILE_PATH
      Input onnx file path.

  -of OUTPUT_ONNX_FILE_PATH, --output_onnx_file_path OUTPUT_ONNX_FILE_PATH
      Output onnx file path.

  -ics INITIALIZATION_CHARACTER_STRING, --initialization_character_string INITIALIZATION_CHARACTER_STRING
      String to initialize batch size. "-1" or "N" or "xxx", etc...
      Default: '-1'

  -dos, --disable_onnxsim
      Suppress the execution of onnxsim on the backend and dare to leave redundant processing.

  -n, --non_verbose
      Do not show all information logs. Only error logs are displayed.
```

## 3. In-script Usage
```python
>>> from sbi4onnx import initialize
>>> help(initialize)

Help on function initialize in module sbi4onnx.onnx_batchsize_initialize:

initialize(
  input_onnx_file_path: Union[str, NoneType] = '',
  onnx_graph: Union[onnx.onnx_ml_pb2.ModelProto, NoneType] = None,
  output_onnx_file_path: Union[str, NoneType] = '',
  initialization_character_string: Union[str, NoneType] = '-1',
  non_verbose: Union[bool, NoneType] = False,
  disable_onnxsim: Union[bool, NoneType] = False,
) -> onnx.onnx_ml_pb2.ModelProto

    Parameters
    ----------
    input_onnx_file_path: Optional[str]
        Input onnx file path.
        Either input_onnx_file_path or onnx_graph must be specified.
        Default: ''

    onnx_graph: Optional[onnx.ModelProto]
        onnx.ModelProto.
        Either input_onnx_file_path or onnx_graph must be specified.
        onnx_graph If specified, ignore input_onnx_file_path and process onnx_graph.

    output_onnx_file_path: Optional[str]
        Output onnx file path. If not specified, no ONNX file is output.
        Default: ''

    initialization_character_string: Optional[str]
        String to initialize batch size. "-1" or "N" or "xxx", etc...
        Default: '-1'

    disable_onnxsim: Optional[bool]
        Suppress the execution of onnxsim on the backend and dare to leave redundant processing.
        Default: False

    non_verbose: Optional[bool]
        Do not show all information logs. Only error logs are displayed.
        Default: False

    Returns
    -------
    changed_graph: onnx.ModelProto
        Changed onnx ModelProto.
```

## 4. CLI Execution
```bash
$ sbi4onnx \
--input_onnx_file_path whenet_224x224.onnx \
--output_onnx_file_path whenet_Nx224x224.onnx \
--initialization_character_string N

$ sbi4onnx \
--input_onnx_file_path whenet_224x224.onnx \
--output_onnx_file_path whenet_Nx224x224.onnx \
--initialization_character_string -1

$ sbi4onnx \
--input_onnx_file_path whenet_224x224.onnx \
--output_onnx_file_path whenet_Nx224x224.onnx \
--initialization_character_string abcdefg
```

## 5. In-script Execution
```python
from sbi4onnx import initialize

onnx_graph = initialize(
  input_onnx_file_path="whenet_224x224.onnx",
  output_onnx_file_path="whenet_Nx224x224.onnx",
  initialization_character_string="abcdefg",
)

# or

onnx_graph = initialize(
  onnx_graph=graph,
  initialization_character_string="abcdefg",
)
```

## 6. Sample
### Before
![image](https://user-images.githubusercontent.com/33194443/166225839-3b8d6378-e76f-4139-b5d1-db547ba16d16.png)

### After
![image](https://user-images.githubusercontent.com/33194443/166225927-cb39ea2f-85f6-4fdd-afbc-78a46a2475a1.png)

## 7. Reference
1. https://github.com/onnx/onnx/blob/main/docs/Operators.md
2. https://docs.nvidia.com/deeplearning/tensorrt/onnx-graphsurgeon/docs/index.html
3. https://github.com/NVIDIA/TensorRT/tree/main/tools/onnx-graphsurgeon
4. https://github.com/PINTO0309/simple-onnx-processing-tools
5. https://github.com/PINTO0309/PINTO_model_zoo

## 8. Issues
https://github.com/PINTO0309/simple-onnx-processing-tools/issues

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/PINTO0309/sbi4onnx",
    "name": "sbi4onnx",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.6",
    "maintainer_email": null,
    "keywords": null,
    "author": "Katsuya Hyodo",
    "author_email": "rmsdh122@yahoo.co.jp",
    "download_url": "https://files.pythonhosted.org/packages/0b/53/8412044148a4abbd38989ae76bf01c53c952c50e8c450bb500b8410c2d39/sbi4onnx-1.0.6.tar.gz",
    "platform": "linux",
    "description": "# sbi4onnx\nA very simple script that only initializes the batch size of ONNX. **S**imple **B**atchsize **I**nitialization for **ONNX**.\n\nhttps://github.com/PINTO0309/simple-onnx-processing-tools\n\n[![Downloads](https://static.pepy.tech/personalized-badge/sbi4onnx?period=total&units=none&left_color=grey&right_color=brightgreen&left_text=Downloads)](https://pepy.tech/project/sbi4onnx) ![GitHub](https://img.shields.io/github/license/PINTO0309/sbi4onnx?color=2BAF2B) [![PyPI](https://img.shields.io/pypi/v/sbi4onnx?color=2BAF2B)](https://pypi.org/project/sbi4onnx/) [![CodeQL](https://github.com/PINTO0309/sbi4onnx/workflows/CodeQL/badge.svg)](https://github.com/PINTO0309/sbi4onnx/actions?query=workflow%3ACodeQL)\n\n<p align=\"center\">\n  <img src=\"https://user-images.githubusercontent.com/33194443/170157713-78a1b84e-caf6-4abe-92e4-c9ea51bcaacd.png\" />\n</p>\n\n# Key concept\n\n- [x] Initializes the ONNX batch size with the specified characters.\n- [x] This tool is not a panacea and may fail to initialize models with very complex structures. For example, there is an ONNX that contains a `Reshape` that involves a batch size, or a `Gemm` that contains a batch output other than 1 in the output result.\n- [x] A `Reshape` in a graph cannot contain more than two undefined shapes, such as `-1` or `N` or `None` or `unk_*`. Therefore, before initializing the batch size with this tool, make sure that the `Reshape` does not already contain one or more `-1` dimensions. If it already contains undefined dimensions, it may be possible to successfully initialize the batch size by pre-writing the undefined dimensions of the relevant `Reshape` to static values using **[sam4onnx](https://github.com/PINTO0309/sam4onnx)**.\n\n## 1. Setup\n### 1-1. HostPC\n```bash\n### option\n$ echo export PATH=\"~/.local/bin:$PATH\" >> ~/.bashrc \\\n&& source ~/.bashrc\n\n### run\n$ pip install -U onnx \\\n&& python3 -m pip install -U onnx_graphsurgeon --index-url https://pypi.ngc.nvidia.com \\\n&& pip install --no-deps -U onnx-simplifier \\\n&& pip install -U sbi4onnx\n```\n### 1-2. Docker\nhttps://github.com/PINTO0309/simple-onnx-processing-tools#docker\n\n## 2. CLI Usage\n```\n$ sbi4onnx -h\n\nusage:\n  sbi4onnx [-h]\n  -if INPUT_ONNX_FILE_PATH\n  -of OUTPUT_ONNX_FILE_PATH\n  -ics INITIALIZATION_CHARACTER_STRING\n  [-dos]\n  [-n]\n\noptional arguments:\n  -h, --help\n      show this help message and exit.\n\n  -if INPUT_ONNX_FILE_PATH, --input_onnx_file_path INPUT_ONNX_FILE_PATH\n      Input onnx file path.\n\n  -of OUTPUT_ONNX_FILE_PATH, --output_onnx_file_path OUTPUT_ONNX_FILE_PATH\n      Output onnx file path.\n\n  -ics INITIALIZATION_CHARACTER_STRING, --initialization_character_string INITIALIZATION_CHARACTER_STRING\n      String to initialize batch size. \"-1\" or \"N\" or \"xxx\", etc...\n      Default: '-1'\n\n  -dos, --disable_onnxsim\n      Suppress the execution of onnxsim on the backend and dare to leave redundant processing.\n\n  -n, --non_verbose\n      Do not show all information logs. Only error logs are displayed.\n```\n\n## 3. In-script Usage\n```python\n>>> from sbi4onnx import initialize\n>>> help(initialize)\n\nHelp on function initialize in module sbi4onnx.onnx_batchsize_initialize:\n\ninitialize(\n  input_onnx_file_path: Union[str, NoneType] = '',\n  onnx_graph: Union[onnx.onnx_ml_pb2.ModelProto, NoneType] = None,\n  output_onnx_file_path: Union[str, NoneType] = '',\n  initialization_character_string: Union[str, NoneType] = '-1',\n  non_verbose: Union[bool, NoneType] = False,\n  disable_onnxsim: Union[bool, NoneType] = False,\n) -> onnx.onnx_ml_pb2.ModelProto\n\n    Parameters\n    ----------\n    input_onnx_file_path: Optional[str]\n        Input onnx file path.\n        Either input_onnx_file_path or onnx_graph must be specified.\n        Default: ''\n\n    onnx_graph: Optional[onnx.ModelProto]\n        onnx.ModelProto.\n        Either input_onnx_file_path or onnx_graph must be specified.\n        onnx_graph If specified, ignore input_onnx_file_path and process onnx_graph.\n\n    output_onnx_file_path: Optional[str]\n        Output onnx file path. If not specified, no ONNX file is output.\n        Default: ''\n\n    initialization_character_string: Optional[str]\n        String to initialize batch size. \"-1\" or \"N\" or \"xxx\", etc...\n        Default: '-1'\n\n    disable_onnxsim: Optional[bool]\n        Suppress the execution of onnxsim on the backend and dare to leave redundant processing.\n        Default: False\n\n    non_verbose: Optional[bool]\n        Do not show all information logs. Only error logs are displayed.\n        Default: False\n\n    Returns\n    -------\n    changed_graph: onnx.ModelProto\n        Changed onnx ModelProto.\n```\n\n## 4. CLI Execution\n```bash\n$ sbi4onnx \\\n--input_onnx_file_path whenet_224x224.onnx \\\n--output_onnx_file_path whenet_Nx224x224.onnx \\\n--initialization_character_string N\n\n$ sbi4onnx \\\n--input_onnx_file_path whenet_224x224.onnx \\\n--output_onnx_file_path whenet_Nx224x224.onnx \\\n--initialization_character_string -1\n\n$ sbi4onnx \\\n--input_onnx_file_path whenet_224x224.onnx \\\n--output_onnx_file_path whenet_Nx224x224.onnx \\\n--initialization_character_string abcdefg\n```\n\n## 5. In-script Execution\n```python\nfrom sbi4onnx import initialize\n\nonnx_graph = initialize(\n  input_onnx_file_path=\"whenet_224x224.onnx\",\n  output_onnx_file_path=\"whenet_Nx224x224.onnx\",\n  initialization_character_string=\"abcdefg\",\n)\n\n# or\n\nonnx_graph = initialize(\n  onnx_graph=graph,\n  initialization_character_string=\"abcdefg\",\n)\n```\n\n## 6. Sample\n### Before\n![image](https://user-images.githubusercontent.com/33194443/166225839-3b8d6378-e76f-4139-b5d1-db547ba16d16.png)\n\n### After\n![image](https://user-images.githubusercontent.com/33194443/166225927-cb39ea2f-85f6-4fdd-afbc-78a46a2475a1.png)\n\n## 7. Reference\n1. https://github.com/onnx/onnx/blob/main/docs/Operators.md\n2. https://docs.nvidia.com/deeplearning/tensorrt/onnx-graphsurgeon/docs/index.html\n3. https://github.com/NVIDIA/TensorRT/tree/main/tools/onnx-graphsurgeon\n4. https://github.com/PINTO0309/simple-onnx-processing-tools\n5. https://github.com/PINTO0309/PINTO_model_zoo\n\n## 8. Issues\nhttps://github.com/PINTO0309/simple-onnx-processing-tools/issues\n",
    "bugtrack_url": null,
    "license": "MIT License",
    "summary": "A very simple script that only initializes the batch size of ONNX. Simple Batchsize Initialization for ONNX.",
    "version": "1.0.6",
    "project_urls": {
        "Homepage": "https://github.com/PINTO0309/sbi4onnx"
    },
    "split_keywords": [],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "90d362bf88e8de17b7bc4cdb64bd572e22792f7c1d4b82279fcd40dae7e8e6e3",
                "md5": "efa25bb85ff4488b9df74001be6926cc",
                "sha256": "d18e82a7e5a5414710d6171b816867d9cac81ac168ad9f3b777cc5a4fcb21e01"
            },
            "downloads": -1,
            "filename": "sbi4onnx-1.0.6-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "efa25bb85ff4488b9df74001be6926cc",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.6",
            "size": 6882,
            "upload_time": "2024-04-30T05:52:05",
            "upload_time_iso_8601": "2024-04-30T05:52:05.948239Z",
            "url": "https://files.pythonhosted.org/packages/90/d3/62bf88e8de17b7bc4cdb64bd572e22792f7c1d4b82279fcd40dae7e8e6e3/sbi4onnx-1.0.6-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "0b538412044148a4abbd38989ae76bf01c53c952c50e8c450bb500b8410c2d39",
                "md5": "b046313216cbcc5b4869bc980767d699",
                "sha256": "eef27d95d377db1c492e1a7b62a72d7f40b0da6bba3733d1efc55d8f65a21c57"
            },
            "downloads": -1,
            "filename": "sbi4onnx-1.0.6.tar.gz",
            "has_sig": false,
            "md5_digest": "b046313216cbcc5b4869bc980767d699",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.6",
            "size": 5942,
            "upload_time": "2024-04-30T05:52:07",
            "upload_time_iso_8601": "2024-04-30T05:52:07.554492Z",
            "url": "https://files.pythonhosted.org/packages/0b/53/8412044148a4abbd38989ae76bf01c53c952c50e8c450bb500b8410c2d39/sbi4onnx-1.0.6.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-04-30 05:52:07",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "PINTO0309",
    "github_project": "sbi4onnx",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "sbi4onnx"
}
        
Elapsed time: 0.32578s