sng4onnx


Namesng4onnx JSON
Version 1.0.4 PyPI version JSON
download
home_pagehttps://github.com/PINTO0309/sng4onnx
SummaryA simple tool that automatically generates and assigns an OP name to each OP in an old format ONNX file.
upload_time2024-05-07 05:44:30
maintainerNone
docs_urlNone
authorKatsuya Hyodo
requires_python>=3.6
licenseMIT License
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # sng4onnx
A simple tool that automatically generates and assigns an OP name to each OP in an old format ONNX file.  
**S**imple op **N**ame **G**enerator for **ONNX**.

https://github.com/PINTO0309/simple-onnx-processing-tools

[![Downloads](https://static.pepy.tech/personalized-badge/sng4onnx?period=total&units=none&left_color=grey&right_color=brightgreen&left_text=Downloads)](https://pepy.tech/project/sng4onnx) ![GitHub](https://img.shields.io/github/license/PINTO0309/sng4onnx?color=2BAF2B) [![PyPI](https://img.shields.io/pypi/v/sng4onnx?color=2BAF2B)](https://pypi.org/project/sng4onnx/) [![CodeQL](https://github.com/PINTO0309/sng4onnx/workflows/CodeQL/badge.svg)](https://github.com/PINTO0309/sng4onnx/actions?query=workflow%3ACodeQL)

<p align="center">
  <img src="https://user-images.githubusercontent.com/33194443/195636410-a797d847-365a-469e-8bdd-7a3abb8aa3fd.png" />
</p>

# Key concept

- [x] Automatically generates and assigns an OP name to each OP in an old format ONNX file.

## 1. Setup
### 1-1. HostPC
```bash
### option
$ echo export PATH="~/.local/bin:$PATH" >> ~/.bashrc \
&& source ~/.bashrc

### run
$ pip install -U onnx \
&& python3 -m pip install -U onnx_graphsurgeon --index-url https://pypi.ngc.nvidia.com \
&& pip install -U sng4onnx
```
### 1-2. Docker
https://github.com/PINTO0309/simple-onnx-processing-tools#docker

## 2. CLI Usage
```
$ sng4onnx -h

usage:
  sng4onnx [-h]
  -if INPUT_ONNX_FILE_PATH
  -of OUTPUT_ONNX_FILE_PATH
  [-n]

optional arguments:
  -h, --help
      show this help message and exit.

  -if INPUT_ONNX_FILE_PATH, --input_onnx_file_path INPUT_ONNX_FILE_PATH
      Input onnx file path.

  -of OUTPUT_ONNX_FILE_PATH, --output_onnx_file_path OUTPUT_ONNX_FILE_PATH
      Output onnx file path.

  -n, --non_verbose
      Do not show all information logs. Only error logs are displayed.
```

## 3. In-script Usage
```python
>>> from sng4onnx import generate
>>> help(generate)

Help on function generate in module sng4onnx.onnx_opname_generator:

generate(
    input_onnx_file_path: Union[str, NoneType] = '',
    onnx_graph: Union[onnx.onnx_ml_pb2.ModelProto, NoneType] = None,
    output_onnx_file_path: Union[str, NoneType] = '',
    non_verbose: Union[bool, NoneType] = False
) -> onnx.onnx_ml_pb2.ModelProto

    Parameters
    ----------
    input_onnx_file_path: Optional[str]
        Input onnx file path.
        Either input_onnx_file_path or onnx_graph must be specified.
        Default: ''

    onnx_graph: Optional[onnx.ModelProto]
        onnx.ModelProto.
        Either input_onnx_file_path or onnx_graph must be specified.
        onnx_graph If specified, ignore input_onnx_file_path and process onnx_graph.

    output_onnx_file_path: Optional[str]
        Output onnx file path. If not specified, no ONNX file is output.
        Default: ''

    non_verbose: Optional[bool]
        Do not show all information logs. Only error logs are displayed.
        Default: False

    Returns
    -------
    renamed_graph: onnx.ModelProto
        Renamed onnx ModelProto.
```

## 4. CLI Execution
```bash
$ sng4onnx \
--input_onnx_file_path emotion-ferplus-8.onnx \
--output_onnx_file_path emotion-ferplus-8_renamed.onnx
```

## 5. In-script Execution
```python
from sng4onnx import generate

onnx_graph = generate(
  input_onnx_file_path="fusionnet_180x320.onnx",
  output_onnx_file_path="fusionnet_180x320_renamed.onnx",
)
```

## 6. Sample
https://github.com/onnx/models/blob/main/vision/classification/resnet/model/resnet18-v1-7.onnx
### Before
![image](https://user-images.githubusercontent.com/33194443/195632927-75c76b9a-a14b-411c-8932-f114dc2b9f29.png)

### After
![image](https://user-images.githubusercontent.com/33194443/195633029-86b0ebec-3df5-4dc4-b0ec-079f4f063e46.png)

## 7. Reference
1. https://github.com/onnx/onnx/blob/main/docs/Operators.md
2. https://docs.nvidia.com/deeplearning/tensorrt/onnx-graphsurgeon/docs/index.html
3. https://github.com/NVIDIA/TensorRT/tree/main/tools/onnx-graphsurgeon
4. https://github.com/PINTO0309/simple-onnx-processing-tools
5. https://github.com/PINTO0309/PINTO_model_zoo

## 8. Issues
https://github.com/PINTO0309/simple-onnx-processing-tools/issues

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/PINTO0309/sng4onnx",
    "name": "sng4onnx",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.6",
    "maintainer_email": null,
    "keywords": null,
    "author": "Katsuya Hyodo",
    "author_email": "rmsdh122@yahoo.co.jp",
    "download_url": "https://files.pythonhosted.org/packages/cb/74/82cec386e8a296632fca024920d063225c01403c191eadc13b2e65c81a9c/sng4onnx-1.0.4.tar.gz",
    "platform": "linux",
    "description": "# sng4onnx\nA simple tool that automatically generates and assigns an OP name to each OP in an old format ONNX file.  \n**S**imple op **N**ame **G**enerator for **ONNX**.\n\nhttps://github.com/PINTO0309/simple-onnx-processing-tools\n\n[![Downloads](https://static.pepy.tech/personalized-badge/sng4onnx?period=total&units=none&left_color=grey&right_color=brightgreen&left_text=Downloads)](https://pepy.tech/project/sng4onnx) ![GitHub](https://img.shields.io/github/license/PINTO0309/sng4onnx?color=2BAF2B) [![PyPI](https://img.shields.io/pypi/v/sng4onnx?color=2BAF2B)](https://pypi.org/project/sng4onnx/) [![CodeQL](https://github.com/PINTO0309/sng4onnx/workflows/CodeQL/badge.svg)](https://github.com/PINTO0309/sng4onnx/actions?query=workflow%3ACodeQL)\n\n<p align=\"center\">\n  <img src=\"https://user-images.githubusercontent.com/33194443/195636410-a797d847-365a-469e-8bdd-7a3abb8aa3fd.png\" />\n</p>\n\n# Key concept\n\n- [x] Automatically generates and assigns an OP name to each OP in an old format ONNX file.\n\n## 1. Setup\n### 1-1. HostPC\n```bash\n### option\n$ echo export PATH=\"~/.local/bin:$PATH\" >> ~/.bashrc \\\n&& source ~/.bashrc\n\n### run\n$ pip install -U onnx \\\n&& python3 -m pip install -U onnx_graphsurgeon --index-url https://pypi.ngc.nvidia.com \\\n&& pip install -U sng4onnx\n```\n### 1-2. Docker\nhttps://github.com/PINTO0309/simple-onnx-processing-tools#docker\n\n## 2. CLI Usage\n```\n$ sng4onnx -h\n\nusage:\n  sng4onnx [-h]\n  -if INPUT_ONNX_FILE_PATH\n  -of OUTPUT_ONNX_FILE_PATH\n  [-n]\n\noptional arguments:\n  -h, --help\n      show this help message and exit.\n\n  -if INPUT_ONNX_FILE_PATH, --input_onnx_file_path INPUT_ONNX_FILE_PATH\n      Input onnx file path.\n\n  -of OUTPUT_ONNX_FILE_PATH, --output_onnx_file_path OUTPUT_ONNX_FILE_PATH\n      Output onnx file path.\n\n  -n, --non_verbose\n      Do not show all information logs. Only error logs are displayed.\n```\n\n## 3. In-script Usage\n```python\n>>> from sng4onnx import generate\n>>> help(generate)\n\nHelp on function generate in module sng4onnx.onnx_opname_generator:\n\ngenerate(\n    input_onnx_file_path: Union[str, NoneType] = '',\n    onnx_graph: Union[onnx.onnx_ml_pb2.ModelProto, NoneType] = None,\n    output_onnx_file_path: Union[str, NoneType] = '',\n    non_verbose: Union[bool, NoneType] = False\n) -> onnx.onnx_ml_pb2.ModelProto\n\n    Parameters\n    ----------\n    input_onnx_file_path: Optional[str]\n        Input onnx file path.\n        Either input_onnx_file_path or onnx_graph must be specified.\n        Default: ''\n\n    onnx_graph: Optional[onnx.ModelProto]\n        onnx.ModelProto.\n        Either input_onnx_file_path or onnx_graph must be specified.\n        onnx_graph If specified, ignore input_onnx_file_path and process onnx_graph.\n\n    output_onnx_file_path: Optional[str]\n        Output onnx file path. If not specified, no ONNX file is output.\n        Default: ''\n\n    non_verbose: Optional[bool]\n        Do not show all information logs. Only error logs are displayed.\n        Default: False\n\n    Returns\n    -------\n    renamed_graph: onnx.ModelProto\n        Renamed onnx ModelProto.\n```\n\n## 4. CLI Execution\n```bash\n$ sng4onnx \\\n--input_onnx_file_path emotion-ferplus-8.onnx \\\n--output_onnx_file_path emotion-ferplus-8_renamed.onnx\n```\n\n## 5. In-script Execution\n```python\nfrom sng4onnx import generate\n\nonnx_graph = generate(\n  input_onnx_file_path=\"fusionnet_180x320.onnx\",\n  output_onnx_file_path=\"fusionnet_180x320_renamed.onnx\",\n)\n```\n\n## 6. Sample\nhttps://github.com/onnx/models/blob/main/vision/classification/resnet/model/resnet18-v1-7.onnx\n### Before\n![image](https://user-images.githubusercontent.com/33194443/195632927-75c76b9a-a14b-411c-8932-f114dc2b9f29.png)\n\n### After\n![image](https://user-images.githubusercontent.com/33194443/195633029-86b0ebec-3df5-4dc4-b0ec-079f4f063e46.png)\n\n## 7. Reference\n1. https://github.com/onnx/onnx/blob/main/docs/Operators.md\n2. https://docs.nvidia.com/deeplearning/tensorrt/onnx-graphsurgeon/docs/index.html\n3. https://github.com/NVIDIA/TensorRT/tree/main/tools/onnx-graphsurgeon\n4. https://github.com/PINTO0309/simple-onnx-processing-tools\n5. https://github.com/PINTO0309/PINTO_model_zoo\n\n## 8. Issues\nhttps://github.com/PINTO0309/simple-onnx-processing-tools/issues\n",
    "bugtrack_url": null,
    "license": "MIT License",
    "summary": "A simple tool that automatically generates and assigns an OP name to each OP in an old format ONNX file.",
    "version": "1.0.4",
    "project_urls": {
        "Homepage": "https://github.com/PINTO0309/sng4onnx"
    },
    "split_keywords": [],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "6fd89f6fc80c341d66473896edf58f02f53bbb60a7b0c0d927927d8c8fb3e916",
                "md5": "91fd2edadacd291ead2f063c0d2e8eb2",
                "sha256": "1784d65df96c78532cfd755559a331471e80ccd42ded78044b40ec0d5d708ab4"
            },
            "downloads": -1,
            "filename": "sng4onnx-1.0.4-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "91fd2edadacd291ead2f063c0d2e8eb2",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.6",
            "size": 5915,
            "upload_time": "2024-05-07T05:44:29",
            "upload_time_iso_8601": "2024-05-07T05:44:29.594364Z",
            "url": "https://files.pythonhosted.org/packages/6f/d8/9f6fc80c341d66473896edf58f02f53bbb60a7b0c0d927927d8c8fb3e916/sng4onnx-1.0.4-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "cb7482cec386e8a296632fca024920d063225c01403c191eadc13b2e65c81a9c",
                "md5": "1cb1c132641a61affd68b18f22e930f5",
                "sha256": "836022cd466b9afa4cbe118f9d6c23b8444cbe200601798ac314294c5f243eee"
            },
            "downloads": -1,
            "filename": "sng4onnx-1.0.4.tar.gz",
            "has_sig": false,
            "md5_digest": "1cb1c132641a61affd68b18f22e930f5",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.6",
            "size": 5041,
            "upload_time": "2024-05-07T05:44:30",
            "upload_time_iso_8601": "2024-05-07T05:44:30.620155Z",
            "url": "https://files.pythonhosted.org/packages/cb/74/82cec386e8a296632fca024920d063225c01403c191eadc13b2e65c81a9c/sng4onnx-1.0.4.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-05-07 05:44:30",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "PINTO0309",
    "github_project": "sng4onnx",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "sng4onnx"
}
        
Elapsed time: 0.25232s