torch-onnx


Nametorch-onnx JSON
Version 0.1.25 PyPI version JSON
download
home_pageNone
SummaryExperimental tools for converting PyTorch models to ONNX
upload_time2024-09-18 16:39:09
maintainerNone
docs_urlNone
authorNone
requires_python>=3.8
licenseMIT License
keywords onnx pytorch converter convertion exporter
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # PyTorch to ONNX Exporter

[![PyPI version](https://badge.fury.io/py/torch-onnx.svg)](https://badge.fury.io/py/torch-onnx)

Experimental torch ONNX exporter. Compatible with torch>=2.1.

> [!WARNING]
> This is an experimental project and is not designed for production use.
> Use `torch.onnx.export` for these purposes.

## Installation

```bash
pip install --upgrade torch-onnx
```

## Usage

```python
import torch
import torch_onnx
from onnxscript import ir
import onnx

# Get an exported program with torch.export
exported = torch.export.export(...)
model = torch_onnx.exported_program_to_ir(exported)
proto = ir.to_proto(model)
onnx.save(proto, "model.onnx")

# Or patch the torch.onnx export API
# Set error_report=True to get a detailed error report if the export fails
torch_onnx.patch_torch(report=True, verify=True, profile=True)
torch.onnx.export(...)

# Use the analysis API to print an analysis report for unsupported ops
torch_onnx.analyze(exported)
```

## Design

{dynamo/jit} -> {ExportedProgram} -> {torchlib} -> {ONNX IR} -> {ONNX}

- Use ExportedProgram
  - Rely on robustness of the torch.export implementation
  - Reduce complexity in the exporter
  - This does not solve dynamo limitations, but it avoids introducing additional breakage by running fx passes
- Flat graph; Scope info as metadata, not functions
  - Because existing tools are not good at handling them
- Eager optimization where appropriate
  - Because exsiting tools are not good at optimizing
- Drop in replacement for torch.onnx.export
  - Minimum migration effort
- Build graph eagerly in the exporter
  - Give the exporter full control over the graph being built

## Why is this doable?

- We need to verify torch.export coverage on Huggingface Optimum https://github.com/huggingface/optimum/tree/main/optimum/exporters/onnx; and they are not patching torch.onnx itself.
- Patch torch.onnx.export such that packages do not need to change a single line to use dynamo
- We have all operators implemented and portable

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "torch-onnx",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.8",
    "maintainer_email": null,
    "keywords": "onnx, pytorch, converter, convertion, exporter",
    "author": null,
    "author_email": "Justin Chu <justinchu@microsoft.com>",
    "download_url": "https://files.pythonhosted.org/packages/87/bc/454c57e61e47875245230491313cd5813ce15eb3aa8633e8064008097924/torch_onnx-0.1.25.tar.gz",
    "platform": null,
    "description": "# PyTorch to ONNX Exporter\n\n[![PyPI version](https://badge.fury.io/py/torch-onnx.svg)](https://badge.fury.io/py/torch-onnx)\n\nExperimental torch ONNX exporter. Compatible with torch>=2.1.\n\n> [!WARNING]\n> This is an experimental project and is not designed for production use.\n> Use `torch.onnx.export` for these purposes.\n\n## Installation\n\n```bash\npip install --upgrade torch-onnx\n```\n\n## Usage\n\n```python\nimport torch\nimport torch_onnx\nfrom onnxscript import ir\nimport onnx\n\n# Get an exported program with torch.export\nexported = torch.export.export(...)\nmodel = torch_onnx.exported_program_to_ir(exported)\nproto = ir.to_proto(model)\nonnx.save(proto, \"model.onnx\")\n\n# Or patch the torch.onnx export API\n# Set error_report=True to get a detailed error report if the export fails\ntorch_onnx.patch_torch(report=True, verify=True, profile=True)\ntorch.onnx.export(...)\n\n# Use the analysis API to print an analysis report for unsupported ops\ntorch_onnx.analyze(exported)\n```\n\n## Design\n\n{dynamo/jit} -> {ExportedProgram} -> {torchlib} -> {ONNX IR} -> {ONNX}\n\n- Use ExportedProgram\n  - Rely on robustness of the torch.export implementation\n  - Reduce complexity in the exporter\n  - This does not solve dynamo limitations, but it avoids introducing additional breakage by running fx passes\n- Flat graph; Scope info as metadata, not functions\n  - Because existing tools are not good at handling them\n- Eager optimization where appropriate\n  - Because exsiting tools are not good at optimizing\n- Drop in replacement for torch.onnx.export\n  - Minimum migration effort\n- Build graph eagerly in the exporter\n  - Give the exporter full control over the graph being built\n\n## Why is this doable?\n\n- We need to verify torch.export coverage on Huggingface Optimum https://github.com/huggingface/optimum/tree/main/optimum/exporters/onnx; and they are not patching torch.onnx itself.\n- Patch torch.onnx.export such that packages do not need to change a single line to use dynamo\n- We have all operators implemented and portable\n",
    "bugtrack_url": null,
    "license": "MIT License",
    "summary": "Experimental tools for converting PyTorch models to ONNX",
    "version": "0.1.25",
    "project_urls": {
        "Repository": "https://github.com/justinchuby/torch-onnx"
    },
    "split_keywords": [
        "onnx",
        " pytorch",
        " converter",
        " convertion",
        " exporter"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "9686a5bae457e4fceb61cf873789e59f265f83bac0380ffb607abba42ba472a5",
                "md5": "529db9f902eca1428f9e6349f79459ba",
                "sha256": "a9ce6d5fe4d77c4e697cf97dbbf919a1b77834b299a9e9debd2a25d5415ce68d"
            },
            "downloads": -1,
            "filename": "torch_onnx-0.1.25-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "529db9f902eca1428f9e6349f79459ba",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.8",
            "size": 81602,
            "upload_time": "2024-09-18T16:39:07",
            "upload_time_iso_8601": "2024-09-18T16:39:07.615994Z",
            "url": "https://files.pythonhosted.org/packages/96/86/a5bae457e4fceb61cf873789e59f265f83bac0380ffb607abba42ba472a5/torch_onnx-0.1.25-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "87bc454c57e61e47875245230491313cd5813ce15eb3aa8633e8064008097924",
                "md5": "2afd465c52a603b4b6f705a2bdc9ef06",
                "sha256": "cddcc6089cbfdd3e5ab65dc8ed93249e257215eb0f6c068ffa7377d6ef6b767e"
            },
            "downloads": -1,
            "filename": "torch_onnx-0.1.25.tar.gz",
            "has_sig": false,
            "md5_digest": "2afd465c52a603b4b6f705a2bdc9ef06",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.8",
            "size": 72849,
            "upload_time": "2024-09-18T16:39:09",
            "upload_time_iso_8601": "2024-09-18T16:39:09.257046Z",
            "url": "https://files.pythonhosted.org/packages/87/bc/454c57e61e47875245230491313cd5813ce15eb3aa8633e8064008097924/torch_onnx-0.1.25.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-09-18 16:39:09",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "justinchuby",
    "github_project": "torch-onnx",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "torch-onnx"
}
        
Elapsed time: 0.39339s