torch-onnx


Nametorch-onnx JSON
Version 0.0.31 PyPI version JSON
download
home_pageNone
SummaryExperimental tools for converting PyTorch models to ONNX
upload_time2024-07-25 00:32:56
maintainerNone
docs_urlNone
authorNone
requires_python>=3.8
licenseMIT License
keywords onnx pytorch converter convertion exporter
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # PyTorch to ONNX Exporter

[![PyPI version](https://badge.fury.io/py/torch-onnx.svg)](https://badge.fury.io/py/torch-onnx)

Experimental torch ONNX exporter.

> [!WARNING]
> This is an experimental project and is not designed for production use.
> Use `torch.onnx.export` for these purposes.

## Installation

```bash
pip install --upgrade torch-onnx
```

## Usage

```python
import torch
import torch_onnx
from onnxscript import ir
import onnx

# Get an exported program with torch.export
exported = torch.export.export(...)
model = torch_onnx.exported_program_to_ir(exported)
proto = ir.to_proto(model)
onnx.save(proto, "model.onnx")

# Or patch the torch.onnx export API
# Set error_report=True to get a detailed error report if the export fails
torch_onnx.patch_torch(error_report=True, profile=True)
torch.onnx.export(...)

# Use the analysis API to print an analysis report for unsupported ops
torch_onnx.analyze(exported)
```

## Design

{dynamo/jit} -> {ExportedProgram} -> {torchlib} -> {ONNX IR} -> {ONNX}

- Use ExportedProgram
  - Rely on robustness of the torch.export implementation
  - Reduce complexity in the exporter
  - This does not solve dynamo limitations, but it avoids introducing additional breakage by running fx passes
- Flat graph; Scope info as metadata, not functions
  - Because existing tools are not good at handling them
- Eager optimization where appropriate
  - Because exsiting tools are not good at optimizing
- Drop in replacement for torch.onnx.export
  - Minimum migration effort
- Build graph eagerly in the exporter
  - Give the exporter full control over the graph being built

## Why is this doable?

- We need to verify torch.export coverage on Huggingface Optimum https://github.com/huggingface/optimum/tree/main/optimum/exporters/onnx; and they are not patching torch.onnx itself.
- Patch torch.onnx.export such that packages do not need to change a single line to use dynamo
- We have all operators implemented and portable

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "torch-onnx",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.8",
    "maintainer_email": null,
    "keywords": "onnx, pytorch, converter, convertion, exporter",
    "author": null,
    "author_email": "Justin Chu <justinchu@microsoft.com>",
    "download_url": "https://files.pythonhosted.org/packages/c5/be/df1556da2342f3971f0bd416e835f27536c3c6a535c40d9152ee0bb9619a/torch_onnx-0.0.31.tar.gz",
    "platform": null,
    "description": "# PyTorch to ONNX Exporter\n\n[![PyPI version](https://badge.fury.io/py/torch-onnx.svg)](https://badge.fury.io/py/torch-onnx)\n\nExperimental torch ONNX exporter.\n\n> [!WARNING]\n> This is an experimental project and is not designed for production use.\n> Use `torch.onnx.export` for these purposes.\n\n## Installation\n\n```bash\npip install --upgrade torch-onnx\n```\n\n## Usage\n\n```python\nimport torch\nimport torch_onnx\nfrom onnxscript import ir\nimport onnx\n\n# Get an exported program with torch.export\nexported = torch.export.export(...)\nmodel = torch_onnx.exported_program_to_ir(exported)\nproto = ir.to_proto(model)\nonnx.save(proto, \"model.onnx\")\n\n# Or patch the torch.onnx export API\n# Set error_report=True to get a detailed error report if the export fails\ntorch_onnx.patch_torch(error_report=True, profile=True)\ntorch.onnx.export(...)\n\n# Use the analysis API to print an analysis report for unsupported ops\ntorch_onnx.analyze(exported)\n```\n\n## Design\n\n{dynamo/jit} -> {ExportedProgram} -> {torchlib} -> {ONNX IR} -> {ONNX}\n\n- Use ExportedProgram\n  - Rely on robustness of the torch.export implementation\n  - Reduce complexity in the exporter\n  - This does not solve dynamo limitations, but it avoids introducing additional breakage by running fx passes\n- Flat graph; Scope info as metadata, not functions\n  - Because existing tools are not good at handling them\n- Eager optimization where appropriate\n  - Because exsiting tools are not good at optimizing\n- Drop in replacement for torch.onnx.export\n  - Minimum migration effort\n- Build graph eagerly in the exporter\n  - Give the exporter full control over the graph being built\n\n## Why is this doable?\n\n- We need to verify torch.export coverage on Huggingface Optimum https://github.com/huggingface/optimum/tree/main/optimum/exporters/onnx; and they are not patching torch.onnx itself.\n- Patch torch.onnx.export such that packages do not need to change a single line to use dynamo\n- We have all operators implemented and portable\n",
    "bugtrack_url": null,
    "license": "MIT License",
    "summary": "Experimental tools for converting PyTorch models to ONNX",
    "version": "0.0.31",
    "project_urls": {
        "Repository": "https://github.com/justinchuby/torch-onnx"
    },
    "split_keywords": [
        "onnx",
        " pytorch",
        " converter",
        " convertion",
        " exporter"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "d10f06d0c657748067674fa6cf7b32ac7f6de5b05dc42f012b4aecda4dc52487",
                "md5": "d518ff93147b33bd803d9eca582979ce",
                "sha256": "c9181e54c1a6606d836629babd03454ec87529660b427a6dbf1a0c5cfdfc69d2"
            },
            "downloads": -1,
            "filename": "torch_onnx-0.0.31-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "d518ff93147b33bd803d9eca582979ce",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.8",
            "size": 60209,
            "upload_time": "2024-07-25T00:32:54",
            "upload_time_iso_8601": "2024-07-25T00:32:54.119181Z",
            "url": "https://files.pythonhosted.org/packages/d1/0f/06d0c657748067674fa6cf7b32ac7f6de5b05dc42f012b4aecda4dc52487/torch_onnx-0.0.31-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "c5bedf1556da2342f3971f0bd416e835f27536c3c6a535c40d9152ee0bb9619a",
                "md5": "618191688a171dd0644834ebc77592db",
                "sha256": "93f9d81be292aa2ae509a983e7eb2865d10c86c7752e913f2a73711d115fb8e7"
            },
            "downloads": -1,
            "filename": "torch_onnx-0.0.31.tar.gz",
            "has_sig": false,
            "md5_digest": "618191688a171dd0644834ebc77592db",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.8",
            "size": 53771,
            "upload_time": "2024-07-25T00:32:56",
            "upload_time_iso_8601": "2024-07-25T00:32:56.114196Z",
            "url": "https://files.pythonhosted.org/packages/c5/be/df1556da2342f3971f0bd416e835f27536c3c6a535c40d9152ee0bb9619a/torch_onnx-0.0.31.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-07-25 00:32:56",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "justinchuby",
    "github_project": "torch-onnx",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "torch-onnx"
}
        
Elapsed time: 0.30739s