Name | onnxscript JSON |
Version |
0.1.0.dev20241009
JSON |
| download |
home_page | None |
Summary | Naturally author ONNX functions and models using a subset of Python |
upload_time | 2024-10-09 00:06:24 |
maintainer | None |
docs_url | None |
author | None |
requires_python | >=3.8 |
license | MIT License
Copyright (c) Microsoft Corporation
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.
|
keywords |
|
VCS |
|
bugtrack_url |
|
requirements |
No requirements were recorded.
|
Travis-CI |
No Travis.
|
coveralls test coverage |
No coveralls.
|
# ONNX Script
[![CI](https://github.com/microsoft/onnxscript/actions/workflows/main.yaml/badge.svg)](https://github.com/microsoft/onnxscript/actions/workflows/main.yaml)
[![Dev Release](https://aiinfra.visualstudio.com/ONNX%20Converters/_apis/build/status%2Fonnxscript-release-dev?branchName=main&label=Dev%20Release)](https://aiinfra.visualstudio.com/ONNX%20Converters/_build/latest?definitionId=1258&branchName=main)
[![PyPI - Version](https://img.shields.io/pypi/v/onnxscript.svg)](https://pypi.org/project/onnxscript)
[![PyPI - Python Version](https://img.shields.io/pypi/pyversions/onnxscript.svg)](https://pypi.org/project/onnxscript)
[![Ruff](https://img.shields.io/endpoint?url=https://raw.githubusercontent.com/astral-sh/ruff/main/assets/badge/v2.json)](https://github.com/astral-sh/ruff)
[![Black](https://img.shields.io/badge/code%20style-black-000000.svg)](https://github.com/psf/black)
ONNX Script enables developers to naturally author ONNX functions and
models using a subset of Python. ONNX Script is:
* **Expressive:** enables the authoring of all ONNX functions.
* **Simple and concise:** function code is natural and simple.
* **Debuggable:** allows for eager-mode evaluation that provides for a
more delightful ONNX model debugging experience.
This repo also covers:
* **ONNX IR:** an in-memory IR that supports the full ONNX spec, designed
for graph construction, analysis and transformation.
* **ONNX Script Optimizer:** provides functionality to optimize an ONNX
model by performing optimizations and clean-ups such as constant folding,
dead code elimination, etc.
* **ONNX Rewriter:** provides functionality to replace certain patterns in
an ONNX graph with replacement patterns based on user-defined rewrite rules.
Note however that ONNX Script does **not** intend to support the entirety
of the Python language.
Website: [https://onnxscript.ai/](https://onnxscript.ai/)
## Design Overview
ONNX Script provides a few major capabilities for authoring and debugging
ONNX models and functions:
* A converter which translates a Python ONNX Script function into an
ONNX graph, accomplished by traversing the [Python Abstract Syntax Tree][python-ast] to build an ONNX graph equivalent of the function.
* A converter that operates inversely, translating ONNX models and
functions into ONNX Script. This capability can be used to fully round-trip
ONNX Script ↔ ONNX graph.
* A runtime shim that allows such functions to be evaluated
(in an "eager mode"). This functionality currently relies on
[ONNX Runtime][onnx-runtime] for executing every [ONNX Operator][onnx-ops],
and there is a Python-only reference runtime for ONNX underway that
will also be supported.
Note that the runtime is intended to help understand and debug function definitions. Performance is not a goal here.
## Installing ONNX Script
```bash
pip install --upgrade onnxscript
```
### Install for Development
```bash
git clone https://github.com/microsoft/onnxscript
cd onnxscript
pip install -r requirements-dev.txt
pip install -e .
```
### Run Unit Tests
```bash
pytest .
```
## Example
```python update-readme
import onnx
# We use ONNX opset 15 to define the function below.
from onnxscript import FLOAT, script
from onnxscript import opset15 as op
# We use the script decorator to indicate that
# this is meant to be translated to ONNX.
@script()
def onnx_hardmax(X, axis: int):
"""Hardmax is similar to ArgMax, with the result being encoded OneHot style."""
# The type annotation on X indicates that it is a float tensor of
# unknown rank. The type annotation on axis indicates that it will
# be treated as an int attribute in ONNX.
#
# Invoke ONNX opset 15 op ArgMax.
# Use unnamed arguments for ONNX input parameters, and named
# arguments for ONNX attribute parameters.
argmax = op.ArgMax(X, axis=axis, keepdims=False)
xshape = op.Shape(X, start=axis)
# use the Constant operator to create constant tensors
zero = op.Constant(value_ints=[0])
depth = op.GatherElements(xshape, zero)
empty_shape = op.Constant(value_ints=[0])
depth = op.Reshape(depth, empty_shape)
values = op.Constant(value_ints=[0, 1])
cast_values = op.CastLike(values, X)
return op.OneHot(argmax, depth, cast_values, axis=axis)
# We use the script decorator to indicate that
# this is meant to be translated to ONNX.
@script()
def sample_model(X: FLOAT[64, 128], Wt: FLOAT[128, 10], Bias: FLOAT[10]) -> FLOAT[64, 10]:
matmul = op.MatMul(X, Wt) + Bias
return onnx_hardmax(matmul, axis=1)
# onnx_model is an in-memory ModelProto
onnx_model = sample_model.to_model_proto()
# Save the ONNX model at a given path
onnx.save(onnx_model, "sample_model.onnx")
# Check the model
try:
onnx.checker.check_model(onnx_model)
except onnx.checker.ValidationError as e:
print(f"The model is invalid: {e}")
else:
print("The model is valid!")
```
The decorator parses the code of the function, converting it into an
intermediate representation. If it fails, it produces an error message
indicating the line where the error was detected. If it succeeds, the
intermediate representation can be converted into an ONNX graph
structure of type `FunctionProto`:
* `Hardmax.to_function_proto()` returns a `FunctionProto`
### Eager Mode Evaluation
Eager mode is mostly used to debug and validate that intermediate results
are as expected. The function defined above can be called as below,
executing in an eager-evaluation mode:
```python
import numpy as np
v = np.array([[0, 1], [2, 3]], dtype=np.float32)
result = Hardmax(v)
```
More examples can be found in the [docs/examples](docs/examples) directory.
## ONNX IR
An in-memory IR that supports the full ONNX spec, designed for graph construction, analysis and transformation.
### Features
* **Full ONNX spec support:** all valid models representable by ONNX protobuf,
and a subset of invalid models (so you can load and fix them).
* **Low memory footprint:** mmap'ed external tensors; unified interface for
ONNX TensorProto, Numpy arrays and PyTorch Tensors etc. No tensor size
limitation. Zero copies.
* **Straightforward access patterns:** Access value information and traverse the
graph topology at ease.
* **Robust mutation:** Create as many iterators as you like on the graph while mutating it.
* **Speed:** Performant graph manipulation, serialization/deserialization to Protobuf.
* **Pythonic and familiar APIs:** Classes define Pythonic apis and still map to
ONNX protobuf concepts in an intuitive way.
## ONNX Script Tools
### ONNX Optimizer
The ONNX Script Optimizer tool provides the user with the functionality to optimize an ONNX model by performing optimizations and clean-ups such as constant folding, dead code elimination, etc. In order to utilize the optimizer tool:
```python
import onnxscript
onnxscript.optimizer.optimize(onnx_model)
```
For a detailed summary of all the optimizations applied by the optimizer call, refer to the tutorial [Optimizing a Model using the Optimizer](https://onnxscript.ai/tutorial/optimizer/optimize.html)
### ONNX Rewriter
The ONNX Rewriter tool provides the user with the functionality to replace certain patterns in an ONNX graph with another pattern based on user-defined rewrite rules. The rewriter tools allows two different methods in which patterns in the graph can be rewritten.
### Pattern-based rewriting
For this style of rewriting, the user provides a `target_pattern` that is to be replaced, a `replacement_pattern` and a `match_condition` (pattern rewrite will occur only if the match condition is satisfied). A simple example on how to use the pattern-based rewriting tool is as follows:
```python
from onnxscript.rewriter import pattern
# The target pattern
def erf_gelu_pattern(op, x):
return 0.5 * (x * (op.Erf(x / math.sqrt(2)) + 1.0))
def erf_gelu_pattern_2(op, x):
return (x * (op.Erf(x / math.sqrt(2)) + 1.0)) * 0.5
# The replacement pattern
def gelu(op, x: ir.Value):
return op.Gelu(x, domain="com.microsoft")
# Create multiple rules
rule1 = pattern.RewriteRule(
erf_gelu_pattern, # Target Pattern
gelu, # Replacement
)
rule2 = pattern.RewriteRule(
erf_gelu_pattern_2, # Target Pattern
gelu, # Replacement
)
# Create a Rewrite Rule Set with multiple rules.
rewrite_rule_set = pattern.RewriteRuleSet([rule1, rule2])
# Apply rewrites
model_with_rewrite_applied = onnxscript.rewriter.rewrite(
model, # Original ONNX Model
pattern_rewrite_rules=rewrite_rule_set,
)
return model_with_rewrite_applied
```
For a detailed tutorial on how to create target_pattern, replacement_pattern and match_condition blocks in order to utilize the pattern-based rewriter, refer to the tutorial [Pattern-based Rewrite Using Rules](https://onnxscript.ai/tutorial/rewriter/rewrite_patterns.html)
### Function-based rewriting
This style of rewriting matches a `FUNCTION_KEYWORD` and `PACKAGE_NAME` provided by the user to an existing function within the graph and replaces it with a new function provided by the user.
## Development Guidelines
Every change impacting the converter or the eager evaluation must be
unit tested with class `OnnxScriptTestCase` to ensure both systems do
return the same results with the same inputs.
### Coding Style
We use `ruff`, `black`, `isort`, and `mypy` etc. to check code formatting and use `lintrunner` to run all linters.
You can install the dependencies and initialize with
```sh
pip install lintrunner lintrunner-adapters
lintrunner init
```
This will install lintrunner on your system and download all the necessary dependencies to run linters locally.
If you want to see what lintrunner init will install, run `lintrunner init --dry-run`.
To lint local changes:
```bash
lintrunner
```
To format files:
```bash
lintrunner f
```
To lint all files:
```bash
lintrunner --all-files
```
Use `--output oneline` to produce a compact list of lint errors, useful when
there are many errors to fix.
See all available options with `lintrunner -h`.
To read more about lintrunner, see [wiki](https://github.com/pytorch/pytorch/wiki/lintrunner).
To update an existing linting rule or create a new one, modify `.lintrunner.toml` or create a
new adapter following examples in https://github.com/justinchuby/lintrunner-adapters.
## Contributing
We're always looking for your help to improve the product (bug fixes, new features, documentation, etc). Currently ONNX Script is under early and heavy development, so we encourage proposing any major changes by [filing an issue](https://github.com/microsoft/onnxscript/issues) to discuss your idea with the team first.
### Report a Security Issue
**Please do not report security vulnerabilities through public GitHub issues.**
Please refer to our guidance on filing [Security Issues](SECURITY.md).
### Licensing Guidelines
This project welcomes contributions and suggestions. Most contributions require you to
agree to a Contributor License Agreement (CLA) declaring that you have the right to,
and actually do, grant us the rights to use your contribution. For details, visit
https://cla.microsoft.com.
When you submit a pull request, a CLA-bot will automatically determine whether you need
to provide a CLA and decorate the PR appropriately (e.g., label, comment). Simply follow the
instructions provided by the bot. You will only need to do this once across all repositories using our CLA.
### Code of Conduct
This project has adopted the [Microsoft Open Source Code of Conduct](https://opensource.microsoft.com/codeofconduct/).
For more information see the [Code of Conduct FAQ](https://opensource.microsoft.com/codeofconduct/faq/)
or contact [opencode@microsoft.com](mailto:opencode@microsoft.com) with any additional questions or comments.
## Trademarks
This project may contain trademarks or logos for projects, products, or services. Authorized use of Microsoft
trademarks or logos is subject to and must follow
[Microsoft's Trademark & Brand Guidelines](https://www.microsoft.com/en-us/legal/intellectualproperty/trademarks/usage/general).
Use of Microsoft trademarks or logos in modified versions of this project must not cause confusion or imply Microsoft sponsorship.
Any use of third-party trademarks or logos is subject to those third-party's policies.
[python-ast]: https://docs.python.org/3/library/ast.html
[onnx-runtime]: https://onnxruntime.ai
[onnx-ops]: https://github.com/onnx/onnx/blob/main/docs/Operators.md
[onnxfns1A.py]: https://github.com/microsoft/onnxscript/blob/main/onnxscript/tests/models/onnxfns1A.py
Raw data
{
"_id": null,
"home_page": null,
"name": "onnxscript",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.8",
"maintainer_email": null,
"keywords": null,
"author": null,
"author_email": "Microsoft Corporation <onnx@microsoft.com>",
"download_url": "https://files.pythonhosted.org/packages/d8/e5/ce842982b026d0bdbcd748359ccc3d206c355aaf968c09a99679f9982601/onnxscript-0.1.0.dev20241009.tar.gz",
"platform": null,
"description": "# ONNX Script\n\n[![CI](https://github.com/microsoft/onnxscript/actions/workflows/main.yaml/badge.svg)](https://github.com/microsoft/onnxscript/actions/workflows/main.yaml)\n[![Dev Release](https://aiinfra.visualstudio.com/ONNX%20Converters/_apis/build/status%2Fonnxscript-release-dev?branchName=main&label=Dev%20Release)](https://aiinfra.visualstudio.com/ONNX%20Converters/_build/latest?definitionId=1258&branchName=main)\n[![PyPI - Version](https://img.shields.io/pypi/v/onnxscript.svg)](https://pypi.org/project/onnxscript)\n[![PyPI - Python Version](https://img.shields.io/pypi/pyversions/onnxscript.svg)](https://pypi.org/project/onnxscript)\n[![Ruff](https://img.shields.io/endpoint?url=https://raw.githubusercontent.com/astral-sh/ruff/main/assets/badge/v2.json)](https://github.com/astral-sh/ruff)\n[![Black](https://img.shields.io/badge/code%20style-black-000000.svg)](https://github.com/psf/black)\n\nONNX Script enables developers to naturally author ONNX functions and\nmodels using a subset of Python. ONNX Script is:\n\n* **Expressive:** enables the authoring of all ONNX functions.\n* **Simple and concise:** function code is natural and simple.\n* **Debuggable:** allows for eager-mode evaluation that provides for a\n more delightful ONNX model debugging experience.\n\nThis repo also covers:\n\n* **ONNX IR:** an in-memory IR that supports the full ONNX spec, designed\n for graph construction, analysis and transformation.\n* **ONNX Script Optimizer:** provides functionality to optimize an ONNX\n model by performing optimizations and clean-ups such as constant folding,\n dead code elimination, etc.\n* **ONNX Rewriter:** provides functionality to replace certain patterns in\n an ONNX graph with replacement patterns based on user-defined rewrite rules.\n\nNote however that ONNX Script does **not** intend to support the entirety\nof the Python language.\n\nWebsite: [https://onnxscript.ai/](https://onnxscript.ai/)\n\n## Design Overview\n\nONNX Script provides a few major capabilities for authoring and debugging\nONNX models and functions:\n\n* A converter which translates a Python ONNX Script function into an\n ONNX graph, accomplished by traversing the [Python Abstract Syntax Tree][python-ast] to build an ONNX graph equivalent of the function.\n\n* A converter that operates inversely, translating ONNX models and\n functions into ONNX Script. This capability can be used to fully round-trip\n ONNX Script \u2194 ONNX graph.\n\n* A runtime shim that allows such functions to be evaluated\n (in an \"eager mode\"). This functionality currently relies on\n [ONNX Runtime][onnx-runtime] for executing every [ONNX Operator][onnx-ops],\n and there is a Python-only reference runtime for ONNX underway that\n will also be supported.\n\n Note that the runtime is intended to help understand and debug function definitions. Performance is not a goal here.\n\n## Installing ONNX Script\n\n```bash\npip install --upgrade onnxscript\n```\n\n### Install for Development\n\n```bash\ngit clone https://github.com/microsoft/onnxscript\ncd onnxscript\npip install -r requirements-dev.txt\npip install -e .\n```\n\n### Run Unit Tests\n\n```bash\npytest .\n```\n\n## Example\n\n```python update-readme\nimport onnx\n\n# We use ONNX opset 15 to define the function below.\nfrom onnxscript import FLOAT, script\nfrom onnxscript import opset15 as op\n\n\n# We use the script decorator to indicate that\n# this is meant to be translated to ONNX.\n@script()\ndef onnx_hardmax(X, axis: int):\n \"\"\"Hardmax is similar to ArgMax, with the result being encoded OneHot style.\"\"\"\n\n # The type annotation on X indicates that it is a float tensor of\n # unknown rank. The type annotation on axis indicates that it will\n # be treated as an int attribute in ONNX.\n #\n # Invoke ONNX opset 15 op ArgMax.\n # Use unnamed arguments for ONNX input parameters, and named\n # arguments for ONNX attribute parameters.\n argmax = op.ArgMax(X, axis=axis, keepdims=False)\n xshape = op.Shape(X, start=axis)\n # use the Constant operator to create constant tensors\n zero = op.Constant(value_ints=[0])\n depth = op.GatherElements(xshape, zero)\n empty_shape = op.Constant(value_ints=[0])\n depth = op.Reshape(depth, empty_shape)\n values = op.Constant(value_ints=[0, 1])\n cast_values = op.CastLike(values, X)\n return op.OneHot(argmax, depth, cast_values, axis=axis)\n\n\n# We use the script decorator to indicate that\n# this is meant to be translated to ONNX.\n@script()\ndef sample_model(X: FLOAT[64, 128], Wt: FLOAT[128, 10], Bias: FLOAT[10]) -> FLOAT[64, 10]:\n matmul = op.MatMul(X, Wt) + Bias\n return onnx_hardmax(matmul, axis=1)\n\n\n# onnx_model is an in-memory ModelProto\nonnx_model = sample_model.to_model_proto()\n\n# Save the ONNX model at a given path\nonnx.save(onnx_model, \"sample_model.onnx\")\n\n# Check the model\ntry:\n onnx.checker.check_model(onnx_model)\nexcept onnx.checker.ValidationError as e:\n print(f\"The model is invalid: {e}\")\nelse:\n print(\"The model is valid!\")\n```\n\nThe decorator parses the code of the function, converting it into an\nintermediate representation. If it fails, it produces an error message\nindicating the line where the error was detected. If it succeeds, the\nintermediate representation can be converted into an ONNX graph\nstructure of type `FunctionProto`:\n\n* `Hardmax.to_function_proto()` returns a `FunctionProto`\n\n### Eager Mode Evaluation\n\nEager mode is mostly used to debug and validate that intermediate results\nare as expected. The function defined above can be called as below,\nexecuting in an eager-evaluation mode:\n\n```python\nimport numpy as np\n\nv = np.array([[0, 1], [2, 3]], dtype=np.float32)\nresult = Hardmax(v)\n```\n\nMore examples can be found in the [docs/examples](docs/examples) directory.\n\n## ONNX IR\n\nAn in-memory IR that supports the full ONNX spec, designed for graph construction, analysis and transformation.\n\n### Features\n\n* **Full ONNX spec support:** all valid models representable by ONNX protobuf,\n and a subset of invalid models (so you can load and fix them).\n* **Low memory footprint:** mmap'ed external tensors; unified interface for\n ONNX TensorProto, Numpy arrays and PyTorch Tensors etc. No tensor size\n limitation. Zero copies.\n* **Straightforward access patterns:** Access value information and traverse the\n graph topology at ease.\n* **Robust mutation:** Create as many iterators as you like on the graph while mutating it.\n* **Speed:** Performant graph manipulation, serialization/deserialization to Protobuf.\n* **Pythonic and familiar APIs:** Classes define Pythonic apis and still map to\n ONNX protobuf concepts in an intuitive way.\n\n## ONNX Script Tools\n\n### ONNX Optimizer\n\nThe ONNX Script Optimizer tool provides the user with the functionality to optimize an ONNX model by performing optimizations and clean-ups such as constant folding, dead code elimination, etc. In order to utilize the optimizer tool:\n\n```python\nimport onnxscript\n\nonnxscript.optimizer.optimize(onnx_model)\n```\n\nFor a detailed summary of all the optimizations applied by the optimizer call, refer to the tutorial [Optimizing a Model using the Optimizer](https://onnxscript.ai/tutorial/optimizer/optimize.html)\n\n### ONNX Rewriter\n\nThe ONNX Rewriter tool provides the user with the functionality to replace certain patterns in an ONNX graph with another pattern based on user-defined rewrite rules. The rewriter tools allows two different methods in which patterns in the graph can be rewritten.\n\n### Pattern-based rewriting\n\nFor this style of rewriting, the user provides a `target_pattern` that is to be replaced, a `replacement_pattern` and a `match_condition` (pattern rewrite will occur only if the match condition is satisfied). A simple example on how to use the pattern-based rewriting tool is as follows:\n\n```python\nfrom onnxscript.rewriter import pattern\n\n# The target pattern\ndef erf_gelu_pattern(op, x):\n return 0.5 * (x * (op.Erf(x / math.sqrt(2)) + 1.0))\n\ndef erf_gelu_pattern_2(op, x):\n return (x * (op.Erf(x / math.sqrt(2)) + 1.0)) * 0.5\n\n# The replacement pattern\ndef gelu(op, x: ir.Value):\n return op.Gelu(x, domain=\"com.microsoft\")\n\n# Create multiple rules\nrule1 = pattern.RewriteRule(\n erf_gelu_pattern, # Target Pattern\n gelu, # Replacement\n)\nrule2 = pattern.RewriteRule(\n erf_gelu_pattern_2, # Target Pattern\n gelu, # Replacement\n)\n# Create a Rewrite Rule Set with multiple rules.\nrewrite_rule_set = pattern.RewriteRuleSet([rule1, rule2])\n# Apply rewrites\nmodel_with_rewrite_applied = onnxscript.rewriter.rewrite(\n model, # Original ONNX Model\n pattern_rewrite_rules=rewrite_rule_set,\n)\nreturn model_with_rewrite_applied\n```\n\nFor a detailed tutorial on how to create target_pattern, replacement_pattern and match_condition blocks in order to utilize the pattern-based rewriter, refer to the tutorial [Pattern-based Rewrite Using Rules](https://onnxscript.ai/tutorial/rewriter/rewrite_patterns.html)\n\n### Function-based rewriting\n\nThis style of rewriting matches a `FUNCTION_KEYWORD` and `PACKAGE_NAME` provided by the user to an existing function within the graph and replaces it with a new function provided by the user.\n\n## Development Guidelines\n\nEvery change impacting the converter or the eager evaluation must be\nunit tested with class `OnnxScriptTestCase` to ensure both systems do\nreturn the same results with the same inputs.\n\n### Coding Style\n\nWe use `ruff`, `black`, `isort`, and `mypy` etc. to check code formatting and use `lintrunner` to run all linters.\nYou can install the dependencies and initialize with\n\n```sh\npip install lintrunner lintrunner-adapters\nlintrunner init\n```\n\nThis will install lintrunner on your system and download all the necessary dependencies to run linters locally.\nIf you want to see what lintrunner init will install, run `lintrunner init --dry-run`.\n\nTo lint local changes:\n\n```bash\nlintrunner\n```\n\nTo format files:\n\n```bash\nlintrunner f\n```\n\nTo lint all files:\n\n```bash\nlintrunner --all-files\n```\n\nUse `--output oneline` to produce a compact list of lint errors, useful when\nthere are many errors to fix.\n\nSee all available options with `lintrunner -h`.\n\nTo read more about lintrunner, see [wiki](https://github.com/pytorch/pytorch/wiki/lintrunner).\nTo update an existing linting rule or create a new one, modify `.lintrunner.toml` or create a\nnew adapter following examples in https://github.com/justinchuby/lintrunner-adapters.\n\n## Contributing\n\nWe're always looking for your help to improve the product (bug fixes, new features, documentation, etc). Currently ONNX Script is under early and heavy development, so we encourage proposing any major changes by [filing an issue](https://github.com/microsoft/onnxscript/issues) to discuss your idea with the team first.\n\n### Report a Security Issue\n\n**Please do not report security vulnerabilities through public GitHub issues.**\n\nPlease refer to our guidance on filing [Security Issues](SECURITY.md).\n\n### Licensing Guidelines\n\nThis project welcomes contributions and suggestions. Most contributions require you to\nagree to a Contributor License Agreement (CLA) declaring that you have the right to,\nand actually do, grant us the rights to use your contribution. For details, visit\nhttps://cla.microsoft.com.\n\nWhen you submit a pull request, a CLA-bot will automatically determine whether you need\nto provide a CLA and decorate the PR appropriately (e.g., label, comment). Simply follow the\ninstructions provided by the bot. You will only need to do this once across all repositories using our CLA.\n\n### Code of Conduct\n\nThis project has adopted the [Microsoft Open Source Code of Conduct](https://opensource.microsoft.com/codeofconduct/).\nFor more information see the [Code of Conduct FAQ](https://opensource.microsoft.com/codeofconduct/faq/)\nor contact [opencode@microsoft.com](mailto:opencode@microsoft.com) with any additional questions or comments.\n\n## Trademarks\n\nThis project may contain trademarks or logos for projects, products, or services. Authorized use of Microsoft\ntrademarks or logos is subject to and must follow\n[Microsoft's Trademark & Brand Guidelines](https://www.microsoft.com/en-us/legal/intellectualproperty/trademarks/usage/general).\nUse of Microsoft trademarks or logos in modified versions of this project must not cause confusion or imply Microsoft sponsorship.\nAny use of third-party trademarks or logos is subject to those third-party's policies.\n\n[python-ast]: https://docs.python.org/3/library/ast.html\n[onnx-runtime]: https://onnxruntime.ai\n[onnx-ops]: https://github.com/onnx/onnx/blob/main/docs/Operators.md\n[onnxfns1A.py]: https://github.com/microsoft/onnxscript/blob/main/onnxscript/tests/models/onnxfns1A.py\n",
"bugtrack_url": null,
"license": "MIT License \n \nCopyright (c) Microsoft Corporation \n \nPermission is hereby granted, free of charge, to any person obtaining a copy \nof this software and associated documentation files (the \"Software\"), to deal \nin the Software without restriction, including without limitation the rights \nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell \ncopies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: \n \nThe above copyright notice and this permission notice shall be included in all \ncopies or substantial portions of the Software. \n \nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR \nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, \nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE \nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER \nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, \nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE \nSOFTWARE. \n",
"summary": "Naturally author ONNX functions and models using a subset of Python",
"version": "0.1.0.dev20241009",
"project_urls": {
"Commit": "https://github.com/microsoft/onnxscript/tree/37b11fcff94681dea368ebfe0c4768844f2d3149",
"Homepage": "https://onnxscript.ai/",
"Repository": "https://github.com/microsoft/onnxscript"
},
"split_keywords": [],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "79f21944864dcfc5167e32805334e84b035c13f77881ec1144e9feba855b4e5a",
"md5": "4a553cd9cee03c234205d61cc583bdd6",
"sha256": "6303fb9b8ba66623e5a10a7d05b4914f3f79e761e6843225f0c72b0c4f865394"
},
"downloads": -1,
"filename": "onnxscript-0.1.0.dev20241009-py3-none-any.whl",
"has_sig": false,
"md5_digest": "4a553cd9cee03c234205d61cc583bdd6",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.8",
"size": 670741,
"upload_time": "2024-10-09T00:06:26",
"upload_time_iso_8601": "2024-10-09T00:06:26.333971Z",
"url": "https://files.pythonhosted.org/packages/79/f2/1944864dcfc5167e32805334e84b035c13f77881ec1144e9feba855b4e5a/onnxscript-0.1.0.dev20241009-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "d8e5ce842982b026d0bdbcd748359ccc3d206c355aaf968c09a99679f9982601",
"md5": "6576562443781f31a8837adbad753fb4",
"sha256": "0ab8955b368d022813f820d8f99379dbd0b4892da8d3931d0b82f96ac986f7fa"
},
"downloads": -1,
"filename": "onnxscript-0.1.0.dev20241009.tar.gz",
"has_sig": false,
"md5_digest": "6576562443781f31a8837adbad753fb4",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.8",
"size": 557445,
"upload_time": "2024-10-09T00:06:24",
"upload_time_iso_8601": "2024-10-09T00:06:24.142498Z",
"url": "https://files.pythonhosted.org/packages/d8/e5/ce842982b026d0bdbcd748359ccc3d206c355aaf968c09a99679f9982601/onnxscript-0.1.0.dev20241009.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-10-09 00:06:24",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "microsoft",
"github_project": "onnxscript",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"lcname": "onnxscript"
}