pnnx


Namepnnx JSON
Version 20250819 PyPI version JSON
download
home_pagehttps://github.com/Tencent/ncnn/tree/master/tools/pnnx
Summarypnnx is an open standard for PyTorch model interoperability.
upload_time2025-08-19 06:19:20
maintainerNone
docs_urlNone
authornihui
requires_python>=3.7
licenseBSD-3
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # pnnx
python wrapper of pnnx, only support python 3.7+  now.

Install from pip
==================

pnnx is available as wheel packages for macOS, Windows and Linux distributions, you can install with pip:

```
pip install pnnx
```

# Build & Install from source

## Prerequisites

**On Unix (Linux, OS X)**

* A compiler with C++14 support
* CMake >= 3.4

**On Mac**

* A compiler with C++14 support
* CMake >= 3.4

**On Windows**

* Visual Studio 2015 or higher
* CMake >= 3.4

## Build & install
1. clone ncnn.
```bash
git clone https://github.com/Tencent/ncnn.git
```
2. install pytorch 

install pytorch according to https://pytorch.org/ . Anaconda is strongly recommended for example:
```bash
conda install pytorch
```
3. install
```bash
cd /pathto/ncnntools/pnnx/python
python setup.py install
```

> **Note:**
> If torchvision and pnnx2onnx are needed, you can set the following environment variables before 'python setup.py install' to enable them. e.g. on ubuntu:
>
> ```
> export TORCHVISION_INSTALL_DIR="/project/torchvision"
> export PROTOBUF_INCLUDE_DIR="/project/protobuf/include"
> export PROTOBUF_LIBRARIES="/project/protobuf/lib64/libprotobuf.a"
> export PROTOBUF_PROTOC_EXECUTABLE="/project/protobuf/bin/protoc" 
> ```
>
> To do these, you must install Torchvision and Protobuf first.


## Tests
```bash
cd /pathto/ncnn/tools/pnnx/python
pytest tests
```

## Usage
1. export model to pnnx
```python
import torch
import torchvision.models as models
import pnnx

net = models.resnet18(pretrained=True)
x = torch.rand(1, 3, 224, 224)

# You could try disabling checking when torch tracing raises error
# opt_net = pnnx.export(net, "resnet18.pt", x, check_trace=False)
opt_net = pnnx.export(net, "resnet18.pt", x)
```

2. convert existing model to pnnx
```python
import torch
import pnnx

x = torch.rand(1, 3, 224, 224)
opt_net = pnnx.convert("resnet18.pt", x)
```

## API Reference
1. pnnx.export

`model` (torch.nn.Model): model to be exported.

`ptpath` (str): the torchscript name.

`inputs` (torch.Tensor of list of torch.Tensor) expected inputs of the model.

`inputs2` (torch.Tensor of list of torch.Tensor) alternative inputs of the model. Usually, it is used with input_shapes to resolve dynamic shape.

`input_shapes` (Optional, list of int or list of list with int type inside)  shapes of model inputs.
It is used to resolve tensor shapes in model graph. for example, [1,3,224,224] for the model with only 
1 input, [[1,3,224,224],[1,3,224,224]] for the model that have 2 inputs. 

`input_types` (Optional, str or list of str) types of model inputs, it should have the same length with `input_shapes`.
for example, "f32" for the model with only 1 input, ["f32", "f32"] for the model that have 2 inputs.

| typename | torch type                      |
|:--------:|:--------------------------------|
|   f32    | torch.float32 or torch.float    |
|   f64    | torch.float64 or torch.double   |
|   f16    | torch.float16 or torch.half     |
|    u8    | torch.uint8                     |
|    i8    | torch.int8                      |
|   i16    | torch.int16 or torch.short      |
|   i32    | torch.int32 or torch.int        |
|   i64    | torch.int64 or torch.long       |
|   c32    | torch.complex32                 |
|   c64    | torch.complex64                 |
|  c128    | torch.complex128                |

`input_shapes2` (Optional, list of int or list of list with int type inside) shapes of alternative model inputs,
the format is identical to `input_shapes`. Usually, it is used with input_shapes to resolve dynamic shape (-1)
in model graph.

`input_types2` (Optional, str or list of str) types of alternative model inputs.

`device` (Optional, str, default="cpu") device type for the input in TorchScript model, cpu or gpu.

`customop` (Optional, str or list of str) list of Torch extensions (dynamic library) for custom operators.
For example, "/home/nihui/.cache/torch_extensions/fused/fused.so" or 
["/home/nihui/.cache/torch_extensions/fused/fused.so",...].

`moduleop` (Optional, str or list of str)  list of modules to keep as one big operator.
for example, "models.common.Focus" or ["models.common.Focus","models.yolo.Detect"].

`optlevel` (Optional, int, default=2) graph optimization level

| option | optimization level                |
|:--------:|:----------------------------------|
|   0    | do not apply optimization         |
|   1    | do not apply optimization         |
|   2    | optimization more for inference   |

`pnnxparam` (Optional, str, default="*.pnnx.param", * is the model name): PNNX graph definition file.

`pnnxbin` (Optional, str, default="*.pnnx.bin"): PNNX model weight.

`pnnxpy` (Optional, str, default="*_pnnx.py"): PyTorch script for inference, including model construction 
and weight initialization code.

`pnnxonnx` (Optional, str, default="*.pnnx.onnx"): PNNX model in onnx format.

`ncnnparam` (Optional, str, default="*.ncnn.param"): ncnn graph definition.

`ncnnbin` (Optional, str, default="*.ncnn.bin"): ncnn model weight.

`ncnnpy` (Optional, str, default="*_ncnn.py"): pyncnn script for inference.

2. pnnx.convert

`ptpath` (str): torchscript model to be converted.

Other parameters are consistent with `pnnx.export`

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/Tencent/ncnn/tree/master/tools/pnnx",
    "name": "pnnx",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.7",
    "maintainer_email": null,
    "keywords": null,
    "author": "nihui",
    "author_email": "nihuini@tencent.com",
    "download_url": null,
    "platform": null,
    "description": "# pnnx\npython wrapper of pnnx, only support python 3.7+  now.\n\nInstall from pip\n==================\n\npnnx is available as wheel packages for macOS, Windows and Linux distributions, you can install with pip:\n\n```\npip install pnnx\n```\n\n# Build & Install from source\n\n## Prerequisites\n\n**On Unix (Linux, OS X)**\n\n* A compiler with C++14 support\n* CMake >= 3.4\n\n**On Mac**\n\n* A compiler with C++14 support\n* CMake >= 3.4\n\n**On Windows**\n\n* Visual Studio 2015 or higher\n* CMake >= 3.4\n\n## Build & install\n1. clone ncnn.\n```bash\ngit clone https://github.com/Tencent/ncnn.git\n```\n2. install pytorch \n\ninstall pytorch according to https://pytorch.org/ . Anaconda is strongly recommended for example:\n```bash\nconda install pytorch\n```\n3. install\n```bash\ncd /pathto/ncnntools/pnnx/python\npython setup.py install\n```\n\n> **Note:**\n> If torchvision and pnnx2onnx are needed, you can set the following environment variables before 'python setup.py install' to enable them. e.g. on ubuntu:\n>\n> ```\n> export TORCHVISION_INSTALL_DIR=\"/project/torchvision\"\n> export PROTOBUF_INCLUDE_DIR=\"/project/protobuf/include\"\n> export PROTOBUF_LIBRARIES=\"/project/protobuf/lib64/libprotobuf.a\"\n> export PROTOBUF_PROTOC_EXECUTABLE=\"/project/protobuf/bin/protoc\" \n> ```\n>\n> To do these, you must install Torchvision and Protobuf first.\n\n\n## Tests\n```bash\ncd /pathto/ncnn/tools/pnnx/python\npytest tests\n```\n\n## Usage\n1. export model to pnnx\n```python\nimport torch\nimport torchvision.models as models\nimport pnnx\n\nnet = models.resnet18(pretrained=True)\nx = torch.rand(1, 3, 224, 224)\n\n# You could try disabling checking when torch tracing raises error\n# opt_net = pnnx.export(net, \"resnet18.pt\", x, check_trace=False)\nopt_net = pnnx.export(net, \"resnet18.pt\", x)\n```\n\n2. convert existing model to pnnx\n```python\nimport torch\nimport pnnx\n\nx = torch.rand(1, 3, 224, 224)\nopt_net = pnnx.convert(\"resnet18.pt\", x)\n```\n\n## API Reference\n1. pnnx.export\n\n`model` (torch.nn.Model): model to be exported.\n\n`ptpath` (str): the torchscript name.\n\n`inputs` (torch.Tensor of list of torch.Tensor) expected inputs of the model.\n\n`inputs2` (torch.Tensor of list of torch.Tensor) alternative inputs of the model. Usually, it is used with input_shapes to resolve dynamic shape.\n\n`input_shapes` (Optional, list of int or list of list with int type inside)  shapes of model inputs.\nIt is used to resolve tensor shapes in model graph. for example, [1,3,224,224] for the model with only \n1 input, [[1,3,224,224],[1,3,224,224]] for the model that have 2 inputs. \n\n`input_types` (Optional, str or list of str) types of model inputs, it should have the same length with `input_shapes`.\nfor example, \"f32\" for the model with only 1 input, [\"f32\", \"f32\"] for the model that have 2 inputs.\n\n| typename | torch type                      |\n|:--------:|:--------------------------------|\n|   f32    | torch.float32 or torch.float    |\n|   f64    | torch.float64 or torch.double   |\n|   f16    | torch.float16 or torch.half     |\n|    u8    | torch.uint8                     |\n|    i8    | torch.int8                      |\n|   i16    | torch.int16 or torch.short      |\n|   i32    | torch.int32 or torch.int        |\n|   i64    | torch.int64 or torch.long       |\n|   c32    | torch.complex32                 |\n|   c64    | torch.complex64                 |\n|  c128    | torch.complex128                |\n\n`input_shapes2` (Optional, list of int or list of list with int type inside) shapes of alternative model inputs,\nthe format is identical to `input_shapes`. Usually, it is used with input_shapes to resolve dynamic shape (-1)\nin model graph.\n\n`input_types2` (Optional, str or list of str) types of alternative model inputs.\n\n`device` (Optional, str, default=\"cpu\") device type for the input in TorchScript model, cpu or gpu.\n\n`customop` (Optional, str or list of str) list of Torch extensions (dynamic library) for custom operators.\nFor example, \"/home/nihui/.cache/torch_extensions/fused/fused.so\" or \n[\"/home/nihui/.cache/torch_extensions/fused/fused.so\",...].\n\n`moduleop` (Optional, str or list of str)  list of modules to keep as one big operator.\nfor example, \"models.common.Focus\" or [\"models.common.Focus\",\"models.yolo.Detect\"].\n\n`optlevel` (Optional, int, default=2) graph optimization level\n\n| option | optimization level                |\n|:--------:|:----------------------------------|\n|   0    | do not apply optimization         |\n|   1    | do not apply optimization         |\n|   2    | optimization more for inference   |\n\n`pnnxparam` (Optional, str, default=\"*.pnnx.param\", * is the model name): PNNX graph definition file.\n\n`pnnxbin` (Optional, str, default=\"*.pnnx.bin\"): PNNX model weight.\n\n`pnnxpy` (Optional, str, default=\"*_pnnx.py\"): PyTorch script for inference, including model construction \nand weight initialization code.\n\n`pnnxonnx` (Optional, str, default=\"*.pnnx.onnx\"): PNNX model in onnx format.\n\n`ncnnparam` (Optional, str, default=\"*.ncnn.param\"): ncnn graph definition.\n\n`ncnnbin` (Optional, str, default=\"*.ncnn.bin\"): ncnn model weight.\n\n`ncnnpy` (Optional, str, default=\"*_ncnn.py\"): pyncnn script for inference.\n\n2. pnnx.convert\n\n`ptpath` (str): torchscript model to be converted.\n\nOther parameters are consistent with `pnnx.export`\n",
    "bugtrack_url": null,
    "license": "BSD-3",
    "summary": "pnnx is an open standard for PyTorch model interoperability.",
    "version": "20250819",
    "project_urls": {
        "Homepage": "https://github.com/Tencent/ncnn/tree/master/tools/pnnx"
    },
    "split_keywords": [],
    "urls": [
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "ffe9a2f5d1a31cc9fd6805bed9523af553a9a66e0cce3b653d146494420eff71",
                "md5": "b8b5223ef4c849f3627f848823be2f98",
                "sha256": "27d901103124f1e1c46acc133f5ac2b1e2a55d1358374e24a3367023d309fc7b"
            },
            "downloads": -1,
            "filename": "pnnx-20250819-py3-none-macosx_10_9_universal2.macosx_10_9_x86_64.macosx_11_0_arm64.whl",
            "has_sig": false,
            "md5_digest": "b8b5223ef4c849f3627f848823be2f98",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.7",
            "size": 51351546,
            "upload_time": "2025-08-19T06:19:20",
            "upload_time_iso_8601": "2025-08-19T06:19:20.475860Z",
            "url": "https://files.pythonhosted.org/packages/ff/e9/a2f5d1a31cc9fd6805bed9523af553a9a66e0cce3b653d146494420eff71/pnnx-20250819-py3-none-macosx_10_9_universal2.macosx_10_9_x86_64.macosx_11_0_arm64.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "412b413f585c152b39b8a5ac5036d5508dce47e1fd6abf915f51a72cc992e30f",
                "md5": "2817b56d13d5bb2e6bd1b058e763634f",
                "sha256": "882ff90936c617a99a751874b33379c0f0787ce2abb7f05c25aa6fd97d0174d2"
            },
            "downloads": -1,
            "filename": "pnnx-20250819-py3-none-manylinux2014_aarch64.manylinux_2_17_aarch64.whl",
            "has_sig": false,
            "md5_digest": "2817b56d13d5bb2e6bd1b058e763634f",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.7",
            "size": 22820071,
            "upload_time": "2025-08-19T06:19:24",
            "upload_time_iso_8601": "2025-08-19T06:19:24.043942Z",
            "url": "https://files.pythonhosted.org/packages/41/2b/413f585c152b39b8a5ac5036d5508dce47e1fd6abf915f51a72cc992e30f/pnnx-20250819-py3-none-manylinux2014_aarch64.manylinux_2_17_aarch64.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "726a8809d697375202387f04ce41bc671bdf9ce46a95c18224694a127ef9e91a",
                "md5": "8a3925520809b1ef88d6d287787e56e0",
                "sha256": "87097b3d4a95166cd884c148b52d69db44e726fc5d23ec7235ff8ac0859756a4"
            },
            "downloads": -1,
            "filename": "pnnx-20250819-py3-none-manylinux2014_x86_64.manylinux_2_17_x86_64.whl",
            "has_sig": false,
            "md5_digest": "8a3925520809b1ef88d6d287787e56e0",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.7",
            "size": 27080995,
            "upload_time": "2025-08-19T06:19:26",
            "upload_time_iso_8601": "2025-08-19T06:19:26.480867Z",
            "url": "https://files.pythonhosted.org/packages/72/6a/8809d697375202387f04ce41bc671bdf9ce46a95c18224694a127ef9e91a/pnnx-20250819-py3-none-manylinux2014_x86_64.manylinux_2_17_x86_64.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "6b63cc42000e9607b6ca5640306483df558dbc5d0f8b1ee4614891670e3d6601",
                "md5": "31169ea9d3b3de7724a0174a5c021841",
                "sha256": "a8882e66309f0f0cea31ff0dd8fc671538a624cce3bd3ebb3ea60fcc56000cdc"
            },
            "downloads": -1,
            "filename": "pnnx-20250819-py3-none-win_amd64.whl",
            "has_sig": false,
            "md5_digest": "31169ea9d3b3de7724a0174a5c021841",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.7",
            "size": 17476180,
            "upload_time": "2025-08-19T06:19:28",
            "upload_time_iso_8601": "2025-08-19T06:19:28.976658Z",
            "url": "https://files.pythonhosted.org/packages/6b/63/cc42000e9607b6ca5640306483df558dbc5d0f8b1ee4614891670e3d6601/pnnx-20250819-py3-none-win_amd64.whl",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-08-19 06:19:20",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "Tencent",
    "github_project": "ncnn",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "pnnx"
}
        
Elapsed time: 3.38036s