pnnx


Namepnnx JSON
Version 20240410 PyPI version JSON
download
home_pagehttps://github.com/Tencent/ncnn/tree/master/tools/pnnx
Summarypnnx is an open standard for PyTorch model interoperability.
upload_time2024-04-10 11:40:51
maintainerNone
docs_urlNone
authornihui
requires_python>=3.7
licenseBSD-3
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # pnnx
python wrapper of pnnx, only support python 3.7+  now.

Install from pip
==================

pnnx is available as wheel packages for macOS, Windows and Linux distributions, you can install with pip:

```
pip install pnnx
```

# Build & Install from source

## Prerequisites

**On Unix (Linux, OS X)**

* A compiler with C++14 support
* CMake >= 3.4

**On Mac**

* A compiler with C++14 support
* CMake >= 3.4

**On Windows**

* Visual Studio 2015 or higher
* CMake >= 3.4

## Build & install
1. clone ncnn.
```bash
git clone https://github.com/Tencent/ncnn.git
```
2. install pytorch 

install pytorch according to https://pytorch.org/ . Anaconda is strongly recommended for example:
```bash
conda install pytorch
```
3. install
```bash
cd /pathto/ncnntools/pnnx/python
python setup.py install
```

> **Note:**
> If torchvision and pnnx2onnx are needed, you can set the following environment variables before 'python setup.py install' to enable them. e.g. on ubuntu:
>
> ```
> export TORCHVISION_INSTALL_DIR="/project/torchvision"
> export PROTOBUF_INCLUDE_DIR="/project/protobuf/include"
> export PROTOBUF_LIBRARIES="/project/protobuf/lib64/libprotobuf.a"
> export PROTOBUF_PROTOC_EXECUTABLE="/project/protobuf/bin/protoc" 
> ```
>
> To do these, you must install Torchvision and Protobuf first.


## Tests
```bash
cd /pathto/ncnn/tools/pnnx/python
pytest tests
```

## Usage
1. export model to pnnx
```python
import torch
import torchvision.models as models
import pnnx

net = models.resnet18(pretrained=True)
x = torch.rand(1, 3, 224, 224)

# You could try disabling checking when torch tracing raises error
# opt_net = pnnx.export(net, "resnet18.pt", x, check_trace=False)
opt_net = pnnx.export(net, "resnet18.pt", x)
```

2. convert existing model to pnnx
```python
import torch
import pnnx

x = torch.rand(1, 3, 224, 224)
opt_net = pnnx.convert("resnet18.pt", x)
```

## API Reference
1. pnnx.export

`model` (torch.nn.Model): model to be exported.

`ptpath` (str): the torchscript name.

`inputs` (torch.Tensor of list of torch.Tensor) expected inputs of the model.

`inputs2` (torch.Tensor of list of torch.Tensor) alternative inputs of the model. Usually, it is used with input_shapes to resolve dynamic shape.

`input_shapes` (Optional, list of int or list of list with int type inside)  shapes of model inputs.
It is used to resolve tensor shapes in model graph. for example, [1,3,224,224] for the model with only 
1 input, [[1,3,224,224],[1,3,224,224]] for the model that have 2 inputs. 

`input_types` (Optional, str or list of str) types of model inputs, it should have the same length with `input_shapes`.
for example, "f32" for the model with only 1 input, ["f32", "f32"] for the model that have 2 inputs.

| typename | torch type                      |
|:--------:|:--------------------------------|
|   f32    | torch.float32 or torch.float    |
|   f64    | torch.float64 or torch.double   |
|   f16    | torch.float16 or torch.half     |
|    u8    | torch.uint8                     |
|    i8    | torch.int8                      |
|   i16    | torch.int16 or torch.short      |
|   i32    | torch.int32 or torch.int        |
|   i64    | torch.int64 or torch.long       |
|   c32    | torch.complex32                 |
|   c64    | torch.complex64                 |
|  c128    | torch.complex128                |

`input_shapes2` (Optional, list of int or list of list with int type inside) shapes of alternative model inputs,
the format is identical to `input_shapes`. Usually, it is used with input_shapes to resolve dynamic shape (-1)
in model graph.

`input_types2` (Optional, str or list of str) types of alternative model inputs.

`device` (Optional, str, default="cpu") device type for the input in TorchScript model, cpu or gpu.

`customop` (Optional, str or list of str) list of Torch extensions (dynamic library) for custom operators.
For example, "/home/nihui/.cache/torch_extensions/fused/fused.so" or 
["/home/nihui/.cache/torch_extensions/fused/fused.so",...].

`moduleop` (Optional, str or list of str)  list of modules to keep as one big operator.
for example, "models.common.Focus" or ["models.common.Focus","models.yolo.Detect"].

`optlevel` (Optional, int, default=2) graph optimization level

| option | optimization level                |
|:--------:|:----------------------------------|
|   0    | do not apply optimization         |
|   1    | do not apply optimization         |
|   2    | optimization more for inference   |

`pnnxparam` (Optional, str, default="*.pnnx.param", * is the model name): PNNX graph definition file.

`pnnxbin` (Optional, str, default="*.pnnx.bin"): PNNX model weight.

`pnnxpy` (Optional, str, default="*_pnnx.py"): PyTorch script for inference, including model construction 
and weight initialization code.

`pnnxonnx` (Optional, str, default="*.pnnx.onnx"): PNNX model in onnx format.

`ncnnparam` (Optional, str, default="*.ncnn.param"): ncnn graph definition.

`ncnnbin` (Optional, str, default="*.ncnn.bin"): ncnn model weight.

`ncnnpy` (Optional, str, default="*_ncnn.py"): pyncnn script for inference.

2. pnnx.convert

`ptpath` (str): torchscript model to be converted.

Other parameters are consistent with `pnnx.export`

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/Tencent/ncnn/tree/master/tools/pnnx",
    "name": "pnnx",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.7",
    "maintainer_email": null,
    "keywords": null,
    "author": "nihui",
    "author_email": "nihuini@tencent.com",
    "download_url": null,
    "platform": null,
    "description": "# pnnx\npython wrapper of pnnx, only support python 3.7+  now.\n\nInstall from pip\n==================\n\npnnx is available as wheel packages for macOS, Windows and Linux distributions, you can install with pip:\n\n```\npip install pnnx\n```\n\n# Build & Install from source\n\n## Prerequisites\n\n**On Unix (Linux, OS X)**\n\n* A compiler with C++14 support\n* CMake >= 3.4\n\n**On Mac**\n\n* A compiler with C++14 support\n* CMake >= 3.4\n\n**On Windows**\n\n* Visual Studio 2015 or higher\n* CMake >= 3.4\n\n## Build & install\n1. clone ncnn.\n```bash\ngit clone https://github.com/Tencent/ncnn.git\n```\n2. install pytorch \n\ninstall pytorch according to https://pytorch.org/ . Anaconda is strongly recommended for example:\n```bash\nconda install pytorch\n```\n3. install\n```bash\ncd /pathto/ncnntools/pnnx/python\npython setup.py install\n```\n\n> **Note:**\n> If torchvision and pnnx2onnx are needed, you can set the following environment variables before 'python setup.py install' to enable them. e.g. on ubuntu:\n>\n> ```\n> export TORCHVISION_INSTALL_DIR=\"/project/torchvision\"\n> export PROTOBUF_INCLUDE_DIR=\"/project/protobuf/include\"\n> export PROTOBUF_LIBRARIES=\"/project/protobuf/lib64/libprotobuf.a\"\n> export PROTOBUF_PROTOC_EXECUTABLE=\"/project/protobuf/bin/protoc\" \n> ```\n>\n> To do these, you must install Torchvision and Protobuf first.\n\n\n## Tests\n```bash\ncd /pathto/ncnn/tools/pnnx/python\npytest tests\n```\n\n## Usage\n1. export model to pnnx\n```python\nimport torch\nimport torchvision.models as models\nimport pnnx\n\nnet = models.resnet18(pretrained=True)\nx = torch.rand(1, 3, 224, 224)\n\n# You could try disabling checking when torch tracing raises error\n# opt_net = pnnx.export(net, \"resnet18.pt\", x, check_trace=False)\nopt_net = pnnx.export(net, \"resnet18.pt\", x)\n```\n\n2. convert existing model to pnnx\n```python\nimport torch\nimport pnnx\n\nx = torch.rand(1, 3, 224, 224)\nopt_net = pnnx.convert(\"resnet18.pt\", x)\n```\n\n## API Reference\n1. pnnx.export\n\n`model` (torch.nn.Model): model to be exported.\n\n`ptpath` (str): the torchscript name.\n\n`inputs` (torch.Tensor of list of torch.Tensor) expected inputs of the model.\n\n`inputs2` (torch.Tensor of list of torch.Tensor) alternative inputs of the model. Usually, it is used with input_shapes to resolve dynamic shape.\n\n`input_shapes` (Optional, list of int or list of list with int type inside)  shapes of model inputs.\nIt is used to resolve tensor shapes in model graph. for example, [1,3,224,224] for the model with only \n1 input, [[1,3,224,224],[1,3,224,224]] for the model that have 2 inputs. \n\n`input_types` (Optional, str or list of str) types of model inputs, it should have the same length with `input_shapes`.\nfor example, \"f32\" for the model with only 1 input, [\"f32\", \"f32\"] for the model that have 2 inputs.\n\n| typename | torch type                      |\n|:--------:|:--------------------------------|\n|   f32    | torch.float32 or torch.float    |\n|   f64    | torch.float64 or torch.double   |\n|   f16    | torch.float16 or torch.half     |\n|    u8    | torch.uint8                     |\n|    i8    | torch.int8                      |\n|   i16    | torch.int16 or torch.short      |\n|   i32    | torch.int32 or torch.int        |\n|   i64    | torch.int64 or torch.long       |\n|   c32    | torch.complex32                 |\n|   c64    | torch.complex64                 |\n|  c128    | torch.complex128                |\n\n`input_shapes2` (Optional, list of int or list of list with int type inside) shapes of alternative model inputs,\nthe format is identical to `input_shapes`. Usually, it is used with input_shapes to resolve dynamic shape (-1)\nin model graph.\n\n`input_types2` (Optional, str or list of str) types of alternative model inputs.\n\n`device` (Optional, str, default=\"cpu\") device type for the input in TorchScript model, cpu or gpu.\n\n`customop` (Optional, str or list of str) list of Torch extensions (dynamic library) for custom operators.\nFor example, \"/home/nihui/.cache/torch_extensions/fused/fused.so\" or \n[\"/home/nihui/.cache/torch_extensions/fused/fused.so\",...].\n\n`moduleop` (Optional, str or list of str)  list of modules to keep as one big operator.\nfor example, \"models.common.Focus\" or [\"models.common.Focus\",\"models.yolo.Detect\"].\n\n`optlevel` (Optional, int, default=2) graph optimization level\n\n| option | optimization level                |\n|:--------:|:----------------------------------|\n|   0    | do not apply optimization         |\n|   1    | do not apply optimization         |\n|   2    | optimization more for inference   |\n\n`pnnxparam` (Optional, str, default=\"*.pnnx.param\", * is the model name): PNNX graph definition file.\n\n`pnnxbin` (Optional, str, default=\"*.pnnx.bin\"): PNNX model weight.\n\n`pnnxpy` (Optional, str, default=\"*_pnnx.py\"): PyTorch script for inference, including model construction \nand weight initialization code.\n\n`pnnxonnx` (Optional, str, default=\"*.pnnx.onnx\"): PNNX model in onnx format.\n\n`ncnnparam` (Optional, str, default=\"*.ncnn.param\"): ncnn graph definition.\n\n`ncnnbin` (Optional, str, default=\"*.ncnn.bin\"): ncnn model weight.\n\n`ncnnpy` (Optional, str, default=\"*_ncnn.py\"): pyncnn script for inference.\n\n2. pnnx.convert\n\n`ptpath` (str): torchscript model to be converted.\n\nOther parameters are consistent with `pnnx.export`\n",
    "bugtrack_url": null,
    "license": "BSD-3",
    "summary": "pnnx is an open standard for PyTorch model interoperability.",
    "version": "20240410",
    "project_urls": {
        "Homepage": "https://github.com/Tencent/ncnn/tree/master/tools/pnnx"
    },
    "split_keywords": [],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "031a25b579a1d9d6b7bfb47c868ba2054385ff7080f07fd8a5df80e4e07e65f4",
                "md5": "0ddd0a07407c5a34e145adb3f57236c2",
                "sha256": "efd7e7c07faea8e63015498883e76b2b014fa6975b90bfbf852417bcde4ccf5c"
            },
            "downloads": -1,
            "filename": "pnnx-20240410-py3-none-macosx_10_9_universal2.macosx_10_9_x86_64.macosx_11_0_arm64.whl",
            "has_sig": false,
            "md5_digest": "0ddd0a07407c5a34e145adb3f57236c2",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.7",
            "size": 36560589,
            "upload_time": "2024-04-10T11:40:51",
            "upload_time_iso_8601": "2024-04-10T11:40:51.958248Z",
            "url": "https://files.pythonhosted.org/packages/03/1a/25b579a1d9d6b7bfb47c868ba2054385ff7080f07fd8a5df80e4e07e65f4/pnnx-20240410-py3-none-macosx_10_9_universal2.macosx_10_9_x86_64.macosx_11_0_arm64.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "62ce9e9d6da18bf74b61a05a8452cbfabd86e14652242bc0cb9f322f7644ee6e",
                "md5": "d013485f40c5d4e8acb3eaa9d9d51705",
                "sha256": "3a73c1bc266a0a36134b31bd46e48dea4f986f5fd22f1e467c325d7bdf6c1439"
            },
            "downloads": -1,
            "filename": "pnnx-20240410-py3-none-manylinux2014_aarch64.manylinux_2_17_aarch64.whl",
            "has_sig": false,
            "md5_digest": "d013485f40c5d4e8acb3eaa9d9d51705",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.7",
            "size": 16775787,
            "upload_time": "2024-04-10T11:40:55",
            "upload_time_iso_8601": "2024-04-10T11:40:55.630362Z",
            "url": "https://files.pythonhosted.org/packages/62/ce/9e9d6da18bf74b61a05a8452cbfabd86e14652242bc0cb9f322f7644ee6e/pnnx-20240410-py3-none-manylinux2014_aarch64.manylinux_2_17_aarch64.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "5c598b01afd38a81cd59e744903ff46836aa91e52e6b559b7e12791d65697836",
                "md5": "8e14a6d72b8cc1d04a8532ff25ff6daf",
                "sha256": "98fbeb2942f3b05f72d2a3cdba5cc5277a1dd5008912001608b2a20e2e6a18e0"
            },
            "downloads": -1,
            "filename": "pnnx-20240410-py3-none-manylinux2014_x86_64.manylinux_2_17_x86_64.whl",
            "has_sig": false,
            "md5_digest": "8e14a6d72b8cc1d04a8532ff25ff6daf",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.7",
            "size": 18275200,
            "upload_time": "2024-04-10T11:40:59",
            "upload_time_iso_8601": "2024-04-10T11:40:59.208848Z",
            "url": "https://files.pythonhosted.org/packages/5c/59/8b01afd38a81cd59e744903ff46836aa91e52e6b559b7e12791d65697836/pnnx-20240410-py3-none-manylinux2014_x86_64.manylinux_2_17_x86_64.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "c9adf656856947309c0f59a2215a8b383569a2cd98f6acf6b1bf35916edddc28",
                "md5": "ad2c523536d6a1646e28f77a75d2ce70",
                "sha256": "8cda860d365b3c2eb05c38a0277696ccb91e1b7e03423bb11501cce052a34f86"
            },
            "downloads": -1,
            "filename": "pnnx-20240410-py3-none-win_amd64.whl",
            "has_sig": false,
            "md5_digest": "ad2c523536d6a1646e28f77a75d2ce70",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.7",
            "size": 12959038,
            "upload_time": "2024-04-10T11:41:01",
            "upload_time_iso_8601": "2024-04-10T11:41:01.926466Z",
            "url": "https://files.pythonhosted.org/packages/c9/ad/f656856947309c0f59a2215a8b383569a2cd98f6acf6b1bf35916edddc28/pnnx-20240410-py3-none-win_amd64.whl",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-04-10 11:40:51",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "Tencent",
    "github_project": "ncnn",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "pnnx"
}
        
Elapsed time: 0.29209s