tfjs-graph-converter


Nametfjs-graph-converter JSON
Version 1.6.3 PyPI version JSON
download
home_page
SummaryA tensorflowjs Graph Model Converter
upload_time2023-01-22 10:11:48
maintainer
docs_urlNone
author
requires_python>=3.6
licenseMIT License Copyright (c) 2019, 2022 Patrick Levin Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
keywords tensorflow tensorflowjs converter
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # TensorFlow.js Graph Model Converter

![TFJS Graph Converter Logo](https://github.com/patlevin/tfjs-to-tf/raw/master/docs/logo.png)

The purpose of this library is to import TFJS graph models into Tensorflow.
This allows you to use TensorFlow.js models with Python in case you don't
have access to the original formats or the models have been created in TFJS.

## Disclaimer

I'm neither a Python developer, nor do I know TensorFlow or TensorFlow.js.
I created this package solely because I ran into an issue when trying to convert
a pretrained TensorFlow.js model into a different format. I didn't have access to
the pretrained original TF model and didn't have the resources to train it myself.
I soon learned that I'm not alone with this [issue](https://github.com/tensorflow/tfjs/issues/1575)
so I sat down and wrote this little library.

If you find any part of the code to be non-idiomatic or know of a simpler way to
achieve certain things, feel free to let me know, since I'm a beginner in both
Python and especially TensorFlow (used it for the very first time in this
very project).

## Prerequisites

* tensorflow 2.1+
* tensorflowjs 1.5.2+

## Compatibility

The converter has been tested with tensorflowjs v3.13.0, tensorflow v2.8.0
and Python 3.9.10.

## Installation

```sh
pip install tfjs-graph-converter
```

## Usage

After the installation, you can run the packaged `tfjs_graph_converter` binary
for quick and easy model conversion.

### Positional Arguments

 | Positional Argument | Description |
 | :--- | :--- |
 | `input_path` | Path to the TFJS Graph Model directory containing the model.json |
 | `output_path` | For output format "tf_saved_model", a SavedModel target directory. For output format "tf_frozen_model", a frozen model file. |

### Options

| Option | Description |
| :--- | :--- |
| `-h`, `--help` | Show help message and exit |
| `--output_format` | Use `tf_frozen_model` (the default) to save a Tensorflow frozen model. `tf_saved_model` exports to a Tensorflow _SavedModel_ instead. |
| `--saved_model_tags` | Specifies the tags of the MetaGraphDef to save, in comma separated string format. Defaults to "serve". Applicable only if `--output_format` is `tf_saved_model` |
| `-c`, `--compat_mode` | Sets a compatibility mode forthe coverted model (see below) |
| `-v`, `--version` | Shows the version of the converter and its dependencies. |
| `-s`, `--silent` | Suppresses any output besides error messages. |

### Compatibility Modes

Models are converted to optmimised native Tensorflow operators by default.
This can cause problems if the converted model is subseuently converted to
another format (ONNX, TFLite, older TFJS versions, etc.) The `--compat_mode`
option can be used to avoid incompatible native operations such as fused
convolutions. Available options are:

| Mode     | Description |
| :---     | :---        |
| `none`   | Use all available optimisations and native TF operators |
| `tfjs`   | Harmonise input types for compatibility with older TFJS versions |
| `tflite` | Only use TFLite builtins in the converted model |

### Advanced Options

These options are intended for advanced users who are familiar with the details of TensorFlow and [TensorFlow Serving](https://www.tensorflow.org/tfx/guide/serving).

| Option | Description | Example |
| :--- | :--- | :--- |
| `--outputs` | Specifies the outputs of the MetaGraphDef to save, in comma separated string format. Applicable only if `--output_format` is `tf_saved_model` | --outputs=Identity |
| `--signature_key` | Specifies the key for the signature of the MetraGraphDef. Applicable only if `--output_format` is `tf_saved_model`. Requires `--outputs` to be set. | --signature_key=serving_autoencode |
| `--method_name` | Specifies the method name for the signature of the MetraGraphDef. Applicable only if `--output_format` is `tf_saved_model`. Requires `--outputs` to be set. | --method_name=tensorflow/serving/classify |
| `--rename` | Specifies a key mapping to change the keys of outputs and inputs in the signature. The format is comma-separated pairs of _old name:new name_. Applicable only if `--output_format` is `tf_saved_model`. Requires `--outputs` to be set. | --rename Identity:scores,model/dense256/BiasAdd:confidence |

Specifying ``--outputs`` can be useful for multi-head models to select the default
output for the main signature. The CLI only handles the default signature of
the model. Multiple signatures can be created using the [API](https://github.com/patlevin/tfjs-to-tf/blob/master/docs/api.rst).

The method name must be handled with care, since setting the wrong value might
prevent the signature from being valid for use with TensorFlow Serving.
The option is available, because the converter only generates
_predict_-signatures. In case the model is a regression model or a classifier
with the matching outputs, the correct method name can be forced using the
``--method_name`` option.

Alternatively, you can create your own converter programs using the module's API.
The API is required to accomplish more complicated tasks, like packaging multiple
TensorFlow.js models into a single SavedModel.

## Example

To convert a TensorFlow.js graph model to a TensorFlow frozen model (i.e. the
most common use case?), just specify the directory containing the `model.json`,
followed by the path and file name of the frozen model like so:

```sh
tfjs_graph_converter path/to/js/model path/to/frozen/model.pb
```

## Advanced Example

Converting to [TF SavedMovel format](https://www.tensorflow.org/guide/saved_model)
adds a lot of options for tweaking model signatures. The following example
converts a [Posenet](https://github.com/tensorflow/tfjs-models/tree/master/posenet)
model, which is a multi-head model.

We want to select only two of the four possible outputs and rename them in the
model's signature, as follows:

* Input: _input_ (from _sub_2_)
* Outputs: _offsets_ and _heatmaps_ (from _float_short_offsets_ and _float_heatmaps_)

```sh
tfjs_graph_converter \
    ~/models/posenet/model-stride16.json \
    ~/models/posenet_savedmodel \
    --output_format tf_saved_model \
    --outputs float_short_offsets,float_heatmaps \
    --rename float_short_offsets:offsets,float_heatmaps:heatmaps,sub_2:input
```

After the conversion, we can examine the output and verify the new model
signature:

```sh
saved_model_cli show --dir ~/models/posenet_savedmodel --all

MetaGraphDef with tag-set: 'serve' contains the following SignatureDefs:

signature_def['serving_default']:
  The given SavedModel SignatureDef contains the following input(s):
    inputs['input'] tensor_info:
        dtype: DT_FLOAT
        shape: (1, -1, -1, 3)
        name: sub_2:0
  The given SavedModel SignatureDef contains the following output(s):
    outputs['heatmaps'] tensor_info:
        dtype: DT_FLOAT
        shape: (1, -1, -1, 17)
        name: float_heatmaps:0
    outputs['offsets'] tensor_info:
        dtype: DT_FLOAT
        shape: (1, -1, -1, 34)
        name: float_short_offsets:0
  Method name is: tensorflow/serving/predict
```

## Usage from within Python

The package installs the module `tfjs_graph_converter`, which contains all the
functionality used by the converter script.
You can leverage the API to either load TensorFlow.js graph models directly for
use with your TensorFlow program (e.g. for inference, fine-tuning, or extending),
or use the advanced functionality to combine several TFJS models into a single
`SavedModel`.
The latter is only supported using the API (it's just a single function call,
though, so don't panic 😉)

## Important Note

By default, Python code that includes the library will see CUDA devices
disabled (i.e. not visible in the program). This is done because the library
uses some low-level APIs that don't allow disabling GPUs from Python code.
Unfortunately some GPUs support CUDA but don't have the compute capabilities or
VRAM required to convert certain models. For this reason, CUDA devices are
disabled by default and the converter and scripts using it use CPUs only.

This behaviour can now be disabled by calling `enable_cuda()` **before** any
Tensorflow or converter function is called. This will re-enable the use of
CUDA-capable devices, but may result in errors during model
loading/conversion depending on the installed GPU hardware.

[API Documentation](./docs/modules.rst)

            

Raw data

            {
    "_id": null,
    "home_page": "",
    "name": "tfjs-graph-converter",
    "maintainer": "",
    "docs_url": null,
    "requires_python": ">=3.6",
    "maintainer_email": "",
    "keywords": "tensorflow,tensorflowjs,converter",
    "author": "",
    "author_email": "Patrick Levin <vertical-pink@protonmail.com>",
    "download_url": "https://files.pythonhosted.org/packages/f1/4f/d7e6786b07d42012005439270d6bff3ff813370e64d721c3279ac1751b66/tfjs-graph-converter-1.6.3.tar.gz",
    "platform": null,
    "description": "# TensorFlow.js Graph Model Converter\r\n\r\n![TFJS Graph Converter Logo](https://github.com/patlevin/tfjs-to-tf/raw/master/docs/logo.png)\r\n\r\nThe purpose of this library is to import TFJS graph models into Tensorflow.\r\nThis allows you to use TensorFlow.js models with Python in case you don't\r\nhave access to the original formats or the models have been created in TFJS.\r\n\r\n## Disclaimer\r\n\r\nI'm neither a Python developer, nor do I know TensorFlow or TensorFlow.js.\r\nI created this package solely because I ran into an issue when trying to convert\r\na pretrained TensorFlow.js model into a different format. I didn't have access to\r\nthe pretrained original TF model and didn't have the resources to train it myself.\r\nI soon learned that I'm not alone with this [issue](https://github.com/tensorflow/tfjs/issues/1575)\r\nso I sat down and wrote this little library.\r\n\r\nIf you find any part of the code to be non-idiomatic or know of a simpler way to\r\nachieve certain things, feel free to let me know, since I'm a beginner in both\r\nPython and especially TensorFlow (used it for the very first time in this\r\nvery project).\r\n\r\n## Prerequisites\r\n\r\n* tensorflow 2.1+\r\n* tensorflowjs 1.5.2+\r\n\r\n## Compatibility\r\n\r\nThe converter has been tested with tensorflowjs v3.13.0, tensorflow v2.8.0\r\nand Python 3.9.10.\r\n\r\n## Installation\r\n\r\n```sh\r\npip install tfjs-graph-converter\r\n```\r\n\r\n## Usage\r\n\r\nAfter the installation, you can run the packaged `tfjs_graph_converter` binary\r\nfor quick and easy model conversion.\r\n\r\n### Positional Arguments\r\n\r\n | Positional Argument | Description |\r\n | :--- | :--- |\r\n | `input_path` | Path to the TFJS Graph Model directory containing the model.json |\r\n | `output_path` | For output format \"tf_saved_model\", a SavedModel target directory. For output format \"tf_frozen_model\", a frozen model file. |\r\n\r\n### Options\r\n\r\n| Option | Description |\r\n| :--- | :--- |\r\n| `-h`, `--help` | Show help message and exit |\r\n| `--output_format` | Use `tf_frozen_model` (the default) to save a Tensorflow frozen model. `tf_saved_model` exports to a Tensorflow _SavedModel_ instead. |\r\n| `--saved_model_tags` | Specifies the tags of the MetaGraphDef to save, in comma separated string format. Defaults to \"serve\". Applicable only if `--output_format` is `tf_saved_model` |\r\n| `-c`, `--compat_mode` | Sets a compatibility mode forthe coverted model (see below) |\r\n| `-v`, `--version` | Shows the version of the converter and its dependencies. |\r\n| `-s`, `--silent` | Suppresses any output besides error messages. |\r\n\r\n### Compatibility Modes\r\n\r\nModels are converted to optmimised native Tensorflow operators by default.\r\nThis can cause problems if the converted model is subseuently converted to\r\nanother format (ONNX, TFLite, older TFJS versions, etc.) The `--compat_mode`\r\noption can be used to avoid incompatible native operations such as fused\r\nconvolutions. Available options are:\r\n\r\n| Mode     | Description |\r\n| :---     | :---        |\r\n| `none`   | Use all available optimisations and native TF operators |\r\n| `tfjs`   | Harmonise input types for compatibility with older TFJS versions |\r\n| `tflite` | Only use TFLite builtins in the converted model |\r\n\r\n### Advanced Options\r\n\r\nThese options are intended for advanced users who are familiar with the details of TensorFlow and [TensorFlow Serving](https://www.tensorflow.org/tfx/guide/serving).\r\n\r\n| Option | Description | Example |\r\n| :--- | :--- | :--- |\r\n| `--outputs` | Specifies the outputs of the MetaGraphDef to save, in comma separated string format. Applicable only if `--output_format` is `tf_saved_model` | --outputs=Identity |\r\n| `--signature_key` | Specifies the key for the signature of the MetraGraphDef. Applicable only if `--output_format` is `tf_saved_model`. Requires `--outputs` to be set. | --signature_key=serving_autoencode |\r\n| `--method_name` | Specifies the method name for the signature of the MetraGraphDef. Applicable only if `--output_format` is `tf_saved_model`. Requires `--outputs` to be set. | --method_name=tensorflow/serving/classify |\r\n| `--rename` | Specifies a key mapping to change the keys of outputs and inputs in the signature. The format is comma-separated pairs of _old name:new name_. Applicable only if `--output_format` is `tf_saved_model`. Requires `--outputs` to be set. | --rename Identity:scores,model/dense256/BiasAdd:confidence |\r\n\r\nSpecifying ``--outputs`` can be useful for multi-head models to select the default\r\noutput for the main signature. The CLI only handles the default signature of\r\nthe model. Multiple signatures can be created using the [API](https://github.com/patlevin/tfjs-to-tf/blob/master/docs/api.rst).\r\n\r\nThe method name must be handled with care, since setting the wrong value might\r\nprevent the signature from being valid for use with TensorFlow Serving.\r\nThe option is available, because the converter only generates\r\n_predict_-signatures. In case the model is a regression model or a classifier\r\nwith the matching outputs, the correct method name can be forced using the\r\n``--method_name`` option.\r\n\r\nAlternatively, you can create your own converter programs using the module's API.\r\nThe API is required to accomplish more complicated tasks, like packaging multiple\r\nTensorFlow.js models into a single SavedModel.\r\n\r\n## Example\r\n\r\nTo convert a TensorFlow.js graph model to a TensorFlow frozen model (i.e. the\r\nmost common use case?), just specify the directory containing the `model.json`,\r\nfollowed by the path and file name of the frozen model like so:\r\n\r\n```sh\r\ntfjs_graph_converter path/to/js/model path/to/frozen/model.pb\r\n```\r\n\r\n## Advanced Example\r\n\r\nConverting to [TF SavedMovel format](https://www.tensorflow.org/guide/saved_model)\r\nadds a lot of options for tweaking model signatures. The following example\r\nconverts a [Posenet](https://github.com/tensorflow/tfjs-models/tree/master/posenet)\r\nmodel, which is a multi-head model.\r\n\r\nWe want to select only two of the four possible outputs and rename them in the\r\nmodel's signature, as follows:\r\n\r\n* Input: _input_ (from _sub_2_)\r\n* Outputs: _offsets_ and _heatmaps_ (from _float_short_offsets_ and _float_heatmaps_)\r\n\r\n```sh\r\ntfjs_graph_converter \\\r\n    ~/models/posenet/model-stride16.json \\\r\n    ~/models/posenet_savedmodel \\\r\n    --output_format tf_saved_model \\\r\n    --outputs float_short_offsets,float_heatmaps \\\r\n    --rename float_short_offsets:offsets,float_heatmaps:heatmaps,sub_2:input\r\n```\r\n\r\nAfter the conversion, we can examine the output and verify the new model\r\nsignature:\r\n\r\n```sh\r\nsaved_model_cli show --dir ~/models/posenet_savedmodel --all\r\n\r\nMetaGraphDef with tag-set: 'serve' contains the following SignatureDefs:\r\n\r\nsignature_def['serving_default']:\r\n  The given SavedModel SignatureDef contains the following input(s):\r\n    inputs['input'] tensor_info:\r\n        dtype: DT_FLOAT\r\n        shape: (1, -1, -1, 3)\r\n        name: sub_2:0\r\n  The given SavedModel SignatureDef contains the following output(s):\r\n    outputs['heatmaps'] tensor_info:\r\n        dtype: DT_FLOAT\r\n        shape: (1, -1, -1, 17)\r\n        name: float_heatmaps:0\r\n    outputs['offsets'] tensor_info:\r\n        dtype: DT_FLOAT\r\n        shape: (1, -1, -1, 34)\r\n        name: float_short_offsets:0\r\n  Method name is: tensorflow/serving/predict\r\n```\r\n\r\n## Usage from within Python\r\n\r\nThe package installs the module `tfjs_graph_converter`, which contains all the\r\nfunctionality used by the converter script.\r\nYou can leverage the API to either load TensorFlow.js graph models directly for\r\nuse with your TensorFlow program (e.g. for inference, fine-tuning, or extending),\r\nor use the advanced functionality to combine several TFJS models into a single\r\n`SavedModel`.\r\nThe latter is only supported using the API (it's just a single function call,\r\nthough, so don't panic \ud83d\ude09)\r\n\r\n## Important Note\r\n\r\nBy default, Python code that includes the library will see CUDA devices\r\ndisabled (i.e. not visible in the program). This is done because the library\r\nuses some low-level APIs that don't allow disabling GPUs from Python code.\r\nUnfortunately some GPUs support CUDA but don't have the compute capabilities or\r\nVRAM required to convert certain models. For this reason, CUDA devices are\r\ndisabled by default and the converter and scripts using it use CPUs only.\r\n\r\nThis behaviour can now be disabled by calling `enable_cuda()` **before** any\r\nTensorflow or converter function is called. This will re-enable the use of\r\nCUDA-capable devices, but may result in errors during model\r\nloading/conversion depending on the installed GPU hardware.\r\n\r\n[API Documentation](./docs/modules.rst)\r\n",
    "bugtrack_url": null,
    "license": "MIT License  Copyright (c) 2019, 2022 Patrick Levin  Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the \"Software\"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:  The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.  THE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.",
    "summary": "A tensorflowjs Graph Model Converter",
    "version": "1.6.3",
    "split_keywords": [
        "tensorflow",
        "tensorflowjs",
        "converter"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "f82a6a1924189ddec1606e3cb061aee6b30b291f3aec5c89880639f3d5a9295b",
                "md5": "49cc07efe26aabb362399e626e821c99",
                "sha256": "26df812303834066ea4596087d3be70c9bd7422f4550ba246455d187f6ef8f34"
            },
            "downloads": -1,
            "filename": "tfjs_graph_converter-1.6.3-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "49cc07efe26aabb362399e626e821c99",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.6",
            "size": 33870,
            "upload_time": "2023-01-22T10:11:47",
            "upload_time_iso_8601": "2023-01-22T10:11:47.265513Z",
            "url": "https://files.pythonhosted.org/packages/f8/2a/6a1924189ddec1606e3cb061aee6b30b291f3aec5c89880639f3d5a9295b/tfjs_graph_converter-1.6.3-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "f14fd7e6786b07d42012005439270d6bff3ff813370e64d721c3279ac1751b66",
                "md5": "d20456dd8d508c7025f40fc6c751cb4f",
                "sha256": "ef1aae75e38a7ddb9ca7f05702ad4264f89eea912f3ad6e6ef7bd9a29041468c"
            },
            "downloads": -1,
            "filename": "tfjs-graph-converter-1.6.3.tar.gz",
            "has_sig": false,
            "md5_digest": "d20456dd8d508c7025f40fc6c751cb4f",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.6",
            "size": 31899,
            "upload_time": "2023-01-22T10:11:48",
            "upload_time_iso_8601": "2023-01-22T10:11:48.722165Z",
            "url": "https://files.pythonhosted.org/packages/f1/4f/d7e6786b07d42012005439270d6bff3ff813370e64d721c3279ac1751b66/tfjs-graph-converter-1.6.3.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2023-01-22 10:11:48",
    "github": false,
    "gitlab": false,
    "bitbucket": false,
    "lcname": "tfjs-graph-converter"
}
        
Elapsed time: 0.03233s