TorchFix


NameTorchFix JSON
Version 0.3.0 PyPI version JSON
download
home_page
SummaryTorchFix - a linter for PyTorch-using code with autofix support
upload_time2024-01-19 00:18:26
maintainer
docs_urlNone
author
requires_python
licenseBSD License For TorchFix software Copyright (c) Facebook, Inc. and its affiliates. All rights reserved. Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met: * Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. * Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. * Neither the name Facebook nor the names of its contributors may be used to endorse or promote products derived from this software without specific prior written permission. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # TorchFix - a linter for PyTorch-using code with autofix support

[![PyPI](https://img.shields.io/pypi/v/torchfix.svg)](https://pypi.org/project/torchfix/)

TorchFix is a Python code static analysis tool - a linter with autofix capabilities -
for users of PyTorch. It can be used to find and fix issues like usage of deprecated
PyTorch functions and non-public symbols, and to adopt PyTorch best practices in general.

TorchFix is built upon https://github.com/Instagram/LibCST - a library to manipulate
Python concrete syntax trees. LibCST enables "codemods" (autofixes) in addition to
reporting issues.

TorchFix can be used as a Flake8 plugin (linting only) or as a standalone
program (with autofix available for a subset of the lint violations).

Currently TorchFix is in a **beta version** stage, so there are still a lot of rough
edges and many things can and will change.

## Installation

To install the latest code from GitHub, clone/download
https://github.com/pytorch-labs/torchfix and run `pip install .`
inside the directory.

To install a release version from PyPI, run `pip install torchfix`.

## Usage

After the installation, TorchFix will be available as a Flake8 plugin, so running
Flake8 normally will run the TorchFix linter.

To see only TorchFix warnings without the rest of the Flake8 linters, you can run
`flake8 --isolated --select=TOR0,TOR1,TOR2`

TorchFix can also be run as a standalone program: `torchfix .`
Add `--fix` parameter to try to autofix some of the issues (the files will be overwritten!)
To see some additional debug info, add `--show-stderr` parameter.

Please keep in mind that autofix is a best-effort mechanism. Given the dynamic nature of Python,
and especially the beta version status of TorchFix, it's very difficult to have
certainty when making changes to code, even for the seemingly trivial fixes.

Warnings for issues with codes starting with TOR0, TOR1, and TOR2 are enabled by default.
Warnings with other codes may be too noisy, so not enabled by default.
To enable them, use standard flake8 configuration options for the plugin mode or use
`torchfix --select=ALL .` for the standalone mode.


## Reporting problems

If you encounter a bug or some other problem with TorchFix, please file an issue on
https://github.com/pytorch-labs/torchfix/issues.


## Rules

### TOR001 Use of removed function

#### torch.solve

This function was deprecated since PyTorch version 1.9 and is now removed.

`torch.solve` is deprecated in favor of `torch.linalg.solve`.
`torch.linalg.solve` has its arguments reversed and does not return the LU factorization.

To get the LU factorization see `torch.lu`, which can be used with `torch.lu_solve` or `torch.lu_unpack`.

`X = torch.solve(B, A).solution` should be replaced with `X = torch.linalg.solve(A, B)`.

### TOR002 Likely typo `require_grad` in assignment. Did you mean `requires_grad`?

This is a common misspelling that can lead to silent performance issues.

### TOR003 Please pass `use_reentrant` explicitly to `checkpoint`

The default value of the `use_reentrant` parameter in `torch.utils.checkpoint` is being changed
from `True` to `False`. In the meantime, the value needs to be passed explicitly.

See this [forum post](https://dev-discuss.pytorch.org/t/bc-breaking-update-to-torch-utils-checkpoint-not-passing-in-use-reentrant-flag-will-raise-an-error/1745)
for details.

### TOR101 Use of deprecated function

#### torch.nn.utils.weight_norm

This function is deprecated. Use :func:`torch.nn.utils.parametrizations.weight_norm`
which uses the modern parametrization API. The new ``weight_norm`` is compatible
with ``state_dict`` generated from old ``weight_norm``.

Migration guide:

* The magnitude (``weight_g``) and direction (``weight_v``) are now expressed
    as ``parametrizations.weight.original0`` and ``parametrizations.weight.original1``
    respectively.

* To remove the weight normalization reparametrization, use
    `torch.nn.utils.parametrize.remove_parametrizations`.

* The weight is no longer recomputed once at module forward; instead, it will
    be recomputed on every access.  To restore the old behavior, use
    `torch.nn.utils.parametrize.cached` before invoking the module
    in question.

## License
TorchFix is BSD License licensed, as found in the LICENSE file.

            

Raw data

            {
    "_id": null,
    "home_page": "",
    "name": "TorchFix",
    "maintainer": "",
    "docs_url": null,
    "requires_python": "",
    "maintainer_email": "",
    "keywords": "",
    "author": "",
    "author_email": "",
    "download_url": "https://files.pythonhosted.org/packages/fd/ba/3323b6bed221fde5f974d12d6c7fa0fa8248b12990f02077c7af0b48b12c/TorchFix-0.3.0.tar.gz",
    "platform": null,
    "description": "# TorchFix - a linter for PyTorch-using code with autofix support\n\n[![PyPI](https://img.shields.io/pypi/v/torchfix.svg)](https://pypi.org/project/torchfix/)\n\nTorchFix is a Python code static analysis tool - a linter with autofix capabilities -\nfor users of PyTorch. It can be used to find and fix issues like usage of deprecated\nPyTorch functions and non-public symbols, and to adopt PyTorch best practices in general.\n\nTorchFix is built upon https://github.com/Instagram/LibCST - a library to manipulate\nPython concrete syntax trees. LibCST enables \"codemods\" (autofixes) in addition to\nreporting issues.\n\nTorchFix can be used as a Flake8 plugin (linting only) or as a standalone\nprogram (with autofix available for a subset of the lint violations).\n\nCurrently TorchFix is in a **beta version** stage, so there are still a lot of rough\nedges and many things can and will change.\n\n## Installation\n\nTo install the latest code from GitHub, clone/download\nhttps://github.com/pytorch-labs/torchfix and run `pip install .`\ninside the directory.\n\nTo install a release version from PyPI, run `pip install torchfix`.\n\n## Usage\n\nAfter the installation, TorchFix will be available as a Flake8 plugin, so running\nFlake8 normally will run the TorchFix linter.\n\nTo see only TorchFix warnings without the rest of the Flake8 linters, you can run\n`flake8 --isolated --select=TOR0,TOR1,TOR2`\n\nTorchFix can also be run as a standalone program: `torchfix .`\nAdd `--fix` parameter to try to autofix some of the issues (the files will be overwritten!)\nTo see some additional debug info, add `--show-stderr` parameter.\n\nPlease keep in mind that autofix is a best-effort mechanism. Given the dynamic nature of Python,\nand especially the beta version status of TorchFix, it's very difficult to have\ncertainty when making changes to code, even for the seemingly trivial fixes.\n\nWarnings for issues with codes starting with TOR0, TOR1, and TOR2 are enabled by default.\nWarnings with other codes may be too noisy, so not enabled by default.\nTo enable them, use standard flake8 configuration options for the plugin mode or use\n`torchfix --select=ALL .` for the standalone mode.\n\n\n## Reporting problems\n\nIf you encounter a bug or some other problem with TorchFix, please file an issue on\nhttps://github.com/pytorch-labs/torchfix/issues.\n\n\n## Rules\n\n### TOR001 Use of removed function\n\n#### torch.solve\n\nThis function was deprecated since PyTorch version 1.9 and is now removed.\n\n`torch.solve` is deprecated in favor of `torch.linalg.solve`.\n`torch.linalg.solve` has its arguments reversed and does not return the LU factorization.\n\nTo get the LU factorization see `torch.lu`, which can be used with `torch.lu_solve` or `torch.lu_unpack`.\n\n`X = torch.solve(B, A).solution` should be replaced with `X = torch.linalg.solve(A, B)`.\n\n### TOR002 Likely typo `require_grad` in assignment. Did you mean `requires_grad`?\n\nThis is a common misspelling that can lead to silent performance issues.\n\n### TOR003 Please pass `use_reentrant` explicitly to `checkpoint`\n\nThe default value of the `use_reentrant` parameter in `torch.utils.checkpoint` is being changed\nfrom `True` to `False`. In the meantime, the value needs to be passed explicitly.\n\nSee this [forum post](https://dev-discuss.pytorch.org/t/bc-breaking-update-to-torch-utils-checkpoint-not-passing-in-use-reentrant-flag-will-raise-an-error/1745)\nfor details.\n\n### TOR101 Use of deprecated function\n\n#### torch.nn.utils.weight_norm\n\nThis function is deprecated. Use :func:`torch.nn.utils.parametrizations.weight_norm`\nwhich uses the modern parametrization API. The new ``weight_norm`` is compatible\nwith ``state_dict`` generated from old ``weight_norm``.\n\nMigration guide:\n\n* The magnitude (``weight_g``) and direction (``weight_v``) are now expressed\n    as ``parametrizations.weight.original0`` and ``parametrizations.weight.original1``\n    respectively.\n\n* To remove the weight normalization reparametrization, use\n    `torch.nn.utils.parametrize.remove_parametrizations`.\n\n* The weight is no longer recomputed once at module forward; instead, it will\n    be recomputed on every access.  To restore the old behavior, use\n    `torch.nn.utils.parametrize.cached` before invoking the module\n    in question.\n\n## License\nTorchFix is BSD License licensed, as found in the LICENSE file.\n",
    "bugtrack_url": null,
    "license": "BSD License  For TorchFix software  Copyright (c) Facebook, Inc. and its affiliates. All rights reserved.  Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met:  * Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer.  * Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution.  * Neither the name Facebook nor the names of its contributors may be used to endorse or promote products derived from this software without specific prior written permission.  THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS \"AS IS\" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. ",
    "summary": "TorchFix - a linter for PyTorch-using code with autofix support",
    "version": "0.3.0",
    "project_urls": {
        "Bug Tracker": "https://github.com/pytorch-labs/torchfix/issues",
        "Repository": "https://github.com/pytorch-labs/torchfix"
    },
    "split_keywords": [],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "428ff19e35830ccfef84cf5ee2095bcd21b333d5452ba614791c7f68139b659c",
                "md5": "495fb9f953997af48302521f2ee287c1",
                "sha256": "b46a458b56287670c1519f40cd40a8e3b871611c754bc4be625d627cd273e623"
            },
            "downloads": -1,
            "filename": "TorchFix-0.3.0-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "495fb9f953997af48302521f2ee287c1",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": null,
            "size": 23092,
            "upload_time": "2024-01-19T00:18:25",
            "upload_time_iso_8601": "2024-01-19T00:18:25.414804Z",
            "url": "https://files.pythonhosted.org/packages/42/8f/f19e35830ccfef84cf5ee2095bcd21b333d5452ba614791c7f68139b659c/TorchFix-0.3.0-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "fdba3323b6bed221fde5f974d12d6c7fa0fa8248b12990f02077c7af0b48b12c",
                "md5": "19e6ded2da24327d1fc4037d3fc89738",
                "sha256": "c5d8c9eeaa07f56881905471f950ad0894ac7e95521585491984564a68a0fecf"
            },
            "downloads": -1,
            "filename": "TorchFix-0.3.0.tar.gz",
            "has_sig": false,
            "md5_digest": "19e6ded2da24327d1fc4037d3fc89738",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": null,
            "size": 16968,
            "upload_time": "2024-01-19T00:18:26",
            "upload_time_iso_8601": "2024-01-19T00:18:26.747843Z",
            "url": "https://files.pythonhosted.org/packages/fd/ba/3323b6bed221fde5f974d12d6c7fa0fa8248b12990f02077c7af0b48b12c/TorchFix-0.3.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-01-19 00:18:26",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "pytorch-labs",
    "github_project": "torchfix",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "torchfix"
}
        
Elapsed time: 0.21111s