light-the-torch


Namelight-the-torch JSON
Version 0.7.5 PyPI version JSON
download
home_page
SummaryInstall PyTorch distributions with computation backend auto-detection
upload_time2023-07-17 08:23:40
maintainer
docs_urlNone
author
requires_python>=3.7
licenseBSD-3-Clause
keywords pytorch cuda pip install
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage
            # `light-the-torch`

[![BSD-3-Clause License](https://img.shields.io/github/license/pmeier/light-the-torch)](https://opensource.org/licenses/BSD-3-Clause)
[![Project Status: WIP](https://www.repostatus.org/badges/latest/wip.svg)](https://www.repostatus.org/#wip)
[![Code coverage via codecov.io](https://codecov.io/gh/pmeier/light-the-torch/branch/main/graph/badge.svg)](https://codecov.io/gh/pmeier/light-the-torch)

`light-the-torch` is a small utility that wraps `pip` to ease the installation process
for PyTorch distributions like `torch`, `torchvision`, `torchaudio`, and so on as well
as third-party packages that depend on them. It auto-detects compatible CUDA versions
from the local setup and installs the correct PyTorch binaries without user
interference.

- [Why do I need it?](#why-do-i-need-it)
- [How do I install it?](#how-do-i-install-it)
- [How do I use it?](#how-do-i-use-it)
- [How does it work?](#how-does-it-work)
- [Is it safe?](#is-it-safe)
- [How do I contribute?](#how-do-i-contribute)

## Why do I need it?

PyTorch distributions like `torch`, `torchvision`, `torchaudio`, and so on are fully
`pip install`'able, but PyPI, the default `pip` search index, has some limitations:

1. PyPI regularly only allows binaries up to a size of
   [approximately 60 MB](https://github.com/pypa/packaging-problems/issues/86). One can
   [request a file size limit increase](https://pypi.org/help/#file-size-limit) (and the
   PyTorch team probably does that for every release), but it is still not enough:
   although PyTorch has pre-built binaries for Windows with CUDA, they cannot be
   installed through PyPI due to their size.
2. PyTorch uses local version specifiers to indicate for which computation backend the
   binary was compiled, for example `torch==1.11.0+cpu`. Unfortunately, local specifiers
   are not allowed on PyPI. Thus, only the binaries compiled with one CUDA version are
   uploaded without an indication of the CUDA version. If you do not have a CUDA capable
   GPU, downloading this is only a waste of bandwidth and disk capacity. If on the other
   hand your NVIDIA driver version simply doesn't support the CUDA version the binary
   was compiled with, you can't use any of the GPU features.

To overcome this, PyTorch also hosts _most_[^1] binaries
[on their own package indices](https://download.pytorch.org/whl). To access PyTorch's
package indices, you can still use `pip install`, but some
[additional options](https://pytorch.org/get-started/locally/) are needed:

```shell
pip install torch --extra-index-url https://download.pytorch.org/whl/cu113
```

[^1]:
    Some distributions are not compiled against a specific computation backend and thus
    hosting them on PyPI is sufficient since they work in every environment.

While this is certainly an improvement, it still has a few downsides:

1. You need to know what computation backend, e.g. CUDA 11.3 (`cu113`), is supported on
   your local machine. This can be quite challenging for new users and at least tedious
   for more experienced ones.
2. Besides the stable binaries, PyTorch also offers nightly and test ones. To install
   them, you need a different `--extra-index-url` for each.
3. For the nightly and test channel you also need to supply the `--pre` option. Failing
   to do so, will pull the stable binary from PyPI even if the rest of the installation
   command is correct.

If any of these points don't sound appealing to you, and you just want to have the same
user experience as `pip install` for PyTorch distributions, `light-the-torch` was made
for you.

## How do I install it?

Installing `light-the-torch` is as easy as

```shell
pip install light-the-torch
```

Since it depends on `pip` and it might be upgraded during installation,
[Windows users](https://pip.pypa.io/en/stable/installation/#upgrading-pip) should
install it with

```shell
py -m pip install light-the-torch
```

## How do I use it?

After `light-the-torch` is installed you can use its CLI interface `ltt` as drop-in
replacement for `pip`:

```shell
ltt install torch
```

In fact, `ltt` is `pip` with a few added options:

- By default, `ltt` uses the local NVIDIA driver version to select the correct binary
  for you. You can pass the `--pytorch-computation-backend` option to manually specify
  the computation backend you want to use:

  ```shell
  ltt install --pytorch-computation-backend=cu102 torch torchvision torchaudio
  ```

  Borrowing from the mutex packages that PyTorch provides for `conda` installations,
  `--cpuonly` is available as shorthand for `--pytorch-computation-backend=cu102`.

  In addition, the computation backend to be installed can also be set through the
  `LTT_PYTORCH_COMPUTATION_BACKEND` environment variable. It will only be honored in
  case no CLI option for the computation backend is specified.

- By default, `ltt` installs stable PyTorch binaries. To install binaries from the
  nightly or test channels pass the `--pytorch-channel` option:

  ```shell
  ltt install --pytorch-channel=nightly torch torchvision torchaudio
  ```

  If `--pytorch-channel` is not passed, using `pip`'s builtin `--pre` option will
  install PyTorch test binaries.

Of course, you are not limited to install only PyTorch distributions. Everything shown
above also works if you install packages that depend on PyTorch:

```shell
ltt install --pytorch-computation-backend=cpu --pytorch-channel=nightly pystiche
```

## How does it work?

The authors of `pip` **do not condone** the use of `pip` internals as they might break
without warning. As a results of this, `pip` has no capability for plugins to hook into
specific tasks.

`light-the-torch` works by monkey-patching `pip` internals at runtime:

- While searching for a download link for a PyTorch distribution, `light-the-torch`
  replaces the default search index with an official PyTorch download link. This is
  equivalent to calling `pip install` with the `--extra-index-url` option only for
  PyTorch distributions.
- While evaluating possible PyTorch installation candidates, `light-the-torch` culls
  binaries incompatible with the hardware.

## Is it safe?

A project as large as PyTorch is attractive for malicious actors given the large user
base. For example in December 2022, PyTorch was hit by a
[supply chain attack](https://pytorch.org/blog/compromised-nightly-dependency/) that
potentially extracted user information. The PyTorch team mitigated the attack as soon as
it was detected by temporarily hosting all third party dependencies for the nightly
Linux releases on their own indices. With that,
`pip install torch --extra-index-url https://download.pytorch.org/whl/cpu` wouldn't pull
anything from PyPI and thus avoiding malicious packages placed there.

However, due to `light-the-torch`'s index patching, this mitigation would have been
completely circumvented since only PyTorch distributions would have been installed from
the PyTorch indices. Since version `0.7.0`, `light-the-torch` will only pull third-party
dependencies for nightly Linux PyTorch releases from PyPI in case they are specifically
requested and pinned. For example `ltt install --pytorch-channel=nightly torch` and
`ltt install --pytorch-channel=nightly torch sympy` will install everything from the
PyTorch indices. However, if you pin a third party dependency, e.g.
`ltt install --pytorch-channel=nightly torch sympy==1.11.1`, it will be pulled from PyPI
regardless of whether the version matches the one on the PyTorch index.

In summary, `light-the-torch` is usually as safe as the regular PyTorch installation
instructions. However, attacks on the supply chain can lead to situations where
`light-the-torch` circumvents mitigations done by the PyTorch team. Unfortunately,
`light-the-torch` is not officially supported and thus also not tested by them.

## How do I contribute?

Thanks a lot for your interest to contribute to `light-the-torch`! All contributions are
appreciated, be it code or not. Especially in a project like this, we rely on user
reports for edge cases we didn't anticipate. Please feel free to
[open an issue](https://github.com/pmeier/light-the-torch/issues) if you encounter
anything that you think should be working but doesn't.

If you want to contribute code, check out our [contributing guidelines](CONTRIBUTING.md)
to learn more about the workflow.

            

Raw data

            {
    "_id": null,
    "home_page": "",
    "name": "light-the-torch",
    "maintainer": "",
    "docs_url": null,
    "requires_python": ">=3.7",
    "maintainer_email": "",
    "keywords": "pytorch,cuda,pip,install",
    "author": "",
    "author_email": "Philip Meier <github.pmeier@posteo.de>",
    "download_url": "https://files.pythonhosted.org/packages/8d/41/3fdeec9d1cfb0775de53691da69e64558562d87536b8349769db79c95dc6/light_the_torch-0.7.5.tar.gz",
    "platform": null,
    "description": "# `light-the-torch`\n\n[![BSD-3-Clause License](https://img.shields.io/github/license/pmeier/light-the-torch)](https://opensource.org/licenses/BSD-3-Clause)\n[![Project Status: WIP](https://www.repostatus.org/badges/latest/wip.svg)](https://www.repostatus.org/#wip)\n[![Code coverage via codecov.io](https://codecov.io/gh/pmeier/light-the-torch/branch/main/graph/badge.svg)](https://codecov.io/gh/pmeier/light-the-torch)\n\n`light-the-torch` is a small utility that wraps `pip` to ease the installation process\nfor PyTorch distributions like `torch`, `torchvision`, `torchaudio`, and so on as well\nas third-party packages that depend on them. It auto-detects compatible CUDA versions\nfrom the local setup and installs the correct PyTorch binaries without user\ninterference.\n\n- [Why do I need it?](#why-do-i-need-it)\n- [How do I install it?](#how-do-i-install-it)\n- [How do I use it?](#how-do-i-use-it)\n- [How does it work?](#how-does-it-work)\n- [Is it safe?](#is-it-safe)\n- [How do I contribute?](#how-do-i-contribute)\n\n## Why do I need it?\n\nPyTorch distributions like `torch`, `torchvision`, `torchaudio`, and so on are fully\n`pip install`'able, but PyPI, the default `pip` search index, has some limitations:\n\n1. PyPI regularly only allows binaries up to a size of\n   [approximately 60 MB](https://github.com/pypa/packaging-problems/issues/86). One can\n   [request a file size limit increase](https://pypi.org/help/#file-size-limit) (and the\n   PyTorch team probably does that for every release), but it is still not enough:\n   although PyTorch has pre-built binaries for Windows with CUDA, they cannot be\n   installed through PyPI due to their size.\n2. PyTorch uses local version specifiers to indicate for which computation backend the\n   binary was compiled, for example `torch==1.11.0+cpu`. Unfortunately, local specifiers\n   are not allowed on PyPI. Thus, only the binaries compiled with one CUDA version are\n   uploaded without an indication of the CUDA version. If you do not have a CUDA capable\n   GPU, downloading this is only a waste of bandwidth and disk capacity. If on the other\n   hand your NVIDIA driver version simply doesn't support the CUDA version the binary\n   was compiled with, you can't use any of the GPU features.\n\nTo overcome this, PyTorch also hosts _most_[^1] binaries\n[on their own package indices](https://download.pytorch.org/whl). To access PyTorch's\npackage indices, you can still use `pip install`, but some\n[additional options](https://pytorch.org/get-started/locally/) are needed:\n\n```shell\npip install torch --extra-index-url https://download.pytorch.org/whl/cu113\n```\n\n[^1]:\n    Some distributions are not compiled against a specific computation backend and thus\n    hosting them on PyPI is sufficient since they work in every environment.\n\nWhile this is certainly an improvement, it still has a few downsides:\n\n1. You need to know what computation backend, e.g. CUDA 11.3 (`cu113`), is supported on\n   your local machine. This can be quite challenging for new users and at least tedious\n   for more experienced ones.\n2. Besides the stable binaries, PyTorch also offers nightly and test ones. To install\n   them, you need a different `--extra-index-url` for each.\n3. For the nightly and test channel you also need to supply the `--pre` option. Failing\n   to do so, will pull the stable binary from PyPI even if the rest of the installation\n   command is correct.\n\nIf any of these points don't sound appealing to you, and you just want to have the same\nuser experience as `pip install` for PyTorch distributions, `light-the-torch` was made\nfor you.\n\n## How do I install it?\n\nInstalling `light-the-torch` is as easy as\n\n```shell\npip install light-the-torch\n```\n\nSince it depends on `pip` and it might be upgraded during installation,\n[Windows users](https://pip.pypa.io/en/stable/installation/#upgrading-pip) should\ninstall it with\n\n```shell\npy -m pip install light-the-torch\n```\n\n## How do I use it?\n\nAfter `light-the-torch` is installed you can use its CLI interface `ltt` as drop-in\nreplacement for `pip`:\n\n```shell\nltt install torch\n```\n\nIn fact, `ltt` is `pip` with a few added options:\n\n- By default, `ltt` uses the local NVIDIA driver version to select the correct binary\n  for you. You can pass the `--pytorch-computation-backend` option to manually specify\n  the computation backend you want to use:\n\n  ```shell\n  ltt install --pytorch-computation-backend=cu102 torch torchvision torchaudio\n  ```\n\n  Borrowing from the mutex packages that PyTorch provides for `conda` installations,\n  `--cpuonly` is available as shorthand for `--pytorch-computation-backend=cu102`.\n\n  In addition, the computation backend to be installed can also be set through the\n  `LTT_PYTORCH_COMPUTATION_BACKEND` environment variable. It will only be honored in\n  case no CLI option for the computation backend is specified.\n\n- By default, `ltt` installs stable PyTorch binaries. To install binaries from the\n  nightly or test channels pass the `--pytorch-channel` option:\n\n  ```shell\n  ltt install --pytorch-channel=nightly torch torchvision torchaudio\n  ```\n\n  If `--pytorch-channel` is not passed, using `pip`'s builtin `--pre` option will\n  install PyTorch test binaries.\n\nOf course, you are not limited to install only PyTorch distributions. Everything shown\nabove also works if you install packages that depend on PyTorch:\n\n```shell\nltt install --pytorch-computation-backend=cpu --pytorch-channel=nightly pystiche\n```\n\n## How does it work?\n\nThe authors of `pip` **do not condone** the use of `pip` internals as they might break\nwithout warning. As a results of this, `pip` has no capability for plugins to hook into\nspecific tasks.\n\n`light-the-torch` works by monkey-patching `pip` internals at runtime:\n\n- While searching for a download link for a PyTorch distribution, `light-the-torch`\n  replaces the default search index with an official PyTorch download link. This is\n  equivalent to calling `pip install` with the `--extra-index-url` option only for\n  PyTorch distributions.\n- While evaluating possible PyTorch installation candidates, `light-the-torch` culls\n  binaries incompatible with the hardware.\n\n## Is it safe?\n\nA project as large as PyTorch is attractive for malicious actors given the large user\nbase. For example in December 2022, PyTorch was hit by a\n[supply chain attack](https://pytorch.org/blog/compromised-nightly-dependency/) that\npotentially extracted user information. The PyTorch team mitigated the attack as soon as\nit was detected by temporarily hosting all third party dependencies for the nightly\nLinux releases on their own indices. With that,\n`pip install torch --extra-index-url https://download.pytorch.org/whl/cpu` wouldn't pull\nanything from PyPI and thus avoiding malicious packages placed there.\n\nHowever, due to `light-the-torch`'s index patching, this mitigation would have been\ncompletely circumvented since only PyTorch distributions would have been installed from\nthe PyTorch indices. Since version `0.7.0`, `light-the-torch` will only pull third-party\ndependencies for nightly Linux PyTorch releases from PyPI in case they are specifically\nrequested and pinned. For example `ltt install --pytorch-channel=nightly torch` and\n`ltt install --pytorch-channel=nightly torch sympy` will install everything from the\nPyTorch indices. However, if you pin a third party dependency, e.g.\n`ltt install --pytorch-channel=nightly torch sympy==1.11.1`, it will be pulled from PyPI\nregardless of whether the version matches the one on the PyTorch index.\n\nIn summary, `light-the-torch` is usually as safe as the regular PyTorch installation\ninstructions. However, attacks on the supply chain can lead to situations where\n`light-the-torch` circumvents mitigations done by the PyTorch team. Unfortunately,\n`light-the-torch` is not officially supported and thus also not tested by them.\n\n## How do I contribute?\n\nThanks a lot for your interest to contribute to `light-the-torch`! All contributions are\nappreciated, be it code or not. Especially in a project like this, we rely on user\nreports for edge cases we didn't anticipate. Please feel free to\n[open an issue](https://github.com/pmeier/light-the-torch/issues) if you encounter\nanything that you think should be working but doesn't.\n\nIf you want to contribute code, check out our [contributing guidelines](CONTRIBUTING.md)\nto learn more about the workflow.\n",
    "bugtrack_url": null,
    "license": "BSD-3-Clause",
    "summary": "Install PyTorch distributions with computation backend auto-detection",
    "version": "0.7.5",
    "project_urls": {
        "Source": "https://github.com/pmeier/light-the-torch",
        "Tracker": "https://github.com/pmeier/light-the-torch/issues"
    },
    "split_keywords": [
        "pytorch",
        "cuda",
        "pip",
        "install"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "73e822a83f7de9090844e7154f34c741c0e35b035130b03608692787243b2e65",
                "md5": "462f16bf38785050656fe73bb7485ce4",
                "sha256": "96de7a5accb1c93aa20b0f4322ca3c56295e2505ac8fbf6b05fb81236348653c"
            },
            "downloads": -1,
            "filename": "light_the_torch-0.7.5-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "462f16bf38785050656fe73bb7485ce4",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.7",
            "size": 14629,
            "upload_time": "2023-07-17T08:23:39",
            "upload_time_iso_8601": "2023-07-17T08:23:39.179887Z",
            "url": "https://files.pythonhosted.org/packages/73/e8/22a83f7de9090844e7154f34c741c0e35b035130b03608692787243b2e65/light_the_torch-0.7.5-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "8d413fdeec9d1cfb0775de53691da69e64558562d87536b8349769db79c95dc6",
                "md5": "c903a12a24fe9d703dcb6f5a1a30979a",
                "sha256": "36f40432e4a48a88d9d3ec2070fdaaca7568f369194dd036d340e732328048ef"
            },
            "downloads": -1,
            "filename": "light_the_torch-0.7.5.tar.gz",
            "has_sig": false,
            "md5_digest": "c903a12a24fe9d703dcb6f5a1a30979a",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.7",
            "size": 16765,
            "upload_time": "2023-07-17T08:23:40",
            "upload_time_iso_8601": "2023-07-17T08:23:40.813587Z",
            "url": "https://files.pythonhosted.org/packages/8d/41/3fdeec9d1cfb0775de53691da69e64558562d87536b8349769db79c95dc6/light_the_torch-0.7.5.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2023-07-17 08:23:40",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "pmeier",
    "github_project": "light-the-torch",
    "travis_ci": false,
    "coveralls": true,
    "github_actions": true,
    "lcname": "light-the-torch"
}
        
Elapsed time: 0.08691s