Name | torchwrench JSON |
Version |
0.7.3
JSON |
| download |
home_page | None |
Summary | Collection of functions and modules to help development in PyTorch. |
upload_time | 2025-07-18 22:01:46 |
maintainer | None |
docs_url | None |
author | None |
requires_python | >=3.8 |
license | MIT License
Copyright (c) 2025 Labbeti
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.
|
keywords |
pytorch
deep-learning
|
VCS |
 |
bugtrack_url |
|
requirements |
No requirements were recorded.
|
Travis-CI |
No Travis.
|
coveralls test coverage |
No coveralls.
|
# torchwrench
<center>
<a href="https://www.python.org/">
<img alt="Python" src="https://img.shields.io/badge/-Python 3.8+-blue?style=for-the-badge&logo=python&logoColor=white">
</a>
<a href="https://github.com/Labbeti/torchwrench/actions">
<img alt="Build" src="https://img.shields.io/github/actions/workflow/status/Labbeti/torchwrench/test.yaml?branch=main&style=for-the-badge&logo=github">
</a>
<a href='https://torchwrench.readthedocs.io/'>
<img src='https://readthedocs.org/projects/torchwrench/badge/?version=stable&style=for-the-badge' alt='Documentation Status' />
</a>
<a href="https://pytorch.org/get-started/locally/">
<img alt="PyTorch" src="https://img.shields.io/badge/-PyTorch 1.10+-ee4c2c?style=for-the-badge&logo=pytorch&logoColor=white">
</a>
Collection of functions and modules to help development in PyTorch.
</center>
## Installation
With pip:
```bash
pip install torchwrench
```
With uv:
```bash
uv add torchwrench
```
The main requirement is **[PyTorch](https://pytorch.org/)**.
To check if the package is installed and show the package version, you can use the following command in your terminal:
```bash
torchwrench-info
```
This library has been tested on all Python versions **3.8 - 3.13**, all PyTorch versions **1.10 - 2.6**, and on **Linux, Mac and Windows** systems.
## Examples
`torchwrench` functions and modules can be used like `torch` ones. The default acronym for `torchwrench` is `tw`.
### Label conversions
Supports **multiclass** labels conversions between probabilities, classes indices, classes names and onehot encoding.
```python
import torchwrench as tw
probs = tw.as_tensor([[0.9, 0.1], [0.4, 0.6]])
names = tw.probs_to_name(probs, idx_to_name={0: "Cat", 1: "Dog"})
# ["Cat", "Dog"]
```
This package also supports **multilabel** labels conversions between probabilities, classes multi-indices, classes multi-names and multihot encoding.
```python
import torchwrench as tw
multihot = tw.as_tensor([[1, 0, 0], [0, 1, 1], [0, 0, 0]])
indices = tw.multihot_to_indices(multihot)
# [[0], [1, 2], []]
```
Finally, this packages includes the **powerset multilabel** conversions :
```python
import torchwrench as tw
multihot = tw.as_tensor([[1, 0, 0], [0, 1, 1], [0, 0, 0]])
indices = tw.multilabel_to_powerset(multihot, num_classes=3, max_set_size=2)
# tensor([[0, 1, 0, 0, 0, 0, 0],
# [0, 0, 0, 0, 0, 0, 1],
# [1, 0, 0, 0, 0, 0, 0]])
```
### Typing
Typing with number of dimensions :
```python
import torchwrench as tw
x1 = tw.as_tensor([1, 2])
print(isinstance(x1, tw.Tensor2D)) # False
x2 = tw.as_tensor([[1, 2], [3, 4]])
print(isinstance(x2, tw.Tensor2D)) # True
```
Typing with tensor dtype :
```python
import torchwrench as tw
x1 = tw.as_tensor([1, 2], dtype=tw.int)
print(isinstance(x1, tw.SignedIntegerTensor)) # True
x2 = tw.as_tensor([1, 2], dtype=tw.long)
print(isinstance(x2, tw.SignedIntegerTensor1D)) # True
x3 = tw.as_tensor([1, 2], dtype=tw.float)
print(isinstance(x3, tw.SignedIntegerTensor)) # False
```
### Padding & cropping
Pad a specific dimension :
```python
import torchwrench as tw
x = tw.rand(10, 3, 1)
padded = tw.pad_dim(x, target_length=5, dim=1, pad_value=-1)
# x2 has shape (10, 5, 1), padded with -1
```
Pad nested list of tensors to a single one :
```python
import torchwrench as tw
tensors = [tw.rand(10, 2), [tw.rand(3)] * 5, tw.rand(0, 5)]
padded = tw.pad_and_stack_rec(tensors, pad_value=0)
# padded has shape (3, 10, 5), padded with 0
```
Remove values at a specific dimension :
```python
import torchwrench as tw
x = tw.rand(10, 5, 3)
cropped = tw.crop_dim(x, dim=1, target_length=2)
# cropped has shape (10, 2, 3)
```
### Masking
```python
import torchwrench as tw
x = tw.as_tensor([3, 1, 2])
mask = tw.lengths_to_non_pad_mask(x, max_len=4)
# Each row i contains x[i] True values for non-padding mask
# tensor([[True, True, True, False],
# [True, False, False, False],
# [True, True, False, False]])
```
```python
import torchwrench as tw
x = tw.as_tensor([1, 2, 3, 4])
mask = tw.as_tensor([True, True, False, False])
result = tw.masked_mean(x, mask)
# result contains the mean of the values marked as True: 1.5
```
### Others tensors manipulations!
```python
import torchwrench as tw
x = tw.as_tensor([1, 2, 3, 4])
result = tw.insert_at_indices(x, indices=[0, 2], values=5)
# result contains tensor with inserted values: tensor([5, 1, 2, 5, 3, 4])
```
```python
import torchwrench as tw
perm = tw.randperm(10)
inv_perm = tw.get_inverse_perm(perm)
x1 = tw.rand(10)
x2 = x1[perm]
x3 = x2[inv_perm]
# inv_perm are indices that allow us to get x3 from x2, i.e. x1 == x3 here
```
### Extra: pre-compute datasets to HDF files
Here is an example of pre-computing spectrograms of torchaudio `SPEECHCOMMANDS` dataset, using `pack_dataset` function:
```python
from torchaudio.datasets import SPEECHCOMMANDS
from torchaudio.transforms import Spectrogram
from torchwrench import nn
from torchwrench.extras.hdf import pack_to_hdf
speech_commands_root = "path/to/speech_commands"
packed_root = "path/to/packed_dataset.hdf"
dataset = SPEECHCOMMANDS(speech_commands_root, download=True, subset="validation")
# dataset[0] is a tuple, contains waveform and other metadata
class MyTransform(nn.Module):
def __init__(self) -> None:
super().__init__()
self.spectrogram_extractor = Spectrogram()
def forward(self, item):
waveform = item[0]
spectrogram = self.spectrogram_extractor(waveform)
return (spectrogram,) + item[1:]
pack_to_hdf(dataset, packed_root, MyTransform())
```
Then you can load the pre-computed dataset using `HDFDataset`:
```python
from torchwrench.extras.hdf import HDFDataset
packed_root = "path/to/packed_dataset.hdf"
packed_dataset = HDFDataset(packed_root)
packed_dataset[0] # == first transformed item, i.e. transform(dataset[0])
```
## Contact
Maintainer:
- [Étienne Labbé](https://labbeti.github.io/) "Labbeti": labbeti.pub@gmail.com
Raw data
{
"_id": null,
"home_page": null,
"name": "torchwrench",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.8",
"maintainer_email": "\"\u00c9tienne Labb\u00e9 (Labbeti)\" <labbeti.pub@gmail.com>",
"keywords": "pytorch, deep-learning",
"author": null,
"author_email": "\"\u00c9tienne Labb\u00e9 (Labbeti)\" <labbeti.pub@gmail.com>",
"download_url": "https://files.pythonhosted.org/packages/58/79/243992598ee2e04d5a25c77bd6ba7f3390b8bb4b55591eda0ba3bee5e22f/torchwrench-0.7.3.tar.gz",
"platform": null,
"description": "# torchwrench\n\n<center>\n\n<a href=\"https://www.python.org/\">\n <img alt=\"Python\" src=\"https://img.shields.io/badge/-Python 3.8+-blue?style=for-the-badge&logo=python&logoColor=white\">\n</a>\n<a href=\"https://github.com/Labbeti/torchwrench/actions\">\n <img alt=\"Build\" src=\"https://img.shields.io/github/actions/workflow/status/Labbeti/torchwrench/test.yaml?branch=main&style=for-the-badge&logo=github\">\n</a>\n<a href='https://torchwrench.readthedocs.io/'>\n <img src='https://readthedocs.org/projects/torchwrench/badge/?version=stable&style=for-the-badge' alt='Documentation Status' />\n</a>\n<a href=\"https://pytorch.org/get-started/locally/\">\n <img alt=\"PyTorch\" src=\"https://img.shields.io/badge/-PyTorch 1.10+-ee4c2c?style=for-the-badge&logo=pytorch&logoColor=white\">\n</a>\n\nCollection of functions and modules to help development in PyTorch.\n\n</center>\n\n\n## Installation\n\nWith pip:\n```bash\npip install torchwrench\n```\n\nWith uv:\n```bash\nuv add torchwrench\n```\n\nThe main requirement is **[PyTorch](https://pytorch.org/)**.\n\nTo check if the package is installed and show the package version, you can use the following command in your terminal:\n```bash\ntorchwrench-info\n```\n\nThis library has been tested on all Python versions **3.8 - 3.13**, all PyTorch versions **1.10 - 2.6**, and on **Linux, Mac and Windows** systems.\n\n## Examples\n\n`torchwrench` functions and modules can be used like `torch` ones. The default acronym for `torchwrench` is `tw`.\n\n### Label conversions\nSupports **multiclass** labels conversions between probabilities, classes indices, classes names and onehot encoding.\n\n```python\nimport torchwrench as tw\n\nprobs = tw.as_tensor([[0.9, 0.1], [0.4, 0.6]])\nnames = tw.probs_to_name(probs, idx_to_name={0: \"Cat\", 1: \"Dog\"})\n# [\"Cat\", \"Dog\"]\n```\n\nThis package also supports **multilabel** labels conversions between probabilities, classes multi-indices, classes multi-names and multihot encoding.\n\n```python\nimport torchwrench as tw\n\nmultihot = tw.as_tensor([[1, 0, 0], [0, 1, 1], [0, 0, 0]])\nindices = tw.multihot_to_indices(multihot)\n# [[0], [1, 2], []]\n```\n\nFinally, this packages includes the **powerset multilabel** conversions :\n\n```python\nimport torchwrench as tw\n\nmultihot = tw.as_tensor([[1, 0, 0], [0, 1, 1], [0, 0, 0]])\nindices = tw.multilabel_to_powerset(multihot, num_classes=3, max_set_size=2)\n# tensor([[0, 1, 0, 0, 0, 0, 0],\n# [0, 0, 0, 0, 0, 0, 1],\n# [1, 0, 0, 0, 0, 0, 0]])\n```\n\n### Typing\n\nTyping with number of dimensions :\n\n```python\nimport torchwrench as tw\n\nx1 = tw.as_tensor([1, 2])\nprint(isinstance(x1, tw.Tensor2D)) # False\nx2 = tw.as_tensor([[1, 2], [3, 4]])\nprint(isinstance(x2, tw.Tensor2D)) # True\n```\n\nTyping with tensor dtype :\n\n```python\nimport torchwrench as tw\n\nx1 = tw.as_tensor([1, 2], dtype=tw.int)\nprint(isinstance(x1, tw.SignedIntegerTensor)) # True\n\nx2 = tw.as_tensor([1, 2], dtype=tw.long)\nprint(isinstance(x2, tw.SignedIntegerTensor1D)) # True\n\nx3 = tw.as_tensor([1, 2], dtype=tw.float)\nprint(isinstance(x3, tw.SignedIntegerTensor)) # False\n```\n\n### Padding & cropping\n\nPad a specific dimension :\n\n```python\nimport torchwrench as tw\n\nx = tw.rand(10, 3, 1)\npadded = tw.pad_dim(x, target_length=5, dim=1, pad_value=-1)\n# x2 has shape (10, 5, 1), padded with -1\n```\n\nPad nested list of tensors to a single one :\n\n```python\nimport torchwrench as tw\n\ntensors = [tw.rand(10, 2), [tw.rand(3)] * 5, tw.rand(0, 5)]\npadded = tw.pad_and_stack_rec(tensors, pad_value=0)\n# padded has shape (3, 10, 5), padded with 0\n```\n\nRemove values at a specific dimension :\n\n```python\nimport torchwrench as tw\n\nx = tw.rand(10, 5, 3)\ncropped = tw.crop_dim(x, dim=1, target_length=2)\n# cropped has shape (10, 2, 3)\n```\n\n### Masking\n\n```python\nimport torchwrench as tw\n\nx = tw.as_tensor([3, 1, 2])\nmask = tw.lengths_to_non_pad_mask(x, max_len=4)\n# Each row i contains x[i] True values for non-padding mask\n# tensor([[True, True, True, False],\n# [True, False, False, False],\n# [True, True, False, False]])\n```\n\n```python\nimport torchwrench as tw\n\nx = tw.as_tensor([1, 2, 3, 4])\nmask = tw.as_tensor([True, True, False, False])\nresult = tw.masked_mean(x, mask)\n# result contains the mean of the values marked as True: 1.5\n```\n\n### Others tensors manipulations!\n\n```python\nimport torchwrench as tw\n\nx = tw.as_tensor([1, 2, 3, 4])\nresult = tw.insert_at_indices(x, indices=[0, 2], values=5)\n# result contains tensor with inserted values: tensor([5, 1, 2, 5, 3, 4])\n```\n\n```python\nimport torchwrench as tw\n\nperm = tw.randperm(10)\ninv_perm = tw.get_inverse_perm(perm)\n\nx1 = tw.rand(10)\nx2 = x1[perm]\nx3 = x2[inv_perm]\n# inv_perm are indices that allow us to get x3 from x2, i.e. x1 == x3 here\n```\n\n### Extra: pre-compute datasets to HDF files\n\nHere is an example of pre-computing spectrograms of torchaudio `SPEECHCOMMANDS` dataset, using `pack_dataset` function:\n\n```python\nfrom torchaudio.datasets import SPEECHCOMMANDS\nfrom torchaudio.transforms import Spectrogram\nfrom torchwrench import nn\nfrom torchwrench.extras.hdf import pack_to_hdf\n\nspeech_commands_root = \"path/to/speech_commands\"\npacked_root = \"path/to/packed_dataset.hdf\"\n\ndataset = SPEECHCOMMANDS(speech_commands_root, download=True, subset=\"validation\")\n# dataset[0] is a tuple, contains waveform and other metadata\n\nclass MyTransform(nn.Module):\n def __init__(self) -> None:\n super().__init__()\n self.spectrogram_extractor = Spectrogram()\n\n def forward(self, item):\n waveform = item[0]\n spectrogram = self.spectrogram_extractor(waveform)\n return (spectrogram,) + item[1:]\n\npack_to_hdf(dataset, packed_root, MyTransform())\n```\n\nThen you can load the pre-computed dataset using `HDFDataset`:\n```python\nfrom torchwrench.extras.hdf import HDFDataset\n\npacked_root = \"path/to/packed_dataset.hdf\"\npacked_dataset = HDFDataset(packed_root)\npacked_dataset[0] # == first transformed item, i.e. transform(dataset[0])\n```\n\n## Contact\nMaintainer:\n- [\u00c9tienne Labb\u00e9](https://labbeti.github.io/) \"Labbeti\": labbeti.pub@gmail.com\n",
"bugtrack_url": null,
"license": "MIT License\n \n Copyright (c) 2025 Labbeti\n \n Permission is hereby granted, free of charge, to any person obtaining a copy\n of this software and associated documentation files (the \"Software\"), to deal\n in the Software without restriction, including without limitation the rights\n to use, copy, modify, merge, publish, distribute, sublicense, and/or sell\n copies of the Software, and to permit persons to whom the Software is\n furnished to do so, subject to the following conditions:\n \n The above copyright notice and this permission notice shall be included in all\n copies or substantial portions of the Software.\n \n THE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\n IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\n FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\n AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\n LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\n OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\n SOFTWARE.\n ",
"summary": "Collection of functions and modules to help development in PyTorch.",
"version": "0.7.3",
"project_urls": {
"Changelog": "https://github.com/Labbeti/torchwrench/blob/main/CHANGELOG.md",
"Documentation": "https://torchwrench.readthedocs.io/",
"Homepage": "https://pypi.org/project/torchwrench/",
"Repository": "https://github.com/Labbeti/torchwrench.git",
"Tracker": "https://github.com/Labbeti/torchwrench/issues"
},
"split_keywords": [
"pytorch",
" deep-learning"
],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "5879243992598ee2e04d5a25c77bd6ba7f3390b8bb4b55591eda0ba3bee5e22f",
"md5": "c71e97fa2bf3ca1957eb006d6bf6e88f",
"sha256": "0e87a4da240d54de27c5447d5cac41011590bdee1bf5d075335e5e12cc0b21fc"
},
"downloads": -1,
"filename": "torchwrench-0.7.3.tar.gz",
"has_sig": false,
"md5_digest": "c71e97fa2bf3ca1957eb006d6bf6e88f",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.8",
"size": 116079,
"upload_time": "2025-07-18T22:01:46",
"upload_time_iso_8601": "2025-07-18T22:01:46.307735Z",
"url": "https://files.pythonhosted.org/packages/58/79/243992598ee2e04d5a25c77bd6ba7f3390b8bb4b55591eda0ba3bee5e22f/torchwrench-0.7.3.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-07-18 22:01:46",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "Labbeti",
"github_project": "torchwrench",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"lcname": "torchwrench"
}