Name | hypothesis-torch JSON |
Version |
0.8.5
JSON |
| download |
home_page | None |
Summary | Hypothesis strategies for various Pytorch structures, including tensors and modules. |
upload_time | 2024-11-18 18:02:05 |
maintainer | None |
docs_url | None |
author | None |
requires_python | >=3.9 |
license | MIT License Copyright (c) 2024 Andrew Sansom Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. |
keywords |
hypothesis
torch
pytorch
testing
property-based testing
deep learning
tensor
neural network
artificial intelligence
machine learning
|
VCS |
|
bugtrack_url |
|
requirements |
No requirements were recorded.
|
Travis-CI |
No Travis.
|
coveralls test coverage |
No coveralls.
|
[![PyPI version](https://img.shields.io/pypi/v/hypothesis-torch.svg)](https://pypi.org/project/hypothesis-torch) ![PyPI - Downloads](https://img.shields.io/pypi/dm/hypothesis-torch)
[![Checked with pyright](https://microsoft.github.io/pyright/img/pyright_badge.svg)](https://microsoft.github.io/pyright/)
[![Checked with mypy](http://www.mypy-lang.org/static/mypy_badge.svg)](http://mypy-lang.org/)
# hypothesis-torch
Hypothesis strategies for various Pytorch structures (including tensors and modules).
[Hypothesis](https://hypothesis.readthedocs.io/en/latest/) is a powerful property-based testing library for Python. It
lacks built-in support for Pytorch tensors and modules, so this library provides strategies for generating them.
## Installation
`hypothesis-torch` can be installed via pip:
```bash
pip install hypothesis-torch
```
Optionally, you can also install the `huggingface` extra to also install the `transformers` library:
```bash
pip install hypothesis-torch[huggingface]
```
Strategies for generating Hugging Face transformer models are provided in the `hypothesis_torch.huggingface` module. If
and only if `transformers` is installed when `hypothesis-torch` is imported, these strategies will be available from
the root `hypothesis_torch` module.
## What you can generate
### Tensors
Tensors can be generated with the `tensor_strategy` function. This function takes in optional arguments for the shape,
dtype, device, and other properties of the desired tensors. Each property can be specified as a fixed value or as a
strategy. For example, to generate a tensor with a shape of 3x3, a dtype of `torch.float32`, and values between 0 and 1,
```python
import hypothesis_torch
from hypothesis import strategies as st
import torch
hypothesis_torch.tensor_strategy(dtype=torch.float32, shape=(3, 3), elements=st.floats(0, 1))
```
Note that specifying other hypothesis strategies that return the same type as an argument will sample from that strategy
while generating the tensor. For example, to generate a tensor with any dtype, specify a strategy that returns a dtype:
```python
import hypothesis_torch
from hypothesis import strategies as st
import torch
hypothesis_torch.tensor_strategy(dtype=st.sampled_from([torch.float32, torch.float64]), shape=(3, 3), elements=st.floats(0, 1))
```
### Dtypes
Dtypes can be generated with the `dtype_strategy` function. If no arguments are provided, this function will default to
sampling from the set of all Pytorch dtypes.
```python
import hypothesis_torch
hypothesis_torch.dtype_strategy()
```
If a set of dtypes is provided, the function will sample from that set.
```python
import hypothesis_torch
import torch
hypothesis_torch.dtype_strategy(dtypes={torch.float32, torch.float64})
```
### Devices
Devices can be generated with the `device_strategy` function. If no arguments are provided, this function will default to
sampling from the set of all available, physical devices.
```python
import hypothesis_torch
hypothesis_torch.device_strategy()
```
If a set of devices is provided, the function will sample from that set.
```python
import hypothesis_torch
import torch
hypothesis_torch.device_strategy(devices={torch.device('cuda:0'), torch.device('cpu')})
```
If `allow_meta_device` is set to `True`, the strategy may also return meta devices, i.e. `torch.device('meta')`.
```python
import hypothesis_torch
hypothesis_torch.device_strategy(allow_meta_device=True)
```
### Modules
Various types of PyTorch modules have their own strategies.
#### Activation functions
Activation functions can be generated with the `same_shape_activation_strategy` function.
```python
import hypothesis_torch
hypothesis_torch.same_shape_activation_strategy()
```
#### Fully-connected/Feed forward neural networks
Fully-connected neural networks can be generated with the `linear_network_strategy` function. This function takes in
optional arguments for the input size, output size, and number of hidden layers. Each of these arguments can be
specified as a fixed value or as a strategy. For example, to generate a fully-connected neural network with an input
size of 10, an output size of 5, and 3 hidden layers with sizes between 5 and 10:
```python
import hypothesis_torch
from hypothesis import strategies as st
hypothesis_torch.linear_network_strategy(input_shape=(1,10), output_shape=(1,5), hidden_layer_size=st.integers(5, 10), num_hidden_layers=3)
```
#### Hugging Face Transformer Models
Hugging Face transformer models can be generated with the `transformer_strategy` function. This function takes in any
Hugging Face `PreTrainedModel` subclass (or a strategy that generates references `PreTrainedModel` subclasses) and
returns an instance of that model. For example, to generate an arbitrary Llama2 model:
```python
import hypothesis_torch
import transformers
hypothesis_torch.transformer_strategy(transformers.LlamaForCausalLM)
```
The strategy also accepts `kwargs` to pass to the model constructor. These can be either fixed values or strategies to
generate those corresponding values. For example, to generate an arbitrary Llama2 model with a hidden size between 64 and
128, but a fixed vocabulary size of 1000:
```python
import hypothesis_torch
import transformers
from hypothesis import strategies as st
hypothesis_torch.transformer_strategy(transformers.LlamaForCausalLM, hidden_size=st.integers(64, 128), vocab_size=1000)
```
[! Note]
Currently, the `transformer_strategy` only accepts `kwargs` that can be passed to the constructor of the model's
config class. Thus, it cannot currently replicate all the behavior of calling `from_pretrained` on a model class.
Raw data
{
"_id": null,
"home_page": null,
"name": "hypothesis-torch",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.9",
"maintainer_email": null,
"keywords": "hypothesis, torch, pytorch, testing, property-based testing, deep learning, tensor, neural network, artificial intelligence, machine learning",
"author": null,
"author_email": "\"Andrew P. Sansom\" <andrew@euleriancircuit.com>",
"download_url": "https://files.pythonhosted.org/packages/d3/df/b25815fd473f32e26cf7329283e1e31cc5d8dd870084ebd43478e83c158a/hypothesis_torch-0.8.5.tar.gz",
"platform": null,
"description": "[![PyPI version](https://img.shields.io/pypi/v/hypothesis-torch.svg)](https://pypi.org/project/hypothesis-torch) ![PyPI - Downloads](https://img.shields.io/pypi/dm/hypothesis-torch)\n[![Checked with pyright](https://microsoft.github.io/pyright/img/pyright_badge.svg)](https://microsoft.github.io/pyright/)\n[![Checked with mypy](http://www.mypy-lang.org/static/mypy_badge.svg)](http://mypy-lang.org/)\n\n# hypothesis-torch\nHypothesis strategies for various Pytorch structures (including tensors and modules).\n\n[Hypothesis](https://hypothesis.readthedocs.io/en/latest/) is a powerful property-based testing library for Python. It\nlacks built-in support for Pytorch tensors and modules, so this library provides strategies for generating them.\n\n## Installation\n`hypothesis-torch` can be installed via pip:\n```bash\npip install hypothesis-torch\n```\n\nOptionally, you can also install the `huggingface` extra to also install the `transformers` library:\n```bash\npip install hypothesis-torch[huggingface]\n```\n\nStrategies for generating Hugging Face transformer models are provided in the `hypothesis_torch.huggingface` module. If \nand only if `transformers` is installed when `hypothesis-torch` is imported, these strategies will be available from\nthe root `hypothesis_torch` module.\n\n## What you can generate\n\n### Tensors\n\nTensors can be generated with the `tensor_strategy` function. This function takes in optional arguments for the shape,\ndtype, device, and other properties of the desired tensors. Each property can be specified as a fixed value or as a\nstrategy. For example, to generate a tensor with a shape of 3x3, a dtype of `torch.float32`, and values between 0 and 1,\n\n```python\nimport hypothesis_torch\nfrom hypothesis import strategies as st\nimport torch\nhypothesis_torch.tensor_strategy(dtype=torch.float32, shape=(3, 3), elements=st.floats(0, 1))\n```\n\nNote that specifying other hypothesis strategies that return the same type as an argument will sample from that strategy\nwhile generating the tensor. For example, to generate a tensor with any dtype, specify a strategy that returns a dtype:\n\n```python\nimport hypothesis_torch\nfrom hypothesis import strategies as st\nimport torch\nhypothesis_torch.tensor_strategy(dtype=st.sampled_from([torch.float32, torch.float64]), shape=(3, 3), elements=st.floats(0, 1))\n```\n\n### Dtypes\n\nDtypes can be generated with the `dtype_strategy` function. If no arguments are provided, this function will default to \nsampling from the set of all Pytorch dtypes. \n \n```python\nimport hypothesis_torch\nhypothesis_torch.dtype_strategy()\n```\n\nIf a set of dtypes is provided, the function will sample from that set.\n\n```python\nimport hypothesis_torch\nimport torch\nhypothesis_torch.dtype_strategy(dtypes={torch.float32, torch.float64})\n```\n\n### Devices\n\nDevices can be generated with the `device_strategy` function. If no arguments are provided, this function will default to\nsampling from the set of all available, physical devices.\n\n```python\nimport hypothesis_torch\nhypothesis_torch.device_strategy()\n```\n\nIf a set of devices is provided, the function will sample from that set.\n\n```python\nimport hypothesis_torch\nimport torch\nhypothesis_torch.device_strategy(devices={torch.device('cuda:0'), torch.device('cpu')})\n```\n\nIf `allow_meta_device` is set to `True`, the strategy may also return meta devices, i.e. `torch.device('meta')`.\n\n```python\nimport hypothesis_torch\nhypothesis_torch.device_strategy(allow_meta_device=True)\n```\n\n### Modules\n\nVarious types of PyTorch modules have their own strategies.\n\n#### Activation functions\n\nActivation functions can be generated with the `same_shape_activation_strategy` function. \n\n```python\nimport hypothesis_torch\nhypothesis_torch.same_shape_activation_strategy()\n```\n\n#### Fully-connected/Feed forward neural networks\n\nFully-connected neural networks can be generated with the `linear_network_strategy` function. This function takes in \noptional arguments for the input size, output size, and number of hidden layers. Each of these arguments can be \nspecified as a fixed value or as a strategy. For example, to generate a fully-connected neural network with an input\nsize of 10, an output size of 5, and 3 hidden layers with sizes between 5 and 10:\n\n```python\nimport hypothesis_torch\nfrom hypothesis import strategies as st\nhypothesis_torch.linear_network_strategy(input_shape=(1,10), output_shape=(1,5), hidden_layer_size=st.integers(5, 10), num_hidden_layers=3)\n```\n\n#### Hugging Face Transformer Models\n\nHugging Face transformer models can be generated with the `transformer_strategy` function. This function takes in any\nHugging Face `PreTrainedModel` subclass (or a strategy that generates references `PreTrainedModel` subclasses) and \nreturns an instance of that model. For example, to generate an arbitrary Llama2 model:\n\n```python\nimport hypothesis_torch\nimport transformers\nhypothesis_torch.transformer_strategy(transformers.LlamaForCausalLM)\n```\n\nThe strategy also accepts `kwargs` to pass to the model constructor. These can be either fixed values or strategies to \ngenerate those corresponding values. For example, to generate an arbitrary Llama2 model with a hidden size between 64 and\n128, but a fixed vocabulary size of 1000:\n\n```python\nimport hypothesis_torch\nimport transformers\nfrom hypothesis import strategies as st\nhypothesis_torch.transformer_strategy(transformers.LlamaForCausalLM, hidden_size=st.integers(64, 128), vocab_size=1000)\n```\n\n[! Note]\n Currently, the `transformer_strategy` only accepts `kwargs` that can be passed to the constructor of the model's \n config class. Thus, it cannot currently replicate all the behavior of calling `from_pretrained` on a model class.\n",
"bugtrack_url": null,
"license": "MIT License Copyright (c) 2024 Andrew Sansom Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the \"Software\"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ",
"summary": "Hypothesis strategies for various Pytorch structures, including tensors and modules.",
"version": "0.8.5",
"project_urls": {
"Documentation": "https://hypothesis-torch.readthedocs.io/en/stable/",
"Homepage": "https://github.com/qthequartermasterman/hypothesis-torch",
"Issues": "https://github.com/qthequartermasterman/hypothesis-torch/issues",
"Repository": "https://github.com/qthequartermasterman/hypothesis-torch.git"
},
"split_keywords": [
"hypothesis",
" torch",
" pytorch",
" testing",
" property-based testing",
" deep learning",
" tensor",
" neural network",
" artificial intelligence",
" machine learning"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "7a20983864c2af7aaa55761ca4d80822cee5b642f21515852fb7aa511c698a9e",
"md5": "319f635115f73823fdfa80d1067b3251",
"sha256": "9a51f9a2485fcab49a9cfc318cf33251c1548fc0a6ac5158e622d862edd4032e"
},
"downloads": -1,
"filename": "hypothesis_torch-0.8.5-py3-none-any.whl",
"has_sig": false,
"md5_digest": "319f635115f73823fdfa80d1067b3251",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.9",
"size": 23163,
"upload_time": "2024-11-18T18:02:03",
"upload_time_iso_8601": "2024-11-18T18:02:03.844609Z",
"url": "https://files.pythonhosted.org/packages/7a/20/983864c2af7aaa55761ca4d80822cee5b642f21515852fb7aa511c698a9e/hypothesis_torch-0.8.5-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "d3dfb25815fd473f32e26cf7329283e1e31cc5d8dd870084ebd43478e83c158a",
"md5": "37ec098710f65b9ca2961548dea463f3",
"sha256": "08137f9e5259c343b9b511c138fa07ce2dd86b9f1947ab6d3c531932182ef4f8"
},
"downloads": -1,
"filename": "hypothesis_torch-0.8.5.tar.gz",
"has_sig": false,
"md5_digest": "37ec098710f65b9ca2961548dea463f3",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.9",
"size": 21087,
"upload_time": "2024-11-18T18:02:05",
"upload_time_iso_8601": "2024-11-18T18:02:05.212284Z",
"url": "https://files.pythonhosted.org/packages/d3/df/b25815fd473f32e26cf7329283e1e31cc5d8dd870084ebd43478e83c158a/hypothesis_torch-0.8.5.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-11-18 18:02:05",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "qthequartermasterman",
"github_project": "hypothesis-torch",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"lcname": "hypothesis-torch"
}