hypothesis-torch


Namehypothesis-torch JSON
Version 0.4.3 PyPI version JSON
download
home_pageNone
SummaryHypothesis strategies for various Pytorch structures, including tensors and modules.
upload_time2024-04-28 03:30:00
maintainerNone
docs_urlNone
authorNone
requires_python>=3.9
licenseMIT License Copyright (c) 2024 Andrew Sansom Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
keywords hypothesis torch pytorch testing property-based testing
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # hypothesis-torch
Hypothesis strategies for various Pytorch structures (including tensors and modules).

[Hypothesis](https://hypothesis.readthedocs.io/en/latest/) is a powerful property-based testing library for Python. It
lacks built-in support for Pytorch tensors and modules, so this library provides strategies for generating them.

## Installation
`hypothesis-torch` can be installed via pip:
```bash
pip install hypothesis-torch
```

Optionally, you can also install the `huggingface` extra to also install the `transformers` library:
```bash
pip install hypothesis-torch[huggingface]
```

Strategies for generating Hugging Face transformer models are provided in the `hypothesis_torch.huggingface` module. If 
and only if `transformers` is installed when `hypothesis-torch` is imported, these strategies will be available from
the root `hypothesis_torch` module.

## What you can generate

### Tensors

Tensors can be generated with the `tensor_strategy` function. This function takes in optional arguments for the shape,
dtype, device, and other properties of the desired tensors. Each property can be specified as a fixed value or as a
strategy. For example, to generate a tensor with a shape of 3x3, a dtype of `torch.float32`, and values between 0 and 1,

```python
import hypothesis_torch
from hypothesis import strategies as st
import torch
hypothesis_torch.tensor_strategy(dtype=torch.float32, shape=(3, 3), elements=st.floats(0, 1))
```

Note that specifying other hypothesis strategies that return the same type as an argument will sample from that strategy
while generating the tensor. For example, to generate a tensor with any dtype, specify a strategy that returns a dtype:

```python
import hypothesis_torch
from hypothesis import strategies as st
import torch
hypothesis_torch.tensor_strategy(dtype=st.sampled_from([torch.float32, torch.float64]), shape=(3, 3), elements=st.floats(0, 1))
```

### Dtypes

Dtypes can be generated with the `dtype_strategy` function. If no arguments are provided, this function will default to 
sampling from the set of all Pytorch dtypes. 
    
```python
import hypothesis_torch
hypothesis_torch.dtype_strategy()
```

If a set of dtypes is provided, the function will sample from that set.

```python
import hypothesis_torch
import torch
hypothesis_torch.dtype_strategy(dtypes={torch.float32, torch.float64})
```

### Devices

Devices can be generated with the `device_strategy` function. If no arguments are provided, this function will default to
sampling from the set of all available, physical devices.

```python
import hypothesis_torch
hypothesis_torch.device_strategy()
```

If a set of devices is provided, the function will sample from that set.

```python
import hypothesis_torch
import torch
hypothesis_torch.device_strategy(devices={torch.device('cuda:0'), torch.device('cpu')})
```

If `allow_meta_device` is set to `True`, the strategy may also return meta devices, i.e. `torch.device('meta')`.

```python
import hypothesis_torch
hypothesis_torch.device_strategy(allow_meta_device=True)
```

### Modules

Various types of PyTorch modules have their own strategies.

#### Activation functions

Activation functions can be generated with the `same_shape_activation_strategy` function. 

```python
import hypothesis_torch
hypothesis_torch.same_shape_activation_strategy()
```

#### Fully-connected/Feed forward neural networks

Fully-connected neural networks can be generated with the `linear_network_strategy` function. This function takes in 
optional arguments for the input size, output size, and number of hidden layers. Each of these arguments can be 
specified as a fixed value or as a strategy. For example, to generate a fully-connected neural network with an input
size of 10, an output size of 5, and 3 hidden layers with sizes between 5 and 10:

```python
import hypothesis_torch
from hypothesis import strategies as st
hypothesis_torch.linear_network_strategy(input_shape=(1,10), output_shape=(1,5), hidden_layer_size=st.integers(5, 10), num_hidden_layers=3)
```

#### Hugging Face Transformer Models

Hugging Face transformer models can be generated with the `transformer_strategy` function. This function takes in any
Hugging Face `PreTrainedModel` subclass (or a strategy that generates references `PreTrainedModel` subclasses) and 
returns an instance of that model. For example, to generate an arbitrary Llama2 model:

```python
import hypothesis_torch
import transformers
hypothesis_torch.transformer_strategy(transformers.LlamaForCausalLM)
```

The strategy also accepts `kwargs` to pass to the model constructor. These can be either fixed values or strategies to 
generate those corresponding values. For example, to generate an arbitrary Llama2 model with a hidden size between 64 and
128, but a fixed vocabulary size of 1000:

```python
import hypothesis_torch
import transformers
from hypothesis import strategies as st
hypothesis_torch.transformer_strategy(transformers.LlamaForCausalLM, hidden_size=st.integers(64, 128), vocab_size=1000)
```

[! Note]
    Currently, the `transformer_strategy` only accepts `kwargs` that can be passed to the constructor of the model's 
    config class. Thus, it cannot currently replicate all the behavior of calling `from_pretrained` on a model class.

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "hypothesis-torch",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.9",
    "maintainer_email": null,
    "keywords": "hypothesis, torch, pytorch, testing, property-based testing",
    "author": null,
    "author_email": "\"Andrew P. Sansom\" <andrew@euleriancircuit.com>",
    "download_url": "https://files.pythonhosted.org/packages/5e/d7/191035f537ad369b6dfbafdd283b74f427afc97dd69cf5e458165e3a1240/hypothesis_torch-0.4.3.tar.gz",
    "platform": null,
    "description": "# hypothesis-torch\nHypothesis strategies for various Pytorch structures (including tensors and modules).\n\n[Hypothesis](https://hypothesis.readthedocs.io/en/latest/) is a powerful property-based testing library for Python. It\nlacks built-in support for Pytorch tensors and modules, so this library provides strategies for generating them.\n\n## Installation\n`hypothesis-torch` can be installed via pip:\n```bash\npip install hypothesis-torch\n```\n\nOptionally, you can also install the `huggingface` extra to also install the `transformers` library:\n```bash\npip install hypothesis-torch[huggingface]\n```\n\nStrategies for generating Hugging Face transformer models are provided in the `hypothesis_torch.huggingface` module. If \nand only if `transformers` is installed when `hypothesis-torch` is imported, these strategies will be available from\nthe root `hypothesis_torch` module.\n\n## What you can generate\n\n### Tensors\n\nTensors can be generated with the `tensor_strategy` function. This function takes in optional arguments for the shape,\ndtype, device, and other properties of the desired tensors. Each property can be specified as a fixed value or as a\nstrategy. For example, to generate a tensor with a shape of 3x3, a dtype of `torch.float32`, and values between 0 and 1,\n\n```python\nimport hypothesis_torch\nfrom hypothesis import strategies as st\nimport torch\nhypothesis_torch.tensor_strategy(dtype=torch.float32, shape=(3, 3), elements=st.floats(0, 1))\n```\n\nNote that specifying other hypothesis strategies that return the same type as an argument will sample from that strategy\nwhile generating the tensor. For example, to generate a tensor with any dtype, specify a strategy that returns a dtype:\n\n```python\nimport hypothesis_torch\nfrom hypothesis import strategies as st\nimport torch\nhypothesis_torch.tensor_strategy(dtype=st.sampled_from([torch.float32, torch.float64]), shape=(3, 3), elements=st.floats(0, 1))\n```\n\n### Dtypes\n\nDtypes can be generated with the `dtype_strategy` function. If no arguments are provided, this function will default to \nsampling from the set of all Pytorch dtypes. \n    \n```python\nimport hypothesis_torch\nhypothesis_torch.dtype_strategy()\n```\n\nIf a set of dtypes is provided, the function will sample from that set.\n\n```python\nimport hypothesis_torch\nimport torch\nhypothesis_torch.dtype_strategy(dtypes={torch.float32, torch.float64})\n```\n\n### Devices\n\nDevices can be generated with the `device_strategy` function. If no arguments are provided, this function will default to\nsampling from the set of all available, physical devices.\n\n```python\nimport hypothesis_torch\nhypothesis_torch.device_strategy()\n```\n\nIf a set of devices is provided, the function will sample from that set.\n\n```python\nimport hypothesis_torch\nimport torch\nhypothesis_torch.device_strategy(devices={torch.device('cuda:0'), torch.device('cpu')})\n```\n\nIf `allow_meta_device` is set to `True`, the strategy may also return meta devices, i.e. `torch.device('meta')`.\n\n```python\nimport hypothesis_torch\nhypothesis_torch.device_strategy(allow_meta_device=True)\n```\n\n### Modules\n\nVarious types of PyTorch modules have their own strategies.\n\n#### Activation functions\n\nActivation functions can be generated with the `same_shape_activation_strategy` function. \n\n```python\nimport hypothesis_torch\nhypothesis_torch.same_shape_activation_strategy()\n```\n\n#### Fully-connected/Feed forward neural networks\n\nFully-connected neural networks can be generated with the `linear_network_strategy` function. This function takes in \noptional arguments for the input size, output size, and number of hidden layers. Each of these arguments can be \nspecified as a fixed value or as a strategy. For example, to generate a fully-connected neural network with an input\nsize of 10, an output size of 5, and 3 hidden layers with sizes between 5 and 10:\n\n```python\nimport hypothesis_torch\nfrom hypothesis import strategies as st\nhypothesis_torch.linear_network_strategy(input_shape=(1,10), output_shape=(1,5), hidden_layer_size=st.integers(5, 10), num_hidden_layers=3)\n```\n\n#### Hugging Face Transformer Models\n\nHugging Face transformer models can be generated with the `transformer_strategy` function. This function takes in any\nHugging Face `PreTrainedModel` subclass (or a strategy that generates references `PreTrainedModel` subclasses) and \nreturns an instance of that model. For example, to generate an arbitrary Llama2 model:\n\n```python\nimport hypothesis_torch\nimport transformers\nhypothesis_torch.transformer_strategy(transformers.LlamaForCausalLM)\n```\n\nThe strategy also accepts `kwargs` to pass to the model constructor. These can be either fixed values or strategies to \ngenerate those corresponding values. For example, to generate an arbitrary Llama2 model with a hidden size between 64 and\n128, but a fixed vocabulary size of 1000:\n\n```python\nimport hypothesis_torch\nimport transformers\nfrom hypothesis import strategies as st\nhypothesis_torch.transformer_strategy(transformers.LlamaForCausalLM, hidden_size=st.integers(64, 128), vocab_size=1000)\n```\n\n[! Note]\n    Currently, the `transformer_strategy` only accepts `kwargs` that can be passed to the constructor of the model's \n    config class. Thus, it cannot currently replicate all the behavior of calling `from_pretrained` on a model class.\n",
    "bugtrack_url": null,
    "license": "MIT License  Copyright (c) 2024 Andrew Sansom  Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the \"Software\"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:  The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.  THE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ",
    "summary": "Hypothesis strategies for various Pytorch structures, including tensors and modules.",
    "version": "0.4.3",
    "project_urls": {
        "Documentation": "https://github.com/qthequartermasterman/hypothesis-torch",
        "Homepage": "https://github.com/qthequartermasterman/hypothesis-torch",
        "Issues": "https://github.com/qthequartermasterman/hypothesis-torch/issues",
        "Repository": "https://github.com/qthequartermasterman/hypothesis-torch.git"
    },
    "split_keywords": [
        "hypothesis",
        " torch",
        " pytorch",
        " testing",
        " property-based testing"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "242fbb8ad170297d113581909c20cf4e9bebd6426ab3e1287a9ee619c248baf6",
                "md5": "7df2186a5e4d9f206a557d5098ad73a9",
                "sha256": "ba3a717f73548f5115316e9b818a1ce366e33d13b87a46cf6c9122dd22381ec6"
            },
            "downloads": -1,
            "filename": "hypothesis_torch-0.4.3-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "7df2186a5e4d9f206a557d5098ad73a9",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.9",
            "size": 17774,
            "upload_time": "2024-04-28T03:29:58",
            "upload_time_iso_8601": "2024-04-28T03:29:58.951107Z",
            "url": "https://files.pythonhosted.org/packages/24/2f/bb8ad170297d113581909c20cf4e9bebd6426ab3e1287a9ee619c248baf6/hypothesis_torch-0.4.3-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "5ed7191035f537ad369b6dfbafdd283b74f427afc97dd69cf5e458165e3a1240",
                "md5": "0982dbbee8f3a02bdd705b760494a4f3",
                "sha256": "02cc77618de8f674aa6a1231f125e307ec9506d2005420d5fdd5fa94c2af573f"
            },
            "downloads": -1,
            "filename": "hypothesis_torch-0.4.3.tar.gz",
            "has_sig": false,
            "md5_digest": "0982dbbee8f3a02bdd705b760494a4f3",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.9",
            "size": 15750,
            "upload_time": "2024-04-28T03:30:00",
            "upload_time_iso_8601": "2024-04-28T03:30:00.908692Z",
            "url": "https://files.pythonhosted.org/packages/5e/d7/191035f537ad369b6dfbafdd283b74f427afc97dd69cf5e458165e3a1240/hypothesis_torch-0.4.3.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-04-28 03:30:00",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "qthequartermasterman",
    "github_project": "hypothesis-torch",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "hypothesis-torch"
}
        
Elapsed time: 0.24315s