Name | posteriors JSON |
Version |
0.0.5
JSON |
| download |
home_page | None |
Summary | Uncertainty quantification with PyTorch |
upload_time | 2024-11-05 18:39:02 |
maintainer | None |
docs_url | None |
author | None |
requires_python | >=3.9 |
license | Apache-2.0 |
keywords |
pytorch
uncertainty
|
VCS |
|
bugtrack_url |
|
requirements |
No requirements were recorded.
|
Travis-CI |
No Travis.
|
coveralls test coverage |
No coveralls.
|
<div align="center">
<img src="https://storage.googleapis.com/posteriors/logo_with_text.png" alt="logo"></img>
</div>
[**Installation**](#installation)
| [**Quickstart**](#quickstart)
| [**Methods**](#methods)
| [**Friends**](#friends)
| [**Contributing**](#contributing)
| [**Citation**](#citation)
| [**Documentation**](https://normal-computing.github.io/posteriors/)
| [**Paper**](https://arxiv.org/abs/2406.00104)
## What is `posteriors`?
General purpose python library for uncertainty quantification with [`PyTorch`](https://github.com/pytorch/pytorch).
- [x] **Composable**: Use with [`transformers`](https://huggingface.co/docs/transformers/en/index), [`lightning`](https://lightning.ai/), [`torchopt`](https://github.com/metaopt/torchopt), [`torch.distributions`](https://pytorch.org/docs/stable/distributions.html), [`pyro`](https://pyro.ai/) and more!
- [x] **Extensible**: Add new methods! Add new models!
- [x] **Functional**: Easier to test, closer to mathematics!
- [x] **Scalable**: Big model? Big data? No problem!
- [x] **Swappable**: Swap between algorithms with ease!
## Installation
`posteriors` is available on [PyPI](https://pypi.org/project/posteriors/) and can be installed via `pip`:
```bash
pip install posteriors
```
## Quickstart
`posteriors` is functional first and aims to be easy to use and extend. Let's try it out
by training a simple model with variational inference:
```python
from torchvision.datasets import MNIST
from torchvision.transforms import ToTensor
from torch import nn, utils, func
import torchopt
import posteriors
dataset = MNIST(root="./data", transform=ToTensor())
train_loader = utils.data.DataLoader(dataset, batch_size=32, shuffle=True)
num_data = len(dataset)
classifier = nn.Sequential(nn.Linear(28 * 28, 64), nn.ReLU(), nn.Linear(64, 10))
params = dict(classifier.named_parameters())
def log_posterior(params, batch):
images, labels = batch
images = images.view(images.size(0), -1)
output = func.functional_call(classifier, params, images)
log_post_val = (
-nn.functional.cross_entropy(output, labels)
+ posteriors.diag_normal_log_prob(params) / num_data
)
return log_post_val, output
transform = posteriors.vi.diag.build(
log_posterior, torchopt.adam(), temperature=1 / num_data
) # Can swap out for any posteriors algorithm
state = transform.init(params)
for batch in train_loader:
state = transform.update(state, batch)
```
Observe that `posteriors` recommends specifying `log_posterior` and `temperature` such that
`log_posterior` remains on the same scale for different batch sizes. `posteriors`
algorithms are designed to be stable as `temperature` goes to zero.
Further, the output of `log_posterior` is a tuple containing the evaluation
(single-element Tensor) and an additional argument (TensorTree) containing any
auxiliary information we'd like to retain from the model call, here the model predictions.
If you have no auxiliary information, you can simply return `torch.tensor([])` as
the second element. For more info see [`torch.func.grad`](https://pytorch.org/docs/stable/generated/torch.func.grad.html)
(with `has_aux=True`) or the [documentation](https://normal-computing.github.io/posteriors/log_posteriors).
Check out the [tutorials](https://normal-computing.github.io/posteriors/tutorials) for more detailed usage!
## Methods
`posteriors` supports a variety of methods for uncertainty quantification, including:
- [**Extended Kalman filter**](posteriors/ekf/)
- [**Laplace approximation**](posteriors/laplace/)
- [**Stochastic gradient MCMC**](posteriors/sgmcmc/)
- [**Variational inference**](posteriors/vi/)
With full details available in the [API documentation](https://normal-computing.github.io/posteriors/api).
`posteriors` is designed to be easily extensible, if you're favorite method is not listed above,
[raise an issue]((https://github.com/normal-computing/posteriors/issues)) and we'll see what we can do!
## Friends
Interfaces seamlessly with:
- [`torch`](https://github.com/pytorch/pytorch) and in particular [`torch.func`](https://pytorch.org/docs/stable/func.html).
- [`torch.distributions`](https://pytorch.org/docs/stable/distributions.html) for distributions and sampling, (note that it's typically required to set `validate_args=False` to conform with the control flows in [`torch.func`](https://pytorch.org/docs/stable/func.html)).
- Functional and flexible torch optimizers from [`torchopt`](https://github.com/metaopt/torchopt).
- [`transformers`](https://github.com/huggingface/transformers) for pre-trained models.
- [`lightning`](https://github.com/Lightning-AI/lightning) for convenient training and logging, see [examples/lightning_autoencoder.py](examples/lightning_autoencoder.py).
The functional transform interface is strongly inspired by frameworks such as
[`optax`](https://github.com/google-deepmind/optax) and [`blackjax`](https://github.com/blackjax-devs/blackjax).
As well as other UQ libraries [`fortuna`](https://github.com/awslabs/fortuna),
[`laplace`](https://github.com/aleximmer/Laplace), [`numpyro`](https://github.com/pyro-ppl/numpyro),
[`pymc`](https://github.com/pymc-devs/pymc) and [`uncertainty-baselines`](https://github.com/google/uncertainty-baselines).
## Contributing
You can report a bug or request a feature by [creating a new issue on GitHub](https://github.com/normal-computing/posteriors/issues).
If you want to contribute code, please check the [contributing guide](https://normal-computing.github.io/posteriors/contributing).
## Citation
If you use `posteriors` in your research, please cite the library using the following BibTeX entry:
```bibtex
@article{duffield2024scalable,
title={Scalable Bayesian Learning with posteriors},
author={Duffield, Samuel and Donatella, Kaelan and Chiu, Johnathan and Klett, Phoebe and Simpson, Daniel},
journal={arXiv preprint arXiv:2406.00104},
year={2024}
}
```
Raw data
{
"_id": null,
"home_page": null,
"name": "posteriors",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.9",
"maintainer_email": null,
"keywords": "pytorch, uncertainty",
"author": null,
"author_email": "Sam Duffield <sam@normalcomputing.ai>",
"download_url": "https://files.pythonhosted.org/packages/e6/db/2ee6d23a3cb376688d61804219ad5f5d529792b97d07c3021ac2cde66773/posteriors-0.0.5.tar.gz",
"platform": null,
"description": "<div align=\"center\">\n<img src=\"https://storage.googleapis.com/posteriors/logo_with_text.png\" alt=\"logo\"></img>\n</div>\n\n[**Installation**](#installation)\n| [**Quickstart**](#quickstart)\n| [**Methods**](#methods)\n| [**Friends**](#friends)\n| [**Contributing**](#contributing)\n| [**Citation**](#citation)\n| [**Documentation**](https://normal-computing.github.io/posteriors/)\n| [**Paper**](https://arxiv.org/abs/2406.00104)\n\n## What is `posteriors`?\n\nGeneral purpose python library for uncertainty quantification with [`PyTorch`](https://github.com/pytorch/pytorch).\n\n- [x] **Composable**: Use with [`transformers`](https://huggingface.co/docs/transformers/en/index), [`lightning`](https://lightning.ai/), [`torchopt`](https://github.com/metaopt/torchopt), [`torch.distributions`](https://pytorch.org/docs/stable/distributions.html), [`pyro`](https://pyro.ai/) and more!\n- [x] **Extensible**: Add new methods! Add new models!\n- [x] **Functional**: Easier to test, closer to mathematics!\n- [x] **Scalable**: Big model? Big data? No problem!\n- [x] **Swappable**: Swap between algorithms with ease!\n\n\n## Installation\n\n`posteriors` is available on [PyPI](https://pypi.org/project/posteriors/) and can be installed via `pip`:\n\n```bash\npip install posteriors\n```\n\n## Quickstart\n\n`posteriors` is functional first and aims to be easy to use and extend. Let's try it out\nby training a simple model with variational inference:\n```python\nfrom torchvision.datasets import MNIST\nfrom torchvision.transforms import ToTensor\nfrom torch import nn, utils, func\nimport torchopt\nimport posteriors\n\ndataset = MNIST(root=\"./data\", transform=ToTensor())\ntrain_loader = utils.data.DataLoader(dataset, batch_size=32, shuffle=True)\nnum_data = len(dataset)\n\nclassifier = nn.Sequential(nn.Linear(28 * 28, 64), nn.ReLU(), nn.Linear(64, 10))\nparams = dict(classifier.named_parameters())\n\n\ndef log_posterior(params, batch):\n images, labels = batch\n images = images.view(images.size(0), -1)\n output = func.functional_call(classifier, params, images)\n log_post_val = (\n -nn.functional.cross_entropy(output, labels)\n + posteriors.diag_normal_log_prob(params) / num_data\n )\n return log_post_val, output\n\n\ntransform = posteriors.vi.diag.build(\n log_posterior, torchopt.adam(), temperature=1 / num_data\n) # Can swap out for any posteriors algorithm\n\nstate = transform.init(params)\n\nfor batch in train_loader:\n state = transform.update(state, batch)\n\n```\n\nObserve that `posteriors` recommends specifying `log_posterior` and `temperature` such that \n`log_posterior` remains on the same scale for different batch sizes. `posteriors` \nalgorithms are designed to be stable as `temperature` goes to zero.\n\nFurther, the output of `log_posterior` is a tuple containing the evaluation \n(single-element Tensor) and an additional argument (TensorTree) containing any \nauxiliary information we'd like to retain from the model call, here the model predictions.\nIf you have no auxiliary information, you can simply return `torch.tensor([])` as\nthe second element. For more info see [`torch.func.grad`](https://pytorch.org/docs/stable/generated/torch.func.grad.html) \n(with `has_aux=True`) or the [documentation](https://normal-computing.github.io/posteriors/log_posteriors).\n\nCheck out the [tutorials](https://normal-computing.github.io/posteriors/tutorials) for more detailed usage!\n\n## Methods\n\n`posteriors` supports a variety of methods for uncertainty quantification, including:\n\n- [**Extended Kalman filter**](posteriors/ekf/)\n- [**Laplace approximation**](posteriors/laplace/)\n- [**Stochastic gradient MCMC**](posteriors/sgmcmc/)\n- [**Variational inference**](posteriors/vi/)\n\nWith full details available in the [API documentation](https://normal-computing.github.io/posteriors/api).\n\n`posteriors` is designed to be easily extensible, if you're favorite method is not listed above,\n[raise an issue]((https://github.com/normal-computing/posteriors/issues)) and we'll see what we can do!\n\n\n## Friends\n\nInterfaces seamlessly with:\n\n- [`torch`](https://github.com/pytorch/pytorch) and in particular [`torch.func`](https://pytorch.org/docs/stable/func.html).\n- [`torch.distributions`](https://pytorch.org/docs/stable/distributions.html) for distributions and sampling, (note that it's typically required to set `validate_args=False` to conform with the control flows in [`torch.func`](https://pytorch.org/docs/stable/func.html)).\n- Functional and flexible torch optimizers from [`torchopt`](https://github.com/metaopt/torchopt).\n- [`transformers`](https://github.com/huggingface/transformers) for pre-trained models.\n- [`lightning`](https://github.com/Lightning-AI/lightning) for convenient training and logging, see [examples/lightning_autoencoder.py](examples/lightning_autoencoder.py).\n\nThe functional transform interface is strongly inspired by frameworks such as \n[`optax`](https://github.com/google-deepmind/optax) and [`blackjax`](https://github.com/blackjax-devs/blackjax).\n\nAs well as other UQ libraries [`fortuna`](https://github.com/awslabs/fortuna),\n[`laplace`](https://github.com/aleximmer/Laplace), [`numpyro`](https://github.com/pyro-ppl/numpyro),\n[`pymc`](https://github.com/pymc-devs/pymc) and [`uncertainty-baselines`](https://github.com/google/uncertainty-baselines).\n\n\n## Contributing\n\nYou can report a bug or request a feature by [creating a new issue on GitHub](https://github.com/normal-computing/posteriors/issues).\n\n\nIf you want to contribute code, please check the [contributing guide](https://normal-computing.github.io/posteriors/contributing).\n\n\n## Citation\n\nIf you use `posteriors` in your research, please cite the library using the following BibTeX entry:\n\n```bibtex\n@article{duffield2024scalable,\n title={Scalable Bayesian Learning with posteriors},\n author={Duffield, Samuel and Donatella, Kaelan and Chiu, Johnathan and Klett, Phoebe and Simpson, Daniel},\n journal={arXiv preprint arXiv:2406.00104},\n year={2024}\n}\n```\n",
"bugtrack_url": null,
"license": "Apache-2.0",
"summary": "Uncertainty quantification with PyTorch",
"version": "0.0.5",
"project_urls": null,
"split_keywords": [
"pytorch",
" uncertainty"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "947d1a9b5f15eb8730d38ce7260682394cc89bbf19f9db66921c71a756ee433a",
"md5": "1dc5dc0aa3f6114b73098faa029649e2",
"sha256": "606b2a49b24b5050e783c7a4f4f49f0581bc1c51b64e585bef3ba5e02244b863"
},
"downloads": -1,
"filename": "posteriors-0.0.5-py3-none-any.whl",
"has_sig": false,
"md5_digest": "1dc5dc0aa3f6114b73098faa029649e2",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.9",
"size": 47716,
"upload_time": "2024-11-05T18:39:00",
"upload_time_iso_8601": "2024-11-05T18:39:00.886480Z",
"url": "https://files.pythonhosted.org/packages/94/7d/1a9b5f15eb8730d38ce7260682394cc89bbf19f9db66921c71a756ee433a/posteriors-0.0.5-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "e6db2ee6d23a3cb376688d61804219ad5f5d529792b97d07c3021ac2cde66773",
"md5": "2a845cc40c5164bd9c235f9e7ec8898f",
"sha256": "6d0a9c418b695a5c8c2bd952e513eb5a5984fbdae5c8a4a004c106ba37facc55"
},
"downloads": -1,
"filename": "posteriors-0.0.5.tar.gz",
"has_sig": false,
"md5_digest": "2a845cc40c5164bd9c235f9e7ec8898f",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.9",
"size": 38513,
"upload_time": "2024-11-05T18:39:02",
"upload_time_iso_8601": "2024-11-05T18:39:02.696789Z",
"url": "https://files.pythonhosted.org/packages/e6/db/2ee6d23a3cb376688d61804219ad5f5d529792b97d07c3021ac2cde66773/posteriors-0.0.5.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-11-05 18:39:02",
"github": false,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"lcname": "posteriors"
}