torchbayesian


Nametorchbayesian JSON
Version 0.1.0 PyPI version JSON
download
home_pageNone
SummaryOne line to turn any PyTorch model into a BNN — effortless uncertainty quantification.
upload_time2025-10-21 14:29:26
maintainerNone
docs_urlNone
authorNone
requires_python>=3.10
licenseNone
keywords bayes-by-backprop bayesian bnn deep-learning neural-network pytorch torch uncertainty
VCS
bugtrack_url
requirements torch
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # torchbayesian — Bayesian Neural Networks made easy

![Python 3.10+](https://img.shields.io/badge/python-3.10%2B-blue?logo=python)
![PyTorch](https://img.shields.io/badge/PyTorch-1.10+-orange?logo=pytorch)
[![License](https://img.shields.io/badge/license-Apache%202.0-green.svg)](https://opensource.org/licenses/Apache-2.0)

**torchbayesian** is a lightweight [PyTorch](https://pytorch.org/) extension that lets you turn any PyTorch model into a **Bayesian Neural Network** (BNN) with just one line of code.
It makes [Bayes-by-Backprop](https://arxiv.org/abs/1505.05424) and **variational inference** effortless and **compatible with any** `nn.Module`, without you having to rewrite your model using custom layers or change your usual PyTorch workflow.
Its goal is to make **uncertainty-aware** and **Bayesian deep learning** as easy as working with any traditional neural network.

### One line to transform any torch model into a BNN

Simply wrap any `nn.Module` model `bnn.BayesianModule` to make it a BNN :

```python
from torchbayesian.bnn import BayesianModule

net = BayesianModule(net)  # 'net' is now a BNN
```

The resulting model behaves exactly like any standard `nn.Module`, but instead of learning fixed weight values, your model now learns distributions from which weight values are sampled during training and inference, allowing it to capture uncertainty in its parameters and predictions.

Figure


## Key Features

- **One line to "BNN-ize" any model** — Turn any already existing PyTorch model into a BNN with a single line of code. No need to rewrite your model, redefine layers, or modify your existing architecture.
- **Truly compatible with all layers** — Unlike other "BNN-izers" that swap specific supported layers for variational versions, torchbayesian converts every trainable parameter in your model into a variational posterior module, actually making the entire model Bayesian, not just parts of it.
- **PyTorch-native design** — Works entirely within the PyTorch framework; training, inference, evaluation remain unchanged. Fully compatible with other PyTorch-based tools such as [Lightning](https://lightning.ai/docs/pytorch/stable/), [TorchMetrics](https://lightning.ai/docs/torchmetrics/stable/), and [MONAI](https://monai.io/).
- **Custom priors and variational posteriors** — Specify priors and variational posteriors directly as arguments. You can also define your own custom priors and variational posteriors and register them with the API using a simple decorator logic. This allows both plug-and-play use and deep customization without having to touch the core library.
- **KL divergence easily accessible** — Retrieve the model's KL divergence at any point using the `.kl_divergence()` method of `bnn.BayesianModule`.
- **Flexible KL computation** — When analytic computation is not available for some pair of variational posterior and prior, falls back to an estimation using Monte-Carlo sampling. This ensures generality and support for arbitrary user-defined distributions.

## Requirements

torchbayesian works with Python 3.10+ and has a direct dependency on [PyTorch](https://pytorch.org/get-started/locally/).

## Installation

To install the current release, run :

...

## Getting started

How to use it

How it works

KL divergence

Example .py

Factories

## Motivation

...

## License

...

## Citation

...

## Contact

...

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "torchbayesian",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.10",
    "maintainer_email": null,
    "keywords": "bayes-by-backprop, bayesian, bnn, deep-learning, neural-network, pytorch, torch, uncertainty",
    "author": null,
    "author_email": "Raphael Brodeur <raphael.brodeur@gmail.com>",
    "download_url": "https://files.pythonhosted.org/packages/91/a5/fff6f35cc1ede211897d640cc143ef712df04b89a65b5388610c392e30e4/torchbayesian-0.1.0.tar.gz",
    "platform": null,
    "description": "# torchbayesian \u2014 Bayesian Neural Networks made easy\r\n\r\n![Python 3.10+](https://img.shields.io/badge/python-3.10%2B-blue?logo=python)\r\n![PyTorch](https://img.shields.io/badge/PyTorch-1.10+-orange?logo=pytorch)\r\n[![License](https://img.shields.io/badge/license-Apache%202.0-green.svg)](https://opensource.org/licenses/Apache-2.0)\r\n\r\n**torchbayesian** is a lightweight [PyTorch](https://pytorch.org/) extension that lets you turn any PyTorch model into a **Bayesian Neural Network** (BNN) with just one line of code.\r\nIt makes [Bayes-by-Backprop](https://arxiv.org/abs/1505.05424) and **variational inference** effortless and **compatible with any** `nn.Module`, without you having to rewrite your model using custom layers or change your usual PyTorch workflow.\r\nIts goal is to make **uncertainty-aware** and **Bayesian deep learning** as easy as working with any traditional neural network.\r\n\r\n### One line to transform any torch model into a BNN\r\n\r\nSimply wrap any `nn.Module` model `bnn.BayesianModule` to make it a BNN :\r\n\r\n```python\r\nfrom torchbayesian.bnn import BayesianModule\r\n\r\nnet = BayesianModule(net)  # 'net' is now a BNN\r\n```\r\n\r\nThe resulting model behaves exactly like any standard `nn.Module`, but instead of learning fixed weight values, your model now learns distributions from which weight values are sampled during training and inference, allowing it to capture uncertainty in its parameters and predictions.\r\n\r\nFigure\r\n\r\n\r\n## Key Features\r\n\r\n- **One line to \"BNN-ize\" any model** \u2014 Turn any already existing PyTorch model into a BNN with a single line of code. No need to rewrite your model, redefine layers, or modify your existing architecture.\r\n- **Truly compatible with all layers** \u2014 Unlike other \"BNN-izers\" that swap specific supported layers for variational versions, torchbayesian converts every trainable parameter in your model into a variational posterior module, actually making the entire model Bayesian, not just parts of it.\r\n- **PyTorch-native design** \u2014 Works entirely within the PyTorch framework; training, inference, evaluation remain unchanged. Fully compatible with other PyTorch-based tools such as [Lightning](https://lightning.ai/docs/pytorch/stable/), [TorchMetrics](https://lightning.ai/docs/torchmetrics/stable/), and [MONAI](https://monai.io/).\r\n- **Custom priors and variational posteriors** \u2014 Specify priors and variational posteriors directly as arguments. You can also define your own custom priors and variational posteriors and register them with the API using a simple decorator logic. This allows both plug-and-play use and deep customization without having to touch the core library.\r\n- **KL divergence easily accessible** \u2014 Retrieve the model's KL divergence at any point using the `.kl_divergence()` method of `bnn.BayesianModule`.\r\n- **Flexible KL computation** \u2014 When analytic computation is not available for some pair of variational posterior and prior, falls back to an estimation using Monte-Carlo sampling. This ensures generality and support for arbitrary user-defined distributions.\r\n\r\n## Requirements\r\n\r\ntorchbayesian works with Python 3.10+ and has a direct dependency on [PyTorch](https://pytorch.org/get-started/locally/).\r\n\r\n## Installation\r\n\r\nTo install the current release, run :\r\n\r\n...\r\n\r\n## Getting started\r\n\r\nHow to use it\r\n\r\nHow it works\r\n\r\nKL divergence\r\n\r\nExample .py\r\n\r\nFactories\r\n\r\n## Motivation\r\n\r\n...\r\n\r\n## License\r\n\r\n...\r\n\r\n## Citation\r\n\r\n...\r\n\r\n## Contact\r\n\r\n...\r\n",
    "bugtrack_url": null,
    "license": null,
    "summary": "One line to turn any PyTorch model into a BNN \u2014 effortless uncertainty quantification.",
    "version": "0.1.0",
    "project_urls": {
        "Homepage": "https://github.com/raphbrodeur/torchbayesian"
    },
    "split_keywords": [
        "bayes-by-backprop",
        " bayesian",
        " bnn",
        " deep-learning",
        " neural-network",
        " pytorch",
        " torch",
        " uncertainty"
    ],
    "urls": [
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "1b9320943deefd570e845ca5d7ab46226a2241b75988912bf966bf8a32b51bf8",
                "md5": "7af7b98050740680dbf2f753d5449821",
                "sha256": "e1b07eabca2a79d8c3b74e1eb91fc98ada5e47bc0d3a6bcfcf9b718a1c413a8c"
            },
            "downloads": -1,
            "filename": "torchbayesian-0.1.0-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "7af7b98050740680dbf2f753d5449821",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.10",
            "size": 24897,
            "upload_time": "2025-10-21T14:29:25",
            "upload_time_iso_8601": "2025-10-21T14:29:25.135178Z",
            "url": "https://files.pythonhosted.org/packages/1b/93/20943deefd570e845ca5d7ab46226a2241b75988912bf966bf8a32b51bf8/torchbayesian-0.1.0-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "91a5fff6f35cc1ede211897d640cc143ef712df04b89a65b5388610c392e30e4",
                "md5": "09f431bd628162b0f0cc2c3c47f58f67",
                "sha256": "a91313cda6ad8c380d3fa663db0b286d83f143c5260694fdba509c608722ef48"
            },
            "downloads": -1,
            "filename": "torchbayesian-0.1.0.tar.gz",
            "has_sig": false,
            "md5_digest": "09f431bd628162b0f0cc2c3c47f58f67",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.10",
            "size": 21018,
            "upload_time": "2025-10-21T14:29:26",
            "upload_time_iso_8601": "2025-10-21T14:29:26.773265Z",
            "url": "https://files.pythonhosted.org/packages/91/a5/fff6f35cc1ede211897d640cc143ef712df04b89a65b5388610c392e30e4/torchbayesian-0.1.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-10-21 14:29:26",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "raphbrodeur",
    "github_project": "torchbayesian",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": false,
    "requirements": [
        {
            "name": "torch",
            "specs": []
        }
    ],
    "lcname": "torchbayesian"
}
        
Elapsed time: 1.58014s