torch-uncertainty


Nametorch-uncertainty JSON
Version 0.3.1 PyPI version JSON
download
home_pageNone
SummaryUncertainty quantification library in PyTorch
upload_time2024-11-18 09:39:36
maintainerNone
docs_urlNone
authorNone
requires_python>=3.10
licenseNone
keywords bayesian-network ensembles neural-networks predictive-uncertainty reliable-ai trustworthy-machine-learning uncertainty uncertainty-quantification
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            <div align="center">

![TorchUncertaintyLogo](https://github.com/ENSTA-U2IS-AI/torch-uncertainty/blob/main/docs/source/_static/images/torch_uncertainty.png)

[![pypi](https://img.shields.io/pypi/v/torch_uncertainty.svg)](https://pypi.python.org/pypi/torch_uncertainty)
[![tests](https://github.com/ENSTA-U2IS-AI/torch-uncertainty/actions/workflows/run-tests.yml/badge.svg?branch=main&event=push)](https://github.com/ENSTA-U2IS-AI/torch-uncertainty/actions/workflows/run-tests.yml)
[![Docs](https://github.com/ENSTA-U2IS-AI/torch-uncertainty/actions/workflows/build-docs.yml/badge.svg)](https://torch-uncertainty.github.io/)
[![PRWelcome](https://img.shields.io/badge/PRs-welcome-brightgreen.svg)](https://github.com/ENSTA-U2IS-AI/torch-uncertainty/pulls)
[![Ruff](https://img.shields.io/endpoint?url=https://raw.githubusercontent.com/astral-sh/ruff/main/assets/badge/v2.json)](https://github.com/astral-sh/ruff)
[![Code Coverage](https://codecov.io/github/ENSTA-U2IS-AI/torch-uncertainty/coverage.svg?branch=master)](https://codecov.io/gh/ENSTA-U2IS-AI/torch-uncertainty)
[![Downloads](https://static.pepy.tech/badge/torch-uncertainty)](https://pepy.tech/project/torch-uncertainty)
[![Discord Badge](https://dcbadge.vercel.app/api/server/HMCawt5MJu?compact=true&style=flat)](https://discord.gg/HMCawt5MJu)
</div>

_TorchUncertainty_ is a package designed to help you leverage [uncertainty quantification techniques](https://github.com/ENSTA-U2IS-AI/awesome-uncertainty-deeplearning) and make your deep neural networks more reliable. It aims at being collaborative and including as many methods as possible, so reach out to add yours!

:construction: _TorchUncertainty_ is in early development :construction: - expect changes, but reach out and contribute if you are interested in the project! **Please raise an issue if you have any bugs or difficulties and join the [discord server](https://discord.gg/HMCawt5MJu).**

:books: Our webpage and documentation is available here: [torch-uncertainty.github.io](https://torch-uncertainty.github.io). :books:

TorchUncertainty contains the *official implementations* of multiple papers from *major machine-learning and computer vision conferences* and was/will be featured in tutorials at **[WACV](https://wacv2024.thecvf.com/) 2024**, **[HAICON](https://haicon24.de/) 2024** and **[ECCV](https://eccv.ecva.net/) 2024**.

---

This package provides a multi-level API, including:

- easy-to-use :zap: lightning **uncertainty-aware** training & evaluation routines for **4 tasks**: classification, probabilistic and pointwise regression, and segmentation.
- ready-to-train baselines on research datasets, such as ImageNet and CIFAR
- [pretrained weights](https://huggingface.co/torch-uncertainty) for these baselines on ImageNet and CIFAR ( :construction: work in progress :construction: ).
- **layers**, **models**, **metrics**, & **losses** available for use in your networks
- scikit-learn style post-processing methods such as Temperature Scaling.

Have a look at the [Reference page](https://torch-uncertainty.github.io/references.html) or the [API reference](https://torch-uncertainty.github.io/api.html) for a more exhaustive list of the implemented methods, datasets, metrics, etc.

## :gear: Installation

TorchUncertainty requires Python 3.10 or greater. Install the desired PyTorch version in your environment.
Then, install the package from PyPI:

```sh
pip install torch-uncertainty
```

The installation procedure for contributors is different: have a look at the [contribution page](https://torch-uncertainty.github.io/contributing.html).

## :racehorse: Quickstart

We make a quickstart available at [torch-uncertainty.github.io/quickstart](https://torch-uncertainty.github.io/quickstart.html).

## :books: Implemented methods

TorchUncertainty currently supports **classification**, **probabilistic** and pointwise **regression**, **segmentation** and **pixelwise regression** (such as monocular depth estimation). It includes the official codes of the following papers:

- *LP-BNN: Encoding the latent posterior of Bayesian Neural Networks for uncertainty quantification* - [IEEE TPAMI](https://arxiv.org/abs/2012.02818)
- *Packed-Ensembles for Efficient Uncertainty Estimation* - [ICLR 2023](https://arxiv.org/abs/2210.09184) - [Tutorial](https://torch-uncertainty.github.io/auto_tutorials/tutorial_pe_cifar10.html)
- *MUAD: Multiple Uncertainties for Autonomous Driving, a benchmark for multiple uncertainty types and tasks* - [BMVC 2022](https://arxiv.org/abs/2203.01437)

We also provide the following methods:

### Baselines

To date, the following deep learning baselines have been implemented. **Click** :inbox_tray: **on the methods for tutorials**:

- [Deep Ensembles](https://torch-uncertainty.github.io/auto_tutorials/tutorial_from_de_to_pe.html), BatchEnsemble, Masksembles, & MIMO
- [MC-Dropout](https://torch-uncertainty.github.io/auto_tutorials/tutorial_mc_dropout.html)
- [Packed-Ensembles](https://torch-uncertainty.github.io/auto_tutorials/tutorial_from_de_to_pe.html) (see [Blog post](https://medium.com/@adrien.lafage/make-your-neural-networks-more-reliable-with-packed-ensembles-7ad0b737a873))
- [Variational Bayesian Neural Networks](https://torch-uncertainty.github.io/auto_tutorials/tutorial_bayesian.html)
- Checkpoint Ensembles & Snapshot Ensembles
- Stochastic Weight Averaging & Stochastic Weight Averaging Gaussian
- Regression with Beta Gaussian NLL Loss
- [Deep Evidential Classification](https://torch-uncertainty.github.io/auto_tutorials/tutorial_evidential_classification.html) & [Regression](https://torch-uncertainty.github.io/auto_tutorials/tutorial_der_cubic.html)

### Augmentation methods

The following data augmentation methods have been implemented:

- Mixup, MixupIO, RegMixup, WarpingMixup

### Post-processing methods

To date, the following post-processing methods have been implemented:

- [Temperature](https://torch-uncertainty.github.io/auto_tutorials/tutorial_scaler.html), Vector, & Matrix scaling
- [Monte Carlo Batch Normalization](https://torch-uncertainty.github.io/auto_tutorials/tutorial_mc_batch_norm.html)
- Laplace approximation using the [Laplace library](https://github.com/aleximmer/Laplace)

## Tutorials

Check out our tutorials at [torch-uncertainty.github.io/auto_tutorials](https://torch-uncertainty.github.io/auto_tutorials/index.html).

## :telescope: Projects using TorchUncertainty

The following projects use TorchUncertainty:

- *A Symmetry-Aware Exploration of Bayesian Neural Network Posteriors* - [ICLR 2024](https://arxiv.org/abs/2310.08287)

**If you are using TorchUncertainty in your project, please let us know, we will add your project to this list!**


            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "torch-uncertainty",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.10",
    "maintainer_email": null,
    "keywords": "bayesian-network, ensembles, neural-networks, predictive-uncertainty, reliable-ai, trustworthy-machine-learning, uncertainty, uncertainty-quantification",
    "author": null,
    "author_email": "ENSTA U2IS <olivier.laurent@ensta-paris.fr>, Adrien Lafage <adrienlafage@outlook.com>, Olivier Laurent <olivier.laurent@ensta-paris.fr>",
    "download_url": "https://files.pythonhosted.org/packages/d8/83/04c03c21f074b5487bfce2efd25acccf01e7e80a211a7558503276960364/torch_uncertainty-0.3.1.tar.gz",
    "platform": null,
    "description": "<div align=\"center\">\n\n![TorchUncertaintyLogo](https://github.com/ENSTA-U2IS-AI/torch-uncertainty/blob/main/docs/source/_static/images/torch_uncertainty.png)\n\n[![pypi](https://img.shields.io/pypi/v/torch_uncertainty.svg)](https://pypi.python.org/pypi/torch_uncertainty)\n[![tests](https://github.com/ENSTA-U2IS-AI/torch-uncertainty/actions/workflows/run-tests.yml/badge.svg?branch=main&event=push)](https://github.com/ENSTA-U2IS-AI/torch-uncertainty/actions/workflows/run-tests.yml)\n[![Docs](https://github.com/ENSTA-U2IS-AI/torch-uncertainty/actions/workflows/build-docs.yml/badge.svg)](https://torch-uncertainty.github.io/)\n[![PRWelcome](https://img.shields.io/badge/PRs-welcome-brightgreen.svg)](https://github.com/ENSTA-U2IS-AI/torch-uncertainty/pulls)\n[![Ruff](https://img.shields.io/endpoint?url=https://raw.githubusercontent.com/astral-sh/ruff/main/assets/badge/v2.json)](https://github.com/astral-sh/ruff)\n[![Code Coverage](https://codecov.io/github/ENSTA-U2IS-AI/torch-uncertainty/coverage.svg?branch=master)](https://codecov.io/gh/ENSTA-U2IS-AI/torch-uncertainty)\n[![Downloads](https://static.pepy.tech/badge/torch-uncertainty)](https://pepy.tech/project/torch-uncertainty)\n[![Discord Badge](https://dcbadge.vercel.app/api/server/HMCawt5MJu?compact=true&style=flat)](https://discord.gg/HMCawt5MJu)\n</div>\n\n_TorchUncertainty_ is a package designed to help you leverage [uncertainty quantification techniques](https://github.com/ENSTA-U2IS-AI/awesome-uncertainty-deeplearning) and make your deep neural networks more reliable. It aims at being collaborative and including as many methods as possible, so reach out to add yours!\n\n:construction: _TorchUncertainty_ is in early development :construction: - expect changes, but reach out and contribute if you are interested in the project! **Please raise an issue if you have any bugs or difficulties and join the [discord server](https://discord.gg/HMCawt5MJu).**\n\n:books: Our webpage and documentation is available here: [torch-uncertainty.github.io](https://torch-uncertainty.github.io). :books:\n\nTorchUncertainty contains the *official implementations* of multiple papers from *major machine-learning and computer vision conferences* and was/will be featured in tutorials at **[WACV](https://wacv2024.thecvf.com/) 2024**, **[HAICON](https://haicon24.de/) 2024** and **[ECCV](https://eccv.ecva.net/) 2024**.\n\n---\n\nThis package provides a multi-level API, including:\n\n- easy-to-use :zap: lightning **uncertainty-aware** training & evaluation routines for **4 tasks**: classification, probabilistic and pointwise regression, and segmentation.\n- ready-to-train baselines on research datasets, such as ImageNet and CIFAR\n- [pretrained weights](https://huggingface.co/torch-uncertainty) for these baselines on ImageNet and CIFAR ( :construction: work in progress :construction: ).\n- **layers**, **models**, **metrics**, & **losses** available for use in your networks\n- scikit-learn style post-processing methods such as Temperature Scaling.\n\nHave a look at the [Reference page](https://torch-uncertainty.github.io/references.html) or the [API reference](https://torch-uncertainty.github.io/api.html) for a more exhaustive list of the implemented methods, datasets, metrics, etc.\n\n## :gear: Installation\n\nTorchUncertainty requires Python 3.10 or greater. Install the desired PyTorch version in your environment.\nThen, install the package from PyPI:\n\n```sh\npip install torch-uncertainty\n```\n\nThe installation procedure for contributors is different: have a look at the [contribution page](https://torch-uncertainty.github.io/contributing.html).\n\n## :racehorse: Quickstart\n\nWe make a quickstart available at [torch-uncertainty.github.io/quickstart](https://torch-uncertainty.github.io/quickstart.html).\n\n## :books: Implemented methods\n\nTorchUncertainty currently supports **classification**, **probabilistic** and pointwise **regression**, **segmentation** and **pixelwise regression** (such as monocular depth estimation). It includes the official codes of the following papers:\n\n- *LP-BNN: Encoding the latent posterior of Bayesian Neural Networks for uncertainty quantification* - [IEEE TPAMI](https://arxiv.org/abs/2012.02818)\n- *Packed-Ensembles for Efficient Uncertainty Estimation* - [ICLR 2023](https://arxiv.org/abs/2210.09184) - [Tutorial](https://torch-uncertainty.github.io/auto_tutorials/tutorial_pe_cifar10.html)\n- *MUAD: Multiple Uncertainties for Autonomous Driving, a benchmark for multiple uncertainty types and tasks* - [BMVC 2022](https://arxiv.org/abs/2203.01437)\n\nWe also provide the following methods:\n\n### Baselines\n\nTo date, the following deep learning baselines have been implemented. **Click** :inbox_tray: **on the methods for tutorials**:\n\n- [Deep Ensembles](https://torch-uncertainty.github.io/auto_tutorials/tutorial_from_de_to_pe.html), BatchEnsemble, Masksembles, & MIMO\n- [MC-Dropout](https://torch-uncertainty.github.io/auto_tutorials/tutorial_mc_dropout.html)\n- [Packed-Ensembles](https://torch-uncertainty.github.io/auto_tutorials/tutorial_from_de_to_pe.html) (see [Blog post](https://medium.com/@adrien.lafage/make-your-neural-networks-more-reliable-with-packed-ensembles-7ad0b737a873))\n- [Variational Bayesian Neural Networks](https://torch-uncertainty.github.io/auto_tutorials/tutorial_bayesian.html)\n- Checkpoint Ensembles & Snapshot Ensembles\n- Stochastic Weight Averaging & Stochastic Weight Averaging Gaussian\n- Regression with Beta Gaussian NLL Loss\n- [Deep Evidential Classification](https://torch-uncertainty.github.io/auto_tutorials/tutorial_evidential_classification.html) & [Regression](https://torch-uncertainty.github.io/auto_tutorials/tutorial_der_cubic.html)\n\n### Augmentation methods\n\nThe following data augmentation methods have been implemented:\n\n- Mixup, MixupIO, RegMixup, WarpingMixup\n\n### Post-processing methods\n\nTo date, the following post-processing methods have been implemented:\n\n- [Temperature](https://torch-uncertainty.github.io/auto_tutorials/tutorial_scaler.html), Vector, & Matrix scaling\n- [Monte Carlo Batch Normalization](https://torch-uncertainty.github.io/auto_tutorials/tutorial_mc_batch_norm.html)\n- Laplace approximation using the [Laplace library](https://github.com/aleximmer/Laplace)\n\n## Tutorials\n\nCheck out our tutorials at [torch-uncertainty.github.io/auto_tutorials](https://torch-uncertainty.github.io/auto_tutorials/index.html).\n\n## :telescope: Projects using TorchUncertainty\n\nThe following projects use TorchUncertainty:\n\n- *A Symmetry-Aware Exploration of Bayesian Neural Network Posteriors* - [ICLR 2024](https://arxiv.org/abs/2310.08287)\n\n**If you are using TorchUncertainty in your project, please let us know, we will add your project to this list!**\n\n",
    "bugtrack_url": null,
    "license": null,
    "summary": "Uncertainty quantification library in PyTorch",
    "version": "0.3.1",
    "project_urls": {
        "documentation": "https://torch-uncertainty.github.io/quickstart.html",
        "homepage": "https://torch-uncertainty.github.io/",
        "repository": "https://github.com/ENSTA-U2IS-AI/torch-uncertainty.git"
    },
    "split_keywords": [
        "bayesian-network",
        " ensembles",
        " neural-networks",
        " predictive-uncertainty",
        " reliable-ai",
        " trustworthy-machine-learning",
        " uncertainty",
        " uncertainty-quantification"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "f29abb4aa4604f387c5b83e7cc7b86e98e6e9d643a111edae7ee3d6715cef2c4",
                "md5": "8c731679d55d47077c8e916af4d4a2c6",
                "sha256": "70c39a909be7ff6150bb5103c026f6a6a179b13f33bf55e9c317a7ec83eb7f2f"
            },
            "downloads": -1,
            "filename": "torch_uncertainty-0.3.1-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "8c731679d55d47077c8e916af4d4a2c6",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.10",
            "size": 274782,
            "upload_time": "2024-11-18T09:39:34",
            "upload_time_iso_8601": "2024-11-18T09:39:34.931258Z",
            "url": "https://files.pythonhosted.org/packages/f2/9a/bb4aa4604f387c5b83e7cc7b86e98e6e9d643a111edae7ee3d6715cef2c4/torch_uncertainty-0.3.1-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "d88304c03c21f074b5487bfce2efd25acccf01e7e80a211a7558503276960364",
                "md5": "f1c811475d4146994373485b52378c48",
                "sha256": "8948a74bd17852669572bb25aa8f11143616db1978e06a00937228f2447b88ba"
            },
            "downloads": -1,
            "filename": "torch_uncertainty-0.3.1.tar.gz",
            "has_sig": false,
            "md5_digest": "f1c811475d4146994373485b52378c48",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.10",
            "size": 757621,
            "upload_time": "2024-11-18T09:39:36",
            "upload_time_iso_8601": "2024-11-18T09:39:36.380988Z",
            "url": "https://files.pythonhosted.org/packages/d8/83/04c03c21f074b5487bfce2efd25acccf01e7e80a211a7558503276960364/torch_uncertainty-0.3.1.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-11-18 09:39:36",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "ENSTA-U2IS-AI",
    "github_project": "torch-uncertainty",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "torch-uncertainty"
}
        
Elapsed time: 0.39846s