botorch


Namebotorch JSON
Version 0.11.0 PyPI version JSON
download
home_pagehttps://botorch.org
SummaryBayesian Optimization in PyTorch
upload_time2024-05-01 20:33:12
maintainerNone
docs_urlNone
authorMeta Platforms, Inc.
requires_python>=3.10
licenseMIT
keywords bayesian optimization pytorch
VCS
bugtrack_url
requirements multipledispatch scipy mpmath torch pyro-ppl gpytorch linear_operator
Travis-CI No Travis.
coveralls test coverage No coveralls.
            <a href="https://botorch.org">
  <img width="350" src="https://botorch.org/img/botorch_logo_lockup.png" alt="BoTorch Logo" />
</a>

<hr/>

[![Support Ukraine](https://img.shields.io/badge/Support-Ukraine-FFD500?style=flat&labelColor=005BBB)](https://opensource.fb.com/support-ukraine)
[![Lint](https://github.com/pytorch/botorch/workflows/Lint/badge.svg)](https://github.com/pytorch/botorch/actions?query=workflow%3ALint)
[![Test](https://github.com/pytorch/botorch/workflows/Test/badge.svg)](https://github.com/pytorch/botorch/actions?query=workflow%3ATest)
[![Docs](https://github.com/pytorch/botorch/workflows/Docs/badge.svg)](https://github.com/pytorch/botorch/actions?query=workflow%3ADocs)
[![Tutorials](https://github.com/pytorch/botorch/workflows/Tutorials/badge.svg)](https://github.com/pytorch/botorch/actions?query=workflow%3ATutorials)
[![Codecov](https://img.shields.io/codecov/c/github/pytorch/botorch.svg)](https://codecov.io/github/pytorch/botorch)

[![Conda](https://img.shields.io/conda/v/pytorch/botorch.svg)](https://anaconda.org/pytorch/botorch)
[![PyPI](https://img.shields.io/pypi/v/botorch.svg)](https://pypi.org/project/botorch)
[![License](https://img.shields.io/badge/license-MIT-green.svg)](LICENSE)


BoTorch is a library for Bayesian Optimization built on PyTorch.

*BoTorch is currently in beta and under active development!*


#### Why BoTorch ?
BoTorch
* Provides a modular and easily extensible interface for composing Bayesian
  optimization primitives, including probabilistic models, acquisition functions,
  and optimizers.
* Harnesses the power of PyTorch, including auto-differentiation, native support
  for highly parallelized modern hardware (e.g. GPUs) using device-agnostic code,
  and a dynamic computation graph.
* Supports Monte Carlo-based acquisition functions via the
  [reparameterization trick](https://arxiv.org/abs/1312.6114), which makes it
  straightforward to implement new ideas without having to impose restrictive
  assumptions about the underlying model.
* Enables seamless integration with deep and/or convolutional architectures in PyTorch.
* Has first-class support for state-of-the art probabilistic models in
  [GPyTorch](http://www.gpytorch.ai/), including support for multi-task Gaussian
  Processes (GPs) deep kernel learning, deep GPs, and approximate inference.


#### Target Audience

The primary audience for hands-on use of BoTorch are researchers and
sophisticated practitioners in Bayesian Optimization and AI.
We recommend using BoTorch as a low-level API for implementing new algorithms
for [Ax](https://ax.dev). Ax has been designed to be an easy-to-use platform
for end-users, which at the same time is flexible enough for Bayesian
Optimization researchers to plug into for handling of feature transformations,
(meta-)data management, storage, etc.
We recommend that end-users who are not actively doing research on Bayesian
Optimization simply use Ax.


## Installation

**Installation Requirements**
- Python >= 3.10
- PyTorch >= 1.13.1
- gpytorch == 1.11
- linear_operator == 0.5.1
- pyro-ppl >= 1.8.4
- scipy
- multiple-dispatch

### Prerequisite only for MacOS users with Intel processors:
Before installing BoTorch, we recommend first manually installing PyTorch, a required dependency of
BoTorch. Installing it according to the [PyTorch installation instructions](https://pytorch.org/get-started/locally/)
ensures that it is properly linked against MKL, a library that optimizes mathematical computation for Intel processors.
This will result in up to an order-of-magnitude speed-up for Bayesian optimization, as at the moment,
installing PyTorch from pip does not link against MKL.

The PyTorch installation instructions currently recommend:
1. Install [Anaconda](https://www.anaconda.com/distribution/#download-section). Note that there are different installers for Intel and M1 Macs.
2. Install PyTorch following the [PyTorch installation instructions](https://pytorch.org/get-started/locally/).
Currently, this suggests running `conda install pytorch torchvision -c pytorch`.

If you want to customize your installation, please follow the [PyTorch installation instructions](https://pytorch.org/get-started/locally/) to build from source.

### Option 1: Installing the latest release

The latest release of BoTorch is easily installed either via
[Anaconda](https://www.anaconda.com/distribution/#download-section) (recommended) or pip.

**To install BoTorch from Anaconda**, run
```bash
conda install botorch -c pytorch -c gpytorch -c conda-forge
```
The above command installs BoTorch and any needed dependencies. ` -c pytorch -c gpytorch -c conda-forge` means that the most preferred source to install from is the PyTorch channel, the next most preferred is the GPyTorch channel,
and the least preferred is conda-forge.

**Alternatively, to install with `pip`**, do
```bash
pip install botorch
```

_Note_: Make sure the `pip` being used is actually the one from the newly created Conda environment. If you're using a Unix-based OS, you can use `which pip` to check.

### Option 2: Installing from latest main branch

If you would like to try our bleeding edge features (and don't mind potentially
running into the occasional bug here or there), you can install the latest
development version directly from GitHub. If you want to also install the
current `gpytorch` and `linear_operator` development versions, you will need
to ensure that the `ALLOW_LATEST_GPYTORCH_LINOP` environment variable is set:
```bash
pip install --upgrade git+https://github.com/cornellius-gp/linear_operator.git
pip install --upgrade git+https://github.com/cornellius-gp/gpytorch.git
export ALLOW_LATEST_GPYTORCH_LINOP=true
pip install --upgrade git+https://github.com/pytorch/botorch.git
```

### Option 3: Editable/dev install

If you want to [contribute](CONTRIBUTING.md) to BoTorch, you will want to install editably so that you can change files and have the
changes reflected in your local install.

If you want to install the current `gpytorch` and `linear_operator` development versions, as in Option 2, do that
before proceeding.

#### Option 3a: Bare-bones editable install

```bash
git clone https://github.com/pytorch/botorch.git
cd botorch
pip install -e .
```

#### Option 3b: Editable install with development and tutorials dependencies

```bash
git clone https://github.com/pytorch/botorch.git
cd botorch
export ALLOW_BOTORCH_LATEST=true
pip install -e ".[dev, tutorials]"
```

* `dev`: Specifies tools necessary for development
  (testing, linting, docs building; see [Contributing](#contributing) below).
* `tutorials`: Also installs all packages necessary for running the tutorial notebooks.
* You can also install either the dev or tutorials dependencies without installing both, e.g. by changing the last command to `pip install -e ".[dev]"`.

## Getting Started

Here's a quick run down of the main components of a Bayesian optimization loop.
For more details see our [Documentation](https://botorch.org/docs/introduction) and the
[Tutorials](https://botorch.org/tutorials).

1. Fit a Gaussian Process model to data
  ```python
  import torch
  from botorch.models import SingleTaskGP
  from botorch.fit import fit_gpytorch_mll
  from gpytorch.mlls import ExactMarginalLogLikelihood

  # Double precision is highly recommended for GPs.
  # See https://github.com/pytorch/botorch/discussions/1444
  train_X = torch.rand(10, 2, dtype=torch.double)
  Y = 1 - (train_X - 0.5).norm(dim=-1, keepdim=True)  # explicit output dimension
  Y += 0.1 * torch.rand_like(Y)
  train_Y = (Y - Y.mean()) / Y.std()

  gp = SingleTaskGP(train_X, train_Y)
  mll = ExactMarginalLogLikelihood(gp.likelihood, gp)
  fit_gpytorch_mll(mll)
  ```

2. Construct an acquisition function
  ```python
  from botorch.acquisition import UpperConfidenceBound

  UCB = UpperConfidenceBound(gp, beta=0.1)
  ```

3. Optimize the acquisition function
  ```python
  from botorch.optim import optimize_acqf

  bounds = torch.stack([torch.zeros(2), torch.ones(2)])
  candidate, acq_value = optimize_acqf(
      UCB, bounds=bounds, q=1, num_restarts=5, raw_samples=20,
  )
  ```


## Citing BoTorch

If you use BoTorch, please cite the following paper:
> [M. Balandat, B. Karrer, D. R. Jiang, S. Daulton, B. Letham, A. G. Wilson, and E. Bakshy. BoTorch: A Framework for Efficient Monte-Carlo Bayesian Optimization. Advances in Neural Information Processing Systems 33, 2020.](https://arxiv.org/abs/1910.06403)

```
@inproceedings{balandat2020botorch,
  title={{BoTorch: A Framework for Efficient Monte-Carlo Bayesian Optimization}},
  author={Balandat, Maximilian and Karrer, Brian and Jiang, Daniel R. and Daulton, Samuel and Letham, Benjamin and Wilson, Andrew Gordon and Bakshy, Eytan},
  booktitle = {Advances in Neural Information Processing Systems 33},
  year={2020},
  url = {http://arxiv.org/abs/1910.06403}
}
```

See [here](https://botorch.org/docs/papers) for an incomplete selection of peer-reviewed papers that build off of BoTorch.


## Contributing
See the [CONTRIBUTING](CONTRIBUTING.md) file for how to help out.


## License
BoTorch is MIT licensed, as found in the [LICENSE](LICENSE) file.

            

Raw data

            {
    "_id": null,
    "home_page": "https://botorch.org",
    "name": "botorch",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.10",
    "maintainer_email": null,
    "keywords": "Bayesian optimization, PyTorch",
    "author": "Meta Platforms, Inc.",
    "author_email": null,
    "download_url": "https://files.pythonhosted.org/packages/ca/54/8ac3edc6aabc633ef581db6eed62c8a583a95e09fd0e43af9a5f78af85c1/botorch-0.11.0.tar.gz",
    "platform": null,
    "description": "<a href=\"https://botorch.org\">\n  <img width=\"350\" src=\"https://botorch.org/img/botorch_logo_lockup.png\" alt=\"BoTorch Logo\" />\n</a>\n\n<hr/>\n\n[![Support Ukraine](https://img.shields.io/badge/Support-Ukraine-FFD500?style=flat&labelColor=005BBB)](https://opensource.fb.com/support-ukraine)\n[![Lint](https://github.com/pytorch/botorch/workflows/Lint/badge.svg)](https://github.com/pytorch/botorch/actions?query=workflow%3ALint)\n[![Test](https://github.com/pytorch/botorch/workflows/Test/badge.svg)](https://github.com/pytorch/botorch/actions?query=workflow%3ATest)\n[![Docs](https://github.com/pytorch/botorch/workflows/Docs/badge.svg)](https://github.com/pytorch/botorch/actions?query=workflow%3ADocs)\n[![Tutorials](https://github.com/pytorch/botorch/workflows/Tutorials/badge.svg)](https://github.com/pytorch/botorch/actions?query=workflow%3ATutorials)\n[![Codecov](https://img.shields.io/codecov/c/github/pytorch/botorch.svg)](https://codecov.io/github/pytorch/botorch)\n\n[![Conda](https://img.shields.io/conda/v/pytorch/botorch.svg)](https://anaconda.org/pytorch/botorch)\n[![PyPI](https://img.shields.io/pypi/v/botorch.svg)](https://pypi.org/project/botorch)\n[![License](https://img.shields.io/badge/license-MIT-green.svg)](LICENSE)\n\n\nBoTorch is a library for Bayesian Optimization built on PyTorch.\n\n*BoTorch is currently in beta and under active development!*\n\n\n#### Why BoTorch ?\nBoTorch\n* Provides a modular and easily extensible interface for composing Bayesian\n  optimization primitives, including probabilistic models, acquisition functions,\n  and optimizers.\n* Harnesses the power of PyTorch, including auto-differentiation, native support\n  for highly parallelized modern hardware (e.g. GPUs) using device-agnostic code,\n  and a dynamic computation graph.\n* Supports Monte Carlo-based acquisition functions via the\n  [reparameterization trick](https://arxiv.org/abs/1312.6114), which makes it\n  straightforward to implement new ideas without having to impose restrictive\n  assumptions about the underlying model.\n* Enables seamless integration with deep and/or convolutional architectures in PyTorch.\n* Has first-class support for state-of-the art probabilistic models in\n  [GPyTorch](http://www.gpytorch.ai/), including support for multi-task Gaussian\n  Processes (GPs) deep kernel learning, deep GPs, and approximate inference.\n\n\n#### Target Audience\n\nThe primary audience for hands-on use of BoTorch are researchers and\nsophisticated practitioners in Bayesian Optimization and AI.\nWe recommend using BoTorch as a low-level API for implementing new algorithms\nfor [Ax](https://ax.dev). Ax has been designed to be an easy-to-use platform\nfor end-users, which at the same time is flexible enough for Bayesian\nOptimization researchers to plug into for handling of feature transformations,\n(meta-)data management, storage, etc.\nWe recommend that end-users who are not actively doing research on Bayesian\nOptimization simply use Ax.\n\n\n## Installation\n\n**Installation Requirements**\n- Python >= 3.10\n- PyTorch >= 1.13.1\n- gpytorch == 1.11\n- linear_operator == 0.5.1\n- pyro-ppl >= 1.8.4\n- scipy\n- multiple-dispatch\n\n### Prerequisite only for MacOS users with Intel processors:\nBefore installing BoTorch, we recommend first manually installing PyTorch, a required dependency of\nBoTorch. Installing it according to the [PyTorch installation instructions](https://pytorch.org/get-started/locally/)\nensures that it is properly linked against MKL, a library that optimizes mathematical computation for Intel processors.\nThis will result in up to an order-of-magnitude speed-up for Bayesian optimization, as at the moment,\ninstalling PyTorch from pip does not link against MKL.\n\nThe PyTorch installation instructions currently recommend:\n1. Install [Anaconda](https://www.anaconda.com/distribution/#download-section). Note that there are different installers for Intel and M1 Macs.\n2. Install PyTorch following the [PyTorch installation instructions](https://pytorch.org/get-started/locally/).\nCurrently, this suggests running `conda install pytorch torchvision -c pytorch`.\n\nIf you want to customize your installation, please follow the [PyTorch installation instructions](https://pytorch.org/get-started/locally/) to build from source.\n\n### Option 1: Installing the latest release\n\nThe latest release of BoTorch is easily installed either via\n[Anaconda](https://www.anaconda.com/distribution/#download-section) (recommended) or pip.\n\n**To install BoTorch from Anaconda**, run\n```bash\nconda install botorch -c pytorch -c gpytorch -c conda-forge\n```\nThe above command installs BoTorch and any needed dependencies. ` -c pytorch -c gpytorch -c conda-forge` means that the most preferred source to install from is the PyTorch channel, the next most preferred is the GPyTorch channel,\nand the least preferred is conda-forge.\n\n**Alternatively, to install with `pip`**, do\n```bash\npip install botorch\n```\n\n_Note_: Make sure the `pip` being used is actually the one from the newly created Conda environment. If you're using a Unix-based OS, you can use `which pip` to check.\n\n### Option 2: Installing from latest main branch\n\nIf you would like to try our bleeding edge features (and don't mind potentially\nrunning into the occasional bug here or there), you can install the latest\ndevelopment version directly from GitHub. If you want to also install the\ncurrent `gpytorch` and `linear_operator` development versions, you will need\nto ensure that the `ALLOW_LATEST_GPYTORCH_LINOP` environment variable is set:\n```bash\npip install --upgrade git+https://github.com/cornellius-gp/linear_operator.git\npip install --upgrade git+https://github.com/cornellius-gp/gpytorch.git\nexport ALLOW_LATEST_GPYTORCH_LINOP=true\npip install --upgrade git+https://github.com/pytorch/botorch.git\n```\n\n### Option 3: Editable/dev install\n\nIf you want to [contribute](CONTRIBUTING.md) to BoTorch, you will want to install editably so that you can change files and have the\nchanges reflected in your local install.\n\nIf you want to install the current `gpytorch` and `linear_operator` development versions, as in Option 2, do that\nbefore proceeding.\n\n#### Option 3a: Bare-bones editable install\n\n```bash\ngit clone https://github.com/pytorch/botorch.git\ncd botorch\npip install -e .\n```\n\n#### Option 3b: Editable install with development and tutorials dependencies\n\n```bash\ngit clone https://github.com/pytorch/botorch.git\ncd botorch\nexport ALLOW_BOTORCH_LATEST=true\npip install -e \".[dev, tutorials]\"\n```\n\n* `dev`: Specifies tools necessary for development\n  (testing, linting, docs building; see [Contributing](#contributing) below).\n* `tutorials`: Also installs all packages necessary for running the tutorial notebooks.\n* You can also install either the dev or tutorials dependencies without installing both, e.g. by changing the last command to `pip install -e \".[dev]\"`.\n\n## Getting Started\n\nHere's a quick run down of the main components of a Bayesian optimization loop.\nFor more details see our [Documentation](https://botorch.org/docs/introduction) and the\n[Tutorials](https://botorch.org/tutorials).\n\n1. Fit a Gaussian Process model to data\n  ```python\n  import torch\n  from botorch.models import SingleTaskGP\n  from botorch.fit import fit_gpytorch_mll\n  from gpytorch.mlls import ExactMarginalLogLikelihood\n\n  # Double precision is highly recommended for GPs.\n  # See https://github.com/pytorch/botorch/discussions/1444\n  train_X = torch.rand(10, 2, dtype=torch.double)\n  Y = 1 - (train_X - 0.5).norm(dim=-1, keepdim=True)  # explicit output dimension\n  Y += 0.1 * torch.rand_like(Y)\n  train_Y = (Y - Y.mean()) / Y.std()\n\n  gp = SingleTaskGP(train_X, train_Y)\n  mll = ExactMarginalLogLikelihood(gp.likelihood, gp)\n  fit_gpytorch_mll(mll)\n  ```\n\n2. Construct an acquisition function\n  ```python\n  from botorch.acquisition import UpperConfidenceBound\n\n  UCB = UpperConfidenceBound(gp, beta=0.1)\n  ```\n\n3. Optimize the acquisition function\n  ```python\n  from botorch.optim import optimize_acqf\n\n  bounds = torch.stack([torch.zeros(2), torch.ones(2)])\n  candidate, acq_value = optimize_acqf(\n      UCB, bounds=bounds, q=1, num_restarts=5, raw_samples=20,\n  )\n  ```\n\n\n## Citing BoTorch\n\nIf you use BoTorch, please cite the following paper:\n> [M. Balandat, B. Karrer, D. R. Jiang, S. Daulton, B. Letham, A. G. Wilson, and E. Bakshy. BoTorch: A Framework for Efficient Monte-Carlo Bayesian Optimization. Advances in Neural Information Processing Systems 33, 2020.](https://arxiv.org/abs/1910.06403)\n\n```\n@inproceedings{balandat2020botorch,\n  title={{BoTorch: A Framework for Efficient Monte-Carlo Bayesian Optimization}},\n  author={Balandat, Maximilian and Karrer, Brian and Jiang, Daniel R. and Daulton, Samuel and Letham, Benjamin and Wilson, Andrew Gordon and Bakshy, Eytan},\n  booktitle = {Advances in Neural Information Processing Systems 33},\n  year={2020},\n  url = {http://arxiv.org/abs/1910.06403}\n}\n```\n\nSee [here](https://botorch.org/docs/papers) for an incomplete selection of peer-reviewed papers that build off of BoTorch.\n\n\n## Contributing\nSee the [CONTRIBUTING](CONTRIBUTING.md) file for how to help out.\n\n\n## License\nBoTorch is MIT licensed, as found in the [LICENSE](LICENSE) file.\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "Bayesian Optimization in PyTorch",
    "version": "0.11.0",
    "project_urls": {
        "Documentation": "https://botorch.org",
        "Homepage": "https://botorch.org",
        "Source": "https://github.com/pytorch/botorch",
        "conda": "https://anaconda.org/pytorch/botorch"
    },
    "split_keywords": [
        "bayesian optimization",
        " pytorch"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "6cec51fba883f3da0790f62ade2482c01d4892675e6fabf666f6267a2019b13d",
                "md5": "9250259fe4c10c512a545ffce8fef612",
                "sha256": "3cda6793a644ba46004a193b370e7a017177029e02d73792a3a42707be3457aa"
            },
            "downloads": -1,
            "filename": "botorch-0.11.0-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "9250259fe4c10c512a545ffce8fef612",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.10",
            "size": 624985,
            "upload_time": "2024-05-01T20:33:09",
            "upload_time_iso_8601": "2024-05-01T20:33:09.809168Z",
            "url": "https://files.pythonhosted.org/packages/6c/ec/51fba883f3da0790f62ade2482c01d4892675e6fabf666f6267a2019b13d/botorch-0.11.0-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "ca548ac3edc6aabc633ef581db6eed62c8a583a95e09fd0e43af9a5f78af85c1",
                "md5": "498b591f282bb5b214b65aeff6d2f2e5",
                "sha256": "ea4327d1c89f89eb81c4e553eb3508de7d300c99c8f37c76148716e327fd92d6"
            },
            "downloads": -1,
            "filename": "botorch-0.11.0.tar.gz",
            "has_sig": false,
            "md5_digest": "498b591f282bb5b214b65aeff6d2f2e5",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.10",
            "size": 727906,
            "upload_time": "2024-05-01T20:33:12",
            "upload_time_iso_8601": "2024-05-01T20:33:12.115251Z",
            "url": "https://files.pythonhosted.org/packages/ca/54/8ac3edc6aabc633ef581db6eed62c8a583a95e09fd0e43af9a5f78af85c1/botorch-0.11.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-05-01 20:33:12",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "pytorch",
    "github_project": "botorch",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "requirements": [
        {
            "name": "multipledispatch",
            "specs": []
        },
        {
            "name": "scipy",
            "specs": []
        },
        {
            "name": "mpmath",
            "specs": [
                [
                    ">=",
                    "0.19"
                ],
                [
                    "<=",
                    "1.3"
                ]
            ]
        },
        {
            "name": "torch",
            "specs": [
                [
                    ">=",
                    "1.13.1"
                ]
            ]
        },
        {
            "name": "pyro-ppl",
            "specs": [
                [
                    ">=",
                    "1.8.4"
                ]
            ]
        },
        {
            "name": "gpytorch",
            "specs": [
                [
                    "==",
                    "1.11"
                ]
            ]
        },
        {
            "name": "linear_operator",
            "specs": [
                [
                    "==",
                    "0.5.1"
                ]
            ]
        }
    ],
    "lcname": "botorch"
}
        
Elapsed time: 0.24552s