bayesian-optimization


Namebayesian-optimization JSON
Version 2.0.1 PyPI version JSON
download
home_pageNone
SummaryBayesian Optimization package
upload_time2024-12-09 12:00:07
maintainerNone
docs_urlNone
authorFernando Nogueira
requires_python<4.0,>=3.9
licenseMIT
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            <div align="center">
  <img src="https://raw.githubusercontent.com/bayesian-optimization/BayesianOptimization/master/docsrc/static/func.png"><br><br>
</div>

# Bayesian Optimization

![tests](https://github.com/bayesian-optimization/BayesianOptimization/actions/workflows/run_tests.yml/badge.svg)
[![docs - stable](https://img.shields.io/badge/docs-stable-blue)](https://bayesian-optimization.github.io/BayesianOptimization/index.html)
[![Codecov](https://codecov.io/github/bayesian-optimization/BayesianOptimization/badge.svg?branch=master&service=github)](https://codecov.io/github/bayesian-optimization/BayesianOptimization?branch=master)
[![Pypi](https://img.shields.io/pypi/v/bayesian-optimization.svg)](https://pypi.python.org/pypi/bayesian-optimization)
![PyPI - Python Version](https://img.shields.io/pypi/pyversions/bayesian-optimization)


Pure Python implementation of bayesian global optimization with gaussian
processes.


This is a constrained global optimization package built upon bayesian inference
and gaussian processes, that attempts to find the maximum value of an unknown
function in as few iterations as possible. This technique is particularly
suited for optimization of high cost functions and situations where the balance
between exploration and exploitation is important.

## Installation

* pip (via PyPI):

```console
$ pip install bayesian-optimization
```

* Conda (via conda-forge):

```console
$ conda install -c conda-forge bayesian-optimization
```

## How does it work?

See the [documentation](https://bayesian-optimization.github.io/BayesianOptimization/) for how to use this package.

Bayesian optimization works by constructing a posterior distribution of functions (gaussian process) that best describes the function you want to optimize. As the number of observations grows, the posterior distribution improves, and the algorithm becomes more certain of which regions in parameter space are worth exploring and which are not, as seen in the picture below.

![BayesianOptimization in action](docsrc/static/bo_example.png)

As you iterate over and over, the algorithm balances its needs of exploration and exploitation taking into account what it knows about the target function. At each step a Gaussian Process is fitted to the known samples (points previously explored), and the posterior distribution, combined with a exploration strategy (such as UCB (Upper Confidence Bound), or EI (Expected Improvement)), are used to determine the next point that should be explored (see the gif below).

![BayesianOptimization in action](docsrc/static/bayesian_optimization.gif)

This process is designed to minimize the number of steps required to find a combination of parameters that are close to the optimal combination. To do so, this method uses a proxy optimization problem (finding the maximum of the acquisition function) that, albeit still a hard problem, is cheaper (in the computational sense) and common tools can be employed. Therefore Bayesian Optimization is most adequate for situations where sampling the function to be optimized is a very expensive endeavor. See the references for a proper discussion of this method.

This project is under active development. If you run into trouble, find a bug or notice
anything that needs correction, please let us know by filing an issue.


## Basic tour of the Bayesian Optimization package

### 1. Specifying the function to be optimized

This is a function optimization package, therefore the first and most important ingredient is, of course, the function to be optimized.

**DISCLAIMER:** We know exactly how the output of the function below depends on its parameter. Obviously this is just an example, and you shouldn't expect to know it in a real scenario. However, it should be clear that you don't need to. All you need in order to use this package (and more generally, this technique) is a function `f` that takes a known set of parameters and outputs a real number.


```python
def black_box_function(x, y):
    """Function with unknown internals we wish to maximize.

    This is just serving as an example, for all intents and
    purposes think of the internals of this function, i.e.: the process
    which generates its output values, as unknown.
    """
    return -x ** 2 - (y - 1) ** 2 + 1
```

### 2. Getting Started

All we need to get started is to instantiate a `BayesianOptimization` object specifying a function to be optimized `f`, and its parameters with their corresponding bounds, `pbounds`. This is a constrained optimization technique, so you must specify the minimum and maximum values that can be probed for each parameter in order for it to work


```python
from bayes_opt import BayesianOptimization

# Bounded region of parameter space
pbounds = {'x': (2, 4), 'y': (-3, 3)}

optimizer = BayesianOptimization(
    f=black_box_function,
    pbounds=pbounds,
    random_state=1,
)
```

The BayesianOptimization object will work out of the box without much tuning needed. The main method you should be aware of is `maximize`, which does exactly what you think it does.

There are many parameters you can pass to maximize, nonetheless, the most important ones are:
- `n_iter`: How many steps of bayesian optimization you want to perform. The more steps the more likely to find a good maximum you are.
- `init_points`: How many steps of **random** exploration you want to perform. Random exploration can help by diversifying the exploration space.


```python
optimizer.maximize(
    init_points=2,
    n_iter=3,
)
```

    |   iter    |  target   |     x     |     y     |
    -------------------------------------------------
    |  1        | -7.135    |  2.834    |  1.322    |
    |  2        | -7.78     |  2.0      | -1.186    |
    |  3        | -19.0     |  4.0      |  3.0      |
    |  4        | -16.3     |  2.378    | -2.413    |
    |  5        | -4.441    |  2.105    | -0.005822 |
    =================================================


The best combination of parameters and target value found can be accessed via the property `optimizer.max`.


```python
print(optimizer.max)
>>> {'target': -4.441293113411222, 'params': {'y': -0.005822117636089974, 'x': 2.104665051994087}}
```


While the list of all parameters probed and their corresponding target values is available via the property `optimizer.res`.


```python
for i, res in enumerate(optimizer.res):
    print("Iteration {}: \n\t{}".format(i, res))

>>> Iteration 0:
>>>     {'target': -7.135455292718879, 'params': {'y': 1.3219469606529488, 'x': 2.8340440094051482}}
>>> Iteration 1:
>>>     {'target': -7.779531005607566, 'params': {'y': -1.1860045642089614, 'x': 2.0002287496346898}}
>>> Iteration 2:
>>>     {'target': -19.0, 'params': {'y': 3.0, 'x': 4.0}}
>>> Iteration 3:
>>>     {'target': -16.29839645063864, 'params': {'y': -2.412527795983739, 'x': 2.3776144540856503}}
>>> Iteration 4:
>>>     {'target': -4.441293113411222, 'params': {'y': -0.005822117636089974, 'x': 2.104665051994087}}
```


## Minutiae

### Citation

If you used this package in your research, please cite it:

```
@Misc{,
    author = {Fernando Nogueira},
    title = {{Bayesian Optimization}: Open source constrained global optimization tool for {Python}},
    year = {2014--},
    url = " https://github.com/bayesian-optimization/BayesianOptimization"
}
```
If you used any of the advanced functionalities, please additionally cite the corresponding publication:

For the `SequentialDomainTransformer`:
```
@article{
    author = {Stander, Nielen and Craig, Kenneth},
    year = {2002},
    month = {06},
    pages = {},
    title = {On the robustness of a simple domain reduction scheme for simulation-based optimization},
    volume = {19},
    journal = {International Journal for Computer-Aided Engineering and Software (Eng. Comput.)},
    doi = {10.1108/02644400210430190}
}
```

For constrained optimization:
```
@inproceedings{gardner2014bayesian,
    title={Bayesian optimization with inequality constraints.},
    author={Gardner, Jacob R and Kusner, Matt J and Xu, Zhixiang Eddie and Weinberger, Kilian Q and Cunningham, John P},
    booktitle={ICML},
    volume={2014},
    pages={937--945},
    year={2014}
}
```

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "bayesian-optimization",
    "maintainer": null,
    "docs_url": null,
    "requires_python": "<4.0,>=3.9",
    "maintainer_email": null,
    "keywords": null,
    "author": "Fernando Nogueira",
    "author_email": null,
    "download_url": "https://files.pythonhosted.org/packages/cf/7f/1a1e42b37726d58f8c75814a0ad6afe7d99c62d8cd7927a99e84c22ae90d/bayesian_optimization-2.0.1.tar.gz",
    "platform": null,
    "description": "<div align=\"center\">\n  <img src=\"https://raw.githubusercontent.com/bayesian-optimization/BayesianOptimization/master/docsrc/static/func.png\"><br><br>\n</div>\n\n# Bayesian Optimization\n\n![tests](https://github.com/bayesian-optimization/BayesianOptimization/actions/workflows/run_tests.yml/badge.svg)\n[![docs - stable](https://img.shields.io/badge/docs-stable-blue)](https://bayesian-optimization.github.io/BayesianOptimization/index.html)\n[![Codecov](https://codecov.io/github/bayesian-optimization/BayesianOptimization/badge.svg?branch=master&service=github)](https://codecov.io/github/bayesian-optimization/BayesianOptimization?branch=master)\n[![Pypi](https://img.shields.io/pypi/v/bayesian-optimization.svg)](https://pypi.python.org/pypi/bayesian-optimization)\n![PyPI - Python Version](https://img.shields.io/pypi/pyversions/bayesian-optimization)\n\n\nPure Python implementation of bayesian global optimization with gaussian\nprocesses.\n\n\nThis is a constrained global optimization package built upon bayesian inference\nand gaussian processes, that attempts to find the maximum value of an unknown\nfunction in as few iterations as possible. This technique is particularly\nsuited for optimization of high cost functions and situations where the balance\nbetween exploration and exploitation is important.\n\n## Installation\n\n* pip (via PyPI):\n\n```console\n$ pip install bayesian-optimization\n```\n\n* Conda (via conda-forge):\n\n```console\n$ conda install -c conda-forge bayesian-optimization\n```\n\n## How does it work?\n\nSee the [documentation](https://bayesian-optimization.github.io/BayesianOptimization/) for how to use this package.\n\nBayesian optimization works by constructing a posterior distribution of functions (gaussian process) that best describes the function you want to optimize. As the number of observations grows, the posterior distribution improves, and the algorithm becomes more certain of which regions in parameter space are worth exploring and which are not, as seen in the picture below.\n\n![BayesianOptimization in action](docsrc/static/bo_example.png)\n\nAs you iterate over and over, the algorithm balances its needs of exploration and exploitation taking into account what it knows about the target function. At each step a Gaussian Process is fitted to the known samples (points previously explored), and the posterior distribution, combined with a exploration strategy (such as UCB (Upper Confidence Bound), or EI (Expected Improvement)), are used to determine the next point that should be explored (see the gif below).\n\n![BayesianOptimization in action](docsrc/static/bayesian_optimization.gif)\n\nThis process is designed to minimize the number of steps required to find a combination of parameters that are close to the optimal combination. To do so, this method uses a proxy optimization problem (finding the maximum of the acquisition function) that, albeit still a hard problem, is cheaper (in the computational sense) and common tools can be employed. Therefore Bayesian Optimization is most adequate for situations where sampling the function to be optimized is a very expensive endeavor. See the references for a proper discussion of this method.\n\nThis project is under active development. If you run into trouble, find a bug or notice\nanything that needs correction, please let us know by filing an issue.\n\n\n## Basic tour of the Bayesian Optimization package\n\n### 1. Specifying the function to be optimized\n\nThis is a function optimization package, therefore the first and most important ingredient is, of course, the function to be optimized.\n\n**DISCLAIMER:** We know exactly how the output of the function below depends on its parameter. Obviously this is just an example, and you shouldn't expect to know it in a real scenario. However, it should be clear that you don't need to. All you need in order to use this package (and more generally, this technique) is a function `f` that takes a known set of parameters and outputs a real number.\n\n\n```python\ndef black_box_function(x, y):\n    \"\"\"Function with unknown internals we wish to maximize.\n\n    This is just serving as an example, for all intents and\n    purposes think of the internals of this function, i.e.: the process\n    which generates its output values, as unknown.\n    \"\"\"\n    return -x ** 2 - (y - 1) ** 2 + 1\n```\n\n### 2. Getting Started\n\nAll we need to get started is to instantiate a `BayesianOptimization` object specifying a function to be optimized `f`, and its parameters with their corresponding bounds, `pbounds`. This is a constrained optimization technique, so you must specify the minimum and maximum values that can be probed for each parameter in order for it to work\n\n\n```python\nfrom bayes_opt import BayesianOptimization\n\n# Bounded region of parameter space\npbounds = {'x': (2, 4), 'y': (-3, 3)}\n\noptimizer = BayesianOptimization(\n    f=black_box_function,\n    pbounds=pbounds,\n    random_state=1,\n)\n```\n\nThe BayesianOptimization object will work out of the box without much tuning needed. The main method you should be aware of is `maximize`, which does exactly what you think it does.\n\nThere are many parameters you can pass to maximize, nonetheless, the most important ones are:\n- `n_iter`: How many steps of bayesian optimization you want to perform. The more steps the more likely to find a good maximum you are.\n- `init_points`: How many steps of **random** exploration you want to perform. Random exploration can help by diversifying the exploration space.\n\n\n```python\noptimizer.maximize(\n    init_points=2,\n    n_iter=3,\n)\n```\n\n    |   iter    |  target   |     x     |     y     |\n    -------------------------------------------------\n    |  1        | -7.135    |  2.834    |  1.322    |\n    |  2        | -7.78     |  2.0      | -1.186    |\n    |  3        | -19.0     |  4.0      |  3.0      |\n    |  4        | -16.3     |  2.378    | -2.413    |\n    |  5        | -4.441    |  2.105    | -0.005822 |\n    =================================================\n\n\nThe best combination of parameters and target value found can be accessed via the property `optimizer.max`.\n\n\n```python\nprint(optimizer.max)\n>>> {'target': -4.441293113411222, 'params': {'y': -0.005822117636089974, 'x': 2.104665051994087}}\n```\n\n\nWhile the list of all parameters probed and their corresponding target values is available via the property `optimizer.res`.\n\n\n```python\nfor i, res in enumerate(optimizer.res):\n    print(\"Iteration {}: \\n\\t{}\".format(i, res))\n\n>>> Iteration 0:\n>>>     {'target': -7.135455292718879, 'params': {'y': 1.3219469606529488, 'x': 2.8340440094051482}}\n>>> Iteration 1:\n>>>     {'target': -7.779531005607566, 'params': {'y': -1.1860045642089614, 'x': 2.0002287496346898}}\n>>> Iteration 2:\n>>>     {'target': -19.0, 'params': {'y': 3.0, 'x': 4.0}}\n>>> Iteration 3:\n>>>     {'target': -16.29839645063864, 'params': {'y': -2.412527795983739, 'x': 2.3776144540856503}}\n>>> Iteration 4:\n>>>     {'target': -4.441293113411222, 'params': {'y': -0.005822117636089974, 'x': 2.104665051994087}}\n```\n\n\n## Minutiae\n\n### Citation\n\nIf you used this package in your research, please cite it:\n\n```\n@Misc{,\n    author = {Fernando Nogueira},\n    title = {{Bayesian Optimization}: Open source constrained global optimization tool for {Python}},\n    year = {2014--},\n    url = \" https://github.com/bayesian-optimization/BayesianOptimization\"\n}\n```\nIf you used any of the advanced functionalities, please additionally cite the corresponding publication:\n\nFor the `SequentialDomainTransformer`:\n```\n@article{\n    author = {Stander, Nielen and Craig, Kenneth},\n    year = {2002},\n    month = {06},\n    pages = {},\n    title = {On the robustness of a simple domain reduction scheme for simulation-based optimization},\n    volume = {19},\n    journal = {International Journal for Computer-Aided Engineering and Software (Eng. Comput.)},\n    doi = {10.1108/02644400210430190}\n}\n```\n\nFor constrained optimization:\n```\n@inproceedings{gardner2014bayesian,\n    title={Bayesian optimization with inequality constraints.},\n    author={Gardner, Jacob R and Kusner, Matt J and Xu, Zhixiang Eddie and Weinberger, Kilian Q and Cunningham, John P},\n    booktitle={ICML},\n    volume={2014},\n    pages={937--945},\n    year={2014}\n}\n```\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "Bayesian Optimization package",
    "version": "2.0.1",
    "project_urls": null,
    "split_keywords": [],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "d6ee4cb6ccb538d35a967f85ce5c215915120c604a019a92d7c9b71baf59a63b",
                "md5": "326313959b0da9db0fa4645ede320f48",
                "sha256": "e24c2dd11919c450d206e932d7a6052de66e57e12b8c41d5ade07ae574796e98"
            },
            "downloads": -1,
            "filename": "bayesian_optimization-2.0.1-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "326313959b0da9db0fa4645ede320f48",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": "<4.0,>=3.9",
            "size": 31226,
            "upload_time": "2024-12-09T12:00:06",
            "upload_time_iso_8601": "2024-12-09T12:00:06.043898Z",
            "url": "https://files.pythonhosted.org/packages/d6/ee/4cb6ccb538d35a967f85ce5c215915120c604a019a92d7c9b71baf59a63b/bayesian_optimization-2.0.1-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "cf7f1a1e42b37726d58f8c75814a0ad6afe7d99c62d8cd7927a99e84c22ae90d",
                "md5": "0251154196e72018cd931bf2f4a7f055",
                "sha256": "75e1fb54cf892e6f4c7e90520dddaa364737e35ba4231c08eba092a4b58006b6"
            },
            "downloads": -1,
            "filename": "bayesian_optimization-2.0.1.tar.gz",
            "has_sig": false,
            "md5_digest": "0251154196e72018cd931bf2f4a7f055",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": "<4.0,>=3.9",
            "size": 29174,
            "upload_time": "2024-12-09T12:00:07",
            "upload_time_iso_8601": "2024-12-09T12:00:07.582458Z",
            "url": "https://files.pythonhosted.org/packages/cf/7f/1a1e42b37726d58f8c75814a0ad6afe7d99c62d8cd7927a99e84c22ae90d/bayesian_optimization-2.0.1.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-12-09 12:00:07",
    "github": false,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "lcname": "bayesian-optimization"
}
        
Elapsed time: 3.07493s