globopt


Nameglobopt JSON
Version 1.0.0 PyPI version JSON
download
home_pageNone
SummaryMyopic and Non-Myopic Global Optimization
upload_time2024-12-05 13:30:01
maintainerNone
docs_urlNone
authorNone
requires_python>=3.9
licenseMIT
keywords optimization derivative-free-optimization rollout multi-step optimization radial-basis-function inverse-distance-weighting
VCS
bugtrack_url
requirements torch botorch
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # Global Optimization

**global-optimization** is a Python package for **Glob**al **Opt**imization of expensive
black-box functions via Inverse Distance Weighting (IDW) and Radial Basis Function (RBF)
approximation.

> |                   |                                                                 |
> | ----------------- | --------------------------------------------------------------- |
> | **Download**      | <https://pypi.python.org/pypi/globopt/>                         |
> | **Source code**   | <https://github.com/FilippoAiraldi/global-optimization/>        |
> | **Report issues** | <https://github.com/FilippoAiraldi/global-optimization/issues/> |

[![PyPI version](https://badge.fury.io/py/globopt.svg)](https://badge.fury.io/py/globopt)
[![Source Code License](https://img.shields.io/badge/license-MIT-blueviolet)](https://github.com/FilippoAiraldi/global-optimization/blob/botorch/LICENSE)
![Python 3.9](https://img.shields.io/badge/python->=3.9-green.svg)

[![Tests](https://github.com/FilippoAiraldi/global-optimization/actions/workflows/ci.yml/badge.svg)](https://github.com/FilippoAiraldi/global-optimization/actions/workflows/ci.yml)
[![Downloads](https://static.pepy.tech/badge/globopt)](https://www.pepy.tech/projects/globopt)
[![Maintainability](https://api.codeclimate.com/v1/badges/d1cf537cff6af108508/maintainability)](https://codeclimate.com/github/FilippoAiraldi/global-optimization/maintainability)
[![Test Coverage](https://api.codeclimate.com/v1/badges/d1cf537cff6af1a0808/test_coverage)](https://codeclimate.com/github/FilippoAiraldi/global-optimization/test_coverage)
[![Code style: black](https://img.shields.io/badge/code%20style-black-000000.svg)](https://github.com/psf/black)

---

## Features

**globopt** builds on top of [BoTorch](https://botorch.org/) [[1]](#1), a powerful
framework for Bayesian optimization (that leverages
[PyTorch](https://pytorch.org/) [[2]](#2) and its computational benefits), and extends
it to the more generic field of global optimization via IDW [[3]](#3) and RBF [[4]](#4) deterministic surrogate models. This expansion is achieved in two ways:

1. appropriate surrogate models based on IDW and RBF are implemented to approximate the
   black-box function, which support partially fitting new datapoints without the need
   to retrain the model from scratch on the whole new dataset

2. acquisition functions are designed to guide the search strategy towards the minimum
   (or one of) of the black-box function. These acquisition functions are available both
   as myopic versions, or as nonmyopic formulations that consider future evaluations and
   evolution of the surrogate models to predict the best point to query next.

The repository of this package includes also the source code for the following paper:

TODO

```bibtex
@article{airaldi2024nonmyopic,
  title = {Nonmyopic Global Optimisation via Approximate Dynamic Programming},
  year = {2024},
  author = {Filippo Airaldi and Bart De Schutter and Azita Dabiri},
}
```

More information is available in the
[section Paper](https://github.com/FilippoAiraldi/global-optimization/#paper)
below.

---

## Installation

### Using `pip`

You can use `pip` to install **globopt** with the command

```bash
pip install globopt
```

**globopt** has the following dependencies

- Python 3.9 or higher
- [PyTorch](https://pytorch.org/)
- [BoTorch](https://botorch.org/).

### Using source code

If you'd like to play around with the source code instead, run

```bash
git clone https://github.com/FilippoAiraldi/global-optimization.git
```

The main branch is `botorch`, and the other branches contain previous or experimental
versions of the package. You can then install the package to edit it
as you wish as

```bash
pip install -e /path/to/global-optimization
```

---

## Getting started

Here we provide a compact example on how **globopt** can be used to optimize a custom
black-box function. First of all, we need to implement this function as a subclass of
`SyntheticTestFunction`.

```python
from botorch.test_functions.synthetic import SyntheticTestFunction
from torch import Tensor


class CustomProblem(SyntheticTestFunction):
    r"""Custom optimization problem:

        f(x) = (1 + x sin(2x) cos(3x) / (1 + x^2))^2 + x^2 / 12 + x / 10

    x is bounded [-3, +3], and f in has a global minimum at `x_opt = -0.959769`
    with `f_opt = 0.2795`.
    """

    dim = 1
    _optimal_value = 0.279504
    _optimizers = [(-0.959769,)]
    _bounds = [(-3.0, +3.0)]

    def evaluate_true(self, X: Tensor) -> Tensor:
        X2 = X.square()
        return (
            (1 + X * (2 * X).sin() * (3 * X).cos() / (1 + X2)).square()
            + X2 / 12
            + X / 10
        )
```

Then, we can draw some random points to initialize the surrogate model (in this case,
IDW), and define some other constants.

```python
import torch

# instantiate problem and create starting training data
N_ITERS = ...
problem = CustomProblem()
lb, ub = problem._bounds[0]
bounds = torch.as_tensor([[lb], [ub]])
train_X = torch.as_tensor([[-2.62, -1.2, 0.14, 1.1, 2.82]]).T
train_Y = problem(train_X)
c1, c2 = 0.5, 1.0, 0.5
```

Finally, we can loop over the optimization iterations, optimizing the acquisition
function (in this case, the myopic one) and updating the surrogate model with the
newly queried point at each iteration.

```python
from botorch.optim import optimize_acqf
from globopt import IdwAcquisitionFunction, Rbf

# run optimization loop
for iteration in range(N_ITERS):
    # instantiate model and acquisition function
    mdl = Idw(train_X, train_Y, init_state=rbf_state)
    MAF = IdwAcquisitionFunction(mdl, c1, c2)

    # minimize acquisition function
    X_opt, acq_opt = optimize_acqf(MAF, bounds, 1, 8, 16, options={"seed": iteration})

    # evaluate objective function at the new point, and append it to training data
    Y_opt = problem(X_opt)
    train_X = torch.cat((train_X, X_opt))
    train_Y = torch.cat((train_Y, Y_opt))
```

Assuming a sufficiently large number of iterations is carried out, the optimization
process will converge to the global minimum of the black-box function, which can be
retrieved, in theory, as the last queried point `train_Y[-1]`, but for technical
reasons it is more convenient to retrieved the best-so-far `train_Y.min()`.

---

## Examples

Our
[examples](https://github.com/FilippoAiraldi/global-optimization/tree/botorch/examples)
subdirectory contains example applications of this package showing how to build the
IDW and RBF surrogate models, evaluate myopic and nonmyopic acquistion functions, and
use them to optimize custom black-box functions.

---

## Paper

As aforementioned, this package was used as source code of the following paper:

TODO

```bibtex
@article{airaldi2024nonmyopic,
  title = {Nonmyopic Global Optimisation via Approximate Dynamic Programming},
  year = {2024},
  author = {Filippo Airaldi and Bart De Schutter and Azita Dabiri},
}
```

Below the details on how to run the experiments and reproduce the results of the paper
are reported. Note that, while the package is available for Python >= 3.9, the results
of the paper, and thus the commands below, are based on Python 3.11.3.

### Synthetic and real problems

To reproduce the results of the paper on the collection of synthetic and real benchmark
problems, first make sure the Python version and the correct packages are installed

```bash
python --version  # 3.11.3 in our case
pip install -r benchmarking/requirements-benchmarking.txt
```

Then, you can run all the experiments (which are a lot) by executing the following
command

```bash
python benchmarking/run.py --methods myopic ms-mc.1 ms-mc.1.1 ms-mc.1.1.1 ms-mc.1.1.1.1 ms-gh.1 ms-gh.1.1 ms-gh.1.1.1 ms-gh.1.1.1.1 ms-mc.10 ms-mc.10.5 ms-gh.10 ms-gh.10.5 --n-jobs={number-of-jobs} --devices {list-of-available-devices} --csv={filename}
```

where `{number-of-jobs}`, `{list-of-available-devices}` and `{filename}` are
placeholders and should be replaced with the desired values. Be aware that this command
will take several days to run, depending on the number of jobs and the devices at your
disposal. However, the results are incrementally saved to the CSV file, so you can stop
and start the script at any time without throwing partial results away. Additionally,
you can also plot the ongoing results. To fetch the status of the simulation, you can
run

```bash
python benchmarking/status.py {filename}
```

which will print a dataframe with the number of trials already completed for each
problem-method pair. Once the results are ready (or partially ready), you can analyze
them by running the `benchmarking/analyze.py` script. Three different modes are
available: `summary`, `plot`, and `pgfplotstables`. To get the results reported in the
paper and simulated by us, run

```bash
python benchmarking/analyze.py benchmarking/results.csv --summary --exclude-methods random ei myopic-s
python benchmarking/analyze.py benchmarking/results.csv --plot --exclude-methods random ei myopic-s
python benchmarking/analyze.py benchmarking/results.csv --pgfplotstables --exclude-methods random ei myopic-s
```

In turn, these command will report a textual summary of the results (of which the table
of gaps is the primary interest), plot the results in a crude way, and generate in the
`pgfplotstables/` folder the `.dat` tables with all the data to be later plotted in
LaTeX with PGFPlots.

### Data-driven tuning of an MPC controller

Similarly to the previous results (since it is based on the same scripts), to reproduce
the results on the second numerical experiment, the tuning of a Model Predictive Control
controller, first install the requirements

```bash
pip install -r mpc-tuning/requirements-mpc-tuning.txt
```

Then, to launch the simulations, run

```bash
python mpc-tuning/tune.py --methods myopic ms-gh.1.1.1 ms-gh.10.5 --n-jobs={number-of-jobs} --devices {list-of-available-devices} --csv={filename} --n-trials=30
```

You can monitor the progress of the simulation with the same `benchmarking/status.py`
script as before. To analyze the results obtained by us, run

```bash
python mpc-tuning/analyze.py mpc-tuning/results.csv {--summary,--plot,--pgfplotstables} --include-methods myopic ms-gh.1.1.1$ ms-gh.10.5
```

---

## License

The repository is provided under the MIT License. See the LICENSE file included with
this repository.

---

## Author

[Filippo Airaldi](https://www.tudelft.nl/staff/f.airaldi/), PhD Candidate
[f.airaldi@tudelft.nl | filippoairaldi@gmail.com]

> [Delft Center for Systems and Control](https://www.tudelft.nl/en/me/about/departments/delft-center-for-systems-and-control/)
> in [Delft University of Technology](https://www.tudelft.nl/en/)

Copyright (c) 2024 Filippo Airaldi.

Copyright notice: Technische Universiteit Delft hereby disclaims all copyright interest
in the program “globopt” (Global Optimization) written by the Author(s). Prof. Dr. Ir.
Fred van Keulen, Dean of ME.

---

## References

<a id="1">[1]</a>
Balandat, M., Karrer, B., Jiang, D. R., Daulton, S., Letham, B., Wilson, A. G., Bakshy, E. (2020).
[BoTorch: A Framework for Efficient Monte-Carlo Bayesian Optimization](https://proceedings.neurips.cc/paper/2020/hash/f5b1b89d98b7286673128a5fb112cb9a-Abstract.html).
Advances in Neural Information Processing Systems, 33, 21524-21538.

<a id="2">[2]</a>
Paszke, A., Gross, S., Massa, F., Lerer, A., Bradbury, J., Chanan, G., Killeen, T., Lin, Z., Gimelshein, N., Antiga, L., Desmaison, A., Köpf, A., Yang, E., DeVito, Z., Raison, M., Tejani, A., Chilamkurthy, S., Steiner, B., Fang, L., Bai, J., Chintala, S. (2019).
[PyTorch: An Imperative Style, High-Performance Deep Learning Library](https://arxiv.org/abs/1912.01703).
Advances in Neural Information Processing Systems, 33, 21524-21538.

<a id="3">[3]</a>
Joseph, V.R., Kang, L. (2011).
[Regression-based inverse distance weighting with applications to computer experiments](https://www.jstor.org/stable/23210401).
Technometrics 53(3), 254–265.

<a id="4">[4]</a>
McDonald, D.B., Grantham, W.J., Tabor, W.L., Murphy, M.J. (2007).
[Global and local optimization using radial basis function response surface models](https://www.sciencedirect.com/science/article/pii/S0307904X06002009).
Applied Mathematical Modelling 31(10), 2095–2110.

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "globopt",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.9",
    "maintainer_email": null,
    "keywords": "optimization, derivative-free-optimization, rollout, multi-step optimization, radial-basis-function, inverse-distance-weighting",
    "author": null,
    "author_email": "Filippo Airaldi <filippoairaldi@gmail.com>",
    "download_url": "https://files.pythonhosted.org/packages/b7/c8/fe510e843cb136bdf4841d2adef167279a7960fbb2a0a0fd777331b084d7/globopt-1.0.0.tar.gz",
    "platform": null,
    "description": "# Global Optimization\r\n\r\n**global-optimization** is a Python package for **Glob**al **Opt**imization of expensive\r\nblack-box functions via Inverse Distance Weighting (IDW) and Radial Basis Function (RBF)\r\napproximation.\r\n\r\n> |                   |                                                                 |\r\n> | ----------------- | --------------------------------------------------------------- |\r\n> | **Download**      | <https://pypi.python.org/pypi/globopt/>                         |\r\n> | **Source code**   | <https://github.com/FilippoAiraldi/global-optimization/>        |\r\n> | **Report issues** | <https://github.com/FilippoAiraldi/global-optimization/issues/> |\r\n\r\n[![PyPI version](https://badge.fury.io/py/globopt.svg)](https://badge.fury.io/py/globopt)\r\n[![Source Code License](https://img.shields.io/badge/license-MIT-blueviolet)](https://github.com/FilippoAiraldi/global-optimization/blob/botorch/LICENSE)\r\n![Python 3.9](https://img.shields.io/badge/python->=3.9-green.svg)\r\n\r\n[![Tests](https://github.com/FilippoAiraldi/global-optimization/actions/workflows/ci.yml/badge.svg)](https://github.com/FilippoAiraldi/global-optimization/actions/workflows/ci.yml)\r\n[![Downloads](https://static.pepy.tech/badge/globopt)](https://www.pepy.tech/projects/globopt)\r\n[![Maintainability](https://api.codeclimate.com/v1/badges/d1cf537cff6af108508/maintainability)](https://codeclimate.com/github/FilippoAiraldi/global-optimization/maintainability)\r\n[![Test Coverage](https://api.codeclimate.com/v1/badges/d1cf537cff6af1a0808/test_coverage)](https://codeclimate.com/github/FilippoAiraldi/global-optimization/test_coverage)\r\n[![Code style: black](https://img.shields.io/badge/code%20style-black-000000.svg)](https://github.com/psf/black)\r\n\r\n---\r\n\r\n## Features\r\n\r\n**globopt** builds on top of [BoTorch](https://botorch.org/) [[1]](#1), a powerful\r\nframework for Bayesian optimization (that leverages\r\n[PyTorch](https://pytorch.org/) [[2]](#2) and its computational benefits), and extends\r\nit to the more generic field of global optimization via IDW [[3]](#3) and RBF [[4]](#4) deterministic surrogate models. This expansion is achieved in two ways:\r\n\r\n1. appropriate surrogate models based on IDW and RBF are implemented to approximate the\r\n   black-box function, which support partially fitting new datapoints without the need\r\n   to retrain the model from scratch on the whole new dataset\r\n\r\n2. acquisition functions are designed to guide the search strategy towards the minimum\r\n   (or one of) of the black-box function. These acquisition functions are available both\r\n   as myopic versions, or as nonmyopic formulations that consider future evaluations and\r\n   evolution of the surrogate models to predict the best point to query next.\r\n\r\nThe repository of this package includes also the source code for the following paper:\r\n\r\nTODO\r\n\r\n```bibtex\r\n@article{airaldi2024nonmyopic,\r\n  title = {Nonmyopic Global Optimisation via Approximate Dynamic Programming},\r\n  year = {2024},\r\n  author = {Filippo Airaldi and Bart De Schutter and Azita Dabiri},\r\n}\r\n```\r\n\r\nMore information is available in the\r\n[section Paper](https://github.com/FilippoAiraldi/global-optimization/#paper)\r\nbelow.\r\n\r\n---\r\n\r\n## Installation\r\n\r\n### Using `pip`\r\n\r\nYou can use `pip` to install **globopt** with the command\r\n\r\n```bash\r\npip install globopt\r\n```\r\n\r\n**globopt** has the following dependencies\r\n\r\n- Python 3.9 or higher\r\n- [PyTorch](https://pytorch.org/)\r\n- [BoTorch](https://botorch.org/).\r\n\r\n### Using source code\r\n\r\nIf you'd like to play around with the source code instead, run\r\n\r\n```bash\r\ngit clone https://github.com/FilippoAiraldi/global-optimization.git\r\n```\r\n\r\nThe main branch is `botorch`, and the other branches contain previous or experimental\r\nversions of the package. You can then install the package to edit it\r\nas you wish as\r\n\r\n```bash\r\npip install -e /path/to/global-optimization\r\n```\r\n\r\n---\r\n\r\n## Getting started\r\n\r\nHere we provide a compact example on how **globopt** can be used to optimize a custom\r\nblack-box function. First of all, we need to implement this function as a subclass of\r\n`SyntheticTestFunction`.\r\n\r\n```python\r\nfrom botorch.test_functions.synthetic import SyntheticTestFunction\r\nfrom torch import Tensor\r\n\r\n\r\nclass CustomProblem(SyntheticTestFunction):\r\n    r\"\"\"Custom optimization problem:\r\n\r\n        f(x) = (1 + x sin(2x) cos(3x) / (1 + x^2))^2 + x^2 / 12 + x / 10\r\n\r\n    x is bounded [-3, +3], and f in has a global minimum at `x_opt = -0.959769`\r\n    with `f_opt = 0.2795`.\r\n    \"\"\"\r\n\r\n    dim = 1\r\n    _optimal_value = 0.279504\r\n    _optimizers = [(-0.959769,)]\r\n    _bounds = [(-3.0, +3.0)]\r\n\r\n    def evaluate_true(self, X: Tensor) -> Tensor:\r\n        X2 = X.square()\r\n        return (\r\n            (1 + X * (2 * X).sin() * (3 * X).cos() / (1 + X2)).square()\r\n            + X2 / 12\r\n            + X / 10\r\n        )\r\n```\r\n\r\nThen, we can draw some random points to initialize the surrogate model (in this case,\r\nIDW), and define some other constants.\r\n\r\n```python\r\nimport torch\r\n\r\n# instantiate problem and create starting training data\r\nN_ITERS = ...\r\nproblem = CustomProblem()\r\nlb, ub = problem._bounds[0]\r\nbounds = torch.as_tensor([[lb], [ub]])\r\ntrain_X = torch.as_tensor([[-2.62, -1.2, 0.14, 1.1, 2.82]]).T\r\ntrain_Y = problem(train_X)\r\nc1, c2 = 0.5, 1.0, 0.5\r\n```\r\n\r\nFinally, we can loop over the optimization iterations, optimizing the acquisition\r\nfunction (in this case, the myopic one) and updating the surrogate model with the\r\nnewly queried point at each iteration.\r\n\r\n```python\r\nfrom botorch.optim import optimize_acqf\r\nfrom globopt import IdwAcquisitionFunction, Rbf\r\n\r\n# run optimization loop\r\nfor iteration in range(N_ITERS):\r\n    # instantiate model and acquisition function\r\n    mdl = Idw(train_X, train_Y, init_state=rbf_state)\r\n    MAF = IdwAcquisitionFunction(mdl, c1, c2)\r\n\r\n    # minimize acquisition function\r\n    X_opt, acq_opt = optimize_acqf(MAF, bounds, 1, 8, 16, options={\"seed\": iteration})\r\n\r\n    # evaluate objective function at the new point, and append it to training data\r\n    Y_opt = problem(X_opt)\r\n    train_X = torch.cat((train_X, X_opt))\r\n    train_Y = torch.cat((train_Y, Y_opt))\r\n```\r\n\r\nAssuming a sufficiently large number of iterations is carried out, the optimization\r\nprocess will converge to the global minimum of the black-box function, which can be\r\nretrieved, in theory, as the last queried point `train_Y[-1]`, but for technical\r\nreasons it is more convenient to retrieved the best-so-far `train_Y.min()`.\r\n\r\n---\r\n\r\n## Examples\r\n\r\nOur\r\n[examples](https://github.com/FilippoAiraldi/global-optimization/tree/botorch/examples)\r\nsubdirectory contains example applications of this package showing how to build the\r\nIDW and RBF surrogate models, evaluate myopic and nonmyopic acquistion functions, and\r\nuse them to optimize custom black-box functions.\r\n\r\n---\r\n\r\n## Paper\r\n\r\nAs aforementioned, this package was used as source code of the following paper:\r\n\r\nTODO\r\n\r\n```bibtex\r\n@article{airaldi2024nonmyopic,\r\n  title = {Nonmyopic Global Optimisation via Approximate Dynamic Programming},\r\n  year = {2024},\r\n  author = {Filippo Airaldi and Bart De Schutter and Azita Dabiri},\r\n}\r\n```\r\n\r\nBelow the details on how to run the experiments and reproduce the results of the paper\r\nare reported. Note that, while the package is available for Python >= 3.9, the results\r\nof the paper, and thus the commands below, are based on Python 3.11.3.\r\n\r\n### Synthetic and real problems\r\n\r\nTo reproduce the results of the paper on the collection of synthetic and real benchmark\r\nproblems, first make sure the Python version and the correct packages are installed\r\n\r\n```bash\r\npython --version  # 3.11.3 in our case\r\npip install -r benchmarking/requirements-benchmarking.txt\r\n```\r\n\r\nThen, you can run all the experiments (which are a lot) by executing the following\r\ncommand\r\n\r\n```bash\r\npython benchmarking/run.py --methods myopic ms-mc.1 ms-mc.1.1 ms-mc.1.1.1 ms-mc.1.1.1.1 ms-gh.1 ms-gh.1.1 ms-gh.1.1.1 ms-gh.1.1.1.1 ms-mc.10 ms-mc.10.5 ms-gh.10 ms-gh.10.5 --n-jobs={number-of-jobs} --devices {list-of-available-devices} --csv={filename}\r\n```\r\n\r\nwhere `{number-of-jobs}`, `{list-of-available-devices}` and `{filename}` are\r\nplaceholders and should be replaced with the desired values. Be aware that this command\r\nwill take several days to run, depending on the number of jobs and the devices at your\r\ndisposal. However, the results are incrementally saved to the CSV file, so you can stop\r\nand start the script at any time without throwing partial results away. Additionally,\r\nyou can also plot the ongoing results. To fetch the status of the simulation, you can\r\nrun\r\n\r\n```bash\r\npython benchmarking/status.py {filename}\r\n```\r\n\r\nwhich will print a dataframe with the number of trials already completed for each\r\nproblem-method pair. Once the results are ready (or partially ready), you can analyze\r\nthem by running the `benchmarking/analyze.py` script. Three different modes are\r\navailable: `summary`, `plot`, and `pgfplotstables`. To get the results reported in the\r\npaper and simulated by us, run\r\n\r\n```bash\r\npython benchmarking/analyze.py benchmarking/results.csv --summary --exclude-methods random ei myopic-s\r\npython benchmarking/analyze.py benchmarking/results.csv --plot --exclude-methods random ei myopic-s\r\npython benchmarking/analyze.py benchmarking/results.csv --pgfplotstables --exclude-methods random ei myopic-s\r\n```\r\n\r\nIn turn, these command will report a textual summary of the results (of which the table\r\nof gaps is the primary interest), plot the results in a crude way, and generate in the\r\n`pgfplotstables/` folder the `.dat` tables with all the data to be later plotted in\r\nLaTeX with PGFPlots.\r\n\r\n### Data-driven tuning of an MPC controller\r\n\r\nSimilarly to the previous results (since it is based on the same scripts), to reproduce\r\nthe results on the second numerical experiment, the tuning of a Model Predictive Control\r\ncontroller, first install the requirements\r\n\r\n```bash\r\npip install -r mpc-tuning/requirements-mpc-tuning.txt\r\n```\r\n\r\nThen, to launch the simulations, run\r\n\r\n```bash\r\npython mpc-tuning/tune.py --methods myopic ms-gh.1.1.1 ms-gh.10.5 --n-jobs={number-of-jobs} --devices {list-of-available-devices} --csv={filename} --n-trials=30\r\n```\r\n\r\nYou can monitor the progress of the simulation with the same `benchmarking/status.py`\r\nscript as before. To analyze the results obtained by us, run\r\n\r\n```bash\r\npython mpc-tuning/analyze.py mpc-tuning/results.csv {--summary,--plot,--pgfplotstables} --include-methods myopic ms-gh.1.1.1$ ms-gh.10.5\r\n```\r\n\r\n---\r\n\r\n## License\r\n\r\nThe repository is provided under the MIT License. See the LICENSE file included with\r\nthis repository.\r\n\r\n---\r\n\r\n## Author\r\n\r\n[Filippo Airaldi](https://www.tudelft.nl/staff/f.airaldi/), PhD Candidate\r\n[f.airaldi@tudelft.nl | filippoairaldi@gmail.com]\r\n\r\n> [Delft Center for Systems and Control](https://www.tudelft.nl/en/me/about/departments/delft-center-for-systems-and-control/)\r\n> in [Delft University of Technology](https://www.tudelft.nl/en/)\r\n\r\nCopyright (c) 2024 Filippo Airaldi.\r\n\r\nCopyright notice: Technische Universiteit Delft hereby disclaims all copyright interest\r\nin the program \u201cglobopt\u201d (Global Optimization) written by the Author(s). Prof. Dr. Ir.\r\nFred van Keulen, Dean of ME.\r\n\r\n---\r\n\r\n## References\r\n\r\n<a id=\"1\">[1]</a>\r\nBalandat, M., Karrer, B., Jiang, D. R., Daulton, S., Letham, B., Wilson, A. G., Bakshy, E. (2020).\r\n[BoTorch: A Framework for Efficient Monte-Carlo Bayesian Optimization](https://proceedings.neurips.cc/paper/2020/hash/f5b1b89d98b7286673128a5fb112cb9a-Abstract.html).\r\nAdvances in Neural Information Processing Systems, 33, 21524-21538.\r\n\r\n<a id=\"2\">[2]</a>\r\nPaszke, A., Gross, S., Massa, F., Lerer, A., Bradbury, J., Chanan, G., Killeen, T., Lin, Z., Gimelshein, N., Antiga, L., Desmaison, A., K\u00f6pf, A., Yang, E., DeVito, Z., Raison, M., Tejani, A., Chilamkurthy, S., Steiner, B., Fang, L., Bai, J., Chintala, S. (2019).\r\n[PyTorch: An Imperative Style, High-Performance Deep Learning Library](https://arxiv.org/abs/1912.01703).\r\nAdvances in Neural Information Processing Systems, 33, 21524-21538.\r\n\r\n<a id=\"3\">[3]</a>\r\nJoseph, V.R., Kang, L. (2011).\r\n[Regression-based inverse distance weighting with applications to computer experiments](https://www.jstor.org/stable/23210401).\r\nTechnometrics 53(3), 254\u2013265.\r\n\r\n<a id=\"4\">[4]</a>\r\nMcDonald, D.B., Grantham, W.J., Tabor, W.L., Murphy, M.J. (2007).\r\n[Global and local optimization using radial basis function response surface models](https://www.sciencedirect.com/science/article/pii/S0307904X06002009).\r\nApplied Mathematical Modelling 31(10), 2095\u20132110.\r\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "Myopic and Non-Myopic Global Optimization",
    "version": "1.0.0",
    "project_urls": {
        "Bug Tracker": "https://github.com/FilippoAiraldi/global-optimization/issues",
        "Homepage": "https://github.com/FilippoAiraldi/global-optimization"
    },
    "split_keywords": [
        "optimization",
        " derivative-free-optimization",
        " rollout",
        " multi-step optimization",
        " radial-basis-function",
        " inverse-distance-weighting"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "01ffae3bc3318badc6a9c63645039143b434b27c6e64f6a3e3df4bd8e7535009",
                "md5": "a1f88ed702dc5b83b4d740f760f32d9c",
                "sha256": "859e80af01167ee5edc8497188ec297b70dfb907c6da630c968349a17d2f4752"
            },
            "downloads": -1,
            "filename": "globopt-1.0.0-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "a1f88ed702dc5b83b4d740f760f32d9c",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.9",
            "size": 21243,
            "upload_time": "2024-12-05T13:30:00",
            "upload_time_iso_8601": "2024-12-05T13:30:00.555418Z",
            "url": "https://files.pythonhosted.org/packages/01/ff/ae3bc3318badc6a9c63645039143b434b27c6e64f6a3e3df4bd8e7535009/globopt-1.0.0-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "b7c8fe510e843cb136bdf4841d2adef167279a7960fbb2a0a0fd777331b084d7",
                "md5": "2c87980dd20403c384f247c8a9b71fbd",
                "sha256": "14711d12ebf00e6843b5d9428442e79064504382de98a9eb2df0793d412ef359"
            },
            "downloads": -1,
            "filename": "globopt-1.0.0.tar.gz",
            "has_sig": false,
            "md5_digest": "2c87980dd20403c384f247c8a9b71fbd",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.9",
            "size": 27808,
            "upload_time": "2024-12-05T13:30:01",
            "upload_time_iso_8601": "2024-12-05T13:30:01.907961Z",
            "url": "https://files.pythonhosted.org/packages/b7/c8/fe510e843cb136bdf4841d2adef167279a7960fbb2a0a0fd777331b084d7/globopt-1.0.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-12-05 13:30:01",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "FilippoAiraldi",
    "github_project": "global-optimization",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "requirements": [
        {
            "name": "torch",
            "specs": [
                [
                    ">=",
                    "2.1.2"
                ]
            ]
        },
        {
            "name": "botorch",
            "specs": [
                [
                    ">=",
                    "0.9.5"
                ]
            ]
        }
    ],
    "lcname": "globopt"
}
        
Elapsed time: 5.47037s