# BA-LORRE
A Python implementation of the **B**ees **A**lgorithm based **L**ocal **O**ptima **R**egion **R**adius **E**xtimator.
Most of the intelligenent numerical optimization algorithms available are only made to find the global optimum of a given function.
In contrast, the BA-LORRE is a swarm computation algorithm designed to find *several* local optima, avoiding already explored regions during the search.
The algorithm exploits the [search properties](https://www.sciencedirect.com/science/article/abs/pii/S2210650220303990?via%3Dihub) of the [Bees Algorithm](https://gitlab.com/luca.baronti/ba_lorre) to identify the position and radius of local optima region in a non-parametric way. Once identified, the local score of those regions is lowered, allowing the algorithm to focus furture search to unexplored regions.
| | |
|:-------------------------:|:-------------------------:|
|<img src="https://gitlab.com/luca.baronti/ba_lorre/-/raw/master/pics/iterations_0.png"> | <img src="https://gitlab.com/luca.baronti/ba_lorre/-/raw/master/pics/iterations_1.png"> |
|<img src="https://gitlab.com/luca.baronti/ba_lorre/-/raw/master/pics/iterations_2.png"> | <img src="https://gitlab.com/luca.baronti/ba_lorre/-/raw/master/pics/iterations_3.png"> |
This library allows an off-the-shelf use of the algorithm to find the best local optima of an user-defined target function.
![solutions](https://gitlab.com/luca.baronti/ba_lorre/-/raw/master/pics/lorre_solutions.png)
## Reference
For details on the inner working of the BA-LORRE algorithm, how it's related on the Bess Algorithm and how the optima regions are found, please refer to [this work](https://etheses.bham.ac.uk/id/eprint/10094/7/Baronti2020PhD.pdf).
If you are using this implementation of the BA-LORRE for your research, feel free to cite this work in your paper using the following BibTex entry:
```bibtex
@article{baronti2020analysis,
title={An Analysis of the Search Mechanisms of the Bees Algorithm},
author={Baronti, Luca and Castellani, Marco and Pham, Duc Truong},
journal={Swarm and Evolutionary Computation},
volume={59},
pages={100746},
year={2020},
publisher={Elsevier},
doi={10.1016/j.swevo.2020.100746},
url={https://doi.org/10.1016/j.swevo.2020.100746}
}
```
# Installation
This module is available on [pip](https://pypi.org/project/ba-lorre) and can be installed as follows:
```sh
$ pip3 install ba_lorre
```
# Usage and Guidelines
As an extension of the original Bees Algorithm, the BA-LORRE shares most of the interface and mirrors its general usage.
The main class, `LORRE`, has the following signature (mandatory parameters in **bold**):
- **score_function**`: Callable[[list], float]` the function that need to be optimised. It can either be a (lambda) function or a class that overrides the `__call__` function. Either way, `score_function(solution: list) -> float` needs to take a single parameter, a list of *N* `float` representing a position in the search space, and returns the score relative to that position;
- **range_min**`: list` list of *N* `float`, indicating the lower bound of the search space;
- **range_max**`: list` list of *N* `float`, indicating the upper bound of the search space;
- *nb*`: int` number of *sites*;
- *nrb*`: int` number of solutions sampled around each site, per iteration;
- *stlim*`: int` stagnation limit, the number of iteration required to abandon a site if no local progress is made;
- *initial_ngh*`: list` the neighbourhood used for the local search in a site;
- *shrink_factor*`: float` how much a neighbourhood must be shrunk per every iteration that its centre is not updated;
- *derating_radius_selection_criterion*`: str` the criterion used to identify the optima region radius. Allowed values are *median* and *topological*;
- *derating_type*`: str` the shape of penalty that will apply on every found region. Allowed values are *flat* and *linear*;
In this context *N* is the number of dimensions of the score function to be optimized. At each iteration *nb* x *nrb* solutions are evaluated in the search space.
Examples of scores functions that can be used as a benchmark can be found in [this library](https://gitlab.com/luca.baronti/python_benchmark_functions).
Please note that for simplicity this algorithm solves a *maximization* problem.
If you need to minimize a function, please provide the *opposite* of the function (i.e. *g(x)=-f(x)*) to the algorithm.
The found solutions won't be affected.
Please refer to the Bees Algorithm [documentation](https://gitlab.com/luca.baronti/ba_lorre/-/blob/master/README.md) for further details.
## Initialization
The algorithm is first instantiated (as an example, here we are using a simple hyperbolic function to be optimized)
```python
from ba_lorre import LORRE
alg = LORRE(lambda x: -pow(x[0]*x[1],2),
range_min=[-100, -100],
range_max=[100, 100],
nb=5, nrb=20, stlim=20,
derating_radius_selection_criterion='median',
derating_type='linear')
```
## Optimization
Then the optimisation itself can be performed in different ways. The most simple method is calling `performFullOptimisation` specifying either the maximum number of iterations or the maximum score wanted as stopping criterion.
```python
>>> alg.performFullOptimisation(max_iteration=50)
(50, -3.4228945713694973e-11)
```
This returns the number of iterations required and the score of the best solution found.
Alternatively, more control on the optimization process can achieved performing one iteration at a time this way:
```python
alg.performSingleStep()
```
For score functions with 2 dimensions is possible to visualize the iterations of the algorithm calling
```python
alg.visualize_iteration_steps()
```
This is often used for demonstration, debugging and to get a better insight on the algorithm dynamics.
All the figures in this readme have been taken this way.
## Found Optima Handling
The aim of the algorithm is to return the best local optima of a function.
Once the optimization process is terminated the found optima can be collected with
```python
alg.getFoundOptima()
```
which will return the position of the found optima.
For score functions with 2 dimensions is possible to visualize the found optima directly on the function's surface with:
```python
alg.showFoundOptima()
```
![solutions](https://gitlab.com/luca.baronti/ba_lorre/-/raw/master/pics/lorre_solutions.png)
Either way, a list of *pruning* criteria can be passed to filter the found optima.
### Pruning Criteria
The algorithm has no prior information on how many local optima the score function is supposed to have.
As a result, spurious and redundant optima can be returned.
For this reason, this library comes with a set of pruning functions that can be passed to `getFoundOptima` or `showFoundOptima` to eliminate solutions that are unlikely to be a good approximation of real optima.
Available pruning classes are:
- **PruningProximity** putative optima that have been generated at the same time in the same region of an optimum of higher score are removed;
- **PruningAbsScoreCutoff** putative optima with a score lower than the *cutoff* parameter, are removed;
- **PruningPercScoreCutoff** putative optima with a score lower than *cutoff*% of the other optima are removed;
Any number of pruning criteria can be combined:
```python
from ba_lorre import PruningProximity, PruningPercScoreCutoff
alg.getFoundOptima(pruning_functions=[PruningProximity(), PruningPercScoreCutoff(cutoff=.5)])
```
The `PruningAbsScoreCutoff` and `PruningPercScoreCutoff` pruning criteria require insight on the optima distribution to be used properly.
To help the practitioner a convenience function to visualize the optima score distribution is provided:
```python
alg.showOptimaScoreHistogram()
```
![solutions](https://gitlab.com/luca.baronti/ba_lorre/-/raw/master/pics/sol_distr.png)
# Versions History
## v1.0.2
- minor fix with a parameters hint
## v1.0.1
- minor visual fixes for pypi
## v1.0.0
- first release
Raw data
{
"_id": null,
"home_page": "https://gitlab.com/luca.baronti/ba_lorre",
"name": "ba-lorre",
"maintainer": "",
"docs_url": null,
"requires_python": "",
"maintainer_email": "",
"keywords": "Optimisation,Optimization,Bees Algorithm,Intelligent Optimisation,LORRE",
"author": "Luca Baronti",
"author_email": "lbaronti@gmail.com",
"download_url": "https://files.pythonhosted.org/packages/18/fa/6747d9f7b8457ae045f4effd43a97fe0cd589f4fe12deb4f262692e84837/ba_lorre-1.0.2.tar.gz",
"platform": null,
"description": "# BA-LORRE\n\nA Python implementation of the **B**ees **A**lgorithm based **L**ocal **O**ptima **R**egion **R**adius **E**xtimator. \nMost of the intelligenent numerical optimization algorithms available are only made to find the global optimum of a given function. \nIn contrast, the BA-LORRE is a swarm computation algorithm designed to find *several* local optima, avoiding already explored regions during the search.\n\nThe algorithm exploits the [search properties](https://www.sciencedirect.com/science/article/abs/pii/S2210650220303990?via%3Dihub) of the [Bees Algorithm](https://gitlab.com/luca.baronti/ba_lorre) to identify the position and radius of local optima region in a non-parametric way. Once identified, the local score of those regions is lowered, allowing the algorithm to focus furture search to unexplored regions.\n\n| | |\n|:-------------------------:|:-------------------------:|\n|<img src=\"https://gitlab.com/luca.baronti/ba_lorre/-/raw/master/pics/iterations_0.png\"> | <img src=\"https://gitlab.com/luca.baronti/ba_lorre/-/raw/master/pics/iterations_1.png\"> |\n|<img src=\"https://gitlab.com/luca.baronti/ba_lorre/-/raw/master/pics/iterations_2.png\"> | <img src=\"https://gitlab.com/luca.baronti/ba_lorre/-/raw/master/pics/iterations_3.png\"> |\n\nThis library allows an off-the-shelf use of the algorithm to find the best local optima of an user-defined target function.\n\n![solutions](https://gitlab.com/luca.baronti/ba_lorre/-/raw/master/pics/lorre_solutions.png)\n\n## Reference\nFor details on the inner working of the BA-LORRE algorithm, how it's related on the Bess Algorithm and how the optima regions are found, please refer to [this work](https://etheses.bham.ac.uk/id/eprint/10094/7/Baronti2020PhD.pdf). \n\nIf you are using this implementation of the BA-LORRE for your research, feel free to cite this work in your paper using the following BibTex entry:\n```bibtex\n@article{baronti2020analysis,\n title={An Analysis of the Search Mechanisms of the Bees Algorithm},\n author={Baronti, Luca and Castellani, Marco and Pham, Duc Truong},\n journal={Swarm and Evolutionary Computation},\n volume={59},\n pages={100746},\n year={2020},\n publisher={Elsevier},\n doi={10.1016/j.swevo.2020.100746},\n url={https://doi.org/10.1016/j.swevo.2020.100746}\n}\n```\n\n# Installation\n\nThis module is available on [pip](https://pypi.org/project/ba-lorre) and can be installed as follows:\n```sh\n$ pip3 install ba_lorre\n```\n# Usage and Guidelines\nAs an extension of the original Bees Algorithm, the BA-LORRE shares most of the interface and mirrors its general usage. \n\nThe main class, `LORRE`, has the following signature (mandatory parameters in **bold**):\n- **score_function**`: Callable[[list], float]` the function that need to be optimised. It can either be a (lambda) function or a class that overrides the `__call__` function. Either way, `score_function(solution: list) -> float` needs to take a single parameter, a list of *N* `float` representing a position in the search space, and returns the score relative to that position; \n- **range_min**`: list` list of *N* `float`, indicating the lower bound of the search space; \n- **range_max**`: list` list of *N* `float`, indicating the upper bound of the search space; \n- *nb*`: int` number of *sites*;\n- *nrb*`: int` number of solutions sampled around each site, per iteration;\n- *stlim*`: int` stagnation limit, the number of iteration required to abandon a site if no local progress is made;\n- *initial_ngh*`: list` the neighbourhood used for the local search in a site;\n- *shrink_factor*`: float` how much a neighbourhood must be shrunk per every iteration that its centre is not updated; \n- *derating_radius_selection_criterion*`: str` the criterion used to identify the optima region radius. Allowed values are *median* and *topological*;\n- *derating_type*`: str` the shape of penalty that will apply on every found region. Allowed values are *flat* and *linear*;\n\nIn this context *N* is the number of dimensions of the score function to be optimized. At each iteration *nb* x *nrb* solutions are evaluated in the search space.\nExamples of scores functions that can be used as a benchmark can be found in [this library](https://gitlab.com/luca.baronti/python_benchmark_functions).\n\nPlease note that for simplicity this algorithm solves a *maximization* problem. \nIf you need to minimize a function, please provide the *opposite* of the function (i.e. *g(x)=-f(x)*) to the algorithm.\nThe found solutions won't be affected.\n\nPlease refer to the Bees Algorithm [documentation](https://gitlab.com/luca.baronti/ba_lorre/-/blob/master/README.md) for further details.\n## Initialization\nThe algorithm is first instantiated (as an example, here we are using a simple hyperbolic function to be optimized)\n```python\nfrom ba_lorre import LORRE\n\nalg = LORRE(lambda x: -pow(x[0]*x[1],2),\n range_min=[-100, -100],\n range_max=[100, 100],\n nb=5, nrb=20, stlim=20,\n derating_radius_selection_criterion='median',\n derating_type='linear')\n```\n## Optimization\nThen the optimisation itself can be performed in different ways. The most simple method is calling `performFullOptimisation` specifying either the maximum number of iterations or the maximum score wanted as stopping criterion.\n```python\n>>> alg.performFullOptimisation(max_iteration=50)\n(50, -3.4228945713694973e-11)\n```\nThis returns the number of iterations required and the score of the best solution found. \n\nAlternatively, more control on the optimization process can achieved performing one iteration at a time this way:\n```python\nalg.performSingleStep()\n```\nFor score functions with 2 dimensions is possible to visualize the iterations of the algorithm calling\n```python\nalg.visualize_iteration_steps()\n```\nThis is often used for demonstration, debugging and to get a better insight on the algorithm dynamics. \nAll the figures in this readme have been taken this way.\n\n## Found Optima Handling\nThe aim of the algorithm is to return the best local optima of a function. \nOnce the optimization process is terminated the found optima can be collected with\n```python\nalg.getFoundOptima()\n```\nwhich will return the position of the found optima.\n\nFor score functions with 2 dimensions is possible to visualize the found optima directly on the function's surface with:\n```python\nalg.showFoundOptima()\n```\n![solutions](https://gitlab.com/luca.baronti/ba_lorre/-/raw/master/pics/lorre_solutions.png)\nEither way, a list of *pruning* criteria can be passed to filter the found optima.\n\n### Pruning Criteria\nThe algorithm has no prior information on how many local optima the score function is supposed to have.\nAs a result, spurious and redundant optima can be returned. \n\nFor this reason, this library comes with a set of pruning functions that can be passed to `getFoundOptima` or `showFoundOptima` to eliminate solutions that are unlikely to be a good approximation of real optima.\nAvailable pruning classes are:\n- **PruningProximity** putative optima that have been generated at the same time in the same region of an optimum of higher score are removed;\n- **PruningAbsScoreCutoff** putative optima with a score lower than the *cutoff* parameter, are removed;\n- **PruningPercScoreCutoff** putative optima with a score lower than *cutoff*% of the other optima are removed;\n\nAny number of pruning criteria can be combined:\n```python\nfrom ba_lorre import PruningProximity, PruningPercScoreCutoff\n\nalg.getFoundOptima(pruning_functions=[PruningProximity(), PruningPercScoreCutoff(cutoff=.5)])\n```\nThe `PruningAbsScoreCutoff` and `PruningPercScoreCutoff` pruning criteria require insight on the optima distribution to be used properly. \nTo help the practitioner a convenience function to visualize the optima score distribution is provided:\n```python\nalg.showOptimaScoreHistogram()\n```\n![solutions](https://gitlab.com/luca.baronti/ba_lorre/-/raw/master/pics/sol_distr.png)\n\n\n\n# Versions History\n\n## v1.0.2\n- minor fix with a parameters hint\n\n## v1.0.1\n- minor visual fixes for pypi\n\n## v1.0.0\n- first release\n\n\n",
"bugtrack_url": null,
"license": "GNUv3",
"summary": "A Python implementation of the Bees Algorithm based Local Optima Region Radius Extimator (BA-LORRE). This library allows an out-of-the-box use of the numerical optimisation algorithm on an user-defined target function. The algorithm can be configured to find the maxima of the target function with an iterative process.",
"version": "1.0.2",
"split_keywords": [
"optimisation",
"optimization",
"bees algorithm",
"intelligent optimisation",
"lorre"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "fde4a6cf86ca2c4787d5e0aca048176c3adb564636aa4c7a3baaddbf0077f1d1",
"md5": "8531844b726865e8c9ee4dddc56bc4ca",
"sha256": "3e7b0a21141087473ef5b5201f129c6358d56632f86fc1dbdd86ec2f1fda25ee"
},
"downloads": -1,
"filename": "ba_lorre-1.0.2-py3-none-any.whl",
"has_sig": false,
"md5_digest": "8531844b726865e8c9ee4dddc56bc4ca",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": null,
"size": 22160,
"upload_time": "2023-01-23T00:35:28",
"upload_time_iso_8601": "2023-01-23T00:35:28.484812Z",
"url": "https://files.pythonhosted.org/packages/fd/e4/a6cf86ca2c4787d5e0aca048176c3adb564636aa4c7a3baaddbf0077f1d1/ba_lorre-1.0.2-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "18fa6747d9f7b8457ae045f4effd43a97fe0cd589f4fe12deb4f262692e84837",
"md5": "1fe3ff847f398fa159183d3e5534f3e8",
"sha256": "6804294783d33f6e5d28661bf3773fd6374f37b02a20fdeb4d8600c7191b5225"
},
"downloads": -1,
"filename": "ba_lorre-1.0.2.tar.gz",
"has_sig": false,
"md5_digest": "1fe3ff847f398fa159183d3e5534f3e8",
"packagetype": "sdist",
"python_version": "source",
"requires_python": null,
"size": 24642,
"upload_time": "2023-01-23T00:35:30",
"upload_time_iso_8601": "2023-01-23T00:35:30.477614Z",
"url": "https://files.pythonhosted.org/packages/18/fa/6747d9f7b8457ae045f4effd43a97fe0cd589f4fe12deb4f262692e84837/ba_lorre-1.0.2.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2023-01-23 00:35:30",
"github": false,
"gitlab": true,
"bitbucket": false,
"gitlab_user": "luca.baronti",
"gitlab_project": "ba_lorre",
"lcname": "ba-lorre"
}