<p align="center">
<img style="max-width:100%;" src="https://thieu1995.github.io/post/2023-08/evorbf1.png" alt="EvoRBF"/>
</p>
---
[![GitHub release](https://img.shields.io/badge/release-2.0.0-yellow.svg)](https://github.com/thieu1995/evorbf/releases)
[![Wheel](https://img.shields.io/pypi/wheel/gensim.svg)](https://pypi.python.org/pypi/evorbf)
[![PyPI version](https://badge.fury.io/py/evorbf.svg)](https://badge.fury.io/py/evorbf)
![PyPI - Python Version](https://img.shields.io/pypi/pyversions/evorbf.svg)
![PyPI - Status](https://img.shields.io/pypi/status/evorbf.svg)
![PyPI - Downloads](https://img.shields.io/pypi/dm/evorbf.svg)
[![Downloads](https://static.pepy.tech/badge/evorbf)](https://pepy.tech/project/evorbf)
[![Tests & Publishes to PyPI](https://github.com/thieu1995/evorbf/actions/workflows/publish-package.yaml/badge.svg)](https://github.com/thieu1995/evorbf/actions/workflows/publish-package.yaml)
![GitHub Release Date](https://img.shields.io/github/release-date/thieu1995/evorbf.svg)
[![Documentation Status](https://readthedocs.org/projects/evorbf/badge/?version=latest)](https://evorbf.readthedocs.io/en/latest/?badge=latest)
[![Chat](https://img.shields.io/badge/Chat-on%20Telegram-blue)](https://t.me/+fRVCJGuGJg1mNDg1)
![GitHub contributors](https://img.shields.io/github/contributors/thieu1995/evorbf.svg)
[![GitTutorial](https://img.shields.io/badge/PR-Welcome-%23FF8300.svg?)](https://git-scm.com/book/en/v2/GitHub-Contributing-to-a-Project)
[![DOI](https://zenodo.org/badge/DOI/10.5281/zenodo.11136007.svg)](https://doi.org/10.5281/zenodo.11136007)
[![License: GPL v3](https://img.shields.io/badge/License-GPLv3-blue.svg)](https://www.gnu.org/licenses/gpl-3.0)
| **EvoRBF** | **Evolving Radial Basis Function Network** |
|--------------------------------------|--------------------------------------------------------|
| **Free software** | GNU General Public License (GPL) V3 license |
| **Traditional RBF models** | `RbfRegressor`, `RbfClassifier` |
| **Advanced RBF models** | `AdvancedRbfRegressor`, `AdvancedRbfClassifier` |
| **Nature-inspired RBF models** | `NiaRbfRegressor`, `NiaRbfClassifier` |
| **Tuner for traditional RBF models** | `NiaRbfTuner` |
| **Provided total ML models** | \> 400 Models |
| **Supported total metrics** | \>= 67 (47 regressions and 20 classifications) |
| **Supported loss functions** | \>= 61 (45 regressions and 16 classifications) |
| **Documentation** | https://evorbf.readthedocs.io |
| **Python versions** | \>= 3.8.x |
| **Dependencies** | numpy, scipy, scikit-learn, pandas, mealpy, permetrics |
# Citation Request
```bibtex
@software{thieu_2024_11136008,
author = {Nguyen Van Thieu},
title = {EvoRBF: A Nature-inspired Algorithmic Framework for Evolving Radial Basis Function Networks},
month = may,
year = 2024,
publisher = {Zenodo},
doi = {10.5281/zenodo.11136007},
url = {https://doi.org/10.5281/zenodo.11136007}
}
@article{van2023mealpy,
title={MEALPY: An open-source library for latest meta-heuristic algorithms in Python},
author={Van Thieu, Nguyen and Mirjalili, Seyedali},
journal={Journal of Systems Architecture},
year={2023},
publisher={Elsevier},
doi={10.1016/j.sysarc.2023.102871}
}
```
# Theory
**EvoRBF** is mind-blowing framework for Radial Basis Function (RBF) networks.
We explain several keys components and provide several types of RBF networks that you will never see in other places.
You can read several papers by using Google Scholar search. There are many ways we can use Nature-inspired Algorithms
to optimize Radial Basis Function network, for example, you can read [this paper](https://doi.org/10.1016/B978-0-443-18764-3.00015-1).
Here we will walk through some basic concepts and parameters that matter to this network.
## Structure
1. RBF has input, single-hidden, and output layer.
2. RBF is considered has only hidden-output weights. The output layer is linear combination.
3. The output of hidden layer is calculate by radial function (e.g, Gaussian function, Thin spline,...)
## Training algorithm
### Traditional RBF models
In case of traditional RBF model. There are a few parameters need to identify to get the best model.
```code
1. The number of hidden nodes in hidden layer
2. The centers and widths (sigmas) of Gaussian function
3. The output weights
4. The regularization factor (lambda) L2
```
To train their parameters,
```code
1. Using hyper-parameter tuning model such as GridSearchCV or RandomizedSearchCV to get the best hidden nodes
2. The centers can be calculated by Random or KMeans or unsupervised learning algorithms
3. The widths (sigmas) can be computed by hyper-parameter tuning process.
+ Width can be a single value that represent all hidden nodes has the same curve of Gaussian function
+ Width can be multiple values that each hidden node has a different value.
4. The output weights can be calculated by Moore-Penrose inverse (Matrix multiplication). Do not use Gradient Descent.
5. When setting regularization L2. lambda can be computed by hyper-parameter tuning process.
```
Example,
```python
from evorbf import RbfRegressor, RbfClassifier
model = RbfClassifier(size_hidden=10, center_finder="kmeans", sigmas=2.0, reg_lambda=0.1, seed=None)
model = RbfRegressor(size_hidden=4, center_finder="random", sigmas=(1.5, 2, 2, 2.5), reg_lambda=0, seed=42)
model.fit(X=X_train, y=y_train)
y_pred = model.predict(X_test)
y_pred_prob = model.predict_proba(X_test)
```
### Advanced RBF models
In case of advanced RBF model. User can have so many different options.
```code
1. Choice different RBF kernel function such as Multiquadric (MQ), Inverse Multiquadric (IMQ), Thin Plate Spline (TPS), Exponential, Power,...
2. Choice different unsupervised learning algorithms to calculate the centers, and may be the number of hidden nodes.
+ For example, KMeans, or random algorithms, you need to set up the number of hidden nodes.
+ But, for MeanShift or DBSCAN algorithms, you don't need to set that value. They can automatically identify the number of cluters (number of hidden nodes).
3. This version may have the bias in output layer.
```
Examples,
```python
from evorbf import AdvancedRbfClassifier, AdvancedRbfRegressor
model = AdvancedRbfClassifier(center_finder="random", finder_params={"n_centers": 15},
rbf_kernel="gaussian", kernel_params={"sigma": 1.5},
reg_lambda=0.1, has_bias=True, seed=42)
model = AdvancedRbfClassifier(center_finder="random", finder_params=None, # Default n_centers = 10
rbf_kernel="gaussian", kernel_params=None, # Default sigma = 1.0
reg_lambda=0.1, has_bias=False, seed=42)
model = AdvancedRbfClassifier(center_finder="kmeans", finder_params={"n_centers": 20},
rbf_kernel="multiquadric", kernel_params=None,
reg_lambda=0.1, has_bias=False, seed=42)
model = AdvancedRbfClassifier(center_finder="meanshift", finder_params={"bandwidth": 0.6}, # Give us 28 hidden nodes
rbf_kernel="inverse_multiquadric", kernel_params={"sigma": 1.5},
reg_lambda=0.5, has_bias=True, seed=42)
model = AdvancedRbfClassifier(center_finder="dbscan", finder_params={"eps": 0.2}, # Give us 42 hidden nodes
rbf_kernel="multiquadric", kernel_params={"sigma": 1.5},
reg_lambda=0.5, has_bias=True, seed=42)
model = AdvancedRbfClassifier(center_finder="dbscan", finder_params={"eps": 0.175}, # Give us 16 hidden nodes
rbf_kernel="multiquadric", kernel_params={"sigma": 1.5},
reg_lambda=None, has_bias=False, seed=42)
model.fit(X=X_train, y=y_train)
y_pred = model.predict(X_test)
y_pred_prob = model.predict_proba(X_test)
```
### Nature-inspired Algorithm-based RBF models
This is the main purpose of this library. In this type of models,
```code
1. We use Nature-inspired Algorithm (NIA) to train widths (sigmas) value for each hidden node.
2. If you set up the Regularization technique, then NIA is automatically calculated the lambda factor
```
Examples,
```python
from evorbf import NiaRbfRegressor, NiaRbfClassifier
model = NiaRbfClassifier(size_hidden=25, center_finder="kmeans",
regularization=False, obj_name="F1S",
optim="OriginalWOA",
optim_paras={"epoch": 50, "pop_size": 20},
verbose=True, seed=42)
model = NiaRbfRegressor(size_hidden=10, center_finder="random",
regularization=True, obj_name="AS",
optim="BaseGA",
optim_paras={"epoch": 50, "pop_size": 20},
verbose=True, seed=42)
model.fit(X=X_train, y=y_train)
y_pred = model.predict(X_test)
y_pred_prob = model.predict_proba(X_test)
```
### Nature-inspired Algorithm-based hyperparameter RBF tuning model
In this case, user can use NIA to tune hyper-parameters of traditional RBF models.
```python
from evorbf import NiaRbfTuner, IntegerVar, StringVar, FloatVar
# Design the boundary (for hyper-parameters)
my_bounds = [
IntegerVar(lb=5, ub=21, name="size_hidden"),
StringVar(valid_sets=("kmeans", "random"), name="center_finder"),
FloatVar(lb=(0.01,), ub=(3.0,), name="sigmas"),
FloatVar(lb=(0, ), ub=(1.0, ), name="reg_lambda"),
]
model = NiaRbfTuner(problem_type="classification", bounds=my_bounds, cv=3, scoring="AS",
optim="OriginalWOA", optim_paras={"epoch": 10, "pop_size": 20},
verbose=True, seed=42)
```
### My notes
1. RBF needs to train the centers, and widths of Gaussian activation function. (This is 1st phase)
2. RBF usually use KMeans to find centers ==> Increase the complexity and time.
+ In that case, user need to define widths ==> Can use 1 single width or each hidden with different width.
+ Or RBF use random to find centers ==> Not good to split samples to different clusters.
3. RBF needs to train the output weights. (This is 2nd phase)
4. RBF do not use Gradient descent to calculate output weights, it used Moore–Penrose inverse (matrix multiplication, least square method) ==> so it is faster than MLP network.
5. Moore-Penrose inverse can find the exact solution ==> So we don't have to use Gradient Descent or Approximation algorithm here.
6. In case of overfitting, what can we do with this network ==> We add L2 regularization method.
7. If you have large-scale dataset ==> Set more hidden nodes ==> Then increase the L2 regularization parameter.
```code
1. RbfRegressor, RbfClassifier: You need to set up 4 types of hyper-parameters.
2. AdvancedRbfRegressor, AdvancedRbfClassifier: You need to set up 6 types of hyper-parameters.
But you have many option to choice, and you can design your own RBF models, a new one that nobody has used it before.
For example, RBF that has bias in output layer or RBF that use DBSCAN and Exponential kernel function.
3. NiaRbfRegressor, NiaRbfClassifier: You need to set up the hidden size. However, these are best classes in this library.
+ The sigmas are automatically calculated for each hidden nodes.
+ The reguarlization factor is also automatically tuned to fine the best one.
4. NiaRbfTuner. This class also extremely useful for traditional RBF models, it can tune hidden size, however,
there is only 1 sigma value will be presented all hidden nodes.
```
# Usage
* Install the [current PyPI release](https://pypi.python.org/pypi/evorbf):
```sh
$ pip install evorbf
```
After installation, you can check EvoRBF version:
```sh
$ python
>>> import evorbf
>>> evorbf.__version__
```
We have provided above several ways to import and call the proposed classes. If you need more details how to
use each of them, please check out the folder [examples](/examples). In this short demonstration, we will use
Whale Optimization Algorithm to optimize the `sigmas` (in non-linear Gaussian kernel) and `reg_lambda` of
L2 regularization in RBF network (WOA-RBF model) for Diabetes prediction problem.
```python
import numpy as np
from evorbf import Data, NiaRbfRegressor
from sklearn.datasets import load_diabetes
## Load data object
# total samples = 442, total features = 10
X, y = load_diabetes(return_X_y=True)
data = Data(X, y)
## Split train and test
data.split_train_test(test_size=0.2, random_state=2)
print(data.X_train.shape, data.X_test.shape)
## Scaling dataset
data.X_train, scaler_X = data.scale(data.X_train, scaling_methods=("standard"))
data.X_test = scaler_X.transform(data.X_test)
data.y_train, scaler_y = data.scale(data.y_train, scaling_methods=("standard", ))
data.y_test = scaler_y.transform(np.reshape(data.y_test, (-1, 1)))
## Create model
opt_paras = {"name": "WOA", "epoch": 50, "pop_size": 20}
model = NiaRbfRegressor(size_hidden=25, # Set up big enough hidden size
center_finder="kmeans", # Use KMeans to find the centers
regularization=True, # Use L2 regularization
obj_name="MSE", # Mean squared error as fitness function for NIAs
optim="OriginalWOA", # Use Whale Optimization
optim_paras={"epoch": 50, "pop_size": 20}, # Set up parameter for Whale Optimization
verbose=True, seed=42)
## Train the model
model.fit(data.X_train, data.y_train)
## Test the model
y_pred = model.predict(data.X_test)
print(model.optimizer.g_best.solution)
## Calculate some metrics
print(model.score(X=data.X_test, y=data.y_test))
print(model.scores(X=data.X_test, y=data.y_test, list_metrics=["R2", "R", "KGE", "MAPE"]))
print(model.evaluate(y_true=data.y_test, y_pred=y_pred, list_metrics=["MSE", "RMSE", "R2S", "NSE", "KGE", "MAPE"]))
```
Please go check out the [examples](/examples) folder. You'll be surprised by what this library can do for your problem.
You can also read the [documentation](https://evorbf.readthedocs.io/) for more detailed installation
instructions, explanations, and examples.
### Official Links (Get support for questions and answers)
* [Official source code repository](https://github.com/thieu1995/evorbf)
* [Official document](https://evorbf.readthedocs.io/)
* [Download releases](https://pypi.org/project/evorbf/)
* [Issue tracker](https://github.com/thieu1995/evorbf/issues)
* [Notable changes log](/ChangeLog.md)
* [Official discussion group](https://t.me/+fRVCJGuGJg1mNDg1)
---
Developed by: [Thieu](mailto:nguyenthieu2102@gmail.com?Subject=EvoRBF_QUESTIONS) @ 2024
Raw data
{
"_id": null,
"home_page": "https://github.com/thieu1995/evorbf",
"name": "evorbf",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.8",
"maintainer_email": null,
"keywords": "radial basis function, machine learning, artificial intelligence, deep learning, neural networks, single hidden layer network, metaheuristic-based RBFrandom projection, kernel methods, feature extraction, classification, regression, supervised learning, optimization algorithms, Kernel RBF, Cross-validationGenetic algorithm (GA), Particle swarm optimization (PSO), Ant colony optimization (ACO), Differential evolution (DE), Simulated annealing, Grey wolf optimizer (GWO), Whale Optimization Algorithm (WOA), confusion matrix, recall, precision, accuracy, K-Nearest Neighbors, random forest, support vector machine, scikit-learn models, estimator, Robust machine learning, shallow neural network, nature-inspired RBF, RBF network, Convergence analysis, Search space exploration, Local search, Computational intelligence, Robust optimization, Performance analysis, Intelligent optimization, Simulations",
"author": "Thieu",
"author_email": "nguyenthieu2102@gmail.com",
"download_url": "https://files.pythonhosted.org/packages/53/d7/6c8c5166d15171f7889581ce1ea24001b04f01352699d7c64da24fbfbe1d/evorbf-2.0.0.tar.gz",
"platform": null,
"description": "\n<p align=\"center\">\n<img style=\"max-width:100%;\" src=\"https://thieu1995.github.io/post/2023-08/evorbf1.png\" alt=\"EvoRBF\"/>\n</p>\n\n---\n\n\n[![GitHub release](https://img.shields.io/badge/release-2.0.0-yellow.svg)](https://github.com/thieu1995/evorbf/releases)\n[![Wheel](https://img.shields.io/pypi/wheel/gensim.svg)](https://pypi.python.org/pypi/evorbf) \n[![PyPI version](https://badge.fury.io/py/evorbf.svg)](https://badge.fury.io/py/evorbf)\n![PyPI - Python Version](https://img.shields.io/pypi/pyversions/evorbf.svg)\n![PyPI - Status](https://img.shields.io/pypi/status/evorbf.svg)\n![PyPI - Downloads](https://img.shields.io/pypi/dm/evorbf.svg)\n[![Downloads](https://static.pepy.tech/badge/evorbf)](https://pepy.tech/project/evorbf)\n[![Tests & Publishes to PyPI](https://github.com/thieu1995/evorbf/actions/workflows/publish-package.yaml/badge.svg)](https://github.com/thieu1995/evorbf/actions/workflows/publish-package.yaml)\n![GitHub Release Date](https://img.shields.io/github/release-date/thieu1995/evorbf.svg)\n[![Documentation Status](https://readthedocs.org/projects/evorbf/badge/?version=latest)](https://evorbf.readthedocs.io/en/latest/?badge=latest)\n[![Chat](https://img.shields.io/badge/Chat-on%20Telegram-blue)](https://t.me/+fRVCJGuGJg1mNDg1)\n![GitHub contributors](https://img.shields.io/github/contributors/thieu1995/evorbf.svg)\n[![GitTutorial](https://img.shields.io/badge/PR-Welcome-%23FF8300.svg?)](https://git-scm.com/book/en/v2/GitHub-Contributing-to-a-Project)\n[![DOI](https://zenodo.org/badge/DOI/10.5281/zenodo.11136007.svg)](https://doi.org/10.5281/zenodo.11136007)\n[![License: GPL v3](https://img.shields.io/badge/License-GPLv3-blue.svg)](https://www.gnu.org/licenses/gpl-3.0)\n\n| **EvoRBF** | **Evolving Radial Basis Function Network** |\n|--------------------------------------|--------------------------------------------------------|\n| **Free software** | GNU General Public License (GPL) V3 license |\n| **Traditional RBF models** | `RbfRegressor`, `RbfClassifier` |\n| **Advanced RBF models** | `AdvancedRbfRegressor`, `AdvancedRbfClassifier` | \n| **Nature-inspired RBF models** | `NiaRbfRegressor`, `NiaRbfClassifier` |\n| **Tuner for traditional RBF models** | `NiaRbfTuner` | \n| **Provided total ML models** | \\> 400 Models |\n| **Supported total metrics** | \\>= 67 (47 regressions and 20 classifications) |\n| **Supported loss functions** | \\>= 61 (45 regressions and 16 classifications) |\n| **Documentation** | https://evorbf.readthedocs.io | \n| **Python versions** | \\>= 3.8.x | \n| **Dependencies** | numpy, scipy, scikit-learn, pandas, mealpy, permetrics |\n\n\n# Citation Request \n\n```bibtex\n@software{thieu_2024_11136008,\n author = {Nguyen Van Thieu},\n title = {EvoRBF: A Nature-inspired Algorithmic Framework for Evolving Radial Basis Function Networks},\n month = may,\n year = 2024,\n publisher = {Zenodo},\n doi = {10.5281/zenodo.11136007},\n url = {https://doi.org/10.5281/zenodo.11136007}\n}\n\n@article{van2023mealpy,\n title={MEALPY: An open-source library for latest meta-heuristic algorithms in Python},\n author={Van Thieu, Nguyen and Mirjalili, Seyedali},\n journal={Journal of Systems Architecture},\n year={2023},\n publisher={Elsevier},\n doi={10.1016/j.sysarc.2023.102871}\n}\n```\n\n\n# Theory\n\n**EvoRBF** is mind-blowing framework for Radial Basis Function (RBF) networks.\nWe explain several keys components and provide several types of RBF networks that you will never see in other places.\n\nYou can read several papers by using Google Scholar search. There are many ways we can use Nature-inspired Algorithms \nto optimize Radial Basis Function network, for example, you can read [this paper](https://doi.org/10.1016/B978-0-443-18764-3.00015-1).\nHere we will walk through some basic concepts and parameters that matter to this network.\n\n\n## Structure\n\n1. RBF has input, single-hidden, and output layer.\n2. RBF is considered has only hidden-output weights. The output layer is linear combination.\n3. The output of hidden layer is calculate by radial function (e.g, Gaussian function, Thin spline,...)\n\n\n## Training algorithm\n\n### Traditional RBF models\n\nIn case of traditional RBF model. There are a few parameters need to identify to get the best model.\n```code\n1. The number of hidden nodes in hidden layer\n2. The centers and widths (sigmas) of Gaussian function\n3. The output weights\n4. The regularization factor (lambda) L2\n```\n\nTo train their parameters, \n```code\n1. Using hyper-parameter tuning model such as GridSearchCV or RandomizedSearchCV to get the best hidden nodes\n2. The centers can be calculated by Random or KMeans or unsupervised learning algorithms\n3. The widths (sigmas) can be computed by hyper-parameter tuning process.\n + Width can be a single value that represent all hidden nodes has the same curve of Gaussian function\n + Width can be multiple values that each hidden node has a different value.\n4. The output weights can be calculated by Moore-Penrose inverse (Matrix multiplication). Do not use Gradient Descent.\n5. When setting regularization L2. lambda can be computed by hyper-parameter tuning process.\n```\n\nExample,\n```python\nfrom evorbf import RbfRegressor, RbfClassifier\n\nmodel = RbfClassifier(size_hidden=10, center_finder=\"kmeans\", sigmas=2.0, reg_lambda=0.1, seed=None)\nmodel = RbfRegressor(size_hidden=4, center_finder=\"random\", sigmas=(1.5, 2, 2, 2.5), reg_lambda=0, seed=42)\n\nmodel.fit(X=X_train, y=y_train)\ny_pred = model.predict(X_test)\ny_pred_prob = model.predict_proba(X_test)\n```\n\n\n### Advanced RBF models\n\nIn case of advanced RBF model. User can have so many different options.\n```code\n1. Choice different RBF kernel function such as Multiquadric (MQ), Inverse Multiquadric (IMQ), Thin Plate Spline (TPS), Exponential, Power,...\n2. Choice different unsupervised learning algorithms to calculate the centers, and may be the number of hidden nodes.\n + For example, KMeans, or random algorithms, you need to set up the number of hidden nodes.\n + But, for MeanShift or DBSCAN algorithms, you don't need to set that value. They can automatically identify the number of cluters (number of hidden nodes).\n3. This version may have the bias in output layer. \n```\n\nExamples,\n```python\nfrom evorbf import AdvancedRbfClassifier, AdvancedRbfRegressor\n\nmodel = AdvancedRbfClassifier(center_finder=\"random\", finder_params={\"n_centers\": 15},\n rbf_kernel=\"gaussian\", kernel_params={\"sigma\": 1.5},\n reg_lambda=0.1, has_bias=True, seed=42)\n\nmodel = AdvancedRbfClassifier(center_finder=\"random\", finder_params=None, # Default n_centers = 10\n rbf_kernel=\"gaussian\", kernel_params=None, # Default sigma = 1.0\n reg_lambda=0.1, has_bias=False, seed=42)\n\nmodel = AdvancedRbfClassifier(center_finder=\"kmeans\", finder_params={\"n_centers\": 20},\n rbf_kernel=\"multiquadric\", kernel_params=None,\n reg_lambda=0.1, has_bias=False, seed=42)\n\nmodel = AdvancedRbfClassifier(center_finder=\"meanshift\", finder_params={\"bandwidth\": 0.6}, # Give us 28 hidden nodes\n rbf_kernel=\"inverse_multiquadric\", kernel_params={\"sigma\": 1.5},\n reg_lambda=0.5, has_bias=True, seed=42)\n\nmodel = AdvancedRbfClassifier(center_finder=\"dbscan\", finder_params={\"eps\": 0.2}, # Give us 42 hidden nodes\n rbf_kernel=\"multiquadric\", kernel_params={\"sigma\": 1.5},\n reg_lambda=0.5, has_bias=True, seed=42)\n\nmodel = AdvancedRbfClassifier(center_finder=\"dbscan\", finder_params={\"eps\": 0.175}, # Give us 16 hidden nodes\n rbf_kernel=\"multiquadric\", kernel_params={\"sigma\": 1.5},\n reg_lambda=None, has_bias=False, seed=42)\n\nmodel.fit(X=X_train, y=y_train)\ny_pred = model.predict(X_test)\ny_pred_prob = model.predict_proba(X_test)\n```\n\n### Nature-inspired Algorithm-based RBF models\n\nThis is the main purpose of this library. In this type of models,\n\n```code\n1. We use Nature-inspired Algorithm (NIA) to train widths (sigmas) value for each hidden node.\n2. If you set up the Regularization technique, then NIA is automatically calculated the lambda factor\n```\n\nExamples,\n```python\nfrom evorbf import NiaRbfRegressor, NiaRbfClassifier\n\nmodel = NiaRbfClassifier(size_hidden=25, center_finder=\"kmeans\", \n regularization=False, obj_name=\"F1S\",\n optim=\"OriginalWOA\", \n optim_paras={\"epoch\": 50, \"pop_size\": 20}, \n verbose=True, seed=42)\n\nmodel = NiaRbfRegressor(size_hidden=10, center_finder=\"random\", \n regularization=True, obj_name=\"AS\",\n optim=\"BaseGA\", \n optim_paras={\"epoch\": 50, \"pop_size\": 20}, \n verbose=True, seed=42)\n\nmodel.fit(X=X_train, y=y_train)\ny_pred = model.predict(X_test)\ny_pred_prob = model.predict_proba(X_test)\n```\n\n### Nature-inspired Algorithm-based hyperparameter RBF tuning model\n\nIn this case, user can use NIA to tune hyper-parameters of traditional RBF models.\n\n```python\nfrom evorbf import NiaRbfTuner, IntegerVar, StringVar, FloatVar\n\n# Design the boundary (for hyper-parameters)\nmy_bounds = [\n IntegerVar(lb=5, ub=21, name=\"size_hidden\"),\n StringVar(valid_sets=(\"kmeans\", \"random\"), name=\"center_finder\"),\n FloatVar(lb=(0.01,), ub=(3.0,), name=\"sigmas\"),\n FloatVar(lb=(0, ), ub=(1.0, ), name=\"reg_lambda\"),\n]\n\nmodel = NiaRbfTuner(problem_type=\"classification\", bounds=my_bounds, cv=3, scoring=\"AS\",\n optim=\"OriginalWOA\", optim_paras={\"epoch\": 10, \"pop_size\": 20}, \n verbose=True, seed=42)\n```\n\n### My notes\n\n1. RBF needs to train the centers, and widths of Gaussian activation function. (This is 1st phase)\n2. RBF usually use KMeans to find centers ==> Increase the complexity and time.\n + In that case, user need to define widths ==> Can use 1 single width or each hidden with different width.\n + Or RBF use random to find centers ==> Not good to split samples to different clusters.\n3. RBF needs to train the output weights. (This is 2nd phase)\n4. RBF do not use Gradient descent to calculate output weights, it used Moore\u2013Penrose inverse (matrix multiplication, least square method) ==> so it is faster than MLP network.\n5. Moore-Penrose inverse can find the exact solution ==> So we don't have to use Gradient Descent or Approximation algorithm here.\n6. In case of overfitting, what can we do with this network ==> We add L2 regularization method.\n7. If you have large-scale dataset ==> Set more hidden nodes ==> Then increase the L2 regularization parameter.\n\n```code\n1. RbfRegressor, RbfClassifier: You need to set up 4 types of hyper-parameters.\n2. AdvancedRbfRegressor, AdvancedRbfClassifier: You need to set up 6 types of hyper-parameters. \n But you have many option to choice, and you can design your own RBF models, a new one that nobody has used it before.\n For example, RBF that has bias in output layer or RBF that use DBSCAN and Exponential kernel function. \n3. NiaRbfRegressor, NiaRbfClassifier: You need to set up the hidden size. However, these are best classes in this library.\n + The sigmas are automatically calculated for each hidden nodes.\n + The reguarlization factor is also automatically tuned to fine the best one.\n4. NiaRbfTuner. This class also extremely useful for traditional RBF models, it can tune hidden size, however, \n there is only 1 sigma value will be presented all hidden nodes.\n```\n\n\n# Usage\n\n* Install the [current PyPI release](https://pypi.python.org/pypi/evorbf):\n```sh \n$ pip install evorbf\n```\n\nAfter installation, you can check EvoRBF version:\n\n```sh\n$ python\n>>> import evorbf\n>>> evorbf.__version__\n```\n\nWe have provided above several ways to import and call the proposed classes. If you need more details how to \nuse each of them, please check out the folder [examples](/examples). In this short demonstration, we will use \nWhale Optimization Algorithm to optimize the `sigmas` (in non-linear Gaussian kernel) and `reg_lambda` of \nL2 regularization in RBF network (WOA-RBF model) for Diabetes prediction problem.\n\n```python\nimport numpy as np\nfrom evorbf import Data, NiaRbfRegressor\nfrom sklearn.datasets import load_diabetes\n\n## Load data object\n# total samples = 442, total features = 10\nX, y = load_diabetes(return_X_y=True)\ndata = Data(X, y)\n\n## Split train and test\ndata.split_train_test(test_size=0.2, random_state=2)\nprint(data.X_train.shape, data.X_test.shape)\n\n## Scaling dataset\ndata.X_train, scaler_X = data.scale(data.X_train, scaling_methods=(\"standard\"))\ndata.X_test = scaler_X.transform(data.X_test)\n\ndata.y_train, scaler_y = data.scale(data.y_train, scaling_methods=(\"standard\", ))\ndata.y_test = scaler_y.transform(np.reshape(data.y_test, (-1, 1)))\n\n## Create model\nopt_paras = {\"name\": \"WOA\", \"epoch\": 50, \"pop_size\": 20}\nmodel = NiaRbfRegressor(size_hidden=25, # Set up big enough hidden size \n center_finder=\"kmeans\", # Use KMeans to find the centers\n regularization=True, # Use L2 regularization \n obj_name=\"MSE\", # Mean squared error as fitness function for NIAs\n optim=\"OriginalWOA\", # Use Whale Optimization\n optim_paras={\"epoch\": 50, \"pop_size\": 20}, # Set up parameter for Whale Optimization\n verbose=True, seed=42)\n\n## Train the model\nmodel.fit(data.X_train, data.y_train)\n\n## Test the model\ny_pred = model.predict(data.X_test)\n\nprint(model.optimizer.g_best.solution)\n## Calculate some metrics\nprint(model.score(X=data.X_test, y=data.y_test))\nprint(model.scores(X=data.X_test, y=data.y_test, list_metrics=[\"R2\", \"R\", \"KGE\", \"MAPE\"]))\nprint(model.evaluate(y_true=data.y_test, y_pred=y_pred, list_metrics=[\"MSE\", \"RMSE\", \"R2S\", \"NSE\", \"KGE\", \"MAPE\"]))\n```\n\nPlease go check out the [examples](/examples) folder. You'll be surprised by what this library can do for your problem.\nYou can also read the [documentation](https://evorbf.readthedocs.io/) for more detailed installation \ninstructions, explanations, and examples.\n\n\n### Official Links (Get support for questions and answers)\n\n* [Official source code repository](https://github.com/thieu1995/evorbf)\n* [Official document](https://evorbf.readthedocs.io/)\n* [Download releases](https://pypi.org/project/evorbf/) \n* [Issue tracker](https://github.com/thieu1995/evorbf/issues) \n* [Notable changes log](/ChangeLog.md)\n* [Official discussion group](https://t.me/+fRVCJGuGJg1mNDg1)\n\n---\n\nDeveloped by: [Thieu](mailto:nguyenthieu2102@gmail.com?Subject=EvoRBF_QUESTIONS) @ 2024\n\n\n",
"bugtrack_url": null,
"license": "GPLv3",
"summary": "EvoRBF: A Nature-inspired Algorithmic Framework for Evolving Radial Basis Function Networks",
"version": "2.0.0",
"project_urls": {
"Bug Tracker": "https://github.com/thieu1995/evorbf/issues",
"Change Log": "https://github.com/thieu1995/evorbf/blob/master/ChangeLog.md",
"Documentation": "https://evorbf.readthedocs.io/",
"Forum": "https://t.me/+fRVCJGuGJg1mNDg1",
"Homepage": "https://github.com/thieu1995/evorbf",
"Source Code": "https://github.com/thieu1995/evorbf"
},
"split_keywords": [
"radial basis function",
" machine learning",
" artificial intelligence",
" deep learning",
" neural networks",
" single hidden layer network",
" metaheuristic-based rbfrandom projection",
" kernel methods",
" feature extraction",
" classification",
" regression",
" supervised learning",
" optimization algorithms",
" kernel rbf",
" cross-validationgenetic algorithm (ga)",
" particle swarm optimization (pso)",
" ant colony optimization (aco)",
" differential evolution (de)",
" simulated annealing",
" grey wolf optimizer (gwo)",
" whale optimization algorithm (woa)",
" confusion matrix",
" recall",
" precision",
" accuracy",
" k-nearest neighbors",
" random forest",
" support vector machine",
" scikit-learn models",
" estimator",
" robust machine learning",
" shallow neural network",
" nature-inspired rbf",
" rbf network",
" convergence analysis",
" search space exploration",
" local search",
" computational intelligence",
" robust optimization",
" performance analysis",
" intelligent optimization",
" simulations"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "0696475a3d74166bf828fdb41550bcf0ef07a7f54614853adac358bbda3022ea",
"md5": "7eb623e5615236f5aa27b61215461976",
"sha256": "e79341882efa20b6603254a59716efb728a186bf0e3efd030d9fcfdbc1419db1"
},
"downloads": -1,
"filename": "evorbf-2.0.0-py3-none-any.whl",
"has_sig": false,
"md5_digest": "7eb623e5615236f5aa27b61215461976",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.8",
"size": 87688,
"upload_time": "2024-12-04T11:57:57",
"upload_time_iso_8601": "2024-12-04T11:57:57.066232Z",
"url": "https://files.pythonhosted.org/packages/06/96/475a3d74166bf828fdb41550bcf0ef07a7f54614853adac358bbda3022ea/evorbf-2.0.0-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "53d76c8c5166d15171f7889581ce1ea24001b04f01352699d7c64da24fbfbe1d",
"md5": "e2da9c7b9e07ffc003aca911693ba6a1",
"sha256": "5d1f0ab375faeabce42ba430ffa1c303ca3fae8e37d72f5d2c3b8166548d8ab3"
},
"downloads": -1,
"filename": "evorbf-2.0.0.tar.gz",
"has_sig": false,
"md5_digest": "e2da9c7b9e07ffc003aca911693ba6a1",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.8",
"size": 67808,
"upload_time": "2024-12-04T11:57:58",
"upload_time_iso_8601": "2024-12-04T11:57:58.449371Z",
"url": "https://files.pythonhosted.org/packages/53/d7/6c8c5166d15171f7889581ce1ea24001b04f01352699d7c64da24fbfbe1d/evorbf-2.0.0.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-12-04 11:57:58",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "thieu1995",
"github_project": "evorbf",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"requirements": [
{
"name": "numpy",
"specs": [
[
">=",
"1.17.1"
]
]
},
{
"name": "scipy",
"specs": [
[
">=",
"1.7.1"
]
]
},
{
"name": "scikit-learn",
"specs": [
[
">=",
"1.0.2"
]
]
},
{
"name": "pandas",
"specs": [
[
">=",
"1.3.5"
]
]
},
{
"name": "mealpy",
"specs": [
[
">=",
"3.0.1"
]
]
},
{
"name": "permetrics",
"specs": [
[
">=",
"2.0.0"
]
]
},
{
"name": "pytest",
"specs": [
[
"==",
"7.1.2"
]
]
},
{
"name": "pytest-cov",
"specs": [
[
"==",
"4.0.0"
]
]
},
{
"name": "flake8",
"specs": [
[
">=",
"4.0.1"
]
]
}
],
"lcname": "evorbf"
}