metaperceptron


Namemetaperceptron JSON
Version 1.1.0 PyPI version JSON
download
home_pagehttps://github.com/thieu1995/MetaPerceptron
SummaryMetaPerceptron: Unleashing the Power of Metaheuristic-optimized Multi-Layer Perceptron - A Python Library
upload_time2023-12-03 03:15:51
maintainer
docs_urlNone
authorThieu
requires_python>=3.8
licenseGPLv3
keywords multi-layer perceptron machine learning artificial intelligence deep learning neural networks single hidden layer network random projection flann feed-forward neural network artificial neural network classification regression supervised learning online learning generalization optimization algorithms kernel mlp cross-validationgenetic algorithm (ga) particle swarm optimization (pso) ant colony optimization (aco) differential evolution (de) simulated annealing grey wolf optimizer (gwo) whale optimization algorithm (woa) confusion matrix recall precision accuracy pearson correlation coefficient (pcc) spearman correlation coefficient (scc) global optimization convergence analysis search space exploration local search computational intelligence robust optimization metaheuristic metaheuristic algorithms nature-inspired computing nature-inspired algorithms swarm-based computation metaheuristic-based multi-layer perceptron metaheuristic-optimized mlp performance analysis intelligent optimization simulations
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            
<p align="center">
<img style="width:100%;" src="https://thieu1995.github.io/post/2023-08/metaperceptron1.png" alt="MetaPerceptron"/>
</p>


---

[![GitHub release](https://img.shields.io/badge/release-1.1.0-yellow.svg)](https://github.com/thieu1995/MetaPerceptron/releases)
[![Wheel](https://img.shields.io/pypi/wheel/gensim.svg)](https://pypi.python.org/pypi/metaperceptron) 
[![PyPI version](https://badge.fury.io/py/metaperceptron.svg)](https://badge.fury.io/py/metaperceptron)
![PyPI - Python Version](https://img.shields.io/pypi/pyversions/metaperceptron.svg)
![PyPI - Status](https://img.shields.io/pypi/status/metaperceptron.svg)
![PyPI - Downloads](https://img.shields.io/pypi/dm/metaperceptron.svg)
[![Downloads](https://pepy.tech/badge/metaperceptron)](https://pepy.tech/project/metaperceptron)
[![Tests & Publishes to PyPI](https://github.com/thieu1995/metaperceptron/actions/workflows/publish-package.yaml/badge.svg)](https://github.com/thieu1995/metaperceptron/actions/workflows/publish-package.yaml)
![GitHub Release Date](https://img.shields.io/github/release-date/thieu1995/metaperceptron.svg)
[![Documentation Status](https://readthedocs.org/projects/metaperceptron/badge/?version=latest)](https://metaperceptron.readthedocs.io/en/latest/?badge=latest)
[![Chat](https://img.shields.io/badge/Chat-on%20Telegram-blue)](https://t.me/+fRVCJGuGJg1mNDg1)
![GitHub contributors](https://img.shields.io/github/contributors/thieu1995/metaperceptron.svg)
[![GitTutorial](https://img.shields.io/badge/PR-Welcome-%23FF8300.svg?)](https://git-scm.com/book/en/v2/GitHub-Contributing-to-a-Project)
[![DOI](https://zenodo.org/badge/676088001.svg)](https://zenodo.org/doi/10.5281/zenodo.10251021)
[![License: GPL v3](https://img.shields.io/badge/License-GPLv3-blue.svg)](https://www.gnu.org/licenses/gpl-3.0)


MetaPerceptron (Metaheuristic-optimized Multi-Layer Perceptron) is a Python library that implements variants and the 
traditional version of Multi-Layer Perceptron models. These include Metaheuristic-optimized MLP models (GA, PSO, WOA, TLO, DE, ...) 
and Gradient Descent-optimized MLP models (SGD, Adam, Adelta, Adagrad, ...). It provides a comprehensive list of 
optimizers for training MLP models and is also compatible with the Scikit-Learn library. With MetaPerceptron, 
you can perform searches and hyperparameter tuning using the features provided by the Scikit-Learn library.

* **Free software:** GNU General Public License (GPL) V3 license
* **Provided Estimator**: MlpRegressor, MlpClassifier, MhaMlpRegressor, MhaMlpClassifier
* **Total Metaheuristic-based MLP Regressor**: > 200 Models 
* **Total Metaheuristic-based MLP Classifier**: > 200 Models
* **Total Gradient Descent-based MLP Regressor**: 12 Models
* **Total Gradient Descent-based MLP Classifier**: 12 Models
* **Supported performance metrics**: >= 67 (47 regressions and 20 classifications)
* **Supported objective functions (as fitness functions or loss functions)**: >= 67 (47 regressions and 20 classifications)
* **Documentation:** https://metaperceptron.readthedocs.io
* **Python versions:** >= 3.8.x
* **Dependencies:** numpy, scipy, scikit-learn, pandas, mealpy, permetrics, torch, skorch


# Citation Request 

If you want to understand how Metaheuristic is applied to Multi-Layer Perceptron, you need to read the paper 
titled **"Let a biogeography-based optimizer train your Multi-Layer Perceptron"**. 
The paper can be accessed at the following [link](https://doi.org/10.1016/j.ins.2014.01.038)


Please include these citations if you plan to use this library:

```code

@software{nguyen_van_thieu_2023_10251022,
  author       = {Nguyen Van Thieu},
  title        = {MetaPerceptron: Unleashing the Power of Metaheuristic-optimized Multi-Layer Perceptron - A Python Library},
  month        = dec,
  year         = 2023,
  publisher    = {Zenodo},
  doi          = {10.5281/zenodo.10251021},
  url          = {https://github.com/thieu1995/MetaPerceptron}
}

@article{van2023mealpy,
  title={MEALPY: An open-source library for latest meta-heuristic algorithms in Python},
  author={Van Thieu, Nguyen and Mirjalili, Seyedali},
  journal={Journal of Systems Architecture},
  year={2023},
  publisher={Elsevier},
  doi={10.1016/j.sysarc.2023.102871}
}

@article{van2023groundwater,
  title={Groundwater level modeling using Augmented Artificial Ecosystem Optimization},
  author={Van Thieu, Nguyen and Barma, Surajit Deb and Van Lam, To and Kisi, Ozgur and Mahesha, Amai},
  journal={Journal of Hydrology},
  volume={617},
  pages={129034},
  year={2023},
  publisher={Elsevier}
}

@article{thieu2019efficient,
  title={Efficient time-series forecasting using neural network and opposition-based coral reefs optimization},
  author={Thieu Nguyen, Tu Nguyen and Nguyen, Binh Minh and Nguyen, Giang},
  journal={International Journal of Computational Intelligence Systems},
  volume={12},
  number={2},
  pages={1144--1161},
  year={2019}
}

```

# Installation

* Install the [current PyPI release](https://pypi.python.org/pypi/metaperceptron):
```sh 
$ pip install metaperceptron==1.1.0
```

* Install directly from source code
```sh 
$ git clone https://github.com/thieu1995/MetaPerceptron.git
$ cd MetaPerceptron
$ python setup.py install
```

* In case, you want to install the development version from Github:
```sh 
$ pip install git+https://github.com/thieu1995/MetaPerceptron 
```

After installation, you can import MetaPerceptron as any other Python module:

```sh
$ python
>>> import metaperceptron
>>> metaperceptron.__version__
```

### Examples

Please check all use cases and examples in folder [examples](examples).

1) MetaPerceptron provides this useful classes

```python
from metaperceptron import DataTransformer, Data
from metaperceptron import MlpRegressor, MlpClassifier
from metaperceptron import MhaMlpRegressor, MhaMlpClassifier
```

2) What you can do with `DataTransformer` class

We provide many scaler classes that you can select and make a combination of transforming your data via 
DataTransformer class. For example: 

2.1) I want to scale data by `Loge` and then `Sqrt` and then `MinMax`:

```python
from metaperceptron import DataTransformer
import pandas as pd
from sklearn.model_selection import train_test_split

dataset = pd.read_csv('Position_Salaries.csv')
X = dataset.iloc[:, 1:5].values
y = dataset.iloc[:, 5].values
X_train, y_train, X_test, y_test = train_test_split(X, y, test_size=0.2)

dt = DataTransformer(scaling_methods=("loge", "sqrt", "minmax"))
X_train_scaled = dt.fit_transform(X_train)
X_test_scaled = dt.transform(X_test)
```

2.2) I want to scale data by `YeoJohnson` and then `Standard`:

```python
from metaperceptron import DataTransformer
import pandas as pd
from sklearn.model_selection import train_test_split

dataset = pd.read_csv('Position_Salaries.csv')
X = dataset.iloc[:, 1:5].values
y = dataset.iloc[:, 5].values
X_train, y_train, X_test, y_test = train_test_split(X, y, test_size=0.2)

dt = DataTransformer(scaling_methods=("yeo-johnson", "standard"))
X_train_scaled = dt.fit_transform(X_train)
X_test_scaled = dt.transform(X_test)
```

3) What can you do with `Data` class
+ You can load your dataset into Data class
+ You can split dataset to train and test set
+ You can scale dataset without using DataTransformer class
+ You can scale labels using LabelEncoder

```python
from metaperceptron import Data
import pandas as pd

dataset = pd.read_csv('Position_Salaries.csv')
X = dataset.iloc[:, 1:5].values
y = dataset.iloc[:, 5].values

data = Data(X, y, name="position_salaries")

#### Split dataset into train and test set
data.split_train_test(test_size=0.2, shuffle=True, random_state=100, inplace=True)

#### Feature Scaling
data.X_train, scaler_X = data.scale(data.X_train, scaling_methods=("standard", "sqrt", "minmax"))
data.X_test = scaler_X.transform(data.X_test)

data.y_train, scaler_y = data.encode_label(data.y_train)  # This is for classification problem only
data.y_test = scaler_y.transform(data.y_test)
```

4) What can you do with all model classes
+ Define the model 
+ Use provides functions to train, predict, and evaluate model

```python
from metaperceptron import MlpRegressor, MlpClassifier, MhaMlpRegressor, MhaMlpClassifier

## Use standard MLP model for regression problem
regressor = MlpRegressor(hidden_size=50, act1_name="tanh", act2_name="sigmoid", obj_name="MSE",
                 max_epochs=1000, batch_size=32, optimizer="SGD", optimizer_paras=None, verbose=False)

## Use standard MLP model for classification problem 
classifier = MlpClassifier(hidden_size=50, act1_name="tanh", act2_name="sigmoid", obj_name="NLLL",
                 max_epochs=1000, batch_size=32, optimizer="SGD", optimizer_paras=None, verbose=False)

## Use Metaheuristic-optimized MLP model for regression problem
print(MhaMlpClassifier.SUPPORTED_OPTIMIZERS)
print(MhaMlpClassifier.SUPPORTED_REG_OBJECTIVES)

opt_paras = {"name": "WOA", "epoch": 100, "pop_size": 30}
regressor = MhaMlpRegressor(hidden_size=50, act1_name="tanh", act2_name="sigmoid",
                 obj_name="MSE", optimizer="OriginalWOA", optimizer_paras=opt_paras, verbose=True)

## Use Metaheuristic-optimized MLP model for classification problem
print(MhaMlpClassifier.SUPPORTED_OPTIMIZERS)
print(MhaMlpClassifier.SUPPORTED_CLS_OBJECTIVES)

opt_paras = {"name": "WOA", "epoch": 100, "pop_size": 30}
classifier = MhaMlpClassifier(hidden_size=50, act1_name="tanh", act2_name="softmax",
                 obj_name="CEL", optimizer="OriginalWOA", optimizer_paras=opt_paras, verbose=True)
```

5) What can you do with model object

```python
from metaperceptron import MlpRegressor, Data 

data = Data()       # Assumption that you have provide this object like above

model = MlpRegressor(hidden_size=50, act1_name="tanh", act2_name="sigmoid", obj_name="MSE",
                 max_epochs=1000, batch_size=32, optimizer="SGD", optimizer_paras=None, verbose=False)

## Train the model
model.fit(data.X_train, data.y_train)

## Predicting a new result
y_pred = model.predict(data.X_test)

## Calculate metrics using score or scores functions.
print(model.score(data.X_test, data.y_test, method="MAE"))
print(model.scores(data.X_test, data.y_test, list_methods=["MAPE", "NNSE", "KGE", "MASE", "R2", "R", "R2S"]))

## Calculate metrics using evaluate function
print(model.evaluate(data.y_test, y_pred, list_metrics=("MSE", "RMSE", "MAPE", "NSE")))

## Save performance metrics to csv file
model.save_evaluation_metrics(data.y_test, y_pred, list_metrics=("RMSE", "MAE"), save_path="history", filename="metrics.csv")

## Save training loss to csv file
model.save_training_loss(save_path="history", filename="loss.csv")

## Save predicted label
model.save_y_predicted(X=data.X_test, y_true=data.y_test, save_path="history", filename="y_predicted.csv")

## Save model
model.save_model(save_path="history", filename="traditional_mlp.pkl")

## Load model 
trained_model = MlpRegressor.load_model(load_path="history", filename="traditional_mlp.pkl")
```

# Support (questions, problems)

### Official Links 

* Official source code repo: https://github.com/thieu1995/MetaPerceptron
* Official document: https://metapeceptron.readthedocs.io/
* Download releases: https://pypi.org/project/metaperceptron/
* Issue tracker: https://github.com/thieu1995/MetaPerceptron/issues
* Notable changes log: https://github.com/thieu1995/MetaPerceptron/blob/master/ChangeLog.md
* Official chat group: https://t.me/+fRVCJGuGJg1mNDg1

* This project also related to our another projects which are "optimization" and "machine learning", check it here:
    * https://github.com/thieu1995/mealpy
    * https://github.com/thieu1995/metaheuristics
    * https://github.com/thieu1995/opfunu
    * https://github.com/thieu1995/enoppy
    * https://github.com/thieu1995/permetrics
    * https://github.com/thieu1995/MetaCluster
    * https://github.com/thieu1995/pfevaluator
    * https://github.com/thieu1995/IntelELM
    * https://github.com/thieu1995/reflame
    * https://github.com/aiir-team

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/thieu1995/MetaPerceptron",
    "name": "metaperceptron",
    "maintainer": "",
    "docs_url": null,
    "requires_python": ">=3.8",
    "maintainer_email": "",
    "keywords": "multi-layer perceptron,machine learning,artificial intelligence,deep learning,neural networks,single hidden layer network,random projection,FLANN,feed-forward neural network,artificial neural network,classification,regression,supervised learning,online learning,generalization,optimization algorithms,Kernel MLP,Cross-validationGenetic algorithm (GA),Particle swarm optimization (PSO),Ant colony optimization (ACO),Differential evolution (DE),Simulated annealing,Grey wolf optimizer (GWO),Whale Optimization Algorithm (WOA),confusion matrix,recall,precision,accuracy,pearson correlation coefficient (PCC),spearman correlation coefficient (SCC),Global optimization,Convergence analysis,Search space exploration,Local search,Computational intelligence,Robust optimization,metaheuristic,metaheuristic algorithms,nature-inspired computing,nature-inspired algorithms,swarm-based computation,metaheuristic-based multi-layer perceptron,metaheuristic-optimized MLP,Performance analysis,Intelligent optimization,Simulations",
    "author": "Thieu",
    "author_email": "nguyenthieu2102@gmail.com",
    "download_url": "https://files.pythonhosted.org/packages/10/9f/6751d35d1fbc470249bd4046f72e9c2fb973fd2897fb9cd174ae484ecf55/metaperceptron-1.1.0.tar.gz",
    "platform": null,
    "description": "\n<p align=\"center\">\n<img style=\"width:100%;\" src=\"https://thieu1995.github.io/post/2023-08/metaperceptron1.png\" alt=\"MetaPerceptron\"/>\n</p>\n\n\n---\n\n[![GitHub release](https://img.shields.io/badge/release-1.1.0-yellow.svg)](https://github.com/thieu1995/MetaPerceptron/releases)\n[![Wheel](https://img.shields.io/pypi/wheel/gensim.svg)](https://pypi.python.org/pypi/metaperceptron) \n[![PyPI version](https://badge.fury.io/py/metaperceptron.svg)](https://badge.fury.io/py/metaperceptron)\n![PyPI - Python Version](https://img.shields.io/pypi/pyversions/metaperceptron.svg)\n![PyPI - Status](https://img.shields.io/pypi/status/metaperceptron.svg)\n![PyPI - Downloads](https://img.shields.io/pypi/dm/metaperceptron.svg)\n[![Downloads](https://pepy.tech/badge/metaperceptron)](https://pepy.tech/project/metaperceptron)\n[![Tests & Publishes to PyPI](https://github.com/thieu1995/metaperceptron/actions/workflows/publish-package.yaml/badge.svg)](https://github.com/thieu1995/metaperceptron/actions/workflows/publish-package.yaml)\n![GitHub Release Date](https://img.shields.io/github/release-date/thieu1995/metaperceptron.svg)\n[![Documentation Status](https://readthedocs.org/projects/metaperceptron/badge/?version=latest)](https://metaperceptron.readthedocs.io/en/latest/?badge=latest)\n[![Chat](https://img.shields.io/badge/Chat-on%20Telegram-blue)](https://t.me/+fRVCJGuGJg1mNDg1)\n![GitHub contributors](https://img.shields.io/github/contributors/thieu1995/metaperceptron.svg)\n[![GitTutorial](https://img.shields.io/badge/PR-Welcome-%23FF8300.svg?)](https://git-scm.com/book/en/v2/GitHub-Contributing-to-a-Project)\n[![DOI](https://zenodo.org/badge/676088001.svg)](https://zenodo.org/doi/10.5281/zenodo.10251021)\n[![License: GPL v3](https://img.shields.io/badge/License-GPLv3-blue.svg)](https://www.gnu.org/licenses/gpl-3.0)\n\n\nMetaPerceptron (Metaheuristic-optimized Multi-Layer Perceptron) is a Python library that implements variants and the \ntraditional version of Multi-Layer Perceptron models. These include Metaheuristic-optimized MLP models (GA, PSO, WOA, TLO, DE, ...) \nand Gradient Descent-optimized MLP models (SGD, Adam, Adelta, Adagrad, ...). It provides a comprehensive list of \noptimizers for training MLP models and is also compatible with the Scikit-Learn library. With MetaPerceptron, \nyou can perform searches and hyperparameter tuning using the features provided by the Scikit-Learn library.\n\n* **Free software:** GNU General Public License (GPL) V3 license\n* **Provided Estimator**: MlpRegressor, MlpClassifier, MhaMlpRegressor, MhaMlpClassifier\n* **Total Metaheuristic-based MLP Regressor**: > 200 Models \n* **Total Metaheuristic-based MLP Classifier**: > 200 Models\n* **Total Gradient Descent-based MLP Regressor**: 12 Models\n* **Total Gradient Descent-based MLP Classifier**: 12 Models\n* **Supported performance metrics**: >= 67 (47 regressions and 20 classifications)\n* **Supported objective functions (as fitness functions or loss functions)**: >= 67 (47 regressions and 20 classifications)\n* **Documentation:** https://metaperceptron.readthedocs.io\n* **Python versions:** >= 3.8.x\n* **Dependencies:** numpy, scipy, scikit-learn, pandas, mealpy, permetrics, torch, skorch\n\n\n# Citation Request \n\nIf you want to understand how Metaheuristic is applied to Multi-Layer Perceptron, you need to read the paper \ntitled **\"Let a biogeography-based optimizer train your Multi-Layer Perceptron\"**. \nThe paper can be accessed at the following [link](https://doi.org/10.1016/j.ins.2014.01.038)\n\n\nPlease include these citations if you plan to use this library:\n\n```code\n\n@software{nguyen_van_thieu_2023_10251022,\n  author       = {Nguyen Van Thieu},\n  title        = {MetaPerceptron: Unleashing the Power of Metaheuristic-optimized Multi-Layer Perceptron - A Python Library},\n  month        = dec,\n  year         = 2023,\n  publisher    = {Zenodo},\n  doi          = {10.5281/zenodo.10251021},\n  url          = {https://github.com/thieu1995/MetaPerceptron}\n}\n\n@article{van2023mealpy,\n  title={MEALPY: An open-source library for latest meta-heuristic algorithms in Python},\n  author={Van Thieu, Nguyen and Mirjalili, Seyedali},\n  journal={Journal of Systems Architecture},\n  year={2023},\n  publisher={Elsevier},\n  doi={10.1016/j.sysarc.2023.102871}\n}\n\n@article{van2023groundwater,\n  title={Groundwater level modeling using Augmented Artificial Ecosystem Optimization},\n  author={Van Thieu, Nguyen and Barma, Surajit Deb and Van Lam, To and Kisi, Ozgur and Mahesha, Amai},\n  journal={Journal of Hydrology},\n  volume={617},\n  pages={129034},\n  year={2023},\n  publisher={Elsevier}\n}\n\n@article{thieu2019efficient,\n  title={Efficient time-series forecasting using neural network and opposition-based coral reefs optimization},\n  author={Thieu Nguyen, Tu Nguyen and Nguyen, Binh Minh and Nguyen, Giang},\n  journal={International Journal of Computational Intelligence Systems},\n  volume={12},\n  number={2},\n  pages={1144--1161},\n  year={2019}\n}\n\n```\n\n# Installation\n\n* Install the [current PyPI release](https://pypi.python.org/pypi/metaperceptron):\n```sh \n$ pip install metaperceptron==1.1.0\n```\n\n* Install directly from source code\n```sh \n$ git clone https://github.com/thieu1995/MetaPerceptron.git\n$ cd MetaPerceptron\n$ python setup.py install\n```\n\n* In case, you want to install the development version from Github:\n```sh \n$ pip install git+https://github.com/thieu1995/MetaPerceptron \n```\n\nAfter installation, you can import MetaPerceptron as any other Python module:\n\n```sh\n$ python\n>>> import metaperceptron\n>>> metaperceptron.__version__\n```\n\n### Examples\n\nPlease check all use cases and examples in folder [examples](examples).\n\n1) MetaPerceptron provides this useful classes\n\n```python\nfrom metaperceptron import DataTransformer, Data\nfrom metaperceptron import MlpRegressor, MlpClassifier\nfrom metaperceptron import MhaMlpRegressor, MhaMlpClassifier\n```\n\n2) What you can do with `DataTransformer` class\n\nWe provide many scaler classes that you can select and make a combination of transforming your data via \nDataTransformer class. For example: \n\n2.1) I want to scale data by `Loge` and then `Sqrt` and then `MinMax`:\n\n```python\nfrom metaperceptron import DataTransformer\nimport pandas as pd\nfrom sklearn.model_selection import train_test_split\n\ndataset = pd.read_csv('Position_Salaries.csv')\nX = dataset.iloc[:, 1:5].values\ny = dataset.iloc[:, 5].values\nX_train, y_train, X_test, y_test = train_test_split(X, y, test_size=0.2)\n\ndt = DataTransformer(scaling_methods=(\"loge\", \"sqrt\", \"minmax\"))\nX_train_scaled = dt.fit_transform(X_train)\nX_test_scaled = dt.transform(X_test)\n```\n\n2.2) I want to scale data by `YeoJohnson` and then `Standard`:\n\n```python\nfrom metaperceptron import DataTransformer\nimport pandas as pd\nfrom sklearn.model_selection import train_test_split\n\ndataset = pd.read_csv('Position_Salaries.csv')\nX = dataset.iloc[:, 1:5].values\ny = dataset.iloc[:, 5].values\nX_train, y_train, X_test, y_test = train_test_split(X, y, test_size=0.2)\n\ndt = DataTransformer(scaling_methods=(\"yeo-johnson\", \"standard\"))\nX_train_scaled = dt.fit_transform(X_train)\nX_test_scaled = dt.transform(X_test)\n```\n\n3) What can you do with `Data` class\n+ You can load your dataset into Data class\n+ You can split dataset to train and test set\n+ You can scale dataset without using DataTransformer class\n+ You can scale labels using LabelEncoder\n\n```python\nfrom metaperceptron import Data\nimport pandas as pd\n\ndataset = pd.read_csv('Position_Salaries.csv')\nX = dataset.iloc[:, 1:5].values\ny = dataset.iloc[:, 5].values\n\ndata = Data(X, y, name=\"position_salaries\")\n\n#### Split dataset into train and test set\ndata.split_train_test(test_size=0.2, shuffle=True, random_state=100, inplace=True)\n\n#### Feature Scaling\ndata.X_train, scaler_X = data.scale(data.X_train, scaling_methods=(\"standard\", \"sqrt\", \"minmax\"))\ndata.X_test = scaler_X.transform(data.X_test)\n\ndata.y_train, scaler_y = data.encode_label(data.y_train)  # This is for classification problem only\ndata.y_test = scaler_y.transform(data.y_test)\n```\n\n4) What can you do with all model classes\n+ Define the model \n+ Use provides functions to train, predict, and evaluate model\n\n```python\nfrom metaperceptron import MlpRegressor, MlpClassifier, MhaMlpRegressor, MhaMlpClassifier\n\n## Use standard MLP model for regression problem\nregressor = MlpRegressor(hidden_size=50, act1_name=\"tanh\", act2_name=\"sigmoid\", obj_name=\"MSE\",\n                 max_epochs=1000, batch_size=32, optimizer=\"SGD\", optimizer_paras=None, verbose=False)\n\n## Use standard MLP model for classification problem \nclassifier = MlpClassifier(hidden_size=50, act1_name=\"tanh\", act2_name=\"sigmoid\", obj_name=\"NLLL\",\n                 max_epochs=1000, batch_size=32, optimizer=\"SGD\", optimizer_paras=None, verbose=False)\n\n## Use Metaheuristic-optimized MLP model for regression problem\nprint(MhaMlpClassifier.SUPPORTED_OPTIMIZERS)\nprint(MhaMlpClassifier.SUPPORTED_REG_OBJECTIVES)\n\nopt_paras = {\"name\": \"WOA\", \"epoch\": 100, \"pop_size\": 30}\nregressor = MhaMlpRegressor(hidden_size=50, act1_name=\"tanh\", act2_name=\"sigmoid\",\n                 obj_name=\"MSE\", optimizer=\"OriginalWOA\", optimizer_paras=opt_paras, verbose=True)\n\n## Use Metaheuristic-optimized MLP model for classification problem\nprint(MhaMlpClassifier.SUPPORTED_OPTIMIZERS)\nprint(MhaMlpClassifier.SUPPORTED_CLS_OBJECTIVES)\n\nopt_paras = {\"name\": \"WOA\", \"epoch\": 100, \"pop_size\": 30}\nclassifier = MhaMlpClassifier(hidden_size=50, act1_name=\"tanh\", act2_name=\"softmax\",\n                 obj_name=\"CEL\", optimizer=\"OriginalWOA\", optimizer_paras=opt_paras, verbose=True)\n```\n\n5) What can you do with model object\n\n```python\nfrom metaperceptron import MlpRegressor, Data \n\ndata = Data()       # Assumption that you have provide this object like above\n\nmodel = MlpRegressor(hidden_size=50, act1_name=\"tanh\", act2_name=\"sigmoid\", obj_name=\"MSE\",\n                 max_epochs=1000, batch_size=32, optimizer=\"SGD\", optimizer_paras=None, verbose=False)\n\n## Train the model\nmodel.fit(data.X_train, data.y_train)\n\n## Predicting a new result\ny_pred = model.predict(data.X_test)\n\n## Calculate metrics using score or scores functions.\nprint(model.score(data.X_test, data.y_test, method=\"MAE\"))\nprint(model.scores(data.X_test, data.y_test, list_methods=[\"MAPE\", \"NNSE\", \"KGE\", \"MASE\", \"R2\", \"R\", \"R2S\"]))\n\n## Calculate metrics using evaluate function\nprint(model.evaluate(data.y_test, y_pred, list_metrics=(\"MSE\", \"RMSE\", \"MAPE\", \"NSE\")))\n\n## Save performance metrics to csv file\nmodel.save_evaluation_metrics(data.y_test, y_pred, list_metrics=(\"RMSE\", \"MAE\"), save_path=\"history\", filename=\"metrics.csv\")\n\n## Save training loss to csv file\nmodel.save_training_loss(save_path=\"history\", filename=\"loss.csv\")\n\n## Save predicted label\nmodel.save_y_predicted(X=data.X_test, y_true=data.y_test, save_path=\"history\", filename=\"y_predicted.csv\")\n\n## Save model\nmodel.save_model(save_path=\"history\", filename=\"traditional_mlp.pkl\")\n\n## Load model \ntrained_model = MlpRegressor.load_model(load_path=\"history\", filename=\"traditional_mlp.pkl\")\n```\n\n# Support (questions, problems)\n\n### Official Links \n\n* Official source code repo: https://github.com/thieu1995/MetaPerceptron\n* Official document: https://metapeceptron.readthedocs.io/\n* Download releases: https://pypi.org/project/metaperceptron/\n* Issue tracker: https://github.com/thieu1995/MetaPerceptron/issues\n* Notable changes log: https://github.com/thieu1995/MetaPerceptron/blob/master/ChangeLog.md\n* Official chat group: https://t.me/+fRVCJGuGJg1mNDg1\n\n* This project also related to our another projects which are \"optimization\" and \"machine learning\", check it here:\n    * https://github.com/thieu1995/mealpy\n    * https://github.com/thieu1995/metaheuristics\n    * https://github.com/thieu1995/opfunu\n    * https://github.com/thieu1995/enoppy\n    * https://github.com/thieu1995/permetrics\n    * https://github.com/thieu1995/MetaCluster\n    * https://github.com/thieu1995/pfevaluator\n    * https://github.com/thieu1995/IntelELM\n    * https://github.com/thieu1995/reflame\n    * https://github.com/aiir-team\n",
    "bugtrack_url": null,
    "license": "GPLv3",
    "summary": "MetaPerceptron: Unleashing the Power of Metaheuristic-optimized Multi-Layer Perceptron - A Python Library",
    "version": "1.1.0",
    "project_urls": {
        "Bug Tracker": "https://github.com/thieu1995/MetaPerceptron/issues",
        "Change Log": "https://github.com/thieu1995/MetaPerceptron/blob/main/ChangeLog.md",
        "Documentation": "https://metaperceptron.readthedocs.io/",
        "Forum": "https://t.me/+fRVCJGuGJg1mNDg1",
        "Homepage": "https://github.com/thieu1995/MetaPerceptron",
        "Source Code": "https://github.com/thieu1995/MetaPerceptron"
    },
    "split_keywords": [
        "multi-layer perceptron",
        "machine learning",
        "artificial intelligence",
        "deep learning",
        "neural networks",
        "single hidden layer network",
        "random projection",
        "flann",
        "feed-forward neural network",
        "artificial neural network",
        "classification",
        "regression",
        "supervised learning",
        "online learning",
        "generalization",
        "optimization algorithms",
        "kernel mlp",
        "cross-validationgenetic algorithm (ga)",
        "particle swarm optimization (pso)",
        "ant colony optimization (aco)",
        "differential evolution (de)",
        "simulated annealing",
        "grey wolf optimizer (gwo)",
        "whale optimization algorithm (woa)",
        "confusion matrix",
        "recall",
        "precision",
        "accuracy",
        "pearson correlation coefficient (pcc)",
        "spearman correlation coefficient (scc)",
        "global optimization",
        "convergence analysis",
        "search space exploration",
        "local search",
        "computational intelligence",
        "robust optimization",
        "metaheuristic",
        "metaheuristic algorithms",
        "nature-inspired computing",
        "nature-inspired algorithms",
        "swarm-based computation",
        "metaheuristic-based multi-layer perceptron",
        "metaheuristic-optimized mlp",
        "performance analysis",
        "intelligent optimization",
        "simulations"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "2dbc6b5615426979a16dd7df117bb8d7d7c07e225e5a7e4128ccb49039a954f3",
                "md5": "c3e915387e23d28dc0c83efe94465193",
                "sha256": "639d1e46b44e19b0cdaafe6ff098c983e7b2ac27c7d0a9a57a111e4d9728c218"
            },
            "downloads": -1,
            "filename": "metaperceptron-1.1.0-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "c3e915387e23d28dc0c83efe94465193",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.8",
            "size": 42593,
            "upload_time": "2023-12-03T03:15:49",
            "upload_time_iso_8601": "2023-12-03T03:15:49.971236Z",
            "url": "https://files.pythonhosted.org/packages/2d/bc/6b5615426979a16dd7df117bb8d7d7c07e225e5a7e4128ccb49039a954f3/metaperceptron-1.1.0-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "109f6751d35d1fbc470249bd4046f72e9c2fb973fd2897fb9cd174ae484ecf55",
                "md5": "6fdd5b014ba7a2b767038c8031ebac39",
                "sha256": "5dd4ea2c20f71adfea697a8fd764a387bcee4c9e8e11704c12d8f2252754c5f0"
            },
            "downloads": -1,
            "filename": "metaperceptron-1.1.0.tar.gz",
            "has_sig": false,
            "md5_digest": "6fdd5b014ba7a2b767038c8031ebac39",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.8",
            "size": 40755,
            "upload_time": "2023-12-03T03:15:51",
            "upload_time_iso_8601": "2023-12-03T03:15:51.833011Z",
            "url": "https://files.pythonhosted.org/packages/10/9f/6751d35d1fbc470249bd4046f72e9c2fb973fd2897fb9cd174ae484ecf55/metaperceptron-1.1.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2023-12-03 03:15:51",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "thieu1995",
    "github_project": "MetaPerceptron",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "requirements": [],
    "lcname": "metaperceptron"
}
        
Elapsed time: 0.14802s