mafese


Namemafese JSON
Version 0.2.0 PyPI version JSON
download
home_pagehttps://github.com/thieu1995/mafese
SummaryFeature Selection using Metaheuristics Made Easy: Open Source MAFESE Library in Python
upload_time2024-06-12 15:47:19
maintainerNone
docs_urlNone
authorThieu
requires_python>=3.7
licenseGPLv3
keywords engineering optimization problems mathematical optimization feature selection classification problem feature selector dimensionality reduction subset selection wrapper methods embedded methods mutual information correlation-based feature selection recursive feature selection principal component analysis pca lasso regularization ridge regularization genetic algorithm (ga) particle swarm optimization (pso) ant colony optimization (aco) differential evolution (de) simulated annealing grey wolf optimizer (gwo) whale optimization algorithm (woa) confusion matrix recall precision accuracy k-nearest neighbors random forest support vector machine pearson correlation coefficient (pcc) spearman correlation coefficient (scc) relief relief-f multi-objectives optimization problems stochastic optimization global optimization convergence analysis search space exploration local search computational intelligence robust optimization performance analysis intelligent optimization simulations
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            
<p align="center">
<img style="max-width:100%;" 
src="https://thieu1995.github.io/post/2023-08/mafese-02.png" 
alt="MAFESE"/>
</p>

---

[![GitHub release](https://img.shields.io/badge/release-0.2.0-yellow.svg)](https://github.com/thieu1995/mafese/releases)
[![Wheel](https://img.shields.io/pypi/wheel/gensim.svg)](https://pypi.python.org/pypi/mafese) 
[![PyPI version](https://badge.fury.io/py/mafese.svg)](https://badge.fury.io/py/mafese)
![PyPI - Python Version](https://img.shields.io/pypi/pyversions/mafese.svg)
![PyPI - Status](https://img.shields.io/pypi/status/mafese.svg)
![PyPI - Downloads](https://img.shields.io/pypi/dm/mafese.svg)
[![Downloads](https://static.pepy.tech/badge/mafese)](https://pepy.tech/project/mafese)
[![Tests & Publishes to PyPI](https://github.com/thieu1995/mafese/actions/workflows/publish-package.yaml/badge.svg)](https://github.com/thieu1995/mafese/actions/workflows/publish-package.yaml)
![GitHub Release Date](https://img.shields.io/github/release-date/thieu1995/mafese.svg)
[![Documentation Status](https://readthedocs.org/projects/mafese/badge/?version=latest)](https://mafese.readthedocs.io/en/latest/?badge=latest)
[![Chat](https://img.shields.io/badge/Chat-on%20Telegram-blue)](https://t.me/+fRVCJGuGJg1mNDg1)
![GitHub contributors](https://img.shields.io/github/contributors/thieu1995/mafese.svg)
[![GitTutorial](https://img.shields.io/badge/PR-Welcome-%23FF8300.svg?)](https://git-scm.com/book/en/v2/GitHub-Contributing-to-a-Project)
[![DOI](https://zenodo.org/badge/545209353.svg)](https://doi.org/10.5281/zenodo.7969042)
[![License: GPL v3](https://img.shields.io/badge/License-GPLv3-blue.svg)](https://www.gnu.org/licenses/gpl-3.0)


MAFESE (Metaheuristic Algorithms for FEature SElection) is the biggest python library for feature selection (FS) 
problem using meta-heuristic algorithms.

* **🆓 Free software:** GNU General Public License (GPL) V3 license
* **🔄 Total Wrapper-based (Metaheuristic Algorithms):** > 200 methods
* **📊 Total Filter-based (Statistical-based):** > 15 methods
* **🌳 Total Embedded-based (Tree and Lasso):** > 10 methods
* **🔍 Total Unsupervised-based:** ≥ 4 methods
* **📂 Total datasets:** ≥ 30 (47 classifications and 7 regressions)
* **📈 Total performance metrics:** ≥ 61 (45 regressions and 16 classifications)
* **⚙️ Total objective functions (as fitness functions):** ≥ 61 (45 regressions and 16 classifications)
* **📖 Documentation:** [https://mafese.readthedocs.io/en/latest/](https://mafese.readthedocs.io/en/latest/)
* **🐍 Python versions:** ≥ 3.7.x
* **📦 Dependencies:** `numpy`, `scipy`, `scikit-learn`, `pandas`, `mealpy`, `permetrics`, `plotly`, `kaleido`


## Citation Request

Please include these citations if you plan to use this incredible library:


```bibtex

@article{van2024feature,
  title={Feature selection using metaheuristics made easy: Open source MAFESE library in Python},
  author={Van Thieu, Nguyen and Nguyen, Ngoc Hung and Heidari, Ali Asghar},
  journal={Future Generation Computer Systems},
  year={2024},
  publisher={Elsevier},
  doi={10.1016/j.future.2024.06.006},
  url={https://doi.org/10.1016/j.future.2024.06.006},
}

@article{van2023mealpy,
  title={MEALPY: An open-source library for latest meta-heuristic algorithms in Python},
  author={Van Thieu, Nguyen and Mirjalili, Seyedali},
  journal={Journal of Systems Architecture},
  year={2023},
  publisher={Elsevier},
  doi={10.1016/j.sysarc.2023.102871}
}

```

# Usage

## Goals

- **Our library provides all state-of-the-art feature selection methods**:
  + Unsupervised-based FS
  + Filter-based FS
  + Embedded-based FS
    + Regularization (Lasso-based)
    + Tree-based methods
  + Wrapper-based FS
    + Sequential-based: forward and backward
    + Recursive-based
    + MHA-based: Metaheuristic Algorithms

## Installation

* Install the [current PyPI release](https://pypi.python.org/pypi/mafese):
```sh 
$ pip install mafese
```

After installation, you can import MAFESE and check its installed version:

```sh
$ python
>>> import mafese
>>> mafese.__version__
```



<details><summary><h2>Lib's structure</h2></summary>

```code
docs
examples
mafese
    data/
        cls/
            aggregation.csv
            Arrhythmia.csv
            ...
        reg/
            boston-housing.csv
            diabetes.csv
            ...
    wrapper/
        mha.py
        recursive.py
        sequential.py
    embedded/
        lasso.py
        tree.py
    filter.py
    unsupervised.py
    utils/
        correlation.py
        data_loader.py
        encoder.py
        estimator.py
        mealpy_util.py
        transfer.py
        validator.py
    __init__.py
    selector.py
README.md
setup.py
```

</details>


## Examples

Let's go through some examples.

### 1. First, load dataset. You can use the available datasets from Mafese:

```python
# Load available dataset from MAFESE
from mafese import get_dataset

# Try unknown data
get_dataset("unknown")
# Enter: 1      -> This wil list all of avaialble dataset

data = get_dataset("Arrhythmia")
```

* Or you can load your own dataset 

```python
import pandas as pd
from mafese import Data

# load X and y
# NOTE mafese accepts numpy arrays only, hence the .values attribute
dataset = pd.read_csv('examples/dataset.csv', index_col=0).values
X, y = dataset[:, 0:-1], dataset[:, -1]
data = Data(X, y)
```

### 2. Next, prepare your dataset


#### 2.1 Split dataset into train and test set

```python
data.split_train_test(test_size=0.2, inplace=True)
print(data.X_train[:2].shape)
print(data.y_train[:2].shape)
```

#### 2.2 Feature Scaling

```python
data.X_train, scaler_X = data.scale(data.X_train, scaling_methods=("standard", "minmax"))
data.X_test = scaler_X.transform(data.X_test)

data.y_train, scaler_y = data.encode_label(data.y_train)   # This is for classification problem only
data.y_test = scaler_y.transform(data.y_test)
```

### 3. Next, choose the Selector that you want to use by first import them:

```python
## First way, we recommended 
from mafese import UnsupervisedSelector, FilterSelector, LassoSelector, TreeSelector
from mafese import SequentialSelector, RecursiveSelector, MhaSelector, MultiMhaSelector

## Second way
from mafese.unsupervised import UnsupervisedSelector
from mafese.filter import FilterSelector
from mafese.embedded.lasso import LassoSelector
from mafese.embedded.tree import TreeSelector
from mafese.wrapper.sequential import SequentialSelector
from mafese.wrapper.recursive import RecursiveSelector
from mafese.wrapper.mha import MhaSelector, MultiMhaSelector
```

### 4. Next, create an instance of Selector class you want to use:

```python
feat_selector = UnsupervisedSelector(problem='classification', method='DR', n_features=5)

feat_selector = FilterSelector(problem='classification', method='SPEARMAN', n_features=5)

feat_selector = LassoSelector(problem="classification", estimator="lasso", estimator_paras={"alpha": 0.1})

feat_selector = TreeSelector(problem="classification", estimator="tree")

feat_selector = SequentialSelector(problem="classification", estimator="knn", n_features=3, direction="forward")

feat_selector = RecursiveSelector(problem="classification", estimator="rf", n_features=5)

feat_selector = MhaSelector(problem="classification", estimator="knn",
                            optimizer="BaseGA", optimizer_paras=None,
                            transfer_func="vstf_01", obj_name="AS")

list_optimizers = ("OriginalWOA", "OriginalGWO", "OriginalTLO", "OriginalGSKA")
list_paras = [{"epoch": 10, "pop_size": 30}, ]*4
feat_selector = MultiMhaSelector(problem="classification", estimator="knn",
                            list_optimizers=list_optimizers, list_optimizer_paras=list_paras,
                            transfer_func="vstf_01", obj_name="AS")
```

### 5. Fit the model to X_train and y_train

```python
feat_selector.fit(data.X_train, data.y_train)
```

### 6. Get the information

```python
# check selected features - True (or 1) is selected, False (or 0) is not selected
print(feat_selector.selected_feature_masks)
print(feat_selector.selected_feature_solution)

# check the index of selected features
print(feat_selector.selected_feature_indexes)
```

### 7. Call transform() on the X that you want to filter it down to selected features

```python
X_train_selected = feat_selector.transform(data.X_train)
X_test_selected = feat_selector.transform(data.X_test)
```

### 8.You can build your own evaluating method or use our method.

If you use our method, don't transform the data.

#### 8.1 You can use difference estimator than the one used in feature selection process 
```python
feat_selector.evaluate(estimator="svm", data=data, metrics=["AS", "PS", "RS"])

## Here, we pass the data that was loaded above. So it contains both train and test set. So, the results will look 
like this: 
{'AS_train': 0.77176, 'PS_train': 0.54177, 'RS_train': 0.6205, 'AS_test': 0.72636, 'PS_test': 0.34628, 'RS_test': 0.52747}
```

#### 8.2 You can use the same estimator in feature selection process 
```python
X_test, y_test = data.X_test, data.y_test
feat_selector.evaluate(estimator=None, data=data, metrics=["AS", "PS", "RS"])
```

For more usage examples please look at [examples](/examples) folder.


# Support

## Some popular questions

1. Where do I find the supported metrics like above ["AS", "PS", "RS"]. What is that?

You can find it here: https://github.com/thieu1995/permetrics or use this 

```python
from mafese import MhaSelector 

print(MhaSelector.SUPPORTED_REGRESSION_METRICS)
print(MhaSelector.SUPPORTED_CLASSIFICATION_METRICS)
```

2. How do I know my Selector support which estimator? which methods?

```python
print(feat_selector.SUPPORT) 
```
Or you better read the document from: https://mafese.readthedocs.io/en/latest/

3. I got this type of error. How to solve it?

```python
raise ValueError("Existed at least one new label in y_pred.")
ValueError: Existed at least one new label in y_pred.
```

> This occurs only when you are working on a classification problem with a small dataset that has many classes. For 
  instance, the "Zoo" dataset contains only 101 samples, but it has 7 classes. If you split the dataset into a 
  training and testing set with a ratio of around 80% - 20%, there is a chance that one or more classes may appear 
  in the testing set but not in the training set. As a result, when you calculate the performance metrics, you may 
  encounter this error. You cannot predict or assign new data to a new label because you have no knowledge about the 
  new label. There are several solutions to this problem.


+ 1st: Use the SMOTE method to address imbalanced data and ensure that all classes have the same number of samples.

```python
from imblearn.over_sampling import SMOTE
import pandas as pd
from mafese import Data

dataset = pd.read_csv('examples/dataset.csv', index_col=0).values
X, y = dataset[:, 0:-1], dataset[:, -1]

X_new, y_new = SMOTE().fit_resample(X, y)
data = Data(X_new, y_new)
```

+ 2nd: Use different random_state numbers in split_train_test() function.
```python
import pandas as pd 
from mafese import Data 

dataset = pd.read_csv('examples/dataset.csv', index_col=0).values
X, y = dataset[:, 0:-1], dataset[:, -1]
data = Data(X, y)
data.split_train_test(test_size=0.2, random_state=10)   # Try different random_state value 
```


<details><summary><h2>Official Links</h2></summary>

* Official source code repository: https://github.com/thieu1995/mafese
* Official document: https://mafese.readthedocs.io/
* Download releases: https://pypi.org/project/mafese/
* Issue tracker: https://github.com/thieu1995/mafese/issues
* Notable changes log: https://github.com/thieu1995/mafese/blob/master/ChangeLog.md
* Examples with different mealpy version: https://github.com/thieu1995/mafese/blob/master/examples.md
* Official chat group: https://t.me/+fRVCJGuGJg1mNDg1

* This project also related to our another projects which are "optimization" and "machine learning", check it here:
    * https://github.com/thieu1995/mealpy
    * https://github.com/thieu1995/metaheuristics
    * https://github.com/thieu1995/opfunu
    * https://github.com/thieu1995/enoppy
    * https://github.com/thieu1995/permetrics
    * https://github.com/thieu1995/MetaCluster
    * https://github.com/thieu1995/pfevaluator
    * https://github.com/aiir-team

</details>



<details><summary><h2>Related Documents</h2></summary>

1. https://neptune.ai/blog/feature-selection-methods
2. https://www.blog.trainindata.com/feature-selection-machine-learning-with-python/
3. https://github.com/LBBSoft/FeatureSelect
4. https://bmcbioinformatics.biomedcentral.com/articles/10.1186/s12859-019-2754-0
5. https://github.com/scikit-learn-contrib/boruta_py
6. https://elki-project.github.io/
7. https://sci2s.ugr.es/keel/index.php
8. https://archive.ics.uci.edu/datasets
9. https://python-charts.com/distribution/box-plot-plotly/
10. https://plotly.com/python/box-plots/?_ga=2.50659434.2126348639.1688086416-114197406.1688086416#box-plot-styling-mean--standard-deviation

</details>

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/thieu1995/mafese",
    "name": "mafese",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.7",
    "maintainer_email": null,
    "keywords": "engineering optimization problems, mathematical optimization, feature selection, classification problem, feature selector, dimensionality reduction, subset selection, wrapper methods, embedded methods, mutual information, correlation-based feature selection, recursive feature selection, principal component analysis, PCA, lasso regularization, ridge regularization, Genetic algorithm (GA), Particle swarm optimization (PSO), Ant colony optimization (ACO), Differential evolution (DE), Simulated annealing, Grey wolf optimizer (GWO), Whale Optimization Algorithm (WOA), confusion matrix, recall, precision, accuracy, K-Nearest Neighbors, random forest, support vector machine, pearson correlation coefficient (PCC), spearman correlation coefficient (SCC), relief, relief-f, multi-objectives optimization problems, Stochastic optimization, Global optimization, Convergence analysis, Search space exploration, Local search, Computational intelligence, Robust optimization, Performance analysis, Intelligent optimization, Simulations",
    "author": "Thieu",
    "author_email": "nguyenthieu2102@gmail.com",
    "download_url": "https://files.pythonhosted.org/packages/97/65/e6801fede519b82991c1f70cd94a279f40af62b97038810f5e93760c87b6/mafese-0.2.0.tar.gz",
    "platform": null,
    "description": "\n<p align=\"center\">\n<img style=\"max-width:100%;\" \nsrc=\"https://thieu1995.github.io/post/2023-08/mafese-02.png\" \nalt=\"MAFESE\"/>\n</p>\n\n---\n\n[![GitHub release](https://img.shields.io/badge/release-0.2.0-yellow.svg)](https://github.com/thieu1995/mafese/releases)\n[![Wheel](https://img.shields.io/pypi/wheel/gensim.svg)](https://pypi.python.org/pypi/mafese) \n[![PyPI version](https://badge.fury.io/py/mafese.svg)](https://badge.fury.io/py/mafese)\n![PyPI - Python Version](https://img.shields.io/pypi/pyversions/mafese.svg)\n![PyPI - Status](https://img.shields.io/pypi/status/mafese.svg)\n![PyPI - Downloads](https://img.shields.io/pypi/dm/mafese.svg)\n[![Downloads](https://static.pepy.tech/badge/mafese)](https://pepy.tech/project/mafese)\n[![Tests & Publishes to PyPI](https://github.com/thieu1995/mafese/actions/workflows/publish-package.yaml/badge.svg)](https://github.com/thieu1995/mafese/actions/workflows/publish-package.yaml)\n![GitHub Release Date](https://img.shields.io/github/release-date/thieu1995/mafese.svg)\n[![Documentation Status](https://readthedocs.org/projects/mafese/badge/?version=latest)](https://mafese.readthedocs.io/en/latest/?badge=latest)\n[![Chat](https://img.shields.io/badge/Chat-on%20Telegram-blue)](https://t.me/+fRVCJGuGJg1mNDg1)\n![GitHub contributors](https://img.shields.io/github/contributors/thieu1995/mafese.svg)\n[![GitTutorial](https://img.shields.io/badge/PR-Welcome-%23FF8300.svg?)](https://git-scm.com/book/en/v2/GitHub-Contributing-to-a-Project)\n[![DOI](https://zenodo.org/badge/545209353.svg)](https://doi.org/10.5281/zenodo.7969042)\n[![License: GPL v3](https://img.shields.io/badge/License-GPLv3-blue.svg)](https://www.gnu.org/licenses/gpl-3.0)\n\n\nMAFESE (Metaheuristic Algorithms for FEature SElection) is the biggest python library for feature selection (FS) \nproblem using meta-heuristic algorithms.\n\n* **\ud83c\udd93 Free software:** GNU General Public License (GPL) V3 license\n* **\ud83d\udd04 Total Wrapper-based (Metaheuristic Algorithms):** > 200 methods\n* **\ud83d\udcca Total Filter-based (Statistical-based):** > 15 methods\n* **\ud83c\udf33 Total Embedded-based (Tree and Lasso):** > 10 methods\n* **\ud83d\udd0d Total Unsupervised-based:** \u2265 4 methods\n* **\ud83d\udcc2 Total datasets:** \u2265 30 (47 classifications and 7 regressions)\n* **\ud83d\udcc8 Total performance metrics:** \u2265 61 (45 regressions and 16 classifications)\n* **\u2699\ufe0f Total objective functions (as fitness functions):** \u2265 61 (45 regressions and 16 classifications)\n* **\ud83d\udcd6 Documentation:** [https://mafese.readthedocs.io/en/latest/](https://mafese.readthedocs.io/en/latest/)\n* **\ud83d\udc0d Python versions:** \u2265 3.7.x\n* **\ud83d\udce6 Dependencies:** `numpy`, `scipy`, `scikit-learn`, `pandas`, `mealpy`, `permetrics`, `plotly`, `kaleido`\n\n\n## Citation Request\n\nPlease include these citations if you plan to use this incredible library:\n\n\n```bibtex\n\n@article{van2024feature,\n  title={Feature selection using metaheuristics made easy: Open source MAFESE library in Python},\n  author={Van Thieu, Nguyen and Nguyen, Ngoc Hung and Heidari, Ali Asghar},\n  journal={Future Generation Computer Systems},\n  year={2024},\n  publisher={Elsevier},\n  doi={10.1016/j.future.2024.06.006},\n  url={https://doi.org/10.1016/j.future.2024.06.006},\n}\n\n@article{van2023mealpy,\n  title={MEALPY: An open-source library for latest meta-heuristic algorithms in Python},\n  author={Van Thieu, Nguyen and Mirjalili, Seyedali},\n  journal={Journal of Systems Architecture},\n  year={2023},\n  publisher={Elsevier},\n  doi={10.1016/j.sysarc.2023.102871}\n}\n\n```\n\n# Usage\n\n## Goals\n\n- **Our library provides all state-of-the-art feature selection methods**:\n  + Unsupervised-based FS\n  + Filter-based FS\n  + Embedded-based FS\n    + Regularization (Lasso-based)\n    + Tree-based methods\n  + Wrapper-based FS\n    + Sequential-based: forward and backward\n    + Recursive-based\n    + MHA-based: Metaheuristic Algorithms\n\n## Installation\n\n* Install the [current PyPI release](https://pypi.python.org/pypi/mafese):\n```sh \n$ pip install mafese\n```\n\nAfter installation, you can import MAFESE and check its installed version:\n\n```sh\n$ python\n>>> import mafese\n>>> mafese.__version__\n```\n\n\n\n<details><summary><h2>Lib's structure</h2></summary>\n\n```code\ndocs\nexamples\nmafese\n    data/\n        cls/\n            aggregation.csv\n            Arrhythmia.csv\n            ...\n        reg/\n            boston-housing.csv\n            diabetes.csv\n            ...\n    wrapper/\n        mha.py\n        recursive.py\n        sequential.py\n    embedded/\n        lasso.py\n        tree.py\n    filter.py\n    unsupervised.py\n    utils/\n        correlation.py\n        data_loader.py\n        encoder.py\n        estimator.py\n        mealpy_util.py\n        transfer.py\n        validator.py\n    __init__.py\n    selector.py\nREADME.md\nsetup.py\n```\n\n</details>\n\n\n## Examples\n\nLet's go through some examples.\n\n### 1. First, load dataset. You can use the available datasets from Mafese:\n\n```python\n# Load available dataset from MAFESE\nfrom mafese import get_dataset\n\n# Try unknown data\nget_dataset(\"unknown\")\n# Enter: 1      -> This wil list all of avaialble dataset\n\ndata = get_dataset(\"Arrhythmia\")\n```\n\n* Or you can load your own dataset \n\n```python\nimport pandas as pd\nfrom mafese import Data\n\n# load X and y\n# NOTE mafese accepts numpy arrays only, hence the .values attribute\ndataset = pd.read_csv('examples/dataset.csv', index_col=0).values\nX, y = dataset[:, 0:-1], dataset[:, -1]\ndata = Data(X, y)\n```\n\n### 2. Next, prepare your dataset\n\n\n#### 2.1 Split dataset into train and test set\n\n```python\ndata.split_train_test(test_size=0.2, inplace=True)\nprint(data.X_train[:2].shape)\nprint(data.y_train[:2].shape)\n```\n\n#### 2.2 Feature Scaling\n\n```python\ndata.X_train, scaler_X = data.scale(data.X_train, scaling_methods=(\"standard\", \"minmax\"))\ndata.X_test = scaler_X.transform(data.X_test)\n\ndata.y_train, scaler_y = data.encode_label(data.y_train)   # This is for classification problem only\ndata.y_test = scaler_y.transform(data.y_test)\n```\n\n### 3. Next, choose the Selector that you want to use by first import them:\n\n```python\n## First way, we recommended \nfrom mafese import UnsupervisedSelector, FilterSelector, LassoSelector, TreeSelector\nfrom mafese import SequentialSelector, RecursiveSelector, MhaSelector, MultiMhaSelector\n\n## Second way\nfrom mafese.unsupervised import UnsupervisedSelector\nfrom mafese.filter import FilterSelector\nfrom mafese.embedded.lasso import LassoSelector\nfrom mafese.embedded.tree import TreeSelector\nfrom mafese.wrapper.sequential import SequentialSelector\nfrom mafese.wrapper.recursive import RecursiveSelector\nfrom mafese.wrapper.mha import MhaSelector, MultiMhaSelector\n```\n\n### 4. Next, create an instance of Selector class you want to use:\n\n```python\nfeat_selector = UnsupervisedSelector(problem='classification', method='DR', n_features=5)\n\nfeat_selector = FilterSelector(problem='classification', method='SPEARMAN', n_features=5)\n\nfeat_selector = LassoSelector(problem=\"classification\", estimator=\"lasso\", estimator_paras={\"alpha\": 0.1})\n\nfeat_selector = TreeSelector(problem=\"classification\", estimator=\"tree\")\n\nfeat_selector = SequentialSelector(problem=\"classification\", estimator=\"knn\", n_features=3, direction=\"forward\")\n\nfeat_selector = RecursiveSelector(problem=\"classification\", estimator=\"rf\", n_features=5)\n\nfeat_selector = MhaSelector(problem=\"classification\", estimator=\"knn\",\n                            optimizer=\"BaseGA\", optimizer_paras=None,\n                            transfer_func=\"vstf_01\", obj_name=\"AS\")\n\nlist_optimizers = (\"OriginalWOA\", \"OriginalGWO\", \"OriginalTLO\", \"OriginalGSKA\")\nlist_paras = [{\"epoch\": 10, \"pop_size\": 30}, ]*4\nfeat_selector = MultiMhaSelector(problem=\"classification\", estimator=\"knn\",\n                            list_optimizers=list_optimizers, list_optimizer_paras=list_paras,\n                            transfer_func=\"vstf_01\", obj_name=\"AS\")\n```\n\n### 5. Fit the model to X_train and y_train\n\n```python\nfeat_selector.fit(data.X_train, data.y_train)\n```\n\n### 6. Get the information\n\n```python\n# check selected features - True (or 1) is selected, False (or 0) is not selected\nprint(feat_selector.selected_feature_masks)\nprint(feat_selector.selected_feature_solution)\n\n# check the index of selected features\nprint(feat_selector.selected_feature_indexes)\n```\n\n### 7. Call transform() on the X that you want to filter it down to selected features\n\n```python\nX_train_selected = feat_selector.transform(data.X_train)\nX_test_selected = feat_selector.transform(data.X_test)\n```\n\n### 8.You can build your own evaluating method or use our method.\n\nIf you use our method, don't transform the data.\n\n#### 8.1 You can use difference estimator than the one used in feature selection process \n```python\nfeat_selector.evaluate(estimator=\"svm\", data=data, metrics=[\"AS\", \"PS\", \"RS\"])\n\n## Here, we pass the data that was loaded above. So it contains both train and test set. So, the results will look \nlike this: \n{'AS_train': 0.77176, 'PS_train': 0.54177, 'RS_train': 0.6205, 'AS_test': 0.72636, 'PS_test': 0.34628, 'RS_test': 0.52747}\n```\n\n#### 8.2 You can use the same estimator in feature selection process \n```python\nX_test, y_test = data.X_test, data.y_test\nfeat_selector.evaluate(estimator=None, data=data, metrics=[\"AS\", \"PS\", \"RS\"])\n```\n\nFor more usage examples please look at [examples](/examples) folder.\n\n\n# Support\n\n## Some popular questions\n\n1. Where do I find the supported metrics like above [\"AS\", \"PS\", \"RS\"]. What is that?\n\nYou can find it here: https://github.com/thieu1995/permetrics or use this \n\n```python\nfrom mafese import MhaSelector \n\nprint(MhaSelector.SUPPORTED_REGRESSION_METRICS)\nprint(MhaSelector.SUPPORTED_CLASSIFICATION_METRICS)\n```\n\n2. How do I know my Selector support which estimator? which methods?\n\n```python\nprint(feat_selector.SUPPORT) \n```\nOr you better read the document from: https://mafese.readthedocs.io/en/latest/\n\n3. I got this type of error. How to solve it?\n\n```python\nraise ValueError(\"Existed at least one new label in y_pred.\")\nValueError: Existed at least one new label in y_pred.\n```\n\n> This occurs only when you are working on a classification problem with a small dataset that has many classes. For \n  instance, the \"Zoo\" dataset contains only 101 samples, but it has 7 classes. If you split the dataset into a \n  training and testing set with a ratio of around 80% - 20%, there is a chance that one or more classes may appear \n  in the testing set but not in the training set. As a result, when you calculate the performance metrics, you may \n  encounter this error. You cannot predict or assign new data to a new label because you have no knowledge about the \n  new label. There are several solutions to this problem.\n\n\n+ 1st: Use the SMOTE method to address imbalanced data and ensure that all classes have the same number of samples.\n\n```python\nfrom imblearn.over_sampling import SMOTE\nimport pandas as pd\nfrom mafese import Data\n\ndataset = pd.read_csv('examples/dataset.csv', index_col=0).values\nX, y = dataset[:, 0:-1], dataset[:, -1]\n\nX_new, y_new = SMOTE().fit_resample(X, y)\ndata = Data(X_new, y_new)\n```\n\n+ 2nd: Use different random_state numbers in split_train_test() function.\n```python\nimport pandas as pd \nfrom mafese import Data \n\ndataset = pd.read_csv('examples/dataset.csv', index_col=0).values\nX, y = dataset[:, 0:-1], dataset[:, -1]\ndata = Data(X, y)\ndata.split_train_test(test_size=0.2, random_state=10)   # Try different random_state value \n```\n\n\n<details><summary><h2>Official Links</h2></summary>\n\n* Official source code repository: https://github.com/thieu1995/mafese\n* Official document: https://mafese.readthedocs.io/\n* Download releases: https://pypi.org/project/mafese/\n* Issue tracker: https://github.com/thieu1995/mafese/issues\n* Notable changes log: https://github.com/thieu1995/mafese/blob/master/ChangeLog.md\n* Examples with different mealpy version: https://github.com/thieu1995/mafese/blob/master/examples.md\n* Official chat group: https://t.me/+fRVCJGuGJg1mNDg1\n\n* This project also related to our another projects which are \"optimization\" and \"machine learning\", check it here:\n    * https://github.com/thieu1995/mealpy\n    * https://github.com/thieu1995/metaheuristics\n    * https://github.com/thieu1995/opfunu\n    * https://github.com/thieu1995/enoppy\n    * https://github.com/thieu1995/permetrics\n    * https://github.com/thieu1995/MetaCluster\n    * https://github.com/thieu1995/pfevaluator\n    * https://github.com/aiir-team\n\n</details>\n\n\n\n<details><summary><h2>Related Documents</h2></summary>\n\n1. https://neptune.ai/blog/feature-selection-methods\n2. https://www.blog.trainindata.com/feature-selection-machine-learning-with-python/\n3. https://github.com/LBBSoft/FeatureSelect\n4. https://bmcbioinformatics.biomedcentral.com/articles/10.1186/s12859-019-2754-0\n5. https://github.com/scikit-learn-contrib/boruta_py\n6. https://elki-project.github.io/\n7. https://sci2s.ugr.es/keel/index.php\n8. https://archive.ics.uci.edu/datasets\n9. https://python-charts.com/distribution/box-plot-plotly/\n10. https://plotly.com/python/box-plots/?_ga=2.50659434.2126348639.1688086416-114197406.1688086416#box-plot-styling-mean--standard-deviation\n\n</details>\n",
    "bugtrack_url": null,
    "license": "GPLv3",
    "summary": "Feature Selection using Metaheuristics Made Easy: Open Source MAFESE Library in Python",
    "version": "0.2.0",
    "project_urls": {
        "Bug Tracker": "https://github.com/thieu1995/mafese/issues",
        "Change Log": "https://github.com/thieu1995/mafese/blob/master/ChangeLog.md",
        "Documentation": "https://mafese.readthedocs.io/",
        "Forum": "https://t.me/+fRVCJGuGJg1mNDg1",
        "Homepage": "https://github.com/thieu1995/mafese",
        "Source Code": "https://github.com/thieu1995/mafese"
    },
    "split_keywords": [
        "engineering optimization problems",
        " mathematical optimization",
        " feature selection",
        " classification problem",
        " feature selector",
        " dimensionality reduction",
        " subset selection",
        " wrapper methods",
        " embedded methods",
        " mutual information",
        " correlation-based feature selection",
        " recursive feature selection",
        " principal component analysis",
        " pca",
        " lasso regularization",
        " ridge regularization",
        " genetic algorithm (ga)",
        " particle swarm optimization (pso)",
        " ant colony optimization (aco)",
        " differential evolution (de)",
        " simulated annealing",
        " grey wolf optimizer (gwo)",
        " whale optimization algorithm (woa)",
        " confusion matrix",
        " recall",
        " precision",
        " accuracy",
        " k-nearest neighbors",
        " random forest",
        " support vector machine",
        " pearson correlation coefficient (pcc)",
        " spearman correlation coefficient (scc)",
        " relief",
        " relief-f",
        " multi-objectives optimization problems",
        " stochastic optimization",
        " global optimization",
        " convergence analysis",
        " search space exploration",
        " local search",
        " computational intelligence",
        " robust optimization",
        " performance analysis",
        " intelligent optimization",
        " simulations"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "d35b534d8187b29aece3b360e851e3f71100f2d61a66d24ee027e71c86e0940d",
                "md5": "ed62f57645dd94d275a40ce45af42465",
                "sha256": "26da2c14d5f9166ef16f38ad3ceeccf8716b3787057268cb9a1f34ea84175b78"
            },
            "downloads": -1,
            "filename": "mafese-0.2.0-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "ed62f57645dd94d275a40ce45af42465",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.7",
            "size": 4213665,
            "upload_time": "2024-06-12T15:47:16",
            "upload_time_iso_8601": "2024-06-12T15:47:16.782946Z",
            "url": "https://files.pythonhosted.org/packages/d3/5b/534d8187b29aece3b360e851e3f71100f2d61a66d24ee027e71c86e0940d/mafese-0.2.0-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "9765e6801fede519b82991c1f70cd94a279f40af62b97038810f5e93760c87b6",
                "md5": "95fe9421723f05cb790523addcb19b65",
                "sha256": "8c413ad5e4043f081ef1c0fcbd0fd87856261bc26397a72c2bfe4035083b1f63"
            },
            "downloads": -1,
            "filename": "mafese-0.2.0.tar.gz",
            "has_sig": false,
            "md5_digest": "95fe9421723f05cb790523addcb19b65",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.7",
            "size": 4120489,
            "upload_time": "2024-06-12T15:47:19",
            "upload_time_iso_8601": "2024-06-12T15:47:19.137214Z",
            "url": "https://files.pythonhosted.org/packages/97/65/e6801fede519b82991c1f70cd94a279f40af62b97038810f5e93760c87b6/mafese-0.2.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-06-12 15:47:19",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "thieu1995",
    "github_project": "mafese",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "requirements": [],
    "lcname": "mafese"
}
        
Elapsed time: 0.49473s