mlms


Namemlms JSON
Version 0.11.0 PyPI version JSON
download
home_pagehttps://github.com/HigherHoopern/ML_ModelSelection
SummaryThis package is to facilitate model selection in Machine Learning.
upload_time2023-04-22 15:48:48
maintainer
docs_urlNone
authorJason Lu
requires_python>=3.7,<4.0
licenseMIT
keywords machine learning model selection
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # Machine Learning Model Selection

This package aims to facilitate model selection in Machine Learning. It is a common issue that ML practitioners often struggle to decide on the most appropriate model prior to optimization, as tuning hyperparameters can be time-consuming and computationally demanding. To simplify the process, this package enables users to train several machine learning models using their default hyperparameters and compare their performance, helping them determine the most suitable model to select.

# Usage

`pip install mlms -U`

[pypi package](https://pypi.org/project/mlms/)

Then instantiate and use it like this:

`from mlms.ModelSelection import Select_Regressor, Select_Classifier`

Select some models to tune, this list should be the abbreviation of models as below, for example

`MODELS = ['LGR', 'AB', 'CART', 'GBC', 'XGBC', 'RFC', 'ETC', 'KNN', 'NB', 'SVC', 'MLP', 'SGDC', 'GPC', 'PAC']`

`df_performance, fitted_classifiers = Select_Classifier('accuracy', 10, X_train, X_test, y_train, y_test, MODELS)`

`df_performance, fitted_regressors = Select_Classifier('neg_mean_squared_erro', 10, X_train, X_test, y_train, y_test)`

For classifiers, the performance can set as `accuracy` , `'f1_score` , `precision`, `recall`, `roc_auc` , `balanced_accuracy_score` and so on. Available classifiers are below

* `('LGR', LogisticRegression(n_jobs=-1))`,
* `('AB', AdaBoostClassifier())`,
* `('CART', DecisionTreeClassifier())`,
* `('GBC', GradientBoostingClassifier())`,
* `('XGBC', XGBClassifier())`,
* `('RFC', RandomForestClassifier())`,
* `('ETC', ExtraTreeClassifier())`,
* `('KNN', KNeighborsClassifier(n_jobs=-1))`,
* `('NB', GaussianNB())`,
* `('SVC', SVC())`,
* `('MLP', MLPClassifier()),`
* `('SGDC', SGDClassifier(n_jobs=-1)),`
* `('GPC', GaussianProcessClassifier(n_jobs=-1)),`
* `('PAC', PassiveAggressiveClassifier(n_jobs=-1))`

(The charts is an classifier selection example using Iris dataset)

![1679444303986](image/README/1679444303986.png)

![1679443565646](image/README/1679443565646.png)

![1679443664816](image/README/1679443664816.png)

For regressors, the performance can set as `r2_score`, `neg_mean_squared_error` and so on. Available regressors are below:

- `('KNN', KNeighborsRegressor())`,
- `('CART', DecisionTreeRegressor())`,
- `('SVR', SVR()),`
- `('MLP', MLPRegressor())`,
- `('ABR', AdaBoostRegressor())`,
- `('GBR', GradientBoostingRegressor())`,
- `('XGB', XGBRegressor())`,
- `('RFR', RandomForestRegressor())`,
- `('ETR', ExtraTreesRegressor())`

![1679487197758](image/README/1679487197758.png)

Additonally, this package also alow users to plot ROC_Curve

`from mlms.plot_roc_curve import Multiclass_ROC_Curve, Binary_ROC_Curve`

`Multiclass_ROC_Curve(X_test, y_test, fitted_model, chart_title:str)`

![1679785680784](image/README/1679785680784.png)

`Binary_ROC_Curve(y_true, y_pred,chart_name:str)`

[GitHub](https://github.com/HigherHoopern/ML_ModelSelection)


            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/HigherHoopern/ML_ModelSelection",
    "name": "mlms",
    "maintainer": "",
    "docs_url": null,
    "requires_python": ">=3.7,<4.0",
    "maintainer_email": "",
    "keywords": "Machine Learning,Model Selection",
    "author": "Jason Lu",
    "author_email": "luzhenxian@hotmail.com",
    "download_url": "https://files.pythonhosted.org/packages/79/a4/eb645d548787f4ce44e60b00e177e4543db7e0948c7b5a43b104319687dd/mlms-0.11.0.tar.gz",
    "platform": null,
    "description": "# Machine Learning Model Selection\n\nThis package aims to facilitate model selection in Machine Learning. It is a common issue that ML practitioners often struggle to decide on the most appropriate model prior to optimization, as tuning hyperparameters can be time-consuming and computationally demanding. To simplify the process, this package enables users to train several machine learning models using their default hyperparameters and compare their performance, helping them determine the most suitable model to select.\n\n# Usage\n\n`pip install mlms -U`\n\n[pypi package](https://pypi.org/project/mlms/)\n\nThen instantiate and use it like this:\n\n`from mlms.ModelSelection import Select_Regressor, Select_Classifier`\n\nSelect some models to tune, this list should be the abbreviation of models as below, for example\n\n`MODELS = ['LGR', 'AB', 'CART', 'GBC', 'XGBC', 'RFC', 'ETC', 'KNN', 'NB', 'SVC', 'MLP', 'SGDC', 'GPC', 'PAC']`\n\n`df_performance, fitted_classifiers = Select_Classifier('accuracy', 10, X_train, X_test, y_train, y_test, MODELS)`\n\n`df_performance, fitted_regressors = Select_Classifier('neg_mean_squared_erro', 10, X_train, X_test, y_train, y_test)`\n\nFor classifiers, the performance can set as `accuracy` , `'f1_score` , `precision`, `recall`, `roc_auc` , `balanced_accuracy_score` and so on. Available classifiers are below\n\n* `('LGR', LogisticRegression(n_jobs=-1))`,\n* `('AB', AdaBoostClassifier())`,\n* `('CART', DecisionTreeClassifier())`,\n* `('GBC', GradientBoostingClassifier())`,\n* `('XGBC', XGBClassifier())`,\n* `('RFC', RandomForestClassifier())`,\n* `('ETC', ExtraTreeClassifier())`,\n* `('KNN', KNeighborsClassifier(n_jobs=-1))`,\n* `('NB', GaussianNB())`,\n* `('SVC', SVC())`,\n* `('MLP', MLPClassifier()),`\n* `('SGDC', SGDClassifier(n_jobs=-1)),`\n* `('GPC', GaussianProcessClassifier(n_jobs=-1)),`\n* `('PAC', PassiveAggressiveClassifier(n_jobs=-1))`\n\n(The charts is an classifier selection example using Iris dataset)\n\n![1679444303986](image/README/1679444303986.png)\n\n![1679443565646](image/README/1679443565646.png)\n\n![1679443664816](image/README/1679443664816.png)\n\nFor regressors, the performance can set as `r2_score`, `neg_mean_squared_error` and so on. Available regressors are below:\n\n- `('KNN', KNeighborsRegressor())`,\n- `('CART', DecisionTreeRegressor())`,\n- `('SVR', SVR()),`\n- `('MLP', MLPRegressor())`,\n- `('ABR', AdaBoostRegressor())`,\n- `('GBR', GradientBoostingRegressor())`,\n- `('XGB', XGBRegressor())`,\n- `('RFR', RandomForestRegressor())`,\n- `('ETR', ExtraTreesRegressor())`\n\n![1679487197758](image/README/1679487197758.png)\n\nAdditonally, this package also alow users to plot ROC_Curve\n\n`from mlms.plot_roc_curve import Multiclass_ROC_Curve, Binary_ROC_Curve`\n\n`Multiclass_ROC_Curve(X_test, y_test, fitted_model, chart_title:str)`\n\n![1679785680784](image/README/1679785680784.png)\n\n`Binary_ROC_Curve(y_true, y_pred,chart_name:str)`\n\n[GitHub](https://github.com/HigherHoopern/ML_ModelSelection)\n\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "This package is to facilitate model selection in Machine Learning.",
    "version": "0.11.0",
    "split_keywords": [
        "machine learning",
        "model selection"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "e1b0880bd8403a43ce0800938831ae3d72bc7b4b0388098b1bb81afae0cbe363",
                "md5": "42424258b2d54231bd643762109f84eb",
                "sha256": "6bebe6891cbcd6b5000e766efc9e26880b53e02fd56b342b8037a7383a8033ac"
            },
            "downloads": -1,
            "filename": "mlms-0.11.0-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "42424258b2d54231bd643762109f84eb",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.7,<4.0",
            "size": 8914,
            "upload_time": "2023-04-22T15:48:46",
            "upload_time_iso_8601": "2023-04-22T15:48:46.612138Z",
            "url": "https://files.pythonhosted.org/packages/e1/b0/880bd8403a43ce0800938831ae3d72bc7b4b0388098b1bb81afae0cbe363/mlms-0.11.0-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "79a4eb645d548787f4ce44e60b00e177e4543db7e0948c7b5a43b104319687dd",
                "md5": "de5316a5cee81f40254d9958ed55ec6d",
                "sha256": "e793953d333e9d7ed61c84c5470893705d200fd998eba91a2fd75486152b1039"
            },
            "downloads": -1,
            "filename": "mlms-0.11.0.tar.gz",
            "has_sig": false,
            "md5_digest": "de5316a5cee81f40254d9958ed55ec6d",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.7,<4.0",
            "size": 6442,
            "upload_time": "2023-04-22T15:48:48",
            "upload_time_iso_8601": "2023-04-22T15:48:48.250941Z",
            "url": "https://files.pythonhosted.org/packages/79/a4/eb645d548787f4ce44e60b00e177e4543db7e0948c7b5a43b104319687dd/mlms-0.11.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2023-04-22 15:48:48",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "github_user": "HigherHoopern",
    "github_project": "ML_ModelSelection",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": false,
    "lcname": "mlms"
}
        
Elapsed time: 0.05626s