modelLab


NamemodelLab JSON
Version 0.3.4 PyPI version JSON
download
home_pagehttps://github.com/Abhishekkaddipudi/modelLab
SummaryA lib for automating model training process of choosing best model that works for you data
upload_time2023-07-05 17:32:54
maintainer
docs_urlNone
authorAbhishek Kaddipudi
requires_python
license
keywords automl model modelbuilder modellab
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            .. -*- mode: rst -*-

|Version|_ |PythonVersion|_

.. _Linkedin: https://www.linkedin.com/in/abhishek-kaddipudi-0b5183253
.. _GitHub : https://github.com/Abhishekkaddipudi


.. |PythonVersion| image:: https://img.shields.io/badge/python-3.8%20%7C%203.9%20%7C%203.10-blue
.. _PythonVersion: https://pypi.org/project/modelLab/

.. |Version| image:: https://img.shields.io/badge/Version-V0.1-blue
.. _Version: https://github.com/Abhishekkaddipudi/modelLab

.. |Unit_Test| image:: https://github.com/Abhishekkaddipudi/modelLab/actions/workflows/main.yml/badge.svg
.. _Unit_Test: https://github.com/Abhishekkaddipudi/modelLab

.. _Mail: mailto:abhishekkaddipudi123@gmail.com

**modelLab** is a comprehensive library of machine learning models
designed to facilitate regression or classification tasks on a given
dataset. It encompasses a diverse range of models and provides a
comprehensive evaluation of each model's performance, delivering a
comprehensive set of metrics in a Python dictionary.

PURPOSE OF THE PACKAGE
======================

-  The primary objective of the package is to offer a curated ensemble
   of renowned scikit-learn models, enabling users to conveniently train
   all models with a single function call.

FEATURES
========

-  Collections of Machine learning models

   -  **Classification Models**

      -  'LinearSVC'
      -  'SGDClassifier'
      -  'MLPClassifier'
      -  'Perceptron'
      -  'LogisticRegression'
      -  'LogisticRegressionCV'
      -  'SVC'
      -  'CalibratedClassifierCV'
      -  'PassiveAggressiveClassifier'
      -  'LabelPropagation'
      -  'LabelSpreading'
      -  'RandomForestClassifier'
      -  'GradientBoostingClassifier'
      -  'QuadraticDiscriminantAnalysis'
      -  'HistGradientBoostingClassifier'
      -  'RidgeClassifierCV'
      -  'RidgeClassifier'
      -  'AdaBoostClassifier'
      -  'ExtraTreesClassifier'
      -  'KNeighborsClassifier'
      -  'BaggingClassifier'
      -  'BernoulliNB'
      -  'LinearDiscriminantAnalysis'
      -  'GaussianNB'
      -  'NuSVC'
      -  'DecisionTreeClassifier'
      -  'NearestCentroid'
      -  'ExtraTreeClassifier'
      -  'DummyClassifier'

   -  **Regression Models**

      -  'SVR'
      -  'RandomForestRegressor'
      -  'ExtraTreesRegressor'
      -  'AdaBoostRegressor'
      -  'NuSVR'
      -  'GradientBoostingRegressor'
      -  'KNeighborsRegressor'
      -  'HuberRegressor'
      -  'RidgeCV'
      -  'BayesianRidge'
      -  'Ridge'
      -  'LinearRegression'
      -  'LarsCV'
      -  'MLPRegressor'
      -  'XGBRegressor'
      -  'CatBoostRegressor'
      -  'LGBMRegressor'

-  Can also be used for the custom models.

GETTING STARTED 
===============

This package is available on PyPI, allowing for convenient installation through the PyPI repository.

Dependencies
============

::

   -  'scikit-learn'
   -  'xgboost'
   -  'catboost'
   -  'lightgbm'

INSTALLATION
============

If you already installed scikit-learn, the easiest way to install
modelLab is using ``pip``:

.. code:: bash

   pip install modelLab

USAGE
=====

.. code:: python

   >>> from modelLab import regressors,classifier
   >>> regressors(X, y, models=models, verbose=False, rets=True) #X,y is data
   >>> classifier(X, y, models=models, verbose=False, rets=True)

Examples
========

-  Regression Problem

.. code:: python

   >>> from modelLab import regressors
   >>> from sklearn.datasets import fetch_california_housing
   >>> X,y=fetch_california_housing(return_X_y=True)
   >>> regressors(X,y,verbose=True)
   Model: SVR
   Adjusted R^2: -0.0249
   R^2: -0.0229
   MSE: 1.3768
   RMSE: 1.1734
   MAE: 0.8698

   Model: RandomForestRegressor
   Adjusted R^2: 0.8034
   R^2: 0.8038
   MSE: 0.2641
   RMSE: 0.5139
   MAE: 0.3364

   Model: ExtraTreesRegressor
   Adjusted R^2: 0.8102
   R^2: 0.8105
   MSE: 0.2550
   RMSE: 0.5050
   MAE: 0.3333

   Model: AdaBoostRegressor
   Adjusted R^2: 0.4563
   R^2: 0.4574
   MSE: 0.7304
   RMSE: 0.8546
   MAE: 0.7296

   Model: NuSVR
   Adjusted R^2: 0.0069
   R^2: 0.0088
   MSE: 1.3342
   RMSE: 1.1551
   MAE: 0.8803

   Model: GradientBoostingRegressor
   Adjusted R^2: 0.7753
   R^2: 0.7757
   MSE: 0.3019
   RMSE: 0.5494
   MAE: 0.3789

   Model: KNeighborsRegressor
   Adjusted R^2: 0.1435
   R^2: 0.1451
   MSE: 1.1506
   RMSE: 1.0727
   MAE: 0.8183

   Model: HuberRegressor
   Adjusted R^2: 0.3702
   R^2: 0.3714
   MSE: 0.8461
   RMSE: 0.9198
   MAE: 0.5800

   Model: RidgeCV
   Adjusted R^2: 0.5868
   R^2: 0.5876
   MSE: 0.5551
   RMSE: 0.7450
   MAE: 0.5423

   Model: BayesianRidge
   Adjusted R^2: 0.5868
   R^2: 0.5876
   MSE: 0.5551
   RMSE: 0.7451
   MAE: 0.5422

   Model: Ridge
   Adjusted R^2: 0.5867
   R^2: 0.5875
   MSE: 0.5552
   RMSE: 0.7451
   MAE: 0.5422

   Model: LinearRegression
   Adjusted R^2: 0.5867
   R^2: 0.5875
   MSE: 0.5552
   RMSE: 0.7451
   MAE: 0.5422

   Model: LarsCV
   Adjusted R^2: 0.5211
   R^2: 0.5220
   MSE: 0.6433
   RMSE: 0.8021
   MAE: 0.5524

   Model: MLPRegressor
   Adjusted R^2: -3.5120
   R^2: -3.5032
   MSE: 6.0613
   RMSE: 2.4620
   MAE: 1.7951

   Model: XGBRegressor
   Adjusted R^2: 0.8269
   R^2: 0.8272
   MSE: 0.2326
   RMSE: 0.4822
   MAE: 0.3195

   Model: CatBoostRegressor
   Adjusted R^2: 0.8461
   R^2: 0.8464
   MSE: 0.2068
   RMSE: 0.4547
   MAE: 0.3005

   Model: LGBMRegressor
   Adjusted R^2: 0.8319
   R^2: 0.8322
   MSE: 0.2259
   RMSE: 0.4753
   MAE: 0.3185

-  Classification Problem

.. code:: python

   >>> from modelLab import regressors,classifier
   >>> from sklearn.datasets import load_iris
   >>> X,y=load_iris(return_X_y=True)
   >>> import warnings                           
   >>> warnings.filterwarnings('ignore')
   >>> classifier(X,y,verbose=True)              
   Model: LinearSVC
   Accuracy: 0.9667
   Precision: 0.9694
   Recall: 0.9667
   F1 Score: 0.9667

   Model: SGDClassifier
   Accuracy: 0.9667
   Precision: 0.9694
   Recall: 0.9667
   F1 Score: 0.9661

   Model: MLPClassifier
   Accuracy: 1.0000
   Precision: 1.0000
   Recall: 1.0000
   F1 Score: 1.0000

   Model: Perceptron
   Accuracy: 0.8667
   Precision: 0.9022
   Recall: 0.8667
   F1 Score: 0.8626

   Model: LogisticRegression
   Accuracy: 0.9667
   Precision: 0.9694
   Recall: 0.9667
   F1 Score: 0.9667

   Model: SVC
   Accuracy: 0.9667
   Precision: 0.9694
   Recall: 0.9667
   F1 Score: 0.9667

   Model: CalibratedClassifierCV
   Accuracy: 0.9667
   Precision: 0.9694
   Recall: 0.9667
   F1 Score: 0.9667

   Model: PassiveAggressiveClassifier
   Accuracy: 0.9667
   Precision: 0.9694
   Recall: 0.9667
   F1 Score: 0.9667

   Model: LabelPropagation
   Accuracy: 0.9667
   Precision: 0.9694
   Recall: 0.9667
   F1 Score: 0.9667

   Model: LabelSpreading
   Accuracy: 0.9667
   Precision: 0.9694
   Recall: 0.9667
   F1 Score: 0.9667

   Model: RandomForestClassifier
   Accuracy: 0.9667
   Precision: 0.9694
   Recall: 0.9667
   F1 Score: 0.9667

   Model: GradientBoostingClassifier
   Accuracy: 0.9333
   Precision: 0.9436
   Recall: 0.9333
   F1 Score: 0.9331

   Model: QuadraticDiscriminantAnalysis
   Accuracy: 1.0000
   Precision: 1.0000
   Recall: 1.0000
   F1 Score: 1.0000

   Model: HistGradientBoostingClassifier
   Accuracy: 0.9000
   Precision: 0.9214
   Recall: 0.9000
   F1 Score: 0.8989

   Model: RidgeClassifierCV
   Accuracy: 0.8667
   Precision: 0.8754
   Recall: 0.8667
   F1 Score: 0.8662

   Model: RidgeClassifier
   Accuracy: 0.8667
   Precision: 0.8754
   Recall: 0.8667
   F1 Score: 0.8662

   Model: AdaBoostClassifier
   Accuracy: 0.9333
   Precision: 0.9436
   Recall: 0.9333
   F1 Score: 0.9331

   Model: ExtraTreesClassifier
   Accuracy: 0.9667
   Precision: 0.9694
   Recall: 0.9667
   F1 Score: 0.9667

   Model: KNeighborsClassifier
   Accuracy: 0.9667
   Precision: 0.9694
   Recall: 0.9667
   F1 Score: 0.9667

   Model: BaggingClassifier
   Accuracy: 0.9333
   Precision: 0.9436
   Recall: 0.9333
   F1 Score: 0.9331

   Model: BernoulliNB
   Accuracy: 0.2333
   Precision: 0.0544
   Recall: 0.2333
   F1 Score: 0.0883

   Model: LinearDiscriminantAnalysis
   Accuracy: 1.0000
   Precision: 1.0000
   Recall: 1.0000
   F1 Score: 1.0000

   Model: GaussianNB
   Accuracy: 0.9333
   Precision: 0.9333
   Recall: 0.9333
   F1 Score: 0.9333

   Model: NuSVC
   Accuracy: 0.9667
   Precision: 0.9694
   Recall: 0.9667
   F1 Score: 0.9667

   Model: DecisionTreeClassifier
   Accuracy: 0.9333
   Precision: 0.9436
   Recall: 0.9333
   F1 Score: 0.9331

   Model: NearestCentroid
   Accuracy: 0.9000
   Precision: 0.9025
   Recall: 0.9000
   F1 Score: 0.9000

   Model: ExtraTreeClassifier
   Accuracy: 0.9667
   Precision: 0.9694
   Recall: 0.9667
   F1 Score: 0.9667

   Model: DummyClassifier
   Accuracy: 0.2333
   Precision: 0.0544
   Recall: 0.2333
   F1 Score: 0.0883

-  Using Custom Models

.. code:: python

   >>> from sklearn.datasets import make_regression
   >>> from sklearn.linear_model import LinearRegression
   >>> from modelLab import regressors
   >>> X, y = make_regression(n_samples=100, n_features=10, random_state=42)
   >>> models = {'Linear Regression': LinearRegression()}
   >>> regressors(X, y, models=models, verbose=False, rets=True)
   defaultdict(<class 'dict'>, {'Linear Regression': {'Adjusted R^2': 1.0, 'R^2': 1.0, 'MSE': 3.097635893749451e-26, 'RMSE': 1.7600101970583725e-13, 'MAE': 1.4992451724538115e-13}})

.. code:: python

   >>> from sklearn.datasets import make_regression, make_classification
   >>> from sklearn.linear_model import LogisticRegression
   >>> from modelLab import classifier
   >>> X, y = make_classification(n_samples=100, n_features=10, random_state=42)
   >>> models = {'Logistic Regression': LogisticRegression()}  
   >>> classifier(X, y, models=models, verbose=False, rets=True)
   defaultdict(<class 'dict'>, {'Logistic Regression': {'Accuracy': 0.95, 'Precision': 0.9545454545454545, 'Recall': 0.95, 'F1 Score': 0.949874686716792}})


Contributor and Author
======================
   [**Abhishek Kaddipudi**]

   `Mail`_ 
   
   `Linkedin`_

   `GitHub`_

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/Abhishekkaddipudi/modelLab",
    "name": "modelLab",
    "maintainer": "",
    "docs_url": null,
    "requires_python": "",
    "maintainer_email": "",
    "keywords": "automl,model,modelbuilder,modelLab",
    "author": "Abhishek Kaddipudi",
    "author_email": "abhishekkaddipudi007@gmail.com",
    "download_url": "https://files.pythonhosted.org/packages/50/20/e532bd243758669aa8485a3f07a02c601a8f40d182fc236721d9dacb6758/modelLab-0.3.4.tar.gz",
    "platform": null,
    "description": ".. -*- mode: rst -*-\n\n|Version|_ |PythonVersion|_\n\n.. _Linkedin: https://www.linkedin.com/in/abhishek-kaddipudi-0b5183253\n.. _GitHub : https://github.com/Abhishekkaddipudi\n\n\n.. |PythonVersion| image:: https://img.shields.io/badge/python-3.8%20%7C%203.9%20%7C%203.10-blue\n.. _PythonVersion: https://pypi.org/project/modelLab/\n\n.. |Version| image:: https://img.shields.io/badge/Version-V0.1-blue\n.. _Version: https://github.com/Abhishekkaddipudi/modelLab\n\n.. |Unit_Test| image:: https://github.com/Abhishekkaddipudi/modelLab/actions/workflows/main.yml/badge.svg\n.. _Unit_Test: https://github.com/Abhishekkaddipudi/modelLab\n\n.. _Mail: mailto:abhishekkaddipudi123@gmail.com\n\n**modelLab** is a comprehensive library of machine learning models\ndesigned to facilitate regression or classification tasks on a given\ndataset. It encompasses a diverse range of models and provides a\ncomprehensive evaluation of each model's performance, delivering a\ncomprehensive set of metrics in a Python dictionary.\n\nPURPOSE OF THE PACKAGE\n======================\n\n-  The primary objective of the package is to offer a curated ensemble\n   of renowned scikit-learn models, enabling users to conveniently train\n   all models with a single function call.\n\nFEATURES\n========\n\n-  Collections of Machine learning models\n\n   -  **Classification Models**\n\n      -  'LinearSVC'\n      -  'SGDClassifier'\n      -  'MLPClassifier'\n      -  'Perceptron'\n      -  'LogisticRegression'\n      -  'LogisticRegressionCV'\n      -  'SVC'\n      -  'CalibratedClassifierCV'\n      -  'PassiveAggressiveClassifier'\n      -  'LabelPropagation'\n      -  'LabelSpreading'\n      -  'RandomForestClassifier'\n      -  'GradientBoostingClassifier'\n      -  'QuadraticDiscriminantAnalysis'\n      -  'HistGradientBoostingClassifier'\n      -  'RidgeClassifierCV'\n      -  'RidgeClassifier'\n      -  'AdaBoostClassifier'\n      -  'ExtraTreesClassifier'\n      -  'KNeighborsClassifier'\n      -  'BaggingClassifier'\n      -  'BernoulliNB'\n      -  'LinearDiscriminantAnalysis'\n      -  'GaussianNB'\n      -  'NuSVC'\n      -  'DecisionTreeClassifier'\n      -  'NearestCentroid'\n      -  'ExtraTreeClassifier'\n      -  'DummyClassifier'\n\n   -  **Regression Models**\n\n      -  'SVR'\n      -  'RandomForestRegressor'\n      -  'ExtraTreesRegressor'\n      -  'AdaBoostRegressor'\n      -  'NuSVR'\n      -  'GradientBoostingRegressor'\n      -  'KNeighborsRegressor'\n      -  'HuberRegressor'\n      -  'RidgeCV'\n      -  'BayesianRidge'\n      -  'Ridge'\n      -  'LinearRegression'\n      -  'LarsCV'\n      -  'MLPRegressor'\n      -  'XGBRegressor'\n      -  'CatBoostRegressor'\n      -  'LGBMRegressor'\n\n-  Can also be used for the custom models.\n\nGETTING STARTED \n===============\n\nThis package is available on PyPI, allowing for convenient installation through the PyPI repository.\n\nDependencies\n============\n\n::\n\n   -  'scikit-learn'\n   -  'xgboost'\n   -  'catboost'\n   -  'lightgbm'\n\nINSTALLATION\n============\n\nIf you already installed scikit-learn, the easiest way to install\nmodelLab is using ``pip``:\n\n.. code:: bash\n\n   pip install modelLab\n\nUSAGE\n=====\n\n.. code:: python\n\n   >>> from modelLab import regressors,classifier\n   >>> regressors(X, y, models=models, verbose=False, rets=True) #X,y is data\n   >>> classifier(X, y, models=models, verbose=False, rets=True)\n\nExamples\n========\n\n-  Regression Problem\n\n.. code:: python\n\n   >>> from modelLab import regressors\n   >>> from sklearn.datasets import fetch_california_housing\n   >>> X,y=fetch_california_housing(return_X_y=True)\n   >>> regressors(X,y,verbose=True)\n   Model: SVR\n   Adjusted R^2: -0.0249\n   R^2: -0.0229\n   MSE: 1.3768\n   RMSE: 1.1734\n   MAE: 0.8698\n\n   Model: RandomForestRegressor\n   Adjusted R^2: 0.8034\n   R^2: 0.8038\n   MSE: 0.2641\n   RMSE: 0.5139\n   MAE: 0.3364\n\n   Model: ExtraTreesRegressor\n   Adjusted R^2: 0.8102\n   R^2: 0.8105\n   MSE: 0.2550\n   RMSE: 0.5050\n   MAE: 0.3333\n\n   Model: AdaBoostRegressor\n   Adjusted R^2: 0.4563\n   R^2: 0.4574\n   MSE: 0.7304\n   RMSE: 0.8546\n   MAE: 0.7296\n\n   Model: NuSVR\n   Adjusted R^2: 0.0069\n   R^2: 0.0088\n   MSE: 1.3342\n   RMSE: 1.1551\n   MAE: 0.8803\n\n   Model: GradientBoostingRegressor\n   Adjusted R^2: 0.7753\n   R^2: 0.7757\n   MSE: 0.3019\n   RMSE: 0.5494\n   MAE: 0.3789\n\n   Model: KNeighborsRegressor\n   Adjusted R^2: 0.1435\n   R^2: 0.1451\n   MSE: 1.1506\n   RMSE: 1.0727\n   MAE: 0.8183\n\n   Model: HuberRegressor\n   Adjusted R^2: 0.3702\n   R^2: 0.3714\n   MSE: 0.8461\n   RMSE: 0.9198\n   MAE: 0.5800\n\n   Model: RidgeCV\n   Adjusted R^2: 0.5868\n   R^2: 0.5876\n   MSE: 0.5551\n   RMSE: 0.7450\n   MAE: 0.5423\n\n   Model: BayesianRidge\n   Adjusted R^2: 0.5868\n   R^2: 0.5876\n   MSE: 0.5551\n   RMSE: 0.7451\n   MAE: 0.5422\n\n   Model: Ridge\n   Adjusted R^2: 0.5867\n   R^2: 0.5875\n   MSE: 0.5552\n   RMSE: 0.7451\n   MAE: 0.5422\n\n   Model: LinearRegression\n   Adjusted R^2: 0.5867\n   R^2: 0.5875\n   MSE: 0.5552\n   RMSE: 0.7451\n   MAE: 0.5422\n\n   Model: LarsCV\n   Adjusted R^2: 0.5211\n   R^2: 0.5220\n   MSE: 0.6433\n   RMSE: 0.8021\n   MAE: 0.5524\n\n   Model: MLPRegressor\n   Adjusted R^2: -3.5120\n   R^2: -3.5032\n   MSE: 6.0613\n   RMSE: 2.4620\n   MAE: 1.7951\n\n   Model: XGBRegressor\n   Adjusted R^2: 0.8269\n   R^2: 0.8272\n   MSE: 0.2326\n   RMSE: 0.4822\n   MAE: 0.3195\n\n   Model: CatBoostRegressor\n   Adjusted R^2: 0.8461\n   R^2: 0.8464\n   MSE: 0.2068\n   RMSE: 0.4547\n   MAE: 0.3005\n\n   Model: LGBMRegressor\n   Adjusted R^2: 0.8319\n   R^2: 0.8322\n   MSE: 0.2259\n   RMSE: 0.4753\n   MAE: 0.3185\n\n-  Classification Problem\n\n.. code:: python\n\n   >>> from modelLab import regressors,classifier\n   >>> from sklearn.datasets import load_iris\n   >>> X,y=load_iris(return_X_y=True)\n   >>> import warnings                           \n   >>> warnings.filterwarnings('ignore')\n   >>> classifier(X,y,verbose=True)              \n   Model: LinearSVC\n   Accuracy: 0.9667\n   Precision: 0.9694\n   Recall: 0.9667\n   F1 Score: 0.9667\n\n   Model: SGDClassifier\n   Accuracy: 0.9667\n   Precision: 0.9694\n   Recall: 0.9667\n   F1 Score: 0.9661\n\n   Model: MLPClassifier\n   Accuracy: 1.0000\n   Precision: 1.0000\n   Recall: 1.0000\n   F1 Score: 1.0000\n\n   Model: Perceptron\n   Accuracy: 0.8667\n   Precision: 0.9022\n   Recall: 0.8667\n   F1 Score: 0.8626\n\n   Model: LogisticRegression\n   Accuracy: 0.9667\n   Precision: 0.9694\n   Recall: 0.9667\n   F1 Score: 0.9667\n\n   Model: SVC\n   Accuracy: 0.9667\n   Precision: 0.9694\n   Recall: 0.9667\n   F1 Score: 0.9667\n\n   Model: CalibratedClassifierCV\n   Accuracy: 0.9667\n   Precision: 0.9694\n   Recall: 0.9667\n   F1 Score: 0.9667\n\n   Model: PassiveAggressiveClassifier\n   Accuracy: 0.9667\n   Precision: 0.9694\n   Recall: 0.9667\n   F1 Score: 0.9667\n\n   Model: LabelPropagation\n   Accuracy: 0.9667\n   Precision: 0.9694\n   Recall: 0.9667\n   F1 Score: 0.9667\n\n   Model: LabelSpreading\n   Accuracy: 0.9667\n   Precision: 0.9694\n   Recall: 0.9667\n   F1 Score: 0.9667\n\n   Model: RandomForestClassifier\n   Accuracy: 0.9667\n   Precision: 0.9694\n   Recall: 0.9667\n   F1 Score: 0.9667\n\n   Model: GradientBoostingClassifier\n   Accuracy: 0.9333\n   Precision: 0.9436\n   Recall: 0.9333\n   F1 Score: 0.9331\n\n   Model: QuadraticDiscriminantAnalysis\n   Accuracy: 1.0000\n   Precision: 1.0000\n   Recall: 1.0000\n   F1 Score: 1.0000\n\n   Model: HistGradientBoostingClassifier\n   Accuracy: 0.9000\n   Precision: 0.9214\n   Recall: 0.9000\n   F1 Score: 0.8989\n\n   Model: RidgeClassifierCV\n   Accuracy: 0.8667\n   Precision: 0.8754\n   Recall: 0.8667\n   F1 Score: 0.8662\n\n   Model: RidgeClassifier\n   Accuracy: 0.8667\n   Precision: 0.8754\n   Recall: 0.8667\n   F1 Score: 0.8662\n\n   Model: AdaBoostClassifier\n   Accuracy: 0.9333\n   Precision: 0.9436\n   Recall: 0.9333\n   F1 Score: 0.9331\n\n   Model: ExtraTreesClassifier\n   Accuracy: 0.9667\n   Precision: 0.9694\n   Recall: 0.9667\n   F1 Score: 0.9667\n\n   Model: KNeighborsClassifier\n   Accuracy: 0.9667\n   Precision: 0.9694\n   Recall: 0.9667\n   F1 Score: 0.9667\n\n   Model: BaggingClassifier\n   Accuracy: 0.9333\n   Precision: 0.9436\n   Recall: 0.9333\n   F1 Score: 0.9331\n\n   Model: BernoulliNB\n   Accuracy: 0.2333\n   Precision: 0.0544\n   Recall: 0.2333\n   F1 Score: 0.0883\n\n   Model: LinearDiscriminantAnalysis\n   Accuracy: 1.0000\n   Precision: 1.0000\n   Recall: 1.0000\n   F1 Score: 1.0000\n\n   Model: GaussianNB\n   Accuracy: 0.9333\n   Precision: 0.9333\n   Recall: 0.9333\n   F1 Score: 0.9333\n\n   Model: NuSVC\n   Accuracy: 0.9667\n   Precision: 0.9694\n   Recall: 0.9667\n   F1 Score: 0.9667\n\n   Model: DecisionTreeClassifier\n   Accuracy: 0.9333\n   Precision: 0.9436\n   Recall: 0.9333\n   F1 Score: 0.9331\n\n   Model: NearestCentroid\n   Accuracy: 0.9000\n   Precision: 0.9025\n   Recall: 0.9000\n   F1 Score: 0.9000\n\n   Model: ExtraTreeClassifier\n   Accuracy: 0.9667\n   Precision: 0.9694\n   Recall: 0.9667\n   F1 Score: 0.9667\n\n   Model: DummyClassifier\n   Accuracy: 0.2333\n   Precision: 0.0544\n   Recall: 0.2333\n   F1 Score: 0.0883\n\n-  Using Custom Models\n\n.. code:: python\n\n   >>> from sklearn.datasets import make_regression\n   >>> from sklearn.linear_model import LinearRegression\n   >>> from modelLab import regressors\n   >>> X, y = make_regression(n_samples=100, n_features=10, random_state=42)\n   >>> models = {'Linear Regression': LinearRegression()}\n   >>> regressors(X, y, models=models, verbose=False, rets=True)\n   defaultdict(<class 'dict'>, {'Linear Regression': {'Adjusted R^2': 1.0, 'R^2': 1.0, 'MSE': 3.097635893749451e-26, 'RMSE': 1.7600101970583725e-13, 'MAE': 1.4992451724538115e-13}})\n\n.. code:: python\n\n   >>> from sklearn.datasets import make_regression, make_classification\n   >>> from sklearn.linear_model import LogisticRegression\n   >>> from modelLab import classifier\n   >>> X, y = make_classification(n_samples=100, n_features=10, random_state=42)\n   >>> models = {'Logistic Regression': LogisticRegression()}  \n   >>> classifier(X, y, models=models, verbose=False, rets=True)\n   defaultdict(<class 'dict'>, {'Logistic Regression': {'Accuracy': 0.95, 'Precision': 0.9545454545454545, 'Recall': 0.95, 'F1 Score': 0.949874686716792}})\n\n\nContributor and Author\n======================\n   [**Abhishek Kaddipudi**]\n\n   `Mail`_ \n   \n   `Linkedin`_\n\n   `GitHub`_\n",
    "bugtrack_url": null,
    "license": "",
    "summary": "A lib for automating model training process of choosing best model that works for you data",
    "version": "0.3.4",
    "project_urls": {
        "Homepage": "https://github.com/Abhishekkaddipudi/modelLab"
    },
    "split_keywords": [
        "automl",
        "model",
        "modelbuilder",
        "modellab"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "15e82df40d20968d4d7b9b187f3be14f1d0b22f55599f6f803dec02d2c17766c",
                "md5": "baa0d15253c17a15a87c0da6681859e2",
                "sha256": "ea0c7d755e34a67691b1b15be6a2d02a3a28f331b537b375f9ebdaa3c3987d38"
            },
            "downloads": -1,
            "filename": "modelLab-0.3.4-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "baa0d15253c17a15a87c0da6681859e2",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": null,
            "size": 11169,
            "upload_time": "2023-07-05T17:32:52",
            "upload_time_iso_8601": "2023-07-05T17:32:52.518924Z",
            "url": "https://files.pythonhosted.org/packages/15/e8/2df40d20968d4d7b9b187f3be14f1d0b22f55599f6f803dec02d2c17766c/modelLab-0.3.4-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "5020e532bd243758669aa8485a3f07a02c601a8f40d182fc236721d9dacb6758",
                "md5": "b5b89c549e0cf2cb448c49ae04874809",
                "sha256": "1837ff9210840b2945772e210e347559fbb9032bc746e4fbc12775390d96c743"
            },
            "downloads": -1,
            "filename": "modelLab-0.3.4.tar.gz",
            "has_sig": false,
            "md5_digest": "b5b89c549e0cf2cb448c49ae04874809",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": null,
            "size": 10962,
            "upload_time": "2023-07-05T17:32:54",
            "upload_time_iso_8601": "2023-07-05T17:32:54.146489Z",
            "url": "https://files.pythonhosted.org/packages/50/20/e532bd243758669aa8485a3f07a02c601a8f40d182fc236721d9dacb6758/modelLab-0.3.4.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2023-07-05 17:32:54",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "Abhishekkaddipudi",
    "github_project": "modelLab",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "modellab"
}
        
Elapsed time: 0.08862s