lazypredict


Namelazypredict JSON
Version 0.2.13 PyPI version JSON
download
home_pagehttps://github.com/shankarpandala/lazypredict
SummaryLazy Predict help build a lot of basic models without much code and helps understand which models works better without any parameter tuning
upload_time2024-11-01 20:12:58
maintainerNone
docs_urlNone
authorShankar Rao Pandala
requires_python>=3.8
licenseMIT license
keywords lazypredict
VCS
bugtrack_url
requirements click scikit-learn pandas tqdm joblib lightgbm xgboost pytest-runner
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # Lazy Predict

[![image](https://img.shields.io/pypi/v/lazypredict.svg)](https://pypi.python.org/pypi/lazypredict)
[![Build Status](https://app.travis-ci.com/shankarpandala/lazypredict.svg)](https://app.travis-ci.com/shankarpandala/lazypredict)
[![Documentation Status](https://readthedocs.org/projects/lazypredict/badge/?version=latest)](https://lazypredict.readthedocs.io/en/latest/?badge=latest)
[![Downloads](https://pepy.tech/badge/lazypredict)](https://pepy.tech/project/lazypredict)
[![CodeFactor](https://www.codefactor.io/repository/github/shankarpandala/lazypredict/badge)](https://www.codefactor.io/repository/github/shankarpandala/lazypredict)

Lazy Predict helps build a lot of basic models without much code and
helps understand which models works better without any parameter tuning.

-   Free software: MIT license
-   Documentation: <https://lazypredict.readthedocs.io>.

# Installation

To install Lazy Predict:

    pip install lazypredict

# Usage

To use Lazy Predict in a project:

    import lazypredict

# Classification

Example :

    from lazypredict.Supervised import LazyClassifier
    from sklearn.datasets import load_breast_cancer
    from sklearn.model_selection import train_test_split

    data = load_breast_cancer()
    X = data.data
    y= data.target

    X_train, X_test, y_train, y_test = train_test_split(X, y,test_size=.5,random_state =123)

    clf = LazyClassifier(verbose=0,ignore_warnings=True, custom_metric=None)
    models,predictions = clf.fit(X_train, X_test, y_train, y_test)

    print(models)


    | Model                          |   Accuracy |   Balanced Accuracy |   ROC AUC |   F1 Score |   Time Taken |
    |:-------------------------------|-----------:|--------------------:|----------:|-----------:|-------------:|
    | LinearSVC                      |   0.989474 |            0.987544 |  0.987544 |   0.989462 |    0.0150008 |
    | SGDClassifier                  |   0.989474 |            0.987544 |  0.987544 |   0.989462 |    0.0109992 |
    | MLPClassifier                  |   0.985965 |            0.986904 |  0.986904 |   0.985994 |    0.426     |
    | Perceptron                     |   0.985965 |            0.984797 |  0.984797 |   0.985965 |    0.0120046 |
    | LogisticRegression             |   0.985965 |            0.98269  |  0.98269  |   0.985934 |    0.0200036 |
    | LogisticRegressionCV           |   0.985965 |            0.98269  |  0.98269  |   0.985934 |    0.262997  |
    | SVC                            |   0.982456 |            0.979942 |  0.979942 |   0.982437 |    0.0140011 |
    | CalibratedClassifierCV         |   0.982456 |            0.975728 |  0.975728 |   0.982357 |    0.0350015 |
    | PassiveAggressiveClassifier    |   0.975439 |            0.974448 |  0.974448 |   0.975464 |    0.0130005 |
    | LabelPropagation               |   0.975439 |            0.974448 |  0.974448 |   0.975464 |    0.0429988 |
    | LabelSpreading                 |   0.975439 |            0.974448 |  0.974448 |   0.975464 |    0.0310006 |
    | RandomForestClassifier         |   0.97193  |            0.969594 |  0.969594 |   0.97193  |    0.033     |
    | GradientBoostingClassifier     |   0.97193  |            0.967486 |  0.967486 |   0.971869 |    0.166998  |
    | QuadraticDiscriminantAnalysis  |   0.964912 |            0.966206 |  0.966206 |   0.965052 |    0.0119994 |
    | HistGradientBoostingClassifier |   0.968421 |            0.964739 |  0.964739 |   0.968387 |    0.682003  |
    | RidgeClassifierCV              |   0.97193  |            0.963272 |  0.963272 |   0.971736 |    0.0130029 |
    | RidgeClassifier                |   0.968421 |            0.960525 |  0.960525 |   0.968242 |    0.0119977 |
    | AdaBoostClassifier             |   0.961404 |            0.959245 |  0.959245 |   0.961444 |    0.204998  |
    | ExtraTreesClassifier           |   0.961404 |            0.957138 |  0.957138 |   0.961362 |    0.0270066 |
    | KNeighborsClassifier           |   0.961404 |            0.95503  |  0.95503  |   0.961276 |    0.0560005 |
    | BaggingClassifier              |   0.947368 |            0.954577 |  0.954577 |   0.947882 |    0.0559971 |
    | BernoulliNB                    |   0.950877 |            0.951003 |  0.951003 |   0.951072 |    0.0169988 |
    | LinearDiscriminantAnalysis     |   0.961404 |            0.950816 |  0.950816 |   0.961089 |    0.0199995 |
    | GaussianNB                     |   0.954386 |            0.949536 |  0.949536 |   0.954337 |    0.0139935 |
    | NuSVC                          |   0.954386 |            0.943215 |  0.943215 |   0.954014 |    0.019989  |
    | DecisionTreeClassifier         |   0.936842 |            0.933693 |  0.933693 |   0.936971 |    0.0170023 |
    | NearestCentroid                |   0.947368 |            0.933506 |  0.933506 |   0.946801 |    0.0160074 |
    | ExtraTreeClassifier            |   0.922807 |            0.912168 |  0.912168 |   0.922462 |    0.0109999 |
    | CheckingClassifier             |   0.361404 |            0.5      |  0.5      |   0.191879 |    0.0170043 |
    | DummyClassifier                |   0.512281 |            0.489598 |  0.489598 |   0.518924 |    0.0119965 |

# Regression

Example :

    from lazypredict.Supervised import LazyRegressor
    from sklearn import datasets
    from sklearn.utils import shuffle
    import numpy as np

    boston = datasets.load_boston()
    X, y = shuffle(boston.data, boston.target, random_state=13)
    X = X.astype(np.float32)

    offset = int(X.shape[0] * 0.9)

    X_train, y_train = X[:offset], y[:offset]
    X_test, y_test = X[offset:], y[offset:]

    reg = LazyRegressor(verbose=0, ignore_warnings=False, custom_metric=None)
    models, predictions = reg.fit(X_train, X_test, y_train, y_test)

    print(models)


    | Model                         | Adjusted R-Squared | R-Squared |  RMSE | Time Taken |
    |:------------------------------|-------------------:|----------:|------:|-----------:|
    | SVR                           |               0.83 |      0.88 |  2.62 |       0.01 |
    | BaggingRegressor              |               0.83 |      0.88 |  2.63 |       0.03 |
    | NuSVR                         |               0.82 |      0.86 |  2.76 |       0.03 |
    | RandomForestRegressor         |               0.81 |      0.86 |  2.78 |       0.21 |
    | XGBRegressor                  |               0.81 |      0.86 |  2.79 |       0.06 |
    | GradientBoostingRegressor     |               0.81 |      0.86 |  2.84 |       0.11 |
    | ExtraTreesRegressor           |               0.79 |      0.84 |  2.98 |       0.12 |
    | AdaBoostRegressor             |               0.78 |      0.83 |  3.04 |       0.07 |
    | HistGradientBoostingRegressor |               0.77 |      0.83 |  3.06 |       0.17 |
    | PoissonRegressor              |               0.77 |      0.83 |  3.11 |       0.01 |
    | LGBMRegressor                 |               0.77 |      0.83 |  3.11 |       0.07 |
    | KNeighborsRegressor           |               0.77 |      0.83 |  3.12 |       0.01 |
    | DecisionTreeRegressor         |               0.65 |      0.74 |  3.79 |       0.01 |
    | MLPRegressor                  |               0.65 |      0.74 |  3.80 |       1.63 |
    | HuberRegressor                |               0.64 |      0.74 |  3.84 |       0.01 |
    | GammaRegressor                |               0.64 |      0.73 |  3.88 |       0.01 |
    | LinearSVR                     |               0.62 |      0.72 |  3.96 |       0.01 |
    | RidgeCV                       |               0.62 |      0.72 |  3.97 |       0.01 |
    | BayesianRidge                 |               0.62 |      0.72 |  3.97 |       0.01 |
    | Ridge                         |               0.62 |      0.72 |  3.97 |       0.01 |
    | TransformedTargetRegressor    |               0.62 |      0.72 |  3.97 |       0.01 |
    | LinearRegression              |               0.62 |      0.72 |  3.97 |       0.01 |
    | ElasticNetCV                  |               0.62 |      0.72 |  3.98 |       0.04 |
    | LassoCV                       |               0.62 |      0.72 |  3.98 |       0.06 |
    | LassoLarsIC                   |               0.62 |      0.72 |  3.98 |       0.01 |
    | LassoLarsCV                   |               0.62 |      0.72 |  3.98 |       0.02 |
    | Lars                          |               0.61 |      0.72 |  3.99 |       0.01 |
    | LarsCV                        |               0.61 |      0.71 |  4.02 |       0.04 |
    | SGDRegressor                  |               0.60 |      0.70 |  4.07 |       0.01 |
    | TweedieRegressor              |               0.59 |      0.70 |  4.12 |       0.01 |
    | GeneralizedLinearRegressor    |               0.59 |      0.70 |  4.12 |       0.01 |
    | ElasticNet                    |               0.58 |      0.69 |  4.16 |       0.01 |
    | Lasso                         |               0.54 |      0.66 |  4.35 |       0.02 |
    | RANSACRegressor               |               0.53 |      0.65 |  4.41 |       0.04 |
    | OrthogonalMatchingPursuitCV   |               0.45 |      0.59 |  4.78 |       0.02 |
    | PassiveAggressiveRegressor    |               0.37 |      0.54 |  5.09 |       0.01 |
    | GaussianProcessRegressor      |               0.23 |      0.43 |  5.65 |       0.03 |
    | OrthogonalMatchingPursuit     |               0.16 |      0.38 |  5.89 |       0.01 |
    | ExtraTreeRegressor            |               0.08 |      0.32 |  6.17 |       0.01 |
    | DummyRegressor                |              -0.38 |     -0.02 |  7.56 |       0.01 |
    | LassoLars                     |              -0.38 |     -0.02 |  7.56 |       0.01 |
    | KernelRidge                   |             -11.50 |     -8.25 | 22.74 |       0.01 |




---
title: History
---

# 0.2.11 (2022-02-06)

-   Updated the default version to 3.9

# 0.2.10 (2022-02-06)

-   Fixed issue with older version of Scikit-learn
-   Reduced dependencies sctrictly to few

# 0.2.8 (2021-02-06)

-   Removed StackingRegressor and CheckingClassifier.
-   Added provided_models method.
-   Added adjusted r-squared metric.
-   Added cardinality check to split categorical columns into low and
    high cardinality features.
-   Added different transformation pipeline for low and high cardinality
    features.
-   Included all number dtypes as inputs.
-   Fixed dependencies.
-   Improved documentation.

# 0.2.7 (2020-07-09)

-   Removed catboost regressor and classifier

# 0.2.6 (2020-01-22)

-   Added xgboost, lightgbm, catboost regressors and classifiers

# 0.2.5 (2020-01-20)

-   Removed troublesome regressors from list of CLASSIFIERS

# 0.2.4 (2020-01-19)

-   Removed troublesome regressors from list of REGRESSORS
-   Added feature to input custom metric for evaluation
-   Added feature to return predictions as dataframe
-   Added model training time for each model

# 0.2.3 (2019-11-22)

-   Removed TheilSenRegressor from list of REGRESSORS
-   Removed GaussianProcessClassifier from list of CLASSIFIERS

# 0.2.2 (2019-11-18)

-   Fixed automatic deployment issue.

# 0.2.1 (2019-11-18)

-   Release of Regression feature.

# 0.2.0 (2019-11-17)

-   Release of Classification feature.

# 0.1.0 (2019-11-16)

-   First release on PyPI.

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/shankarpandala/lazypredict",
    "name": "lazypredict",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.8",
    "maintainer_email": null,
    "keywords": "lazypredict",
    "author": "Shankar Rao Pandala",
    "author_email": "shankar.pandala@live.com",
    "download_url": "https://files.pythonhosted.org/packages/da/8a/3b93301120ab4c7e57276484037bdf23ae36062a354e7f968b230f409d59/lazypredict-0.2.13.tar.gz",
    "platform": null,
    "description": "# Lazy Predict\n\n[![image](https://img.shields.io/pypi/v/lazypredict.svg)](https://pypi.python.org/pypi/lazypredict)\n[![Build Status](https://app.travis-ci.com/shankarpandala/lazypredict.svg)](https://app.travis-ci.com/shankarpandala/lazypredict)\n[![Documentation Status](https://readthedocs.org/projects/lazypredict/badge/?version=latest)](https://lazypredict.readthedocs.io/en/latest/?badge=latest)\n[![Downloads](https://pepy.tech/badge/lazypredict)](https://pepy.tech/project/lazypredict)\n[![CodeFactor](https://www.codefactor.io/repository/github/shankarpandala/lazypredict/badge)](https://www.codefactor.io/repository/github/shankarpandala/lazypredict)\n\nLazy Predict helps build a lot of basic models without much code and\nhelps understand which models works better without any parameter tuning.\n\n-   Free software: MIT license\n-   Documentation: <https://lazypredict.readthedocs.io>.\n\n# Installation\n\nTo install Lazy Predict:\n\n    pip install lazypredict\n\n# Usage\n\nTo use Lazy Predict in a project:\n\n    import lazypredict\n\n# Classification\n\nExample :\n\n    from lazypredict.Supervised import LazyClassifier\n    from sklearn.datasets import load_breast_cancer\n    from sklearn.model_selection import train_test_split\n\n    data = load_breast_cancer()\n    X = data.data\n    y= data.target\n\n    X_train, X_test, y_train, y_test = train_test_split(X, y,test_size=.5,random_state =123)\n\n    clf = LazyClassifier(verbose=0,ignore_warnings=True, custom_metric=None)\n    models,predictions = clf.fit(X_train, X_test, y_train, y_test)\n\n    print(models)\n\n\n    | Model                          |   Accuracy |   Balanced Accuracy |   ROC AUC |   F1 Score |   Time Taken |\n    |:-------------------------------|-----------:|--------------------:|----------:|-----------:|-------------:|\n    | LinearSVC                      |   0.989474 |            0.987544 |  0.987544 |   0.989462 |    0.0150008 |\n    | SGDClassifier                  |   0.989474 |            0.987544 |  0.987544 |   0.989462 |    0.0109992 |\n    | MLPClassifier                  |   0.985965 |            0.986904 |  0.986904 |   0.985994 |    0.426     |\n    | Perceptron                     |   0.985965 |            0.984797 |  0.984797 |   0.985965 |    0.0120046 |\n    | LogisticRegression             |   0.985965 |            0.98269  |  0.98269  |   0.985934 |    0.0200036 |\n    | LogisticRegressionCV           |   0.985965 |            0.98269  |  0.98269  |   0.985934 |    0.262997  |\n    | SVC                            |   0.982456 |            0.979942 |  0.979942 |   0.982437 |    0.0140011 |\n    | CalibratedClassifierCV         |   0.982456 |            0.975728 |  0.975728 |   0.982357 |    0.0350015 |\n    | PassiveAggressiveClassifier    |   0.975439 |            0.974448 |  0.974448 |   0.975464 |    0.0130005 |\n    | LabelPropagation               |   0.975439 |            0.974448 |  0.974448 |   0.975464 |    0.0429988 |\n    | LabelSpreading                 |   0.975439 |            0.974448 |  0.974448 |   0.975464 |    0.0310006 |\n    | RandomForestClassifier         |   0.97193  |            0.969594 |  0.969594 |   0.97193  |    0.033     |\n    | GradientBoostingClassifier     |   0.97193  |            0.967486 |  0.967486 |   0.971869 |    0.166998  |\n    | QuadraticDiscriminantAnalysis  |   0.964912 |            0.966206 |  0.966206 |   0.965052 |    0.0119994 |\n    | HistGradientBoostingClassifier |   0.968421 |            0.964739 |  0.964739 |   0.968387 |    0.682003  |\n    | RidgeClassifierCV              |   0.97193  |            0.963272 |  0.963272 |   0.971736 |    0.0130029 |\n    | RidgeClassifier                |   0.968421 |            0.960525 |  0.960525 |   0.968242 |    0.0119977 |\n    | AdaBoostClassifier             |   0.961404 |            0.959245 |  0.959245 |   0.961444 |    0.204998  |\n    | ExtraTreesClassifier           |   0.961404 |            0.957138 |  0.957138 |   0.961362 |    0.0270066 |\n    | KNeighborsClassifier           |   0.961404 |            0.95503  |  0.95503  |   0.961276 |    0.0560005 |\n    | BaggingClassifier              |   0.947368 |            0.954577 |  0.954577 |   0.947882 |    0.0559971 |\n    | BernoulliNB                    |   0.950877 |            0.951003 |  0.951003 |   0.951072 |    0.0169988 |\n    | LinearDiscriminantAnalysis     |   0.961404 |            0.950816 |  0.950816 |   0.961089 |    0.0199995 |\n    | GaussianNB                     |   0.954386 |            0.949536 |  0.949536 |   0.954337 |    0.0139935 |\n    | NuSVC                          |   0.954386 |            0.943215 |  0.943215 |   0.954014 |    0.019989  |\n    | DecisionTreeClassifier         |   0.936842 |            0.933693 |  0.933693 |   0.936971 |    0.0170023 |\n    | NearestCentroid                |   0.947368 |            0.933506 |  0.933506 |   0.946801 |    0.0160074 |\n    | ExtraTreeClassifier            |   0.922807 |            0.912168 |  0.912168 |   0.922462 |    0.0109999 |\n    | CheckingClassifier             |   0.361404 |            0.5      |  0.5      |   0.191879 |    0.0170043 |\n    | DummyClassifier                |   0.512281 |            0.489598 |  0.489598 |   0.518924 |    0.0119965 |\n\n# Regression\n\nExample :\n\n    from lazypredict.Supervised import LazyRegressor\n    from sklearn import datasets\n    from sklearn.utils import shuffle\n    import numpy as np\n\n    boston = datasets.load_boston()\n    X, y = shuffle(boston.data, boston.target, random_state=13)\n    X = X.astype(np.float32)\n\n    offset = int(X.shape[0] * 0.9)\n\n    X_train, y_train = X[:offset], y[:offset]\n    X_test, y_test = X[offset:], y[offset:]\n\n    reg = LazyRegressor(verbose=0, ignore_warnings=False, custom_metric=None)\n    models, predictions = reg.fit(X_train, X_test, y_train, y_test)\n\n    print(models)\n\n\n    | Model                         | Adjusted R-Squared | R-Squared |  RMSE | Time Taken |\n    |:------------------------------|-------------------:|----------:|------:|-----------:|\n    | SVR                           |               0.83 |      0.88 |  2.62 |       0.01 |\n    | BaggingRegressor              |               0.83 |      0.88 |  2.63 |       0.03 |\n    | NuSVR                         |               0.82 |      0.86 |  2.76 |       0.03 |\n    | RandomForestRegressor         |               0.81 |      0.86 |  2.78 |       0.21 |\n    | XGBRegressor                  |               0.81 |      0.86 |  2.79 |       0.06 |\n    | GradientBoostingRegressor     |               0.81 |      0.86 |  2.84 |       0.11 |\n    | ExtraTreesRegressor           |               0.79 |      0.84 |  2.98 |       0.12 |\n    | AdaBoostRegressor             |               0.78 |      0.83 |  3.04 |       0.07 |\n    | HistGradientBoostingRegressor |               0.77 |      0.83 |  3.06 |       0.17 |\n    | PoissonRegressor              |               0.77 |      0.83 |  3.11 |       0.01 |\n    | LGBMRegressor                 |               0.77 |      0.83 |  3.11 |       0.07 |\n    | KNeighborsRegressor           |               0.77 |      0.83 |  3.12 |       0.01 |\n    | DecisionTreeRegressor         |               0.65 |      0.74 |  3.79 |       0.01 |\n    | MLPRegressor                  |               0.65 |      0.74 |  3.80 |       1.63 |\n    | HuberRegressor                |               0.64 |      0.74 |  3.84 |       0.01 |\n    | GammaRegressor                |               0.64 |      0.73 |  3.88 |       0.01 |\n    | LinearSVR                     |               0.62 |      0.72 |  3.96 |       0.01 |\n    | RidgeCV                       |               0.62 |      0.72 |  3.97 |       0.01 |\n    | BayesianRidge                 |               0.62 |      0.72 |  3.97 |       0.01 |\n    | Ridge                         |               0.62 |      0.72 |  3.97 |       0.01 |\n    | TransformedTargetRegressor    |               0.62 |      0.72 |  3.97 |       0.01 |\n    | LinearRegression              |               0.62 |      0.72 |  3.97 |       0.01 |\n    | ElasticNetCV                  |               0.62 |      0.72 |  3.98 |       0.04 |\n    | LassoCV                       |               0.62 |      0.72 |  3.98 |       0.06 |\n    | LassoLarsIC                   |               0.62 |      0.72 |  3.98 |       0.01 |\n    | LassoLarsCV                   |               0.62 |      0.72 |  3.98 |       0.02 |\n    | Lars                          |               0.61 |      0.72 |  3.99 |       0.01 |\n    | LarsCV                        |               0.61 |      0.71 |  4.02 |       0.04 |\n    | SGDRegressor                  |               0.60 |      0.70 |  4.07 |       0.01 |\n    | TweedieRegressor              |               0.59 |      0.70 |  4.12 |       0.01 |\n    | GeneralizedLinearRegressor    |               0.59 |      0.70 |  4.12 |       0.01 |\n    | ElasticNet                    |               0.58 |      0.69 |  4.16 |       0.01 |\n    | Lasso                         |               0.54 |      0.66 |  4.35 |       0.02 |\n    | RANSACRegressor               |               0.53 |      0.65 |  4.41 |       0.04 |\n    | OrthogonalMatchingPursuitCV   |               0.45 |      0.59 |  4.78 |       0.02 |\n    | PassiveAggressiveRegressor    |               0.37 |      0.54 |  5.09 |       0.01 |\n    | GaussianProcessRegressor      |               0.23 |      0.43 |  5.65 |       0.03 |\n    | OrthogonalMatchingPursuit     |               0.16 |      0.38 |  5.89 |       0.01 |\n    | ExtraTreeRegressor            |               0.08 |      0.32 |  6.17 |       0.01 |\n    | DummyRegressor                |              -0.38 |     -0.02 |  7.56 |       0.01 |\n    | LassoLars                     |              -0.38 |     -0.02 |  7.56 |       0.01 |\n    | KernelRidge                   |             -11.50 |     -8.25 | 22.74 |       0.01 |\n\n\n\n\n---\ntitle: History\n---\n\n# 0.2.11 (2022-02-06)\n\n-   Updated the default version to 3.9\n\n# 0.2.10 (2022-02-06)\n\n-   Fixed issue with older version of Scikit-learn\n-   Reduced dependencies sctrictly to few\n\n# 0.2.8 (2021-02-06)\n\n-   Removed StackingRegressor and CheckingClassifier.\n-   Added provided_models method.\n-   Added adjusted r-squared metric.\n-   Added cardinality check to split categorical columns into low and\n    high cardinality features.\n-   Added different transformation pipeline for low and high cardinality\n    features.\n-   Included all number dtypes as inputs.\n-   Fixed dependencies.\n-   Improved documentation.\n\n# 0.2.7 (2020-07-09)\n\n-   Removed catboost regressor and classifier\n\n# 0.2.6 (2020-01-22)\n\n-   Added xgboost, lightgbm, catboost regressors and classifiers\n\n# 0.2.5 (2020-01-20)\n\n-   Removed troublesome regressors from list of CLASSIFIERS\n\n# 0.2.4 (2020-01-19)\n\n-   Removed troublesome regressors from list of REGRESSORS\n-   Added feature to input custom metric for evaluation\n-   Added feature to return predictions as dataframe\n-   Added model training time for each model\n\n# 0.2.3 (2019-11-22)\n\n-   Removed TheilSenRegressor from list of REGRESSORS\n-   Removed GaussianProcessClassifier from list of CLASSIFIERS\n\n# 0.2.2 (2019-11-18)\n\n-   Fixed automatic deployment issue.\n\n# 0.2.1 (2019-11-18)\n\n-   Release of Regression feature.\n\n# 0.2.0 (2019-11-17)\n\n-   Release of Classification feature.\n\n# 0.1.0 (2019-11-16)\n\n-   First release on PyPI.\n",
    "bugtrack_url": null,
    "license": "MIT license",
    "summary": "Lazy Predict help build a lot of basic models without much code and helps understand which models works better without any parameter tuning",
    "version": "0.2.13",
    "project_urls": {
        "Homepage": "https://github.com/shankarpandala/lazypredict"
    },
    "split_keywords": [
        "lazypredict"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "499b323d7922bcf4a09b2b63962b021193011b94c74767b382ae681b1437c9b4",
                "md5": "def23d93f7bed39c10bec2266be18485",
                "sha256": "ad0d914bf6cf3cf6af541b4785288a2a9c94d6008df8bea97e6d875f0c0bf8d0"
            },
            "downloads": -1,
            "filename": "lazypredict-0.2.13-py2.py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "def23d93f7bed39c10bec2266be18485",
            "packagetype": "bdist_wheel",
            "python_version": "py2.py3",
            "requires_python": ">=3.8",
            "size": 12325,
            "upload_time": "2024-11-01T20:12:56",
            "upload_time_iso_8601": "2024-11-01T20:12:56.932291Z",
            "url": "https://files.pythonhosted.org/packages/49/9b/323d7922bcf4a09b2b63962b021193011b94c74767b382ae681b1437c9b4/lazypredict-0.2.13-py2.py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "da8a3b93301120ab4c7e57276484037bdf23ae36062a354e7f968b230f409d59",
                "md5": "932f1068cb4039266c1b0b550a1a7ec3",
                "sha256": "a3a557365a83cb4d2fefc47adcb73fb0709888fb3708756f8e5c22e39ad443f0"
            },
            "downloads": -1,
            "filename": "lazypredict-0.2.13.tar.gz",
            "has_sig": false,
            "md5_digest": "932f1068cb4039266c1b0b550a1a7ec3",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.8",
            "size": 21231,
            "upload_time": "2024-11-01T20:12:58",
            "upload_time_iso_8601": "2024-11-01T20:12:58.726412Z",
            "url": "https://files.pythonhosted.org/packages/da/8a/3b93301120ab4c7e57276484037bdf23ae36062a354e7f968b230f409d59/lazypredict-0.2.13.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-11-01 20:12:58",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "shankarpandala",
    "github_project": "lazypredict",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "requirements": [
        {
            "name": "click",
            "specs": []
        },
        {
            "name": "scikit-learn",
            "specs": []
        },
        {
            "name": "pandas",
            "specs": []
        },
        {
            "name": "tqdm",
            "specs": []
        },
        {
            "name": "joblib",
            "specs": []
        },
        {
            "name": "lightgbm",
            "specs": []
        },
        {
            "name": "xgboost",
            "specs": []
        },
        {
            "name": "pytest-runner",
            "specs": []
        }
    ],
    "tox": true,
    "lcname": "lazypredict"
}
        
Elapsed time: 0.60610s