lazypredict-nightly


Namelazypredict-nightly JSON
Version 0.3.2 PyPI version JSON
download
home_pagehttps://github.com/nityansuman/lazypredict-nightly
Summary[Updated] Lazy Predict help build a lot of basic models without much code and helps understand which models works better without any parameter tuning
upload_time2024-03-25 08:35:42
maintainerNone
docs_urlNone
authorKumar Nityan Suman
requires_python>=3.6
licenseMIT license
keywords lazypredict lazypredict-nightly
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # Lazy Predict

[Nightly Updated] Lazy Predict 2.0 to help you benchmark models without much code and understand what works better without any hyper-parameter tuning.

# Coming soon

- [ ] LLM based task benchmarking
    - [ ] Text Classification
    - [ ] Token Classification
    - [ ] Text Summarization
    - [ ] Text Similarity
- [ ] Stats model benchmarking

# Getting started

To install Lazy Predict Nightly:

    pip install lazypredict-nightly

To use Lazy Predict in a project:

    import lazypredict

## Classification

```
    from lazypredict import LazyClassifier

    from sklearn.datasets import load_breast_cancer
    from sklearn.model_selection import train_test_split

    data = load_breast_cancer()
    X = data.data
    y= data.target

    X_train, X_test, y_train, y_test = train_test_split(X, y,test_size=.5,random_state =123)

    clf = LazyClassifier(verbose=0,ignore_warnings=True, custom_metric=None)
    models,predictions = clf.fit(X_train, X_test, y_train, y_test)

    print(models)


    | Model                          |   Accuracy |   Balanced Accuracy |   ROC AUC |   F1 Score |   Time Taken |
    |:-------------------------------|-----------:|--------------------:|----------:|-----------:|-------------:|
    | LinearSVC                      |   0.989474 |            0.987544 |  0.987544 |   0.989462 |    0.0150008 |
    | SGDClassifier                  |   0.989474 |            0.987544 |  0.987544 |   0.989462 |    0.0109992 |
    | MLPClassifier                  |   0.985965 |            0.986904 |  0.986904 |   0.985994 |    0.426     |
    | Perceptron                     |   0.985965 |            0.984797 |  0.984797 |   0.985965 |    0.0120046 |
    | LogisticRegression             |   0.985965 |            0.98269  |  0.98269  |   0.985934 |    0.0200036 |
    | LogisticRegressionCV           |   0.985965 |            0.98269  |  0.98269  |   0.985934 |    0.262997  |
    | SVC                            |   0.982456 |            0.979942 |  0.979942 |   0.982437 |    0.0140011 |
    | CalibratedClassifierCV         |   0.982456 |            0.975728 |  0.975728 |   0.982357 |    0.0350015 |
    | PassiveAggressiveClassifier    |   0.975439 |            0.974448 |  0.974448 |   0.975464 |    0.0130005 |
    | LabelPropagation               |   0.975439 |            0.974448 |  0.974448 |   0.975464 |    0.0429988 |
    | LabelSpreading                 |   0.975439 |            0.974448 |  0.974448 |   0.975464 |    0.0310006 |
    | RandomForestClassifier         |   0.97193  |            0.969594 |  0.969594 |   0.97193  |    0.033     |
    | GradientBoostingClassifier     |   0.97193  |            0.967486 |  0.967486 |   0.971869 |    0.166998  |
    | QuadraticDiscriminantAnalysis  |   0.964912 |            0.966206 |  0.966206 |   0.965052 |    0.0119994 |
    | HistGradientBoostingClassifier |   0.968421 |            0.964739 |  0.964739 |   0.968387 |    0.682003  |
    | RidgeClassifierCV              |   0.97193  |            0.963272 |  0.963272 |   0.971736 |    0.0130029 |
    | RidgeClassifier                |   0.968421 |            0.960525 |  0.960525 |   0.968242 |    0.0119977 |
    | AdaBoostClassifier             |   0.961404 |            0.959245 |  0.959245 |   0.961444 |    0.204998  |
    | ExtraTreesClassifier           |   0.961404 |            0.957138 |  0.957138 |   0.961362 |    0.0270066 |
    | KNeighborsClassifier           |   0.961404 |            0.95503  |  0.95503  |   0.961276 |    0.0560005 |
    | BaggingClassifier              |   0.947368 |            0.954577 |  0.954577 |   0.947882 |    0.0559971 |
    | BernoulliNB                    |   0.950877 |            0.951003 |  0.951003 |   0.951072 |    0.0169988 |
    | LinearDiscriminantAnalysis     |   0.961404 |            0.950816 |  0.950816 |   0.961089 |    0.0199995 |
    | GaussianNB                     |   0.954386 |            0.949536 |  0.949536 |   0.954337 |    0.0139935 |
    | NuSVC                          |   0.954386 |            0.943215 |  0.943215 |   0.954014 |    0.019989  |
    | DecisionTreeClassifier         |   0.936842 |            0.933693 |  0.933693 |   0.936971 |    0.0170023 |
    | NearestCentroid                |   0.947368 |            0.933506 |  0.933506 |   0.946801 |    0.0160074 |
    | ExtraTreeClassifier            |   0.922807 |            0.912168 |  0.912168 |   0.922462 |    0.0109999 |
    | CheckingClassifier             |   0.361404 |            0.5      |  0.5      |   0.191879 |    0.0170043 |
    | DummyClassifier                |   0.512281 |            0.489598 |  0.489598 |   0.518924 |    0.0119965 |
```

## Regression

```
    from lazypredict import LazyRegressor

    from sklearn import datasets
    from sklearn.utils import shuffle
    import numpy as np

    boston = datasets.load_boston()
    X, y = shuffle(boston.data, boston.target, random_state=13)
    X = X.astype(np.float32)

    offset = int(X.shape[0] * 0.9)

    X_train, y_train = X[:offset], y[:offset]
    X_test, y_test = X[offset:], y[offset:]

    reg = LazyRegressor(verbose=0, ignore_warnings=False, custom_metric=None)
    models, predictions = reg.fit(X_train, X_test, y_train, y_test)

    print(models)


    | Model                         | Adjusted R-Squared | R-Squared |  RMSE | Time Taken |
    |:------------------------------|-------------------:|----------:|------:|-----------:|
    | SVR                           |               0.83 |      0.88 |  2.62 |       0.01 |
    | BaggingRegressor              |               0.83 |      0.88 |  2.63 |       0.03 |
    | NuSVR                         |               0.82 |      0.86 |  2.76 |       0.03 |
    | RandomForestRegressor         |               0.81 |      0.86 |  2.78 |       0.21 |
    | XGBRegressor                  |               0.81 |      0.86 |  2.79 |       0.06 |
    | GradientBoostingRegressor     |               0.81 |      0.86 |  2.84 |       0.11 |
    | ExtraTreesRegressor           |               0.79 |      0.84 |  2.98 |       0.12 |
    | AdaBoostRegressor             |               0.78 |      0.83 |  3.04 |       0.07 |
    | HistGradientBoostingRegressor |               0.77 |      0.83 |  3.06 |       0.17 |
    | PoissonRegressor              |               0.77 |      0.83 |  3.11 |       0.01 |
    | LGBMRegressor                 |               0.77 |      0.83 |  3.11 |       0.07 |
    | KNeighborsRegressor           |               0.77 |      0.83 |  3.12 |       0.01 |
    | DecisionTreeRegressor         |               0.65 |      0.74 |  3.79 |       0.01 |
    | MLPRegressor                  |               0.65 |      0.74 |  3.80 |       1.63 |
    | HuberRegressor                |               0.64 |      0.74 |  3.84 |       0.01 |
    | GammaRegressor                |               0.64 |      0.73 |  3.88 |       0.01 |
    | LinearSVR                     |               0.62 |      0.72 |  3.96 |       0.01 |
    | RidgeCV                       |               0.62 |      0.72 |  3.97 |       0.01 |
    | BayesianRidge                 |               0.62 |      0.72 |  3.97 |       0.01 |
    | Ridge                         |               0.62 |      0.72 |  3.97 |       0.01 |
    | TransformedTargetRegressor    |               0.62 |      0.72 |  3.97 |       0.01 |
    | LinearRegression              |               0.62 |      0.72 |  3.97 |       0.01 |
    | ElasticNetCV                  |               0.62 |      0.72 |  3.98 |       0.04 |
    | LassoCV                       |               0.62 |      0.72 |  3.98 |       0.06 |
    | LassoLarsIC                   |               0.62 |      0.72 |  3.98 |       0.01 |
    | LassoLarsCV                   |               0.62 |      0.72 |  3.98 |       0.02 |
    | Lars                          |               0.61 |      0.72 |  3.99 |       0.01 |
    | LarsCV                        |               0.61 |      0.71 |  4.02 |       0.04 |
    | SGDRegressor                  |               0.60 |      0.70 |  4.07 |       0.01 |
    | TweedieRegressor              |               0.59 |      0.70 |  4.12 |       0.01 |
    | GeneralizedLinearRegressor    |               0.59 |      0.70 |  4.12 |       0.01 |
    | ElasticNet                    |               0.58 |      0.69 |  4.16 |       0.01 |
    | Lasso                         |               0.54 |      0.66 |  4.35 |       0.02 |
    | RANSACRegressor               |               0.53 |      0.65 |  4.41 |       0.04 |
    | OrthogonalMatchingPursuitCV   |               0.45 |      0.59 |  4.78 |       0.02 |
    | PassiveAggressiveRegressor    |               0.37 |      0.54 |  5.09 |       0.01 |
    | GaussianProcessRegressor      |               0.23 |      0.43 |  5.65 |       0.03 |
    | OrthogonalMatchingPursuit     |               0.16 |      0.38 |  5.89 |       0.01 |
    | ExtraTreeRegressor            |               0.08 |      0.32 |  6.17 |       0.01 |
    | DummyRegressor                |              -0.38 |     -0.02 |  7.56 |       0.01 |
    | LassoLars                     |              -0.38 |     -0.02 |  7.56 |       0.01 |
    | KernelRidge                   |             -11.50 |     -8.25 | 22.74 |       0.01 |
```


---
History
---

# 0.3.2 (2024-03-25)

-   Major import bug fix
-   Cleanup

# 0.3.1 (2024-03-03)

-   Minor cleanups

# 0.3.0 (2024-03-03)

-   Fixed OneHotEncoder Bug

# 0.2.11 (2022-02-06)

-   Updated the default version to 3.9

# 0.2.10 (2022-02-06)

-   Fixed issue with older version of Scikit-learn
-   Reduced dependencies sctrictly to few

# 0.2.8 (2021-02-06)

-   Removed StackingRegressor and CheckingClassifier.
-   Added provided_models method.
-   Added adjusted r-squared metric.
-   Added cardinality check to split categorical columns into low and
    high cardinality features.
-   Added different transformation pipeline for low and high cardinality
    features.
-   Included all number dtypes as inputs.
-   Fixed dependencies.
-   Improved documentation.

# 0.2.7 (2020-07-09)

-   Removed catboost regressor and classifier

# 0.2.6 (2020-01-22)

-   Added xgboost, lightgbm, catboost regressors and classifiers

# 0.2.5 (2020-01-20)

-   Removed troublesome regressors from list of CLASSIFIERS

# 0.2.4 (2020-01-19)

-   Removed troublesome regressors from list of REGRESSORS
-   Added feature to input custom metric for evaluation
-   Added feature to return predictions as dataframe
-   Added model training time for each model

# 0.2.3 (2019-11-22)

-   Removed TheilSenRegressor from list of REGRESSORS
-   Removed GaussianProcessClassifier from list of CLASSIFIERS

# 0.2.2 (2019-11-18)

-   Fixed automatic deployment issue.

# 0.2.1 (2019-11-18)

-   Release of Regression feature.

# 0.2.0 (2019-11-17)

-   Release of Classification feature.

# 0.1.0 (2019-11-16)

-   First release on PyPI.

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/nityansuman/lazypredict-nightly",
    "name": "lazypredict-nightly",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.6",
    "maintainer_email": null,
    "keywords": "lazypredict, lazypredict-nightly",
    "author": "Kumar Nityan Suman",
    "author_email": "nityan.suman@gmail.com",
    "download_url": "https://files.pythonhosted.org/packages/05/fa/10de6d6306294880c5e4b15286bf4447f0069c0d19938edd4af610cba6aa/lazypredict-nightly-0.3.2.tar.gz",
    "platform": null,
    "description": "# Lazy Predict\n\n[Nightly Updated] Lazy Predict 2.0 to help you benchmark models without much code and understand what works better without any hyper-parameter tuning.\n\n# Coming soon\n\n- [ ] LLM based task benchmarking\n    - [ ] Text Classification\n    - [ ] Token Classification\n    - [ ] Text Summarization\n    - [ ] Text Similarity\n- [ ] Stats model benchmarking\n\n# Getting started\n\nTo install Lazy Predict Nightly:\n\n    pip install lazypredict-nightly\n\nTo use Lazy Predict in a project:\n\n    import lazypredict\n\n## Classification\n\n```\n    from lazypredict import LazyClassifier\n\n    from sklearn.datasets import load_breast_cancer\n    from sklearn.model_selection import train_test_split\n\n    data = load_breast_cancer()\n    X = data.data\n    y= data.target\n\n    X_train, X_test, y_train, y_test = train_test_split(X, y,test_size=.5,random_state =123)\n\n    clf = LazyClassifier(verbose=0,ignore_warnings=True, custom_metric=None)\n    models,predictions = clf.fit(X_train, X_test, y_train, y_test)\n\n    print(models)\n\n\n    | Model                          |   Accuracy |   Balanced Accuracy |   ROC AUC |   F1 Score |   Time Taken |\n    |:-------------------------------|-----------:|--------------------:|----------:|-----------:|-------------:|\n    | LinearSVC                      |   0.989474 |            0.987544 |  0.987544 |   0.989462 |    0.0150008 |\n    | SGDClassifier                  |   0.989474 |            0.987544 |  0.987544 |   0.989462 |    0.0109992 |\n    | MLPClassifier                  |   0.985965 |            0.986904 |  0.986904 |   0.985994 |    0.426     |\n    | Perceptron                     |   0.985965 |            0.984797 |  0.984797 |   0.985965 |    0.0120046 |\n    | LogisticRegression             |   0.985965 |            0.98269  |  0.98269  |   0.985934 |    0.0200036 |\n    | LogisticRegressionCV           |   0.985965 |            0.98269  |  0.98269  |   0.985934 |    0.262997  |\n    | SVC                            |   0.982456 |            0.979942 |  0.979942 |   0.982437 |    0.0140011 |\n    | CalibratedClassifierCV         |   0.982456 |            0.975728 |  0.975728 |   0.982357 |    0.0350015 |\n    | PassiveAggressiveClassifier    |   0.975439 |            0.974448 |  0.974448 |   0.975464 |    0.0130005 |\n    | LabelPropagation               |   0.975439 |            0.974448 |  0.974448 |   0.975464 |    0.0429988 |\n    | LabelSpreading                 |   0.975439 |            0.974448 |  0.974448 |   0.975464 |    0.0310006 |\n    | RandomForestClassifier         |   0.97193  |            0.969594 |  0.969594 |   0.97193  |    0.033     |\n    | GradientBoostingClassifier     |   0.97193  |            0.967486 |  0.967486 |   0.971869 |    0.166998  |\n    | QuadraticDiscriminantAnalysis  |   0.964912 |            0.966206 |  0.966206 |   0.965052 |    0.0119994 |\n    | HistGradientBoostingClassifier |   0.968421 |            0.964739 |  0.964739 |   0.968387 |    0.682003  |\n    | RidgeClassifierCV              |   0.97193  |            0.963272 |  0.963272 |   0.971736 |    0.0130029 |\n    | RidgeClassifier                |   0.968421 |            0.960525 |  0.960525 |   0.968242 |    0.0119977 |\n    | AdaBoostClassifier             |   0.961404 |            0.959245 |  0.959245 |   0.961444 |    0.204998  |\n    | ExtraTreesClassifier           |   0.961404 |            0.957138 |  0.957138 |   0.961362 |    0.0270066 |\n    | KNeighborsClassifier           |   0.961404 |            0.95503  |  0.95503  |   0.961276 |    0.0560005 |\n    | BaggingClassifier              |   0.947368 |            0.954577 |  0.954577 |   0.947882 |    0.0559971 |\n    | BernoulliNB                    |   0.950877 |            0.951003 |  0.951003 |   0.951072 |    0.0169988 |\n    | LinearDiscriminantAnalysis     |   0.961404 |            0.950816 |  0.950816 |   0.961089 |    0.0199995 |\n    | GaussianNB                     |   0.954386 |            0.949536 |  0.949536 |   0.954337 |    0.0139935 |\n    | NuSVC                          |   0.954386 |            0.943215 |  0.943215 |   0.954014 |    0.019989  |\n    | DecisionTreeClassifier         |   0.936842 |            0.933693 |  0.933693 |   0.936971 |    0.0170023 |\n    | NearestCentroid                |   0.947368 |            0.933506 |  0.933506 |   0.946801 |    0.0160074 |\n    | ExtraTreeClassifier            |   0.922807 |            0.912168 |  0.912168 |   0.922462 |    0.0109999 |\n    | CheckingClassifier             |   0.361404 |            0.5      |  0.5      |   0.191879 |    0.0170043 |\n    | DummyClassifier                |   0.512281 |            0.489598 |  0.489598 |   0.518924 |    0.0119965 |\n```\n\n## Regression\n\n```\n    from lazypredict import LazyRegressor\n\n    from sklearn import datasets\n    from sklearn.utils import shuffle\n    import numpy as np\n\n    boston = datasets.load_boston()\n    X, y = shuffle(boston.data, boston.target, random_state=13)\n    X = X.astype(np.float32)\n\n    offset = int(X.shape[0] * 0.9)\n\n    X_train, y_train = X[:offset], y[:offset]\n    X_test, y_test = X[offset:], y[offset:]\n\n    reg = LazyRegressor(verbose=0, ignore_warnings=False, custom_metric=None)\n    models, predictions = reg.fit(X_train, X_test, y_train, y_test)\n\n    print(models)\n\n\n    | Model                         | Adjusted R-Squared | R-Squared |  RMSE | Time Taken |\n    |:------------------------------|-------------------:|----------:|------:|-----------:|\n    | SVR                           |               0.83 |      0.88 |  2.62 |       0.01 |\n    | BaggingRegressor              |               0.83 |      0.88 |  2.63 |       0.03 |\n    | NuSVR                         |               0.82 |      0.86 |  2.76 |       0.03 |\n    | RandomForestRegressor         |               0.81 |      0.86 |  2.78 |       0.21 |\n    | XGBRegressor                  |               0.81 |      0.86 |  2.79 |       0.06 |\n    | GradientBoostingRegressor     |               0.81 |      0.86 |  2.84 |       0.11 |\n    | ExtraTreesRegressor           |               0.79 |      0.84 |  2.98 |       0.12 |\n    | AdaBoostRegressor             |               0.78 |      0.83 |  3.04 |       0.07 |\n    | HistGradientBoostingRegressor |               0.77 |      0.83 |  3.06 |       0.17 |\n    | PoissonRegressor              |               0.77 |      0.83 |  3.11 |       0.01 |\n    | LGBMRegressor                 |               0.77 |      0.83 |  3.11 |       0.07 |\n    | KNeighborsRegressor           |               0.77 |      0.83 |  3.12 |       0.01 |\n    | DecisionTreeRegressor         |               0.65 |      0.74 |  3.79 |       0.01 |\n    | MLPRegressor                  |               0.65 |      0.74 |  3.80 |       1.63 |\n    | HuberRegressor                |               0.64 |      0.74 |  3.84 |       0.01 |\n    | GammaRegressor                |               0.64 |      0.73 |  3.88 |       0.01 |\n    | LinearSVR                     |               0.62 |      0.72 |  3.96 |       0.01 |\n    | RidgeCV                       |               0.62 |      0.72 |  3.97 |       0.01 |\n    | BayesianRidge                 |               0.62 |      0.72 |  3.97 |       0.01 |\n    | Ridge                         |               0.62 |      0.72 |  3.97 |       0.01 |\n    | TransformedTargetRegressor    |               0.62 |      0.72 |  3.97 |       0.01 |\n    | LinearRegression              |               0.62 |      0.72 |  3.97 |       0.01 |\n    | ElasticNetCV                  |               0.62 |      0.72 |  3.98 |       0.04 |\n    | LassoCV                       |               0.62 |      0.72 |  3.98 |       0.06 |\n    | LassoLarsIC                   |               0.62 |      0.72 |  3.98 |       0.01 |\n    | LassoLarsCV                   |               0.62 |      0.72 |  3.98 |       0.02 |\n    | Lars                          |               0.61 |      0.72 |  3.99 |       0.01 |\n    | LarsCV                        |               0.61 |      0.71 |  4.02 |       0.04 |\n    | SGDRegressor                  |               0.60 |      0.70 |  4.07 |       0.01 |\n    | TweedieRegressor              |               0.59 |      0.70 |  4.12 |       0.01 |\n    | GeneralizedLinearRegressor    |               0.59 |      0.70 |  4.12 |       0.01 |\n    | ElasticNet                    |               0.58 |      0.69 |  4.16 |       0.01 |\n    | Lasso                         |               0.54 |      0.66 |  4.35 |       0.02 |\n    | RANSACRegressor               |               0.53 |      0.65 |  4.41 |       0.04 |\n    | OrthogonalMatchingPursuitCV   |               0.45 |      0.59 |  4.78 |       0.02 |\n    | PassiveAggressiveRegressor    |               0.37 |      0.54 |  5.09 |       0.01 |\n    | GaussianProcessRegressor      |               0.23 |      0.43 |  5.65 |       0.03 |\n    | OrthogonalMatchingPursuit     |               0.16 |      0.38 |  5.89 |       0.01 |\n    | ExtraTreeRegressor            |               0.08 |      0.32 |  6.17 |       0.01 |\n    | DummyRegressor                |              -0.38 |     -0.02 |  7.56 |       0.01 |\n    | LassoLars                     |              -0.38 |     -0.02 |  7.56 |       0.01 |\n    | KernelRidge                   |             -11.50 |     -8.25 | 22.74 |       0.01 |\n```\n\n\n---\nHistory\n---\n\n# 0.3.2 (2024-03-25)\n\n-   Major import bug fix\n-   Cleanup\n\n# 0.3.1 (2024-03-03)\n\n-   Minor cleanups\n\n# 0.3.0 (2024-03-03)\n\n-   Fixed OneHotEncoder Bug\n\n# 0.2.11 (2022-02-06)\n\n-   Updated the default version to 3.9\n\n# 0.2.10 (2022-02-06)\n\n-   Fixed issue with older version of Scikit-learn\n-   Reduced dependencies sctrictly to few\n\n# 0.2.8 (2021-02-06)\n\n-   Removed StackingRegressor and CheckingClassifier.\n-   Added provided_models method.\n-   Added adjusted r-squared metric.\n-   Added cardinality check to split categorical columns into low and\n    high cardinality features.\n-   Added different transformation pipeline for low and high cardinality\n    features.\n-   Included all number dtypes as inputs.\n-   Fixed dependencies.\n-   Improved documentation.\n\n# 0.2.7 (2020-07-09)\n\n-   Removed catboost regressor and classifier\n\n# 0.2.6 (2020-01-22)\n\n-   Added xgboost, lightgbm, catboost regressors and classifiers\n\n# 0.2.5 (2020-01-20)\n\n-   Removed troublesome regressors from list of CLASSIFIERS\n\n# 0.2.4 (2020-01-19)\n\n-   Removed troublesome regressors from list of REGRESSORS\n-   Added feature to input custom metric for evaluation\n-   Added feature to return predictions as dataframe\n-   Added model training time for each model\n\n# 0.2.3 (2019-11-22)\n\n-   Removed TheilSenRegressor from list of REGRESSORS\n-   Removed GaussianProcessClassifier from list of CLASSIFIERS\n\n# 0.2.2 (2019-11-18)\n\n-   Fixed automatic deployment issue.\n\n# 0.2.1 (2019-11-18)\n\n-   Release of Regression feature.\n\n# 0.2.0 (2019-11-17)\n\n-   Release of Classification feature.\n\n# 0.1.0 (2019-11-16)\n\n-   First release on PyPI.\n",
    "bugtrack_url": null,
    "license": "MIT license",
    "summary": "[Updated] Lazy Predict help build a lot of basic models without much code and helps understand which models works better without any parameter tuning",
    "version": "0.3.2",
    "project_urls": {
        "Homepage": "https://github.com/nityansuman/lazypredict-nightly"
    },
    "split_keywords": [
        "lazypredict",
        " lazypredict-nightly"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "891f117b789c48222d04fbcfeb74c58de20c61e6ec75ea67ea03f9357eb2ddb6",
                "md5": "c6ed269f998cccaa0040dedcfa618b23",
                "sha256": "63787a22d7afca40a0bf2611a15d9946fb56df246816670e2a62341da469b741"
            },
            "downloads": -1,
            "filename": "lazypredict_nightly-0.3.2-py2.py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "c6ed269f998cccaa0040dedcfa618b23",
            "packagetype": "bdist_wheel",
            "python_version": "py2.py3",
            "requires_python": ">=3.6",
            "size": 11600,
            "upload_time": "2024-03-25T08:35:40",
            "upload_time_iso_8601": "2024-03-25T08:35:40.904493Z",
            "url": "https://files.pythonhosted.org/packages/89/1f/117b789c48222d04fbcfeb74c58de20c61e6ec75ea67ea03f9357eb2ddb6/lazypredict_nightly-0.3.2-py2.py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "05fa10de6d6306294880c5e4b15286bf4447f0069c0d19938edd4af610cba6aa",
                "md5": "1a7413bf4b96302bdd552b2db9cf5d94",
                "sha256": "9ab5c2d3cea52c18eae45b9612847c880257f6f415a8fdf65e91e32518c9d74e"
            },
            "downloads": -1,
            "filename": "lazypredict-nightly-0.3.2.tar.gz",
            "has_sig": false,
            "md5_digest": "1a7413bf4b96302bdd552b2db9cf5d94",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.6",
            "size": 13045,
            "upload_time": "2024-03-25T08:35:42",
            "upload_time_iso_8601": "2024-03-25T08:35:42.451633Z",
            "url": "https://files.pythonhosted.org/packages/05/fa/10de6d6306294880c5e4b15286bf4447f0069c0d19938edd4af610cba6aa/lazypredict-nightly-0.3.2.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-03-25 08:35:42",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "nityansuman",
    "github_project": "lazypredict-nightly",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": false,
    "requirements": [],
    "lcname": "lazypredict-nightly"
}
        
Elapsed time: 0.20551s