LazyProphet


NameLazyProphet JSON
Version 0.3.9 PyPI version JSON
download
home_pagehttps://github.com/tblume1992/LazyProphet
SummaryTime series forecasting with LightGBM
upload_time2023-08-14 16:11:12
maintainer
docs_urlNone
authorTyler Blume
requires_python
license
keywords forecasting time series lightgbm
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # LazyProphet v0.3.8

## Recent Changes

With v0.3.8 comes a fully fledged Optuna Optimizer for simple (no exogenous) regression problems. Classification is ToDo.

A Quick example of the new functionality:

```
from LazyProphet import LazyProphet as lp
from sklearn.datasets import fetch_openml
import matplotlib.pyplot as plt

bike_sharing = fetch_openml("Bike_Sharing_Demand", version=2, as_frame=True)
y = bike_sharing.frame['count']
y = y[-400:].values

lp_model = lp.LazyProphet.Optimize(y,
                                seasonal_period=[24, 168],
                                n_folds=2, # must be greater than 1
                                n_trials=20, # number of optimization runs, default is 100
                                test_size=48 # size of the holdout set to test against
                                )
fitted = lp_model.fit(y)
predicted = lp_model.predict(100)

plt.plot(y)
plt.plot(np.append(fitted, predicted))
plt.axvline(400)
plt.show()
```

## Introduction

[A decent intro can be found here.](https://medium.com/p/3745bafe5ce5)

LazyProphet is a time series forecasting model built for LightGBM forecasting of single time series.

Many nice-ities have been added such as recursive forecasting when using lagged target variable such as the last 4 values to predict the 5th.

Additionally, fourier basis functions and penalized weighted piecewise linear basis functions are options as well!

Don't ever use in-sample fit for these types of models as they fit the data quite snuggly.

## Quickstart

```
pip install LazyProphet
```

Simple example from Sklearn, just give it the hyperparameters and an array:

```
from LazyProphet import LazyProphet as lp
from sklearn.datasets import fetch_openml
import matplotlib.pyplot as plt

bike_sharing = fetch_openml("Bike_Sharing_Demand", version=2, as_frame=True)
y = bike_sharing.frame['count']
y = y[-400:].values

lp_model = lp.LazyProphet(seasonal_period=[24, 168], #list means we use both seasonal periods
                          n_basis=4, #weighted piecewise basis functions
                          fourier_order=10,
                          ar=list(range(1,25)),
                          decay=.99 #the 'penalized' in penalized weighted piecewise linear basis functions
                          )
fitted = lp_model.fit(y)
predicted = lp_model.predict(100)

plt.plot(y)
plt.plot(np.append(fitted, predicted))
plt.axvline(400)
plt.show()
```
![alt text](https://github.com/tblume1992/LazyProphet/blob/main/LazyProphet/static/example_output.png "Output 1")

If you are working with less data or then you will probably want to pass custom LightGBM params via boosting_params when creating the LazyProphet obj.

The default params are:

```
boosting_params = {
                        "objective": "regression",
                        "metric": "rmse",
                        "verbosity": -1,
                        "boosting_type": "gbdt",
                        "seed": 42,
                        'linear_tree': False,
                        'learning_rate': .15,
                        'min_child_samples': 5,
                        'num_leaves': 31,
                        'num_iterations': 50
                    }
```
*WARNING* 
Passing linear_tree=True can be extremely unstable, especially with ar and n_basis arguments. We do tests for linearity and will de-trend if necessary.
**

Most importantly for controlling the complexity by using num_leaves/learning_rate for complexity with less data.

Alternatively, you could try out the method:

```
tree_optimize(y, exogenous=None, cv_splits=3, test_size=None)
```
In-place of the fit method.  This will do 'cv_splits' number of Time-Series Cross-Validation steps to optimize the tree using Optuna. This method has some degraded performance in testing but may be better for autoforecasting various types of data sizes.

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/tblume1992/LazyProphet",
    "name": "LazyProphet",
    "maintainer": "",
    "docs_url": null,
    "requires_python": "",
    "maintainer_email": "",
    "keywords": "forecasting,time series,lightgbm",
    "author": "Tyler Blume",
    "author_email": "tblume@mail.USF.edu",
    "download_url": "",
    "platform": null,
    "description": "# LazyProphet v0.3.8\n\n## Recent Changes\n\nWith v0.3.8 comes a fully fledged Optuna Optimizer for simple (no exogenous) regression problems. Classification is ToDo.\n\nA Quick example of the new functionality:\n\n```\nfrom LazyProphet import LazyProphet as lp\nfrom sklearn.datasets import fetch_openml\nimport matplotlib.pyplot as plt\n\nbike_sharing = fetch_openml(\"Bike_Sharing_Demand\", version=2, as_frame=True)\ny = bike_sharing.frame['count']\ny = y[-400:].values\n\nlp_model = lp.LazyProphet.Optimize(y,\n                                seasonal_period=[24, 168],\n                                n_folds=2, # must be greater than 1\n                                n_trials=20, # number of optimization runs, default is 100\n                                test_size=48 # size of the holdout set to test against\n                                )\nfitted = lp_model.fit(y)\npredicted = lp_model.predict(100)\n\nplt.plot(y)\nplt.plot(np.append(fitted, predicted))\nplt.axvline(400)\nplt.show()\n```\n\n## Introduction\n\n[A decent intro can be found here.](https://medium.com/p/3745bafe5ce5)\n\nLazyProphet is a time series forecasting model built for LightGBM forecasting of single time series.\n\nMany nice-ities have been added such as recursive forecasting when using lagged target variable such as the last 4 values to predict the 5th.\n\nAdditionally, fourier basis functions and penalized weighted piecewise linear basis functions are options as well!\n\nDon't ever use in-sample fit for these types of models as they fit the data quite snuggly.\n\n## Quickstart\n\n```\npip install LazyProphet\n```\n\nSimple example from Sklearn, just give it the hyperparameters and an array:\n\n```\nfrom LazyProphet import LazyProphet as lp\nfrom sklearn.datasets import fetch_openml\nimport matplotlib.pyplot as plt\n\nbike_sharing = fetch_openml(\"Bike_Sharing_Demand\", version=2, as_frame=True)\ny = bike_sharing.frame['count']\ny = y[-400:].values\n\nlp_model = lp.LazyProphet(seasonal_period=[24, 168], #list means we use both seasonal periods\n                          n_basis=4, #weighted piecewise basis functions\n                          fourier_order=10,\n                          ar=list(range(1,25)),\n                          decay=.99 #the 'penalized' in penalized weighted piecewise linear basis functions\n                          )\nfitted = lp_model.fit(y)\npredicted = lp_model.predict(100)\n\nplt.plot(y)\nplt.plot(np.append(fitted, predicted))\nplt.axvline(400)\nplt.show()\n```\n![alt text](https://github.com/tblume1992/LazyProphet/blob/main/LazyProphet/static/example_output.png \"Output 1\")\n\nIf you are working with less data or then you will probably want to pass custom LightGBM params via boosting_params when creating the LazyProphet obj.\n\nThe default params are:\n\n```\nboosting_params = {\n                        \"objective\": \"regression\",\n                        \"metric\": \"rmse\",\n                        \"verbosity\": -1,\n                        \"boosting_type\": \"gbdt\",\n                        \"seed\": 42,\n                        'linear_tree': False,\n                        'learning_rate': .15,\n                        'min_child_samples': 5,\n                        'num_leaves': 31,\n                        'num_iterations': 50\n                    }\n```\n*WARNING* \nPassing linear_tree=True can be extremely unstable, especially with ar and n_basis arguments. We do tests for linearity and will de-trend if necessary.\n**\n\nMost importantly for controlling the complexity by using num_leaves/learning_rate for complexity with less data.\n\nAlternatively, you could try out the method:\n\n```\ntree_optimize(y, exogenous=None, cv_splits=3, test_size=None)\n```\nIn-place of the fit method.  This will do 'cv_splits' number of Time-Series Cross-Validation steps to optimize the tree using Optuna. This method has some degraded performance in testing but may be better for autoforecasting various types of data sizes.\n",
    "bugtrack_url": null,
    "license": "",
    "summary": "Time series forecasting with LightGBM",
    "version": "0.3.9",
    "project_urls": {
        "Homepage": "https://github.com/tblume1992/LazyProphet"
    },
    "split_keywords": [
        "forecasting",
        "time series",
        "lightgbm"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "59d1935fdd0be9c2788857af1db3e7f2dad2fe967d39b0b75e5362899c8355a0",
                "md5": "3c73cc44b3f838e64c7d8d2075a434e8",
                "sha256": "1e297deb3c7ce035594c0644c65aea8daf908957c8f413784344687587f94e2e"
            },
            "downloads": -1,
            "filename": "LazyProphet-0.3.9-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "3c73cc44b3f838e64c7d8d2075a434e8",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": null,
            "size": 10190,
            "upload_time": "2023-08-14T16:11:12",
            "upload_time_iso_8601": "2023-08-14T16:11:12.390816Z",
            "url": "https://files.pythonhosted.org/packages/59/d1/935fdd0be9c2788857af1db3e7f2dad2fe967d39b0b75e5362899c8355a0/LazyProphet-0.3.9-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2023-08-14 16:11:12",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "tblume1992",
    "github_project": "LazyProphet",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": false,
    "lcname": "lazyprophet"
}
        
Elapsed time: 0.22768s