MFLES


NameMFLES JSON
Version 0.2.4 PyPI version JSON
download
home_pagehttps://github.com/tblume1992/MFLES
SummaryGradient boosted time series forecasting.
upload_time2024-02-10 15:15:09
maintainer
docs_urlNone
authorTyler Blume
requires_python
license
keywords forecasting time series seasonality trend
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # MFLES v0.2.2
 ![alt text](https://github.com/tblume1992/MFLES/blob/main/static/mfles_logo.png?raw=true "logo")

A Specific implementation from ThymeBoost written with the help of Numba.

Here is a quick Introduction and demonstration of methods such as Conformal Prediction Intervals and seasonality decomposition:

https://github.com/tblume1992/MFLES/blob/main/examples/MFLES_Intro.ipynb


Here is a quick benchmark vs AutoETS from M4:
![alt text](https://github.com/tblume1992/MFLES/blob/main/static/mfles_benchmark.PNG?raw=true "benchmark")
## Quick Start:
### Install via pip
```
pip install MFLES
```

### Import MFLES class
```
from MFLES.Forecaster import MFLES
```
### Import data
```
import pandas as pd
import numpy as np
import matplotlib.pyplot as plt

df = pd.read_csv(r'https://raw.githubusercontent.com/jbrownlee/Datasets/master/airline-passengers.csv')
```
### Fit and predict!
```
mfles = MFLES()
fitted = mfles.fit(df['Passengers'].values, seasonal_period=12)
predicted = mfles.predict(12)

plt.plot(np.append(fitted, predicted))
plt.plot(df['Passengers'].values)
plt.show()
```
![alt text](https://github.com/tblume1992/MFLES/blob/main/static/mfles_example.png?raw=true "example")

### Or Optimize
```
mfles = MFLES()
opt_params = mfles.optimize(df['Passengers'].values,
                          seasonal_period=12,
                          test_size=6,
                          n_steps=3, #number of train/test splits to make
                          step_size=6, #the number of periods to move each step
                          metric='mse' #should support smape, mse, mae, mape
                          )
fitted = mfles.fit(df['Passengers'].values, **opt_params)
predicted = mfles.predict(12)

plt.plot(np.append(fitted, predicted))
plt.plot(df['Passengers'].values)
plt.show()
```
### Fitting from dataframe:
```
from MFLES.Forecaster import fit_from_df

output = fit_from_df(df,
                      forecast_horizon=24,
                      freq='M',
                      seasonal_period=12,
                      id_column='unique_id',
                      time_column='ds',
                      value_column='y',
                      floor=0)
```
### Optimizing from dataframe
```
from MFLES.Forecaster import optimize_from_df

output = optimize_from_df(df,
                          forecast_horizon=4,
                          test_size=4,
                          n_steps=3,
                          step_size=1,
                          metric='mse',
                          seasonal_period=12,
                          freq='M')
```
## Gradient Boosted Time Series Decomposition Theory
The idea is pretty simple, take a process like decomposition and view it as
a type of 'psuedo' gradient boosting since we are passing residuals around
simlar to standard gradient boosting. Then apply gradient boosting approaches
such as iterating with a global mechanism to control the process and introduce
learning rates for each of the components in the process such as trend or
seasonality or exogenous. By doing this we graduate from this 'psuedo' approach
to full blown gradient boosting.

This process allows us to fit pretty exotic models and optimize for each learning
rate to make them jive. Also enables online learning since the framework is made
for residuals. Also opens up changepoint detection using segmentation schemes
although that is out-of-scope of this library.

## Citing
```
@software{
author = {Blume Tyler},
license = {MIT License},
title = {{MFLES}},
url = {https://github.com/tblume1992/MFLES},
version = {0.2.2},
year = {2024}
}
```

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/tblume1992/MFLES",
    "name": "MFLES",
    "maintainer": "",
    "docs_url": null,
    "requires_python": "",
    "maintainer_email": "",
    "keywords": "forecasting,time series,seasonality,trend",
    "author": "Tyler Blume",
    "author_email": "",
    "download_url": "",
    "platform": null,
    "description": "# MFLES v0.2.2\n ![alt text](https://github.com/tblume1992/MFLES/blob/main/static/mfles_logo.png?raw=true \"logo\")\n\nA Specific implementation from ThymeBoost written with the help of Numba.\n\nHere is a quick Introduction and demonstration of methods such as Conformal Prediction Intervals and seasonality decomposition:\n\nhttps://github.com/tblume1992/MFLES/blob/main/examples/MFLES_Intro.ipynb\n\n\nHere is a quick benchmark vs AutoETS from M4:\n![alt text](https://github.com/tblume1992/MFLES/blob/main/static/mfles_benchmark.PNG?raw=true \"benchmark\")\n## Quick Start:\n### Install via pip\n```\npip install MFLES\n```\n\n### Import MFLES class\n```\nfrom MFLES.Forecaster import MFLES\n```\n### Import data\n```\nimport pandas as pd\nimport numpy as np\nimport matplotlib.pyplot as plt\n\ndf = pd.read_csv(r'https://raw.githubusercontent.com/jbrownlee/Datasets/master/airline-passengers.csv')\n```\n### Fit and predict!\n```\nmfles = MFLES()\nfitted = mfles.fit(df['Passengers'].values, seasonal_period=12)\npredicted = mfles.predict(12)\n\nplt.plot(np.append(fitted, predicted))\nplt.plot(df['Passengers'].values)\nplt.show()\n```\n![alt text](https://github.com/tblume1992/MFLES/blob/main/static/mfles_example.png?raw=true \"example\")\n\n### Or Optimize\n```\nmfles = MFLES()\nopt_params = mfles.optimize(df['Passengers'].values,\n                          seasonal_period=12,\n                          test_size=6,\n                          n_steps=3, #number of train/test splits to make\n                          step_size=6, #the number of periods to move each step\n                          metric='mse' #should support smape, mse, mae, mape\n                          )\nfitted = mfles.fit(df['Passengers'].values, **opt_params)\npredicted = mfles.predict(12)\n\nplt.plot(np.append(fitted, predicted))\nplt.plot(df['Passengers'].values)\nplt.show()\n```\n### Fitting from dataframe:\n```\nfrom MFLES.Forecaster import fit_from_df\n\noutput = fit_from_df(df,\n                      forecast_horizon=24,\n                      freq='M',\n                      seasonal_period=12,\n                      id_column='unique_id',\n                      time_column='ds',\n                      value_column='y',\n                      floor=0)\n```\n### Optimizing from dataframe\n```\nfrom MFLES.Forecaster import optimize_from_df\n\noutput = optimize_from_df(df,\n                          forecast_horizon=4,\n                          test_size=4,\n                          n_steps=3,\n                          step_size=1,\n                          metric='mse',\n                          seasonal_period=12,\n                          freq='M')\n```\n## Gradient Boosted Time Series Decomposition Theory\nThe idea is pretty simple, take a process like decomposition and view it as\na type of 'psuedo' gradient boosting since we are passing residuals around\nsimlar to standard gradient boosting. Then apply gradient boosting approaches\nsuch as iterating with a global mechanism to control the process and introduce\nlearning rates for each of the components in the process such as trend or\nseasonality or exogenous. By doing this we graduate from this 'psuedo' approach\nto full blown gradient boosting.\n\nThis process allows us to fit pretty exotic models and optimize for each learning\nrate to make them jive. Also enables online learning since the framework is made\nfor residuals. Also opens up changepoint detection using segmentation schemes\nalthough that is out-of-scope of this library.\n\n## Citing\n```\n@software{\nauthor = {Blume Tyler},\nlicense = {MIT License},\ntitle = {{MFLES}},\nurl = {https://github.com/tblume1992/MFLES},\nversion = {0.2.2},\nyear = {2024}\n}\n```\n",
    "bugtrack_url": null,
    "license": "",
    "summary": "Gradient boosted time series forecasting.",
    "version": "0.2.4",
    "project_urls": {
        "Homepage": "https://github.com/tblume1992/MFLES"
    },
    "split_keywords": [
        "forecasting",
        "time series",
        "seasonality",
        "trend"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "5cc23457053a4379968579e4fb89255d4e3079e2b784e712fcdf8c2e6373b3f5",
                "md5": "64c2efceb5a14ef6790b4ab59152bc69",
                "sha256": "202dda077398a281c24130c9ff2a599b1dfebe4a90cb0d59f073a4e22a62adf1"
            },
            "downloads": -1,
            "filename": "MFLES-0.2.4-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "64c2efceb5a14ef6790b4ab59152bc69",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": null,
            "size": 14434,
            "upload_time": "2024-02-10T15:15:09",
            "upload_time_iso_8601": "2024-02-10T15:15:09.042020Z",
            "url": "https://files.pythonhosted.org/packages/5c/c2/3457053a4379968579e4fb89255d4e3079e2b784e712fcdf8c2e6373b3f5/MFLES-0.2.4-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-02-10 15:15:09",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "tblume1992",
    "github_project": "MFLES",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": false,
    "lcname": "mfles"
}
        
Elapsed time: 0.18085s