SCALECAST


NameSCALECAST JSON
Version 0.19.10 PyPI version JSON
download
home_pagehttps://github.com/mikekeith52/scalecast
SummaryThe practitioner's time series forecasting library
upload_time2024-10-14 22:58:10
maintainerNone
docs_urlNone
authorMichael Keith
requires_pythonNone
licenseMIT
keywords forecasting deep learning time series machine learning easy
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            
# Scalecast

<p align="center">
  <img src="_static/logo2.png" alt="Scalecast Logo"/>
</p>

## About

Scalecast helps you forecast time series. Here is how to initiate its main object:
```python
from scalecast.Forecaster import Forecaster

f = Forecaster(
    y = array_of_values,
    current_dates = array_of_dates,
    future_dates=fcst_horizon_length,
    test_length = 0, # do you want to test all models? if so, on how many or what percent of observations?
    cis = False, # evaluate conformal confidence intervals for all models?
    metrics = ['rmse','mape','mae','r2'], # what metrics to evaluate over the validation/test sets?
)
```
Uniform ML modeling (with models from a diverse set of libraries, including scikit-learn, statsmodels, and tensorflow), reporting, and data visualizations are offered through the `Forecaster` and `MVForecaster` interfaces. Data storage and processing then becomes easy as all applicable data, predictions, and many derived metrics are contained in a few objects with much customization available through different modules. [Feature requests and issue reporting](https://github.com/mikekeith52/scalecast/issues/new) are welcome! Don't forget to leave a star!⭐  

## Documentation  
- [Read the Docs](https://scalecast.readthedocs.io/en/latest/)  
- [Introductory Notebook](https://scalecast-examples.readthedocs.io/en/latest/misc/introduction/Introduction2.html)  
- [Change Log](https://scalecast.readthedocs.io/en/latest/change_log.html)  
 
## Popular Features
1. **Easy LSTM Modeling:** setting up an LSTM model for time series using tensorflow is hard. Using scalecast, it's easy. Many tutorials and Kaggle notebooks that are designed for those getting to know the model use scalecast (see the [aritcle](https://medium.com/towards-data-science/exploring-the-lstm-neural-network-model-for-time-series-8b7685aa8cf)).
```python
f.set_estimator('lstm')
f.manual_forecast(
    lags=36,
    batch_size=32,
    epochs=15,
    validation_split=.2,
    activation='tanh',
    optimizer='Adam',
    learning_rate=0.001,
    lstm_layer_sizes=(100,)*3,
    dropout=(0,)*3,
)
```
2. **Auto lag, trend, and seasonality selection:**
```python
f.auto_Xvar_select( # iterate through different combinations of covariates
    estimator = 'lasso', # what estimator?
    alpha = .2, # estimator hyperparams?
    monitor = 'ValidationMetricValue', # what metric to monitor to make decisions?
    cross_validate = True, # cross validate
    cvkwargs = {'k':3}, # 3 folds
)
```
3. **Hyperparameter tuning using grid search and time series cross validation:**
```python
from scalecast import GridGenerator

GridGenerator.get_example_grids()
models = ['ridge','lasso','xgboost','lightgbm','knn']
f.tune_test_forecast(
    models,
    limit_grid_size = .2,
    feature_importance = True, # save pfi feature importance for each model?
    cross_validate = True, # cross validate? if False, using a seperate validation set that the user can specify
    rolling = True, # rolling time series cross validation?
    k = 3, # how many folds?
)
```
4. **Plotting results:** plot test predictions, forecasts, fitted values, and more.
```python
import matplotlib.pyplot as plt

fig, ax = plt.subplots(2,1, figsize = (12,6))
f.plot_test_set(models=models,order_by='TestSetRMSE',ax=ax[0])
f.plot(models=models,order_by='TestSetRMSE',ax=ax[1])
plt.show()
```
5. **Pipelines that include transformations, reverting, and backtesting:**
```python
from scalecast import GridGenerator
from scalecast.Pipeline import Transformer, Reverter, Pipeline
from scalecast.util import find_optimal_transformation, backtest_metrics

def forecaster(f):
    models = ['ridge','lasso','xgboost','lightgbm','knn']
    f.tune_test_forecast(
        models,
        limit_grid_size = .2, # randomized grid search on 20% of original grid sizes
        feature_importance = True, # save pfi feature importance for each model?
        cross_validate = True, # cross validate? if False, using a seperate validation set that the user can specify
        rolling = True, # rolling time series cross validation?
        k = 3, # how many folds?
    )

transformer, reverter = find_optimal_transformation(f) # just one of several ways to select transformations for your series

pipeline = Pipeline(
    steps = [
        ('Transform',transformer),
        ('Forecast',forecaster),
        ('Revert',reverter),
    ]
)

f = pipeline.fit_predict(f)
backtest_results = pipeline.backtest(f)
metrics = backtest_metrics(backtest_results)
```
6. **Model stacking:** There are two ways to stack models with scalecast, with the [`StackingRegressor`](https://medium.com/towards-data-science/expand-your-time-series-arsenal-with-these-models-10c807d37558) from scikit-learn or using [its own stacking procedure](https://medium.com/p/7977c6667d29).
```python
from scalecast.auxmodels import auto_arima

f.set_estimator('lstm')
f.manual_forecast(
    lags=36,
    batch_size=32,
    epochs=15,
    validation_split=.2,
    activation='tanh',
    optimizer='Adam',
    learning_rate=0.001,
    lstm_layer_sizes=(100,)*3,
    dropout=(0,)*3,
)

f.set_estimator('prophet')
f.manual_forecast()

auto_arima(f)

# stack previously evaluated models
f.add_signals(['lstm','prophet','arima'])
f.set_estimator('catboost')
f.manual_forecast()
```
7. **Multivariate modeling and multivariate pipelines:**
```python
from scalecast.MVForecaster import MVForecaster
from scalecast.Pipeline import MVPipeline
from scalecast.util import find_optimal_transformation, backtest_metrics
from scalecast import GridGenerator

GridGenerator.get_mv_grids()

def mvforecaster(mvf):
    models = ['ridge','lasso','xgboost','lightgbm','knn']
    mvf.tune_test_forecast(
        models,
        limit_grid_size = .2, # randomized grid search on 20% of original grid sizes
        cross_validate = True, # cross validate? if False, using a seperate validation set that the user can specify
        rolling = True, # rolling time series cross validation?
        k = 3, # how many folds?
    )

mvf = MVForecaster(f1,f2,f3) # can take N Forecaster objects

transformer1, reverter1 = find_optimal_transformation(f1)
transformer2, reverter2 = find_optimal_transformation(f2)
transformer3, reverter3 = find_optimal_transformation(f3)

pipeline = MVPipeline(
    steps = [
        ('Transform',[transformer1,transformer2,transformer3]),
        ('Forecast',mvforecaster),
        ('Revert',[reverter1,reverter2,reverter3])
    ]
)

f1, f2, f3 = pipeline.fit_predict(f1, f2, f3)
backtest_results = pipeline.backtest(f1, f2, f3)
metrics = backtest_metrics(backtest_results)
```
8. **Transfer Learning (new with 0.19.0):** Train a model in one `Forecaster` object and use that model to make predictions on the data in a separate `Forecaster` object.
```python
f = Forecaster(...)
f.auto_Xvar_select()
f.set_estimator('xgboost')
f.cross_validate()
f.auto_forecast()

f_new = Forecaster(...) # different series than f
f_new = infer_apply_Xvar_selection(infer_from=f,apply_to=f_new)
f_new.transfer_predict(transfer_from=f,model='xgboost') # transfers the xgboost model from f to f_new
```

## Installation
- Only the base package is needed to get started:  
  - `pip install --upgrade scalecast`  
- Optional add-ons:  
  - `pip install tensorflow` (for RNN/LSTM on Windows) or `pip install tensorflow-macos` (for MAC/M1)
  - `pip install darts`  
  - `pip install prophet`  
  - `pip install greykite` (for the silverkite model)  
  - `pip install kats` (changepoint detection)  
  - `pip install pmdarima` (auto arima)  
  - `pip install tqdm` (progress bar for notebook)  
  - `pip install ipython` (widgets for notebook)  
  - `pip install ipywidgets` (widgets for notebook)  
  - `jupyter nbextension enable --py widgetsnbextension` (widgets for notebook)  
  - `jupyter labextension install @jupyter-widgets/jupyterlab-manager` (widgets for Lab)  

## Papers that use scalecast
- [Post-covid customer service behavior forecasting using machine learning techniques](https://digital-library.theiet.org/content/conferences/10.1049/icp.2023.0947)
- [Application of ANN and traditional ML algorithms in modelling compost production under different climatic conditions](https://link.springer.com/article/10.1007/s00521-023-08404-4)
- [Reservoir Computing Solutions for Streamflow Modeling and Prediction in Real World Scenarios](https://scholarsarchive.byu.edu/studentpub_uht/273/)
- [LSTM-based recurrent neural network provides effective short term flu forecasting](https://link.springer.com/article/10.1186/s12889-023-16720-6)
- [IMPLEMENTING AN ENERGY TRADING STRATEGY USING FORECASTING OF ENERGY PRICES AND PRODUCTION](https://webthesis.biblio.polito.it/29175/)
- [Modelamiento predictivo del número de visitantes en un centro comercial](https://repository.eafit.edu.co/items/ffd6e4fe-9a68-49d5-b74b-47311db99f04)

## Udemy Course
[Scalecast: Machine Learning & Deep Learning](https://www.udemy.com/course/uniform-ml-dl/?couponCode=LETSLEARNNOWPP)

## Blog posts and notebooks

### [Forecasting with Different Model Types](https://scalecast.readthedocs.io/en/latest/Forecaster/_forecast.html)
- Sklearn Univariate
  - [Expand your Time Series Arsenal with These Models](https://towardsdatascience.com/expand-your-time-series-arsenal-with-these-models-10c807d37558)
  - [Notebook](https://scalecast-examples.readthedocs.io/en/latest/sklearn/sklearn.html)
- Sklearn Multivariate
  - [Multiple Series? Forecast Them together with any Sklearn Model](https://towardsdatascience.com/multiple-series-forecast-them-together-with-any-sklearn-model-96319d46269)
  - [Notebook 1](https://scalecast-examples.readthedocs.io/en/latest/multivariate/multivariate.html)
  - [Notebook 2](https://scalecast-examples.readthedocs.io/en/latest/multivariate-beyond/mv.html)  
- RNN 
  - [Exploring the LSTM Neural Network Model for Time Series](https://towardsdatascience.com/exploring-the-lstm-neural-network-model-for-time-series-8b7685aa8cf)
  - [LSTM Notebook](https://scalecast-examples.readthedocs.io/en/latest/lstm/lstm.html)
  - [RNN Notebook](https://scalecast-examples.readthedocs.io/en/latest/rnn/rnn.html)
- ARIMA
  - [Forecast with ARIMA in Python More Easily with Scalecast](https://towardsdatascience.com/forecast-with-arima-in-python-more-easily-with-scalecast-35125fc7dc2e)
  - [Notebook](https://scalecast-examples.readthedocs.io/en/latest/arima/arima.html)
- Theta
  - [Easily Employ A Theta Model For Time Series](https://medium.com/towards-data-science/easily-employ-a-theta-model-for-time-series-b94465099a00)
  - [Notebook](https://scalecast-examples.readthedocs.io/en/latest/theta/theta.html)
- VECM
  - [Employ a VECM to predict FANG Stocks with an ML Framework](https://medium.com/p/52f170ec68e6)
  - [Notebook](https://scalecast-examples.readthedocs.io/en/latest/vecm/vecm.html)
- Stacking
   - [Stacking Time Series Models to Improve Accuracy](https://medium.com/towards-data-science/stacking-time-series-models-to-improve-accuracy-7977c6667d29)
   - [Notebook](https://scalecast-examples.readthedocs.io/en/latest/misc/stacking/custom_stacking.html)
- Other Notebooks
  - [Prophet](https://scalecast-examples.readthedocs.io/en/latest/prophet/prophet.html)
  - [Combo](https://scalecast-examples.readthedocs.io/en/latest/combo/combo.html)
  - [Holt-Winters Exponential Smoothing](https://scalecast-examples.readthedocs.io/en/latest/hwes/hwes.html)
  - [Silverkite](https://scalecast-examples.readthedocs.io/en/latest/silverkite/silverkite.html)

### [Transforming and Reverting](https://scalecast.readthedocs.io/en/latest/Forecaster/SeriesTransformer.html)
- [Time Series Transformations (and Reverting) Made Easy](https://medium.com/towards-data-science/time-series-transformations-and-reverting-made-easy-f4f768c18f63)
- [Notebook](https://scalecast-examples.readthedocs.io/en/latest/transforming/medium_code.html)  
  
### Confidence Intervals
- [Easy Distribution-Free Conformal Intervals for Time Series](https://medium.com/towards-data-science/easy-distribution-free-conformal-intervals-for-time-series-665137e4d907)  
- [Dynamic Conformal Intervals for any Time Series Model](https://towardsdatascience.com/dynamic-conformal-intervals-for-any-time-series-model-d1638aa48527)
- [Notebook 1](https://scalecast-examples.readthedocs.io/en/latest/misc/cis/cis.html)  
- [Notebook 2](https://scalecast-examples.readthedocs.io/en/latest/misc/cis-bt/cis-bt.html)

### Dynamic Validation
- [How Not to be Fooled by Time Series Models](https://towardsdatascience.com/how-not-to-be-fooled-by-time-series-forecasting-8044f5838de3)
- [Model Validation Techniques for Time Series](https://towardsdatascience.com/model-validation-techniques-for-time-series-3518269bd5b3)
- [Notebook](https://scalecast-examples.readthedocs.io/en/latest/misc/validation/validation.html)

### Model Input Selection
- [Variable Reduction Techniques for Time Series](https://medium.com/towards-data-science/variable-reduction-techniques-for-time-series-646743f726d4)
- [Auto Model Specification with ML Techniques for Time Series](https://mikekeith52.medium.com/auto-model-specification-with-ml-techniques-for-time-series-e7b9a90ae9d7)
- [Notebook 1](https://scalecast-examples.readthedocs.io/en/latest/misc/feature-selection/feature_selection.html)
- [Notebook 2](https://scalecast-examples.readthedocs.io/en/latest/misc/auto_Xvar/auto_Xvar.html)

### Scaled Forecasting on Many Series
- [May the Forecasts Be with You](https://towardsdatascience.com/may-the-forecasts-be-with-you-introducing-scalecast-pt-2-692f3f7f0be5)
- [Introductory Notebook Section](https://scalecast-examples.readthedocs.io/en/latest/misc/introduction/Introduction2.html#Scaled-Automated-Forecasting)

### Transfer Learning
- [Notebook 1](https://scalecast-examples.readthedocs.io/en/latest/transfer_learning/transfer_learning.html)
- [Notebook 2](https://scalecast-examples.readthedocs.io/en/latest/transfer_learning/transfer_learning_tf.html)

### Anomaly Detection
- [Anomaly Detection for Time Series with Monte Carlo Simulations](https://towardsdatascience.com/anomaly-detection-for-time-series-with-monte-carlo-simulations-e43c77ba53c?source=email-85177a9cbd35-1658325190052-activity.collection_post_approved)
- [Notebook1](https://scalecast-examples.readthedocs.io/en/latest/misc/anomalies/anomalies.html)
- [Notebook2](https://github.com/mikekeith52/scalecast-examples/blob/main/misc/anomalies/monte%20carlo/monte%20carlo.ipynb)

## Contributing
- [Contributing.md](https://github.com/mikekeith52/scalecast/blob/main/Contributing.md)
- Want something that's not listed? Open an [issue](https://github.com/mikekeith52/scalecast/issues/new)!  

## How to cite scalecast
```
@misc{scalecast,
  title = {{scalecast}},
  author = {Michael Keith},
  year = {2024},
  version = {<your version>},
  url = {https://scalecast.readthedocs.io/en/latest/},
}
```

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/mikekeith52/scalecast",
    "name": "SCALECAST",
    "maintainer": null,
    "docs_url": null,
    "requires_python": null,
    "maintainer_email": null,
    "keywords": "FORECASTING, DEEP LEARNING, TIME SERIES, MACHINE LEARNING, EASY",
    "author": "Michael Keith",
    "author_email": "mikekeith52@gmail.com",
    "download_url": "https://files.pythonhosted.org/packages/16/8e/52acf029a7454e66dd9af176a9c77a47c207299e25bd7f60de97788bae47/scalecast-0.19.10.tar.gz",
    "platform": null,
    "description": "\n# Scalecast\n\n<p align=\"center\">\n  <img src=\"_static/logo2.png\" alt=\"Scalecast Logo\"/>\n</p>\n\n## About\n\nScalecast helps you forecast time series. Here is how to initiate its main object:\n```python\nfrom scalecast.Forecaster import Forecaster\n\nf = Forecaster(\n    y = array_of_values,\n    current_dates = array_of_dates,\n    future_dates=fcst_horizon_length,\n    test_length = 0, # do you want to test all models? if so, on how many or what percent of observations?\n    cis = False, # evaluate conformal confidence intervals for all models?\n    metrics = ['rmse','mape','mae','r2'], # what metrics to evaluate over the validation/test sets?\n)\n```\nUniform ML modeling (with models from a diverse set of libraries, including scikit-learn, statsmodels, and tensorflow), reporting, and data visualizations are offered through the `Forecaster` and `MVForecaster` interfaces. Data storage and processing then becomes easy as all applicable data, predictions, and many derived metrics are contained in a few objects with much customization available through different modules. [Feature requests and issue reporting](https://github.com/mikekeith52/scalecast/issues/new) are welcome! Don't forget to leave a star!\u2b50  \n\n## Documentation  \n- [Read the Docs](https://scalecast.readthedocs.io/en/latest/)  \n- [Introductory Notebook](https://scalecast-examples.readthedocs.io/en/latest/misc/introduction/Introduction2.html)  \n- [Change Log](https://scalecast.readthedocs.io/en/latest/change_log.html)  \n \n## Popular Features\n1. **Easy LSTM Modeling:** setting up an LSTM model for time series using tensorflow is hard. Using scalecast, it's easy. Many tutorials and Kaggle notebooks that are designed for those getting to know the model use scalecast (see the [aritcle](https://medium.com/towards-data-science/exploring-the-lstm-neural-network-model-for-time-series-8b7685aa8cf)).\n```python\nf.set_estimator('lstm')\nf.manual_forecast(\n    lags=36,\n    batch_size=32,\n    epochs=15,\n    validation_split=.2,\n    activation='tanh',\n    optimizer='Adam',\n    learning_rate=0.001,\n    lstm_layer_sizes=(100,)*3,\n    dropout=(0,)*3,\n)\n```\n2. **Auto lag, trend, and seasonality selection:**\n```python\nf.auto_Xvar_select( # iterate through different combinations of covariates\n    estimator = 'lasso', # what estimator?\n    alpha = .2, # estimator hyperparams?\n    monitor = 'ValidationMetricValue', # what metric to monitor to make decisions?\n    cross_validate = True, # cross validate\n    cvkwargs = {'k':3}, # 3 folds\n)\n```\n3. **Hyperparameter tuning using grid search and time series cross validation:**\n```python\nfrom scalecast import GridGenerator\n\nGridGenerator.get_example_grids()\nmodels = ['ridge','lasso','xgboost','lightgbm','knn']\nf.tune_test_forecast(\n    models,\n    limit_grid_size = .2,\n    feature_importance = True, # save pfi feature importance for each model?\n    cross_validate = True, # cross validate? if False, using a seperate validation set that the user can specify\n    rolling = True, # rolling time series cross validation?\n    k = 3, # how many folds?\n)\n```\n4. **Plotting results:** plot test predictions, forecasts, fitted values, and more.\n```python\nimport matplotlib.pyplot as plt\n\nfig, ax = plt.subplots(2,1, figsize = (12,6))\nf.plot_test_set(models=models,order_by='TestSetRMSE',ax=ax[0])\nf.plot(models=models,order_by='TestSetRMSE',ax=ax[1])\nplt.show()\n```\n5. **Pipelines that include transformations, reverting, and backtesting:**\n```python\nfrom scalecast import GridGenerator\nfrom scalecast.Pipeline import Transformer, Reverter, Pipeline\nfrom scalecast.util import find_optimal_transformation, backtest_metrics\n\ndef forecaster(f):\n    models = ['ridge','lasso','xgboost','lightgbm','knn']\n    f.tune_test_forecast(\n        models,\n        limit_grid_size = .2, # randomized grid search on 20% of original grid sizes\n        feature_importance = True, # save pfi feature importance for each model?\n        cross_validate = True, # cross validate? if False, using a seperate validation set that the user can specify\n        rolling = True, # rolling time series cross validation?\n        k = 3, # how many folds?\n    )\n\ntransformer, reverter = find_optimal_transformation(f) # just one of several ways to select transformations for your series\n\npipeline = Pipeline(\n    steps = [\n        ('Transform',transformer),\n        ('Forecast',forecaster),\n        ('Revert',reverter),\n    ]\n)\n\nf = pipeline.fit_predict(f)\nbacktest_results = pipeline.backtest(f)\nmetrics = backtest_metrics(backtest_results)\n```\n6. **Model stacking:** There are two ways to stack models with scalecast, with the [`StackingRegressor`](https://medium.com/towards-data-science/expand-your-time-series-arsenal-with-these-models-10c807d37558) from scikit-learn or using [its own stacking procedure](https://medium.com/p/7977c6667d29).\n```python\nfrom scalecast.auxmodels import auto_arima\n\nf.set_estimator('lstm')\nf.manual_forecast(\n    lags=36,\n    batch_size=32,\n    epochs=15,\n    validation_split=.2,\n    activation='tanh',\n    optimizer='Adam',\n    learning_rate=0.001,\n    lstm_layer_sizes=(100,)*3,\n    dropout=(0,)*3,\n)\n\nf.set_estimator('prophet')\nf.manual_forecast()\n\nauto_arima(f)\n\n# stack previously evaluated models\nf.add_signals(['lstm','prophet','arima'])\nf.set_estimator('catboost')\nf.manual_forecast()\n```\n7. **Multivariate modeling and multivariate pipelines:**\n```python\nfrom scalecast.MVForecaster import MVForecaster\nfrom scalecast.Pipeline import MVPipeline\nfrom scalecast.util import find_optimal_transformation, backtest_metrics\nfrom scalecast import GridGenerator\n\nGridGenerator.get_mv_grids()\n\ndef mvforecaster(mvf):\n    models = ['ridge','lasso','xgboost','lightgbm','knn']\n    mvf.tune_test_forecast(\n        models,\n        limit_grid_size = .2, # randomized grid search on 20% of original grid sizes\n        cross_validate = True, # cross validate? if False, using a seperate validation set that the user can specify\n        rolling = True, # rolling time series cross validation?\n        k = 3, # how many folds?\n    )\n\nmvf = MVForecaster(f1,f2,f3) # can take N Forecaster objects\n\ntransformer1, reverter1 = find_optimal_transformation(f1)\ntransformer2, reverter2 = find_optimal_transformation(f2)\ntransformer3, reverter3 = find_optimal_transformation(f3)\n\npipeline = MVPipeline(\n    steps = [\n        ('Transform',[transformer1,transformer2,transformer3]),\n        ('Forecast',mvforecaster),\n        ('Revert',[reverter1,reverter2,reverter3])\n    ]\n)\n\nf1, f2, f3 = pipeline.fit_predict(f1, f2, f3)\nbacktest_results = pipeline.backtest(f1, f2, f3)\nmetrics = backtest_metrics(backtest_results)\n```\n8. **Transfer Learning (new with 0.19.0):** Train a model in one `Forecaster` object and use that model to make predictions on the data in a separate `Forecaster` object.\n```python\nf = Forecaster(...)\nf.auto_Xvar_select()\nf.set_estimator('xgboost')\nf.cross_validate()\nf.auto_forecast()\n\nf_new = Forecaster(...) # different series than f\nf_new = infer_apply_Xvar_selection(infer_from=f,apply_to=f_new)\nf_new.transfer_predict(transfer_from=f,model='xgboost') # transfers the xgboost model from f to f_new\n```\n\n## Installation\n- Only the base package is needed to get started:  \n  - `pip install --upgrade scalecast`  \n- Optional add-ons:  \n  - `pip install tensorflow` (for RNN/LSTM on Windows) or `pip install tensorflow-macos` (for MAC/M1)\n  - `pip install darts`  \n  - `pip install prophet`  \n  - `pip install greykite` (for the silverkite model)  \n  - `pip install kats` (changepoint detection)  \n  - `pip install pmdarima` (auto arima)  \n  - `pip install tqdm` (progress bar for notebook)  \n  - `pip install ipython` (widgets for notebook)  \n  - `pip install ipywidgets` (widgets for notebook)  \n  - `jupyter nbextension enable --py widgetsnbextension` (widgets for notebook)  \n  - `jupyter labextension install @jupyter-widgets/jupyterlab-manager` (widgets for Lab)  \n\n## Papers that use scalecast\n- [Post-covid customer service behavior forecasting using machine learning techniques](https://digital-library.theiet.org/content/conferences/10.1049/icp.2023.0947)\n- [Application of ANN and traditional ML algorithms in modelling compost production under different climatic conditions](https://link.springer.com/article/10.1007/s00521-023-08404-4)\n- [Reservoir Computing Solutions for Streamflow Modeling and Prediction in Real World Scenarios](https://scholarsarchive.byu.edu/studentpub_uht/273/)\n- [LSTM-based recurrent neural network provides effective short term flu forecasting](https://link.springer.com/article/10.1186/s12889-023-16720-6)\n- [IMPLEMENTING AN ENERGY TRADING STRATEGY USING FORECASTING OF ENERGY PRICES AND PRODUCTION](https://webthesis.biblio.polito.it/29175/)\n- [Modelamiento predictivo del n\u00famero de visitantes en un centro comercial](https://repository.eafit.edu.co/items/ffd6e4fe-9a68-49d5-b74b-47311db99f04)\n\n## Udemy Course\n[Scalecast: Machine Learning & Deep Learning](https://www.udemy.com/course/uniform-ml-dl/?couponCode=LETSLEARNNOWPP)\n\n## Blog posts and notebooks\n\n### [Forecasting with Different Model Types](https://scalecast.readthedocs.io/en/latest/Forecaster/_forecast.html)\n- Sklearn Univariate\n  - [Expand your Time Series Arsenal with These Models](https://towardsdatascience.com/expand-your-time-series-arsenal-with-these-models-10c807d37558)\n  - [Notebook](https://scalecast-examples.readthedocs.io/en/latest/sklearn/sklearn.html)\n- Sklearn Multivariate\n  - [Multiple Series? Forecast Them together with any Sklearn Model](https://towardsdatascience.com/multiple-series-forecast-them-together-with-any-sklearn-model-96319d46269)\n  - [Notebook 1](https://scalecast-examples.readthedocs.io/en/latest/multivariate/multivariate.html)\n  - [Notebook 2](https://scalecast-examples.readthedocs.io/en/latest/multivariate-beyond/mv.html)  \n- RNN \n  - [Exploring the LSTM Neural Network Model for Time Series](https://towardsdatascience.com/exploring-the-lstm-neural-network-model-for-time-series-8b7685aa8cf)\n  - [LSTM Notebook](https://scalecast-examples.readthedocs.io/en/latest/lstm/lstm.html)\n  - [RNN Notebook](https://scalecast-examples.readthedocs.io/en/latest/rnn/rnn.html)\n- ARIMA\n  - [Forecast with ARIMA in Python More Easily with Scalecast](https://towardsdatascience.com/forecast-with-arima-in-python-more-easily-with-scalecast-35125fc7dc2e)\n  - [Notebook](https://scalecast-examples.readthedocs.io/en/latest/arima/arima.html)\n- Theta\n  - [Easily Employ A Theta Model For Time Series](https://medium.com/towards-data-science/easily-employ-a-theta-model-for-time-series-b94465099a00)\n  - [Notebook](https://scalecast-examples.readthedocs.io/en/latest/theta/theta.html)\n- VECM\n  - [Employ a VECM to predict FANG Stocks with an ML Framework](https://medium.com/p/52f170ec68e6)\n  - [Notebook](https://scalecast-examples.readthedocs.io/en/latest/vecm/vecm.html)\n- Stacking\n   - [Stacking Time Series Models to Improve Accuracy](https://medium.com/towards-data-science/stacking-time-series-models-to-improve-accuracy-7977c6667d29)\n   - [Notebook](https://scalecast-examples.readthedocs.io/en/latest/misc/stacking/custom_stacking.html)\n- Other Notebooks\n  - [Prophet](https://scalecast-examples.readthedocs.io/en/latest/prophet/prophet.html)\n  - [Combo](https://scalecast-examples.readthedocs.io/en/latest/combo/combo.html)\n  - [Holt-Winters Exponential Smoothing](https://scalecast-examples.readthedocs.io/en/latest/hwes/hwes.html)\n  - [Silverkite](https://scalecast-examples.readthedocs.io/en/latest/silverkite/silverkite.html)\n\n### [Transforming and Reverting](https://scalecast.readthedocs.io/en/latest/Forecaster/SeriesTransformer.html)\n- [Time Series Transformations (and Reverting) Made Easy](https://medium.com/towards-data-science/time-series-transformations-and-reverting-made-easy-f4f768c18f63)\n- [Notebook](https://scalecast-examples.readthedocs.io/en/latest/transforming/medium_code.html)  \n  \n### Confidence Intervals\n- [Easy Distribution-Free Conformal Intervals for Time Series](https://medium.com/towards-data-science/easy-distribution-free-conformal-intervals-for-time-series-665137e4d907)  \n- [Dynamic Conformal Intervals for any Time Series Model](https://towardsdatascience.com/dynamic-conformal-intervals-for-any-time-series-model-d1638aa48527)\n- [Notebook 1](https://scalecast-examples.readthedocs.io/en/latest/misc/cis/cis.html)  \n- [Notebook 2](https://scalecast-examples.readthedocs.io/en/latest/misc/cis-bt/cis-bt.html)\n\n### Dynamic Validation\n- [How Not to be Fooled by Time Series Models](https://towardsdatascience.com/how-not-to-be-fooled-by-time-series-forecasting-8044f5838de3)\n- [Model Validation Techniques for Time Series](https://towardsdatascience.com/model-validation-techniques-for-time-series-3518269bd5b3)\n- [Notebook](https://scalecast-examples.readthedocs.io/en/latest/misc/validation/validation.html)\n\n### Model Input Selection\n- [Variable Reduction Techniques for Time Series](https://medium.com/towards-data-science/variable-reduction-techniques-for-time-series-646743f726d4)\n- [Auto Model Specification with ML Techniques for Time Series](https://mikekeith52.medium.com/auto-model-specification-with-ml-techniques-for-time-series-e7b9a90ae9d7)\n- [Notebook 1](https://scalecast-examples.readthedocs.io/en/latest/misc/feature-selection/feature_selection.html)\n- [Notebook 2](https://scalecast-examples.readthedocs.io/en/latest/misc/auto_Xvar/auto_Xvar.html)\n\n### Scaled Forecasting on Many Series\n- [May the Forecasts Be with You](https://towardsdatascience.com/may-the-forecasts-be-with-you-introducing-scalecast-pt-2-692f3f7f0be5)\n- [Introductory Notebook Section](https://scalecast-examples.readthedocs.io/en/latest/misc/introduction/Introduction2.html#Scaled-Automated-Forecasting)\n\n### Transfer Learning\n- [Notebook 1](https://scalecast-examples.readthedocs.io/en/latest/transfer_learning/transfer_learning.html)\n- [Notebook 2](https://scalecast-examples.readthedocs.io/en/latest/transfer_learning/transfer_learning_tf.html)\n\n### Anomaly Detection\n- [Anomaly Detection for Time Series with Monte Carlo Simulations](https://towardsdatascience.com/anomaly-detection-for-time-series-with-monte-carlo-simulations-e43c77ba53c?source=email-85177a9cbd35-1658325190052-activity.collection_post_approved)\n- [Notebook1](https://scalecast-examples.readthedocs.io/en/latest/misc/anomalies/anomalies.html)\n- [Notebook2](https://github.com/mikekeith52/scalecast-examples/blob/main/misc/anomalies/monte%20carlo/monte%20carlo.ipynb)\n\n## Contributing\n- [Contributing.md](https://github.com/mikekeith52/scalecast/blob/main/Contributing.md)\n- Want something that's not listed? Open an [issue](https://github.com/mikekeith52/scalecast/issues/new)!  \n\n## How to cite scalecast\n```\n@misc{scalecast,\n  title = {{scalecast}},\n  author = {Michael Keith},\n  year = {2024},\n  version = {<your version>},\n  url = {https://scalecast.readthedocs.io/en/latest/},\n}\n```\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "The practitioner's time series forecasting library",
    "version": "0.19.10",
    "project_urls": {
        "Examples": "https://scalecast-examples.readthedocs.io/en/latest/",
        "GitHub": "https://github.com/mikekeith52/scalecast",
        "Homepage": "https://github.com/mikekeith52/scalecast",
        "Read the Docs": "https://scalecast.readthedocs.io/en/latest/"
    },
    "split_keywords": [
        "forecasting",
        " deep learning",
        " time series",
        " machine learning",
        " easy"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "168e52acf029a7454e66dd9af176a9c77a47c207299e25bd7f60de97788bae47",
                "md5": "6973c91a6a8225eddc05b7791c35e002",
                "sha256": "2b182a9f8b3f4f423cecb9e8440c2e00a873e15d42c644223c0934083cf70c70"
            },
            "downloads": -1,
            "filename": "scalecast-0.19.10.tar.gz",
            "has_sig": false,
            "md5_digest": "6973c91a6a8225eddc05b7791c35e002",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": null,
            "size": 1240840,
            "upload_time": "2024-10-14T22:58:10",
            "upload_time_iso_8601": "2024-10-14T22:58:10.927364Z",
            "url": "https://files.pythonhosted.org/packages/16/8e/52acf029a7454e66dd9af176a9c77a47c207299e25bd7f60de97788bae47/scalecast-0.19.10.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-10-14 22:58:10",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "mikekeith52",
    "github_project": "scalecast",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": false,
    "lcname": "scalecast"
}
        
Elapsed time: 0.32573s