# Nowcasting Forecast
[![codecov](https://codecov.io/gh/openclimatefix/nowcasting_forecast/branch/main/graph/badge.svg?token=J9281APVDM)](https://codecov.io/gh/openclimatefix/nowcasting_forecast)
Making live forecasts for the nowcasting project.
The aim is to produce Grid Supply Point (GSP) Solar generation Forecasts.
This is done by loading live PV results, Numerical Weather Predictions (NWPs) and Satellite images,
and running this data through various different ML models.
You can run the application locally by:
```bash
python nowcasting_forecast/app.py --db-url='sqlite:///test.db'
```
## Installation with conda
```shell
mamba env create -f environment.yml
conda activate nowcasting_forecast
# Installing fastai inside environment.yml is currently broken. So install separately:
mamba install -c fastchan fastai
# You need to manually install nowcasting_dataset, nowcasting_datamodel,
# nowcasting_dataloader, and power_perceiver. This can be done via pypi (for all except
# power_perceiver) or by git cloning the repo and using:
pip install -e <path_to_directory>
# Install nowcasting_forecast
pip install -e .
# You may also want to install dev tools:
mamba install pytest flake8 black pre-commit pydocstyle isort mypy
pre-commit install
```
## Directories and files
The following shows the main files
```
+-- nowcasting_forecast
| +-- config
| +-- mvp_v0.yaml
| +-- mvp_v1.yaml
| +-- models
| +-- nwp_solar_simple.py
| +-- nwp_solar_simple_trained
| +-- model.py
| +-- cnn
| +-- cnn.py
| +-- dataloader.py
| +-- model.py
| +-- app.py
| +-- batch.py
| +-- dataloader.py
+-- scripts
+-- tests
```
### ☀️ nowcasting_forecast
`app.py` is the main entry point for running the forecast module. It contains the following arguments:
- --db-url: the database url that the forecasts will be saved too
- --fake: Create fake forecast, mainly used for testing
The app has three main steps
1. Make batched data
2. Run forecasting model to make forecasts
3. Save forecasts to the database
`batch.py` is used to load the data and create batches of data. This uses mainly used [ManagerLive](https://github.com/openclimatefix/nowcasting_dataset/blob/main/nowcasting_dataset/manager/manager_live.py#L29)
`dataloader.py` is used to load the batched data in an efficient way. This is current a stripped down version of this [dataloader](https://github.com/openclimatefix/nowcasting_dataloader).
The `config` directory contains configurations that are used to load different data sources.
The configuration is used by `batch.py` to convert large amounts of data into batched data, which is then ready for ML models.
The `database` directory contains database models and functions to interact with the database. See [below](#data-model) for more information
The `model` directory contains ml models to make forecasts. See [below](#models) for more information
## Models
### NWP Simple
This takes the average 'dswrf' for each example and then divides this by 10 to give a rough model for MW per GSP
configuration: mvp_v0.yaml
### NWP Simple trained
CNN model of 'dswrf' channel in NWP. 6 CNN layers then 4 fully connected layers.
![Diagram](nowcasting_forecast/models/nwp_simple_trained/diagram.png)
Training run: https://app.neptune.ai/o/OpenClimateFix/org/predict-pv-yield/e/PRED-951/charts
configuration: mvp_v1.yaml
### CNN
This model takes both satellite and NWP video data and puts them through
separate 3D convolutional neural networks. These are then connected with
a few fully connected layers, joined with some simple input data like
historic PV data.
## 🩺 Testing
Tests are run with `pytest`
These sets up `postgres` in a docker container.
This slightly more complicated testing framework is needed (compared to running `pytest`)
as some queries can not be fully tested on a `sqlite` database
## 🛠️ infrastructure
`.github/workflows` contains a number of CI actions
1. linters.yaml: Runs linting checks on the code
2. release.yaml: Make and pushes docker files on a new code release
3. test-docker.yaml': Runs tests on every push
The docker file is in the folder `infrastructure/docker/`
The version is bumped automatically for any push to `main`.
## Environmental Variables
- DB_URL: The database url which the forecasts will be saved too
- DB_URL_PV: The database url for the PV data
- NWP_ZARR_PATH: Override NWP data path. This is useful when running this locally, and shows to get data from the cloud.
- SATELLITE_ZARR_PATH: Override Satellite data path. This is useful when running this locally, and shows to get data from the cloud.
- FAKE: Option to make fake/dummy forecasts
- MODEL_NAME: Optional of 'nwp_simple' or 'nwp_simple_trained'
Raw data
{
"_id": null,
"home_page": "https://github.com/openclimatefix/nowcasting_forecast",
"name": "nowcasting-forecast",
"maintainer": "",
"docs_url": null,
"requires_python": "",
"maintainer_email": "",
"keywords": "artificial intelligence,forecast",
"author": "Peter Dudfield",
"author_email": "peter@openclimatefix.org",
"download_url": "https://files.pythonhosted.org/packages/f9/02/07fb8691875ccc50156a5ed27e66c787a1af005877418c7a58c5d8728cc0/nowcasting_forecast-1.3.26.tar.gz",
"platform": null,
"description": "# Nowcasting Forecast\n\n[![codecov](https://codecov.io/gh/openclimatefix/nowcasting_forecast/branch/main/graph/badge.svg?token=J9281APVDM)](https://codecov.io/gh/openclimatefix/nowcasting_forecast)\n\nMaking live forecasts for the nowcasting project.\nThe aim is to produce Grid Supply Point (GSP) Solar generation Forecasts.\nThis is done by loading live PV results, Numerical Weather Predictions (NWPs) and Satellite images,\nand running this data through various different ML models.\n\nYou can run the application locally by:\n```bash\npython nowcasting_forecast/app.py --db-url='sqlite:///test.db'\n```\n\n## Installation with conda\n\n```shell\nmamba env create -f environment.yml\nconda activate nowcasting_forecast\n\n# Installing fastai inside environment.yml is currently broken. So install separately:\nmamba install -c fastchan fastai\n\n# You need to manually install nowcasting_dataset, nowcasting_datamodel,\n# nowcasting_dataloader, and power_perceiver. This can be done via pypi (for all except\n# power_perceiver) or by git cloning the repo and using:\npip install -e <path_to_directory>\n\n# Install nowcasting_forecast\npip install -e .\n\n# You may also want to install dev tools:\nmamba install pytest flake8 black pre-commit pydocstyle isort mypy\npre-commit install\n```\n\n## Directories and files\nThe following shows the main files\n\n```\n+-- nowcasting_forecast\n| +-- config\n| +-- mvp_v0.yaml\n| +-- mvp_v1.yaml\n| +-- models\n| +-- nwp_solar_simple.py\n| +-- nwp_solar_simple_trained\n| +-- model.py\n| +-- cnn\n| +-- cnn.py\n| +-- dataloader.py\n| +-- model.py\n| +-- app.py\n| +-- batch.py\n| +-- dataloader.py\n+-- scripts\n+-- tests\n```\n\n### \u2600\ufe0f nowcasting_forecast\n\n`app.py` is the main entry point for running the forecast module. It contains the following arguments:\n- --db-url: the database url that the forecasts will be saved too\n- --fake: Create fake forecast, mainly used for testing\n\nThe app has three main steps\n1. Make batched data\n2. Run forecasting model to make forecasts\n3. Save forecasts to the database\n\n`batch.py` is used to load the data and create batches of data. This uses mainly used [ManagerLive](https://github.com/openclimatefix/nowcasting_dataset/blob/main/nowcasting_dataset/manager/manager_live.py#L29)\n\n`dataloader.py` is used to load the batched data in an efficient way. This is current a stripped down version of this [dataloader](https://github.com/openclimatefix/nowcasting_dataloader).\n\nThe `config` directory contains configurations that are used to load different data sources.\nThe configuration is used by `batch.py` to convert large amounts of data into batched data, which is then ready for ML models.\n\nThe `database` directory contains database models and functions to interact with the database. See [below](#data-model) for more information\n\nThe `model` directory contains ml models to make forecasts. See [below](#models) for more information\n\n## Models\n\n### NWP Simple\n\nThis takes the average 'dswrf' for each example and then divides this by 10 to give a rough model for MW per GSP\n\nconfiguration: mvp_v0.yaml\n\n### NWP Simple trained\n\nCNN model of 'dswrf' channel in NWP. 6 CNN layers then 4 fully connected layers.\n\n![Diagram](nowcasting_forecast/models/nwp_simple_trained/diagram.png)\n\nTraining run: https://app.neptune.ai/o/OpenClimateFix/org/predict-pv-yield/e/PRED-951/charts\n\nconfiguration: mvp_v1.yaml\n\n### CNN\n\nThis model takes both satellite and NWP video data and puts them through\nseparate 3D convolutional neural networks. These are then connected with\na few fully connected layers, joined with some simple input data like\nhistoric PV data.\n\n## \ud83e\ude7a Testing\n\nTests are run with `pytest`\n\nThese sets up `postgres` in a docker container.\nThis slightly more complicated testing framework is needed (compared to running `pytest`)\nas some queries can not be fully tested on a `sqlite` database\n\n## \ud83d\udee0\ufe0f infrastructure\n\n`.github/workflows` contains a number of CI actions\n1. linters.yaml: Runs linting checks on the code\n2. release.yaml: Make and pushes docker files on a new code release\n3. test-docker.yaml': Runs tests on every push\n\nThe docker file is in the folder `infrastructure/docker/`\n\nThe version is bumped automatically for any push to `main`.\n\n\n\n## Environmental Variables\n\n- DB_URL: The database url which the forecasts will be saved too\n- DB_URL_PV: The database url for the PV data\n- NWP_ZARR_PATH: Override NWP data path. This is useful when running this locally, and shows to get data from the cloud.\n- SATELLITE_ZARR_PATH: Override Satellite data path. This is useful when running this locally, and shows to get data from the cloud.\n- FAKE: Option to make fake/dummy forecasts\n- MODEL_NAME: Optional of 'nwp_simple' or 'nwp_simple_trained'\n",
"bugtrack_url": null,
"license": "MIT",
"summary": "Live forecast for the OCF nowcasting project",
"version": "1.3.26",
"project_urls": {
"Homepage": "https://github.com/openclimatefix/nowcasting_forecast"
},
"split_keywords": [
"artificial intelligence",
"forecast"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "8de4454cf831fd44576c661099194ffb4b22a484df4e4fc47d35c3a191d93b55",
"md5": "c0bda47a375bb0fcd3f413cb550116f2",
"sha256": "8f750d869d8c45db77f8e508a464a21a5cd5ab9865aa573a3483f8ce66bf0bad"
},
"downloads": -1,
"filename": "nowcasting_forecast-1.3.26-py3-none-any.whl",
"has_sig": false,
"md5_digest": "c0bda47a375bb0fcd3f413cb550116f2",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": null,
"size": 16750,
"upload_time": "2023-08-25T13:16:28",
"upload_time_iso_8601": "2023-08-25T13:16:28.851025Z",
"url": "https://files.pythonhosted.org/packages/8d/e4/454cf831fd44576c661099194ffb4b22a484df4e4fc47d35c3a191d93b55/nowcasting_forecast-1.3.26-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "f90207fb8691875ccc50156a5ed27e66c787a1af005877418c7a58c5d8728cc0",
"md5": "22d7438268e46f100dea5421703ead3a",
"sha256": "22373ebe9f31c5d7d9f49c4a8afdfccf7b823b99cc4225610038c27addc26a86"
},
"downloads": -1,
"filename": "nowcasting_forecast-1.3.26.tar.gz",
"has_sig": false,
"md5_digest": "22d7438268e46f100dea5421703ead3a",
"packagetype": "sdist",
"python_version": "source",
"requires_python": null,
"size": 17491,
"upload_time": "2023-08-25T13:16:30",
"upload_time_iso_8601": "2023-08-25T13:16:30.234043Z",
"url": "https://files.pythonhosted.org/packages/f9/02/07fb8691875ccc50156a5ed27e66c787a1af005877418c7a58c5d8728cc0/nowcasting_forecast-1.3.26.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2023-08-25 13:16:30",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "openclimatefix",
"github_project": "nowcasting_forecast",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"requirements": [],
"lcname": "nowcasting-forecast"
}