esnpy


Nameesnpy JSON
Version 0.4.0 PyPI version JSON
download
home_page
SummaryOut-of-the-box framework for Echo State Networks
upload_time2023-06-07 09:57:37
maintainer
docs_urlNone
author
requires_python>=3.9
licenseThe MIT License (MIT) Copyright (c) 2015 Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
keywords machine learning neural networks echo state network
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # esnpy

`esnpy` is an out-of-the-box framework to experiment around ESN and DeepESN.  
Models have been implemented in pure NumPy/SciPy, so there is no need for a powerful GPU, or any esoteric requirements. 

Right now, the focus is on batch training, and feedback loops have not been taken into account.  
But feel free to open a ticket a discuss about anything you need, features you want, or even help !

The documentation for the latest release is available on [the github page](https://nizil.github.io/esnpy).

Note from the author: *`esnpy` is a small projet I initiated during my master intership, and have recently cleaned up. I might keep working on it for fun, but If you want/need a more robust framework, [ReservoirPy](https://github.com/reservoirpy/reservoirpy) might be the one you're searching for ;)*

## Getting Started

### Installation

**From PyPI**
```bash
pip install esnpy
```

**From source**
```bash
pip install git+https://github.com/NiziL/esnpy#egg=esnpy
```
Use `github.com/NiziL/esnpy@<tag or branch>#egg=esnpy` to install from a specific branch or tag instead of main.

### Quickstart

```python
import esnpy

reservoir_builder = createBuilder()
trainer = createTrainer()
warmup, data, target = loadData()

# create the echo state network
esn = esnpy.ESN(reservoir_builder.build(), trainer)
# train it
esn.fit(warmup, data, target)
# test it
predictions = esnpy.transform(data)
print(f"error: {compute_err(target, predictions)}")
```

#### `ESN` and `DeepESN`

You can create your ESN with `esnpy.ESN`. 
The constructor needs a `esnpy.Reservoir` and an implementation of `esnpy.train.Trainer`. 

`esnpy.DeepESN` doesn't differ a lot, it just expect a list of `Reservoir` and have an optional parameters `mask` to specify from which reservoirs the `Trainer` should learn. The size of `mask` and `reservoirs` must be the same. 

Then, simply call `fit` function by passing some warm up and training data with the related targets.  
Once trained, run predictions using `transform`.

#### `Reservoir` and `ReservoirBuilder`

A `Reservoir` can easily be initialized using the `ReservoirBuilder` dataclass.  
For convenience, the configuration class is also a builder, exposing a `build()` method.
This method has an optional `seed` parameter used to make deterministic initialization, and so to ease the comparaison of two identical reservoirs.

| Parameters    | Type                     | Description                                  | Default   |
|---------------|--------------------------|----------------------------------------------|-----------|
| input_size    | `int`                    | Size of input vectors                        |           |
| size          | `int`                    | Number of units in the reservoir             |           |
| leaky         | `float`                  | Leaky parameter of the reservoir             |           |
| fn            | `Callable`               | Activation function of the reservoir         | `np.tanh` |
| input_bias    | `bool`                   | Enable the usage of a bias in the input      | `True`    |
| input_init    | `esnpy.init.Initializer` | Define how to initialize the input weights   |           |
| input_tuners  | `list[esnpy.tune.Tuner]` | Define how to tune the input weights         | `[]`      |
| intern_init   | `esnpy.init.Initializer` | Define how to intialize the internal weights |           |
| intern_tuners | `list[esnpy.init.Tuner]` | Define how to tune the internal weights      | `[]`      |

#### `Initializer` and `Tuner` 

`esnpy.init.Initializer` and `esnpy.tune.Tuner` are the abstract base classes used to setup the input and internal weights of a reservoir.

`Initializer` is defined by a `init() -> Matrix` function. 
`esnpy` provides implementations of initializer for both uniform and gaussian distribution of weights, and for both dense and sparse matrix.

`Tuner` is defined by a `init(matrix : Matrix) -> Matrix` function, which can be used to modify the weights after initialization.
For example, `esnpy` provides a `SpectralRadiusTuner` to change the spectral radius of a weights matrix.

#### `Trainer` and `Reader`

`esnpy.train.Trainer` is responsible to create the output weights matrix from the training data and targets.  
It is defined by a `train(inputs: Matrix, data: Matrix, target: Matrix) -> Reader` function.

The `Reader` is then responsible for computing the final result from the reservoir activations through `__call__`.

`esnpy` provides a `RidgeTrainer` to compute the output weights using a ridge regression, and a `SklearnTrainer` (more on this one later). 
This trainer has three parameters : one float, the regularization parameter's weight `alpha`, and two optionals boolean (default to true) `use_bias` and `use_input` to control if we should use a bias and the input to compute the readout weights.

`SklearnTrainer` is an adapter wrapping scikit-learn model, relying on methods `fit` and `predict`. The feature is still experimental, chaos might ensure. 

## Code Examples 

Want to see some code in action ? Take a look at the `examples/` directory:
- `mackey_glass.py` demonstrates how to learn to predict a time series,
- `trajectory_classification/` demonstrates how to learn to classify 2D trajectories.

## Bibliography

Based on:
- *The "echo state" approach to analysing and training recurrent neural networks* by Herbert Jaeger ([pdf](https://www.ai.rug.nl/minds/uploads/EchoStatesTechRep.pdf)),
- *A pratical guide to applying Echo State Networks* by Mantas Lukoševičius ([pdf](https://www.ai.rug.nl/minds/uploads/PracticalESN.pdf)),
- *Design of deep echo state networks* by Claudio Gallicchio and al ([link](https://www.sciencedirect.com/science/article/pii/S0893608018302223)),
- *Deep echo state network (DeepESN): A brief survey* by Claudio Gallicchio and Alessio Micheli ([pdf](https://arxiv.org/pdf/1712.04323.pdf)).

Special thanks to Mantas Lukoševičius for his [minimal ESN example](https://mantas.info/wp/wp-content/uploads/simple_esn/minimalESN.py), which greatly helped me to get started with reservoir computing.

            

Raw data

            {
    "_id": null,
    "home_page": "",
    "name": "esnpy",
    "maintainer": "",
    "docs_url": null,
    "requires_python": ">=3.9",
    "maintainer_email": "",
    "keywords": "machine learning,neural networks,echo state network",
    "author": "",
    "author_email": "Th\u00e9o BL <biasutto.t@gmail.com>",
    "download_url": "https://files.pythonhosted.org/packages/83/97/eeaec6fcb88b4bcb29433cdb57388f7c801f904ddece9145df1ab49358df/esnpy-0.4.0.tar.gz",
    "platform": null,
    "description": "# esnpy\n\n`esnpy` is an out-of-the-box framework to experiment around ESN and DeepESN.  \nModels have been implemented in pure NumPy/SciPy, so there is no need for a powerful GPU, or any esoteric requirements. \n\nRight now, the focus is on batch training, and feedback loops have not been taken into account.  \nBut feel free to open a ticket a discuss about anything you need, features you want, or even help !\n\nThe documentation for the latest release is available on [the github page](https://nizil.github.io/esnpy).\n\nNote from the author: *`esnpy` is a small projet I initiated during my master intership, and have recently cleaned up. I might keep working on it for fun, but If you want/need a more robust framework, [ReservoirPy](https://github.com/reservoirpy/reservoirpy) might be the one you're searching for ;)*\n\n## Getting Started\n\n### Installation\n\n**From PyPI**\n```bash\npip install esnpy\n```\n\n**From source**\n```bash\npip install git+https://github.com/NiziL/esnpy#egg=esnpy\n```\nUse `github.com/NiziL/esnpy@<tag or branch>#egg=esnpy` to install from a specific branch or tag instead of main.\n\n### Quickstart\n\n```python\nimport esnpy\n\nreservoir_builder = createBuilder()\ntrainer = createTrainer()\nwarmup, data, target = loadData()\n\n# create the echo state network\nesn = esnpy.ESN(reservoir_builder.build(), trainer)\n# train it\nesn.fit(warmup, data, target)\n# test it\npredictions = esnpy.transform(data)\nprint(f\"error: {compute_err(target, predictions)}\")\n```\n\n#### `ESN` and `DeepESN`\n\nYou can create your ESN with `esnpy.ESN`. \nThe constructor needs a `esnpy.Reservoir` and an implementation of `esnpy.train.Trainer`. \n\n`esnpy.DeepESN` doesn't differ a lot, it just expect a list of `Reservoir` and have an optional parameters `mask` to specify from which reservoirs the `Trainer` should learn. The size of `mask` and `reservoirs` must be the same. \n\nThen, simply call `fit` function by passing some warm up and training data with the related targets.  \nOnce trained, run predictions using `transform`.\n\n#### `Reservoir` and `ReservoirBuilder`\n\nA `Reservoir` can easily be initialized using the `ReservoirBuilder` dataclass.  \nFor convenience, the configuration class is also a builder, exposing a `build()` method.\nThis method has an optional `seed` parameter used to make deterministic initialization, and so to ease the comparaison of two identical reservoirs.\n\n| Parameters    | Type                     | Description                                  | Default   |\n|---------------|--------------------------|----------------------------------------------|-----------|\n| input_size    | `int`                    | Size of input vectors                        |           |\n| size          | `int`                    | Number of units in the reservoir             |           |\n| leaky         | `float`                  | Leaky parameter of the reservoir             |           |\n| fn            | `Callable`               | Activation function of the reservoir         | `np.tanh` |\n| input_bias    | `bool`                   | Enable the usage of a bias in the input      | `True`    |\n| input_init    | `esnpy.init.Initializer` | Define how to initialize the input weights   |           |\n| input_tuners  | `list[esnpy.tune.Tuner]` | Define how to tune the input weights         | `[]`      |\n| intern_init   | `esnpy.init.Initializer` | Define how to intialize the internal weights |           |\n| intern_tuners | `list[esnpy.init.Tuner]` | Define how to tune the internal weights      | `[]`      |\n\n#### `Initializer` and `Tuner` \n\n`esnpy.init.Initializer` and `esnpy.tune.Tuner` are the abstract base classes used to setup the input and internal weights of a reservoir.\n\n`Initializer` is defined by a `init() -> Matrix` function. \n`esnpy` provides implementations of initializer for both uniform and gaussian distribution of weights, and for both dense and sparse matrix.\n\n`Tuner` is defined by a `init(matrix : Matrix) -> Matrix` function, which can be used to modify the weights after initialization.\nFor example, `esnpy` provides a `SpectralRadiusTuner` to change the spectral radius of a weights matrix.\n\n#### `Trainer` and `Reader`\n\n`esnpy.train.Trainer` is responsible to create the output weights matrix from the training data and targets.  \nIt is defined by a `train(inputs: Matrix, data: Matrix, target: Matrix) -> Reader` function.\n\nThe `Reader` is then responsible for computing the final result from the reservoir activations through `__call__`.\n\n`esnpy` provides a `RidgeTrainer` to compute the output weights using a ridge regression, and a `SklearnTrainer` (more on this one later). \nThis trainer has three parameters : one float, the regularization parameter's weight `alpha`, and two optionals boolean (default to true) `use_bias` and `use_input` to control if we should use a bias and the input to compute the readout weights.\n\n`SklearnTrainer` is an adapter wrapping scikit-learn model, relying on methods `fit` and `predict`. The feature is still experimental, chaos might ensure. \n\n## Code Examples \n\nWant to see some code in action ? Take a look at the `examples/` directory:\n- `mackey_glass.py` demonstrates how to learn to predict a time series,\n- `trajectory_classification/` demonstrates how to learn to classify 2D trajectories.\n\n## Bibliography\n\nBased on:\n- *The \"echo state\" approach to analysing and training recurrent neural networks* by Herbert Jaeger ([pdf](https://www.ai.rug.nl/minds/uploads/EchoStatesTechRep.pdf)),\n- *A pratical guide to applying Echo State Networks* by Mantas Luko\u0161evi\u010dius ([pdf](https://www.ai.rug.nl/minds/uploads/PracticalESN.pdf)),\n- *Design of deep echo state networks* by Claudio Gallicchio and al ([link](https://www.sciencedirect.com/science/article/pii/S0893608018302223)),\n- *Deep echo state network (DeepESN): A brief survey* by Claudio Gallicchio and Alessio Micheli ([pdf](https://arxiv.org/pdf/1712.04323.pdf)).\n\nSpecial thanks to Mantas Luko\u0161evi\u010dius for his [minimal ESN example](https://mantas.info/wp/wp-content/uploads/simple_esn/minimalESN.py), which greatly helped me to get started with reservoir computing.\n",
    "bugtrack_url": null,
    "license": "The MIT License (MIT)  Copyright (c) 2015  Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the \"Software\"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:  The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.  THE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.  ",
    "summary": "Out-of-the-box framework for Echo State Networks",
    "version": "0.4.0",
    "project_urls": {
        "GitHub": "https://github.com/NiziL/esnpy"
    },
    "split_keywords": [
        "machine learning",
        "neural networks",
        "echo state network"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "3795f44c1f8b5b4c0dd5f31f50f4a0ba425dab960f5bcf91aa6c1847ea5ec7ff",
                "md5": "a038189ebd181c477d15353fd56cc941",
                "sha256": "e53f203de3c5c96fa40cb4322ba848045ed25512e4755565864b80d567d317cb"
            },
            "downloads": -1,
            "filename": "esnpy-0.4.0-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "a038189ebd181c477d15353fd56cc941",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.9",
            "size": 9920,
            "upload_time": "2023-06-07T09:57:36",
            "upload_time_iso_8601": "2023-06-07T09:57:36.218414Z",
            "url": "https://files.pythonhosted.org/packages/37/95/f44c1f8b5b4c0dd5f31f50f4a0ba425dab960f5bcf91aa6c1847ea5ec7ff/esnpy-0.4.0-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "8397eeaec6fcb88b4bcb29433cdb57388f7c801f904ddece9145df1ab49358df",
                "md5": "dd7353569f9d914b4f90aed68ee16940",
                "sha256": "c12416399196fa50b36b495f72d000ab13dc33b9742ef16914785ab5220546d3"
            },
            "downloads": -1,
            "filename": "esnpy-0.4.0.tar.gz",
            "has_sig": false,
            "md5_digest": "dd7353569f9d914b4f90aed68ee16940",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.9",
            "size": 11115,
            "upload_time": "2023-06-07T09:57:37",
            "upload_time_iso_8601": "2023-06-07T09:57:37.650607Z",
            "url": "https://files.pythonhosted.org/packages/83/97/eeaec6fcb88b4bcb29433cdb57388f7c801f904ddece9145df1ab49358df/esnpy-0.4.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2023-06-07 09:57:37",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "NiziL",
    "github_project": "esnpy",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "esnpy"
}
        
Elapsed time: 0.09930s