Name | norse JSON |
Version |
1.1.0
JSON |
| download |
home_page | |
Summary | A library for deep learning with spiking neural networks |
upload_time | 2024-03-18 22:39:51 |
maintainer | |
docs_url | None |
author | |
requires_python | >=3.8 |
license | GNU LESSER GENERAL PUBLIC LICENSE Version 3, 29 June 2007 Copyright (C) 2007 Free Software Foundation, Inc. <http://fsf.org/> Everyone is permitted to copy and distribute verbatim copies of this license document, but changing it is not allowed. This version of the GNU Lesser General Public License incorporates the terms and conditions of version 3 of the GNU General Public License, supplemented by the additional permissions listed below. 0. Additional Definitions. As used herein, "this License" refers to version 3 of the GNU Lesser General Public License, and the "GNU GPL" refers to version 3 of the GNU General Public License. "The Library" refers to a covered work governed by this License, other than an Application or a Combined Work as defined below. An "Application" is any work that makes use of an interface provided by the Library, but which is not otherwise based on the Library. Defining a subclass of a class defined by the Library is deemed a mode of using an interface provided by the Library. A "Combined Work" is a work produced by combining or linking an Application with the Library. The particular version of the Library with which the Combined Work was made is also called the "Linked Version". The "Minimal Corresponding Source" for a Combined Work means the Corresponding Source for the Combined Work, excluding any source code for portions of the Combined Work that, considered in isolation, are based on the Application, and not on the Linked Version. The "Corresponding Application Code" for a Combined Work means the object code and/or source code for the Application, including any data and utility programs needed for reproducing the Combined Work from the Application, but excluding the System Libraries of the Combined Work. 1. Exception to Section 3 of the GNU GPL. You may convey a covered work under sections 3 and 4 of this License without being bound by section 3 of the GNU GPL. 2. Conveying Modified Versions. If you modify a copy of the Library, and, in your modifications, a facility refers to a function or data to be supplied by an Application that uses the facility (other than as an argument passed when the facility is invoked), then you may convey a copy of the modified version: a) under this License, provided that you make a good faith effort to ensure that, in the event an Application does not supply the function or data, the facility still operates, and performs whatever part of its purpose remains meaningful, or b) under the GNU GPL, with none of the additional permissions of this License applicable to that copy. 3. Object Code Incorporating Material from Library Header Files. The object code form of an Application may incorporate material from a header file that is part of the Library. You may convey such object code under terms of your choice, provided that, if the incorporated material is not limited to numerical parameters, data structure layouts and accessors, or small macros, inline functions and templates (ten or fewer lines in length), you do both of the following: a) Give prominent notice with each copy of the object code that the Library is used in it and that the Library and its use are covered by this License. b) Accompany the object code with a copy of the GNU GPL and this license document. 4. Combined Works. You may convey a Combined Work under terms of your choice that, taken together, effectively do not restrict modification of the portions of the Library contained in the Combined Work and reverse engineering for debugging such modifications, if you also do each of the following: a) Give prominent notice with each copy of the Combined Work that the Library is used in it and that the Library and its use are covered by this License. b) Accompany the Combined Work with a copy of the GNU GPL and this license document. c) For a Combined Work that displays copyright notices during execution, include the copyright notice for the Library among these notices, as well as a reference directing the user to the copies of the GNU GPL and this license document. d) Do one of the following: 0) Convey the Minimal Corresponding Source under the terms of this License, and the Corresponding Application Code in a form suitable for, and under terms that permit, the user to recombine or relink the Application with a modified version of the Linked Version to produce a modified Combined Work, in the manner specified by section 6 of the GNU GPL for conveying Corresponding Source. 1) Use a suitable shared library mechanism for linking with the Library. A suitable mechanism is one that (a) uses at run time a copy of the Library already present on the user's computer system, and (b) will operate properly with a modified version of the Library that is interface-compatible with the Linked Version. e) Provide Installation Information, but only if you would otherwise be required to provide such information under section 6 of the GNU GPL, and only to the extent that such information is necessary to install and execute a modified version of the Combined Work produced by recombining or relinking the Application with a modified version of the Linked Version. (If you use option 4d0, the Installation Information must accompany the Minimal Corresponding Source and Corresponding Application Code. If you use option 4d1, you must provide the Installation Information in the manner specified by section 6 of the GNU GPL for conveying Corresponding Source.) 5. Combined Libraries. You may place library facilities that are a work based on the Library side by side in a single library together with other library facilities that are not Applications and are not covered by this License, and convey such a combined library under terms of your choice, if you do both of the following: a) Accompany the combined library with a copy of the same work based on the Library, uncombined with any other library facilities, conveyed under the terms of this License. b) Give prominent notice with the combined library that part of it is a work based on the Library, and explaining where to find the accompanying uncombined form of the same work. 6. Revised Versions of the GNU Lesser General Public License. The Free Software Foundation may publish revised and/or new versions of the GNU Lesser General Public License from time to time. Such new versions will be similar in spirit to the present version, but may differ in detail to address new problems or concerns. Each version is given a distinguishing version number. If the Library as you received it specifies that a certain numbered version of the GNU Lesser General Public License "or any later version" applies to it, you have the option of following the terms and conditions either of that published version or of any later version published by the Free Software Foundation. If the Library as you received it does not specify a version number of the GNU Lesser General Public License, you may choose any version of the GNU Lesser General Public License ever published by the Free Software Foundation. If the Library as you received it specifies that a proxy can decide whether future versions of the GNU Lesser General Public License shall apply, that proxy's public statement of acceptance of any version is permanent authorization for you to choose that version for the Library. |
keywords |
spiking neural networks
deep learning
neural networks
machine learning
|
VCS |
|
bugtrack_url |
|
requirements |
No requirements were recorded.
|
Travis-CI |
No Travis.
|
coveralls test coverage |
|
<p align="center">
<img src="https://raw.githubusercontent.com/norse/norse/master/logo.png">
</p>
A [deep learning](https://en.wikipedia.org/wiki/Deep_learning) library for [spiking neural networks](https://en.wikipedia.org/wiki/Spiking_neural_network).
<p align="center">
<a href="https://github.com/norse/norse/actions">
<img src="https://github.com/norse/norse/workflows/Build%20Python/badge.svg" alt="Test status"></a>
<a href="https://pypi.org/project/norse/" alt="PyPi">
<img src="https://img.shields.io/pypi/v/norse" />
</a>
<a href="https://pypi.org/project/aestream/" alt="PyPi">
<img src="https://img.shields.io/pypi/dm/aestream" />
</a>
<a href="https://github.com/norse/norse/pulse" alt="Activity">
<img src="https://img.shields.io/github/last-commit/norse/norse" />
</a>
<a href="https://discord.gg/7fGN359">
<img src="https://img.shields.io/discord/723215296399147089"
alt="chat on Discord"></a>
<a href="https://www.codacy.com/gh/norse/norse?utm_source=github.com&utm_medium=referral&utm_content=norse/norse&utm_campaign=Badge_Grade"><img src="https://app.codacy.com/project/badge/Grade/a9ab846fc6114afda4320badcb8a69c2"/></a>
<a href="https://codecov.io/gh/norse/norse"><img src="https://codecov.io/gh/norse/norse/branch/master/graph/badge.svg" /></a>
<a href="https://doi.org/10.5281/zenodo.4422025"><img src="https://zenodo.org/badge/DOI/10.5281/zenodo.4422025.svg" alt="DOI"></a>
</p>
Norse aims to exploit the advantages of bio-inspired neural components, which are sparse and event-driven - a fundamental difference from artificial neural networks.
Norse expands [PyTorch](https://pytorch.org/) with primitives for bio-inspired neural components,
bringing you two advantages: a modern and proven infrastructure based on PyTorch and deep learning-compatible spiking neural network components.
**Documentation**: [norse.github.io/norse/](https://norse.github.io/norse/)
## 1. Getting started
The fastest way to try Norse is via the [jupyter notebooks on Google collab](https://github.com/norse/notebooks/tree/master/).
Alternatively, [you can install Norse locally](#installation) and run one of the [included tasks](https://norse.github.io/norse/tasks.html) such as [MNIST](https://en.wikipedia.org/wiki/MNIST_database):
```bash
python -m norse.task.mnist
```
## 2. Using Norse
Norse presents plug-and-play components for deep learning with spiking neural networks.
Here, we describe how to install Norse and start to apply it in your own work.
[Read more in our documentation](https://norse.github.io/norse/working.html).
### 2.1. Installation
<a name="installation"></a>
We assume you are using **Python version 3.8+** and have **installed PyTorch version 1.9 or higher**.
[Read more about the prerequisites in our documentation](https://norse.github.io/norse/installing.html).
<table>
<thead>
<tr>
<th>Method</th><th>Instructions</th><th>Prerequisites</th>
</tr>
</thead>
<tr>
<td>From PyPi</td><td><div class="highlight highlight-source-shell"><pre>
pip install norse
</pre></div></td><td><a href="https://pypi.org/" title="PyPi">Pip</a></td>
</tr>
<tr>
<td>From source</td><td><div class="highlight highlight-source-shell"><pre>
pip install -qU git+https://github.com/norse/norse
</pre></div></td><td><a href="https://pypi.org/" title="PyPi">Pip</a>, <a href="https://pytorch.org/get-started/locally/" title="PyTorch">PyTorch</a></td>
</tr>
<tr>
<td>With Docker</td><td><div class="highlight highlight-source-shell"><pre>
docker pull quay.io/norse/norse
</pre></div></td><td><a href="https://www.docker.com/get-started" title="Docker">Docker</a></td>
</tr>
<tr>
<td>From Conda</td><td> <div class="highlight highlight-source-shell"><pre>
conda install -c norse norse
</pre></div></td><td><a href="https://docs.anaconda.com/anaconda/install/" title="Anaconda">Anaconda</a> or <a href="https://docs.conda.io/en/latest/miniconda.html" title="Miniconda">Miniconda</a></td>
</tr>
</table>
For troubleshooting, please refer to our [installation guide](https://norse.github.io/norse/pages/installing.html#installation-troubleshooting), [create an issue on GitHub](https://github.com/norse/norse/issues) or [write us on Discord](https://discord.gg/7fGN359).
### 2.2. Running examples
Norse is bundled with a number of example tasks, serving as short, self contained, correct examples ([SSCCE](http://www.sscce.org/)).
They can be run by invoking the `norse` module from the base directory.
More information and tasks are available [in our documentation](https://norse.github.io/norse/tasks.html) and in your console by typing: `python -m norse.task.<task> --help`, where `<task>` is one of the task names.
- To train an MNIST classification network, invoke
```bash
python -m norse.task.mnist
```
- To train a CIFAR classification network, invoke
```bash
python -m norse.task.cifar10
```
- To train the cartpole balancing task with Policy gradient, invoke
```bash
python -m norse.task.cartpole
```
Norse is compatible with [PyTorch Lightning](https://pytorch-lightning.readthedocs.io/en/stable/),
as demonstrated in the [PyTorch Lightning MNIST task variant](https://github.com/norse/norse/blob/master/norse/task/mnist_pl.py) (requires PyTorch lightning):
```bash
python -m norse.task.mnist_pl --gpus=4
```
### 2.3. Example: Spiking convolutional classifier
[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/norse/notebooks/blob/master/mnist_classifiers.ipynb)
This classifier is taken from our [tutorial on training a spiking MNIST classifier](https://github.com/norse/notebooks#level-intermediate) and achieves >99% accuracy.
```python
import torch, torch.nn as nn
from norse.torch import LICell # Leaky integrator
from norse.torch import LIFCell # Leaky integrate-and-fire
from norse.torch import SequentialState # Stateful sequential layers
model = SequentialState(
nn.Conv2d(1, 20, 5, 1), # Convolve from 1 -> 20 channels
LIFCell(), # Spiking activation layer
nn.MaxPool2d(2, 2),
nn.Conv2d(20, 50, 5, 1), # Convolve from 20 -> 50 channels
LIFCell(),
nn.MaxPool2d(2, 2),
nn.Flatten(), # Flatten to 800 units
nn.Linear(800, 10),
LICell(), # Non-spiking integrator layer
)
data = torch.randn(8, 1, 28, 28) # 8 batches, 1 channel, 28x28 pixels
output, state = model(data) # Provides a tuple (tensor (8, 10), neuron state)
```
### 2.4. Example: Long short-term spiking neural networks
The long short-term spiking neural networks from the paper by [G. Bellec, D. Salaj, A. Subramoney, R. Legenstein, and W. Maass (2018)](https://arxiv.org/abs/1803.09574) is another interesting way to apply norse:
```python
import torch
from norse.torch import LSNNRecurrent
# Recurrent LSNN network with 2 input neurons and 10 output neurons
layer = LSNNRecurrent(2, 10)
# Generate data: 20 timesteps with 8 datapoints per batch for 2 neurons
data = torch.zeros(20, 8, 2)
# Tuple of (output spikes of shape (20, 8, 2), layer state)
output, new_state = layer(data)
```
## 3. Why Norse?
Norse was created for two reasons: to 1) apply findings from decades of research in practical settings
and to 2) accelerate our own research within bio-inspired learning.
We are passionate about Norse: we strive to follow best practices and promise to maintain this library for the
simple reason that we depend on it ourselves.
We have implemented a number of neuron models, synapse dynamics, encoding and decoding algorithms,
dataset integrations, tasks, and examples.
Combined with the PyTorch infrastructure and our high coding standards, we have found Norse to be an excellent tool for modelling *scaleable* experiments and [Norse is actively being used in research](https://norse.github.io/norse/papers.html).
Finally, we are working to keep Norse as performant as possible.
Preliminary benchmarks suggest that Norse [achieves excellent performance on small networks of up to ~5000 neurons per layer](https://github.com/norse/norse/tree/master/norse/benchmark).
Aided by the preexisting investment in scalable training and inference with PyTorch, Norse scales from a single laptop to several nodes on an HPC cluster with little effort.
As illustrated by our [PyTorch Lightning example task](https://norse.github.io/norse/tasks.html#mnist-in-pytorch-lightning).
[Read more about Norse in our documentation](https://norse.github.io/norse/about.html).
## 4. Similar work
We refer to the [Neuromorphic Software Guide](https://open-neuromorphic.org/neuromorphic-computing/software/) for a comprehensive list of software for neuromorphic computing.
## 5. Contributing
Contributions are warmly encouraged and always welcome. However, we also have high expectations around the code base so if you wish to contribute, please refer to our [contribution guidelines](contributing.md).
## 6. Credits
Norse is created by
* [Christian Pehle](https://www.kip.uni-heidelberg.de/people/10110) (@GitHub [cpehle](https://github.com/cpehle/)), PostDoc at University of Heidelberg, Germany.
* [Jens E. Pedersen](https://www.kth.se/profile/jeped) (@GitHub [jegp](https://github.com/jegp/)), doctoral student at KTH Royal Institute of Technology, Sweden.
More information about Norse can be found [in our documentation](https://norse.github.io/norse/about.html). The research has received funding from the EC Horizon 2020 Framework Programme under Grant Agreements 785907 and 945539 (HBP) and by the Deutsche Forschungsgemeinschaft (DFG, German Research Foundation) under Germany's Excellence Strategy EXC 2181/1 - 390900948 (the Heidelberg STRUCTURES Excellence Cluster).
## 7. Citation
If you use Norse in your work, please cite it as follows:
```BibTex
@software{norse2021,
author = {Pehle, Christian and
Pedersen, Jens Egholm},
title = {{Norse - A deep learning library for spiking
neural networks}},
month = jan,
year = 2021,
note = {Documentation: https://norse.ai/docs/},
publisher = {Zenodo},
version = {0.0.7},
doi = {10.5281/zenodo.4422025},
url = {https://doi.org/10.5281/zenodo.4422025}
}
```
Norse is actively applied and cited in the literature. We refer to [Google Scholar](https://scholar.google.com/citations?view_op=view_citation&hl=de&user=A4VwxccAAAAJ&citation_for_view=A4VwxccAAAAJ:2osOgNQ5qMEC) or [Semantic Scholar](https://www.semanticscholar.org/paper/Norse-A-deep-learning-library-for-spiking-neural-Pehle-Pedersen/bdd21dfe8c4a503365a49bfdb099e63c74823c7c) for a list of citations.
## 8. License
LGPLv3. See [LICENSE](LICENSE) for license details.
Raw data
{
"_id": null,
"home_page": "",
"name": "norse",
"maintainer": "",
"docs_url": null,
"requires_python": ">=3.8",
"maintainer_email": "\"Jens E. Pedersen\" <jens@jepedersen.dk>",
"keywords": "spiking neural networks,deep learning,neural networks,machine learning",
"author": "",
"author_email": "\"Jens E. Pedersen\" <jens@jepedersen.dk>, Christian Pehle <christian.pehle@gmail.com>",
"download_url": "https://files.pythonhosted.org/packages/be/33/b804ae9fe4f141abf31035a2c9553737729e82b8f7df8e7fc9ad7c1d600c/norse-1.1.0.tar.gz",
"platform": null,
"description": "<p align=\"center\">\n<img src=\"https://raw.githubusercontent.com/norse/norse/master/logo.png\">\n</p>\n\nA [deep learning](https://en.wikipedia.org/wiki/Deep_learning) library for [spiking neural networks](https://en.wikipedia.org/wiki/Spiking_neural_network).\n\n<p align=\"center\">\n <a href=\"https://github.com/norse/norse/actions\">\n <img src=\"https://github.com/norse/norse/workflows/Build%20Python/badge.svg\" alt=\"Test status\"></a>\n <a href=\"https://pypi.org/project/norse/\" alt=\"PyPi\">\n <img src=\"https://img.shields.io/pypi/v/norse\" />\n </a>\n <a href=\"https://pypi.org/project/aestream/\" alt=\"PyPi\">\n <img src=\"https://img.shields.io/pypi/dm/aestream\" />\n </a>\n <a href=\"https://github.com/norse/norse/pulse\" alt=\"Activity\">\n <img src=\"https://img.shields.io/github/last-commit/norse/norse\" />\n </a>\n <a href=\"https://discord.gg/7fGN359\">\n <img src=\"https://img.shields.io/discord/723215296399147089\"\n alt=\"chat on Discord\"></a>\n <a href=\"https://www.codacy.com/gh/norse/norse?utm_source=github.com&utm_medium=referral&utm_content=norse/norse&utm_campaign=Badge_Grade\"><img src=\"https://app.codacy.com/project/badge/Grade/a9ab846fc6114afda4320badcb8a69c2\"/></a>\n <a href=\"https://codecov.io/gh/norse/norse\"><img src=\"https://codecov.io/gh/norse/norse/branch/master/graph/badge.svg\" /></a>\n <a href=\"https://doi.org/10.5281/zenodo.4422025\"><img src=\"https://zenodo.org/badge/DOI/10.5281/zenodo.4422025.svg\" alt=\"DOI\"></a>\n</p>\n\nNorse aims to exploit the advantages of bio-inspired neural components, which are sparse and event-driven - a fundamental difference from artificial neural networks.\nNorse expands [PyTorch](https://pytorch.org/) with primitives for bio-inspired neural components, \nbringing you two advantages: a modern and proven infrastructure based on PyTorch and deep learning-compatible spiking neural network components.\n\n**Documentation**: [norse.github.io/norse/](https://norse.github.io/norse/)\n\n## 1. Getting started\n\nThe fastest way to try Norse is via the [jupyter notebooks on Google collab](https://github.com/norse/notebooks/tree/master/).\n\nAlternatively, [you can install Norse locally](#installation) and run one of the [included tasks](https://norse.github.io/norse/tasks.html) such as [MNIST](https://en.wikipedia.org/wiki/MNIST_database):\n```bash\npython -m norse.task.mnist\n```\n\n## 2. Using Norse\n\nNorse presents plug-and-play components for deep learning with spiking neural networks.\nHere, we describe how to install Norse and start to apply it in your own work.\n[Read more in our documentation](https://norse.github.io/norse/working.html).\n\n### 2.1. Installation\n<a name=\"installation\"></a>\n\nWe assume you are using **Python version 3.8+** and have **installed PyTorch version 1.9 or higher**. \n[Read more about the prerequisites in our documentation](https://norse.github.io/norse/installing.html).\n\n<table>\n<thead>\n<tr>\n<th>Method</th><th>Instructions</th><th>Prerequisites</th>\n</tr>\n</thead>\n\n<tr>\n<td>From PyPi</td><td><div class=\"highlight highlight-source-shell\"><pre>\npip install norse\n</pre></div></td><td><a href=\"https://pypi.org/\" title=\"PyPi\">Pip</a></td>\n</tr>\n<tr>\n<td>From source</td><td><div class=\"highlight highlight-source-shell\"><pre>\npip install -qU git+https://github.com/norse/norse\n</pre></div></td><td><a href=\"https://pypi.org/\" title=\"PyPi\">Pip</a>, <a href=\"https://pytorch.org/get-started/locally/\" title=\"PyTorch\">PyTorch</a></td>\n</tr>\n<tr>\n<td>With Docker</td><td><div class=\"highlight highlight-source-shell\"><pre>\ndocker pull quay.io/norse/norse\n</pre></div></td><td><a href=\"https://www.docker.com/get-started\" title=\"Docker\">Docker</a></td>\n</tr>\n<tr>\n<td>From Conda</td><td> <div class=\"highlight highlight-source-shell\"><pre>\nconda install -c norse norse\n</pre></div></td><td><a href=\"https://docs.anaconda.com/anaconda/install/\" title=\"Anaconda\">Anaconda</a> or <a href=\"https://docs.conda.io/en/latest/miniconda.html\" title=\"Miniconda\">Miniconda</a></td>\n</tr>\n</table>\n\nFor troubleshooting, please refer to our [installation guide](https://norse.github.io/norse/pages/installing.html#installation-troubleshooting), [create an issue on GitHub](https://github.com/norse/norse/issues) or [write us on Discord](https://discord.gg/7fGN359).\n\n### 2.2. Running examples\n\nNorse is bundled with a number of example tasks, serving as short, self contained, correct examples ([SSCCE](http://www.sscce.org/)).\nThey can be run by invoking the `norse` module from the base directory.\nMore information and tasks are available [in our documentation](https://norse.github.io/norse/tasks.html) and in your console by typing: `python -m norse.task.<task> --help`, where `<task>` is one of the task names.\n\n- To train an MNIST classification network, invoke\n ```bash\n python -m norse.task.mnist\n ```\n- To train a CIFAR classification network, invoke\n ```bash\n python -m norse.task.cifar10\n ```\n- To train the cartpole balancing task with Policy gradient, invoke\n ```bash\n python -m norse.task.cartpole\n ```\n\nNorse is compatible with [PyTorch Lightning](https://pytorch-lightning.readthedocs.io/en/stable/),\nas demonstrated in the [PyTorch Lightning MNIST task variant](https://github.com/norse/norse/blob/master/norse/task/mnist_pl.py) (requires PyTorch lightning):\n\n```bash\npython -m norse.task.mnist_pl --gpus=4\n```\n\n### 2.3. Example: Spiking convolutional classifier \n\n[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/norse/notebooks/blob/master/mnist_classifiers.ipynb)\n\nThis classifier is taken from our [tutorial on training a spiking MNIST classifier](https://github.com/norse/notebooks#level-intermediate) and achieves >99% accuracy.\n\n```python\nimport torch, torch.nn as nn\nfrom norse.torch import LICell # Leaky integrator\nfrom norse.torch import LIFCell # Leaky integrate-and-fire\nfrom norse.torch import SequentialState # Stateful sequential layers\n\nmodel = SequentialState(\n nn.Conv2d(1, 20, 5, 1), # Convolve from 1 -> 20 channels\n LIFCell(), # Spiking activation layer\n nn.MaxPool2d(2, 2),\n nn.Conv2d(20, 50, 5, 1), # Convolve from 20 -> 50 channels\n LIFCell(),\n nn.MaxPool2d(2, 2),\n nn.Flatten(), # Flatten to 800 units\n nn.Linear(800, 10),\n LICell(), # Non-spiking integrator layer\n)\n\ndata = torch.randn(8, 1, 28, 28) # 8 batches, 1 channel, 28x28 pixels\noutput, state = model(data) # Provides a tuple (tensor (8, 10), neuron state)\n```\n\n### 2.4. Example: Long short-term spiking neural networks\nThe long short-term spiking neural networks from the paper by [G. Bellec, D. Salaj, A. Subramoney, R. Legenstein, and W. Maass (2018)](https://arxiv.org/abs/1803.09574) is another interesting way to apply norse: \n```python\nimport torch\nfrom norse.torch import LSNNRecurrent\n# Recurrent LSNN network with 2 input neurons and 10 output neurons\nlayer = LSNNRecurrent(2, 10)\n# Generate data: 20 timesteps with 8 datapoints per batch for 2 neurons\ndata = torch.zeros(20, 8, 2)\n# Tuple of (output spikes of shape (20, 8, 2), layer state)\noutput, new_state = layer(data)\n```\n\n## 3. Why Norse?\n\nNorse was created for two reasons: to 1) apply findings from decades of research in practical settings\nand to 2) accelerate our own research within bio-inspired learning.\n\nWe are passionate about Norse: we strive to follow best practices and promise to maintain this library for the\nsimple reason that we depend on it ourselves.\nWe have implemented a number of neuron models, synapse dynamics, encoding and decoding algorithms, \ndataset integrations, tasks, and examples.\nCombined with the PyTorch infrastructure and our high coding standards, we have found Norse to be an excellent tool for modelling *scaleable* experiments and [Norse is actively being used in research](https://norse.github.io/norse/papers.html).\n\nFinally, we are working to keep Norse as performant as possible. \nPreliminary benchmarks suggest that Norse [achieves excellent performance on small networks of up to ~5000 neurons per layer](https://github.com/norse/norse/tree/master/norse/benchmark). \nAided by the preexisting investment in scalable training and inference with PyTorch, Norse scales from a single laptop to several nodes on an HPC cluster with little effort.\nAs illustrated by our [PyTorch Lightning example task](https://norse.github.io/norse/tasks.html#mnist-in-pytorch-lightning).\n\n\n[Read more about Norse in our documentation](https://norse.github.io/norse/about.html).\n\n## 4. Similar work\n\nWe refer to the [Neuromorphic Software Guide](https://open-neuromorphic.org/neuromorphic-computing/software/) for a comprehensive list of software for neuromorphic computing.\n\n\n## 5. Contributing\n\nContributions are warmly encouraged and always welcome. However, we also have high expectations around the code base so if you wish to contribute, please refer to our [contribution guidelines](contributing.md).\n\n## 6. Credits\n\nNorse is created by\n* [Christian Pehle](https://www.kip.uni-heidelberg.de/people/10110) (@GitHub [cpehle](https://github.com/cpehle/)), PostDoc at University of Heidelberg, Germany.\n* [Jens E. Pedersen](https://www.kth.se/profile/jeped) (@GitHub [jegp](https://github.com/jegp/)), doctoral student at KTH Royal Institute of Technology, Sweden.\n\nMore information about Norse can be found [in our documentation](https://norse.github.io/norse/about.html). The research has received funding from the EC Horizon 2020 Framework Programme under Grant Agreements 785907 and 945539 (HBP) and by the Deutsche Forschungsgemeinschaft (DFG, German Research Foundation) under Germany's Excellence Strategy EXC 2181/1 - 390900948 (the Heidelberg STRUCTURES Excellence Cluster).\n\n## 7. Citation\n\nIf you use Norse in your work, please cite it as follows:\n\n```BibTex\n@software{norse2021,\n author = {Pehle, Christian and\n Pedersen, Jens Egholm},\n title = {{Norse - A deep learning library for spiking \n neural networks}},\n month = jan,\n year = 2021,\n note = {Documentation: https://norse.ai/docs/},\n publisher = {Zenodo},\n version = {0.0.7},\n doi = {10.5281/zenodo.4422025},\n url = {https://doi.org/10.5281/zenodo.4422025}\n}\n```\n\nNorse is actively applied and cited in the literature. We refer to [Google Scholar](https://scholar.google.com/citations?view_op=view_citation&hl=de&user=A4VwxccAAAAJ&citation_for_view=A4VwxccAAAAJ:2osOgNQ5qMEC) or [Semantic Scholar](https://www.semanticscholar.org/paper/Norse-A-deep-learning-library-for-spiking-neural-Pehle-Pedersen/bdd21dfe8c4a503365a49bfdb099e63c74823c7c) for a list of citations.\n\n## 8. License\n\nLGPLv3. See [LICENSE](LICENSE) for license details.\n",
"bugtrack_url": null,
"license": "GNU LESSER GENERAL PUBLIC LICENSE Version 3, 29 June 2007 Copyright (C) 2007 Free Software Foundation, Inc. <http://fsf.org/> Everyone is permitted to copy and distribute verbatim copies of this license document, but changing it is not allowed. This version of the GNU Lesser General Public License incorporates the terms and conditions of version 3 of the GNU General Public License, supplemented by the additional permissions listed below. 0. Additional Definitions. As used herein, \"this License\" refers to version 3 of the GNU Lesser General Public License, and the \"GNU GPL\" refers to version 3 of the GNU General Public License. \"The Library\" refers to a covered work governed by this License, other than an Application or a Combined Work as defined below. An \"Application\" is any work that makes use of an interface provided by the Library, but which is not otherwise based on the Library. Defining a subclass of a class defined by the Library is deemed a mode of using an interface provided by the Library. A \"Combined Work\" is a work produced by combining or linking an Application with the Library. The particular version of the Library with which the Combined Work was made is also called the \"Linked Version\". The \"Minimal Corresponding Source\" for a Combined Work means the Corresponding Source for the Combined Work, excluding any source code for portions of the Combined Work that, considered in isolation, are based on the Application, and not on the Linked Version. The \"Corresponding Application Code\" for a Combined Work means the object code and/or source code for the Application, including any data and utility programs needed for reproducing the Combined Work from the Application, but excluding the System Libraries of the Combined Work. 1. Exception to Section 3 of the GNU GPL. You may convey a covered work under sections 3 and 4 of this License without being bound by section 3 of the GNU GPL. 2. Conveying Modified Versions. If you modify a copy of the Library, and, in your modifications, a facility refers to a function or data to be supplied by an Application that uses the facility (other than as an argument passed when the facility is invoked), then you may convey a copy of the modified version: a) under this License, provided that you make a good faith effort to ensure that, in the event an Application does not supply the function or data, the facility still operates, and performs whatever part of its purpose remains meaningful, or b) under the GNU GPL, with none of the additional permissions of this License applicable to that copy. 3. Object Code Incorporating Material from Library Header Files. The object code form of an Application may incorporate material from a header file that is part of the Library. You may convey such object code under terms of your choice, provided that, if the incorporated material is not limited to numerical parameters, data structure layouts and accessors, or small macros, inline functions and templates (ten or fewer lines in length), you do both of the following: a) Give prominent notice with each copy of the object code that the Library is used in it and that the Library and its use are covered by this License. b) Accompany the object code with a copy of the GNU GPL and this license document. 4. Combined Works. You may convey a Combined Work under terms of your choice that, taken together, effectively do not restrict modification of the portions of the Library contained in the Combined Work and reverse engineering for debugging such modifications, if you also do each of the following: a) Give prominent notice with each copy of the Combined Work that the Library is used in it and that the Library and its use are covered by this License. b) Accompany the Combined Work with a copy of the GNU GPL and this license document. c) For a Combined Work that displays copyright notices during execution, include the copyright notice for the Library among these notices, as well as a reference directing the user to the copies of the GNU GPL and this license document. d) Do one of the following: 0) Convey the Minimal Corresponding Source under the terms of this License, and the Corresponding Application Code in a form suitable for, and under terms that permit, the user to recombine or relink the Application with a modified version of the Linked Version to produce a modified Combined Work, in the manner specified by section 6 of the GNU GPL for conveying Corresponding Source. 1) Use a suitable shared library mechanism for linking with the Library. A suitable mechanism is one that (a) uses at run time a copy of the Library already present on the user's computer system, and (b) will operate properly with a modified version of the Library that is interface-compatible with the Linked Version. e) Provide Installation Information, but only if you would otherwise be required to provide such information under section 6 of the GNU GPL, and only to the extent that such information is necessary to install and execute a modified version of the Combined Work produced by recombining or relinking the Application with a modified version of the Linked Version. (If you use option 4d0, the Installation Information must accompany the Minimal Corresponding Source and Corresponding Application Code. If you use option 4d1, you must provide the Installation Information in the manner specified by section 6 of the GNU GPL for conveying Corresponding Source.) 5. Combined Libraries. You may place library facilities that are a work based on the Library side by side in a single library together with other library facilities that are not Applications and are not covered by this License, and convey such a combined library under terms of your choice, if you do both of the following: a) Accompany the combined library with a copy of the same work based on the Library, uncombined with any other library facilities, conveyed under the terms of this License. b) Give prominent notice with the combined library that part of it is a work based on the Library, and explaining where to find the accompanying uncombined form of the same work. 6. Revised Versions of the GNU Lesser General Public License. The Free Software Foundation may publish revised and/or new versions of the GNU Lesser General Public License from time to time. Such new versions will be similar in spirit to the present version, but may differ in detail to address new problems or concerns. Each version is given a distinguishing version number. If the Library as you received it specifies that a certain numbered version of the GNU Lesser General Public License \"or any later version\" applies to it, you have the option of following the terms and conditions either of that published version or of any later version published by the Free Software Foundation. If the Library as you received it does not specify a version number of the GNU Lesser General Public License, you may choose any version of the GNU Lesser General Public License ever published by the Free Software Foundation. If the Library as you received it specifies that a proxy can decide whether future versions of the GNU Lesser General Public License shall apply, that proxy's public statement of acceptance of any version is permanent authorization for you to choose that version for the Library.",
"summary": "A library for deep learning with spiking neural networks",
"version": "1.1.0",
"project_urls": {
"Documentation": "https://norse.github.io/norse",
"Homepage": "https://github.com/norse/norse",
"Issues": "https://github.com/norse/norse/issues"
},
"split_keywords": [
"spiking neural networks",
"deep learning",
"neural networks",
"machine learning"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "be33b804ae9fe4f141abf31035a2c9553737729e82b8f7df8e7fc9ad7c1d600c",
"md5": "b2b4628a55cd75d3f8ff27c5b890acc2",
"sha256": "3f70b8579251316761f7a950680136029fcd9d92bbcc53531366d088a8aaf0c8"
},
"downloads": -1,
"filename": "norse-1.1.0.tar.gz",
"has_sig": false,
"md5_digest": "b2b4628a55cd75d3f8ff27c5b890acc2",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.8",
"size": 1501802,
"upload_time": "2024-03-18T22:39:51",
"upload_time_iso_8601": "2024-03-18T22:39:51.089283Z",
"url": "https://files.pythonhosted.org/packages/be/33/b804ae9fe4f141abf31035a2c9553737729e82b8f7df8e7fc9ad7c1d600c/norse-1.1.0.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-03-18 22:39:51",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "norse",
"github_project": "norse",
"travis_ci": false,
"coveralls": true,
"github_actions": true,
"lcname": "norse"
}