Skip to main content

A library for deep learning with spiking neural networks

Project description

A deep learning library for spiking neural networks.

Test status chat on Discord DOI

Norse aims to exploit the advantages of bio-inspired neural components, which are sparse and event-driven - a fundamental difference from artificial neural networks. Norse expands PyTorch with primitives for bio-inspired neural components, bringing you two advantages: a modern and proven infrastructure based on PyTorch and deep learning-compatible spiking neural network components.

Documentation: norse.github.io/norse/

1. Getting started

The fastest way to try Norse is via the jupyter notebooks on Google collab.

Alternatively, you can install Norse locally and run one of the included tasks such as MNIST:

python -m norse.task.mnist

2. Using Norse

Norse presents plug-and-play components for deep learning with spiking neural networks. Here, we describe how to install Norse and start to apply it in your own work. Read more in our documentation.

2.1. Installation

We assume you are using Python version 3.8+ and have installed PyTorch version 1.9 or higher. Read more about the prerequisites in our documentation.

MethodInstructionsPrerequisites
From PyPi
pip install norse
Pip
From source
pip install -qU git+https://github.com/norse/norse
Pip, PyTorch
With Docker
docker pull quay.io/norse/norse
Docker
From Conda
conda install -c norse norse
Anaconda or Miniconda

For troubleshooting, please refer to our installation guide, create an issue on GitHub or write us on Discord.

2.2. Running examples

Norse is bundled with a number of example tasks, serving as short, self contained, correct examples (SSCCE). They can be run by invoking the norse module from the base directory. More information and tasks are available in our documentation and in your console by typing: python -m norse.task.<task> --help, where <task> is one of the task names.

  • To train an MNIST classification network, invoke
    python -m norse.task.mnist
    
  • To train a CIFAR classification network, invoke
    python -m norse.task.cifar10
    
  • To train the cartpole balancing task with Policy gradient, invoke
    python -m norse.task.cartpole
    

Norse is compatible with PyTorch Lightning, as demonstrated in the PyTorch Lightning MNIST task variant (requires PyTorch lightning):

python -m norse.task.mnist_pl --gpus=4

2.3. Example: Spiking convolutional classifier

Open In Colab

This classifier is taken from our tutorial on training a spiking MNIST classifier and achieves >99% accuracy.

import torch, torch.nn as nn
from norse.torch import LICell             # Leaky integrator
from norse.torch import LIFCell            # Leaky integrate-and-fire
from norse.torch import SequentialState    # Stateful sequential layers

model = SequentialState(
    nn.Conv2d(1, 20, 5, 1),      # Convolve from 1 -> 20 channels
    LIFCell(),                   # Spiking activation layer
    nn.MaxPool2d(2, 2),
    nn.Conv2d(20, 50, 5, 1),     # Convolve from 20 -> 50 channels
    LIFCell(),
    nn.MaxPool2d(2, 2),
    nn.Flatten(),                # Flatten to 800 units
    nn.Linear(800, 10),
    LICell(),                    # Non-spiking integrator layer
)

data = torch.randn(8, 1, 28, 28) # 8 batches, 1 channel, 28x28 pixels
output, state = model(data)      # Provides a tuple (tensor (8, 10), neuron state)

2.4. Example: Long short-term spiking neural networks

The long short-term spiking neural networks from the paper by G. Bellec, D. Salaj, A. Subramoney, R. Legenstein, and W. Maass (2018) is another interesting way to apply norse:

import torch
from norse.torch import LSNNRecurrent
# Recurrent LSNN network with 2 input neurons and 10 output neurons
layer = LSNNRecurrent(2, 10)
# Generate data: 20 timesteps with 8 datapoints per batch for 2 neurons
data  = torch.zeros(20, 8, 2)
# Tuple of (output spikes of shape (20, 8, 2), layer state)
output, new_state = layer(data)

3. Why Norse?

Norse was created for two reasons: to 1) apply findings from decades of research in practical settings and to 2) accelerate our own research within bio-inspired learning.

We are passionate about Norse: we strive to follow best practices and promise to maintain this library for the simple reason that we depend on it ourselves. We have implemented a number of neuron models, synapse dynamics, encoding and decoding algorithms, dataset integrations, tasks, and examples. Combined with the PyTorch infrastructure and our high coding standards, we have found Norse to be an excellent tool for modelling scaleable experiments and Norse is actively being used in research.

Finally, we are working to keep Norse as performant as possible. Preliminary benchmarks suggest that Norse achieves excellent performance on small networks of up to ~5000 neurons per layer. Aided by the preexisting investment in scalable training and inference with PyTorch, Norse scales from a single laptop to several nodes on an HPC cluster with little effort. As illustrated by our PyTorch Lightning example task.

Read more about Norse in our documentation.

4. Similar work

We refer to the Neuromorphic Software Guide for a comprehensive list of software for neuromorphic computing.

5. Contributing

Contributions are warmly encouraged and always welcome. However, we also have high expectations around the code base so if you wish to contribute, please refer to our contribution guidelines.

6. Credits

Norse is created by

More information about Norse can be found in our documentation. The research has received funding from the EC Horizon 2020 Framework Programme under Grant Agreements 785907 and 945539 (HBP) and by the Deutsche Forschungsgemeinschaft (DFG, German Research Foundation) under Germany's Excellence Strategy EXC 2181/1 - 390900948 (the Heidelberg STRUCTURES Excellence Cluster).

7. Citation

If you use Norse in your work, please cite it as follows:

@software{norse2021,
  author       = {Pehle, Christian and
                  Pedersen, Jens Egholm},
  title        = {{Norse -  A deep learning library for spiking 
                   neural networks}},
  month        = jan,
  year         = 2021,
  note         = {Documentation: https://norse.ai/docs/},
  publisher    = {Zenodo},
  version      = {0.0.7},
  doi          = {10.5281/zenodo.4422025},
  url          = {https://doi.org/10.5281/zenodo.4422025}
}

Norse is actively applied and cited in the literature. We refer to Google Scholar or Semantic Scholar for a list of citations.

8. License

LGPLv3. See LICENSE for license details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

norse-1.1.0.tar.gz (1.5 MB view hashes)

Uploaded Source

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page