You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
We study the learning problem associated with spiking neural networks. Specifically, we focus on spiking neural networks composed of simple spiking neurons having only positive synaptic weights, equipped with an affine encoder and decoder. These neural networks are shown to depend continuously on their parameters, which facilitates classical covering number-based generalization statements and supports stable gradient-based training. We demonstrate that the positivity of the weights enables a wide range of expressivity results, including rate-optimal approximation of smooth functions and dimension-independent approximation of Barron regular functions. In particular, we show in theory and simulations that affine spiking neural networks are capable of approximating shallow ReLU neural networks. Furthermore, we apply these neural networks to standard machine learning benchmarks and reach competitive results. Finally, we observe that from a generalization perspective, contrary to feedforward neural networks or previous results for general spiking neural networks, the depth has little to no adverse effect on the generalization capabilities.
Code
We provide implementations and experimental setups for affine SNNs, as reported in the corresponding research article.
The source code of the neuron model is in affineSNN/layers/LinearAffine.py.
To use our code, install the provided package affineSNN via pip
pip install -e .
from the folder containing the setup.py.
The source code is located in affineSNN/.
Notebooks with the experiments presented in the publication are found in Notebooks.
Citation
If you use the provided code, or find it helpful for your own work, please cite
@article{neuman2024stable,
title={Stable learning using spiking neural networks equipped with affine encoders and decoders},
author={Neuman, A Martina and Dold, Dominik and Petersen, Philipp Christian},
journal={arXiv preprint arXiv:2404.04549},
year={2024}
}
About
This small package implements the affine SNN model and provides several example notebooks for training affine SNNs.