R.I.P.
๐ป
Ghosted
Uncovering the Representation of Spiking Neural Networks Trained with Surrogate Gradient
April 25, 2023 ยท Entered Twilight ยท ๐ Trans. Mach. Learn. Res.
Repo contents: .gitignore, LICENSE, README.md, data_loaders.py, functions.py, main_ann.py, main_cka.py, main_snn.py, models
Authors
Yuhang Li, Youngeun Kim, Hyoungseob Park, Priyadarshini Panda
arXiv ID
2304.13098
Category
cs.LG: Machine Learning
Cross-listed
cs.NE
Citations
19
Venue
Trans. Mach. Learn. Res.
Repository
https://github.com/Intelligent-Computing-Lab-Yale/SNNCKA
โญ 14
Last Checked
2 months ago
Abstract
Spiking Neural Networks (SNNs) are recognized as the candidate for the next-generation neural networks due to their bio-plausibility and energy efficiency. Recently, researchers have demonstrated that SNNs are able to achieve nearly state-of-the-art performance in image recognition tasks using surrogate gradient training. However, some essential questions exist pertaining to SNNs that are little studied: Do SNNs trained with surrogate gradient learn different representations from traditional Artificial Neural Networks (ANNs)? Does the time dimension in SNNs provide unique representation power? In this paper, we aim to answer these questions by conducting a representation similarity analysis between SNNs and ANNs using Centered Kernel Alignment (CKA). We start by analyzing the spatial dimension of the networks, including both the width and the depth. Furthermore, our analysis of residual connections shows that SNNs learn a periodic pattern, which rectifies the representations in SNNs to be ANN-like. We additionally investigate the effect of the time dimension on SNN representation, finding that deeper layers encourage more dynamics along the time dimension. We also investigate the impact of input data such as event-stream data and adversarial attacks. Our work uncovers a host of new findings of representations in SNNs. We hope this work will inspire future research to fully comprehend the representation power of SNNs. Code is released at https://github.com/Intelligent-Computing-Lab-Yale/SNNCKA.
Community Contributions
Found the code? Know the venue? Think something is wrong? Let us know!
๐ Similar Papers
In the same crypt โ Machine Learning
R.I.P.
๐ป
Ghosted
XGBoost: A Scalable Tree Boosting System
R.I.P.
๐ป
Ghosted
Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift
R.I.P.
๐ป
Ghosted
Semi-Supervised Classification with Graph Convolutional Networks
R.I.P.
๐ป
Ghosted
Proximal Policy Optimization Algorithms
R.I.P.
๐ป
Ghosted