skip to main content
10.1145/2851613.2851724acmconferencesArticle/Chapter ViewAbstractPublication PagessacConference Proceedingsconference-collections
research-article

Learning to be efficient: algorithms for training low-latency, low-compute deep spiking neural networks

Published: 04 April 2016 Publication History

Abstract

Recent advances have allowed Deep Spiking Neural Networks (SNNs) to perform at the same accuracy levels as Artificial Neural Networks (ANNs), but have also highlighted a unique property of SNNs: whereas in ANNs, every neuron needs to update once before an output can be created, the computational effort in an SNN depends on the number of spikes created in the network. While higher spike rates and longer computing times typically improve classification performance, very good results can already be achieved earlier. Here we investigate how Deep SNNs can be optimized to reach desired high accuracy levels as quickly as possible. Different approaches are compared which either minimize the number of spikes created, or aim at rapid classification by enforcing the learning of feature detectors that respond to few input spikes. A variety of networks with different optimization approaches are trained on the MNIST benchmark to perform at an accuracy level of at least 98%, while monitoring the average number of input spikes and spikes created within the network to reach this level of accuracy. The majority of SNNs required significantly fewer computations than frame-based ANN approaches. The most efficient SNN achieves an answer in less than 42% of the computational steps necessary for the ANN, and the fastest SNN requires only 25% of the original number of input spikes to achieve equal classification accuracy. Our results suggest that SNNs can be optimized to dramatically decrease the latency as well as the computation requirements for Deep Neural Networks, making them particularly attractive for applications like robotics, where real-time restrictions to produce outputs and low energy budgets are common.

References

[1]
C. Brandli, L. Muller, and T. Delbruck. Real-time, high-speed video decompression using a frame-and event-based DAVIS sensor. In 2014 IEEE International Symposium on Circuits and Systems (ISCAS), pages 686--689. IEEE, 2014.
[2]
Y. Cao, Y. Chen, and D. Khosla. Spiking deep convolutional neural networks for energy-efficient object recognition. International Journal of Computer Vision, pages 1--13, 2014.
[3]
P. U. Diehl, D. Neil, J. Binas, M. Cook, S.-C. Liu, and M. Pfeiffer. Fast-classifying, high-accuracy spiking deep networks through weight and threshold balancing. In International Joint Conference on Neural Networks (IJCNN), 2015.
[4]
C. Farabet et al. Comparison between frame-constrained fix-pixel-value and frame-free spiking-dynamic-pixel convnets for visual processing. Frontiers in Neuroscience, 6, 2012.
[5]
S. Furber, F. Galluppi, S. Temple, and L. Plana. The SpiNNaker Project. Proceedings of the IEEE, 102(5):652--665, 2014.
[6]
A. Hannun et al. Deep Speech: Scaling up end-to-end speech recognition. arXiv preprint arXiv:1412.5567, 2014.
[7]
G. Hinton, L. Deng, D. Yu, G. E. Dahl, A.-r. Mohamed, N. Jaitly, A. Senior, V. Vanhoucke, P. Nguyen, T. N. Sainath, et al. Deep neural networks for acoustic modeling in speech recognition: The shared views of four research groups. IEEE Signal Processing Magazine, 29(6):82--97, 2012.
[8]
G. E. Hinton et al. Improving neural networks by preventing co-adaptation of feature detectors. arXiv preprint arXiv:1207.0580, 2012.
[9]
G. Indiveri, E. Chicca, and R. Douglas. A VLSI array of low-power spiking neurons and bistable synapses with spike-timing dependent plasticity. IEEE Trans on Neural Networks, 17(1):211--221, 2006.
[10]
Y. LeCun, Y. Bengio, and G. Hinton. Deep learning. Nature, 521(7553):436--444, 2015.
[11]
Y. LeCun, L. Bottou, Y. Bengio, and P. Haffner. Gradient-based learning applied to document recognition. Proceedings of the IEEE, 86(11):2278--2324, 1998.
[12]
S.-C. Liu and T. Delbruck. Neuromorphic sensory systems. Current Opinion in Neurobiology, 20(3):288--295, 2010.
[13]
P. Merolla et al. A million spiking-neuron integrated circuit with a scalable communication network and interface. Science, 345(6197):668--673, 2014.
[14]
V. Mnih, K. Kavukcuoglu, D. Silver, A. A. Rusu, J. Veness, M. G. Bellemare, A. Graves, M. Riedmiller, A. K. Fidjeland, G. Ostrovski, et al. Human-level control through deep reinforcement learning. Nature, 518(7540):529--533, 2015.
[15]
D. Neil and S.-C. Liu. Minitaur, an event-driven FPGA-based spiking network accelerator. IEEE Trans on Very Large Scale Integration (VLSI) Systems, 22(12):2621--2628, 2014.
[16]
P. O'Connor, D. Neil, S.-C. Liu, T. Delbruck, and M. Pfeiffer. Real-time classification and sensor fusion with a spiking Deep Belief Network. Frontiers in Neuroscience, 7, 2013.
[17]
R. B. Palm. Prediction as a candidate for learning deep hierarchical models of data, 2012.
[18]
J. Pérez-Carrasco and others. Mapping from frame-driven to frame-free event-driven vision systems by low-rate rate coding and coincidence processing--Application to feedforward ConvNets. IEEE Trans on Pattern Analysis and Machine Intelligence, 35(11):2706--2719, 2013.
[19]
C. Poultney, S. Chopra, Y. L. Cun, et al. Efficient learning of sparse representations with an energy-based model. In Proc. of NIPS, pages 1137--1144, 2006.
[20]
K. Simonyan and A. Zisserman. Very deep convolutional networks for large-scale image recognition. arXiv preprint arXiv:1409.1556, 2014.
[21]
N. Srivastava, G. Hinton, A. Krizhevsky, I. Sutskever, and R. Salakhutdinov. Dropout: A simple way to prevent neural networks from overfitting. The Journal of Machine Learning Research, 15(1):1929--1958, 2014.
[22]
E. Stromatias, D. Neil, F. Galluppi, M. Pfeiffer, S.-C. Liu, and S. Furber. Live demonstration: handwritten digit recognition using spiking Deep Belief Networks on SpiNNaker. In 2015 IEEE International Symposium on Circuits and Systems (ISCAS), pages 1901--1901, 2015.
[23]
C. Szegedy, W. Liu, Y. Jia, P. Sermanet, S. Reed, D. Anguelov, D. Erhan, V. Vanhoucke, and A. Rabinovich. Going deeper with convolutions. arXiv preprint arXiv:1409.4842, 2014.

Cited By

View all
  • (2025)Fractional-order spike-timing-dependent gradient descent for multi-layer spiking neural networksNeurocomputing10.1016/j.neucom.2024.128662611:COnline publication date: 1-Jan-2025
  • (2024)Memristor–CMOS Hybrid Circuits Implementing Event-Driven Neural Networks for Dynamic Vision Sensor CameraMicromachines10.3390/mi1504042615:4(426)Online publication date: 22-Mar-2024
  • (2024)Artificial Intelligence-Based Algorithms in Medical Image Scan Segmentation and Intelligent Visual Content Generation—A Concise OverviewElectronics10.3390/electronics1304074613:4(746)Online publication date: 13-Feb-2024
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
SAC '16: Proceedings of the 31st Annual ACM Symposium on Applied Computing
April 2016
2360 pages
ISBN:9781450337397
DOI:10.1145/2851613
This work is licensed under a Creative Commons Attribution-NoDerivatives International 4.0 License.

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 04 April 2016

Check for updates

Author Tags

  1. neural network efficiency
  2. spiking deep neural networks
  3. spiking neural network latency

Qualifiers

  • Research-article

Conference

SAC 2016
Sponsor:
SAC 2016: Symposium on Applied Computing
April 4 - 8, 2016
Pisa, Italy

Acceptance Rates

SAC '16 Paper Acceptance Rate 252 of 1,047 submissions, 24%;
Overall Acceptance Rate 1,650 of 6,669 submissions, 25%

Upcoming Conference

SAC '25
The 40th ACM/SIGAPP Symposium on Applied Computing
March 31 - April 4, 2025
Catania , Italy

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)61
  • Downloads (Last 6 weeks)6
Reflects downloads up to 14 Feb 2025

Other Metrics

Citations

Cited By

View all
  • (2025)Fractional-order spike-timing-dependent gradient descent for multi-layer spiking neural networksNeurocomputing10.1016/j.neucom.2024.128662611:COnline publication date: 1-Jan-2025
  • (2024)Memristor–CMOS Hybrid Circuits Implementing Event-Driven Neural Networks for Dynamic Vision Sensor CameraMicromachines10.3390/mi1504042615:4(426)Online publication date: 22-Mar-2024
  • (2024)Artificial Intelligence-Based Algorithms in Medical Image Scan Segmentation and Intelligent Visual Content Generation—A Concise OverviewElectronics10.3390/electronics1304074613:4(746)Online publication date: 13-Feb-2024
  • (2024)A neuromorphic approach to obstacle avoidance in robot manipulationThe International Journal of Robotics Research10.1177/02783649241284058Online publication date: 14-Nov-2024
  • (2024)Epilepsy Seizure Detection and Prediction using an Approximate Spiking Convolutional Transformer2024 IEEE International Symposium on Circuits and Systems (ISCAS)10.1109/ISCAS58744.2024.10558341(1-5)Online publication date: 19-May-2024
  • (2024)Solving Minimum Spanning Tree Problem in Spiking Neural Networks: Improved Results2024 International Conference on Neuromorphic Systems (ICONS)10.1109/ICONS62911.2024.00015(47-54)Online publication date: 30-Jul-2024
  • (2024)Benchmarking of hardware-efficient real-time neural decoding in brain–computer interfacesNeuromorphic Computing and Engineering10.1088/2634-4386/ad44114:2(024008)Online publication date: 17-May-2024
  • (2024)Snn and sound: a comprehensive review of spiking neural networks in soundBiomedical Engineering Letters10.1007/s13534-024-00406-y14:5(981-991)Online publication date: 11-Jul-2024
  • (2023)Spike-Based Anytime Perception2023 IEEE/CVF Winter Conference on Applications of Computer Vision (WACV)10.1109/WACV56688.2023.00526(5283-5293)Online publication date: Jan-2023
  • (2023)Comprehensive SNN Compression Using ADMM Optimization and Activity RegularizationIEEE Transactions on Neural Networks and Learning Systems10.1109/TNNLS.2021.310906434:6(2791-2805)Online publication date: Jun-2023
  • Show More Cited By

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media