On Spiking Neural Networks

Vladimir Evgrafov, Eugene Ilyushin

Abstract


Over the past few years, deep learning methods have made significant progress and have become widespread tools for solving various cognitive tasks. To leverage the power of deep learning everywhere, you need to deploy deep learning not only to large-scale computers but also to peripherals. However, the ever-growing  complexity of deep neural networks, coupled with a dramatic increase in the amount of data processed, place significant energy demands on  modern computing  platforms. The neuromorphic computing model assumes that computations are performed in a biologically plausible way. Part of neuromorphic computing is spike neural networks, which are one of the leading candidates for overcoming the limitations of neural computing and effectively using machine learning algorithms in real-world applications. This paper discusses the biological foundations of spike neural networks, methods for training and creating them, as well as software and hardware platforms for their use.


Full Text:

PDF (Russian)

References


Equal numbers of neuronal and nonneuronal cells make the human brain an isometrically scaledup primate brain / Frederico A. C. Azevedo, Ludmila R. B. Carvalho, Lea T. Grinberg et al. // Journal of Comparative Neurology. – 2009. –– Vol. 513, no. 5. –– P. 532–541. –– _eprint: https://onlinelibrary.wiley.com/doi/pdf/10.1002/cne.21974. online; accessed: https://onlinelibrary.wiley.com/doi/abs/10.1002/cne.21974 (online; accessed: 2020-05-09).

Deep Residual Learning for Image Recognition / Kaiming He, Xiangyu Zhang, Shaoqing Ren, Jian Sun // arXiv:1512.03385 [cs]. –– 2015. – Dec. –– arXiv: 1512.03385. Access mode: http://arxiv.org/abs/1512.03385 (online; accessed: 2020-05-10).

Mastering the game of Go with deep neural networks and tree search / David Silver, Aja Huang, Chris J. Maddison et al. // Nature. –– 2016. –– Jan. –– Vol. 529, no. 7587. –– P. 484–489. –– Number: 7587 Publisher: Nature Publishing Group. online; accessed: https://www.nature.com/articles/nature16961 (online; accessed: 2020-05-10).

Krizhevsky A., Sutskever I., Hinton G. E. ImageNet Classification with Deep Convolutional Neural Networks // Advances in Neural Information Processing Systems 25 / Ed. by F. Pereira, C. J. C. Burges, L. Bottou, K. Q. Weinberger. –– Curran Associates, Inc., 2012. –– P. 1097–1105. –Access mode: http://papers.nips.cc/paper/4824-imagenet-classificationwith-deep-convolutional-neural-networks.pdf (online;accessed: 2020-05-10).

Hinton G. E., Salakhutdinov R. R. Reducing the Dimensionality of Data with Neural Networks // Science. – 2006. –– Jul. –– Vol. 313, no. 5786. –– P. 504–507. – Publisher: American Association for the Advancement of Science Section: Report. online; accessed: https://science.sciencemag.org/content/313/5786/504 (accessed:2020-05-09).

Deep Neural Networks for Acoustic Modeling in Speech Recognition: The Shared Views of Four Research Groups / Geoffrey Hinton, Li Deng, Dong Yu et al. // IEEE Signal Processing Magazine. –– 2012. –– Nov. – Vol. 29, no. 6. – P. 82–97. –– Conference Name: IEEE Signal Processing Magazine.

Pham D. T., Packianather M. S., Charles E. Y. A. Control chart pattern clustering using a new selforganizing spiking neural network: // Proceedings of the Institution of Mechanical Engineers, Part B: Journal of Engineering Manufacture. –– 2008. –– Oct. –– Publisher: SAGE PublicationsSage UK: London, England. online; accessed: https://journals.sagepub.com/doi/10.1243/09544054JEM1054 (online; accessed: 2020-05-09).

Maass W. Networks of spiking neurons: The third generation of neural network models // Neural Networks. – 1997. –– Dec. –– Vol. 10, no. 9. – P. 1659–1671. – online; accessed: http://www.sciencedirect.com/science/article/pii/S0893608097000117 (online; accessed: 2019-12-17).

Izhikevich E. Simple model of spiking neurons // IEEE Transactions on Neural Networks. –– 2003. –– Nov. – Vol. 14, no. 6. –– P. 1569–1572. – online; accessed: http://ieeexplore.ieee.org/document/1257420/ (online; accessed: 2020-05-10).

Going Deeper in Spiking Neural Networks: VGG and Residual Architectures / Abhronil Sengupta, Yuting Ye, Robert Wang et al. // arXiv:1802.02627 [cs]. –– 2019. – Feb. –– arXiv: 1802.02627. Access mode: http://arxiv.org/abs/1802.02627 (online; accessed: 2020-04-06).

Simulation of networks of spiking neurons: A review of tools and strategies / Romain Brette, Michelle Rudolph, Ted Carnevale et al. // Journal of Computational Neuroscience. –– 2007. –– Dec. –– Vol. 23, no. 3. – P. 349–398. –– online; accessed: https://doi.org/10.1007/s10827-007-0038-6 (online; accessed: 2020-05-09).

Dynamic evolving spiking neural networks for on-line spatio- and spectro-temporal pattern recognition / Nikola Kasabov, Kshitij Dhoble, Nuttapod Nuntalid, Giacomo Indiveri // Neural Networks. – 2013. –– May. – Vol. 41. –– P. 188–201. –– online; accessed: http://www.sciencedirect.com/science/article/pii/S0893608012003139 (online; accessed: 2020-05-09).

Izhikevich E. M. Polychronization: Computation with Spikes // Neural Computation. –– 2006. –– Feb. –– Vol. 18, no. 2. –– P. 245–282. – Publisher: MIT Press. Access mode: https://www.mitpressjournals.org/doi/10.1162/

(online; accessed: 2020-05-09).

Pfeiffer M., Pfeil T. Deep Learning With Spiking Neurons: Opportunities and Challenges // Frontiers in 29 IInternational Journal of Open Information Technologies ISSN: 2307-8162 vol. 9, no.7, 2021 Neuroscience. –– 2018. –– Oct. –– Vol. 12. –– P. 774. –– Access mode: https://www.frontiersin.org/article/10.3389/fnins.2018.00774/full (online; accessed: 2020-03-16).

A review of learning in biologically plausible spiking neural networks / Aboozar Taherkhani, Ammar Belatreche, Yuhua Li et al. // Neural Networks. –– 2020. –– Feb. –– Vol. 122. –– P. 253–272. –– online; accessed: http://www.sciencedirect.com/science/article/pii/S0893608019303181 (online; accessed: 2020-01-17).

Masquelier T., Deco G. Learning and Coding in Neural Networks // Principles of Neural Coding. –– CRC Press, 2013. –– May. –– P. 513–526. –– online; accessed: http://www.crcnetbase.com/doi/10.1201/b14756-30 (online; accessed: 2020-05-09).

Brette R. Philosophy of the Spike: Rate-Based vs. Spike-Based Theories of the Brain // Frontiers in Systems Neuroscience. –– 2015. –– Vol. 9. –– Publisher: Frontiers. online; accessed: https://www.frontiersin.org/articles/10.3389/fnsys.2015.00151/full (online; accessed: 2020-05-09).

Paugam-Moisy H., Bohte S. Computing with Spiking Neuron Networks // Handbook of Natural Computing / Ed. by Grzegorz Rozenberg, Thomas Bäck, Joost N. Kok. –– Berlin, Heidelberg : Springer, 2012. –– P. 335–376. –– online; accessed: https://doi.org/10.1007/978-3-540-92910-9_10 (online; accessed: 2020-05-10).

Abeles M. Synfire chains // Scholarpedia. –– 2009. –– Jul. –– Vol. 4, no. 7. –– P. 1441. –– online; accessed: http://www.scholarpedia.org/article/Synfire_chains (online; accessed: 2020-05-09).

Srinivasa N., Cho Y. Self-Organizing Spiking Neural Model for Learning Fault-Tolerant Spatio-Motor Transformations // IEEE Transactions on Neural Networks and Learning Systems. –– 2012. –– Oct. –– Vol. 23, no. 10. – P. 1526–1538. –– Conference Name: IEEE Transactions on Neural Networks and Learning Systems.

Task-Based Core-Periphery Organisation of Human Brain Dynamics / Danielle S. Bassett, Nicholas F. Wymbs, M. Puck Rombach et al. // arXiv:1210.3555 [condmat, physics:nlin, q-bio, stat]. –– 2013. –– Oct. – arXiv: 1210.3555. Access mode: http://arxiv.org/abs/1210.3555 (online; accessed: 2020-05-10).

Wysoski S. G., Benuskova L., Kasabov N. Fast and adaptive network of spiking neurons for multiview visual pattern recognition // Neurocomputing. – 2008. –– Aug. –– Vol. 71, no. 13. –– P. 2563–2575. – online; accessed: http://www.sciencedirect.com/science/article/pii/S0925231208002191 (online; accessed: 2020-05-10).

Belatreche A., Paul R. Dynamic cluster formation using populations of spiking neurons // Unknown Host Publication. –– 2012. –– Jul. –– P. 1–6. – online; accessed: https://pure.ulster.ac.uk/en/publications/dynamic-clusterformation-using-populations-of-spiking-neurons-3 (online; accessed: 2020-05-10).

The SpiNNaker Project / Steve B. Furber, Francesco Galluppi, Steve Temple, Luis A. Plana // Proceedings of the IEEE. –– 2014. –– May. – Vol. 102, no. 5. –– P. 652–665. –– Conference Name: Proceedings of the IEEE.

TrueNorth: Design and Tool Flow of a 65 mW 1 Million Neuron Programmable Neurosynaptic Chip / Filipp Akopyan, Jun Sawada, Andrew Cassidy et al. // IEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems. –– 2015. –– Oct. –– Vol. 34, no. 10. –

P. 1537–1557. –– Conference Name: IEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems.

Caporale N., Dan Y. Spike Timing–Dependent Plasticity: A Hebbian Learning Rule // Annual Review of Neuroscience. –– 2008. –– Jul. –– Vol. 31, no. 1. –– P. 25–46. –– online; accessed: http://www.annualreviews.org/doi/10.

/annurev.neuro.31.060407.125639 (online; accessed: 2020-03-15).

Deep learning in spiking neural networks / Amirhossein Tavanaei, Masoud Ghodrati, Saeed Reza Kheradpisheh et al. // Neural Networks. – 2019. –– Mar. –– Vol. 111. –– P. 47–63. –– online; accessed: http://www.sciencedirect.com/science/article/

pii/S0893608018303332 (online; accessed: 2020-04-28).

Kheradpisheh S. R., Ganjtabesh M., Masquelier T. Bio-inspired unsupervised learning of visual features leads to robust invariant object recognition // Neurocomputing. –– 2016. –– Sep. –– Vol. 205. –– P. 382–392. –– online; accessed: http://www.sciencedirect.com/science/article/pii/S0925231216302880 (online; accessed: 2020-04-30).

Tavanaei A., Maida A. S. A spiking network that learns to extract spike signatures from speech signals // Neurocomputing. –– 2017. –– May. –– Vol. 240. –– P. 191–199. –– online; accessed: http://www.sciencedirect.com/science/article/pii/S0925231217303119 (online; accessed: 2020-04-30).

Diehl P. U., Cook M. Unsupervised learning of digit recognition using spike-timing-dependent plasticity // Frontiers in Computational Neuroscience. –– 2015. – Vol. 9. –– Publisher: Frontiers. online; accessed: https://www.frontiersin.org/articles/10.3389/fncom.2015.00099/full (online; accessed: 2020-05-10).

Lee C., Sarwar S. S., Roy K. Enabling Spike-based Backpropagation in State-of-the-art Deep Neural Network Architectures // arXiv:1903.06379 [cs]. –– 2019. –Aug. –– arXiv: 1903.06379. Access mode: http://arxiv.org/abs/1903.06379 (online; accessed: 2019-12-22).

Jin Y., Zhang W., Li P. Hybrid Macro/Micro Level Backpropagation for Training Deep Spiking Neural Networks // Advances in Neural Information Processing Systems 31 / Ed. by S. Bengio, H. Wallach, H. Larochelle et al. – Curran Associates, Inc., 2018. –– P. 7005–7015. –Access mode: http://papers.nips.cc/paper/7932-hybridmacromicro-level-backpropagation-for-training-deepspiking-neural-networks.pdf (online; accessed: 2020-05-10).

Direct Training for Spiking Neural Networks: Faster, Larger, Better / Yujie Wu, Lei Deng, Guoqi Li et al. // arXiv:1809.05793 [cs]. –– 2018. –– Nov. –– arXiv: 1809.05793. Access mode: http://arxiv.org/abs/1809.05793 (online; accessed: 2020-05-10).

Rueckauer B., Liu S.-C. Conversion of analog to spiking neural networks using sparse temporal coding // 2018 IEEE International Symposium on Circuits and Systems (ISCAS). –– 2018. –– May. –– P. 1–5. –– ISSN: 2379-447X.

A Comprehensive Analysis on Adversarial Robustness of Spiking Neural Networks / Saima Sharmin, 30 IInternational Journal of Open Information Technologies ISSN: 2307-8162 vol. 9, no.7, 2021 Priyadarshini Panda, Syed Shakib Sarwar et al. //

International Joint Conference on Neural Networks (IJCNN). –– 2019. –– Jul. –– P. 1–8. –– ISSN: 2161-4407.

Lee J. H., Delbruck T., Pfeiffer M. Training Deep Spiking Neural Networks Using Backpropagation // Frontiers in Neuroscience. –– 2016. –– Vol. 10. –– online; accessed: https://www.frontiersin.org/articles/10.3389/fnins.2016.00508/full (online; accessed: 2020-04-06).

Neurogrid: A Mixed-Analog-Digital Multichip System for Large-Scale Neural Simulations / Ben Varkey Benjamin, Peiran Gao, Emmett McQuinn et al. // Proceedings of the IEEE. –– 2014. –– May. –– Vol. 102, no. 5. – P. 699–716. –– Conference Name: Proceedings of the IEEE.

Live demonstration: A scaled-down version of the BrainScaleS wafer-scale neuromorphic system / Johannes Schemmel, Andreas Grübl, Stephan Hartmann et al. // 2012 IEEE International Symposium on Circuits and Systems (ISCAS). –– 2012. –– May. –– P. 702–702. – ISSN: 2158-1525.

Loihi: A Neuromorphic Manycore Processor with On- Chip Learning / Mike Davies, Narayan Srinivasa, Tsung- Han Lin et al. // IEEE Micro. – 2018. –– Jan. –– Vol. 38, no. 1. –– P. 82–99. –– Conference Name: IEEE Micro.

Towards artificial general intelligence with hybrid Tianjic chip architecture / Jing Pei, Lei Deng, Sen Song et al. // Nature. –– 2019. – Aug. –– Vol. 572, no. 7767. – P. 106–111. –– Number: 7767 Publisher: Nature Publishing Group. online; accessed: https://www.nature.com/articles/s41586-019-1424-8 (online; accessed: 2020-05-06).

REMODEL: Rethinking Deep CNN Models to Detect and Count on a NeuroSynaptic System / Rohit Shukla, Mikko Lipasti, Brian Van Essen et al. // Frontiers in Neuroscience. –– 2019. –– Feb. –– Vol. 13. –– P. 4. – Access mode: https://www.frontiersin.org/article/10.3389/fnins.2019.00004/full (online; accessed: 2020-04-06).

Neuromorphic Hardware Learns to Learn / Thomas Bohnstingl, Franz Scherr, Christian Pehle et al. // Frontiers in Neuroscience. – 2019. – May. –– Vol. 13. –– Access mode: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6536858/ (online; accessed: 2020-05-10).

Unsupervised Learning on Resistive Memory Array Based Spiking Neural Networks / Yilong Guo, Huaqiang Wu, Bin Gao, He Qian // Frontiers in Neuroscience. – 2019. – Vol. 13. – Publisher: Frontiers. online; accessed: https://www.frontiersin.org/articles/10.3389/fnins.2019.00812/full (online; accessed: 2020-05-10).

TensorFlow: A system for large-scale machine learning / Martin Abadi, Paul Barham, Jianmin Chen et al. // 12th USENIX Symposium on Operating Systems Design and Implementation (OSDI 16). – 2016. –– P. 265–283. – Access mode: https://www.usenix.org/system/files/conference/osdi16/osdi16-abadi.pdf (online;accessed: 2020-05-10).

PyTorch: An Imperative Style, High-Performance Deep Learning Library / Adam Paszke, Sam Gross, Francisco Massa et al. // Advances in Neural Information Processing Systems 32 / Ed. by H. Wallach, H. Larochelle, A. Beygelzimer et al. –– Curran Associates, Inc., 2019. –– P. 8026–8037. – Access mode: http://papers.nips.cc/paper/9015-pytorch-an-imperativestyle-high-performance-deep-learning-library.pdf (online; accessed: 2020-05-10).

BindsNET: A Machine Learning-Oriented Spiking Neural Networks Library in Python / Hananel Hazan, Daniel J. Saunders, Hassaan Khan et al. // Frontiers in Neuroinformatics. –– 2018. –– Vol. 12. –– Publisher: Frontiers. online; accessed: https://www.frontiersin.org/articles/10.3389/fninf.2018.00089/full (online; accessed: 2020-05-10).

PyNN: a common interface for neuronal network simulators / Andrew P. Davison, Daniel Brüderle, Jochen M. Eppler et al. // Frontiers in Neuroinformatics. –– 2009. –– Vol. 2. –– Publisher: Frontiers. online; accessed: https://www.frontiersin.org/articles/10.3389/neuro.11.011.2008/full (online; accessed: 2020-05-10).

SpykeTorch: Efficient Simulation of Convolutional Spiking Neural Networks With at Most One Spike per Neuron / Milad Mozafari, Mohammad Ganjtabesh, Abbas Nowzari-Dalini, Timothée Masquelier // Frontiers in Neuroscience. –– 2019. –– Jul. –– Vol. 13. – Access mode: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6640212/ (online; accessed: 2020-05-10).

Nengo: a Python tool for building large-scale functional brain models / Trevor Bekolay, James Bergstra, Eric Hunsberger et al. // Frontiers in Neuroinformatics. –– 2014. –– Vol. 7. –– online; accessed: https://www.frontiersin.org/articles/10.3389/fninf.2013.00048/full (online; accessed:2020-03-16).

A Large-Scale Model of the Functioning Brain / Chris Eliasmith, Terrence C. Stewart, Xuan Choo et al. //Science. –– 2012. –– Nov. –– Vol. 338, no. 6111. – P. 1202–1205. –– Publisher: American Association for the Advancement of Science Section: Report. online; accessed: https://science.sciencemag.org/content/338/6111/1202 (online; accessed: 2020-05-10).


Refbacks

  • There are currently no refbacks.


Abava  Absolutech Convergent 2020

ISSN: 2307-8162