Full Text:   <2329>

Summary:  <1692>

CLC number: TP183

On-line Access: 2018-03-10

Received: 2017-10-31

Revision Accepted: 2018-01-22

Crosschecked: 2018-01-25

Cited: 0

Clicked: 6581

Citations:  Bibtex RefMan EndNote GB/T7714

 ORCID:

Peng-ju Ren

http://orcid.org/0000-0003-1163-2014

-   Go to

Article info.
Open peer comments

Frontiers of Information Technology & Electronic Engineering  2018 Vol.19 No.1 P.139-150

http://doi.org/10.1631/FITEE.1700714


A novel spiking neural network of receptive field encoding with groups of neurons decision


Author(s):  Yong-qiang Ma, Zi-ru Wang, Si-yu Yu, Ba-dong Chen, Nan-ning Zheng, Peng-ju Ren

Affiliation(s):  Institute of Artificial Intelligence and Robotics, Xian Jiaotong University, Xian 710049, China

Corresponding email(s):   musaqiang@stu.xjtu.edu.cn, nnzheng@mail.xjtu.edu.cn, pengjuren@mail.xjtu.edu.cn

Key Words:  Tempotron, Receptive field, Difference of Gaussian (DoG), Flip invariance, Rotation invariance


Share this article to: More <<< Previous Article|

Yong-qiang Ma, Zi-ru Wang, Si-yu Yu, Ba-dong Chen, Nan-ning Zheng, Peng-ju Ren. A novel spiking neural network of receptive field encoding with groups of neurons decision[J]. Frontiers of Information Technology & Electronic Engineering, 2018, 19(1): 139-150.

@article{title="A novel spiking neural network of receptive field encoding with groups of neurons decision",
author="Yong-qiang Ma, Zi-ru Wang, Si-yu Yu, Ba-dong Chen, Nan-ning Zheng, Peng-ju Ren",
journal="Frontiers of Information Technology & Electronic Engineering",
volume="19",
number="1",
pages="139-150",
year="2018",
publisher="Zhejiang University Press & Springer",
doi="10.1631/FITEE.1700714"
}

%0 Journal Article
%T A novel spiking neural network of receptive field encoding with groups of neurons decision
%A Yong-qiang Ma
%A Zi-ru Wang
%A Si-yu Yu
%A Ba-dong Chen
%A Nan-ning Zheng
%A Peng-ju Ren
%J Frontiers of Information Technology & Electronic Engineering
%V 19
%N 1
%P 139-150
%@ 2095-9184
%D 2018
%I Zhejiang University Press & Springer
%DOI 10.1631/FITEE.1700714

TY - JOUR
T1 - A novel spiking neural network of receptive field encoding with groups of neurons decision
A1 - Yong-qiang Ma
A1 - Zi-ru Wang
A1 - Si-yu Yu
A1 - Ba-dong Chen
A1 - Nan-ning Zheng
A1 - Peng-ju Ren
J0 - Frontiers of Information Technology & Electronic Engineering
VL - 19
IS - 1
SP - 139
EP - 150
%@ 2095-9184
Y1 - 2018
PB - Zhejiang University Press & Springer
ER -
DOI - 10.1631/FITEE.1700714


Abstract: 
Human information processing depends mainly on billions of neurons which constitute a complex neural network, and the information is transmitted in the form of neural spikes. In this paper, we propose a spiking neural network (SNN), named MD-SNN, with three key features: (1) using receptive field to encode spike trains from images; (2) randomly selecting partial spikes as inputs for each neuron to approach the absolute refractory period of the neuron; (3) using groups of neurons to make decisions. We test MD-SNN on the MNIST data set of handwritten digits, and results demonstrate that: (1) Different sizes of receptive fields influence classification results significantly. (2) Considering the neuronal refractory period in the SNN model, increasing the number of neurons in the learning layer could greatly reduce the training time, effectively reduce the probability of over-fitting, and improve the accuracy by 8.77%. (3) Compared with other SNN methods, MD-SNN achieves a better classification; compared with the convolution neural network, MD-SNN maintains flip and rotation invariance (the accuracy can remain at 90.44% on the test set), and it is more suitable for small sample learning (the accuracy can reach 80.15% for 1000 training samples, which is 7.8 times that of CNN).

基于感受野编码的多神经元决策脉冲神经网络

概要:人类对信息的处理主要依赖数十亿个神经元构成的复杂神经网络,信息传输通过神经元释放电脉冲信号实现。本文提出一个名为MD-SNN的脉冲神经网络模型,其具有以下3个主要特点:(1)使用感受野模型对图片编码,产生相应脉冲序列;(2)随机选取脉冲序列中部分脉冲作为每个神经元的输入信号,并以这种方式模拟生物神经元的绝对不应期;(3)使用多组神经元对输出结果作出共同决策。我们在手写数字数据集(MNIST)上对MD-SNN进行测试,结果表明:(1)不同大小感受野对图像分类结果有显著影响;(2)由于MD-SNN模型引入了生物神经元绝对不应期特征,同时增加的学习层神经元极大缩短了训练时间,因此有效降低了过拟合概率,与引入绝对不应期与增加学习层神经元的SNN模型相比,图像分类准确率提高了8.77%;(3)与其他SNN方法相比,MD-SNN对图像分类更加有效--与卷积神经网络(CNN)相比,MD-SNN在图像发生翻转或旋转时仍能保持有效分类(测试集上的分类精度可以保持在90.44%),同时更适合小样本学习(1000个训练样本的分类准确率可以达到80.15%,即CNN的7.8倍)。

关键词:Tempotron;神经元模型;感受野;高斯差分;图像翻转;图像旋转

Darkslateblue:Affiliate; Royal Blue:Author; Turquoise:Article

Reference

[1]Al-Amri SS, Kalyankar NV, Khamitkar SD, 2010. Image segmentation by using edge detection. Int J Comput Sci Eng, 2(3):804-807.

[2]Berry MJII, Meister M, 1998. Refractoriness and neural precision. Proc Conf on Advances in Neural Information Processing Systems 10, p.110-116.

[3]Bi GQ, Poo MM, 2001. Synaptic modification by correlated activity: Hebb’s postulate revisited. Ann Rev Neurosci, 24(1):139-166.

[4]Brette R, Gerstner W, 2005. Adaptive exponential integrate-and-fire model as an effective description of neuronal activity. J Neurophysiol, 94(5):3637-3642.

[5]Burt P, Adelson E, 1983. The laplacian pyramid as a compact image code. IEEE Trans Commun, 31(4):532-540.

[6]Canny J, 1986. A computational approach to edge detection. IEEE Trans Patt Anal Mach Intell, 8(6):679-698.

[7]Coates A, Ng A, Lee H, 2011. An analysis of single-layer networks in unsupervised feature learning. Proc 14th Int Conf on Artificial Intelligence and Statistics, p.215-223.

[8]Dasgupta S, Stevens CF, Navlakha S, 2017. A neural algorithm for a fundamental computing problem. Science, 358(6364):793-796.

[9]Dora S, Suresh S, Sundararajan N, 2015a. A sequential learning algorithm for a spiking neural classifier. Appl Soft Comput, 36:255-268.

[10]Dora S, Sundaram S, Sundararajan N, 2015b. A two stage learning algorithm for a growing-pruning spiking neural network for pattern classification problems. Int Joint Conf on Neural Networks, p.1-7.

[11]Dora S, Subramanian K, Suresh S, et al., 2016. Development of a self-regulating evolving spiking neural network for classification problem. Neurocomputing, 171:1216-1229.

[12]Dora S, Suresh S, Sundararajan N, 2017. Online meta-neuron based learning algorithm for a spiking neural classifier. Inform Sci, 414:19-32.

[13]Fukushima K, 1980. Neocognitron: a self-organizing neural network model for a mechanism of pattern recognition unaffected by shift in position. Biol Cybern, 36(4):193-202.

[14]Gerstner W, Kistler W, 2002. Spiking Neuron Models: Single Neurons, Populations, Plasticity. Cambridge University Press, Cambridge, UK.

[15]Ghosh Dastidar S, Adeli H, 2007. Improved spiking neural networks for eeg classification and epilepsy and seizure detection. Integr Comput Aided Eng, 14(3):187-212.

[16]Gilbert CD, Wiesel TN, 1992. Receptive field dynamics in adult primary visual cortex. Nature, 356(6365):150-152.

[17]Gütig R, Sompolinsky H, 2006. The tempotron: a neuron that learns spike timing-based decisions. Nat Neurosci, 9(3):420-428.

[18]Hannun AY, Case C, Casper J, et al., 2014. Deep speech: Scaling up end-to-end speech recognition. https://arxiv.org/abs/1412.5567

[19]He KM, Zhang XY, Ren SQ, et al., 2016. Deep residual learning for image recognition. Proc IEEE Conf on Computer Vision and Pattern Recognition, p.770-778.

[20]Hodgkin AL, Huxley AF, 1952. A quantitative description of membrane current and its application to conduction and excitation in nerve. J Physiol, 117(4):500-544.

[21]Hubel DH, Wiesel TN, 1962. Receptive fields, binocular interaction and functional architecture in the cat’s visual cortex. J Physiol, 160(1):106-154.

[22]Hussain S, Liu SC, Basu A, 2014. Improved margin multi-class classification using dendritic neurons with morphological learning. IEEE Int Symp on Circuits and Systems, p.2640-2643.

[23]Izhikevich EM, 2001. Resonate-and-fire neurons. Neur Networks, 14(6-7):883-894.

[24]Izhikevich EM, 2003. Simple model of spiking neurons. IEEE Trans Neur Networks, 14(6):1569-1572.

[25]Izhikevich EM, 2004. Which model to use for cortical spiking neurons? IEEE Trans Neur Networks, 15(5):1063-1070.

[26]LeCun Y, Bengio Y, Hinton G, 2015. Deep learning. Nature, 521(7553):436-444.

[27]Legenstein R, Naeger C, Maass W, 2006. What can a neuron learn with spike-timing-dependent plasticity? Neur Comput, 17(11):2337-2382.

[28]Ma YQ, Wu H, Zhu MJ, et al., 2017. Reconstruction of visual image from functional magnetic resonance imaging using spiking neuron model. IEEE Trans Cogn Dev Syst, in press.

[29]Maass W, 1997. Networks of spiking neurons: the third generation of neural network models. Neur Networks, 10(9):1659-1671.

[30]Masquelier T, Guyonneau R, Thorpe SJ, 2009. Competitive stdp-based spike pattern learning. Neur Comput, 21(5):1259-1276.

[31]Merolla P, Arthur J, Akopyan F, et al., 2011. A digital neurosynaptic core using embedded crossbar memory with 45pj per spike in 45nm. IEEE Custom Integrated Circuits Conf, p.1-4.

[32]Ponulak F, Kasiński A, 2010. Supervised learning in spiking neural networks with resume: sequence learning, classification, and spike shifting. Neur Comput, 22(2):467-510.

[33]Rodieck RW, 1965. Quantitative analysis of cat retinal ganglion cell response to visual stimuli. Vis Res, 5(11):583-601.

[34]Schmidhuber J, 2015. Deep learning in neural networks: An overview. Neur networks, 61:85-117.

[35]Sobel I, 2014. History and definition of the sobel operator. https://www.scribd.com/document/271811982/History-and-Definition-of-Sobel-Operator

[36]Tang H, Yu Q, Tan KC, 2012. Learning real-world stimuli by single-spike coding and tempotron rule. Int Joint Conf on Neural Networks, p.1-6.

[37]Tavanaei A, Maida AS, 2015. A minimal spiking neural network to rapidly train and classify handwritten digits in binary and 10-digit tasks. Int J Adv Res Artif Intell, 4(7):1-8.

[38]Thorpe S, Delorme A, van Rullen R, 2001. Spike-based strategies for rapid processing. Neur Netw, 14(67):715- 725.

[39]Victor JD, Purpura KP, 1996. Nature and precision of temporal coding in visual cortex: a metric-space analysis. J Neurophysiol, 76(2):1310-1326.

[40]Wade JJ, Mcdaid LJ, Santos JA, et al., 2010. SWAT: a spiking neural network training algorithm for classification problems. IEEE Trans Neur Networks, 21(11):1817-1830.

[41]Xie XR, Qu H, Yi Z, et al., 2017. Efficient training of supervised spiking neural network via accurate synaptic-efficiency adjustment method. IEEE Trans Neur Networks, 28(6):1411-1424.

[42]Yeomans JS, 1979. The absolute refractory periods of self-stimulation neurons. Phys Behav, 22(5):911-919.

[43]Yu Q, Tang HJ, Tan KC, et al., 2013. Rapid feedforward computation by temporal encoding and learning with spiking neurons. IEEE Trans Neur Networks, 24(10):1539-1552.

[44]Yu Q, Tang HJ, Tan KC, et al., 2014. A brain-inspired spiking neural network model with temporal encoding and learning. Neurocomputing, 138:3-13.

[45]Zenke F, Ganguli S, 2017. Superspike: supervised learning in multi-layer spiking neural networks. https://arxiv.org/abs/1705.11146

Open peer comments: Debate/Discuss/Question/Opinion

<1>

Please provide your name, email address and a comment





Journal of Zhejiang University-SCIENCE, 38 Zheda Road, Hangzhou 310027, China
Tel: +86-571-87952783; E-mail: cjzhang@zju.edu.cn
Copyright © 2000 - 2024 Journal of Zhejiang University-SCIENCE