ABSTRACT
Self-organizing mechanism is an important feature of the human perception system. It is an unsupervised learning process which does not require labeled data. In this paper, we have designed a novel mixed signal architecture for training a self-organizing system. A memristor1 crossbar is utilized for higher synaptic weight density and parallel analog operation. The system essentially implements the winner take all learning algorithm. A novel neuron circuit is designed for the winning neuron detection and lateral inhibition operations. Our experimental results show that the proposed system can self-organize based on unlabeled training data.
- Hashmi, Atif G., and Mikko H. Lipasti. "Cortical columns: Building blocks for intelligent systems." In Computational Intelligence for Multimedia Signal and Vision Processing, CIMSVP'09. IEEE Symposium on, pp. 21--28. IEEE, 2009.Google Scholar
- Kohonen, Teuvo. "Physiological interpretation of the self-organizing map algorithm." Neural Networks 6, no. 7 (1993): 895--905. Google ScholarDigital Library
- P. A Merolla, J. V. Arthur, R. Alvarez-Icaza, A. S. Cassidy, Jun Sawada, F. Akopyan, B. L. Jackson, N. Imam, C. Guo, Y. Nakamura, B. Brezzo, Ivan Vo, S. K. Esser, R. Appuswamy, B. Taba, A. Amir, M. D. Flickne, W. P. Risk, R. Manohar, D. S. Modha, "A million spiking-neuron integrated circuit with a scalable communication network and interface." Science 345, no. 6197, pp 668--673, 2014.Google ScholarCross Ref
- Yunji Chen, Tao Luo, Shaoli Liu, Shijin Zhang, Liqiang He, Jia Wang, Ling Li, Tianshi Chen, Zhiwei Xu, Ninghui Sun, and Olivier Temam. 2014. DaDianNao: A Machine-Learning Supercomputer. In Proceedings of the 47th Annual IEEE/ACM International Symposium on Microarchitecture (MICRO-47). Google ScholarDigital Library
- Daofu Liu, Tianshi Chen, Shaoli Liu, Jinhong Zhou, Shengyuan Zhou, Olivier Teman, Xiaobing Feng, Xuehai Zhou, and Yunji Chen. 2015. PuDianNao: A Polyvalent Machine Learning Accelerator. In Proceedings of the Twentieth International Conference on Architectural Support for Programming Languages and Operating Systems (ASPLOS '15). ACM, New York, NY, USA, 369--381. Google ScholarDigital Library
- L. O. Chua, "Memristor---The Missing Circuit Element," IEEE Transactions on Circuit Theory, vol. 18, no. 5, pp. 507--519 (1971).Google ScholarCross Ref
- W. Lu, K.-H. Kim, T. Chang, S. Gaba, "Two-terminal resistive switches (memristors) for memory and logic applications," in Proc. 16th Asia and South Pacific Design Automation Conference, 2011. Google ScholarDigital Library
- F. Alibart, E. Zamanidoost, and D.B. Strukov, "Pattern classification by memristive crossbar circuits with ex-situ and in-situ training", Nature Communications, 2013.Google ScholarCross Ref
- M. Prezioso, F. Merrikh-Bayat, B. D. Hoskins, G. C. Adam, K. K. Likharev, D. B. Strukov, "Training and operation of an integrated neuromorphic network based on metal-oxide memristors," Nature, 521(7550), 61--64, 2015.Google ScholarCross Ref
- D. Soudry, D. D. Castro, A. Gal, A. Kolodny, and S. Kvatinsky, "Memristor-Based Multilayer Neural Networks With Online Gradient Descent Training," IEEE Trans. on Neural Networks and Learning Systems, issue 99, 2015.Google Scholar
- Choi, Shinhyun, Patrick Sheridan, and Wei D. Lu. "Data Clustering using Memristor Networks." Scientific Reports 5 (2015).Google Scholar
- Rafał Długosz, Tomasz Talaśka, Witold Pedrycz, and Ryszard Wojtyna. 2010. Realization of the conscience mechanism in CMOS implementation of winner-takes-all self-organizing neural networks. Trans. Neur. Netw. 21, 6 (June 2010), 961--971. Google ScholarDigital Library
- Kohonen, Teuvo. "The self-organizing map." Proceedings of the IEEE 78, no. 9 (1990): 1464--1480.Google ScholarCross Ref
- Lemmon, Michael, and BVK Vijaya Kumar. "Competitive learning with generalized winner-take-all activation." IEEE transactions on neural networks 3, no. 2 (1992): 167--175. Google ScholarDigital Library
- S. Yu, "Orientation classification by a winner-take-all network with oxide RRAM based synaptic devices," 2014 IEEE International Symposium on Circuits and Systems (ISCAS), Melbourne VIC, 2014.Google Scholar
- S. N. Truong, K. Van Pham, W. Yang, K. S. Min, Y. Abbas, C. J. Kang, and K. Pedrotti, "Ta2O5-memristor synaptic array with winner-take-all method for neuromorphic pattern matching," Journal of the Korean Physical Society, 69(4), 640--646.Google ScholarCross Ref
- S. Yu, Y. Wu, and H.-S. P. Wong, "Investigating the switching dynamics and multilevel capability of bipolar metal oxide resistive switching memory," Applied Physics Letters 98, 103514 (2011).Google ScholarCross Ref
- C. Yakopcic, T. M. Taha, G. Subramanyam, and R. E. Pino, "Memristor SPICE Model and Crossbar Simulation Based on Devices with Nanosecond Switching Time," IEEE International Joint Conference on Neural Networks (IJCNN), August 2013.Google Scholar
Recommendations
A fast training method for memristor crossbar based multi-layer neural networks
Memristor crossbar arrays carry out multiply---add operations in parallel in the analog domain which is the dominant operation in a neural network application. On-chip training of memristor neural network systems have the significant advantage of being ...
On-chip training of memristor crossbar based multi-layer neural networks
Memristor crossbar arrays carry out multiply-add operations in parallel in the analog domain, and so can enable neuromorphic systems with high throughput at low energy and area consumption. On-chip training of these systems have the significant ...
Ex-situ training of large memristor crossbars for neural network applications
Memristor crossbar arrays carry out multiply---add operations in parallel in the analog domain, and can enable neuromorphic systems with high throughput at low energy and area consumption. Neural networks need to be trained prior to use. This work ...
Comments