skip to main content
10.1145/2542050.2542087acmotherconferencesArticle/Chapter ViewAbstractPublication PagessoictConference Proceedingsconference-collections
research-article

Initializing reservoirs with exhibitory and inhibitory signals using unsupervised learning techniques

Published: 05 December 2013 Publication History

Abstract

The trend of Reservoir Computing (RC) has been gaining prominence in the Neural Computation community since the 2000s. In a RC model there are at least two well-differentiated structures. One is a recurrent part called reservoir, which expands the input data and historical information into a high-dimensional space. This projection is carried out in order to enhance the linear separability of the input data. Another part is a memory-less structure designed to be robust and fast in the learning process. RC models are an alternative of Turing Machines and Recurrent Neural Networks to model cognitive processing in the neural system. Additionally, they are interesting Machine Learning tools to Time Series Modeling and Forecasting.
Recently a new RC model was introduced under the name of Echo State Queueing Networks (ESQN). In this model the reservoir is a dynamical system which arises from the Queueing Theory. The initialization of the reservoir parameters may influence the model performance. Recently, some unsupervised techniques were used to improve the performance of one specific RC method. In this paper, we apply these techniques to set the reservoir parameters of the ESQN model. In particular, we study the ESQN model initialization using Self-Organizing Maps. Additionally, we test the model performance initializing the reservoir employing Hebbian rules. We present an empirical comparison of these reservoir initializations using a range of time series benchmarks.

References

[1]
H. Bakircioǧlu and T. Koçak. Survey of Random Neural Network applications. European Journal of Operational Research, 126(2): 319--330, 2000.
[2]
S. Basterrech, C. Fyfe, and G. Rubino. Initializing Echo State Networks with Topographic Maps. Proceeding in the Second International Conference on Morphological Computation (ICMC 2011), pages 103--105, September 2011.
[3]
S. Basterrech, C. Fyfe, and G. Rubino. Self-Organizing Maps and Scale-Invariant Maps in Echo State Networks. In Intelligent Systems Design and Applications (ISDA), 2011 11th International Conference on, pages 94--99, nov. 2011.
[4]
S. Basterrech, S. Mohamed, G. Rubino, and M. Soliman. Levenberg-Marquardt Training Algorithms for Random Neural Networks. Computer Journal, 54(1): 125--135, January 2011.
[5]
S. Basterrech and G. Rubino. Echo State Queueing Network: a new Reservoir Computing learning tool. IEEE Consumer Comunications & Networking Conference (CCNC'13), January 2013.
[6]
S. Basterrech and G. Rubino. Real-time estimation of Speech Quality through the Internet using Echo State Networks. Journal of Advanced in Computer Networks (JACN), 1(3), september 2013.
[7]
Y. Bengio, P. Simard, and P. Frasconi. Learning long-term dependencies with gradient descent is difficult. Neural Networks, IEEE Transactions on, 5(2): 157--166, 1994.
[8]
J. T. Connor, M. D. R., and L. E. Atlas. Recurrent neural networks and robust time series prediction. Neural Networks, 5(2): 240--254, 1994.
[9]
S. Dasgupta, F. Wörgötter, and P. Manoonpong. Information dynamics based self-adaptive reservoir for delay temporal memory tasks. Evolving Systems, pages 1--15, 2013.
[10]
J. L. Elman. Finding structure in time. Cognitive Science, 14: 179--211, 1990.
[11]
C. Fyfe. Hebbian learning and negative feedback networks. Advanced Information and Knowledge Processing. Springer-Verlag London, first edition, 2005.
[12]
E. Gelenbe. Random Neural Networks with Negative and Positive Signals and Product Form Solution. Neural Computation, 1(4): 502--510, 1989.
[13]
E. Gelenbe. Learning in the Recurrent Random Neural Network. Neural Computation, 5(1): 154--511, 1993.
[14]
D. O. Hebb. The Organization of Behavior: A Neuropsychological Theory. Wiley, New York, 1949.
[15]
H. Jaeger. The "echo state" approach to analysing and training recurrent neural networks. Technical Report 148, German National Research Center for Information Technology, 2001.
[16]
H. Jaeger, M. Lukosevicius, D. Popovici, and U. Siewert. Optimization and applications of Echo State Networks with leaky-integrator neurons. Neural Networks, (3): 335--352, 2007.
[17]
T. Kohonen. Self-Organization and Associative Memory. Springer Series in Information Sciences. Springer-Verlag, Berlin Heidelberg, first edition, 1984.
[18]
T. Kohonen. Self-Organizing Maps. Springer Series in Information Sciences, third edition, 2001.
[19]
A. Likas and A. Stafylopatis. Training the Random Neural Network using Quasi-Newton Methods. Eur. J. Oper. Res, 126: 331--339, 2000.
[20]
M. Lukosevicius. On self-organizing reservoirs and their hierarchies. Technical Report 25, Jacobs University, Bremen, 2010.
[21]
M. Lukosevicius and H. Jaeger. Reservoir computing approaches to recurrent neural network training. Computer Science Review, pages 127--149, 2009.
[22]
W. Maass. Liquid State Machines: Motivation, Theory, and Applications. In In Computability in Context: Computation and Logic in the Real World, pages 275--296. Imperial College Press, 2010.
[23]
W. Maass, T. Natschläger, and H. Markram. Computational Models for Generic Cortical Microcircuits. In Neuroscience Databases. A Practical Guide, pages 121--136, Boston, Usa, June 2003. Kluwer Academic Publishers.
[24]
K. Narendra and K. Parthasarathy. Identification and control of dynamical systems using neural networks. Neural Networks, IEEE Transactions on, 1(1): 4--27, March 1990.
[25]
R. Pascanu, T. Mikolov, and Y. Bengio. On the difficulty of training recurrent neural networks. Proceedings of the 30th International Conference on Machine Learning, 28: 37--48, 2013.
[26]
A. Rodan and P. Tino. Minimum Complexity Echo State Network. IEEE Transactions on Neural Networks, pages 131--144, 2011.
[27]
J. Schmidhuber, D. Wierstra, M. Gagliolo, and F. Gomez. Training Recurrent Networks by Evolino. Neural Computation, 19(3): 757--779, Mar. 2007.
[28]
B. Schrauwen, M. Wardermann, D. Verstraeten, J. J. Steil, and D. Stroobandt. Improving Reservoirs using Intrinsic Plasticity. Neurocomputing, 71: 1159--1171, March 2007.
[29]
J. J. Steil. Backpropagation-Decorrelation: online recurrent learning with O(N) complexity. In Proceedings of IJCNN'04, 1, 2004.
[30]
S. Timotheou. The Random Neural Network: A Survey. The Computer Journal, 53(3): 251--267, 2010.
[31]
D. Verstraeten, B. Schrauwen, M. D'Haene, and D. Stroobandt. An experimental unification of reservoir computing methods. Neural Networks, (3): 287--289, 2007.
[32]
Y. Xue, L. Yang, and S. Haykin. Decoupled Echo State Networks with lateral inhibition. Neural Networks, (3): 365--376, 2007.

Cited By

View all
  • (2023)Cluster-Based Input Weight Initialization for Echo State NetworksIEEE Transactions on Neural Networks and Learning Systems10.1109/TNNLS.2022.314556534:10(7648-7659)Online publication date: Oct-2023
  • (2023)Non-Standard Echo State Networks for Video Door State Monitoring2023 International Joint Conference on Neural Networks (IJCNN)10.1109/IJCNN54540.2023.10191096(1-8)Online publication date: 18-Jun-2023
  • (2023)Exploring unsupervised pre-training for echo state networksNeural Computing and Applications10.1007/s00521-023-08988-x35:34(24225-24242)Online publication date: 24-Sep-2023
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Other conferences
SoICT '13: Proceedings of the 4th Symposium on Information and Communication Technology
December 2013
345 pages
ISBN:9781450324540
DOI:10.1145/2542050
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

  • SOICT: School of Information and Communication Technology - HUST
  • NAFOSTED: The National Foundation for Science and Technology Development
  • ACM Vietnam Chapter: ACM Vietnam Chapter
  • Danang Univ. of Technol.: Danang University of Technology

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 05 December 2013

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. neural computation
  2. reservoir computing
  3. time series benchmarks
  4. unsupervised learning

Qualifiers

  • Research-article

Funding Sources

  • European Regional Development Fund in the IT4Innovations Centre of Excellence project
  • SGS in VSB-Technical University of Ostrava, Czech Republic
  • European Social Fund

Conference

SoICT '13
Sponsor:
  • SOICT
  • NAFOSTED
  • ACM Vietnam Chapter
  • Danang Univ. of Technol.

Acceptance Rates

SoICT '13 Paper Acceptance Rate 40 of 80 submissions, 50%;
Overall Acceptance Rate 147 of 318 submissions, 46%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)1
  • Downloads (Last 6 weeks)0
Reflects downloads up to 05 Mar 2025

Other Metrics

Citations

Cited By

View all
  • (2023)Cluster-Based Input Weight Initialization for Echo State NetworksIEEE Transactions on Neural Networks and Learning Systems10.1109/TNNLS.2022.314556534:10(7648-7659)Online publication date: Oct-2023
  • (2023)Non-Standard Echo State Networks for Video Door State Monitoring2023 International Joint Conference on Neural Networks (IJCNN)10.1109/IJCNN54540.2023.10191096(1-8)Online publication date: 18-Jun-2023
  • (2023)Exploring unsupervised pre-training for echo state networksNeural Computing and Applications10.1007/s00521-023-08988-x35:34(24225-24242)Online publication date: 24-Sep-2023
  • (2014)An experimental analysis of the Echo State Network initialization using the Particle Swarm Optimization2014 Sixth World Congress on Nature and Biologically Inspired Computing (NaBIC 2014)10.1109/NaBIC.2014.6921880(214-219)Online publication date: Jul-2014
  • (2014)Bagging Technique Using Temporal Expansion FunctionsProceedings of the Fifth International Conference on Innovations in Bio-Inspired Computing and Applications IBICA 201410.1007/978-3-319-08156-4_39(395-404)Online publication date: 2014
  • (2014)Solar Irradiance Estimation Using the Echo State Network and the Flexible Neural TreeIntelligent Data analysis and its Applications, Volume I10.1007/978-3-319-07776-5_49(475-484)Online publication date: 2014

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media