Echo State Networks and Reservoir Computing

History and impact

Echo state networks (ESNs) provide an architecture and supervised learning principle for recurrent neural networks. The main idea is (i) to drive a random, large, fixed recurrent neural network with the input signal, thereby inducing in each neuron within this “reservoir” network a nonlinear response signal, and (ii) combine a desired output signal by a trainable linear combination of all of these response signals.

This basic idea had been proposed several times independently in the neurosciences and AI in the early and mid 1990 years (brief overview in our Scholarpedia article). However, the idea rose to its potential for practical exploits only after it was adapted into a machine learning context by Herbert Jaeger around the year 2000. A Science article (Jaeger & Haas, 2004) demonstrated that chaotic timeseries prediction tasks could be learnt by ESNs with up to 5 orders of magnitude higher precision than with any previous method, and a nonlinear satellite communication benchmark task was solved with 2 orders of magnitude better signal to noise ratio performance than with previous methods.

Predictions made by ESNs for some chaos benchmarks. Prediction starts at time 0. Green: analytical solution; blue: ESN prediction. Prediction accuracy per time step is in the order of machine precision.

 

Simultaneously with ESNs, Liquid State Machines (LSMs) were introduced in computational neuroscience by Wolfgang Maass around 2000. Today, ESNs and LSMs and some spinoff approaches are often collectively referred to as Reservoir Computing (RC). RC approaches have become widely adopted

  • in (computational) neuroscience, as a model for a number of processing circuits in biological brains,
  • in machine learning, as a computationally cheap yet competitive alternative to Deep Learning methods in signal processing and control applications,
  • in machine learning education, as an easy-to-program architecture for demonstrating principles of dynamic systems modeling,
  • in unconventional microchip research (optical, analog neuromorphic, nano-mechanical, and chemical devices).

The MINDS group is active in all of these fields.

Core references

Patent Note

The Fraunhofer Institute for Intelligent Analysis and Information Systems (IAIS) holds international patents for the ESN method.

Starter Papers

  • A Scholarpedia article for a first impression.
  • A comprehensive, introductory set of slides from the early years of ESN research (created around 2004 or so), still useful for a quick visual impression what this is all about
  • Highlight paper: H. Jaeger and H. Haas (2004), Harnessing Nonlinearity: Predicting Chaotic Systems and Saving Energy in Wireless Communication. Science 304, 2 April, 78-80 (preprint pdf, Matlab code)
  • Extensive survey: M. Lukoševičius and H. Jaeger (2009), Reservoir Computing Approaches to Recurrent Neural Network Training. Computer Science Review 3(3), 127-149 (preprint pdf)
  • Hands-on guidelines for practical implementations: M. Lukoševičius (2012): A Practical Guide to Applying Echo State Networks. In: G. Montavon, G. B. Orr, and K.-R. Müller (eds.) Neural Networks: Tricks of the Trade, 2nd ed. Springer LNCS 7700, 659-686 preprint pdf

Tutorial Code