Colloquium Artificial Intelligence - Prof. Dr. H. Jaeger (University of Groningen)
When: | Tu 10-09-2019 15:00 - 16:00 |
Where: | 5161.0289 (Bernoulliborg) |
Titel: An Introduction to Reservoir Computing, and a Glimpse Beyond
Abstract:
Recurrent neural networks (RNNs) are general approximators for nonlinear dynamical systems and have recently become widely used in the “deep learning” field of machine learning, especially for speech and language processing tasks.
However, the deep learning set-ups for RNN training are computationally expensive, require very large volumes of training data, and need high-precision numerical processing. For such reasons, deep-learning variants of RNNs are problematic in fields where training data are scarce, where fast and cheap algorithms are desired, or where noisy or low-precision hardware is to be used. This is often the case in domains of nonlinear signal processing, control, brainmachine interfacing, or biomedical signal processing.
Reservoir Computing (RC) is an alternative machine learning approach for RNNs which is in many aspects complementary to the ways of deep learning. In RC, a large, random, possibly low-precision and noisy RNN is used as a nonlinear excitable medium - called the “reservoir” - which is driven by an input signal. The reservoir itself is not adapted or trained. Instead, only a “readout” mechanism is trained, which assembles the desired output signal from the large variety of random, excited signals within the reservoir. This readout training is cheap - typically just a linear regression. RC has become a popular approach in research that aims at useful computations on the basis on unconventional hardware (nondigital, noisy, low-precision).
The talk gives a quick introduction to the basic principles of RC, and then proceeds to recent add-ons to that paradigm.