Regularization and stability in reservoir networks with output feedback

Abstract

Output feedback is crucial for autonomous and parameterized pattern generation with reservoir networks. Read-out learning affects the output feedback loop and can lead to error amplification. Regularization is therefore important for both, generalization and reduction of error amplification. We show that regularization of the reservoir and the read-out… (More)
DOI: 10.1016/j.neucom.2012.01.032

Topics

7 Figures and Tables

Slides referencing similar topics