Training Echo State Networks with Regularization Through Dimensionality Reduction
Permanent link
https://hdl.handle.net/10037/13086Date
2017Type
Journal articleTidsskriftartikkel
Peer reviewed
Abstract
In this paper, we introduce a new framework to train a class of recurrent neural network, called Echo State Network, to predict real valued time-series and to provide a visualization of the modeled system dynamics. The method consists in projecting the output of the internal layer of the network on a lower dimensional space, before training the output layer to learn the target task. Notably, we enforce a regularization constraint that leads to better generalization capabilities. We evaluate the performances of our approach on several benchmark tests, using different techniques to train the readout of the network, achieving superior predictive performance when using the proposed framework. Finally, we provide an insight on the effectiveness of the implemented mechanics through a visualization of the trajectory in the phase space and relying on the methodologies of nonlinear time-series analysis. By applying our method on well-known chaotic systems, we provide evidence that the lower dimensional embedding retains the dynamical properties of the underlying system better than the full-dimensional internal states of the network.
Description
This is a pre-print of an article published in Cognitive Computation. The final authenticated version is available online at: https://doi.org/10.1007/s12559-017-9450-z.