Søk
Viser treff 1-6 av 6
The deep kernelized autoencoder
(Journal article; Tidsskriftartikkel; Peer reviewed, 2018-07-18)
Autoencoders learn data representations (codes) in such a way that the input is reproduced at the output of the network. However, it is not always clear what kind of properties of the input data need to be captured by the codes. Kernel machines have experienced great success by operating via inner-products in a theoretically well-defined reproducing kernel Hilbert space, hence capturing topological ...
Multiplex visibility graphs to investigate recurrent neural network dynamics
(Journal article; Tidsskriftartikkel; Peer reviewed, 2017-03-10)
A recurrent neural network (RNN) is a universal approximator of dynamical systems, whose performance often depends on sensitive hyperparameters. Tuning them properly may be difficult and, typically, based on a trial-and-error approach. In this work, we adopt a graph-based framework to interpret and characterize internal dynamics of a class of RNNs called echo state networks (ESNs). We design principled ...
Deep divergence-based approach to clustering
(Journal article; Tidsskriftartikkel; Peer reviewed, 2019-02-08)
A promising direction in deep learning research consists in learning representations and simultaneously discovering cluster structure in unlabeled data by optimizing a discriminative loss function. As opposed to supervised deep learning, this line of research is in its infancy, and how to design and optimize suitable loss functions to train deep neural networks for clustering is still an open question. ...
Learning representations of multivariate time series with missing data
(Journal article; Tidsskriftartikkel; Peer reviewed, 2019-07-19)
Learning compressed representations of multivariate time series (MTS) facilitates data analysis in the presence of noise and redundant information, and for a large number of variates and time steps. However, classical dimensionality reduction approaches are designed for vectorial data and cannot deal explicitly with missing values. In this work, we propose a novel autoencoder architecture based on ...
Deep kernelized autoencoders
(Peer reviewed; Book; Bokkapittel; Bok; Chapter, 2017-05-19)
In this paper we introduce the deep kernelized autoencoder,
a neural network model that allows an explicit approximation of (i) the
mapping from an input space to an arbitrary, user-specified kernel space
and (ii) the back-projection from such a kernel space to input space. The
proposed method is based on traditional autoencoders and is trained
through a new unsupervised loss function. ...
Critical echo state network dynamics by means of Fisher information maximization
(Chapter; Bokkapittel, 2017-07-03)
The computational capability of an Echo State Network (ESN), expressed in terms of low prediction error and high short-term memory capacity, is maximized on the so-called “edge of criticality”. In this paper we present a novel, unsupervised approach to identify this edge and, accordingly, we determine hyperparameters configuration that maximize network performance. The proposed method is ...