• Corpus ID: 14543766

Autoencoding time series for visualisation

@article{Gianniotis2015AutoencodingTS,
  title={Autoencoding time series for visualisation},
  author={Nikolaos Gianniotis and Sven Dennis K{\"u}gler and Peter Tiňo and Kai Lars Polsterer and Ranjeev Misra},
  journal={ArXiv},
  year={2015},
  volume={abs/1505.00936}
}
We present an algorithm for the visualisation of time series. To that end we employ echo state networks to convert time series into a suitable vector representation which is capable of capturing the latent dynamics of the time series. Subsequently, the obtained vector representations are put through an autoencoder and the visualisation is constructed using the activations of the bottleneck. The crux of the work lies with defining an objective function that quantifies the reconstruction error of… 

Figures from this paper

Model-coupled autoencoder for time series visualisation
Time Series Classification in Reservoir- and Model-Space: A Comparison
TLDR
This work presents a systematic comparison of time series classification in the model space and the classical, discriminative approach with ESNs, evaluated on 43 univariate and 18 multivariate time series.
An explorative approach for inspecting Kepler data
The Kepler survey has provided a wealth of astrophysical knowledge by continuously monitoring over 150,000 stars. The resulting database contains thousands of examples of known variability types and
Machine Learning in Astronomy: a practical overview
TLDR
This document summarizes the topics of supervised and unsupervised learning algorithms presented during the IAC Winter School 2018, and provides practical information on the application of such tools to astronomical datasets.
A Framework for Automated Collaborative Fault Detection in Large-Scale Vehicle Networks
TLDR
This research presents a novel framework for automated fault detection in cyber-physical systems, with specific focus on large-scale vehicle networks, and applies it to a detailed vehicle cooling system model to demonstrate its efficacy.

References

SHOWING 1-10 OF 11 REFERENCES
Minimum Complexity Echo State Network
TLDR
It is shown that a simple deterministically constructed cycle reservoir is comparable to the standard echo state network methodology and the (short-term) of linear cyclic reservoirs can be made arbitrarily close to the proved optimal value.
Model-based kernel for efficient time series analysis
TLDR
Novel, efficient, model based kernels for time series data rooted in the reservoir computation framework are presented and it is shown how the model distances used in the kernel can be calculated analytically or efficiently estimated.
Nonlinear principal component analysis using autoassociative neural networks
TLDR
The NLPCA method is demonstrated using time-dependent, simulated batch reaction data and shows that it successfully reduces dimensionality and produces a feature space map resembling the actual distribution of the underlying system parameters.
Visualizing Data using t-SNE
TLDR
A new technique called t-SNE that visualizes high-dimensional data by giving each datapoint a location in a two or three-dimensional map, a variation of Stochastic Neighbor Embedding that is much easier to optimize, and produces significantly better visualizations by reducing the tendency to crowd points together in the center of the map.
ON MACHINE-LEARNED CLASSIFICATION OF VARIABLE STARS WITH SPARSE AND NOISY TIME-SERIES DATA
TLDR
A methodology for variable-star classification, drawing from modern machine-learning techniques, which is effective for identifying samples of specific science classes and presents the first astronomical use of hierarchical classification methods to incorporate a known class taxonomy in the classifier.
The''echo state''approach to analysing and training recurrent neural networks
The report introduces a constructive learning algorithm for recurrent neural networks, which modifies only the weights to output units in order to achieve the learning task. key words: recurrent
Nonlinear time series analysis of the light curves from the black hole system GRS1915+105
TLDR
It is shown that nearly half of the 12 temporal states exhibit deviation from randomness and their complex temporal behavior could be approximated by a few (three or four) coupled ordinary nonlinear differential equations.
Probability Product Kernels
TLDR
The advantages of discriminative learning algorithms and kernel machines are combined with generative modeling using a novel kernel between distributions to exploit the properties, metrics and invariances of the generative models the authors infer from each datum.
Exploiting Generative Models in Discriminative Classifiers
TLDR
A natural way of achieving this combination by deriving kernel functions for use in discriminative methods such as support vector machines from generative probability models is developed.
Stochastic Models That Separate Fractal Dimension and the Hurst Effect
TLDR
Stochastic models that allow for any combination of fractal dimension and Hurst coefficient are introduced and a test for self-affinity that assesses coupling and decoupling of local and global behavior is suggested.
...
...