• Corpus ID: 231839801

Nonlinear Independent Component Analysis for Continuous-Time Signals

@article{Oberhauser2021NonlinearIC,
  title={Nonlinear Independent Component Analysis for Continuous-Time Signals},
  author={Harald Oberhauser and Alexander Schell},
  journal={ArXiv},
  year={2021},
  volume={abs/2102.02876}
}
We study the classical problem of recovering a multidimensional source process from observations of nonlinear mixtures of this process. Assuming statistical independence of the coordinate processes of the source, we show that this recovery is possible for many popular models of stochastic processes (up to order and monotone scaling of their coordinates) if the mixture is given by a sufficiently differentiable, invertible function. Key to our approach is the combination of tools from stochastic… 

Figures from this paper

Disentangling Identifiable Features from Noisy Data with Structured Nonlinear ICA
TLDR
A new general identifiable framework for principled disentanglement referred to as Structured Nonlinear Independent Component Analysis (SNICA) is introduced and it is established that identifiability for this framework holds even in the presence of noise of unknown distribution.
Path classification by stochastic linear recurrent neural networks
TLDR
It is argued that these RNNs modelled, in a simplified setting, as a continuous-time stochastic recurrent neural network with the identity activation function are easy to train and robust and a trade-off phenomenon between accuracy and robustness is shown.
Proper Scoring Rules, Gradients, Divergences, and Entropies for Paths and Time Series
TLDR
This work uses the statistical framework of proper scoring rules with classical mathematical results to derive a principled approach to decision making with forecasts of gradients, entropy, and divergence that are tailor-made to respect the underlying non-Euclidean structure.

References

SHOWING 1-10 OF 53 REFERENCES
Nonlinear ICA of Temporally Dependent Stationary Sources
TLDR
It is proved that the method estimates the sources for general smooth mixing nonlinearities, assuming the sources have sufficiently strong temporal dependencies, and these dependencies are in a certain way different from dependencies found in Gaussian processes.
Fast and robust fixed-point algorithms for independent component analysis
  • A. Hyvärinen
  • Computer Science, Mathematics
    IEEE Trans. Neural Networks
  • 1999
TLDR
Using maximum entropy approximations of differential entropy, a family of new contrast (objective) functions for ICA enable both the estimation of the whole decomposition by minimizing mutual information, and estimation of individual independent components as projection pursuit directions.
Unsupervised Feature Extraction by Time-Contrastive Learning and Nonlinear ICA
TLDR
This work proposes a new intuitive principle of unsupervised deep learning from time series which uses the nonstationary structure of the data, and shows how TCL can be related to a nonlinear ICA model, when ICA is redefined to include temporal nonstationarities.
Kernel independent component analysis
  • F. Bach, Michael I. Jordan
  • Computer Science
    2003 IEEE International Conference on Acoustics, Speech, and Signal Processing, 2003. Proceedings. (ICASSP '03).
  • 2003
TLDR
A class of algorithms for independent component analysis which use contrast functions based on canonical correlations in a reproducing kernel Hilbert space is presented, showing that these algorithms outperform many of the presently known algorithms.
Independent component analysis via nonparametric maximum likelihood estimation
Independent Component Analysis (ICA) models are very popular semiparametric models in which we observe independent copies of a random vector $X = AS$, where $A$ is a non-singular matrix and $S$ has
Nonlinear ICA Using Auxiliary Variables and Generalized Contrastive Learning
TLDR
This work provides a comprehensive proof of the identifiability of the model as well as the consistency of the estimation method, and proposes to learn nonlinear ICA by discriminating between true augmented data, or data in which the auxiliary variable has been randomized.
Maximum likelihood for blind separation and deconvolution of noisy signals using mixture models
TLDR
An approximate maximum likelihood method for blind source separation and deconvolution of noisy signal is proposed, which is able to capture some salient features of the input signal distribution and performs generally much better than third-order or fourth-order cumulant based techniques.
MISEP -- Linear and Nonlinear ICA Based on Mutual Information
  • L. Almeida
  • Computer Science
    J. Mach. Learn. Res.
  • 2003
TLDR
MISEP is an ICA technique for linear and nonlinear mixtures, which is based on the minimization of the mutual information of the estimated components, which optimizes a network with a specialized architecture, with a single objective function: the output entropy.
Independent component analysis: recent advances
  • A. Hyvärinen
  • Computer Science
    Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences
  • 2013
TLDR
An overview of some recent developments in the theory of independent component analysis is provided, including analysis of causal relations, testing independent components, analysing multiple datasets (three-way data), modelling dependencies between the components and improved methods for estimating the basic model.
Independent component analysis, A new concept?
  • P. Comon
  • Computer Science
    Signal Process.
  • 1994
...
...