Latent common manifold learning with alternating diffusion: analysis and applications

@article{Talmon2019LatentCM,
  title={Latent common manifold learning with alternating diffusion: analysis and applications},
  author={Ronen Talmon and Hau‐Tieng Wu},
  journal={ArXiv},
  year={2019},
  volume={abs/1602.00078}
}

Figures and Tables from this paper

Spectral Flow on the Manifold of SPD Matrices for Multimodal Data Processing

TLDR
This paper focuses on a scenario in which the measurements share mutual sources of variability but might also be contaminated by other measurement-specific sources such as interferences or noise, and combines manifold learning with the well-known Riemannian geometry of symmetric and positive-definite (SPD) matrices.

Alternating diffusion maps for multimodal data fusion

Recovering Hidden Components in Multimodal Data with Composite Diffusion Operators

TLDR
This paper proposes two new diffusion operators that allow to isolate, enhance and attenuate the hidden components of multi-modal data in a data-driven manner, which characterize the common structures and the differences between the manifolds underlying the different modalities.

Spatiotemporal Analysis Using Riemannian Composition of Diffusion Operators

Multivariate time-series have become abundant in recent years, as many dataacquisition systems record information through multiple sensors simultaneously. In this paper, we assume the variables

Joint Geometric and Topological Analysis of Hierarchical Datasets

TLDR
This paper focuses on high-dimensional data that are organized into several hierarchical datasets and presents a method for building an informative representation of hierarchical datasets using topological data analysis (TDA) and geometric manifold learning.

Optimal recovery of precision matrix for Mahalanobis distance from high-dimensional noisy observations in manifold learning

TLDR
This work demonstrates the sensitivity of Mahalanobis distance and the associated precision matrix to measurement noise, and proposes an asymptotically optimal shrinker, which is shown to be beneficial over the classical implementation of the MD, both analytically and in simulations.

Nonlinear Filtering With Variable Bandwidth Exponential Kernels

TLDR
This work focuses on representation learning approaches that consider the measurements as the nodes of a weighted graph, with edge weights computed by a given kernel, to learn representations that accurately parametrize the phenomenon of interest, while reducing variations due to other sources of variability.

Diffusion-based nonlinear filtering for multimodal data fusion with application to sleep stage assessment

TLDR
This paper proposes a nonlinear filtering scheme, which extracts the hidden sources of variability captured by two or more sensors that are independent of the sensor-specific components and demonstrates the technique on real measured data for the purpose of sleep stage assessment based on multiple, multimodal sensor measurements.

Time Coupled Diffusion Maps

References

SHOWING 1-10 OF 66 REFERENCES

Learning Coupled Embedding Using MultiView Diffusion Maps

TLDR
This study defines a cross-view model, in which an implied Random Walk process between objects is restrained to hop between the different views, and defines new diffusion distances and analyzes the spectra of the implied kernels.

Multimodal diffusion geometry by joint diagonalization of Laplacians

TLDR
An extension of diffusion geometry to multiple modalities through joint approximate diagonalization of Laplacian matrices is constructed, demonstrating that the joint diffusion geometry frequently better captures the inherent structure of multi-modal data.

MultiView Diffusion Maps

TLDR
This study defines a cross-view model, in which an implied Random Walk process between objects is restrained to hop between the different views, and defines new diffusion distances and analyzes the spectra of the implied kernels.

Empirical Intrinsic Modeling of Signals and Information Geometry

TLDR
This paper proposes a graph-based method for revealing the low-dimensional manifold and inferring the underlying process and shows that the learned intrinsic nonlinear model is invariant under different observation and instrumental modalities and is noise resilient.

Joint Manifolds for Data Fusion

TLDR
It is shown that joint manifold structure can lead to improved performance for a variety of signal processing algorithms for applications including classification and manifold learning and to formulate a scalable and universal dimensionality reduction scheme that efficiently fuses the data from all sensors.

Alternating diffusion for common manifold learning with application to sleep stage assessment

TLDR
It is demonstrated that, indeed, through alternating-diffusion, the sleep information hidden inside multimodal respiratory signals can be better captured compared to single-modal methods.

Two Manifold Problems with Applications to Nonlinear System Identification

TLDR
A class of algorithms for two-manifold problems is proposed, based on spectral decomposition of cross-covariance operators in Hilbert space, and it is demonstrated that solving a two- manifold problem can aid in learning a nonlinear dynamical system from limited data.

Laplacian Eigenmaps for Dimensionality Reduction and Data Representation

TLDR
This work proposes a geometrically motivated algorithm for representing the high-dimensional data that provides a computationally efficient approach to nonlinear dimensionality reduction that has locality-preserving properties and a natural connection to clustering.

Orientability and Diffusion Maps.

...