Nonnegative Matrix Underapproximation for Robust Multiple Model Fitting

@article{Tepper2016NonnegativeMU,
  title={Nonnegative Matrix Underapproximation for Robust Multiple Model Fitting},
  author={Mariano Tepper and Guillermo Sapiro},
  journal={2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR)},
  year={2016},
  pages={655-663}
}
  • Mariano TepperG. Sapiro
  • Published 4 November 2016
  • Computer Science
  • 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR)
In this work, we introduce a highly efficient algorithm to address the nonnegative matrix underapproximation (NMU) problem, i.e., nonnegative matrix factorization (NMF) with an additional underapproximation constraint. NMU results are interesting as, compared to traditional NMF, they present additional sparsity and part-based behavior, explaining unique data features. To show these features in practice, we first present an application to the analysis of climate data. We then present an NMU… 

Figures and Tables from this paper

DGSAC: Density Guided Sampling and Consensus

An automatic multi-model fitting pipeline that can robustly fit multiple geometric models present in the corrupted and noisy data and achieves competitive performance compared to state-of-the-art approaches in terms of accuracy and computational time is presented.

MultiLink: Multi-class Structure Recovery via Agglomerative Clustering and Model Selection

This work presents a new algorithm, termed MultiLink, that simultaneously deals with multiple classes of models, and combines on-the-fly model fitting and model selection in a novel linkage scheme that determines whether two clusters are to be merged.

Motion Consistency Guided Robust Geometric Model Fitting With Severe Outliers

A novel motion consistency guided fitting method (MCF) to robustly and efficiently estimate the parameters of model instances in data involving severe outliers and achieves higher fitting accuracy at a much lower computational cost than several state-of-the-art fitting methods.

Learning for Multi-Type Subspace Clustering

This work forms the multi-type subspace clustering problem as one of learning non-linear subspace filters via deep multi-layer perceptrons (mlps) and applies K-means to the network output to cluster the data.

Learning for Multi-Type Subspace Clustering

This work forms the multi-type subspace clustering problem as one of learning non-linear subspace filters via deep multi-layer perceptrons (mlps) and applies K-means to the network output to cluster the data.

Co-Clustering on Bipartite Graphs for Robust Model Fitting

A novel model fitting method based on co-clustering on bipartite graphs (CBG) to estimate multiple model instances in data contaminated with outliers and noise and performs favorably when compared with several state-of-the-art fitting methods.

Higher-Order Multicuts for Geometric Model Fitting and Motion Segmentation

A pseudo-boolean formulation for a multiple model fitting problem based on a formulation of any-order minimum cost lifted multicuts, which allows to partition an undirected graph with pairwise connectivity such as to minimize costs defined over any set of hyper-edges.

Non-Negative Novelty Extraction: A New Non-Negativity Constraint for NMF

This work proposes an algorithm for extracting novel sound from the monaural signal including the learned noise using non-negative matrix factorization (NMF) or semi-supervised NMF and applies a constraint for the novel sound, i.e. only the non-negativity without low-rankness.

What to Select: Pursuing Consistent Motion Segmentation from Multiple Geometric Models

A novel geometric-model-fusion framework for motion segmentation that incorporates the structural information shared by affinity matrices to select those semantically consistent entries and a multiplicative decomposition scheme is adopted to ensure structural consistency among multiple affinities.

References

SHOWING 1-10 OF 27 REFERENCES

Using underapproximations for sparse nonnegative matrix factorization

Fast L1-NMF for Multiple Parametric Model Estimation

A comprehensive algorithmic pipeline for multiple parametric model estimation using a biclustering algorithm based on L1 nonnegative matrix factorization (L1-NMF) to handle medium-sized problems faster while also extending the usability of the algorithm to much larger datasets.

Dimensionality reduction, classification, and spectral mixture analysis using non-negative underapproximation

It is explained why these additional constraints make NMU particularly well suited to achieve a parts-based and sparse representation of the data, enabling it to recover the constitutive elements in hyperspectral images.

Compressed Nonnegative Matrix Factorization Is Fast and Accurate

This work proposes to use structured random compression, that is, random projections that exploit the data structure, for two NMF variants: classical and separable, and shows that the resulting compressed techniques are faster than their uncompressed variants, vastly reduce memory demands, and do not encompass any significant deterioration in performance.

Sparse nonnegative matrix underapproximation and its application to hyperspectral image analysis

  • Nicolas GillisR. Plemmons
  • Computer Science
    2011 3rd Workshop on Hyperspectral Image and Signal Processing: Evolution in Remote Sensing (WHISPERS)
  • 2011
Sparse NMU is introduced by adding a sparsity constraint on the abundance matrix and use it to extract materials individually in a more efficient way than NMU, which is experimentally demonstrated on a HYDICE image of the San Diego airport.

Algorithms for Nonnegative Matrix Factorization with the β-Divergence

This letter describes algorithms for nonnegative matrix factorization (NMF) with the β-divergence (β-NMF). The β-divergence is a family of cost functions parameterized by a single shape parameter β

An alternating direction algorithm for matrix completion with nonnegative factors

This paper introduces an algorithm for the nonnegative matrix factorization-and-completion problem, which aims to find nonnegative low-rank matrices X and Y so that the product XY approximates a

Robust Multiple Model Fitting with Preference Analysis and Low-rank Approximation

Experimental validation on public, real data-sets demonstrates that the extraction of multiple models from outlier-contaminated data compares favourably with the state of the art.

A penalized matrix decomposition, with applications to sparse principal components and canonical correlation analysis.

A penalized matrix decomposition (PMD), a new framework for computing a rank-K approximation for a matrix, and establishes connections between the SCoTLASS method for sparse principal component analysis and the method of Zou and others (2006).

A Biclustering Framework for Consensus Problems

This is the first time that the task of finding/fitting multiple parametric models to a dataset is formally posed as a consensus problem and a biclustering framework and perspective for reaching consensus in such grouping problems are proposed.