# Riemannian kernel based Nyström method for approximate infinite-dimensional covariance descriptors with application to image set classification

@article{Chen2018RiemannianKB, title={Riemannian kernel based Nystr{\"o}m method for approximate infinite-dimensional covariance descriptors with application to image set classification}, author={Kai Chen and Xiaojun Wu and Rui Wang and Josef Kittler}, journal={2018 24th International Conference on Pattern Recognition (ICPR)}, year={2018}, pages={651-656} }

In the domain of pattern recognition, using the CovDs (Covariance Descriptors) to represent data and taking the metrics of the resulting Riemannian manifold into account have been widely adopted for the task of image set classification. Recently, it has been proven that infinite-dimensional CovDs are more discriminative than their low-dimensional counterparts. However, the form of infinite-dimensional CovDs is implicit and the computational load is high. We propose a novel framework for… Expand

#### 4 Citations

More About Covariance Descriptors for Image Set Coding: Log-Euclidean Framework Based Kernel Matrix Representation

- Computer Science, Engineering
- 2019 IEEE/CVF International Conference on Computer Vision Workshop (ICCVW)
- 2019

This framework characterise covariance structure in terms of the arc-cosine kernel which satisfies Mercer's condition and proposes the operation of mean centralization on SPD matrices, providing a lower-dimensional and more discriminative data representation for the task of image set classification. Expand

Covariance descriptors on a Gaussian manifold and their application to image set classification

- Computer Science
- Pattern Recognit.
- 2020

This paper extracts pixel-wise features of image regions and represent them by Gaussian models and extends the conventional covariance computation onto a special type of Riemannian manifold, namely a Gaussian manifold, so that it is applicable to image set data representation provided in terms of Gaussianmodels. Expand

Dimensionality reduction on the symmetric positive definite manifold with application to image set classification

- Computer Science, Engineering
- J. Electronic Imaging
- 2020

An SPD manifold dimensionality reduction (DR) algorithm is proposed that maps the original SPD manifold into a more discriminative lower-dimensional one via a learned mapping and demonstrates the superiority of this method over the state-of-the-art image set classification methods. Expand

Neighborhood preserving sparse representation based on Nyström method for image set classification on symmetric positive definite matrices

- Mathematics
- Journal of Algorithms & Computational Technology
- 2019

In the field of pattern recognition, using the symmetric positive-definite matrices to represent image set has been widely studied, and sparse representation-based classification algorithm on the… Expand

#### References

SHOWING 1-10 OF 21 REFERENCES

Approximate infinite-dimensional Region Covariance Descriptors for image classification

- Mathematics, Computer Science
- 2015 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)
- 2015

It is empirically shown that the proposed finite-dimensional approximations of infinite-dimensional RCovDs consistently outperform the low-dimensionalRCovDs for image classification task, while enjoying the Riemannian structure of the SPD manifolds. Expand

Log-Euclidean Kernels for Sparse Representation and Dictionary Learning

- Mathematics, Computer Science
- 2013 IEEE International Conference on Computer Vision
- 2013

This paper proposes a kernel based method for sparse representation (SR) and dictionary learning (DL) of SPD matrices by developing a broad family of kernels that satisfies Mercer's condition and considers the geometric structure in the DL process by updating atom matrices in the Riemannian space. Expand

Dimensionality Reduction on SPD Manifolds: The Emergence of Geometry-Aware Methods

- Computer Science, Medicine
- IEEE Transactions on Pattern Analysis and Machine Intelligence
- 2018

This paper proposes to model the mapping from the high-dimensional SPD manifold to the low-dimensional one with an orthonormal projection and shows that learning can be expressed as an optimization problem on a Grassmann manifold and discusses fast solutions for special cases. Expand

Covariance discriminative learning: A natural and efficient approach to image set classification

- Mathematics, Computer Science
- 2012 IEEE Conference on Computer Vision and Pattern Recognition
- 2012

A novel discriminative learning approach to image set classification by modeling the image set with its natural second-order statistic, i.e. covariance matrix, which shows the superiority of this method over state-of-the-art ones in both accuracy and efficiency, but also its stability to two real challenges: noisy set data and varying set size. Expand

Log-Hilbert-Schmidt metric between positive definite operators on Hilbert spaces

- Computer Science, Mathematics
- NIPS
- 2014

This paper introduces a novel mathematical and computational framework, namely Log-Hilbert-Schmidt metric between positive definite operators on a Hilbert space. This is a generalization of the… Expand

Bregman Divergences for Infinite Dimensional Covariance Matrices

- Computer Science, Mathematics
- 2014 IEEE Conference on Computer Vision and Pattern Recognition
- 2014

This work proposes an approach to computing and comparing Covariance Descriptors (CovDs) in infinite-dimensional spaces by first mapping the original data to a high-dimensional Hilbert space, and only then compute the CovDs. Expand

Projection Metric Learning on Grassmann Manifold with Application to Video based Face Recognition

- Mathematics, Computer Science
- 2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR)
- 2015

This work proposes a novel method to learn the Projection Metric directly on Grassmann manifold rather than in Hilbert space, which can be regarded as performing a geometry-aware dimensionality reduction from the original Grassmann manifolds to a lower-dimensional, more discriminative Grassman manifold where more favorable classification can be achieved. Expand

RAID-G: Robust Estimation of Approximate Infinite Dimensional Gaussian with Application to Material Recognition

- Mathematics, Computer Science
- 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR)
- 2016

The explicit feature mapping (EFM) is first introduced for effective approximation of infinite dimensional Gaussian induced by additive kernel function, and then a new regularized MLE method based on von Neumann divergence is proposed for robust estimation of covariance matrix. Expand

Log-Euclidean Metric Learning on Symmetric Positive Definite Manifold with Application to Image Set Classification

- Mathematics, Computer Science
- ICML
- 2015

This paper proposes a novel metric learning approach to work directly on logarithms of SPD matrices by learning a tangent map that can directly transform the matrix Log-Euclidean Metric from the original tangent space to a new tangentspace of more discriminability. Expand

Riemannian Dictionary Learning and Sparse Coding for Positive Definite Matrices

- Computer Science, Medicine
- IEEE Transactions on Neural Networks and Learning Systems
- 2017

This paper forms a novel Riem optimization objective for DLSC, in which the representation loss is characterized via the affine-invariant Riem metric, and presents a computationally simple algorithm for optimizing the model. Expand