# Random projections on manifolds of Symmetric Positive Definite matrices for image classification

@article{Alavi2014RandomPO, title={Random projections on manifolds of Symmetric Positive Definite matrices for image classification}, author={Azadeh Alavi and Arnold Wiliem and Kun-li Zhao and Brian C. Lovell and Conrad Sanderson}, journal={IEEE Winter Conference on Applications of Computer Vision}, year={2014}, pages={301-308} }

Recent advances suggest that encoding images through Symmetric Positive Definite (SPD) matrices and then interpreting such matrices as points on Riemannian manifolds can lead to increased classification performance. Taking into account manifold geometry is typically done via (1) embedding the manifolds in tangent spaces, or (2) embedding into Reproducing Kernel Hilbert Spaces (RKHS). While embedding into tangent spaces allows the use of existing Euclidean-based learning algorithms, manifold…

## Figures, Tables, and Topics from this paper

## 15 Citations

Distance Preserving Projection Space of Symmetric Positive Definite Manifolds

- Computer ScienceArXiv
- 2016

This paper proposes an optimized projection, based on building local and global sparse similarity graphs that encode the association of data points to the underlying subspace of each point in an optimized distance preserving projection space (DPS), which can be followed by any Euclidean-based classification algorithm.

Optimized Kernel-based Projection Space of Riemannian Manifolds

- Computer Science
- 2016

The concept of dictionary learning and sparse coding, and discriminative analysis, for the optimized kernel-based projection space (OPS) on SPD manifolds are adopted, by employing the concept of subspace clustering.

Riemannian competitive learning for symmetric positive definite matrices clustering

- Computer ScienceNeurocomputing
- 2018

This paper introduces a conscious competition mechanism to enhance the performance of the RCL algorithm and develops a robust algorithm termed Riemannian Frequency Sensitive Competitive Learning (rFSCL), which inherits the online nature of competitive learning and is capable of handling very large data sets.

Clustering Symmetric Positive Definite Matrices on the Riemannian Manifolds

- Computer ScienceACCV
- 2016

Using structured features such as symmetric positive definite (SPD) matrices to encode visual information has been found to be effective in computer vision. Traditional pattern recognition methods…

Efficient clustering on Riemannian manifolds: A kernelised random projection approach

- Computer Science, MathematicsPattern Recognit.
- 2016

This work proposes a kernelised random projection framework for clustering manifold points via kernel space, which can preserve the geometric structure of the original space, but is computationally efficient.

Kernelised orthonormal random projection on grassmann manifolds with applications to action and gait-based gender recognition

- Mathematics, Computer ScienceIEEE International Conference on Identity, Security and Behavior Analysis (ISBA 2015)
- 2015

Experimental results in two biometric applications can achieve better accuracy than the state-of-the-art random projection method for manifold points, and comparisons with kernelised classifiers show that the method achieves nearly 3-fold speed up on average whilst maintaining the accuracy.

SPD Data Dictionary Learning Based on Kernel Learning and Riemannian Metric

- Computer ScienceIEEE Access
- 2020

The proposed framework designs a positive definite kernel function, which is defined by the Log-Euclidean metric, which can be transformed into a corresponding Riemannian kernel.

Data-independent Random Projections from the feature-map of the homogeneous polynomial kernel of degree two

- Computer ScienceInf. Sci.
- 2018

A novel non-linear extension of the Random Projection method based on the degree-2 homogeneous polynomial kernel that is able to implicitly map data points to the high-dimensional feature space of that kernel and from there perform a Random Projections to an Euclidean space of the desired dimensionality.

Data-independent Random Projections from the feature-space of the homogeneous polynomial kernel

- Computer SciencePattern Recognit.
- 2018

This paper presents a novel method to perform Random Projections from the feature space of homogeneous polynomial kernels, focusing on a specific kernel family to preserve some of the beneficial properties of the original Random Projection algorithm.

Comparative Evaluation of Action Recognition Methods via Riemannian Manifolds, Fisher Vectors and GMMs: Ideal and Challenging Conditions

- Computer Science, MathematicsPAKDD Workshops
- 2016

The FV approach obtains the highest accuracy under ideal conditions and best deals with moderate scale and translation changes, while traditional action recognition techniques based on Gaussian mixture models and Fisher vectors FVs are compared.

## References

SHOWING 1-10 OF 44 REFERENCES

Relational divergence based classification on Riemannian manifolds

- Computer Science2013 IEEE Workshop on Applications of Computer Vision (WACV)
- 2013

Experiments on face recognition, person re-identification and texture classification show that the proposed method outperforms state-of-the-art approaches, such as Tensor Sparse Coding, Histogram Plus Epitome and the recent Riemannian Locality Preserving Projection.

Sparse Coding and Dictionary Learning for Symmetric Positive Definite Matrices: A Kernel Approach

- Computer Science, MathematicsECCV
- 2012

Experiments show that the proposed sparse coding approach achieves notable improvements in discrimination accuracy, in comparison to state-of-the-art methods such as tensor sparse coding, Riemannian locality preserving projection, and symmetry-driven accumulation of local features.

Kernel analysis over Riemannian manifolds for visual recognition of actions, pedestrians and textures

- Computer Science2012 IEEE Workshop on the Applications of Computer Vision (WACV)
- 2012

Experiments on several visual classification tasks show that the proposed embedding into the Reproducing Kernel Hilbert Space by introducing a Riemannian pseudo kernel obtains considerable improvements in discrimination accuracy.

Dictionary Learning and Sparse Coding on Grassmann Manifolds: An Extrinsic Solution

- Computer Science, Mathematics2013 IEEE International Conference on Computer Vision
- 2013

To handle non-linearity in data, this paper proposes to embed Grassmann manifolds into the space of symmetric matrices by an isometric mapping, which enables it to devise a closed-form solution for updating a Grassmann dictionary, atom by atom.

K-tangent spaces on Riemannian manifolds for improved pedestrian detection

- Computer Science2012 19th IEEE International Conference on Image Processing
- 2012

A general discriminative model based on the combination of several tangent spaces is proposed in order to preserve more details of the structure of the Riemannian structure.

Dirichlet process mixture models on symmetric positive definite matrices for appearance clustering in video surveillance applications

- Mathematics, Computer ScienceCVPR 2011
- 2011

A novel application of the Dirich-let Process Mixture Model framework is proposed towards unsupervised clustering of symmetric positive definite matrices, extending the existing K-means type clustering algorithms based on the logdet divergence measure and derive the counterpart of it in a Bayesian framework, which leads to the Wishart-Inverse Wishart conjugate pair.

Intrinsic Statistics on Riemannian Manifolds: Basic Tools for Geometric Measurements

- Mathematics, Computer ScienceJournal of Mathematical Imaging and Vision
- 2006

This paper provides a new proof of the characterization of Riemannian centers of mass and an original gradient descent algorithm to efficiently compute them and develops the notions of mean value and covariance matrix of a random element, normal law, Mahalanobis distance and χ2 law.

Positive definite matrices and the S-divergence

- Mathematics
- 2011

Positive definite matrices abound in a dazzling variety of applications. This ubiquity can be in part attributed to their rich geometric structure: positive definite matrices form a self-dual convex…

Generalized Dictionary Learning for Symmetric Positive Definite Matrices with Application to Nearest Neighbor Retrieval

- Computer ScienceECML/PKDD
- 2011

Experiments on several covariance matrix datasets show that GDL achieves performance rivaling state-of-the-art techniques, and allows performing "sparse coding" of positive definite matrices, which enables better NN retrieval.

Extended Grassmann Kernels for Subspace-Based Learning

- Computer Science, MathematicsNIPS
- 2008

The relationship between Grassmann kernels and probabilistic similarity measures is analyzed and it is shown that the KL distance in the limit yields the Projection kernel on the Grassmann manifold, whereas the Bhattacharyya kernel becomes trivial in thelimit and is suboptimal for subspace-based problems.