Random projections on manifolds of Symmetric Positive Definite matrices for image classification

@article{Alavi2014RandomPO,
  title={Random projections on manifolds of Symmetric Positive Definite matrices for image classification},
  author={Azadeh Alavi and Arnold Wiliem and Kun-li Zhao and Brian C. Lovell and Conrad Sanderson},
  journal={IEEE Winter Conference on Applications of Computer Vision},
  year={2014},
  pages={301-308}
}
  • A. Alavi, A. Wiliem, +2 authors C. Sanderson
  • Published 3 March 2014
  • Computer Science, Mathematics
  • IEEE Winter Conference on Applications of Computer Vision
Recent advances suggest that encoding images through Symmetric Positive Definite (SPD) matrices and then interpreting such matrices as points on Riemannian manifolds can lead to increased classification performance. Taking into account manifold geometry is typically done via (1) embedding the manifolds in tangent spaces, or (2) embedding into Reproducing Kernel Hilbert Spaces (RKHS). While embedding into tangent spaces allows the use of existing Euclidean-based learning algorithms, manifold… 
Distance Preserving Projection Space of Symmetric Positive Definite Manifolds
TLDR
This paper proposes an optimized projection, based on building local and global sparse similarity graphs that encode the association of data points to the underlying subspace of each point in an optimized distance preserving projection space (DPS), which can be followed by any Euclidean-based classification algorithm.
Optimized Kernel-based Projection Space of Riemannian Manifolds
TLDR
The concept of dictionary learning and sparse coding, and discriminative analysis, for the optimized kernel-based projection space (OPS) on SPD manifolds are adopted, by employing the concept of subspace clustering.
Riemannian competitive learning for symmetric positive definite matrices clustering
TLDR
This paper introduces a conscious competition mechanism to enhance the performance of the RCL algorithm and develops a robust algorithm termed Riemannian Frequency Sensitive Competitive Learning (rFSCL), which inherits the online nature of competitive learning and is capable of handling very large data sets.
Clustering Symmetric Positive Definite Matrices on the Riemannian Manifolds
Using structured features such as symmetric positive definite (SPD) matrices to encode visual information has been found to be effective in computer vision. Traditional pattern recognition methods
Efficient clustering on Riemannian manifolds: A kernelised random projection approach
TLDR
This work proposes a kernelised random projection framework for clustering manifold points via kernel space, which can preserve the geometric structure of the original space, but is computationally efficient.
Kernelised orthonormal random projection on grassmann manifolds with applications to action and gait-based gender recognition
  • Kun Zhao, A. Wiliem, B. Lovell
  • Mathematics, Computer Science
    IEEE International Conference on Identity, Security and Behavior Analysis (ISBA 2015)
  • 2015
TLDR
Experimental results in two biometric applications can achieve better accuracy than the state-of-the-art random projection method for manifold points, and comparisons with kernelised classifiers show that the method achieves nearly 3-fold speed up on average whilst maintaining the accuracy.
SPD Data Dictionary Learning Based on Kernel Learning and Riemannian Metric
TLDR
The proposed framework designs a positive definite kernel function, which is defined by the Log-Euclidean metric, which can be transformed into a corresponding Riemannian kernel.
Data-independent Random Projections from the feature-map of the homogeneous polynomial kernel of degree two
TLDR
A novel non-linear extension of the Random Projection method based on the degree-2 homogeneous polynomial kernel that is able to implicitly map data points to the high-dimensional feature space of that kernel and from there perform a Random Projections to an Euclidean space of the desired dimensionality.
Data-independent Random Projections from the feature-space of the homogeneous polynomial kernel
TLDR
This paper presents a novel method to perform Random Projections from the feature space of homogeneous polynomial kernels, focusing on a specific kernel family to preserve some of the beneficial properties of the original Random Projection algorithm.
Comparative Evaluation of Action Recognition Methods via Riemannian Manifolds, Fisher Vectors and GMMs: Ideal and Challenging Conditions
TLDR
The FV approach obtains the highest accuracy under ideal conditions and best deals with moderate scale and translation changes, while traditional action recognition techniques based on Gaussian mixture models and Fisher vectors FVs are compared.
...
1
2
...

References

SHOWING 1-10 OF 44 REFERENCES
Relational divergence based classification on Riemannian manifolds
TLDR
Experiments on face recognition, person re-identification and texture classification show that the proposed method outperforms state-of-the-art approaches, such as Tensor Sparse Coding, Histogram Plus Epitome and the recent Riemannian Locality Preserving Projection.
Sparse Coding and Dictionary Learning for Symmetric Positive Definite Matrices: A Kernel Approach
TLDR
Experiments show that the proposed sparse coding approach achieves notable improvements in discrimination accuracy, in comparison to state-of-the-art methods such as tensor sparse coding, Riemannian locality preserving projection, and symmetry-driven accumulation of local features.
Kernel analysis over Riemannian manifolds for visual recognition of actions, pedestrians and textures
TLDR
Experiments on several visual classification tasks show that the proposed embedding into the Reproducing Kernel Hilbert Space by introducing a Riemannian pseudo kernel obtains considerable improvements in discrimination accuracy.
Dictionary Learning and Sparse Coding on Grassmann Manifolds: An Extrinsic Solution
TLDR
To handle non-linearity in data, this paper proposes to embed Grassmann manifolds into the space of symmetric matrices by an isometric mapping, which enables it to devise a closed-form solution for updating a Grassmann dictionary, atom by atom.
K-tangent spaces on Riemannian manifolds for improved pedestrian detection
TLDR
A general discriminative model based on the combination of several tangent spaces is proposed in order to preserve more details of the structure of the Riemannian structure.
Dirichlet process mixture models on symmetric positive definite matrices for appearance clustering in video surveillance applications
TLDR
A novel application of the Dirich-let Process Mixture Model framework is proposed towards unsupervised clustering of symmetric positive definite matrices, extending the existing K-means type clustering algorithms based on the logdet divergence measure and derive the counterpart of it in a Bayesian framework, which leads to the Wishart-Inverse Wishart conjugate pair.
Intrinsic Statistics on Riemannian Manifolds: Basic Tools for Geometric Measurements
  • X. Pennec
  • Mathematics, Computer Science
    Journal of Mathematical Imaging and Vision
  • 2006
TLDR
This paper provides a new proof of the characterization of Riemannian centers of mass and an original gradient descent algorithm to efficiently compute them and develops the notions of mean value and covariance matrix of a random element, normal law, Mahalanobis distance and χ2 law.
Positive definite matrices and the S-divergence
Positive definite matrices abound in a dazzling variety of applications. This ubiquity can be in part attributed to their rich geometric structure: positive definite matrices form a self-dual convex
Generalized Dictionary Learning for Symmetric Positive Definite Matrices with Application to Nearest Neighbor Retrieval
TLDR
Experiments on several covariance matrix datasets show that GDL achieves performance rivaling state-of-the-art techniques, and allows performing "sparse coding" of positive definite matrices, which enables better NN retrieval.
Extended Grassmann Kernels for Subspace-Based Learning
TLDR
The relationship between Grassmann kernels and probabilistic similarity measures is analyzed and it is shown that the KL distance in the limit yields the Projection kernel on the Grassmann manifold, whereas the Bhattacharyya kernel becomes trivial in thelimit and is suboptimal for subspace-based problems.
...
1
2
3
4
5
...