Maximization of the sum of the trace ratio on the Stiefel manifold, I: Theory

@article{Zhang2014MaximizationOT,
  title={Maximization of the sum of the trace ratio on the Stiefel manifold, I: Theory},
  author={Lei-Hong Zhang and Ren-Cang Li},
  journal={Science China Mathematics},
  year={2014},
  volume={57},
  pages={2495-2508}
}
AbstractWe are concerned with the maximization of $\frac{{tr(V^ \top AV)}} {{tr(V^ \top BV)}} + tr(V^ \top AV) $ over the Stiefel manifold $\{ V \in \mathbb{R}^{m \times \ell } |V^ \top V = I_\ell \} (\ell < m)$, where B is a given symmetric and positive definite matrix, A and C are symmetric matrices, and tr(·) is the trace of a square matrix. This is a subspace version of the maximization problem studied in Zhang (2013), which arises from real-world applications in, for example, the downlink… 
Maximization of the sum of the trace ratio on the Stiefel manifold, II: Computation
TLDR
This part analyzes the global and local convergence of the SCF iteration, and shows that the necessary condition for the global maximizers is fulfilled at any convergent point of the sequences of approximations generated by theSCF iteration.
Sampling Algebraic Varieties for Sum of Squares Programs
TLDR
A new methodology is proposed that, rather than relying on some algebraic description, represents ${\mathcal V}$ with a generic set of complex samples, avoiding representation issues such as multiplicity and choice of generators.
On Optimizing the Sum of Rayleigh Quotients on the Unit Sphere
ON OPTIMIZING THE SUM OF RAYLEIGH QUOTIENTS ON THE UNIT SPHERE Aohud Abdulrahman Binbuhaer, Ph.D. The University of Texas at Arlington, 2019 Supervising Professor: Ren-Cang Li Given symmetric
On Generalizing Trace Minimization
Ky Fan’s trace minimization principle is extended along the line of the Brockett cost function tr(DXAX) in X on the Stiefel manifold, where D of an apt size is positive definite. Specifically, we
A Riemannian conjugate gradient method for optimization on the Stiefel manifold
TLDR
Dai’s nonmonotone conjugate gradient method is generalized to the Riemannian case and global convergence of the new algorithm is established under standard assumptions.
A Self-Consistent-Field Iteration for Orthogonal Canonical Correlation Analysis
TLDR
An alternating numerical scheme whose core is the sub-maximization problem in the trace-fractional form with an orthogonality constraint is devised and it is proved that the SCF iteration is globally convergent to a KKT point and that the alternating numerical Scheme always converges.
Polynomial systems: graphical structure, geometry, and applications
  • D. Pardo
  • Computer Science, Mathematics
  • 2018
TLDR
This thesis proposes novel methods to derive more efficient (smaller) relaxations, by leveraging the geometrical structure of the underlying variety, and introduces a methodology to describe the variety with a generic set of samples, instead of relying on an algebraic description.
Rayleigh-Ritz Majorization Error Bounds of Mixed Type
TLDR
This work substitutes multidimensional subspaces for the vectors and derives new bounds of absolute changes of eigenvalues of the matrix RQ in terms of singular values of residual matrices and principal angles between subspaced, using majorization.
...
1
2
3
...

References

SHOWING 1-10 OF 17 REFERENCES
Maximization of the sum of the trace ratio on the Stiefel manifold, II: Computation
TLDR
This part analyzes the global and local convergence of the SCF iteration, and shows that the necessary condition for the global maximizers is fulfilled at any convergent point of the sequences of approximations generated by theSCF iteration.
On optimizing the sum of the Rayleigh quotient and the generalized Rayleigh quotient on the unit sphere
TLDR
This paper first presents a real world application arising from the sparse Fisher discriminant analysis, and realizes the Riemannian trust-region method of Absil, Baker and Gallivan into a practical algorithm, which enjoys the nice convergence properties: global convergence and local superlinear convergence.
A note on the trace quotient problem
TLDR
The classical first and second order optimality conditions for TRP are established and a simple proof is contributed for the property that TRP does not admit local non-global maximizer, which is first proved by Shen et al.
A feasible method for optimization with orthogonality constraints
TLDR
The Cayley transform is applied—a Crank-Nicolson-like update scheme—to preserve the constraints and based on it, curvilinear search algorithms with lower flops are developed with high efficiency for polynomial optimization, nearest correlation matrix estimation and extreme eigenvalue problems.
A Geometric Revisit to the Trace Quotient Problem
TLDR
A simple, efficient algorithm is proposed, which employs only one step of the parallel Rayleigh quotient iteration at each iteration of the Iterative Trace Ratio scheme.
Optimization Algorithms on Matrix Manifolds
TLDR
Optimization Algorithms on Matrix Manifolds offers techniques with broad applications in linear algebra, signal processing, data mining, computer vision, and statistical analysis and will be of interest to applied mathematicians, engineers, and computer scientists.
Optimality conditions for the nonlinear programming problems on Riemannian manifolds
In recent years, many traditional optimization methods have been successfully generalized to minimize objective functions on manifolds. In this paper, we first extend the general traditional
The Geometry of Algorithms with Orthogonality Constraints
TLDR
The theory proposed here provides a taxonomy for numerical linear algebra algorithms that provide a top level mathematical view of previously unrelated algorithms and developers of new algorithms and perturbation theories will benefit from the theory.
The Orthogonally Constrained Regression Revisited
The Penrose regression problem, including the orthonormal Procrustes problem and rotation problem to a partially specified target, is an important class of data matching problems arising frequently
On sparse linear discriminant analysis algorithm for high‐dimensional data classification
TLDR
In this paper, a sparse linear discriminant analysis (LDA) algorithm for high‐dimensional objects in subspaces is presented and an iterative algorithm for computing such sparse and orthogonal vectors in the LDA is developed.
...
1
2
...