Minimizing Condition Number via Convex Programming

@article{Lu2011MinimizingCN,
  title={Minimizing Condition Number via Convex Programming},
  author={Zhaosong Lu and Ting Kei Pong},
  journal={SIAM J. Matrix Anal. Appl.},
  year={2011},
  volume={32},
  pages={1193-1211}
}
In this paper we consider minimizing the spectral condition number of a positive semidefinite matrix over a nonempty closed convex set $\Omega$. We show that it can be solved as a convex programming problem, and moreover, the optimal value of the latter problem is achievable. As a consequence, when $\Omega$ is positive semidefinite representable, it can be cast into a semidefinite programming problem. We then propose a first-order method to solve the convex programming problem. The… 
Controlling singular values with semidefinite programming
TLDR
A convex framework for problems that involve singular values that enables the optimization of functionals and constraints expressed in terms of the extremal singular values of matrices whose singular values are bounded is introduced.
K-Optimal Design via Semidefinite Programming and Entropy Optimization
TLDR
This paper considers the problem of optimal design of experiments, and a two-step inference strategy is proposed that entails the minimization of a convex integral functional under linear constraints.
Minimizing the Condition Number to Construct Design Points for Polynomial Regression Models
TLDR
A new optimality criterion is studied, the K-optimality criterion, for constructing optimal experimental designs for polynomial regression models, which shows that there is always a symmetric $K-optimal design with exactly $p+1$ support points including the boundary points $-1$ and $1$.
K-Optimal Gradient Encoding Scheme for Fourth-Order Tensor-Based Diffusion Profile Imaging
TLDR
This paper proposes a new approach to solve the K-optimal GES design problem for fourth-order tensor-based diffusion profile imaging as a tractable semidefinite programming problem and shows that the proposed design leads to the minimum signal deviation.
Sparse shift-varying FIR preconditioners for fast volume denoising
Splitting-based CT reconstruction algorithms decompose the reconstruction problem into a iterated sequence of “easier” subproblems. One relatively memory-efficient algorithm decomposes the
Robust online motion planning via contraction theory and convex optimization
TLDR
This work presents a framework for online generation of robust motion plans for robotic systems with nonlinear dynamics subject to bounded disturbances, control constraints, and online state constraints such as obstacles and demonstrates the approach through simulations of a 6-state planar quadrotor navigating cluttered environments in the presence of a cross-wind.
Metric selection in fast dual forward-backward splitting
TLDR
This paper proposes several methods, with different computational complexity, to find a space on which the algorithm performs well, and evaluates the proposed metric selection procedures by comparing the performance to the case when the Euclidean space is used.
A Unified Framework for Manifold Landmarking
TLDR
This paper proposes a novel active manifold learning method that combines geometric manifold landmarking methods with algebraic ones and achieves this by using the Gershgorin circle theorem to construct an upper bound on the learning error that depends on the landmarks and the manifold's alignment matrix.
Active manifold learning via a unified framework for manifold landmarking.
The success of semi-supervised manifold learning is highly dependent on the quality of the labeled samples. Active manifold learning aims to select and label representative landmarks on a manifold
Robust Feedback Motion Planning via Contraction Theory
We present a framework for online generation of robust motion plans for robotic systems with nonlinear dynamics subject to bounded disturbances, control constraints, and online state constraints such
...
1
2
3
...

References

SHOWING 1-10 OF 24 REFERENCES
Optimizing Condition Numbers
TLDR
The condition number is a Clarke regular strongly pseudoconvex function and it is proved that a global solution of the problem can be approximated by an exact or an inexact solution of a nonsmooth convex program.
Primal-dual first-order methods with O (1/e) iteration-complexity for cone programming.
In this paper we consider the general cone programming problem, and propose primal-dual convex (smooth and/or nonsmooth) minimization reformulations for it. We then discuss first-order methods
Primal-dual first-order methods with $${\mathcal {O}(1/\epsilon)}$$ iteration-complexity for cone programming
TLDR
This paper discusses first-order methods suitable for solving primal-dual convex and nonsmooth minimization reformulations of the cone programming problem, and proposes a variant of Nesterov’s optimal method which has outperformed the latter one in the authors' computational experiments.
A nonlinear programming algorithm for solving semidefinite programs via low-rank factorization
TLDR
A nonlinear programming algorithm for solving semidefinite programs (SDPs) in standard form that replaces the symmetric, positive semideFinite variable X with a rectangular variable R according to the factorization X=RRT.
Probing the Pareto Frontier for Basis Pursuit Solutions
TLDR
A root-finding algorithm for finding arbitrary points on a curve that traces the optimal trade-off between the least-squares fit and the one-norm of the solution is described, and it is proved that this curve is convex and continuously differentiable over all points of interest.
Solving semidefinite-quadratic-linear programs using SDPT3
TLDR
Computational experiments with linear optimization problems involving semidefinite, quadratic, and linear cone constraints (SQLPs) are discussed and computational results on problems from the SDPLIB and DIMACS Challenge collections are reported.
On the Douglas—Rachford splitting method and the proximal point algorithm for maximal monotone operators
TLDR
This paper shows, by means of an operator called asplitting operator, that the Douglas—Rachford splitting method for finding a zero of the sum of two monotone operators is a special case of the proximal point algorithm, which allows the unification and generalization of a variety of convex programming algorithms.
On the Closedness of the Linear Image of a Closed Convex Cone
  • G. Pataki
  • Mathematics, Computer Science
    Math. Oper. Res.
  • 2007
TLDR
Very simple and intuitive necessary conditions are presented that unify, and generalize seemingly disparate, classical sufficientconditions such as polyhedrality of the cone, and Slater-type conditions.
Convex Optimization
TLDR
A comprehensive introduction to the subject of convex optimization shows in detail how such problems can be solved numerically with great efficiency.
A Matlab toolbox for optimization over symmetric cones
TLDR
This paper describes how to work with SeDuMi, an add-on for MATLAB, which lets you solve optimization problems with linear, quadratic and semidefiniteness constraints by exploiting sparsity.
...
1
2
3
...