# Minimizing Condition Number via Convex Programming

@article{Lu2011MinimizingCN, title={Minimizing Condition Number via Convex Programming}, author={Zhaosong Lu and Ting Kei Pong}, journal={SIAM J. Matrix Anal. Appl.}, year={2011}, volume={32}, pages={1193-1211} }

In this paper we consider minimizing the spectral condition number of a positive semidefinite matrix over a nonempty closed convex set $\Omega$. We show that it can be solved as a convex programming problem, and moreover, the optimal value of the latter problem is achievable. As a consequence, when $\Omega$ is positive semidefinite representable, it can be cast into a semidefinite programming problem. We then propose a first-order method to solve the convex programming problem. The…

## 25 Citations

Controlling singular values with semidefinite programming

- Mathematics, Computer ScienceACM Trans. Graph.
- 2014

A convex framework for problems that involve singular values that enables the optimization of functionals and constraints expressed in terms of the extremal singular values of matrices whose singular values are bounded is introduced.

K-Optimal Design via Semidefinite Programming and Entropy Optimization

- Computer Science, MathematicsMath. Oper. Res.
- 2015

This paper considers the problem of optimal design of experiments, and a two-step inference strategy is proposed that entails the minimization of a convex integral functional under linear constraints.

Minimizing the Condition Number to Construct Design Points for Polynomial Regression Models

- MathematicsSIAM J. Optim.
- 2013

A new optimality criterion is studied, the K-optimality criterion, for constructing optimal experimental designs for polynomial regression models, which shows that there is always a symmetric $K-optimal design with exactly $p+1$ support points including the boundary points $-1$ and $1$.

Multi-Step Gradient Methods for Networked Optimization

- Computer ScienceIEEE Transactions on Signal Processing
- 2013

This work develops multi-step gradient methods for network-constrained optimization of strongly convex functions with Lipschitz-continuous gradients and applies the proposed technique to three engineering problems: resource allocation under network-wide budget constraint, distributed averaging, and Internet congestion control.

K-Optimal Gradient Encoding Scheme for Fourth-Order Tensor-Based Diffusion Profile Imaging

- Computer ScienceBioMed research international
- 2015

This paper proposes a new approach to solve the K-optimal GES design problem for fourth-order tensor-based diffusion profile imaging as a tractable semidefinite programming problem and shows that the proposed design leads to the minimum signal deviation.

Sparse shift-varying FIR preconditioners for fast volume denoising

- Computer Science
- 2013

This work presents an algorithm to design a positive-definite, Schatten p-norm optimal, finite impulse response (FIR) approximation to a given circulant matrix and demonstrates that PCG with an efficient space-varying preconditioner can converge at least quickly as a split-Bregman-like algorithm while using considerably less memory.

Robust online motion planning via contraction theory and convex optimization

- Engineering2017 IEEE International Conference on Robotics and Automation (ICRA)
- 2017

This work presents a framework for online generation of robust motion plans for robotic systems with nonlinear dynamics subject to bounded disturbances, control constraints, and online state constraints such as obstacles and demonstrates the approach through simulations of a 6-state planar quadrotor navigating cluttered environments in the presence of a cross-wind.

A Unified Framework for Manifold Landmarking

- Computer ScienceIEEE Transactions on Signal Processing
- 2018

This paper proposes a novel active manifold learning method that combines geometric manifold landmarking methods with algebraic ones and achieves this by using the Gershgorin circle theorem to construct an upper bound on the learning error that depends on the landmarks and the manifold's alignment matrix.

Active manifold learning via a unified framework for manifold landmarking.

- Computer Science
- 2017

This paper proposes a novel active manifold learning method that combines geometric manifold landmarking methods with algebraic ones and achieves this by using the Gershgorin circle theorem to construct an upper bound on the learning error that depends on the landmarks and the manifold's alignment matrix.

## References

SHOWING 1-10 OF 24 REFERENCES

Optimizing Condition Numbers

- MathematicsSIAM J. Optim.
- 2009

The condition number is a Clarke regular strongly pseudoconvex function and it is proved that a global solution of the problem can be approximated by an exact or an inexact solution of a nonsmooth convex program.

Primal-dual first-order methods with O (1/e) iteration-complexity for cone programming.

- Computer Science, Mathematics
- 2011

First-order methods suitable for solving primal-dual convex (smooth and/or nonsmooth) minimization reformulations of the cone programming problem are discussed, and a variant of Nesterov's optimal method is proposed which has outperformed the latter one in computational experiments.

Primal-dual first-order methods with $${\mathcal {O}(1/\epsilon)}$$ iteration-complexity for cone programming

- Computer Science, MathematicsMath. Program.
- 2011

This paper discusses first-order methods suitable for solving primal-dual convex and nonsmooth minimization reformulations of the cone programming problem, and proposes a variant of Nesterov’s optimal method which has outperformed the latter one in the authors' computational experiments.

A nonlinear programming algorithm for solving semidefinite programs via low-rank factorization

- Computer ScienceMath. Program.
- 2003

A nonlinear programming algorithm for solving semidefinite programs (SDPs) in standard form that replaces the symmetric, positive semideFinite variable X with a rectangular variable R according to the factorization X=RRT.

Probing the Pareto Frontier for Basis Pursuit Solutions

- MathematicsSIAM J. Sci. Comput.
- 2008

A root-finding algorithm for finding arbitrary points on a curve that traces the optimal trade-off between the least-squares fit and the one-norm of the solution is described, and it is proved that this curve is convex and continuously differentiable over all points of interest.

Solving semidefinite-quadratic-linear programs using SDPT3

- Computer ScienceMath. Program.
- 2003

Computational experiments with linear optimization problems involving semidefinite, quadratic, and linear cone constraints (SQLPs) are discussed and computational results on problems from the SDPLIB and DIMACS Challenge collections are reported.

On the Douglas—Rachford splitting method and the proximal point algorithm for maximal monotone operators

- Computer Science, MathematicsMath. Program.
- 1992

This paper shows, by means of an operator called asplitting operator, that the Douglas—Rachford splitting method for finding a zero of the sum of two monotone operators is a special case of the proximal point algorithm, which allows the unification and generalization of a variety of convex programming algorithms.

On the Closedness of the Linear Image of a Closed Convex Cone

- MathematicsMath. Oper. Res.
- 2007

Very simple and intuitive necessary conditions are presented that unify, and generalize seemingly disparate, classical sufficientconditions such as polyhedrality of the cone, and Slater-type conditions.

Convex Optimization

- Computer ScienceIEEE Transactions on Automatic Control
- 2006

A comprehensive introduction to the subject of convex optimization shows in detail how such problems can be solved numerically with great efficiency.

A Matlab toolbox for optimization over symmetric cones

- Computer Science
- 1999

This paper describes how to work with SeDuMi, an add-on for MATLAB, which lets you solve optimization problems with linear, quadratic and semidefiniteness constraints by exploiting sparsity.