#### Filter Results:

- Full text PDF available (49)

#### Publication Year

1995

2017

#### Publication Type

#### Co-author

#### Publication Venue

#### Data Set Used

#### Key Phrases

Learn More

- Shiqian Ma, Donald Goldfarb, Lifeng Chen
- Math. Program.
- 2011

The linearly constrained matrix rank minimization problem is widely applicable in many fields such as control, signal processing and system identification. The tightest convex relaxation of this problem is the linearly constrained nuclear norm minimization. Although the latter can be cast as a semidefinite programming problem, such an approach is… (More)

- Donald Goldfarb, Shiqian Ma
- Foundations of Computational Mathematics
- 2011

The matrix rank minimization problem has applications in many fields such as system identification, optimal control, low-dimensional embedding, etc. As this problem is NP-hard in general, its convex relaxation, the nuclear norm minimization problem, is often solved instead. Recently, Ma, Goldfarb and Chen proposed a fixed-point continuation algorithm for… (More)

- Shiqian Ma, Wotao Yin, Yin Zhang, Amit Chakraborty
- 2008 IEEE Conference on Computer Vision and…
- 2008

Compressed sensing, an emerging multidisciplinary field involving mathematics, probability, optimization, and signal processing, focuses on reconstructing an unknown signal from a very limited number of samples. Because information such as boundaries of organs is very sparse in most MR images, compressed sensing makes it possible to reconstruct the same MR… (More)

- Katya Scheinberg, Shiqian Ma, Donald Goldfarb
- NIPS
- 2010

Gaussian graphical models are of great interest in statistical learning. Because the conditional independencies between different nodes correspond to zero entries in the inverse covariance matrix of the Gaussian distribution, one can learn the structure of the graph by estimating a sparse inverse covariance matrix from sample data, by solving a convex… (More)

- Xiangfeng Wang, Mingyi Hong, Shiqian Ma, Zhi-Quan Luo
- ArXiv
- 2013

In this paper, we consider solving multiple-block separable convex minimization problems using alternating direction method of multipli-ers (ADMM). Motivated by the fact that the existing convergence theory for ADMM is mostly limited to the two-block case, we analyze in this paper, both theoretically and numerically, a new strategy that first transforms a… (More)

- Wei Liu, Shiqian Ma, Dacheng Tao, Jianzhuang Liu, Peng Liu
- KDD
- 2010

In plenty of scenarios, data can be represented as vectors and then mathematically abstracted as points in a Euclidean space. Because a great number of machine learning and data mining applications need proximity measures over data, a simple and universal distance metric is desirable, and metric learning methods have been explored to produce sensible… (More)

- Donald Goldfarb, Shiqian Ma, Zaiwen Wen
- 2009 47th Annual Allerton Conference on…
- 2009

We present several first-order algorithms for solving the low-rank matrix completion problem and the tightest convex relaxation of it obtained by minimizing the nuclear norm instead of the rank of the matrix. Our first algorithm is a fixed point continuation algorithm that incorporates an approximate singular value decomposition procedure (FPCA). FPCA can… (More)

- Bo Jiang, Shiqian Ma, Shuzhong Zhang
- Math. Program.
- 2015

This paper is concerned with the computation of the principal components for a general tensor, known as the tensor principal component analysis (PCA) problem. We show that the general tensor PCA problem is reducible to its special case where the tensor in question is super-symmetric with an even degree. In that case, the tensor can be embedded into a… (More)

- Donald Goldfarb, Shiqian Ma, Katya Scheinberg
- Math. Program.
- 2013

We present in this paper first-order alternating linearization algorithms based on an alternating direction augmented Lagrangian approach for minimizing the sum of two convex functions. Our basic methods require at most O(1/ǫ) iterations to obtain an ǫ-optimal solution, while our accelerated (i.e., fast) versions of them require at most O(1/ √ ǫ)… (More)

- Bo Huang, Shiqian Ma, Donald Goldfarb
- J. Sci. Comput.
- 2013

In this paper, we propose and analyze an accelerated linearized Bregman (ALB) method for solving the basis pursuit and related sparse optimization problems. This accelerated algorithm is based on the fact that the linearized Bregman (LB) algorithm is equivalent to a gradient descent method applied to a certain dual formulation. We show that the LB method… (More)