The approximation of one matrix by another of lower rank

@article{Eckart1936TheAO,
  title={The approximation of one matrix by another of lower rank},
  author={Carl Eckart and G. Marion Young},
  journal={Psychometrika},
  year={1936},
  volume={1},
  pages={211-218}
}
The mathematical problem of approximating one matrix by another of lower rank is closely related to the fundamental postulate of factor-theory. When formulated as a least-squares problem, the normal equations cannot be immediately written down, since the elements of the approximate matrix are not independent of one another. The solution of the problem is simplified by first expressing the matrices in a canonic form. It is found that the problem always has a solution which is usually unique… Expand
Grassmann algorithms for low rank approximation of matrices with missing values
The problem of approximating a matrix by another matrix of lower rank, when a modest portion of its elements are missing, is considered. The solution is obtained using Newton’s algorithm to find aExpand
On a Problem of Weighted Low-Rank Approximation of Matrices
  • Aritra Dutta, Xin Li
  • Mathematics, Computer Science
  • SIAM J. Matrix Anal. Appl.
  • 2017
TLDR
An algorithm based on the alternating direction method is proposed to solve the weighted low rank approximation problem and compare it with the state-of-art general algorithms such as the weighted total alternating least squares and the EM algorithm. Expand
On a general class of matrix nearness problems
The problem is considered of finding the nearest rank-deficient matrix to a given rectangular matrix. For a wide class of matrix norms, and arbitrary sparsity imposed on the matrix of perturbations,Expand
On the L1-Norm Approximation of a Matrix by Another of Lower Rank
TLDR
This paper first shows that the problem is NP-hard, then introduces a theorem on the sparsity of the residual matrix that sets the foundation for a novel algorithm that outperforms all existing counterparts in the L1-norm error minimization metric and exhibits high outlier resistance in comparison to usual L2-norm errors in machine learning applications. Expand
CRITICAL POINTS OF MATRIX LEAST SQUARE DISTANCE FUNCTIONS
Abstract A classical problem in matrix analysis, total least squares estimation and model reduction theory is that of finding a best approximant of a given matrix by lower rank ones. In this paperExpand
The geometry of weighted low-rank approximations
TLDR
It is demonstrated here that the weighted low-rank approximation problem can be solved by finding the subspace that minimizes a particular cost function. Expand
Universal Optimality of Rank Constrained Matrix Approximation
The major results of this paper consist of an extension of optimality properties concerning the best rank-r matrix approximation (cf. Eckart and Young (1936), Mirsky (1960)), and of the solution ofExpand
On best uniform approximation by low-rank matrices
Abstract We study the problem of best approximation, in the elementwise maximum norm, of a given matrix by another matrix of lower rank. We generalize a recent result by Pinkus that describes theExpand
Computing Lower Rank Approximations of Matrix Polynomials
TLDR
It is proved that such lower rank matrices at minimal distance always exist, satisfy regularity conditions, and are all isolated and surrounded by a basin of attraction of non-minimal solutions. Expand
Convex Envelopes for Low Rank Approximation
TLDR
This paper shows how to efficiently compute the convex envelopes of a class of rank minimization formulations, which opens up the possibility of adding additional convex constraints and functions to the minimization problem resulting in strong convex relaxations. Expand
...
1
2
3
4
5
...