The approximation of one matrix by another of lower rank

@article{Eckart1936TheAO,
  title={The approximation of one matrix by another of lower rank},
  author={Carl Eckart and G. Marion Young},
  journal={Psychometrika},
  year={1936},
  volume={1},
  pages={211-218}
}
The mathematical problem of approximating one matrix by another of lower rank is closely related to the fundamental postulate of factor-theory. When formulated as a least-squares problem, the normal equations cannot be immediately written down, since the elements of the approximate matrix are not independent of one another. The solution of the problem is simplified by first expressing the matrices in a canonic form. It is found that the problem always has a solution which is usually unique… 

Grassmann algorithms for low rank approximation of matrices with missing values

The problem of approximating a matrix by another matrix of lower rank, when a modest portion of its elements are missing, is considered. The solution is obtained using Newton’s algorithm to find a

On a Problem of Weighted Low-Rank Approximation of Matrices

TLDR
An algorithm based on the alternating direction method is proposed to solve the weighted low rank approximation problem and compare it with the state-of-art general algorithms such as the weighted total alternating least squares and the EM algorithm.

On a general class of matrix nearness problems

The problem is considered of finding the nearest rank-deficient matrix to a given rectangular matrix. For a wide class of matrix norms, and arbitrary sparsity imposed on the matrix of perturbations,

On the L1-Norm Approximation of a Matrix by Another of Lower Rank

TLDR
This paper first shows that the problem is NP-hard, then introduces a theorem on the sparsity of the residual matrix that sets the foundation for a novel algorithm that outperforms all existing counterparts in the L1-norm error minimization metric and exhibits high outlier resistance in comparison to usual L2-norm errors in machine learning applications.

The geometry of weighted low-rank approximations

TLDR
It is demonstrated here that the weighted low-rank approximation problem can be solved by finding the subspace that minimizes a particular cost function.

Universal Optimality of Rank Constrained Matrix Approximation

  • R. Meyer
  • Mathematics, Computer Science
  • 1994
TLDR
The major results of this paper consist of an extension of optimality properties concerning the best rank-r matrix approximation and the solution of classical MDS from the class of orthogonally invariant norms to a wider class of approximation criteria.

Computing Lower Rank Approximations of Matrix Polynomials

Convex Envelopes for Low Rank Approximation

TLDR
This paper shows how to efficiently compute the convex envelopes of a class of rank minimization formulations, which opens up the possibility of adding additional convex constraints and functions to the minimization problem resulting in strong convex relaxations.
...