# Singular value decomposition and least squares solutions

@article{Golub2007SingularVD, title={Singular value decomposition and least squares solutions}, author={Gene H. Golub and Christian H. Reinsch}, journal={Numerische Mathematik}, year={2007}, volume={14}, pages={403-420} }

Let A be a real m×n matrix with m≧n. It is well known (cf. [4]) that
$$A = U\sum {V^T}$$
(1)
where
$${U^T}U = {V^T}V = V{V^T} = {I_n}{\text{ and }}\sum {\text{ = diag(}}{\sigma _{\text{1}}}{\text{,}} \ldots {\text{,}}{\sigma _n}{\text{)}}{\text{.}}$$
The matrix U consists of n orthonormalized eigenvectors associated with the n largest eigenvalues of AA T , and the matrix V consists of the orthonormalized eigenvectors of A T A. The diagonal elements of ∑ are the non-negative…

## 2,122 Citations

### A simultaneous decomposition of four real quaternion matrices encompassing $\eta$-Hermicity and its applications

- Mathematics
- 2017

Let $\mathbb{H}$ be the real quaternion algebra and $\mathbb{H}^{m\times n}$ denote the set of all $m\times n$ matrices over $\mathbb{H}$. Let $\mathbf{i},\mathbf{j},\mathbf{k}$ be the imaginary…

### An iterative method for computing a symplectic SVD-like decomposition

- Computer Science
- 2018

The proposed method is a block-power iterative method with the ortho-symplectic SR decomposition in the normalization step to efficiently compute the desired number of ordered in magnitude eigenvalues of structured matrices.

### Provable Approximations for Constrained $\ell_p$ Regression

- Computer Science, MathematicsArXiv
- 2019

The first provable constant factor approximation algorithm that solves the constrained $\ell_p$ regression directly, for every constant $p,d\geq 1$ using core-sets, its running time is $O(n \log n)$ including extensions for streaming and distributed (big) data.

### Numerical methods for computing angles between linear subspaces

- MathematicsMilestones in Matrix Computation
- 2007

Experimental results are given, which indicates that MGS gives $\theta_k$ with equal precision and fewer arithmetic operations than HT, however, HT gives principal vectors, which are orthogonal to working accuracy, which is not in general true for MGS.

### Product Eigenvalue Problems

- Computer Science, MathematicsSIAM Rev.
- 2005

The intent of this paper is to demonstrate that the product eigenvalue problem is a powerful unifying concept and that the standard algorithms for solving them are instances of a generic $GR$ algorithm applied to a related cyclic matrix.

### Truncated Cauchy Non-Negative Matrix Factorization

- Computer ScienceIEEE Transactions on Pattern Analysis and Machine Intelligence
- 2019

A Truncated CauchyNMF loss that handle outliers by truncating large errors, and developed to robustly learn the subspace on noisy datasets contaminated by outliers are proposed.

### 45 Computation of the Singular Value Decomposition

- Mathematics
- 2006

then σ is a singular value of A and u and v are corresponding left and right singular vectors, respectively. (For generality it is assumed that the matrices here are complex, although given these…

### DEVELOPMENT OF ORTHOGONALITY OF SINGULAR VECTORS COMPUTED BY I-SVD ALGORITHM

- Mathematics
- 2007

ABSTRACT Recently an O m 2 ( ) algorithm named Integrable-Singular Value Decomposition (I-SVD) for bidiagonal singular value decomposition is developed. Here m is the dimension size. The modified…

### Fast and Accurate Least-Mean-Squares Solvers

- Computer ScienceNeurIPS
- 2019

An algorithm that gets a finite set of n d-dimensional real vectors and returns a subset of d+1 vectors with positive weights whose weighted sum is \emph{exactly} the same, based on a novel paradigm of fusion between different data summarization techniques, known as sketches and coresets.

### Faster Projective Clustering Approximation of Big Data

- Computer ScienceArXiv
- 2020

This work suggests to reduce the size of existing coresets by suggesting the first $O(\log(m))$ approximation for the case of lines clustering in $O(ndm)$ time, and proves that for a sufficiently large $m$ the authors obtain a coreset for projective clustering.

## References

SHOWING 1-10 OF 13 REFERENCES

### Linear least squares solutions by householder transformations

- Mathematics
- 1965

Let A be a given m×n real matrix with m≧n and of rank n and b a given vector. We wish to determine a vector x such that
$$\parallel b - A\hat x\parallel = \min .$$
where ∥ … ∥ indicates the…

### Calculating the singular values and pseudo-inverse of a matrix

- MathematicsMilestones in Matrix Computation
- 2007

The use of the pseudo-inverse $A^I = V\Sigma ^I U^* $ to solve least squares problems in a way which dampens spurious oscillation and cancellation is mentioned.

### Solution of linear equations by diagonalization of coefficients matrix

- Mathematics
- 1955

This form is very convenient since the multiplication of matrices is performed by an electronic computer in almost no time. The unitary matrices U and T are known to exist (for example modal matrices…

### The cyclic Jacobi method for computing the principal values of a complex matrix

- Mathematics
- 1960

is diagonal (T denotes the transpose), then the main diagonal of A is made up of the numbers Xi in some order. If it is desired to compute the Xi numerically, this result is of no immediate use,…

### On some algorithms for the solution of the complete eigenvalue problem

- Mathematics, Computer Science
- 1962

### On the Stationary Values of a Second-Degree Polynomial on the Unit Sphere

- Mathematics
- 1965

(H denotes complex conjugate transpose) is a real number. C. R. Rao of the Indian Statistical Institute, Calcutta, suggested to us the problem of maximizing (or minimizing) 4(x) for complex x on the…

### Householder's tridiagonalization of a symmetric matrix

- Mathematics
- 1968

In an early paper in this series [4] Householder’s algorithm for the tridiagonalization of a real symmetric matrix was discussed. In the light of experience gained since its publication and in view…