# Robust Least Squares for Quantized Data

@article{Clancy2020RobustLS, title={Robust Least Squares for Quantized Data}, author={Richard J Clancy and Stephen Becker}, journal={ArXiv}, year={2020}, volume={abs/2003.12004} }

In this paper we formulate and solve a robust least squares problem for a system of linear equations subject to quantization error. Ordinary least squares fails to consider uncertainty in the data matrices, modeling all noise in the observed signal. Total least squares accounts for uncertainty in the data matrix, but necessarily increases the condition number of the system compared to ordinary least squares. Tikhonov regularization or ridge regression, is frequently employed to combat ill…

## References

SHOWING 1-10 OF 26 REFERENCES

Robust Solutions to Least-Squares Problems with Uncertain Data

- Mathematics
- 1997

We consider least-squares problems where the coefficient matrices A,b are unknown but bounded. We minimize the worst-case residual error using (convex) second-order cone programming, yielding an…

An analysis of the total least squares problem

- MathematicsMilestones in Matrix Computation
- 2007

An algorithm for solving the TLS problem is proposed that utilizes the singular value decomposition and which provides a measure of the underlying problem''s sensitivity.

Tikhonov Regularization and Total Least Squares

- MathematicsSIAM J. Matrix Anal. Appl.
- 1999

It is shown how Tikhonov's regularization method can be recast in a total least squares formulation suited for problems in which both the coefficient matrix and the right-hand side are known only approximately.

Robust Regression and Lasso

- Computer ScienceIEEE Transactions on Information Theory
- 2010

It is shown that robustness of the solution explains why the solution is sparse, and a theorem is proved which states that sparsity and algorithmic stability contradict each other, and hence Lasso is not stable.

Updating Quasi-Newton Matrices With Limited Storage

- Computer Science
- 1980

An update formula which generates matrices using information from the last m iterations, where m is any number supplied by the user, and the BFGS method is considered to be the most efficient.

Computational Methods for Inverse Problems

- Mathematics
- 1987

In verse problems arise in a number of important practical applications, ranging from biomedical imaging to seismic prospecting. This book provides the reader with a basic understanding of both the…

Data Uncertainties and Least Squares Regression

- Sociology
- 1972

Least squares regression analysis makes the assumption that the independent variables can be measured without error. This paper examines the effect of errors in these variables and suggests some…

Second Order Cone Programming Approaches for Handling Missing and Uncertain Data

- Computer Science, MathematicsJ. Mach. Learn. Res.
- 2006

A novel second order cone programming formulation for designing robust classifiers which can handle uncertainty in observations and which outperform imputation in the case of missing values in observations is proposed.

Inexact proximal ϵ -subgradient methods for composite convex optimization problems

- Computer Science, MathematicsJ. Glob. Optim.
- 2019

An analysis of the convergence and rate of convergence properties of the proximal subgradient method, considering various stepsize rules, including both, diminishing and constant stepsizes.