• Corpus ID: 214667176

Robust Least Squares for Quantized Data

@article{Clancy2020RobustLS,
  title={Robust Least Squares for Quantized Data},
  author={Richard J Clancy and Stephen Becker},
  journal={ArXiv},
  year={2020},
  volume={abs/2003.12004}
}
In this paper we formulate and solve a robust least squares problem for a system of linear equations subject to quantization error. Ordinary least squares fails to consider uncertainty in the data matrices, modeling all noise in the observed signal. Total least squares accounts for uncertainty in the data matrix, but necessarily increases the condition number of the system compared to ordinary least squares. Tikhonov regularization or ridge regression, is frequently employed to combat ill… 

Figures and Tables from this paper

References

SHOWING 1-10 OF 26 REFERENCES
Robust Solutions to Least-Squares Problems with Uncertain Data
We consider least-squares problems where the coefficient matrices A,b are unknown but bounded. We minimize the worst-case residual error using (convex) second-order cone programming, yielding an
Overview of total least-squares methods
An analysis of the total least squares problem
TLDR
An algorithm for solving the TLS problem is proposed that utilizes the singular value decomposition and which provides a measure of the underlying problem''s sensitivity.
Tikhonov Regularization and Total Least Squares
TLDR
It is shown how Tikhonov's regularization method can be recast in a total least squares formulation suited for problems in which both the coefficient matrix and the right-hand side are known only approximately.
Robust Regression and Lasso
TLDR
It is shown that robustness of the solution explains why the solution is sparse, and a theorem is proved which states that sparsity and algorithmic stability contradict each other, and hence Lasso is not stable.
Updating Quasi-Newton Matrices With Limited Storage
TLDR
An update formula which generates matrices using information from the last m iterations, where m is any number supplied by the user, and the BFGS method is considered to be the most efficient.
Computational Methods for Inverse Problems
In verse problems arise in a number of important practical applications, ranging from biomedical imaging to seismic prospecting. This book provides the reader with a basic understanding of both the
Data Uncertainties and Least Squares Regression
Least squares regression analysis makes the assumption that the independent variables can be measured without error. This paper examines the effect of errors in these variables and suggests some
Second Order Cone Programming Approaches for Handling Missing and Uncertain Data
TLDR
A novel second order cone programming formulation for designing robust classifiers which can handle uncertainty in observations and which outperform imputation in the case of missing values in observations is proposed.
Inexact proximal ϵ -subgradient methods for composite convex optimization problems
TLDR
An analysis of the convergence and rate of convergence properties of the proximal subgradient method, considering various stepsize rules, including both, diminishing and constant stepsizes.
...
1
2
3
...