Solving least squares problems

@inproceedings{Lawson1995SolvingLS,
  title={Solving least squares problems},
  author={Charles L. Lawson and Richard J. Hanson},
  booktitle={Classics in applied mathematics},
  year={1995}
}
Since the lm function provides a lot of features it is rather complicated. So we are going to instead use the function lsfit as a model. It computes only the coefficient estimates and the residuals. Now would be a good time to read the help file for lsfit. Note that lsfit supports the fitting of multiple least squares models and weighted least squares. Our function will not, hence we can omit the arguments wt, weights and yname. Also, changing tolerances is a little advanced so we will trust… 
An efficient Gauss–Newton algorithm for solving regularized total least squares problems
The total least squares (TLS) method is a well-known technique for solving an overdetermined linear system of equations Ax ≈ b, that is appropriate when both the coefficient matrix A and the
Least squares auto-tuning
TLDR
A powerful proximal gradient method is presented that can be used to find good, if not the best, hyper-parameters for least squares problems and is able to cut the test error of standard least squares in half.
EFFICIENT USE OF TOEPLITZ MATRICES FOR LEAST SQUARES DATA FITTING BY NONNEGATIVE DIFFERENCES
TLDR
This work addresses the problem of making the least sum of squares change to the data by requiring nonnegative differences of order r for the smoothed values by constructing a basis that reduces the equality-constrained minimization calculation of a search direction to an unconstrained maximization calculation, which depends on much fewer variables.
On direct elimination methods for solving the equality constrained least squares problem
Two closely related methods for solving the least squares problem with equality constraints (LSE) are considered. The first is the direct elimination (DE) method that is implemented using Modified
The method of (not so) ordinary least squares: what can go wrong and how to fix them
Most of us came to know about the method of least squares while trying to fit a curve through a set of data points. The parameters of the curve are obtained by solving a set of equations (called the
Solving large linear least squares problems with linear equality constraints
TLDR
This work considers the problem of efficiently solving large-scale linear least squares problems that have one or more linear constraints that must be satisfied exactly and proposes modifications and new ideas, with an emphasis on requiring that the constraints be satisfied with a small residual.
A Projection Method for Least Squares Problems with a Quadratic Equality Constraint
TLDR
Numerical experiments indicate that PMCT is much more efficient than Newton's methods when the LSQE problem is ill-conditioned; PMCT has a 90% success rate in terms of convergence, while commonly used Newton-type iterations almost always fail.
Better Subset Regression Using the
A new method, called the nonnegative (nn) garrote, is proposed for doing subset regression. It both shrinks and zeroes coefficients. In tests on real and simulated data, it produces lower prediction
Exactly initialized recursive least squares
TLDR
Three order-recursive formulas for the Moore-Penrose pseudoinverses of matrices which are the improved and extended Greville formulas (1960) are presented, which are much easier to derive and clearer and simpler.
...
1
2
3
4
5
...