Choosing Ridge Parameter for Regression Problems

@article{Khalaf2005ChoosingRP,
  title={Choosing Ridge Parameter for Regression Problems},
  author={Ghadban Khalaf and Ghazi Shukur},
  journal={Communications in Statistics - Theory and Methods},
  year={2005},
  volume={34},
  pages={1177 - 1182}
}
  • G. Khalaf, G. Shukur
  • Published 1 May 2005
  • Mathematics
  • Communications in Statistics - Theory and Methods
ABSTRACT Hoerl and Kennard (1970a) introduced the ridge regression estimator as an alternative to the ordinary least squares estimator in the presence of multicollinearity. In this article, a new approach for choosing the ridge parameter (K), when multicollinearity among the columns of the design matrix exists, is suggested and evaluated by simulation techniques, in terms of mean squared errors (MSE). A number of factors that may affect the properties of these methods have been varied. The MSE… 
New ridge parameters for ridge regression
Abstract Hoerl and Kennard (1970a) introduced the ridge regression estimator as an alternative to the ordinary least squares (OLS) estimator in the presence of multicollinearity. In ridge regression,
Some Modifications for Choosing Ridge Parameters
Standard least square regression can produce estimates having a large mean squares error (MSE) when predictor variables are highly correlated or multicollinear. In this article, we propose four
Ridge Regression and Ill-Conditioning
Hoerl and Kennard (1970) suggested the ridge regression estimator as an alternative to the Ordinary Least Squares (OLS) estimator in the presence of multicollinearity. This article proposes new
Alternative Method for Choosing Ridge Parameter for Regression
The parameter estimation method based on minimum residual sum of squares is unsatisfactory in the presence of multicollinearity. Hoerl and Kennard [1] introduced alternative method called ridge
Performance of a new ridge regression estimator
Abstract Ridge regression estimator has been introduced as an alternative to the ordinary least squares estimator (OLS) in the presence of multicollinearity. Several studies concerning ridge
On Comparison of Some Ridge Parameters in Ridge Regression
In this article, a new approach to obtain the ridge parameter introduces for the multiple linear regression model suffers from the problem of multicollinearity. Furthermore, we compare the proposed
A RIDGE REGRESSION ESTIMATION APPROACH WHEN MULTICOLLINEARITY IS PRESENT
In regression problems, we usually try to estimate the parameters β in the general linear regression model . u X Y + β = We need a method to estimate the parameter vector . β The most common method
Modified Ridge Regression Estimators
Ridge regression is a variant of ordinary multiple linear regression whose goal is to circumvent the problem of predictors collinearity. It gives up the Ordinary Least Squares (OLS) estimator as a
A Comparative Study on the Performance of New Ridge Estimators
Least square estimators in multiple linear regressions under multicollinearity become unstable as they produce large variance for the estimated regression coefficients. Hoerl and Kennard 1970,
Multicollinearity and A Ridge Parameter Estimation Approach
One of the main goals of the multiple linear regression model, Y = Xβ + u, is to assess the importance of independent variables in determining their predictive ability. However, in practical
...
1
2
3
4
5
...

References

SHOWING 1-9 OF 9 REFERENCES
A Monte Carlo Evaluation of Some Ridge-Type Estimators
Abstract Consider the standard linear model . Ridge regression, as viewed here, defines a class of estimators of indexed by a scalar parameter k. Two analytic methods of specifying k are proposed and
Ridge Regression: Biased Estimation for Nonorthogonal Problems
TLDR
The ridge trace is introduced is the ridge trace, a method for showing in two dimensions the effects of nonorthogonality, and how to augment X′X to obtain biased estimates with smaller mean square error.
Ridge regression:some simulations
An algorithm is given for selacting the biasing paramatar, k, in RIDGE regrassion. By means of simulaction it is shown that the algorithm has the following properties: (i) it produces an aberaged
A class of almost unbiased and efficient estimators of regression coefficients
Abstract In this paper we develop a family of approximately unbiased non-linear estimators and show that they are more efficient than OLS as well as improved biased estimators.
Ridge regression: application to non-orthogonal problems
  • Tech
  • 1970
An almost unbiased estimators
  • Sankhya
  • 1986
Recent Advances in Regression Methods.
A Monte-Carlo evaluation of some ridge-type
  • 1975
1970a). Ridge regression: biased estimation
  • 1970