Corpus ID: 237562838

Adaptive Ridge-Penalized Functional Local Linear Regression

@inproceedings{Huang2021AdaptiveRF,
  title={Adaptive Ridge-Penalized Functional Local Linear Regression},
  author={Wentian Huang and David Ruppert},
  year={2021}
}
  • Wentian Huang, David Ruppert
  • Published 17 September 2021
  • Mathematics
We introduce an original method of multidimensional ridge penalization in functional local linear regressions. The nonparametric regression of functional data is extended from its multivariate counterpart, and is known to be sensitive to the choice of J , where J is the dimension of the projection subspace of the data. Under multivariate setting, a roughness penalty is helpful for variance reduction. However, among the limited works covering roughness penalty under the functional setting, most… Expand

Figures and Tables from this paper

References

SHOWING 1-10 OF 24 REFERENCES
Penalized Nonparametric Scalar-on-Function Regression via Principal Coordinates
  • P. Reiss, David L. Miller, Pei-Shien Wu, Wen-Yu Hua
  • Mathematics, Medicine
  • Journal of computational and graphical statistics : a joint publication of American Statistical Association, Institute of Mathematical Statistics, Interface Foundation of North America
  • 2017
TLDR
A new method of this type, which extends intermediate-rank penalized smoothing to scalar-on-function regression, is introduced, which is shown to outperform a functional generalized linear model. Expand
Data Adaptive Ridging in Local Polynomial Regression
Abstract When estimating a regression function or its derivatives, local polynomials are an attractive choice due to their flexibility and asymptotic performance. Seifert and Gasser proposed ridgingExpand
Scalar-on-function local linear regression and beyond
Regressing a scalar response on a random function is nowadays a common situation. In the nonparametric setting, this paper paves the way for making the local linear regression based on a projectionExpand
Design-adaptive Nonparametric Regression
Abstract In this article we study the method of nonparametric regression based on a weighted local linear regression. This method has advantages over other popular kernel methods. Moreover, such aExpand
An Effective Bandwidth Selector for Local Least Squares Regression
Abstract Local least squares kernel regression provides an appealing solution to the nonparametric regression, or “scatterplot smoothing,” problem, as demonstrated by Fan, for example. The practicalExpand
Functional Principal Component Regression and Functional Partial Least Squares
Regression of a scalar response on signal predictors, such as near-infrared (NIR) spectra of chemical samples, presents a major challenge when, as is typically the case, the dimension of the signalsExpand
Additive modelling of functional gradients
We consider the problem of estimating functional derivatives and gradients in the framework of a regression setting where one observes functional predictors and scalar responses. Derivatives are thenExpand
Locally modelled regression and functional data
The general framework of this paper deals with the nonparametric regression of a scalar response on a functional variable (i.e. one observation can be a curve, surface, or any other object lying intoExpand
Multivariate Locally Weighted Least Squares Regression
Nonparametric regression using locally weighted least squares was first discussed by Stone and by Cleveland. Recently, it was shown by Fan and by Fan and Gijbels that the local linear kernel-weightedExpand
Two-Step Estimation of Functional Linear Models with Applications to Longitudinal Data
Two-Step Estimation of Functional Linear Models with Applications to Longitudinal Data Jianqing Fan and Jin-Ting Zhang Department of Statistics UNC-Chapel Hill, NC 27599-3260 July 16, 1999 AbstractExpand
...
1
2
3
...