# Kernel-based estimation for partially functional linear model: Minimax rates and randomized sketches

@inproceedings{Lv2021KernelbasedEF, title={Kernel-based estimation for partially functional linear model: Minimax rates and randomized sketches}, author={Shaogao Lv and Xin He and Junhui Wang}, year={2021} }

This paper considers the partially functional linear model (PFLM) where all predictive features consist of a functional covariate and a high dimensional scalar vector. Over an infinite dimensional reproducing kernel Hilbert space, the proposed estimation for PFLM is a least square approach with two mixed regularizations of a function-norm and an `1norm. Our main task in this paper is to establish the minimax rates for PFLM under high dimensional setting, and the optimal minimax rates of…

## References

SHOWING 1-10 OF 23 REFERENCES

Minimax and Adaptive Prediction for Functional Linear Regression

- Computer Science
- 2012

This article considers minimax and adaptive prediction with functional predictors in the framework of functional linear model and reproducing kernel Hilbert space and proposes an easily implementable data-driven roughness regularization predictor that is shown to attain the optimal rate of convergence adaptively without the need of knowing the covariance kernel.

Minimax optimal estimation in partially linear additive models under high dimension

- Computer ScienceBernoulli
- 2019

This paper derives minimax rates for estimating both parametric and nonparametric components in partially linear additive models with high dimensional sparse vectors and smooth functional components and demonstrates that penalized least square estimators can nearly achieve minimax lower bounds.

Partially functional linear regression in high dimensions

- Mathematics, Computer Science
- 2016

This work proposes a new class of partially functional linear models to characterize the regression between a scalar response and covariates of both functional and scalar types, and establishes the consistency and oracle properties of the proposed method under mild conditions.

Structured functional additive regression in reproducing kernel Hilbert spaces.

- Computer ScienceJournal of the Royal Statistical Society. Series B, Statistical methodology
- 2014

A new regularization framework for structure estimation in the context of reproducing kernel Hilbert spaces is proposed by penalized least squares using a penalty which encourages the sparse structure of the additive components.

A Reproducing Kernel Hilbert Space Approach to Functional Linear Regression

- Mathematics
- 2010

We study in this paper a smoothness regularization method for functional linear regression and provide a unified treatment for both the prediction and estimation problems. By developing a tool on…

Randomized sketches for kernels: Fast and optimal non-parametric regression

- Computer Science, MathematicsArXiv
- 2015

It is proved that it suffices to choose the sketch dimension $m$ proportional to the statistical dimension (modulo logarithmic factors) of the kernel matrix, and fast and minimax optimal approximations to the KRR estimate for non-parametric regression are obtained.

Functional partially linear quantile regression model

- Mathematics
- 2014

This paper considers estimation of a functional partially quantile regression model whose parameters include the infinite dimensional function as well as the slope parameters. We show asymptotical…

Methodology and convergence rates for functional linear regression

- Mathematics
- 2007

In functional linear regression, the slope "parameter" is a function. Therefore, in a nonparametric context, it is determined by an infinite number of unknowns. Its estimation involves solving an…

SPLINE ESTIMATORS FOR THE FUNCTIONAL LINEAR MODEL

- Mathematics, Computer Science
- 2003

This work considers a regression setting where the response is a scalar and the predictor is a random function defined on a compact set of R, and studies an estimator based on a B-splines expansion of the functional coefficient which generalizes ridge regression.