# Kernel Partial Least Squares for Stationary Data

@article{Singer2017KernelPL, title={Kernel Partial Least Squares for Stationary Data}, author={Marco Singer and Tatyana Krivobokova and Axel Munk}, journal={J. Mach. Learn. Res.}, year={2017}, volume={18}, pages={123:1-123:41} }

We consider the kernel partial least squares algorithm for non-parametric regression with stationary dependent data. Probabilistic convergence rates of the kernel partial least squares estimator to the true regression function are established under a source and an effective dimensionality condition. It is shown both theoretically and in simulations that long range dependence results in slower convergence rates. A protein dynamics example shows high predictive power of kernel partial least…

## 4 Citations

### Bump detection in the presence of dependency: Does it ease or does it load?

- Mathematics, Computer ScienceBernoulli
- 2020

This work provides the asymptotic minimax detection boundary for a bump in the mean function of a stationary Gaussian process and finds that it is generically determined by the value of its spectral density at zero.

### of the Bernoulli Society for Mathematical Statistics and Probability Volume Twenty Six Number Four November 2020

- Mathematics
- 2020

The papers published in Bernoulli are indexed or abstracted in Current Index to Statistics, Mathematical Reviews, Statistical Theory and Method Abstracts-Zentralblatt (STMA-Z), and Zentralblatt für…

### Approximate kernel partial least squares

- Computer ScienceAnnals of Mathematics and Artificial Intelligence
- 2020

This work considers the spectral properties of low-rank kernel matrices constructed as sums of random feature dot-products and presents a new method called randomized kernel partial least squares (RKPLS) to approximate KPLS, and shows that the solution of the algorithm converges to exact kernel matrix in expectation.

## References

SHOWING 1-10 OF 43 REFERENCES

### Kernel Partial Least Squares for Nonlinear Regression and Discrimination

- Computer Science
- 2002

This paper summarizes recent results on applying the method of partial least squares (PLS) in a reproducing kernel Hilbert space (RKHS) and results on a two-class discrimination problem indicate usefulness of the method.

### Partial least squares for dependent data.

- PhysicsBiometrika
- 2016

The partial least squares algorithm for dependent data is considered and the consequences of ignoring nonstationary dependence structures are studied both theoretically and numerically.

### Kernel Partial Least Squares is Universally Consistent

- MathematicsAISTATS
- 2010

This work proves the statistical consistency of kernel Partial Least Squares Regression applied to a bounded regression learning problem on a reproducing kernel Hilbert space and studies two empirical stopping rules that lead to universally consistent estimators provided the kernel is universal.

### ON THE STRUCTURE OF PARTIAL LEAST SQUARES REGRESSION

- Mathematics
- 1988

We prove that the two algorithms given in the literature for partial least squares regression are equivalent, and use this equivalence to give an explicit formula for the resulting prediction…

### Optimal learning rates for Kernel Conjugate Gradient regression

- Computer ScienceNIPS
- 2010

We prove rates of convergence in the statistical sense for kernel-based least squares regression using a conjugate gradient algorithm, where regularization against overfitting is obtained by early…

### Optimal Learning Rates for Kernel Partial Least Squares

- Computer Science, Mathematics
- 2018

This work proposes a stopping rule for determining the number of iterations based on cross-validation, without assuming a priori knowledge of the underlying probability measure, and shows that optimal learning rates can be achieved.

### Optimal Rates for the Regularized Least-Squares Algorithm

- Mathematics, Computer ScienceFound. Comput. Math.
- 2007

A complete minimax analysis of the problem is described, showing that the convergence rates obtained by regularized least-squares estimators are indeed optimal over a suitable class of priors defined by the considered kernel.

### Kernelizing PLS, degrees of freedom, and efficient model selection

- Computer ScienceICML '07
- 2007

It is shown that the kernelization of PLS introduces interesting properties not found in ordinary PLS, giving novel insights into the workings of kernel PLS and the connections to kernel ridge regression and conjugate gradient descent methods.

### Effective Dimension and Generalization of Kernel Learning

- Computer ScienceNIPS
- 2002

A concept of scale-sensitive effective data dimension is introduced, and it is shown that it characterizes the convergence rate of the underlying learning problem, and can naturally extend results for parametric estimation problems in finite dimensional spaces to non-parametric kernel learning methods.

### Kernel ridge vs. principal component regression: Minimax bounds and the qualification of regularization operators

- Mathematics
- 2017

: Regularization is an essential element of virtually all kernel methods for nonparametric regression problems. A critical factor in the eﬀectiveness of a given kernel method is the type of…