## Figures and Tables from this paper

## 3 Citations

### Regression modelling with I-priors

- Mathematics
- 2020

We introduce the I-prior methodology as a unifying framework for estimating a variety of regression models, including varying coefficient, multilevel, longitudinal models, and models with functional…

### iprior: An R Package for Regression Modelling using I-priors.

- Computer Science
- 2019

The iprior package implements a unified methodology for fitting parametric and nonparametric regression models, including additive models, multilevel models, and models with one or more functional covariates, which is illustrated by analysing a simulated toy data set as well as three real-data examples.

## References

SHOWING 1-10 OF 93 REFERENCES

### Regression modelling using priors depending on Fisher information covariance kernels (I-priors)

- Mathematics, Computer Science
- 2018

This study advocates the I-prior methodology as being a simple, intuitive, and comparable alternative to similar leading state-of-the-art models to tackle multicollinearity.

### iprior: An R Package for Regression Modelling using I-priors.

- Computer Science
- 2019

The iprior package implements a unified methodology for fitting parametric and nonparametric regression models, including additive models, multilevel models, and models with one or more functional covariates, which is illustrated by analysing a simulated toy data set as well as three real-data examples.

### Bayesian nonlinear model selection and neural networks: a conjugate prior approach

- Computer ScienceIEEE Trans. Neural Networks Learn. Syst.
- 2000

This Bayesian selection procedure allows us to compare general nonlinear regression models and in particular feedforward neural networks, in addition to embedded models as usual with asymptotic comparison tests.

### Generalized functional linear models

- Mathematics
- 2004

We propose a generalized functional linear regression model for a regression situation where the response variable is a scalar and the predictor is a random function. A linear predictor is obtained…

### Hybrid Regularisation of Functional Linear Models

- Mathematics
- 2016

We consider the problem of estimating the slope function in a functional regression with a scalar response and a functional covariate. This central problem of functional data analysis is well known…

### On Bayesian Analysis of Generalized Linear Models Using Jeffreys's Prior

- Mathematics
- 1991

Abstract Generalized linear models (GLM's) have proved suitable for modeling various kinds of data consisting of exponential family response variables with covariates. Bayesian analysis of such data…

### Improper Priors, Spline Smoothing and the Problem of Guarding Against Model Errors in Regression

- Mathematics
- 1978

SUMMARY Spline and generalized spline smoothing is shown to be equivalent to Bayesian estimation with a partially improper prior. This result supports the idea that spline smoothing is a natural…

### Penalized classification using Fisher's linear discriminant

- Computer ScienceJournal of the Royal Statistical Society. Series B, Statistical methodology
- 2011

This work proposes penalized LDA, which is a general approach for penalizing the discriminant vectors in Fisher's discriminant problem in a way that leads to greater interpretability, and uses a minorization–maximization approach to optimize it efficiently when convex penalties are applied to the discriminating vectors.

### Single and multiple index functional regression models with nonparametric link

- Mathematics, Computer Science
- 2011

A new technique for estimating the link function nonparametrically is introduced and an approach to multi-index modeling using adaptively defined linear projections of functional data is suggested, and it is shown that the methods enable prediction with polynomial convergence rates.

### Bayesian Variable Selection in Structured High-Dimensional Covariate Spaces With Applications in Genomics

- Mathematics, Computer Science
- 2010

This work considers the problem of variable selection in regression modeling in high-dimensional spaces where there is known structure among the covariates, and approaches this problem through the Bayesian variable selection framework, where it is assumed that the covariate lie on an undirected graph and an Ising prior is formulated on the model space for incorporating structural information.