# Flexible Bayesian Nonlinear Model Configuration

@article{Hubin2021FlexibleBN, title={Flexible Bayesian Nonlinear Model Configuration}, author={Aliaksandr Hubin and Geir Storvik and Florian Frommlet}, journal={J. Artif. Intell. Res.}, year={2021}, volume={72}, pages={901-942} }

Regression models are used in a wide range of applications providing a powerful scientific tool for researchers from different fields. Linear, or simple parametric, models are often not sufficient to describe complex relationships between input variables and a response. Such relationships can be better described through flexible approaches such as neural networks, but this results in less interpretable models and potential overfitting. Alternatively, specific parametric nonlinear functions can…

## Tables from this paper

## 2 Citations

Reversible Genetically Modified Mode Jumping MCMC

- Computer Science
- 2021

A reversible version of a genetically modified mode jumping Markov chain Monte Carlo algorithm for inference on posterior model probabilities in complex model spaces, where the number of explanatory variables is prohibitively large for classical Markov Chain Monte Carlo methods.

M E ] 1 1 O ct 2 02 1 Reversible Genetically Modified Mode Jumping MCMC

- 2021

## References

SHOWING 1-10 OF 82 REFERENCES

Mode jumping MCMC for Bayesian variable selection in GLMM

- Computer ScienceComput. Stat. Data Anal.
- 2018

Scalable Variational Inference for Bayesian Variable Selection in Regression, and Its Accuracy in Genetic Association Studies

- Computer Science
- 2012

This work assesses an alternative to MCMC based on a simple variational approximation to retain useful features of Bayesian variable selection at a reduced cost and illustrates how these results guide the use of variational inference for a genome-wide association study with thousands of samples and hundreds of thousands of variables.

Approximate Bayesian inference for latent Gaussian models by using integrated nested Laplace approximations

- Computer Science
- 2009

This work considers approximate Bayesian inference in a popular subset of structured additive regression models, latent Gaussian models, where the latent field is Gaussian, controlled by a few hyperparameters and with non‐Gaussian response variables and can directly compute very accurate approximations to the posterior marginals.

A Practical Bayesian Framework for Backpropagation Networks

- Computer ScienceNeural Computation
- 1992

A quantitative and practical Bayesian framework is described for learning of mappings in feedforward networks that automatically embodies "Occam's razor," penalizing overflexible and overcomplex models.

Model selection and multimodel inference : a practical information-theoretic approach

- Computer Science
- 2003

The second edition of this book is unique in that it focuses on methods for making formal statistical inference from all the models in an a priori set (Multi-Model Inference). A philosophy is…

Ensemble learning in Bayesian neural networks

- Computer Science
- 1998

This chapter shows how the ensemble learning approach can be extended to full-covariance Gaussian distributions while remaining computationally tractable, and extends the framework to deal with hyperparameters, leading to a simple re-estimation procedure.

Mixtures of g-Priors in Generalized Linear Models

- MathematicsJournal of the American Statistical Association
- 2018

ABSTRACT Mixtures of Zellner’s g-priors have been studied extensively in linear models and have been shown to have numerous desirable properties for Bayesian variable selection and model averaging.…

An Introduction to Variational Methods for Graphical Models

- Computer ScienceMachine Learning
- 2004

This paper presents a tutorial introduction to the use of variational methods for inference and learning in graphical models (Bayesian networks and Markov random fields), and describes a general framework for generating variational transformations based on convex duality.

Bayesian training of backpropagation networks by the hybrid Monte-Carlo method

- Computer Science
- 1992

It is shown that Bayesian training of backpropagation neural networks can feasibly be performed by the Hybrid Monte Carlo method, and the method has been applied to a test problem, demonstrating that it can produce good predictions, as well as an indication of the uncertainty of these predictions.

Bayesian Adaptive Sampling for Variable Selection and Model Averaging

- Mathematics, Computer Science
- 2011

A Bayesian adaptive sampling algorithm (BAS), that samples models without replacement from the space of models, is introduced and it is shown that BAS can outperform Markov chain Monte Carlo methods.