# A Comparison of Some Error Estimates for Neural Network Models

@article{Tibshirani1996ACO, title={A Comparison of Some Error Estimates for Neural Network Models}, author={Robert Tibshirani}, journal={Neural Computation}, year={1996}, volume={8}, pages={152-163} }

We discuss a number of methods for estimating the standard error of predicted values from a multilayer perceptron. These methods include the delta method based on the Hessian, bootstrap estimators, and the sandwich estimator. The methods are described and compared in a number of examples. We find that the bootstrap methods perform best, partly because they capture variability due to the choice of starting weights.

## 302 Citations

### Standard Error Estimation in Neural Network Regression Models: the AR-Sieve Bootstrap Approach

- MathematicsWIRN
- 2001

The usage of the AR-Sieve bootstrap method to estimate the standard error of the sampling distribution of the neural network predictive values in a regression model with dependent errors is investigated.

### Bootstrap Variables Selection in Neural Network Regression Models

- Mathematics
- 2004

The use of the moving block bootstrap technique to estimate the variability of some measures for the variables relevance to the neural network model is proposed and the proposed approach determines a correct ranking among relevant and irrelevant variables.

### Assessing the predictions of dynamic neural networks

- Mathematics
- 2005

New formulations are introduced to apply the bootstrap to nonlinear time series models with exogenous input to overcome restrictions of the network parameter uncertainties and the non-normality of the error distribution.

### Estimations of error bounds for neural-network function approximators

- Computer ScienceIEEE Trans. Neural Networks
- 1999

The problem of the effect of errors or noise in the presented, input, vector is examined and a method based on perturbation analysis of determining output bounds based on both the error in the input vector and the imperfections in the weight values after training is presented and demonstrated.

### Construction of confidence intervals for neural networks based on least squares estimation

- MathematicsNeural Networks
- 2000

### Prediction intervals for neural network models

- Computer Science
- 2005

Preliminary results indicate a clear superiority of the combination of the bootstrap and maximum likelihood approaches in constructing prediction intervals, relative to the analytical approaches.

### CONSTRUCTION OF CONFIDENCE INTERVALS IN NEURAL MODELING USING A LINEAR TAYLOR EXPANSION

- Mathematics
- 1998

We introduce the theoretical results on the construction of confidence intervals for a nonlinear regression, based on the linear Taylor expansion of the corresponding nonlinear model output. The case…

### Confidence estimation methods for neural networks : a practical comparison

- Computer ScienceESANN
- 2000

This work shows that treating data noise variance as a function of the inputs is appropriate for the curl prediction task and shows that the mean coverage probability can only gauge confidence estimation performance as an average over the input space, i.e., global performance and that the standard deviation of the coverage is unreliable as a measure of local performance.

### An Evolutionary Bootstrap Approach to Neural Network Pruning and Generalization

- Computer Science
- 1997

This paper combines techniques drawn from the literature on evolutionary optimization algorithms along with bootstrap based statistical tests to create a network estimation and selection procedure which creates parsimonious network structures which generalize well.

### Evaluating Neural Network Predictors by Bootstrapping

- Computer Science
- 1994

We present a new method, inspired by the bootstrap, whose goal it is to determine the quality and reliability of a neural network predictor. Our method leads to more robust forecasting along with a…

## References

SHOWING 1-10 OF 36 REFERENCES

### Evaluating Neural Network Predictors by Bootstrapping

- Computer Science
- 1994

We present a new method, inspired by the bootstrap, whose goal it is to determine the quality and reliability of a neural network predictor. Our method leads to more robust forecasting along with a…

### A Bootstrap Evaluation of the Effect of Data Splitting on Financial Time Series

- Computer ScienceIEEE Trans. Neural Networks
- 1998

Exposes problems of the commonly used technique of splitting the available data into training, validation, and test sets that are held fixed, warns about drawing too strong conclusions from such…

### A Practical Bayesian Framework for Backpropagation Networks

- Computer ScienceNeural Computation
- 1992

A quantitative and practical Bayesian framework is described for learning of mappings in feedforward networks that automatically embodies "Occam's razor," penalizing overflexible and overcomplex models.

### Introduction to the theory of neural computation

- Computer ScienceThe advanced book program
- 1991

This book is a detailed, logically-developed treatment that covers the theory and uses of collective computational networks, including associative memory, feed forward networks, and unsupervised learning.

### The bootstrap and its application in signal processing

- MathematicsIEEE Signal Process. Mag.
- 1998

The motivations for using the bootstrap in typical signal processing applications are highlighted, and the use of the boot strap for constructing confidence intervals for flight parameters in a passive acoustic emission problem is demonstrated.

### Estimating the mean and variance of the target probability distribution

- MathematicsProceedings of 1994 IEEE International Conference on Neural Networks (ICNN'94)
- 1994

Introduces a method that estimates the mean and the variance of the probability distribution of the target as a function of the input, given an assumed target error-distribution model. Through the…

### Improved Option Pricing Using Artificial Neural Networks and Bootstrap Methods

- EconomicsInt. J. Neural Syst.
- 1997

The modified bootstrap predictor outperforms the hybrid and bagging predictors and greatly improved performance was observed on the boundary of the training set and where only sparse training data exists.

### Improved option pricing using bootstrap methods

- Computer ScienceProceedings of International Conference on Neural Networks (ICNN'97)
- 1997

The results show that a modified bootstrap predictor outperforms the hybrid and bagging predictors and greatly improved performance was observed in particular regions of the input space, namely out of the money options.

### Generalization: The Hidden Agenda of Learning

- Computer ScienceIEEE Signal Processing Magazine
- 1997

Optimization of the neural network architecture may lead to better generalization ability and preferably lower computational burden, which is one of the reasons that neural networks have shown useful in practical applications.