Joaquín Pizarro Junquera

Learn More
This paper presents a new approach to model selection based on hypothesis testing. We 4rst describe a procedure to generate di5erent scores for any candidate model from a single sample of training data and then discuss how to apply multiple comparison procedures (MCP) to model selection. MCP statistical tests allow us to compare three or more groups of data(More)
One of the main research concern in neural networks is to find the appropriate network size in order to minimize the trade-off between overfitting and poor approximation. In this paper the choice among different competing models that fit to the same data set is faced when statistical methods for model comparison are applied. The study has been conducted to(More)
In statistical modelling, an investigator must often choose a suitable model among a collection of viable candidates. There is no consensus in the research community on how such a comparative study is performed in a methodologically sound way. The ranking of several methods is usually performed by the use of a selection criterion, which assigns a score to(More)
. This paper proposes a new complexity-penalization model selection strategy derived from the minimum risk principle and the behavior of candidate models under noisy conditions. This strategy seems to be robust in small sample size conditions and tends to AIC criterion as sample size grows up. The simulation study at the end of the paper will show that the(More)
In this paper we suggest an algorithm based on the Discrete Algebraic Reconstruction Technique (DART) which is capable of computing high quality reconstructions from substantially fewer projections than required for conventional continuous tomography. Adaptive DART (ADART) goes a step further than DART on the reduction of the number of unknowns of the(More)
Estimating Prediction Risk is important for providing a way of computing the expected error for predictions made by a model, but it is also an important tool for model selection. This paper addresses an empirical comparison of model selection techniques based on the Prediction Risk estimation, with particular reference to the structure of nonlinear(More)
Complexity-penalization strategies are one way to decide on the most appropriate network size in order to address the trade-off between overfitted and underfitted models. In this paper we propose a new penalty term derived from the behaviour of candidate models under noisy conditions that seems to be much more robust against catastrophic overfitting errors(More)
One of the most important difficulties in using neural networks for a real-world problem is the issue of model complexity, and how affects the generalization performance. We present a new algorithm based on multiple comparison methods for finding low complexity neural networks with high generalization capability.
In this paper we describe a new penalty-based model selection criterion for nonlinear models which is based on the influence of the noise in the fitting. According to Occam’s razor we should seek simpler models over complex ones and optimize the trade-off between model complexity and the accuracy of a model’s description to the training data. An empirical(More)