Scott E. Decatur

Learn More
We consider the problem of determining the three-dimensional folding of a protein given its one-dimensional amino acid sequence. We use the HP model for protein folding proposed by Dill (1985), which models protein as a chain of amino acid residues that are either hydrophobic or polar, and hydrophobic interactions are the dominant initial driving force for(More)
In this paper we study learning in the PAC model of Valiant [18] in which the example oracle used for learning may be faulty in one of two ways: either by misclassifying the example or by distorting the distribution of examples. We first consider models in which examples are misclassified. Kearns [12] recently showed that efficient learning in a new model(More)
1 Introduction In a variety of PAC learning models. a tradeoff between time and information seems to exist-with unlimited time, a small amount of information suffices, but with time restrictions, more information sometimes seems to be required. In addition, it has long been known that there are concept classes that can be learned in the absence of(More)
A recent innovation in computational learning theory is the st atisticrd query (SQ) model. The advantage of specifying learning algorithms in this model is that SQ algorithms can be simulated in the PAC model, both in the absence and in the presence of noise. However, simulations of SQ algorithms in the PAC model have non-optimal time and sample(More)
  • S. E. Decatur
  • International 1989 Joint Conference on Neural…
  • 1989
Results of an investigation into the applicability of neural networks to the classification of radar terrain images are reported. The neural network approach is described and compared with the conventional technique of Bayesian classification with maximum-likelihood estimation. Performance that was previously thought to be optimal is shown to be improved(More)
In this paper, we further characterize the complexity of noise-tolerant learning in the PAC model. Specifically, we show a general lower bound of Ω ( log(1/δ) ε(1−2η) ) on the number of examples required for PAC learning in the presence of classification noise. Combined with a result of Simon, we effectively show that the sample complexity of PAC learning(More)
We derive general bounds on the complexity of learning in the Statistical Query (SQ) model and in the PAC model with classi cation noise. We do so by considering the problem of boosting the accuracy of weak learning algorithms which fall within the SQ model. This new model was introduced by Kearns to provide a general framework for e cient PAC learning in(More)