#### Filter Results:

#### Publication Year

1993

1997

#### Co-author

#### Key Phrase

#### Publication Venue

Learn More

We consider the problem of determining the three-dimensional folding of a protein given its one-dimensional amino acid sequence. We use the HP model for protein folding proposed by Dill (1985), which models protein as a chain of amino acid residues that are either hydrophobic or polar, and hydrophobic interactions are the dominant initial driving force for… (More)

We investigate learnability in the PAC model when the data used for learning, attributes and labels, is either corrupted or incomplete. In order to prove our main results, we define a new complexity measure on statistical query (SQ) learning algorithms. The view of an SQ algorithm is the maximum over all queries in the algorithm , of the number of input… (More)

In this paper we study learning in the PAC model of Valiant 18] in which the example oracle used for learning may be faulty in one of two ways: either by misclassifying the example or by distorting the distribution of examples. We rst consider models in which examples are misclassiied. Kearns 12] recently showed that eecient learning in a new model using… (More)

A recent innovation in computational learning theory is the st atisticrd query (SQ) model. The advantage of specifying learning algorithms in this model is that SQ algorithms can be simulated in the PAC model, both in the absence and in the presence of noise. However, simulations of SQ algorithms in the PAC model have non-optimal time and sample… (More)

In a variety of PAC learning models, a tradeoo between time and information seems to exist: with unlimited time, a small amount of information suuces, but with time restrictions, more information sometimes seems to be required. In addition, it has long been known that there are concept classes that can be learned in the absence of computational… (More)

In this paper, we further characterize the complexity of noise-tolerant learning in the PAC model. Speciically, we show a general lower bound of ? log(1==) "(1?2) 2 on the number of examples required for PAC learning in the presence of classiication noise. Combined with a result of Simon, we eeectively show that the sample complexity of PAC learning in the… (More)

We consider formal models of learning from noisy data. Speciically, we focus on learning in the probability approximately correct model as deened by Valiant. Two of the most widely studied models of noise in this setting have been classiication noise and malicious errors. However, a more realistic model combining the two types of noise has not been… (More)

We derive general bounds on the complexity of learning in the Statistical Query model and in the PAC model with classification noise. We do so by considering the problem of boosting the accuracy of weak learning algorithms which fall within the Statistical Query model. This new model was introduced by Kearns [12] to provide a general framework for efficient… (More)