#### Filter Results:

#### Publication Year

1993

1997

#### Publication Type

#### Co-author

#### Key Phrase

#### Publication Venue

Learn More

We consider the problem of determining the three-dimensional folding of a protein given its one-dimensional amino acid sequence. We use the HP model for protein folding proposed by Dill (1985), which models protein as a chain of amino acid residues that are either hydrophobic or polar, and hydrophobic interactions are the dominant initial driving force for… (More)

We investigate learnability in the PAC model when the data used for learning, attributes and labels, is either corrupted or incomplete. In order to prove our main results, we define a new complexity measure on statistical query (SQ) learning algorithms. The view of an SQ algorithm is the maximum over all queries in the algorithm , of the number of input… (More)

In this paper we study learning in the PAC model of Valiant 18] in which the example oracle used for learning may be faulty in one of two ways: either by misclassifying the example or by distorting the distribution of examples. We rst consider models in which examples are misclassiied. Kearns 12] recently showed that eecient learning in a new model using… (More)

A recent innovation in computational learning theory is the st atisticrd query (SQ) model. The advantage of specifying learning algorithms in this model is that SQ algorithms can be simulated in the PAC model, both in the absence and in the presence of noise. However, simulations of SQ algorithms in the PAC model have non-optimal time and sample… (More)

In a variety of PAC learning models, a tradeoo between time and information seems to exist: with unlimited time, a small amount of information suuces, but with time restrictions, more information sometimes seems to be required. In addition, it has long been known that there are concept classes that can be learned in the absence of computational… (More)

In this paper, we further characterize the complexity of noise-tolerant learning in the PAC model. Speciically, we show a general lower bound of ? log(1==) "(1?2) 2 on the number of examples required for PAC learning in the presence of classiication noise. Combined with a result of Simon, we eeectively show that the sample complexity of PAC learning in the… (More)

We consider formal models of learning from noisy data. Speciically, we focus on learning in the probability approximately correct model as deened by Valiant. Two of the most widely studied models of noise in this setting have been classiication noise and malicious errors. However, a more realistic model combining the two types of noise has not been… (More)

Although many learning problems can be reduced to learning Boolean functions, in many cases a more efficient learning algorithm can be derived when the problem is considered over a larger domain. In this paper we give a natural generalization of DNF formulas, ZN-DNF formulas over the ring of integers modulo lV. We first show using elementary number theory… (More)

In the course of research in Computational Learning Theory, we found ourselves in need of an error-correcting encoding scheme for which few bits in the codeword yield no information about the plain message. Being unaware of a previous solution, we came-up with the scheme presented here. Since this scheme may be of interest to people working in Cryptography,… (More)