#### Filter Results:

- Full text PDF available (8)

#### Publication Year

1979

1999

- This year (0)
- Last 5 years (0)
- Last 10 years (0)

#### Publication Type

#### Co-author

#### Journals and Conferences

#### Key Phrases

Learn More

- Dana Angluin, Philip D. Laird
- Machine Learning
- 1987

The basic question addressed in this paper is: how can a learning algorithm cope with incorrect training examples? Specifically, how can algorithms that produce an “approximately correct” identification with “high probability” for reliable data be adapted to handle noisy data? We show that when the teacher may make independent random errors in classifying… (More)

- Philip D. Laird
- Machine Learning
- 1992

Learning from experience to predict sequences of discrete symbols is a fundamental problem in machine learning with many applications. We present a simple and practical algorithm (TDAG) for discrete sequence prediction. Based on a text-compression method, the TDAG algorithm limits the growth of storage by retaining the most likely prediction contexts and… (More)

- Philip D. Laird, Evan Gamble
- AAAI
- 1990

We show that the familiar explanation-based generalization (EBG) procedure is applicable to a large family of programming languages, including three families of importance to AI: logic programming (such as Prolog); lambda calculus (such as LISP); and combinator languages (such as FP). The main application of this result is to extend the algorithm to domains… (More)

- Philip D. Laird, Evan Gamble
- ALT
- 1990

- Philip D. Laird
- COLT
- 1988

The efficiency of learning from unclassified data (unsupervised learning) is examined by constructing a framework similar in style to the recent work on supervised concept learning inspired by Valiant. We define the framework and illustrate it with results on three model classes. The framework is compared to both the supervised learnability model and other… (More)

- Philip D. Laird
- AAAI
- 1986

- Philip D. Laird, Ronald Saul
- International Conference on Evolutionary…
- 1994

- Philip D. Laird, Ronald Saul, Peter Dunning
- COLT
- 1993

We study sequence extrapolation as an abstract learning problem. The task is to learn a stream—a semi-infinite sequence (s1, sz, . . . , Sn, . . .) of values all of the same data type-from a finite initial segment (sl, S2, . . .,s~ ), We assume that all elements of the stream are of the same type (e.g., integers, strings, etc.). In order to represent the… (More)

We describe how the TDAG algorithm for learning to predict symbol sequences can be used to design a predictive cache store. A model of a two-level mass storage system is developed and used to calcdate the performance of the cache under various conditions. Experimental simulations provide good confirmation of the model.

Analytical learning is a set of machine-learning techniques for revising the representation of a theory based on a small set of examples of that theory. When the representation of the theory is correct and complete but perhaps inefficient, an important objective of such analysis is to improve the computational efficiency of the representation. Several… (More)