We show that the familiar explanation-based generalization (EBG) procedure is applicable to a large family of programming languages, including three families of importance to AI: logic programming (such as Pro-log); lambda calculus (such as LISP); and combinator languages (such as FP). The main application of this result is to extend the algorithm to… (More)
We study sequence extrapolation as an abstract learning problem. The task is to learn a stream—a semi-infinite sequence of values all of the same data type-from a finite initial segment (sl, S2,. . .,s~), We assume that all elements of the stream are of the same type (e.g., integers, strings, etc.). In order to represent the hypotheses, we define a language… (More)
A model is presented for the class of inductive inference problems that are solved by refinement algorithms-that is, algorithms that modify a hypothesis by making it more general or more specific in response to examples. The separate effects of the syntax (rule space) and semantics, and the relevant orderings on these, are precisely specified. Relations… (More)
We describe how the TDAG algorithm for learning to predict symbol sequences can be used to design a predictive cache store. A model of a two-level mass storage system is developed and used to calcdate the performance of the cache under various conditions. Experimental simulations provide good confirmation of the model.