#### Filter Results:

- Full text PDF available (42)

#### Publication Year

1983

2014

#### Publication Type

#### Co-author

#### Publication Venue

#### Key Phrases

Learn More

- Tom Bylander
- Artif. Intell.
- 1994

I present several computational complexity results for propositional STRIPS planning, i.e., STRIPS planning restricted to ground formulas. Diierent planning problems can be deened by restricting the type of formulas, placing limits on the number of pre-and postconditions, by restricting negation in pre-and postconditions, and by requiring optimal plans. For… (More)

- Tom Bylander
- IJCAI
- 1991

I describe several computational complexity results for planning, some of which identify tractable planning problems. The model of planning, called "propositional planning," is simple—conditions within operators are literals with no variables allowed. The different plan ning problems are defined by different restrictions on the preconditions and… (More)

- Tom Bylander, Dean Allemang, Michael C. Tanner, John R. Josephson
- Artif. Intell.
- 1991

The problem of abduction can be characterized as nding the best explanation of a set of data. In this paper we focus on one type of abduction in which the best explanation is the most plausible combination of hypotheses that explains all the data. We then present several computational complexity results demonstrating that this type of abduction is… (More)

- Tom Bylander, Sanjay Mittal
- AI Magazine
- 1986

The ability to map the state of an object into a category languages is transforming AI theories into symbolic struc-in a classification hierarchy has long been an important tures. This pattern can be seen in knowledge representa-part of many fields, for example, biology and medicine. tion (for example, semantic nets and KL-ONE [Brachman Recently, AI… (More)

- Tom Bylander
- Artif. Intell.
- 1996

I present a probabilistic analysis of propositional STRIPS planning. The analysis considers two assumptions. One is that each possible precondition (likewise postcondition) of an operator is selected independently of other pre-and postconditions. The other is that each operator has a xed number of preconditions (likewise postconditions). Under both… (More)

- Tom Bylander, B. Chandrasekaran
- International Journal of Man-Machine Studies
- 1987

- Tom Bylander
- J. Exp. Theor. Artif. Intell.
- 1991

- Tom Bylander
- COLT
- 1997

This paper develops and analyzes a new online algorithm for learning linear functions, called the Binary Exponentiated Gradient algorithm (BEG). BEG imposes an lower and upper bound for all the weights. Using Kivinen and Warmuth's methodology , the BEG algorithm is developed from a binary entropy distance function and the square loss function , and… (More)

- David E. Hirsch, Sheldon R. Simon, Tom Bylander, Michael A. Weintraub, Peter Szolovits
- Applied Artificial Intelligence
- 1989

- Tom Bylander
- Machine Learning
- 2002

For two-class datasets, we provide a method for estimating the generalization error of a bag using out-of-bag estimates. In bagging, each predictor (single hypothesis) is learned from a bootstrap sample of the training examples; the output of a bag (a set of predictors) on an example is determined by voting. The out-of-bag estimate is based on recording the… (More)