#### Filter Results:

- Full text PDF available (21)

#### Publication Year

1964

2008

- This year (0)
- Last 5 years (0)
- Last 10 years (3)

#### Publication Type

#### Co-author

#### Journals and Conferences

#### Key Phrases

#### Organism

Learn More

A new regression technique based on Vapnik’s concept of support vectors is introduced. We compare support vector regression (SVR) with a committee regression technique (bagging) based on regression trees and ridge regression done in feature space. On the basis of these experiments, it is expected that SVR will have advantages in high dimensionality space… (More)

- Harris Drucker, Donghui Wu, Vladimir Vapnik
- IEEE Trans. Neural Networks
- 1999

We study the use of support vector machines (SVM's) in classifying e-mail as spam or nonspam by comparing it to three other classification algorithms: Ripper, Rocchio, and boosting decision trees. These four algorithms were tested on two different data sets: one data set where the number of features were constrained to the 1000 best features and another… (More)

- Yann LeCun, Larry D. Jackel, +5 authors Isabelle Guyon
- 1995

This paper compares the performance of several classi er algorithms on a standard database of handwritten digits. We consider not only raw accuracy, but also rejection, training time, recognition time, and memory requirements.

- Harris Drucker
- ICML
- 1997

In the regression context, boosting and bagging are techniques to build a committee of regressors that may be superior to a single regressor. We use regression trees as fundamental building blocks in bagging committee machines and boosting committee machines. Performance is analyzed on three non-linear functions and the Boston housing database. In all… (More)

This paper compares the performance of several classi er algorithms on a standard database of handwritten digits. We consider not only raw accuracy, but also training time, recognition time, and memory requirements. When available, we report measurements of the fraction of patterns that must be rejected so that the remaining patterns have misclassi cation… (More)

- Harris Drucker, Corinna Cortes
- NIPS
- 1995

A new boosting algorithm of Freund and Schapire is used to improve the performance of decision trees which are constructed usin: the information ratio criterion of Quinlan’s C4.5 algorithm. This boosting algorithm iteratively constructs a series of decision tress, each decision tree being trained and pruned on examples that have been filtered by previously… (More)

- Harris Drucker, Robert E. Schapire, Patrice Y. Simard
- IJPRAI
- 1993

- Harris Drucker, Yann LeCun
- IEEE Trans. Neural Networks
- 1992

In order to generalize from a training set to a test set, it is desirable that small changes in the input space of a pattern do not change the output components. This can be done by forcing this behavior as part of the training algorithm. This is done in double backpropagation by forming an energy function that is the sum of the normal energy term found in… (More)

- Harris Drucker, Corinna Cortes, Lawrence D. Jackel, Yann LeCun, Vladimir Vapnik
- Neural Computation
- 1994

Patrice Simard AT &T Bell Laboratories Holmdel, NJ 07733 A boosting algorithm converts a learning machine with error rate less than 50% to one with an arbitrarily low error rate. However, the algorithm discussed here depends on having a large supply of independent training samples. We show how to circumvent this problem and generate an ensemble of learning… (More)