A concrete statistical realization of Kleinberg's stochastic discrimination for pattern recognition. Part I. Two-class classification

@article{Chen2002ACS,
  title={A concrete statistical realization of Kleinberg's stochastic discrimination for pattern recognition. Part I. Two-class classification},
  author={Dechang Chen and Peng Huang and Xiuzhen Cheng},
  journal={Annals of Statistics},
  year={2002},
  volume={31},
  pages={1393-1412}
}
The method of stochastic discrimination (SD) introduced by Kleinberg is a new method in statistical pattern recognition. It works by producing many weak classifiers and then combining them to form a strong classifier. However, the strict mathematical assumptions in Kleinberg [The Annals of Statistics 24 (1996) 2319-2349] are rarely met in practice. This paper provides an applicable way to realize the SD algorithm. We recast SD in a probability-space framework and present a concrete statistical… Expand
A Simple Implementation of the Stochastic Discrimination for Pattern Recognition
TLDR
A simple algorithm of stochastic discrimination for two-class pattern recognition is presented and the experimental results show that SD is fast, effective, and applicable. Expand
On Kleinberg's Stochastic Discrimination Procedure
  • A. Irle, J. Kauschke
  • Mathematics, Medicine
  • IEEE Transactions on Pattern Analysis and Machine Intelligence
  • 2011
A new condition for high accuracy on test sets is given for the method of stochastic discrimination (SD), a pattern recognition method introduced by Kleinberg. This condition provides a simpleExpand
A Statistical Look at Stochastic Discrimination
Stochastic discrimination (SD) has been shown to be a useful pattern recognition tool in the literature. A large number of experiments conducted indicate tha t SD has a low error rate. This paperExpand
On improvement of classification accuracy for stochastic discrimination
  • N. Zong, X. Hong
  • Computer Science, Medicine
  • IEEE Trans. Syst. Man Cybern. Part B
  • 2005
TLDR
The proposed improved SD improves standard SD by its capability of achieving higher classification accuracy by showing that the smaller the variance of the discriminant function, the lower the error rate of the classifier. Expand
Ensemble Methods – Classifier Combination in Machine Learning
The last ten years have seen a research explosion in machine learning. The rapid growing is largely driven by the following two forces. First, separate research communities in symbolic machineExpand
Hyper-Rectangular and k-Nearest-Neighbor Models in Stochastic Discrimination
TLDR
Analysis of simple examples shows that for high-dimensional data, parallel model generation with the nearest neighbor approach is a favorable alternative to the interval modelgeneration with random manipulation of the feature subspaces. Expand
Principles of Stochastic Discrimination and Ensemble Learning
Learning in everyday life is often accomplished by making many random guesses and synthesizing the feedback. Kleinberg’s analysis of this process resulted in a new method for classifier design –Expand
Improved Uniformity Enforcement in Stochastic Discrimination
TLDR
A new uniformity enforcement method is introduced, which on benchmark datasets, leads to greater predictive efficiency than the currently published method. Expand
Multiple Classifier Systems for Adversarial Classification Tasks
TLDR
A measure of the hardness of evasion of a classifier architecture is proposed, an analytical evaluation and comparison of an individual classifier and a classifiers ensemble architecture are given, and an experimental evaluation on a spam filtering task is reported. Expand
The Use of Low-Dose CT Intra- and Extra-Nodular Image Texture Features to Improve Small Lung Nodule Diagnosis in Lung Cancer Screening
TLDR
The CAD framework incorporating the clinical reading with the texture features extracted from LDCT increased the PPV and reduced the false positive (FP) rate in the early diagnosis of lung cancer. Expand
...
1
2
...

References

SHOWING 1-10 OF 30 REFERENCES
Estimates of classification accuracies for kleinberg's method of stochastic discrimination in pattern recognition
In this dissertation we mainly derive statistical estimates for classification accuracy for the method of stochastic discrimination (SD) in pattern recognition introduced by Kleinberg. In order toExpand
An alternative method of stochastic discrimination with applications to pattern recognition
This dissertation introduces an alternative method of performing stochastic discrimination in pattern recognition which differs in several aspects from the original method introduced by Kleinberg.Expand
On the Algorithmic Implementation of Stochastic Discrimination
  • E. Kleinberg
  • Computer Science
  • IEEE Trans. Pattern Anal. Mach. Intell.
  • 2000
TLDR
An outline of the underlying mathematical theory of stochastic discrimination is outlined and a remark concerning boosting is made, which provides a theoretical justification for properties of that method observed in practice, including its ability to generalize. Expand
Special Invited Paper-Additive logistic regression: A statistical view of boosting
Boosting is one of the most important recent developments in classification methodology. Boosting works by sequentially applying a classification algorithm to reweighted versions of the training dataExpand
Experiments with a New Boosting Algorithm
TLDR
This paper describes experiments carried out to assess how well AdaBoost with and without pseudo-loss, performs on real learning problems and compared boosting to Breiman's "bagging" method when used to aggregate various classifiers. Expand
An overtraining-resistant stochastic modeling method for pattern recognition
We will introduce a generic approach for solving problems in pattern recognition based on the synthesis of accurate multiclass discriminators from large numbers of very inaccurate weak models throughExpand
A decision-theoretic generalization of on-line learning and an application to boosting
TLDR
The model studied can be interpreted as a broad, abstract extension of the well-studied on-line prediction model to a general decision-theoretic setting, and the multiplicative weightupdate Littlestone Warmuth rule can be adapted to this model, yielding bounds that are slightly weaker in some cases, but applicable to a considerably more general class of learning problems. Expand
Pattern Classification with Compact Distribution Maps
TLDR
For applications to arbitrary domains, this work presents a method to automatically construct feature transformations that are suitable for such mappings and illustrates the method by an application in a challenging character recognition problem with thousands of classes. Expand
Pattern Recognition and Neural Networks
TLDR
Professor Ripley brings together two crucial ideas in pattern recognition; statistical methods and machine learning via neural networks in this self-contained account. Expand
The Random Subspace Method for Constructing Decision Forests
  • T. Ho
  • Mathematics, Computer Science
  • IEEE Trans. Pattern Anal. Mach. Intell.
  • 1998
TLDR
A method to construct a decision tree based classifier is proposed that maintains highest accuracy on training data and improves on generalization accuracy as it grows in complexity. Expand
...
1
2
3
...