• Corpus ID: 296750

On Discriminative vs. Generative Classifiers: A comparison of logistic regression and naive Bayes

@inproceedings{Ng2001OnDV,
  title={On Discriminative vs. Generative Classifiers: A comparison of logistic regression and naive Bayes},
  author={A. Ng and Michael I. Jordan},
  booktitle={NIPS},
  year={2001}
}
We compare discriminative and generative learning as typified by logistic regression and naive Bayes. We show, contrary to a widely-held belief that discriminative classifiers are almost always to be preferred, that there can often be two distinct regimes of performance as the training set size is increased, one in which each algorithm does better. This stems from the observation—which is borne out in repeated experiments—that while discriminative learning has lower asymptotic error, a… 

Figures from this paper

Classification: Naive Bayes vs Logistic Regression

This paper will explore both logistic regression and naive Bayes classifiers by giving a background on both, describing how to implement them, and attempting to recreate several of the learning curves found in [1].

The Tradeoff Between Generative and Discriminative Classifiers

A family of classifiers that interpolate the two approaches to classification are introduced, thus providing a new way to compare them and giving an estimation procedure whose classification performance is well balanced between the bias of generative classifiers and the variance of discriminative ones.

Naive Bayes vs Logistic Regression: Theory, Implementation and Experimental Validation

From the experiments, it is observed that LR learning with gradient ascent technique outperforms general NB classifier, however, under Gaussian Naive Bayes assumption, both classifiers NB and LR perform similar.

A Hybrid Generative/Discriminative Bayesian Classifier

A new restricted Bayesian network classifier is introduced that extends naive Bayes by relaxing the conditional independence assumptions, and it is partly generative and partly discriminative.

Discriminative vs. Generative Classifiers for Cost Sensitive Learning

  • C. Drummond
  • Computer Science
    Canadian Conference on AI
  • 2006
Although some of these variants are better than a single discriminative classifier, the right choice of training set distribution plus careful calibration are needed to make them competitive with multiple discriminatives classifiers.

Classification with Hybrid Generative/Discriminative Models

A hybrid model in which a high-dimensional subset of the parameters are training to maximize generative likelihood, and another, small, subset of parameters are discriminatively trained to maximize conditional likelihood is described.

Comparison of Two Learning Methods of the Tree Augmented Naïve Bayesian Network Classifier

  • Hong-bo ShiKun-Lun Li
  • Computer Science
    2006 International Conference on Machine Learning and Cybernetics
  • 2006
The experimental results demonstrate that there are diversity between the generative learning and the discriminative learning of the TAN classifier, and two learning approaches of a restricted Bayesian network classifier are introduced.

A Generative/Discriminative Hybrid Model: Bayes Perceptron Classifier

A novel model named Bayes perceptron is proposed to take advantage of the generative and discriminative approaches, which divides every feature vector into several subvectors, each of which is modeled on Bayes assumption.

Deriving discriminative classifiers from generative models

A general theoretical result is presented specifying how aGenerative classifier induced from a generative model can also be computed in a discriminative way from the same model.
...

References

SHOWING 1-7 OF 7 REFERENCES

Discriminative vs Informative Learning

The tradeoffs between informative and discriminative classifiers for simple classifiers are reviewed and synthesized, and the results are extended to modern techniques such as Naive Bayes and Generalized Additive Models.

The Efficiency of Logistic Regression Compared to Normal Discriminant Analysis

Abstract A random vector x arises from one of two multivariate normal distributions differing in mean but not covariance. A training set x 1, x 2, ··· x n of previous cases, along with their correct

Statistical learning theory

Presenting a method for determining the necessary and sufficient conditions for consistency of learning process, the author covers function estimates from small data pools, applying these estimations to real-life problems, and much more.

Neural Network Learning - Theoretical Foundations

The authors explain the role of scale-sensitive versions of the Vapnik Chervonenkis dimension in large margin classification, and in real prediction, and discuss the computational complexity of neural network learning.

Bounding the Vapnik-Chervonenkis Dimension of Concept Classes Parameterized by Real Numbers

The results show that for two general kinds of concept class the V-C dimension is polynomially bounded in the number of real numbers used to define a problem instance, and that in the continuous case, as in the discrete, the real barrier to efficient learning in the Occam sense is complexity- theoretic and not information-theoretic.

Discriminant Analysis and Statistical Pattern Recognition

Provides a systematic account of the subject area, concentrating on the most recent advances in the field. While the focus is on practical considerations, both theoretical and practical issues are