Bayesian Network Classifiers
@article{Friedman2004BayesianNC, title={Bayesian Network Classifiers}, author={Nir Friedman and Dan Geiger and Mois{\'e}s Goldszmidt}, journal={Machine Learning}, year={2004}, volume={29}, pages={131-163} }
Recent work in supervised learning has shown that a surprisingly simple Bayesian classifier with strong assumptions of independence among features, called naive Bayes, is competitive with state-of-the-art classifiers such as C4.5. This fact raises the question of whether a classifier with less restrictive assumptions can perform even better. In this paper we evaluate approaches for inducing classifiers from data, based on the theory of learning Bayesian networks. These networks are factored…
4,527 Citations
Reasoning about Bayesian Network Classifiers
- Computer ScienceUAI
- 2003
This paper presents an algorithm for converting any naive Bayes classifier into an ODD, and it is shown theoretically and experimentally that this algorithm can give us an O DD that is tractable in size cvcn given an intractable number of instances.
A Theoretical and Experimental Evaluation of Augmented Bayesian Classifiers ∗
- Computer Science
- 2006
Algorithms for learning Augmented Bayesian Classifiers with respect to the Minimum Description Length (MDL) and Bayesian-Dirichlet metrics are presented and Experimental results on the performance of these algorithms on various datasets selected from the UCI Machine Learning Repository are presented.
Discrete Bayesian Network Classifiers : A Survey 1
- Computer Science
- 2014
This article surveys the whole set of discrete Bayesian network classifiers devised to date, organized in increasing order of structure complexity: naive Bayes, selective naive Baye, seminaive Bayer, one8 dependence Bayesian classifiers, k-dependence Bayesianclassifiers, BayesianNetwork-augmented naive, and Bayesian multinets.
Improving Naive Bayes Classifier Using Conditional Probabilities
- Computer ScienceAusDM
- 2011
A new version of the Naive Bayes classifier without assuming independence of features is developed, which approximates the interactions between features by using conditional probabilities.
Discrete Bayesian Network Classifiers
- Computer ScienceACM Comput. Surv.
- 2014
This article surveys the whole set of discrete Bayesian network classifiers devised to date, organized in increasing order of structure complexity: naive Bayes, selective naive Baye, seminaive Bayer, one-dependence Bayesian classifiers, k-dependency Bayesianclassifiers, Bayes network-augmented naiveBayes, Markov blanket-based Bayesian Classifier, unrestricted BayesianClassifiers, and Bayesian multinets.
Learning Semi Naïve Bayes Structures by Estimation of Distribution Algorithms
- Computer ScienceEPIA
- 2003
This paper proposes to learn semi naive Bayes structures through estimation of distribution algorithms, which are non-deterministic, stochastic heuristic search strategies.
Learning Optimal Augmented Bayes Networks
- Computer ScienceArXiv
- 2005
A simple, polynomial time greedy algorithm for learning an optimal Augmented Bayes Network with respect to MDL score is presented.
Sisterhood of Classifiers : A Comparative Study of Naive Bayes and Noisy-or Networks
- Computer Science
- 2006
It is shown that naive Bayes performs better than noisy-or when the data fits its independence assumptions, and vice versa, and mathematical derivations of how to transform a classifer in one model into the other two are shown.
Randomized Bayesian Network Classifiers
- Computer ScienceMCS
- 2013
Randomized Bayesian Network Classifiers (RBNC) borrows the idea of ensemble learning by constructing a collection of semi-naive Bayesian network classifiers and then combines their predictions as the final output.
Survey of Improving Naive Bayes for Classification
- Computer ScienceADMA
- 2007
Four main improved approaches to naive Bayes are reviewed and some main directions for future research on Bayesian network classifiers are discussed, including feature selection, structure extension, local learning, and data expansion.
References
SHOWING 1-10 OF 61 REFERENCES
Building Classifiers Using Bayesian Networks
- Computer ScienceAAAI/IAAI, Vol. 2
- 1996
Tree Augmented Naive Bayes (TAN) is single out, which outperforms naive Bayes, yet at the same time maintains the computational simplicity and robustness which are characteristic of naive Baye.
Efficient Learning of Selective Bayesian Network Classifiers
- Computer ScienceICML
- 1996
Experimental results show the resulting classifiers are competitive with (or superior to) the best classifiers, based on both Bayesian networks and other formalisms, and that the computational time for learning and using these classifiers is relatively small.
A Comparison of Induction Algorithms for Selective and non-Selective Bayesian Classifiers
- Computer ScienceICML
- 1995
Searching for Dependencies in Bayesian Classifiers
- Computer ScienceAISTATS
- 1995
It is shown that the backward sequential elimination and joining algorithm provides the most improvement over the naive Bayesian classifier and that the violations of the independence assumption that affect the accuracy of the classifier can be detected from training data.
Learning Bayesian Networks: The Combination of Knowledge and Statistical Data
- Computer ScienceMachine Learning
- 2004
A methodology for assessing informative priors needed for learning Bayesian networks from a combination of prior knowledge and statistical data is developed and how to compute the relative posterior probabilities of network structures given data is shown.
An Analysis of Bayesian Classifiers
- Computer ScienceAAAI
- 1992
An average-case analysis of the Bayesian classifier, a simple induction algorithm that fares remarkably well on many learning tasks, and explores the behavioral implications of the analysis by presenting predicted learning curves for artificial domains.
Estimating Continuous Distributions in Bayesian Classifiers
- Computer Science, MathematicsUAI
- 1995
This paper abandon the normality assumption and instead use statistical methods for nonparametric density estimation for kernel estimation, which suggests that kernel estimation is a useful tool for learning Bayesian models.
Learning Bayesian Networks with Local Structure
- Computer ScienceUAI
- 1996
A novel addition to the known methods for learning Bayesian networks from data that improves the quality of the learned networks and indicates that learning curves characterizing the procedure that exploits the local structure converge faster than these of the standard procedure.
LEARNING BAYESIAN BELIEF NETWORKS: AN APPROACH BASED ON THE MDL PRINCIPLE
- Computer ScienceComput. Intell.
- 1994
A new approach for learning Bayesian belief networks from raw data is presented, based on Rissanen's minimal description length (MDL) principle, which can learn unrestricted multiply‐connected belief networks and allows for trade off accuracy and complexity in the learned model.