Dynamic Bayesian Combination of Multiple Imperfect Classifiers

@inproceedings{Simpson2013DynamicBC,
  title={Dynamic Bayesian Combination of Multiple Imperfect Classifiers},
  author={Edwin Simpson and Stephen J. Roberts and Ioannis Psorakis and Arfon M. Smith},
  booktitle={Decision Making and Imperfection},
  year={2013}
}
Classifier combination methods need to make best use of the outputs of multiple, imperfect classifiers to enable higher accuracy classifications. [] Key Method We apply the approach to real data from a large citizen science project, Galaxy Zoo Supernovae, and show that our method far outperforms other established approaches to imperfect decision combination.
Hierarchical Bayesian Classifier Combination
TLDR
The proposed Bayesian method, called Hierarchical Bayesian Classifier Combination (HBCC), is for discrete classifiers and assumes that the individual classifiers are conditionally independent given the true class label.
Bayesian Combination of Probabilistic Classifiers using Multivariate Normal Mixtures
TLDR
A new model is proposed, which overcomes the disadvantages of existing Bayesian ensembles and transforms the probabilistic predictions with the inverse additive logistic transformation to model the correlations with multivariate normal mixtures.
A Normative Model of Classifier Fusion
TLDR
It is shown that a change in performance of the fused classifier due to uncertainty reduction can be Bayes optimal even for highly correlated base classifiers, and naturally accommodates the classic Independent Opinion Pool and other independent fusion algorithms as special cases.
Bayesian Classifier Fusion with an Explicit Model of Correlation
TLDR
It is shown that a change in performance of the fused classifier due to uncertainty reduction can be Bayes optimal even for highly correlated base classi-classes, and that the proposed model naturally accommodates the classic Independent Opinion Pool and other independent fusion algorithms as special cases.
Modeling Analysts’ Recommendations via Bayesian Machine Learning
TLDR
The authors argue that there are many similarities between the analyst forecasting problem and a very successful application of IBCC in astronomy, a study in which it dominates heuristic alternatives including simple or weighted averages and majority voting.
Scalable Decision Making : Uncertainty , Imperfection , Deliberation
TLDR
This chapter presents an information-theoretic approach to selecting informative decision-making agents, assigning them to specific tasks and combining their responses using a Bayesian method, and demonstrates empirically how the intelligent task assignment approach improves the accuracy of combined decisions while requiring fewer responses from the crowd.
Non-parametric Bayesian annotator combination
Bayesian Methods for Intelligent Task Assignment in Crowdsourcing Systems
TLDR
This chapter presents an information-theoretic approach to selecting informative decision-making agents, assigning them to specific tasks and combining their responses using a Bayesian method, and demonstrates empirically how the intelligent task assignment approach improves the accuracy of combined decisions while requiring fewer responses from the crowd.
Bayesian Methods for Intelligent Task Assignment in Crowdsourcing Systems
TLDR
This chapter presents an information-theoretic approach to selecting informative decision-making agents, assigning them to specific tasks and combining their responses using a Bayesian method, and demonstrates empirically how the intelligent task assignment approach improves the accuracy of combined decisions while requiring fewer responses from the crowd.
...
...

References

SHOWING 1-10 OF 45 REFERENCES
Bayesian Classifier Combination
TLDR
A general framework for Bayesian model combination (which differs from model averaging) in the context of classification is explored, which explicitly models the relationship between each model’s output and the unknown true label.
Turning Bayesian model averaging into Bayesian model combination
TLDR
It is shown that even the most simplistic of Bayesian model combination strategies outperforms the traditional ad hoc techniques of bagging and boosting, as well as outperforming BMA over a wide variety of cases, suggesting that the power of ensembles does not come from their ability to account for model uncertainty, but instead comes from the changes in representational and preferential bias inherent in the process of combining several different models.
Sequential Dynamic Classification Using Latent Variable Models
TLDR
This paper proposes a set of sequential dynamic classification algorithms based on extension of nonlinear variants of Bayesian Kalman processes and dynamic generalized linear models and extends the models to allow for active label requesting for use in situations in which there is a cost associated with such information and hence a fully labelled target set is prohibitive.
Learning From Crowds
TLDR
A probabilistic approach for supervised learning when the authors have multiple annotators providing (possibly noisy) labels but no absolute gold standard, and experimental results indicate that the proposed method is superior to the commonly used majority voting baseline.
A Variational Baysian Framework for Graphical Models
This paper presents a novel practical framework for Bayesian model averaging and model selection in probabilistic graphical models. Our approach approximates full posterior distributions over model
Dynamic Generalized Linear Models and Bayesian Forecasting
TLDR
The structure of the models depends on the time evolution of underlying state variables, and the feedback of observational information to these variables is achieved using linear Bayesian prediction methods.
Apprenticeship learning via inverse reinforcement learning
TLDR
This work thinks of the expert as trying to maximize a reward function that is expressible as a linear combination of known features, and gives an algorithm for learning the task demonstrated by the expert, based on using "inverse reinforcement learning" to try to recover the unknown reward function.
Variational Mixture of Bayesian Independent Component Analyzers
TLDR
This article employs recent developments in variational Bayesian inference and structure determination to construct a novel approach for modeling nongaussian, discontinuous manifolds and demonstrates its application to real data by decomposing functional magnetic resonance images into meaningful and medically useful features.
Review of Classifier Combination Methods
TLDR
This chapter introduces different categories of classifier combinations and introduces a retraining effect and effects of locality based training as important properties of classifiers combinations.
Bayesian Inference on Multiscale Models for Poisson Intensity Estimation: Applications to Photon-Limited Image Denoising
TLDR
An improved statistical model for analyzing Poisson processes is presented, adopting a multiscale representation of the Poisson process in which the ratios of the underlying Poisson intensities in adjacent scales are modeled as mixtures of conjugate parametric distributions.
...
...