Learning Ambiguities Using Bayesian Mixture of Experts

Abstract

Mixture of Experts (ME) is an ensemble of function approximators that fit the clustered data set locally rather than globally. ME provides a useful tool to learn multi-valued mappings (ambiguities) in the data set. Mixture of Experts training involve learning a multi-category classifier for the gates distribution and fitting a regressor within each of the… (More)
DOI: 10.1109/ICTAI.2006.73

Topics

5 Figures and Tables

Slides referencing similar topics