Mixture of experts: a literature survey

@article{Masoudnia2012MixtureOE,
  title={Mixture of experts: a literature survey},
  author={Saeed Masoudnia and Reza Ebrahimpour},
  journal={Artificial Intelligence Review},
  year={2012},
  volume={42},
  pages={275-293}
}
Mixture of experts (ME) is one of the most popular and interesting combining methods, which has great potential to improve performance in machine learning. ME is established based on the divide-and-conquer principle in which the problem space is divided between a few neural network experts, supervised by a gating network. In earlier works on ME, different strategies were developed to divide the problem space between the experts. To survey and analyse these methods more clearly, we present a… 

Figures and Tables from this paper

Mixture of feature specified experts
Root-quatric mixture of experts for complex classification problems
Multi-objective Methodology for Classification by Committees of Class-experts
  • Computer Science
  • 2019
TLDR
The proposed classification methodology is based on a novel two-stage approach where each objective is associated with one class and, in contrast to the aforementioned studies, where a fixed topology ANNs have been used, a TWEANNbased algorithm is used to evolve the base-classifiers.
Mixture of Convolutional Neural Networks for Image Classification
  • Computer Science
  • 2018
TLDR
A comprehensive series of experiments compared between many gating methods for mixture of convolutional neural networks to improve image classification and presents novel partitioning methods, where the labels are being partitioned instead of the images, in order to improve generalization of the experts.
A proposal for mixture of experts with entropic regularization
TLDR
This work presents a variant of regular Mixture-of-Experts model, which consists of maximizing of the entropy of gate network in addition to classification cost minimization, and shows the advantage of this approach in multiple datasets in terms of accuracy metric.
Towards data-free gating of heterogeneous pre-trained neural networks
TLDR
This paper proposes multiple data-free methods for the combination of heterogeneous neural networks, ranging from the utilization of simple output logit statistics, to training specialized gating networks.
Mixture of ELM based experts with trainable gating network
TLDR
The experimental results show that the proposed approach outperforms the original ELM on prediction stability and classification accuracy, and the experimental results and statistical analysis confirm that MEETG has an acceptable performance in classification problems.
A flexible probabilistic framework for large-margin mixture of experts
TLDR
These models are significantly more efficient than other training algorithms for MoE while outperforming other traditional non-linear models like Kernel SVMs and Gaussian Processes on several benchmark datasets.
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 72 REFERENCES
Improving mixture of experts for view-independent face recognition using teacher-directed learning
TLDR
The experimental results support the claim that directing the experts to a predetermined partitioning of the face space improves the performance of the conventional ME for view-independent face recognition.
Improved learning algorithms for mixture of experts in multiclass classification
Combining Predictors: Comparison of Five Meta Machine Learning Methods
Boosted Mixture of Experts: An Ensemble Learning Scheme
We present a new supervised learning procedure for ensemble machines, in which outputs of predictors, trained on different distributions, are combined by a dynamic classifier combination model. This
Incorporation of a Regularization Term to Control Negative Correlation in Mixture of Experts
TLDR
The capability of a control parameter for NCL is incorporated in the error function of ME, which enables its training algorithm to establish better balance in bias-variance-covariance trade-off and thus improves the generalization ability.
Input partitioning to mixture of experts
  • B. Tang, M. Heywood, M. Shepherd
  • Computer Science
    Proceedings of the 2002 International Joint Conference on Neural Networks. IJCNN'02 (Cat. No.02CH37290)
  • 2002
TLDR
Under the mixtures of experts architecture a method for 'designing' the number of experts and assigning local 'regions' of the input space to individual experts is investigated and classification performance and transparency of the scheme is found to be significantly better than that using a standard mixtures.
Improving combination method of NCL experts using gating network
TLDR
An improved version of NCL method is proposed in which the capability of gating network, as the combining part of Mixture of Experts method, is used to combine the base NNs in the NCL ensemble method.
...
1
2
3
4
5
...