Highly Influential

Mixture of Experts (MoE) is a popular framework in the fields of statistics and machine learning for modeling heterogeneity in data for regression, classification and clustering. MoE for continuous data are usually based on the normal distribution. However, it is known that for data with asymmetric behavior, heavy tails and atypical observations, the use of… (More)

Showing 1-10 of 57 references

Highly Influential

Highly Influential

Highly Influential

Highly Influential

Highly Influential

Highly Influential

Highly Influential

Highly Influential

Highly Influential

Highly Influential