Skew t mixture of experts


Mixture of Experts (MoE) is a popular framework in the fields of statistics and machine learning for modeling heterogeneity in data for regression, classification and clustering. MoE for continuous data are usually based on the normal distribution. However, it is known that for data with asymmetric behavior, heavy tails and atypical observations, the use of… (More)
DOI: 10.1016/j.neucom.2017.05.044


22 Figures and Tables

Slides referencing similar topics