On Fitting Finite Dirichlet Mixture Using ECM and MML

@inproceedings{Bouguila2005OnFF,
  title={On Fitting Finite Dirichlet Mixture Using ECM and MML},
  author={Nizar Bouguila and Djemel Ziou},
  booktitle={ICAPR},
  year={2005}
}
Gaussian mixture models are being increasingly used in pattern recognition applications. However, for a set of data other distributions can give better results. In this paper, we consider Dirichlet mixtures which offer many advantages [1]. The use of the ECM algorithm and the minimum message length (MML) approach to fit this mixture model is described. Experimental results involve the summarization of texture image databases. 
Dynamic textures clustering using a hierarchical Pitman-Yor process mixture of Dirichlet distributions
  • Wentao Fan, N. Bouguila
  • Computer Science, Mathematics
    2015 IEEE International Conference on Image Processing (ICIP)
  • 2015
TLDR
This paper proposes a hierarchical Pitman-Yor (HPY) process mixture of Dirichlet distributions learned via a variational Bayes approach and applied to the challenging problem of dynamic textures clustering.
An Infinite Mixture of Inverted Dirichlet Distributions
TLDR
The results show that the proposed approach is effective for positive data modeling when compared to those reported using infinite Gaussian mixture.
Variational Inference of Finite Asymmetric Gaussian Mixture Models
TLDR
A variational Bayes learning framework for asymmetric Gaussian mixture model, which incorporates the asymmetric shape of data and are adaptive to different conditions in real-word image processing domains is proposed.
Finite Two-Dimensional Beta Mixture Models: Model Selection and Applications
TLDR
A clustering framework for learning a finite mixture model based on a bivariate Beta distribution with three parameters and the proper number of clusters is determined by Minimum Message Length (MML).
Accelerated variational inference for Beta-Liouville mixture learning with application to 3D shapes recognition
TLDR
A novel algorithm to learn beta-Liouville mixture models that have been shown to be very efficient for the clustering of proportional data is developed, based on an accelerated version of the variational Bayes approach.
Finite Multi-dimensional Generalized Gamma Mixture Model Learning Based on MML
TLDR
An unsupervised learning algorithm of a finite multi-dimensional generalized Gamma mixture model (GGMM) is presented to tackle the issue of simultaneously clustering positive vectors and determining the number of clusters through a method based on Minimum message length (MML) criterion.
Learning of Finite Two-Dimensional Beta Mixture Models
TLDR
A bivariate Beta distribution with three parameters as the main parent distribution which could be applied in skin detection and image segmentation is introduced.
Bayesian Learning of Finite Asymmetric Gaussian Mixtures
TLDR
This paper introduces a fully Bayesian learning approach using Metropolis-Hastings within Gibbs sampling method to learn AGM model and shows the merits of the proposed model using synthetic data and a challenging intrusion detection application.
A Finite Multi-Dimensional Generalized Gamma Mixture Model
  • Basim Alghabashi, N. Bouguila
  • Computer Science
    2018 IEEE International Conference on Internet of Things (iThings) and IEEE Green Computing and Communications (GreenCom) and IEEE Cyber, Physical and Social Computing (CPSCom) and IEEE Smart Data (SmartData)
  • 2018
TLDR
An unsupervised learning algorithm, based on a finite multi-dimensional generalized Gamma mixture model (GGMM) is presented for the purpose of positive vectors clustering, and is compared with Gamma and Gaussian mixture models.
...
...

References

SHOWING 1-10 OF 10 REFERENCES
Unsupervised learning of a finite mixture model based on the Dirichlet distribution and its application
TLDR
An unsupervised algorithm for learning a finite mixture model from multivariate data based on the Dirichlet distribution, which offers high flexibility for modeling data.
Finding overlapping components with MML
TLDR
MML coding considerations allows the derivation of useful results to guide the implementation of a mixture modelling program and allows model search to be controlled based on the minimum variance for a component and the amount of data required to distinguish two overlapping components.
Finite Mixture Models
TLDR
The aim of this article is to provide an up-to-date account of the theory and methodological developments underlying the applications of finite mixture models.
Sum and Difference Histograms for Texture Classification
  • M. Unser
  • Mathematics
    IEEE Transactions on Pattern Analysis and Machine Intelligence
  • 1986
TLDR
Experimental results indicate that sum and difference histograms used conjointly are nearly as powerful as cooccurrence matrices for texture discrimination.
MML mixture modelling of multi-state, Poisson, von Mises circular and Gaussian distributions
TLDR
The MML theory can be regarded as the theory with the highest posterior probability, and the MML mixture modelling program, Snob, uses the message lengths from various parameter estimates to enable it to combine parameter estimation with selection of the number of components.
MML clustering of multi-state, Poisson, von Mises circular and Gaussian distributions
TLDR
This work outlines how MML is used for statistical parameter estimation, and how the MML mixture modelling program, Snob, uses the message lengths from various parameter estimates to enable it to combine parameter estimation with selection of the number of components and estimation of the relative abundances of the components.
Textural Features for Image Classification
TLDR
These results indicate that the easily computable textural features based on gray-tone spatial dependancies probably have a general applicability for a wide variety of image-classification applications.
Maximum likelihood estimation via the ECM algorithm: A general framework
Two major reasons for the popularity of the EM algorithm are that its maximum step involves only complete-data maximum likelihood estimation, which is often computationally simple, and that its
Ockham's Razor and Bayesian Analysis
'Ockham's razor', the ad hoc principle enjoining the greatest possible simplicity in theoretical explanations, is presently shown to be justifiable as a consequence of Bayesian inference; Bayesian
A mathematical theory of communication
In this final installment of the paper we consider the case where the signals or the messages or both are continuously variable, in contrast with the discrete nature assumed until now. To a