• Corpus ID: 21550557

INFERENCE FOR NORMAL MIXTURES IN MEAN AND VARIANCE

@inproceedings{Chen2008INFERENCEFN,
  title={INFERENCE FOR NORMAL MIXTURES IN MEAN AND VARIANCE},
  author={Jiahua Chen and Xianming Tan and Runchu Zhang},
  year={2008}
}
A finite mixture of normal distributions, in both mean and variance parameters, is a typical finite mixture in the location and scale families. Because the likelihood function is unbounded for any sample size, the ordinary maximum likelihood estimator is not consistent. Applying a penalty to the likelihood function to control the estimated component variances is thought to restore the optimal properties of the likelihood approach. Yet this proposal lacks practical guidelines, has not been… 

Figures and Tables from this paper

Inference for multivariate normal mixtures

Consistency of the penalized MLE for two-parameter gamma mixture models

Two-parameter gamma distributions are widely used in liability theory, lifetime data analysis, financial statistics, and other areas. Finite mixtures of gamma distributions are their natural

Hypothesis test for normal mixture models: The EM approach

Normal mixture distributions are arguably the most important mixture models, and also the most technically challenging. The likelihood function of the normal mixture model is unbounded based on a set

Consistency of the MLE under a two-parameter Gamma mixture model with a structural shape parameter

The finite Gamma mixture model is often used to describe randomness in income data, insurance data, and data from other applications. The popular likelihood approach, however, does not work for this

Penalized maximum likelihood estimation for skew normal mixtures

Skew normal mixture models provide a more flexible framework than the popular normal mixtures for modelling heterogeneous data with asymmetric behaviors. Due to the unboundedness of likelihood

A CLASSICAL INVARIANCE APPROACH TO THE NORMAL MIXTURE PROBLEM

Although normal mixture models have received great attention and are commonly used in different fields, they stand out for failing to have a finite maximum on the likelihood. In the univariate case,

Inference on the Order of a Normal Mixture

Finite normal mixture models are used in a wide range of applications. Hypothesis testing on the order of the normal mixture is an important yet unsolved problem. Existing procedures often lack a

On consistency of the MLE under finite mixtures of location-scale distributions with a structural parameter

Test for homogeneity in gamma mixture models using likelihood ratio

Estimation of finite mixture models of skew-symmetric circular distributions

Analysis of circular data is challenging, since the usual statistical methods are unsuitable and it is necessary to use circular periodic probabilistic models. Because some actual circular datasets
...

References

SHOWING 1-10 OF 25 REFERENCES

Estimating the components of a mixture of normal distributions

SUMMARY The problem of estimating the components of a mixture of two normal distributions, multivariate or otherwise, with common but unknown covariance matrices is examined. The maximum likelihood

Modified likelihood ratio test in finite mixture models with a structural parameter

TESTS FOR HOMOGENEITY IN NORMAL MIXTURES IN THE PRESENCE OF A STRUCTURAL PARAMETER

Often a question arises as to whether observed data are a sample from a homogeneous population or from a heterogeneous population. If in particular, one wants to test for a single normal distribution

A Graphical Technique for Determining the Number of Components in a Mixture of Normals

Abstract When a population is assumed to be composed of a finite number of subpopulations, a natural model to choose is the finite mixture model. It will often be the case, however, that the number

Penalized maximum likelihood estimator for normal mixtures

TLDR
The estimation of the parameters of a mixture of Gaussian densities is considered, within the framework of maximum likelihood, and a solution to likelihood function degeneracy which consists in penalizing the likelihood function is adopted.

A Constrained Formulation of Maximum-Likelihood Estimation for Normal Mixture Distributions

In this context, the normal densities are sometimes referred to as component densities. The log-likelihood function L(zy), corresponding to a random sample {xl, *, xnj, is defined by L(Ty) = Ek=1

Penalized maximum likelihood estimation for univariate normal mixture distributions

Due to singularities of the likelihood function, the maximum likelihood approach for the estimation of the parameters of normal mixture models is an acknowledged ill posed optimization problem. Ill

Mixture densities, maximum likelihood, and the EM algorithm

TLDR
This work discusses the formulation and theoretical and practical properties of the EM algorithm, a specialization to the mixture density context of a general algorithm used to approximate maximum-likelihood estimates for incomplete data problems.

CONSISTENCY OF THE MAXIMUM LIKELIHOOD ESTIMATOR IN THE PRESENCE OF INFINITELY MANY INCIDENTAL PARAMETERS

0 and ai. The parameter 0, upon which all the distributions depend, is called "structural"; the parameters {aiI} are called "incidental". Throughout this paper we shall assume that the Xi, are

Optimal Rate of Convergence for Finite Mixture Models

In finite mixture models, we establish the best possible rate of convergence for estimating the mixing distribution. We find that the key for estimating the mixing distribution is the knowledge of