• Corpus ID: 238408425

Revisiting consistency of a recursive estimator of mixing distributions

@inproceedings{Dixit2021RevisitingCO,
  title={Revisiting consistency of a recursive estimator of mixing distributions},
  author={Vaidehi Dixit and Ryan Martin},
  year={2021}
}
Estimation of the mixing distribution under a general mixture model is a very difficult problem, especially when the mixing distribution is assumed to have a density. Predictive recursion (PR) is a fast, recursive algorithm for nonparametric estimation of a mixing distribution/density in general mixture models. However, the existing PR consistency results make rather strong assumptions, some of which fail for a class of mixture models relevant for monotone density estimation, namely, scale… 
1 Citations

Figures from this paper

A PRticle filter algorithm for nonparametric estimation of multivariate mixing distributions

A new strategy is proposed, which is referred to as PRticle filter , wherein the basic PR algorithm is augmented with a adaptively reweights an initial set of particles along the updating sequence which are used to obtain Monte Carlo approximations of the normalizing constants.

References

SHOWING 1-10 OF 47 REFERENCES

CONSISTENCY OF A RECURSIVE ESTIMATE OF MIXING DISTRIBUTIONS

Mixture models have received considerable attention recently and Newton [Sankhyā Ser. A 64 (2002) 306-322] proposed a fast recursive algorithm for estimating a mixing distribution. We prove almost

Estimating a Mixing Distribution on the Sphere Using Predictive Recursion

Mixture models are commonly used when data show signs of heterogeneity and, often, it is important to estimate the distribution of the latent variable responsible for that heterogeneity. This is a

Asymptotic properties of predictive recursion: Robustness and rate of convergence

Here we explore general asymptotic properties of Predictive Recursion (PR) for nonparametric estimation of mixing distributions. We prove that, when the mixture model is mis-specified, the estimated

Concentration rate and consistency of the posterior distribution for selected priors under monotonicity constraints

It is proved that the posterior distribution based on both priors concentrates at the rate (n/log(n))−1/3, which is the minimax rate of estimation up to a log(n) factor.

Stochastic Approximation and Newton’s Estimate of a Mixing Distribution

Many statistical problems involve mixture models and the need for computationally efficient methods to estimate the mixing distribution has increased dramatically in recent years. Newton [Sankhya

Permutation-based uncertainty quantification about a mixing distribution

Nonparametric estimation of a mixing distribution based on data coming from a mixture model is a challenging problem. Beyond estimation, there is interest in uncertainty quantification, e.g.,

On empirical estimation of mode based on weakly dependent samples

  • Bowen LiuS. Ghosh
  • Mathematics
    Comput. Stat. Data Anal.
  • 2020

EM Estimation for Finite Mixture Models with Known Mixture Component Size

This work considers the use of an EM algorithm for fitting finite mixture models when mixture component size is known and shows robustness to the choice of starting values and exhibits numerically stable convergence properties.

Empirical Priors and Posterior Concentration Rates for a Monotone Density

In a Bayesian context, prior specification for inference on monotone densities is conceptually straightforward, but proving posterior convergence theorems is complicated by the fact that desirable

Maximum Smoothed Likelihood Density Estimation for Inverse Problems

We consider the problem of estimating a pdf f from samples X 1 , X 2 ,..., X n of a random variable with pdf Kf, where K is a compact integral operator. We employ a maximum smoothed likelihood