• Publications
  • Influence
Minimum disparity estimation for continuous models: Efficiency, distributions and robustness
A general class of minimum distance estimators for continuous models called minimum disparity estimators are introduced. The conventional technique is to minimize a distance between a kernel densityExpand
  • 149
  • 19
  • PDF
Statistical Inference: The Minimum Distance Approach
Introduction General Notation Illustrative Examples Some Background and Relevant Definitions Parametric Inference based on the Maximum Likelihood Method Hypothesis Testing by Likelihood MethodsExpand
  • 213
  • 16
  • PDF
PENALIZED MINIMUM DISPARITY METHODS FOR MULTINOMIAL MODELS
Robust estimation of the probability vector is an important problem for the finite k cell multinomial model. When the probability vector is unrestricted, its estimate is equal to the vector of theExpand
  • 36
  • 7
  • PDF
Weighted Likelihood Equations with Bootstrap Root Search
Abstract We discuss a method of weighting likelihood equations with the aim of obtaining fully efficient and robust estimators. We discuss the case of continuous probability models using unimodalExpand
  • 114
  • 6
Robust Bayes estimation using the density power divergence
The ordinary Bayes estimator based on the posterior density can have potential problems with outliers. Using the density power divergence measure, we develop an estimation method in this paper basedExpand
  • 37
  • 6
  • PDF
2 Minimum distance estimation: The approach using density-based distances
Publisher Summary This chapter discusses the concept of minimum distance estimation using density-based distances. Density-based minimum distance methods have proven to be valuable additions to theExpand
  • 43
  • 4
Minimum negative exponential disparity estimation in parametric models
Abstract Works of Lindsay (1994) and Basu and Sarkar (1994a) provide heuristic arguments and some empirical evidence that the minimum negative exponential disparity estimator (MNEDE), like theExpand
  • 26
  • 4
Robust estimation for non-homogeneous data and the selection of the optimal tuning parameter: the density power divergence approach
The density power divergence (DPD) measure, defined in terms of a single parameter α, has proved to be a popular tool in the area of robust estimation [1]. Recently, Ghosh and Basu [5] rigorouslyExpand
  • 29
  • 3
A generalized divergence for statistical inference
The power divergence (PD) and the density power divergence (DPD) families have proven to be useful tools in the area of robust inference. In this paper, we consider a superfamily of divergences whichExpand
  • 27
  • 3
  • PDF