Share This Author
Empirical Bayes posterior concentration in sparse high-dimensional linear models
A new empirical Bayes approach for inference in the normal linear model, using the use of data in the prior in two ways, for centering and regularization, relevant for both estimation and model selection.
Inferential Models: Reasoning with Uncertainty
Preliminaries Introduction Assumed background Scientific inference: An overview Prediction and inference Outline of the book Prior-Free Probabilistic Inference, and some further technical details.
Inferential Models: A Framework for Prior-Free Posterior Probabilistic Inference
Posterior probabilistic statistical inference without priors is an important but so far elusive goal. Fisher’s fiducial inference, Dempster–Shafer theory of belief functions, and Bayesian inference…
On ε-Optimality of the Pursuit Learning Algorithm
This paper identifies and fills a gap in existing proofs of probabilistic convergence for pursuit learning, and sheds light on the importance of a vanishing sequence of tuning parameters in a theoretical convergence analysis.
Dempster-Shafer Theory and Statistical Inference with Weak Beliefs
A general description of WB in the context of inferential models, its interplay with the DS calculus, and the maximal belief solution is presented, and new applications of the WB method in two high-dimensional hypothesis testing problems are given.
Satellite conjunction analysis and the false confidence theorem
The Martin–Liu validity criterion is introduced as a benchmark by which to identify statistical methods that are free from false confidence, and it is shown that uncertainty ellipsoids satisfy the validity criterion.
Conditional inferential models: combining information for prior‐free probabilistic inference
The inferential model (IM) framework provides valid prior‐free probabilistic inference by focusing on predicting unobserved auxiliary variables. But, efficient IM‐based inference can be challenging…
CONSISTENCY OF A RECURSIVE ESTIMATE OF MIXING DISTRIBUTIONS
Mixture models have received considerable attention recently and Newton [Sankhyā Ser. A 64 (2002) 306-322] proposed a fast recursive algorithm for estimating a mixing distribution. We prove almost…
Calibrating general posterior credible regions
A scalar tuning parameter is introduced that controls the posterior distribution spread, and a Monte Carlo algorithm is developed that sets this parameter so that the corresponding credible region achieves the nominal frequentist coverage probability.
A nonparametric empirical Bayes framework for large-scale multiple testing.
Simulations and real data examples demonstrate that the proposed PRtest's careful handling of the nonnull density can give a much better fit in the tails of the mixture distribution which, in turn, can lead to more realistic conclusions.