• Corpus ID: 220525800

A Bayesian Multiple Testing Paradigm for Model Selection in Inverse Regression Problems

@article{Chatterjee2020ABM,
  title={A Bayesian Multiple Testing Paradigm for Model Selection in Inverse Regression Problems},
  author={Debashis Chatterjee and Sourabh Bhattacharya},
  journal={arXiv: Statistics Theory},
  year={2020}
}
In this article, we propose a novel Bayesian multiple testing formulation for model and variable selection in inverse setups, judiciously embedding the idea of inverse reference distributions proposed by Bhattacharya (2013) in a mixture framework consisting of the competing models. We develop the theory and methods in the general context encompassing parametric and nonparametric competing models, dependent data, as well as misspecifications. Our investigation shows that asymptotically the… 
1 Citations

How Ominous is the Future Global Warming Premonition

Global warming, the phenomenon of increasing global average temperature in the recent decades, is receiving wide attention due to its very significant adverse effects on climate. Whether global

References

SHOWING 1-10 OF 30 REFERENCES

Asymptotic theory of dependent Bayesian multiple testing procedures under possible model misspecification

We study asymptotic properties of Bayesian multiple testing procedures and provide sufficient conditions for strong consistency under general dependence structure. We also consider a novel Bayesian

A Bayesian discovery procedure

It is shown that, under a coherent decision theoretic framework, a loss function combining true positive and false positive counts leads to a decision rule that is based on a threshold of the posterior probability of the alternative.

Convergence of Pseudo-Bayes Factors in Forward and Inverse Regression Problems

Almost sure exponential convergence of pseudo-Bayes factors for large samples under a general setup consisting of dependent data and model misspecifications is established, particularly on general parametric and nonparametric regression setups in both forward and inverse contexts.

The positive false discovery rate: a Bayesian interpretation and the q-value

This work introduces a modified version of the FDR called the “positive false discovery rate” (pFDR), which can be written as a Bayesian posterior probability and can be connected to classification theory.

Non-marginal decisions: A novel Bayesian multiple testing procedure

This article develops a novel Bayesian multiple testing procedure that substantially enhances efficiency by judicious exploitation of the dependence structure among the hypotheses, and proves theoretical results on the relevant error probabilities, establishing the coherence and usefulness of the method.

Dynamics of Bayesian Updating with Dependent Data and Misspecified Models

This work establishes sufficient conditions for posterior convergence when all hypotheses are wrong, and the data have complex dependencies, and derives a kind of large deviations principle for the posterior measure, extending in some cases to rates of convergence, and discusses the advantages of predicting using a combination of models known to be wrong.

A GENERAL DECISION THEORETIC FORMULATION OF PROCEDURES CONTROLLING FDR AND FNR FROM A BAYESIAN PERSPECTIVE

A general decision theoretic formulation is given to multiple testing, al- lowing descriptions of measures of false discoveries and false non-discoveries in terms of certain loss functions even when

Fundamentals of Nonparametric Bayesian Inference

This authoritative text draws on theoretical advances of the past twenty years to synthesize all aspects of Bayesian nonparametrics, from prior construction to computation and large sample behavior of posteriors, making it valuable for both graduate students and researchers in statistics and machine learning.

A Fully Bayesian Approach to Assessment of Model Adequacy in Inverse Problems

Posterior Consistency of Bayesian Inverse Regression and Inverse Reference Distributions

We consider Bayesian inference in inverse regression problems where the objective is to infer about unobserved covariates from observed responses and covariates. We establish posterior consistency of