• Corpus ID: 239024552

Robustness against conflicting prior information in regression

@inproceedings{Gagnon2021RobustnessAC,
  title={Robustness against conflicting prior information in regression},
  author={Philippe Gagnon},
  year={2021}
}
Including prior information about model parameters is a fundamental step of any Bayesian statistical analysis. It is viewed positively by some as it allows, among others, to quantitatively incorporate expert opinion about model parameters. It is viewed negatively by others because it sets the stage for subjectivity in statistical analysis. Certainly, it creates problems when the inference is skewed due to a conflict with the data collected. According to the theory of conflict resolution (O… 
1 Citations

Figures from this paper

Detecting and diagnosing prior and likelihood sensitivity with power-scaling
TLDR
A diagnostic that can indicate the presence of prior-data conflict or likelihood noninformativity is suggested and limitations to the power-scaling approach are discussed.

References

SHOWING 1-10 OF 21 REFERENCES
A New Bayesian Approach to Robustness Against Outliers in Linear Regression
TLDR
This paper proposes a model with super heavy-tailed errors, and proves that it is wholly robust, meaning that the impact of outliers gradually vanishes as they move further and further away form the general trend.
Robust Bayesian Regression Analysis Using Ramsay-Novick Distributed Errors with Student-t Prior
This paper investigates bayesian treatment of regression modelling with Ramsay - Novick (RN)  distribution  specifically  developed for robust  inferential procedures. It falls into the category of
Bayesian heavy-tailed models and conflict resolution: A review
TLDR
It is shown that Bayesian modelling with heavy-tailed distributions has been shown to produce more reasonable con‡ict resolution, typically by favouring one source of information over the other.
Bayesian robustness to outliers in linear regression and ratio estimation
Whole robustness is a nice property to have for statistical models. It implies that the impact of outliers gradually decreases to nothing as they converge towards plus or minus infinity. So far, the
An automatic robust Bayesian approach to principal component regression
TLDR
A Bayesian approach that is robust to outliers in both the dependent variable and the covariates is introduced, compared to its nonrobust Bayesian counterpart, the traditional frequentist approach and a commonly employed robust frequentist method.
Bayesian Model Averaging for Linear Regression Models
Abstract We consider the problem of accounting for model uncertainty in linear regression models. Conditioning on a single selected model ignores model uncertainty, and thus leads to the
The Bayesian Lasso
The Lasso estimate for linear regression parameters can be interpreted as a Bayesian posterior mode estimate when the regression parameters have independent Laplace (i.e., double-exponential) priors.
The Choice of Variables in Multiple Regression
Professor R. L. PLACKETT in the Chair] SUMMARY This paper is concerned with the analysis of data from a multiple regression of a single variable, y, on a set of independent variables, xl, x2, .. .
On Outlier Rejection Phenomena in Bayes Inference
SUMMARY Inference is considered for a location parameter given a random sample. Outliers are not explicitly modelled, but rejection of extreme observations occurs naturally in any Bayesian analysis
Robustness to outliers in location–scale parameter model using log-regularly varying distributions
Estimating the location and scale parameters is common in statistics, using, for instance, the well-known sample mean and standard deviation. However, inference can be contaminated by the presence of
...
1
2
3
...