Learn More
Data augmentation and Gibbs sampling are two closely related, sampling-based approaches to the calculation of posterior moments. The fact that each produces a sample whose constituents are neither independent nor identically distributed complicates the assessment of convergence and numerical accuracy of the approximations to the expected value of functions(More)
This paper surveys the fundamental principles of subjective Bayesian inference in econometrics and the implementation of those principles using posterior simulation methods. The emphasis is on the combination of models and the development of predictive distributions. Moving beyond conditioning on a fixed number of completely specified models, the paper(More)
The construction and implementation of a Gibbs sampler for efficient simulation from the truncated multivariate normal and Student-t distributions is described. It is shown how the accuracy and convergence of integrals based on the Gibbs sample may be constructed, and how an estimate of the probability of the constraint set under the unrestricted(More)
This article takes up methods for Bayesian inference in a linear model in which the disturbances are independent and have identical Student-t distributions. It exploits the equivalence of the Student-t distribution and an appropriate scale mixture of normals, and uses a Gibbs sampler to perform the computations. The new method is applied to some well-known(More)
This article provides an exact Bayesian frame­ work for analyzing the arbitrage pricing the­ ory (APT). Based on the Gibbs sampler, we show how to obtain the exact posterior distributions for functions of interest in the factor modeL In particular, we propose a measure of the APT pricing deviations and obtain its exact posterior distribution. Using monthly(More)
In the specification of linear regression models it is common to indicate a list of candidate variables from which a subset enters the model with nonzero coefficients. This paper interprets this specification as a mixed continuous-discrete prior distribution for coefficient values. It then utilizes a Gibbs sampler to construct posterior moments. It is shown(More)
This paper extends the conventional Bayesian mixture of normals model by permitting state probabilities to depend on observed covariates. The dependence is captured by a simple multinomial probit model. A conventional and rapidly mixing MCMC algorithm provides access to the posterior distribution at modest computational cost. This model is competitive with(More)
A prediction model is any statement of a probability distribution for an outcome not yet observed. This study considers the properties of weighted linear combinations of n prediction models, or linear pools, evaluated using the conventional log predictive scoring rule. The log score is a concave function of the weights, and, in general, an optimal linear(More)
This paper develops new econometric methods to infer hospital quality in a model with discrete dependent variables and non-random selection. Mortality rates in patient discharge records are widely used to infer hospital quality. However, hospital admission is not random and some hospitals may attract patients with greater unobserved severity of illness than(More)
We present a theoretical and empirical framework for computing and evaluating linear projections conditional on hypothetical paths of monetary policy. A modest policy intervention does not significantly shift agents’ beliefs about policy regime and does not induce the changes in behavior that Lucas (1976) emphasizes. Applied to an econometric model of U.S.(More)