Learn More
JSTOR is a not-for-profit service that helps scholars, researchers, and students discover, use, and build upon a wide range of content in a trusted digital archive. We use information technology and tools to increase productivity and facilitate new forms of scholarship. For more information about JSTOR, please contact support@jstor.org. Institute of(More)
This paper proposes a new approach to sparse-signal detection called the horseshoe estimator. We show that the horseshoe is a close cousin of the lasso in that it arises from the same class of multivariate scale mixtures of normals, but that it is almost universally superior to the double-exponential prior at handling sparsity. A theoretical framework is(More)
Taylor & Francis makes every effort to ensure the accuracy of all the information (the " Content ") contained in the publications on our platform. However, Taylor & Francis, our agents, and our licensors make no representations or warranties whatsoever as to the accuracy, completeness, or suitability for any purpose of the Content. Any opinions and views(More)
This chapter discusses Markov Chain Monte Carlo (MCMC) based methods for estimating continuous-time asset pricing models. We describe the Bayesian approach to empirical asset pricing, the mechanics of MCMC algorithms and the strong theoretical underpinnings of MCMC algorithms. We provide a tutorial on building MCMC algorithms and show how to estimate equity(More)
We study the classic problem of choosing a prior distribution for a location parameter β = (β 1 ,. .. , βp) as p grows large. First, we study the standard " global-local shrinkage " approach, based on scale mixtures of normals. Two theorems are presented which characterize certain desirable properties of shrinkage priors for sparse problems. Next, we review(More)
This paper presents a general, fully Bayesian framework for sparse supervised-learning problems based on the horseshoe prior. The horseshoe prior is a member of the family of multivariate scale mixtures of normals, and is therefore closely related to widely used approaches for sparse Bayesian learning, including , among others, Laplacian priors (e.g. the(More)
A Bayesian analysis of the multinomial probit model with fully identi"ed parameters Abstract We present a new prior and corresponding algorithm for Bayesian analysis of the multinomial probit model. Our new approach places a prior directly on the identi"ed parameter space. The key is the speci"cation of a prior on the covariance matrix so that the (1,1)(More)
We develop a simulation-based method for the online updating of Gaussian process regression and classification models. Our method exploits sequential Monte Carlo to produce a thrifty sequential design algorithm, in terms of computational speed, compared to the established MCMC alternative. The latter is less ideal for sequential design since it must be(More)
This chapter develops Markov Chain Monte Carlo (MCMC) methods for Bayesian inference in continuous-time asset pricing models. The Bayesian solution to the inference problem is the distribution of parameters and latent variables conditional on observed data, and MCMC methods provide a tool for exploring these high-dimensional, complex distributions. We first(More)