• Corpus ID: 14173412

GIBBS SAMPLING FOR THE UNINITIATED

@inproceedings{Resnik2010GIBBSSF,
  title={GIBBS SAMPLING FOR THE UNINITIATED},
  author={Philip Resnik and Eric A. Hardisty},
  year={2010}
}
This document is intended for computer scientists who would like to try out a Markov Chain Monte Carlo (MCMC) technique, particularly in order to do inference with Bayesian models on problems related to text processing. We try to keep theory to the absolute minimum needed, though we work through the details much more explicitly than you usually see even in \introductory" explanations. That means we’ve attempted to be ridiculously explicit in our exposition and notation. After providing the… 

A Lightweight Guide on Gibbs Sampling and JAGS

In this document we give some insight about how Gibbs Sampling works and how the JAGS modelling framework implements it. The hope is that, once the reader will have understood these concepts,

Gibbs Sampling with JAGS: Behind the Scenes

The aim of this paper is to give a high-level overview of the JAGS algorithms and its extensions that implement Gibbs sampling, a Bayesian inference technique used in various scientific domains to generate samples from a certain posterior probability density function.

Sampling Rankings

A recursive algorithm for computing the number of rankings consistent with a set of optimal candidates in the framework of Optimality Theory, which improves the k bound on mistakes for Tesar and Smolensky’s Constraint Demotion algorithm and is within a logarithmic factor of the best possible mistake bound for learning rankings.

Collapsed Variational Bayesian Inference for Hidden Markov Models

This paper proposes two collapsed variational Bayesian inference algorithms for hidden Markov models, a popular framework for representing time series data, and validate their algorithms on the natural language processing task of unsupervised part-of-speech induction, showing that they are both more computationally efficient than sampling, and more accurate than standard variationalBayesian inference for HMMs.

On the segmentation of switching autoregressive processes by nonparametric Bayesian methods

  • S. DashP. Djurić
  • Computer Science
    2014 22nd European Signal Processing Conference (EUSIPCO)
  • 2014
A variant of the nonparametric Bayesian (NPB) forward-backward (FB) method is demonstrated for sampling state sequences of hidden Markov models (HMMs), when the continuous-valued observations follow autoregressive (AR) processes.

Sometimes Average is Best: The Importance of Averaging for Prediction using MCMC Inference in Topic Modeling

This work systematically study different strategies for averaging MCMC samples and shows empirically that averaging properly leads to significant improvements in prediction.

Statistical Learning, Inductive Bias, and Bayesian Inference in Language Acquisition

This chapter discusses the contribution of an emerging theoretical framework called Bayesian learning that can be used to investigate the inductive bias needed for language acquisition, a combination of hard and soft constraints.

On the Estimation and Use of Statistical Modelling in Information Retrieval

Overall, this thesis concludes that distributional assumptions can be replaced with an effective, efficient and principled method for determining the "true" distribution and that using the " true" distribution can lead to improved retrieval performance.

Modeling Perspective Using Adaptor Grammars

It is demonstrated that an adaptive naive Bayes model captures multiword lexical usages associated with perspective, and a new state-of-the-art for perspective classification results are established using the Bitter Lemons corpus, a collection of essays about mid-east issues from Israeli and Palestinian points of view.

Bayesian Word Alignment for Statistical Machine Translation

This work proposes a Gibbs sampler for fully Bayesian inference in IBM Model 1, integrating over all possible parameter values in finding the alignment distribution, and shows thatBayesian inference outperforms EM in all of the tested language pairs, domains and data set sizes.
...

References

SHOWING 1-10 OF 24 REFERENCES

Bayesian Inference for PCFGs via Markov Chain Monte Carlo

Two Markov chain Monte Carlo algorithms for Bayesian inference of probabilistic context free grammars (PCFGs) from terminal strings are presented, providing an alternative to maximum-likelihood estimation using the Inside-Outside algorithm.

Bayesian Density Estimation and Inference Using Mixtures

Abstract We describe and illustrate Bayesian inference in models for density estimation using mixtures of Dirichlet processes. These models provide natural settings for density estimation and are

Probabilistic Inference Using Markov Chain Monte Carlo Methods

The role of probabilistic inference in artificial intelligence is outlined, the theory of Markov chains is presented, and various Markov chain Monte Carlo algorithms are described, along with a number of supporting techniques.

Nonparametric bayesian models of lexical acquisition

This thesis rests on the assumption that the kinds of generalizations the learner may make are constrained by the interaction of many different types of stochastic information, including innate learning biases.

Bayesian Modeling of Dependency Trees Using Hierarchical Pitman-Yor Priors

Two hierarchical Bayesian models for dependency trees are described and Eisner’s classic generative dependency model can be substantially improved by using a hierarchical Pitman-Yor process as a prior over the distribution over dependents of a word.

Probabilistic Graphical Models - Principles and Techniques

The framework of probabilistic graphical models, presented in this book, provides a general approach for causal reasoning and decision making under uncertainty, allowing interpretable models to be constructed and then manipulated by reasoning algorithms.

Learning Probabilistic Models of Word Sense Disambiguation

This dissertation presents several new methods of supervised and unsupervised learning of word sense disambiguation models that rely on the use of Gibbs Sampling and the Expectation Maximization algorithm.

An Introduction to MCMC for Machine Learning

This purpose of this introductory paper is to introduce the Monte Carlo method with emphasis on probabilistic machine learning and review the main building blocks of modern Markov chain Monte Carlo simulation.

RNA Modeling Using Gibbs Sampling and Stochastic Context Free Grammars

A new method of discovering the common secondary structure of a family of homologous RNA sequences using Gibbs sampling and stochastic context-free grammars is proposed. Given an unaligned set of

Knowledge Lean Word-Sense Disambiguation

A corpus-based approach to word-sense disambiguation that only requires information that can be automatically extracted from untagged text and Gibbs Sampling results in small but consistent improvement in disambigsuation accuracy over the EM algorithm.