• Publications
  • Influence
Reducing Grounded Learning Tasks To Grammatical Inference
TLDR
We show that the grounded task of learning a semantic parser from ambiguous training data as discussed in Kim and Mooney (2010) can be reduced to a Probabilistic Context-Free Grammar learning task in a way that gives state of the art results. Expand
  • 48
  • 9
  • PDF
WikiNet: A Very Large Scale Multi-Lingual Concept Network
TLDR
This paper describes a multi-lingual concept network obtained automatically by mining for concepts and relations and exploiting a variety of sources of knowledge from Wikipedia. Expand
  • 104
  • 8
  • PDF
A summary of the 2012 JHU CLSP workshop on zero resource speech technologies and models of early language acquisition
TLDR
We summarize the accomplishments of a multi-disciplinary workshop exploring the computational and scientific issues surrounding zero resource (unsupervised) speech technologies and related models of early language acquisition. Expand
  • 97
  • 4
  • PDF
Collocations in Multilingual Natural Language Generation: Lexical Functions meet Lexical Functional Grammar
TLDR
In a collocation, the choice of one lexical item depends on the choice made for another. Expand
  • 12
  • 1
  • PDF
Implementing lexical functions in XLE
Linguistic collocations such as pay attention or heavy rain are semicompositional expressions that require a special treatment in symbolic grammars for NLP. Within the Meaning-Text Theory framework,Expand
  • 7
  • 1
  • PDF
Unsupervised Word Segmentation in Context
TLDR
We use topics from a Latent Dirichlet Allocation model as a proxy for “activities” contexts, to label the Providence corpus. Expand
  • 4
  • 1
  • PDF
A Particle Filter algorithm for Bayesian Wordsegmentation
TLDR
We present a novel online algorithm for the word segmentation models of Goldwater et al. (2009) which is, to our knowledge, the first published version of a Particle Filter for this kind of model. Expand
  • 28
  • PDF
Exploring the Role of Stress in Bayesian Word Segmentation using Adaptor Grammars
TLDR
We show that enabling a current state-of-the-art Bayesian word segmentation model to take advantage of stress cues noticeably improves its performance. Expand
  • 20
  • PDF
Using Rejuvenation to Improve Particle Filtering for Bayesian Word Segmentation
TLDR
We present a novel incremental learning algorithm for the word segmentation problem originally introduced in Goldwater (2006). Expand
  • 15
  • PDF
Studying the Effect of Input Size for Bayesian Word Segmentation on the Providence Corpus
TLDR
In this paper, we study the effect that input size has on the performance of word segmentation models embodying different kinds of linguistic assumptions. Expand
  • 9
  • PDF