#### Filter Results:

- Full text PDF available (66)

#### Publication Year

1954

2017

- This year (6)
- Last 5 years (43)
- Last 10 years (65)

#### Publication Type

#### Co-author

#### Journals and Conferences

#### Brain Region

#### Data Set Used

#### Key Phrases

#### Method

#### Organism

Learn More

- Shay B. Cohen, Noah A. Smith
- HLT-NAACL
- 2009

We present a family of priors over probabilistic grammar weights, called the shared logistic normal distribution. This family extends the partitioned logistic normal distribution, enabling factored covariance between the probabilities of different derivation events in the probabilistic grammar, providing a new way to encode prior knowledge about an unknown… (More)

- Shay B. Cohen, Kevin Gimpel, Noah A. Smith
- NIPS
- 2008

We explore a new Bayesian model for probabilistic grammars, a family of distributions over discrete structures that includes hidden Markov models and probabilistic context-free grammars. Our model extends the correlated topic model framework to probabilistic grammars, exploiting the logistic normal distribution as a prior over the grammar parameters. We… (More)

- Shay B. Cohen, Dipanjan Das, Noah A. Smith
- EMNLP
- 2011

We describe a method for prediction of linguistic structure in a language for which only unlabeled data is available, using annotated data from a set of one or more helper languages. Our approach is based on a model that locally mixes between supervised models from the helper languages. Parallel data is not used, allowing the technique to be applied even in… (More)

- Shay B. Cohen, David M. Blei, Noah A. Smith
- HLT-NAACL
- 2010

Adaptor grammars extend probabilistic context-free grammars to define prior distributions over trees with “rich get richer” dynamics. Inference for adaptor grammars seeks to find parse trees for raw text. This paper describes a variational inference algorithm for adaptor grammars, providing an alternative to Markov chain Monte Carlo methods. To derive this… (More)

- Shay B. Cohen, Noah A. Smith
- EMNLP-CoNLL
- 2007

In morphologically rich languages, should morphological and syntactic disambiguation be treated sequentially or as a single problem? We describe several efficient, probabilisticallyinterpretable ways to apply joint inference to morphological and syntactic disambiguation using lattice parsing. Joint inference is shown to compare favorably to pipeline parsing… (More)

- Shay B. Cohen, Karl Stratos, Michael Collins, Dean P. Foster, Lyle H. Ungar
- HLT-NAACL
- 2013

Latent-variable PCFGs (L-PCFGs) are a highly successful model for natural language parsing. Recent work (Cohen et al., 2012) has introduced a spectral algorithm for parameter estimation of L-PCFGs, which—unlike the EM algorithm—is guaranteed to give consistent parameter estimates (it has PAC-style guarantees of sample complexity). This paper describes… (More)

Jeju, Republic of Korea, 8-14 July 2012. c ©2012 Association for Computational Linguistics Spectral Learning of Latent-Variable PCFGs Shay B. Cohen, Karl Stratos, Michael Collins, Dean P. Foster, and Lyle Ungar Dept. of Computer Science, Columbia University Dept. of Statistics/Dept. of Computer and Information Science, University of Pennsylvania… (More)

- Giorgio Satta, Shay B. Cohen, Marco Damonte
- EACL
- 2017

Abstract Meaning Representation (AMR) is a semantic representation for natural language that embeds annotations relatedMeaning Representation (AMR) is a semantic representation for natural language that embeds annotations related to traditional tasks such as named entity recognition, semantic role labeling, word sense disambiguation and co-reference… (More)

- Shay B. Cohen, Noah A. Smith
- Journal of Machine Learning Research
- 2010

Probabilistic grammars offer great flexibility in modeling discrete sequential data like natural language text. Their symbolic component is amenable to inspection by humans, while their probabilistic component helps resolve ambiguity. They also permit the use of well-understood, generalpurpose learning algorithms. There has been an increased interest in… (More)

- Shay B. Cohen, Noah A. Smith
- ACL
- 2010

We consider the search for a maximum likelihood assignment of hidden derivations and grammar weights for a probabilistic context-free grammar, the problem approximately solved by “Viterbi training.” We show that solving and even approximating Viterbi training for PCFGs is NP-hard. We motivate the use of uniformat-random initialization for Viterbi EM as an… (More)