• Publications
  • Influence
The Curious Case of Neural Text Degeneration
TLDR
By sampling text from the dynamic nucleus of the probability distribution, which allows for diversity while effectively truncating the less reliable tail of the distribution, the resulting text better demonstrates the quality of human text, yielding enhanced diversity without sacrificing fluency and coherence. Expand
Learning to Write with Cooperative Discriminators
TLDR
Human evaluation demonstrates that text generated by the RNN system is preferred over that of baselines by a large margin and significantly enhances the overall coherence, style, and information content of the generated text. Expand
Online Segment to Segment Neural Transduction
TLDR
An online neural sequence to sequence model that learns to alternate between encoding and decoding segments of the input as it is read that tackles the bottleneck of vanilla encoder-decoders that have to read and memorize the entire input sequence in their fixed-length hidden states. Expand
Robust Incremental Neural Semantic Graph Parsing
TLDR
This work proposes a neural encoder-decoder transition-based parser which is the first full-coverage semantic graph parser for Minimal Recursion Semantics (MRS), and uses stack-based embedding features, predicting graphs jointly with unlexicalized predicates and their token alignments. Expand
Cross-Lingual Morphological Tagging for Low-Resource Languages
TLDR
This approach extends existing approaches of projecting part-of-speech tags across languages, using bitext to infer constraints on the possible tags for a given word type or token, using Wsabie, a discriminative embeddingbased model with rank-based learning. Expand
BottleSum: Unsupervised and Self-supervised Sentence Summarization using the Information Bottleneck Principle
TLDR
This paper proposes a novel approach to unsupervised sentence summarization by mapping the Information Bottleneck principle to a conditional language modelling objective: given a sentence, the approach seeks a compressed sentence that can best predict the next sentence. Expand
Chorale Harmonization with Weighted Finite-state Transducers
We approach the task of harmonizing chorales through style imitation by probabilistically modelling the harmony of music pieces in the framework of weighted finite-state transducers (WFSTs), whichExpand
Non-Negative Matrix Factorization for Learning Alignment-Specific Models of Protein Evolution
TLDR
A method for estimating alignment-specific models of protein evolution in which the complexity of the model is adapted to suit the richness of the data, and the basis matrices obtained confirm the expectation that amino acid properties tend to be conserved and allow us to quantify, on specific alignments, how the strength of conservation varies across different properties. Expand
Bridging HMMs and RNNs through Architectural Transformations
TLDR
This paper investigates a series of architectural transformations between HMMs and RNNs, both through theoretical derivations and empirical hybridization, and presents a comprehensive empirical study to provide insights into the interplay between expressivity and interpretability in this model family with respect to language modeling and parts-of-speech induction. Expand
A Tree Transducer Model for Grammatical Error Correction
TLDR
This work presents an approach to grammatical error correction for the CoNLL 2013 shared task based on a weighted tree-to-string transducer that ranked 6th out of the participating teams on both the original and revised test set annotations. Expand
...
1
2
3
...