• Publications
  • Influence
Abstractive Text Summarization using Sequence-to-sequence RNNs and Beyond
TLDR
This work proposes several novel models that address critical problems in summarization that are not adequately modeled by the basic architecture, such as modeling key-words, capturing the hierarchy of sentence-to-word structure, and emitting words that are rare or unseen at training time.
Labeled LDA: A supervised topic model for credit attribution in multi-labeled corpora
TLDR
Labeled LDA is introduced, a topic model that constrains Latent Dirichlet Allocation by defining a one-to-one correspondence between LDA's latent topics and user tags that allows Labeled LDA to directly learn word-tag correspondences.
SummaRuNNer: A Recurrent Neural Network Based Sequence Model for Extractive Summarization of Documents
We present SummaRuNNer, a Recurrent Neural Network (RNN) based sequence model for extractive summarization of documents and show that it achieves performance better than or comparable to
Multi-instance Multi-label Learning for Relation Extraction
TLDR
This work proposes a novel approach to multi-instance multi-label learning for RE, which jointly models all the instances of a pair of entities in text and all their labels using a graphical model with latent variables that performs competitively on two difficult domains.
Pointing the Unknown Words
TLDR
A novel way to deal with the rare and unseen words for the neural network models using attention is proposed using attention, which uses two softmax layers in order to predict the next word in conditional language models.
OCGAN: One-Class Novelty Detection Using GANs With Constrained Latent Representations
TLDR
A novel model called OCGAN is presented for the classical problem of one-class novelty detection, where, given a set of examples from a particular class, the goal is to determine if a query example is from the same class using a de-noising auto-encoder network.
Joint latent topic models for text and citations
TLDR
This work addresses the problem of joint modeling of text and citations in the topic modeling framework with two different models called the Pairwise-Link-LDA and the Link-PLSA-Lda models, which combine the LDA and PLSA models into a single graphical model.
Discriminative models for information retrieval
TLDR
It is argued that the main reason to prefer SVMs over language models is their ability to learn arbitrary features automatically as demonstrated by the experiments on the home-page finding task of TREC-10.
Event threading within news topics
TLDR
This work attempts to capture the rich structure of events and their dependencies in a news topic through the authors' event models, and takes into account novel features such as temporal locality of stories for event recognition and time-ordering for capturing dependencies.
Sequence-to-Sequence RNNs for Text Summarization
TLDR
This work casts text summarization as a sequence-to-sequence problem and applies the attentional encoder-decoder RNN that has been shown to be successful for Machine Translation and significantly outperforms the state-of-the art model of Rush et al. (2015).
...
1
2
3
4
5
...