• Publications
  • Influence
Abstractive Text Summarization using Sequence-to-sequence RNNs and Beyond
TLDR
In this work, we model abstractive text summarization using Attentional Encoder-Decoder Recurrent Neural Networks, and show that they achieve state-of-the-art performance on two different corpora. Expand
  • 1,138
  • 202
  • PDF
Labeled LDA: A supervised topic model for credit attribution in multi-labeled corpora
TLDR
This paper introduces Labeled LDA, a topic model that constrains Latent Dirichlet Allocation by defining a one-to-one correspondence between LDA's latent topics and user tags. Expand
  • 1,208
  • 201
  • PDF
Multi-instance Multi-label Learning for Relation Extraction
TLDR
We propose a novel approach to multi-instance multi-label learning for RE, which jointly models all the instances of a pair of entities in text and all their labels using a graphical model with latent variables. Expand
  • 597
  • 97
  • PDF
SummaRuNNer: A Recurrent Neural Network Based Sequence Model for Extractive Summarization of Documents
TLDR
We present SummaRuNNer, a Recurrent Neural Network based sequence model for extractive summarization of documents and show that it achieves performance better than or comparable to state-of-the-art. Expand
  • 560
  • 97
  • PDF
Pointing the Unknown Words
TLDR
The problem of rare and unknown words is an important issue that can potentially influence the performance of many NLP systems including both the traditional count-based and the deep learning models. Expand
  • 382
  • 46
  • PDF
Joint latent topic models for text and citations
TLDR
In this work, we address the problem of joint modeling of text and citations in the topic modeling framework. Expand
  • 408
  • 32
  • PDF
OCGAN: One-Class Novelty Detection Using GANs With Constrained Latent Representations
TLDR
We present a novel model called OCGAN for the classical problem of one-class novelty detection, where, given a set of examples from a particular class, the goal is to determine if a query example is from the same class. Expand
  • 107
  • 27
  • PDF
Discriminative models for information retrieval
TLDR
Discriminative models have been preferred over generative models in many machine learning problems in the recent past owing to some of their attractive theoretical properties. Expand
  • 305
  • 20
  • PDF
Event threading within news topics
TLDR
In this work, we attempt to capture the rich structure of events and their dependencies in a news topic through our event models. Expand
  • 239
  • 19
  • PDF
Sequence-to-Sequence RNNs for Text Summarization
TLDR
In this work, we cast text summarization as a sequence-to-sequence problem and apply the attentional encoder-decoder RNN that has been shown to be successful for Machine Translation (Bahdanau et al. (2014)). Expand
  • 105
  • 14
  • PDF
...
1
2
3
4
5
...