• Publications
  • Influence
A large annotated corpus for learning natural language inference
TLDR
We introduce the Stanford Natural Language Inference corpus, a new, freely available collection of labeled sentence pairs, written by humans in a grounded, naturalistic context. Expand
  • 1,745
  • 474
  • PDF
Position-aware Attention and Supervised Data Improve Slot Filling
TLDR
We combine an LSTM sequence model with a form of entity position-aware attention that is better suited to relation extraction. Expand
  • 235
  • 61
  • PDF
Leveraging Linguistic Structure For Open Domain Information Extraction
TLDR
We replace a large set of patterns with a few patterns for canonically structured sentences, and shift the focus to a classifier which learns to extract self-contained clauses from longer sentences. Expand
  • 403
  • 56
  • PDF
A Simple Domain-Independent Probabilistic Approach to Generation
TLDR
We present a simple, robust generation system which performs content selection and surface realization in a unified, domain-independent framework, obtaining results comparable to state-of-the-art domain-specific systems. Expand
  • 175
  • 25
  • PDF
Combining Distant and Partial Supervision for Relation Extraction
TLDR
We present an approach for providing partial supervision to a distantly supervised relation extractor using a small number of carefully selected examples using selection criteria inspired by active learning. Expand
  • 140
  • 17
  • PDF
Bootstrapped Self Training for Knowledge Base Population
TLDR
We propose bootstrapped selftraining to capture the benefits of both systems: the precision of patterns and the generalizability of trained models. Expand
  • 29
  • 8
  • PDF
Evaluating Word Embeddings Using a Representative Suite of Practical Tasks
TLDR
We propose evaluating word embeddings in vivo by evaluating them on a suite of popular downstream tasks. Expand
  • 47
  • 7
  • PDF
Stanford's 2014 Slot Filling Systems
We describe Stanford’s entry in the TACKBP 2014 Slot Filling challenge. We submitted two broad approaches to Slot Filling: one based on the DeepDive framework (Niu et al., 2012), and another based onExpand
  • 42
  • 6
  • PDF
Combining Natural Logic and Shallow Reasoning for Question Answering
TLDR
We extend the breadth of inferences afforded by natural logic to include relational entailment (e.g., buy → own) and meronymy in a unified framework based on natural logic. Expand
  • 18
  • 5
  • PDF
Robust Subgraph Generation Improves Abstract Meaning Representation Parsing
TLDR
The Abstract Meaning Representation (AMR) is a representation for open-domain rich semantics, with potential use in fields like event extraction and machine translation. Expand
  • 46
  • 4
  • PDF
...
1
2
...