Enhanced LSTM for Natural Language Inference

@inproceedings{Chen2017EnhancedLF,
  title={Enhanced LSTM for Natural Language Inference},
  author={Qian Chen and Xiao-Dan Zhu and Zhenhua Ling and Si Wei and Hui Jiang and Diana Inkpen},
  booktitle={ACL},
  year={2017}
}
Reasoning and inference are central to human and artificial intelligence. [...] Key Result Particularly, incorporating syntactic parsing information contributes to our best result---it further improves the performance even when added to the already very strong model.Expand
Neural Natural Language Inference Models Enhanced with External Knowledge
TLDR
This paper enrichs the state-of-the-art neural natural language inference models with external knowledge and demonstrates that the proposed models improve neural NLI models to achieve the state of theart performance on the SNLI and MultiNLI datasets. Expand
NeuralLog: Natural Language Inference with Joint Neural and Logical Reasoning
TLDR
This work proposes an inference framework called NeuralLog, which utilizes both a monotonicity-based logical inference engine and a neural network language model for phrase alignment, and shows that the joint logic and neural inference system improves accuracy on the NLI task and can achieve state-of-art accuracy onThe SICK and MED datasets. Expand
Knowledge Augmented Inference Network for Natural Language Inference
TLDR
A Knowledge Augmented Inference Network (K- AIN) that can effectively incorporate external knowledge into existing neural network models on Natural Language Inference task and achieves a better performance than the strong baseline on the SNLI dataset and surpass the current state-of-the-art models on the SciTail dataset. Expand
Knowledge Adaptive Neural Network for Natural Language Inference
TLDR
This paper proposes knowledge adaptive neural network (KANN) that adaptively incorporates commonsense knowledge at sentence encoding and inference stages, and is comparable to if not better than the recent neural network based approaches on NLI. Expand
Unsupervised Pre-training with Structured Knowledge for Improving Natural Language Inference
  • Xiaoyu Yang, Xiaodan Zhu, Zhan Shi, Tianda Li
  • Computer Science
  • ArXiv
  • 2021
TLDR
This paper proposes models that leverage structured knowledge in different components of pre-trained models for NLI and shows that the proposed models perform better than previous BERT-based state-of-the-art models. Expand
DR-BiLSTM: Dependent Reading Bidirectional LSTM for Natural Language Inference
TLDR
A novel dependent reading bidirectional LSTM network (DR-BiLSTM) is proposed to efficiently model the relationship between a premise and a hypothesis during encoding and inference in the natural language inference (NLI) task. Expand
Natural Language Inference with Hierarchical BiLSTM Max Pooling Architecture
TLDR
This model beats the InferSent model in 8 out of 10 recently published SentEval probing tasks designed to evaluate sentence embeddings' ability to capture some of the important linguistic properties of sentences. Expand
SDF-NN: A Deep Neural Network with Semantic Dropping and Fusion for Natural Language Inference
  • L. Tan, C. Wang, +4 authors Z. Huang
  • Computer Science
  • 2017 IEEE 29th International Conference on Tools with Artificial Intelligence (ICTAI)
  • 2017
TLDR
SDF-NN is designed, a new NLI model with two novel components: a Semantic Dropping Network (SDN) to automatically discard some of the interfering semantics; and aSemantic Fusion Alignment (SFA) method to effectively fuse all local inference results. Expand
Sentence embeddings in NLI with iterative refinement encoders
TLDR
This work proposes a hierarchy of bidirectional LSTM and max pooling layers that implements an iterative refinement strategy and yields state of the art results on the SciTail dataset as well as strong results for Stanford Natural Language Inference and Multi-Genre Natural language Inference. Expand
SACNN: Self-attentive Convolutional Neural Network Model for Natural Language Inference
TLDR
A general Self-Attentive Convolution Neural Network (SACNN) is presented for natural language inference and sentence pair modeling tasks, and the proposed model uses CNNs to integrate mutual interactions between sentences, and each sentence with their counterparts is taken into consideration for the formulation of their representation. Expand
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 40 REFERENCES
Learning Natural Language Inference with LSTM
TLDR
A special long short-term memory (LSTM) architecture for NLI that remembers important mismatches that are critical for predicting the contradiction or the neutral relationship label and achieves an accuracy of 86.1%, outperforming the state of the art. Expand
Natural language inference
TLDR
This dissertation explores a range of approaches to NLI, beginning with methods which are robust but approximate, and proceeding to progressively more precise approaches, and greatly extends past work in natural logic to incorporate both semantic exclusion and implicativity. Expand
A Neural Architecture Mimicking Humans End-to-End for Natural Language Inference
TLDR
This work uses the recent advances in representation learning to propose a neural architecture for the problem of natural language inference that achieves better accuracy numbers than all published models in literature. Expand
A large annotated corpus for learning natural language inference
TLDR
The Stanford Natural Language Inference corpus is introduced, a new, freely available collection of labeled sentence pairs, written by humans doing a novel grounded task based on image captioning, which allows a neural network-based model to perform competitively on natural language inference benchmarks for the first time. Expand
A Decomposable Attention Model for Natural Language Inference
We propose a simple neural architecture for natural language inference. Our approach uses attention to decompose the problem into subproblems that can be solved separately, thus making it triviallyExpand
Learning Natural Language Inference using Bidirectional LSTM model and Inner-Attention
TLDR
A sentence encoding-based model for recognizing text entailment that utilized the sentence's first-stage representation to attend words appeared in itself, which is called "Inner-Attention" in this paper. Expand
Modeling Semantic Containment and Exclusion in Natural Language Inference
TLDR
This work proposes an approach to natural language inference based on a model of natural logic, which identifies valid inferences by their lexical and syntactic features, without full semantic interpretation, to incorporate both semantic exclusion and implicativity. Expand
Reasoning about Entailment with Neural Attention
TLDR
This paper proposes a neural model that reads two sentences to determine entailment using long short-term memory units and extends this model with a word-by-word neural attention mechanism that encourages reasoning over entailments of pairs of words and phrases, and presents a qualitative analysis of attention weights produced by this model. Expand
Neural Semantic Encoders
TLDR
This paper demonstrated the effectiveness and the flexibility of NSE on five different natural language tasks: natural language inference, question answering, sentence classification, document sentiment analysis and machine translation where NSE achieved state-of-the-art performance when evaluated on publically available benchmarks. Expand
A Fast Unified Model for Parsing and Sentence Understanding
TLDR
The Stack-augmentedParser-Interpreter NeuralNetwork (SPINN) combines parsing and interpretation within a single tree-sequence hybrid model by integrating tree-structured sentence interpretation into the linear sequential structure of a shiftreduceparser. Expand
...
1
2
3
4
...