Recurrent Neural Network-Based Sentence Encoder with Gated Attention for Natural Language Inference

@inproceedings{Chen2017RecurrentNN,
  title={Recurrent Neural Network-Based Sentence Encoder with Gated Attention for Natural Language Inference},
  author={Qian Chen and Xiao-Dan Zhu and Zhen-Hua Ling and Si Wei and Hui Jiang and Diana Inkpen},
  booktitle={RepEval@EMNLP},
  year={2017}
}
The RepEval 2017 Shared Task aims to evaluate natural language understanding models for sentence representation, in which a sentence is represented as a fixedlength vector with neural networks and the quality of the representation is tested with a natural language inference task. This paper describes our system (alpha) that is ranked among the top in the Shared Task, on both the in-domain test set (obtaining a 74.9% accuracy) and on the crossdomain test set (also attaining a 74.9% accuracy… CONTINUE READING

3 Figures & Tables

Topics

Statistics

0204020172018
Citations per Year

Citation Velocity: 21

Averaging 21 citations per year over the last 2 years.

Learn more about how we calculate this metric in our FAQ.