Deep Unordered Composition Rivals Syntactic Methods for Text Classification

Abstract

Many existing deep learning models for natural language processing tasks focus on learning the compositionality of their inputs, which requires many expensive computations. We present a simple deep neural network that competes with and, in some cases, outperforms such models on sentiment analysis and factoid question answering tasks while taking only a fraction of the training time. While our model is syntactically-ignorant, we show significant improvements over previous bag-of-words models by deepening our network and applying a novel variant of dropout. Moreover, our model performs better than syntactic models on datasets with high syntactic variance. We show that our model makes similar errors to syntactically-aware models, indicating that for the tasks we consider, nonlinearly transforming the input is more important than tailoring a network to incorporate word order and syntax.

View Slides

Extracted Key Phrases

7 Figures and Tables

050201520162017
Citations per Year

125 Citations

Semantic Scholar estimates that this publication has 125 citations based on the available data.

See our FAQ for additional information.

Cite this paper

@inproceedings{Iyyer2015DeepUC, title={Deep Unordered Composition Rivals Syntactic Methods for Text Classification}, author={Mohit Iyyer and Varun Manjunatha and Jordan L. Boyd-Graber and Hal Daum{\'e}}, booktitle={ACL}, year={2015} }