Dependency-based Convolutional Neural Networks for Sentence Embedding

@inproceedings{Ma2015DependencybasedCN,
  title={Dependency-based Convolutional Neural Networks for Sentence Embedding},
  author={M. Ma and Liang Huang and Bowen Zhou and Bing Xiang},
  booktitle={ACL},
  year={2015}
}
In sentence modeling and classification, convolutional neural network approaches have recently achieved state-of-the-art results, but all such efforts process word vectors sequentially and neglect long-distance dependencies. To exploit both deep learning and linguistic structures, we propose a tree-based convolutional neural network model which exploit various long-distance relationships between words. Our model improves the sequential baselines on all three sentiment and question… Expand
Attention-Based Convolutional Neural Networks for Sentence Classification
TLDR
This paper proposes a novel convolutional neural network with attention mechanism to improve the performance of sentence classification and demonstrates that the proposed model significantly outperforms the traditional CNN model and achieves competitive performance with the ones that exploit rich syntactic features. Expand
An Efficient Cross-lingual Model for Sentence Classification Using Convolutional Neural Network
TLDR
This paper proposes a cross-lingual convolutional neural network model that is based on word and phrase embeddings learned from unlabeled data in two languages and dependency grammar that achieves a comparable and even better performance than the traditional MT-based method. Expand
Dependency Sensitive Convolutional Neural Networks for Modeling Sentences and Documents
TLDR
DSCNN hierarchically builds textual representations by processing pretrained word embeddings via Long ShortTerm Memory networks and subsequently extracting features with convolution operators, and does not rely on parsers and expensive phrase labeling, and thus is not restricted to sentencelevel tasks. Expand
Part-Of-Speech Tag Embedding for Modeling Sentences and Documents
TLDR
A very simple convolutional neural network model that incorporates Part-Of-Speech tag information (PCNN) that shows great performance on both sentence and document modeling tasks and makes efforts to explore the effect of POS tag embeddings more thoroughly. Expand
Jointly Trained Sequential Labeling and Classification by Sparse Attention Neural Networks
TLDR
This work proposes a jointly trained model for learning the two tasks simultaneously via Long Short-Term Memory (LSTM) networks and introduces a novel mechanism of "sparse attention" to weigh words differently based on their semantic relevance to sentence-level classification. Expand
Dependency-Tree Based Convolutional Neural Networks for Aspect Term Extraction
TLDR
This work proposes a novel dependency-tree based convolutional stacked neural network (DTBCSNN) for aspect term extraction, in which tree-based convolution is introduced over sentences’ dependency parse trees to capture syntactic features. Expand
Convolutional neural network with pair-wise pure dependence for sentence classification
TLDR
This paper proposes PPD-CNN, a convolutional neural network architecture with Pair-wise Pure Dependence (PPD) for sentence classification that significantly outperforms a wide range of baselines and state-of-the-art methods. Expand
Encoding Dependency Representation with Convolutional Neural Network for Target-Polarity Word Collocation Extraction
TLDR
A dependency representation is introduced to explore the most useful semantic features behind the syntactic rules and a framework based on a convolutional neural network (CNN) is adopted to extract the T-P collocations. Expand
Improving Sentence Classification by Multilingual Data Augmentation and Consensus Learning
TLDR
Very positive classification performance can be achieved by the proposed model, which can make use of multilingual data generated by machine translation and mine their language-share and language-specific knowledge for better representation and classification. Expand
The joint effect of semantic and syntactic word embeddings on sentiment analysis
  • Shu Chen, G. Chen, W. Wang
  • Computer Science
  • 2016 IEEE International Conference on Network Infrastructure and Digital Content (IC-NIDC)
  • 2016
TLDR
This work extends CNN models to coordinate two lookup tables, which exploit semantic word embeddings and syntactic word embeddeddings at the same time, and indicates the positive effect on sentiment analysis. Expand
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 30 REFERENCES
Discriminative Neural Sentence Modeling by Tree-Based Convolution
TLDR
This paper proposes a tree-based convolutional neural network (TBCNN) for discriminative sentence modeling that outperforms previous state-of-the-art results, including existing neural networks and dedicated feature/rule engineering. Expand
Convolutional Neural Networks for Sentence Classification
TLDR
The CNN models discussed herein improve upon the state of the art on 4 out of 7 tasks, which include sentiment analysis and question classification, and are proposed to allow for the use of both task-specific and static vectors. Expand
A Convolutional Neural Network for Modelling Sentences
TLDR
A convolutional architecture dubbed the Dynamic Convolutional Neural Network (DCNN) is described that is adopted for the semantic modelling of sentences and induces a feature graph over the sentence that is capable of explicitly capturing short and long-range relations. Expand
Recursive Deep Models for Semantic Compositionality Over a Sentiment Treebank
TLDR
A Sentiment Treebank that includes fine grained sentiment labels for 215,154 phrases in the parse trees of 11,855 sentences and presents new challenges for sentiment compositionality, and introduces the Recursive Neural Tensor Network. Expand
Natural Language Processing (Almost) from Scratch
We propose a unified neural network architecture and learning algorithm that can be applied to various natural language processing tasks including part-of-speech tagging, chunking, named entityExpand
Semi-Supervised Recursive Autoencoders for Predicting Sentiment Distributions
TLDR
A novel machine learning framework based on recursive autoencoders for sentence-level prediction of sentiment label distributions that outperform other state-of-the-art approaches on commonly used datasets, without using any pre-defined sentiment lexica or polarity shifting rules. Expand
Deep Recursive Neural Networks for Compositionality in Language
TLDR
The results show that deep RNNs outperform associated shallow counterparts that employ the same number of parameters and outperforms previous baselines on the sentiment analysis task, including a multiplicative RNN variant as well as the recently introduced paragraph vectors. Expand
TBCNN: A Tree-Based Convolutional Neural Network for Programming Language Processing
TLDR
To the best knowledge, this paper is the first to analyze programs with deep neural networks and extends the scope of deep learning to the field of programming language processing; the experimental results validate its feasibility and show a promising future of this new research area. Expand
Sentiment Classification Using Word Sub-sequences and Dependency Sub-trees
TLDR
Text mining techniques are used to extract frequent word sub-sequences and dependency sub-trees from sentences in a document dataset and use them as features of support vector machines for document sentiment classification. Expand
A Neural Network for Factoid Question Answering over Paragraphs
TLDR
This work introduces a recursive neural network model, qanta, that can reason over question text input by modeling textual compositionality and applies it to a dataset of questions from a trivia competition called quiz bowl. Expand
...
1
2
3
...