Skipping Word: A Character-Sequential Representation based Framework for Question Answering

@article{Meng2016SkippingWA,
  title={Skipping Word: A Character-Sequential Representation based Framework for Question Answering},
  author={Lingxun Meng and Yan Li and Mengyi Liu and Peng Shu},
  journal={Proceedings of the 25th ACM International on Conference on Information and Knowledge Management},
  year={2016}
}
  • Lingxun Meng, Yan Li, Peng Shu
  • Published 2 September 2016
  • Computer Science
  • Proceedings of the 25th ACM International on Conference on Information and Knowledge Management
Recent works using artificial neural networks based on word distributed representation greatly boost the performance of various natural language learning tasks, especially question answering. Though, they also carry along with some attendant problems, such as corpus selection for embedding learning, dictionary transformation for different learning tasks, etc. In this paper, we propose to straightforwardly model sentences by means of character sequences, and then utilize convolutional neural… 
4 Citations

Figures and Tables from this paper

A Hybrid Embedding Approach to Noisy Answer Passage Retrieval
TLDR
The flexibility of a character based approach on the task of answer passage retrieval is demonstrated, agnostic to the source of embeddings and with improved performance in P@1 and MRR metrics over a word based approach as the collections degrade in quality.
Chinese Character Embedding Based Semantic Query Algorithm for Semi-structured Corpora
TLDR
This paper proposes CSQ, a semantic query algorithm based on Chinese character embedding that computes the vectors of larger language units with those of smaller language units which are computed by classical embedding models.
Question Answering Systems: A Review on Present Developments, Challenges and Trends
TLDR
This study collected publications from top conferences and journals on information retrieval, knowledge management, artificial intelligence, web intelligence, natural language processing and the semantic web to help researchers gain an insight on the latest developments and trends of the research being done in the area of question answering.
Automatic video clip and mixing based on semantic sentence matching

References

SHOWING 1-10 OF 16 REFERENCES
Learning to Rank Short Text Pairs with Convolutional Deep Neural Networks
TLDR
This paper presents a convolutional neural network architecture for reranking pairs of short texts, where the optimal representation of text pairs and a similarity function to relate them in a supervised way from the available training data are learned.
A Convolutional Neural Network for Modelling Sentences
TLDR
A convolutional architecture dubbed the Dynamic Convolutional Neural Network (DCNN) is described that is adopted for the semantic modelling of sentences and induces a feature graph over the sentence that is capable of explicitly capturing short and long-range relations.
Deep Learning for Answer Sentence Selection
TLDR
This work proposes a novel approach to solving the answer sentence selection task via means of distributed representations, and learns to match questions with answers by considering their semantic encoding.
Question Answering Using Enhanced Lexical Semantic Models
TLDR
This work focuses on improving the performance using models of lexical semantic resources and shows that these systems can be consistently and significantly improved with rich lexical semantics information, regardless of the choice of learning algorithms.
Character-Aware Neural Language Models
TLDR
A simple neural language model that relies only on character-level inputs that is able to encode, from characters only, both semantic and orthographic information and suggests that on many languages, character inputs are sufficient for language modeling.
GloVe: Global Vectors for Word Representation
TLDR
A new global logbilinear regression model that combines the advantages of the two major model families in the literature: global matrix factorization and local context window methods and produces a vector space with meaningful substructure.
Semantic Parsing on Freebase from Question-Answer Pairs
TLDR
This paper trains a semantic parser that scales up to Freebase and outperforms their state-of-the-art parser on the dataset of Cai and Yates (2013), despite not having annotated logical forms.
What is the Jeopardy Model? A Quasi-Synchronous Grammar for QA
TLDR
A probabilistic quasi-synchronous grammar, inspired by one proposed for machine translation, and parameterized by mixtures of a robust nonlexical syntax/alignment model with a(n optional) lexical-semantics-driven log-linear model is proposed.
Convolutional Neural Networks for Sentence Classification
TLDR
The CNN models discussed herein improve upon the state of the art on 4 out of 7 tasks, which include sentiment analysis and question classification, and are proposed to allow for the use of both task-specific and static vectors.
Character-level Convolutional Networks for Text Classification
TLDR
This article constructed several large-scale datasets to show that character-level convolutional networks could achieve state-of-the-art or competitive results in text classification.
...
...