Generic soft pattern models for definitional question answering

Abstract

This paper explores probabilistic lexico-syntactic pattern matching, also known as <i>soft pattern matching</i>. While previous methods in soft pattern matching are ad hoc in computing the degree of match, we propose two formal matching models: one based on bigrams and the other on the Profile Hidden Markov Model (PHMM). Both models provide a theoretically sound method to model pattern matching as a probabilistic process that generates token sequences. We demonstrate the effectiveness of these models on definition sentence retrieval for definitional question answering. We show that both models significantly outperform state-of-the-art manually constructed patterns. A critical difference between the two models is that the PHMM technique handles language variations more effectively but requires more training data to converge. We believe that both models can be extended to other areas where lexico-syntactic pattern matching can be applied.

DOI: 10.1145/1076034.1076101

Extracted Key Phrases

5 Figures and Tables

Statistics

051015'06'07'08'09'10'11'12'13'14'15'16'17
Citations per Year

64 Citations

Semantic Scholar estimates that this publication has 64 citations based on the available data.

See our FAQ for additional information.

Cite this paper

@inproceedings{Cui2005GenericSP, title={Generic soft pattern models for definitional question answering}, author={Hang Cui and Min-Yen Kan and Tat-Seng Chua}, booktitle={SIGIR}, year={2005} }