# Bilateral Multi-Perspective Matching for Natural Language Sentences

@inproceedings{Wang2017BilateralMM,
title={Bilateral Multi-Perspective Matching for Natural Language Sentences},
author={Zhiguo Wang and Wael Hamza and Radu Florian},
booktitle={IJCAI},
year={2017}
}
• Published in IJCAI 13 February 2017
• Computer Science
Natural language sentence matching is a fundamental technology for a variety of tasks. [] Key Method Given two sentences $P$ and $Q$, our model first encodes them with a BiLSTM encoder. Next, we match the two encoded sentences in two directions $P \rightarrow Q$ and $P \leftarrow Q$. In each matching direction, each time step of one sentence is matched against all time-steps of the other sentence from multiple perspectives.
551 Citations

## Figures and Tables from this paper

MKPM: Multi keyword-pair matching for natural language sentences
• Computer Science
Appl. Intell.
• 2022
A sentence matching method based on multi keyword-pair matching (MKPM), which uses keyword pairs in two sentences to represent the semantic relationship between them, avoiding the interference of redundancy and noise is proposed.
Structured Alignment Networks for Matching Sentences
• Computer Science
EMNLP
• 2018
This work introduces a model of structured alignments between sentences, showing how to compare two sentences by matching their latent structures, and finds that modeling latent tree structures results in superior performance.
Syntax-Aware Sentence Matching with Graph Convolutional Networks
• Computer Science
KSEM
• 2019
A new method which incorporates syntactic structure into “matching-aggregation” framework for sentence matching tasks and uses a gating mechanism to dynamically combine the raw contextual representation of a sentence with the syntactic representation of the sentence to relieve the noise caused by the potential wrong dependency parsing result.
Siamese Network cooperating with Multi-head Attention for semantic sentence matching
• Computer Science
2020 19th International Symposium on Distributed Computing and Applications for Business Engineering and Science (DCABES)
• 2020
A deep architecture with comparison-interaction separated to match two sentences, which based on Siamese network for comparison and multi-head attention for interaction information between sentence pairs is presented.
Matching Natural Language Sentences with Hierarchical Sentence Factorization
• Computer Science
WWW
• 2018
The proposed hierarchical sentence factorization can be used to significantly improve the performance of existing unsupervised distance-based metrics as well as multiple supervised deep learning models based on the convolutional neural network (CNN) and long short-term memory (LSTM).
Multi-Level Compare-Aggregate Model for Text Matching
• Computer Science
2019 International Joint Conference on Neural Networks (IJCNN)
• 2019
A multi-level compare-aggregate model (MLCA), which matches each word in one text against the other text at three different levels, word level ( word-by-word matching), phrase level (word- by-phrase matching) and sentence level (words-bysentence matching).
DEIM: An effective deep encoding and interaction model for sentence matching
• Computer Science
ArXiv
• 2022
Experimental results show that the proposed algorithm can effectively extract deep semantic features that verify the effectiveness of the algorithm on sentence matching.
Original Semantics-Oriented Attention and Deep Fusion Network for Sentence Matching
• Computer Science
EMNLP
• 2019
This paper presents an original semantics-oriented attention and deep fusion network (OSOA-DFN) for sentence matching, which is oriented to the original semantic representation of another sentence, which captures the relevant information from a fixed matching target.
Densely-Connected Transformer with Co-attentive Information for Matching Text Sequences
• Computer Science
APWeb/WAIM
• 2020
Densely connected Transformer is proposed to perform multiple matching processes with co-attentive information to enhance the interaction of sentence pairs in each matching process.
DRr-Net: Dynamic Re-Read Network for Sentence Semantic Matching
• Computer Science
AAAI
• 2019
A Dynamic Re-read Network (DRr-Net) approach for sentence semantic matching, which is able to pay close attention to a small region of sentences at each step and re-read the important words for better sentence semantic understanding.

## References

SHOWING 1-10 OF 45 REFERENCES
Learning Natural Language Inference using Bidirectional LSTM model and Inner-Attention
• Computer Science
ArXiv
• 2016
A sentence encoding-based model for recognizing text entailment that utilized the sentence's first-stage representation to attend words appeared in itself, which is called "Inner-Attention" in this paper.
ABCNN: Attention-Based Convolutional Neural Network for Modeling Sentence Pairs
• Computer Science
Transactions of the Association for Computational Linguistics
• 2016
This work presents a general Attention Based Convolutional Neural Network (ABCNN) for modeling a pair of sentences and proposes three attention schemes that integrate mutual influence between sentences into CNNs; thus, the representation of each sentence takes into consideration its counterpart.
Sentence Similarity Learning by Lexical Decomposition and Composition
• Computer Science
COLING
• 2016
This work proposes a model to take into account both the similarities and dissimilarities by decomposing and composing lexical semantics over sentences, which gets the state-of-the-art performance on the answer sentence selection task, and achieves a comparable result on the paraphrase identification task.
Multi-Perspective Context Matching for Machine Comprehension
• Computer Science
ArXiv
• 2016
A Multi-Perspective Context Matching (MPCM) model is proposed, which is an end-to-end system that directly predicts the answer beginning and ending points in a passage.
Natural Language Inference by Tree-Based Convolution and Heuristic Matching
• Computer Science
ACL
• 2016
This model, a tree-based convolutional neural network (TBCNN) captures sentence-level semantics; then heuristic matching layers like concatenation, element-wise product/difference combine the information in individual sentences.
FAQ-based Question Answering via Word Alignment
• Computer Science
ArXiv
• 2015
A novel word-alignment-based method to solve the FAQ-based question answering task, where the word alignment between two questions is used for extracting features, and a bootstrap-based feature extraction method to extract a small set of effective lexical features.
Learning Natural Language Inference with LSTM
• Computer Science
NAACL
• 2016
A special long short-term memory (LSTM) architecture for NLI that remembers important mismatches that are critical for predicting the contradiction or the neutral relationship label and achieves an accuracy of 86.1%, outperforming the state of the art.
Reasoning about Entailment with Neural Attention
• Computer Science
ICLR
• 2016
This paper proposes a neural model that reads two sentences to determine entailment using long short-term memory units and extends this model with a word-by-word neural attention mechanism that encourages reasoning over entailments of pairs of words and phrases, and presents a qualitative analysis of attention weights produced by this model.
What is the Jeopardy Model? A Quasi-Synchronous Grammar for QA
• Computer Science
EMNLP
• 2007
A probabilistic quasi-synchronous grammar, inspired by one proposed for machine translation, and parameterized by mixtures of a robust nonlexical syntax/alignment model with a(n optional) lexical-semantics-driven log-linear model is proposed.
A Compare-Aggregate Model for Matching Text Sequences
• Computer Science
ICLR
• 2017
A general "compare-aggregate" framework that performs word-level matching followed by aggregation using Convolutional Neural Networks and finds that some simple comparison functions based on element-wise operations can work better than standard neural network and neural tensor network.