Local and Global Context-Based Pairwise Models for Sentence Ordering

@article{Manku2021LocalAG,
  title={Local and Global Context-Based Pairwise Models for Sentence Ordering},
  author={Ruskin Raj Manku and Aditya Jyoti Paul},
  journal={Knowl. Based Syst.},
  year={2021},
  volume={243},
  pages={108453}
}

Figures and Tables from this paper

Associative Learning Mechanism for Drug-Target Interaction Prediction

A DTA prediction method with interactive learning and an autoencoder mechanism that enhances the corresponding ability to capture the feature information of a single molecular sequence by the drug/protein molecular representation learning module and supplements the information interaction between molecular sequence pairs by the interactive information learning module.

References

SHOWING 1-10 OF 57 REFERENCES

Enhancing Pointer Network for Sentence Ordering with Pairwise Ordering Predictions

This work proposes to enhance the pointer network decoder by using two pairwise ordering prediction modules: the FUTURE module predicts the relative orientations of other unordered sentences with respect to the candidate sentence, and the HISTORY module measures the local coherence between several sentences without the influence of noisy left-side context.

End-to-End Neural Sentence Ordering Using Pointer Network

An end-to-end neural approach to address the sentence ordering problem, which uses the pointer network (Ptr-Net) to alleviate the error propagation problem and utilize the whole contextual information is proposed.

Efficient Relational Sentence Ordering Network

A novel deep Efficient Relational Sentence Ordering Network (referred to as ERSON) is proposed by leveraging pre-trained language model in both encoder and decoder architectures to strengthen the coherence modeling of the entire model.

Graph-based Neural Sentence Ordering

A novel and flexible graph-based neural sentence ordering model, which adopts graph recurrent network \cite{Zhang:acl18} to accurately learn semantic representations of the sentences, which outperforms the existing state-of-the-art systems on several benchmark datasets.

Deep Attentive Ranking Networks for Learning to Order Sentences

This work presents an attention-based ranking framework for learning to order sentences given a paragraph, built on a bidirectional sentence encoder and a self-attention based transformer network that outperforms various state-of-the-art methods on these tasks on a variety of evaluation metrics.

Neural Sentence Ordering Based on Constraint Graphs

This work devise a new approach based on multi-granular orders between sentences, which form multiple constraint graphs, which are encoded by Graph Isomorphism Networks and fused into sentence representations, determined using the order-enhanced sentence representations.

Evaluating Text Coherence at Sentence and Paragraph Levels

Results from evaluations show that except for certain extreme conditions, the recurrent graph neural network-based model is an optimal choice for coherence modeling.

Using Conditional Sentence Representation in Pointer Networks for Sentence Ordering

This work proposes a conditional sentence representation, which incorporates the information of the previously selected sentences into the candidate sentence representations, so that the Pointer Network is able to better capture dependencies among sentences.

Sentence Ordering and Coherence Modeling using Recurrent Neural Networks

This work proposes an end- to-end unsupervised deep learning approach based on the set-to-sequence framework to address the structure of coherent texts and shows that useful text representations can be obtained by learning to order sentences.

Pretraining with Contrastive Sentence Objectives Improves Discourse Performance of Language Models

Conpono, an inter-sentence objective for pretraining language models that models discourse coherence and the distance between sentences is proposed, and it is shown that Conpono yields gains of 2%-6% absolute even for tasks that do not explicitly evaluate discourse: textual entailment, common sense reasoning and reading comprehension.
...