Paper Abstract Writing through Editing Mechanism

@inproceedings{Wang2018PaperAW,
  title={Paper Abstract Writing through Editing Mechanism},
  author={Qingyun Wang and Zhihao Zhou and Lifu Huang and Spencer Whitehead and Boliang Zhang and Heng Ji and Kevin Knight},
  booktitle={ACL},
  year={2018}
}
We present a paper abstract writing system based on an attentive neural sequence-to-sequence model that can take a title as input and automatically generate an abstract. We design a novel Writing-editing Network that can attend to both the title and the previously generated abstract drafts and then iteratively revise and polish the abstract. With two series of Turing tests, where the human judges are asked to distinguish the system-generated abstracts from human-written ones, our system passes… 

Figures and Tables from this paper

PaperRobot: Incremental Draft Generation of Scientific Ideas
We present a PaperRobot who performs as an automatic research assistant by (1) conducting deep understanding of a large collection of human-written papers in a target domain and constructing
Automatic Title Generation for Text with Pre-trained Transformer Language Model
TLDR
A novel approach to Automatic Title Generation for a given text using a pre-trained Transformer Language Model GPT-2, which proposes an unique approach of generating a pool of candidate titles and selecting an appropriate title among them which is then refined or de-noised to get the final title.
Learning to Generate Explainable Plots for Neural Story Generation
TLDR
A latent variable model is proposed for neural story generation that treats an outline, which is a natural language sentence explainable to humans, as a latent variable to represent a high-level plot that bridges the input and output.
ReviewRobot: Explainable Paper Review Generation based on Knowledge Synthesis
TLDR
A novel ReviewRobot is built to automatically assign a review score and write comments for multiple categories such as novelty and meaningful comparison, and can serve as an assistant for paper reviewers, program chairs and authors.
Describing a Knowledge Base
TLDR
This work builds a generation framework based on a pointer network which can copy facts from the input KB, and adds two attention mechanisms: (i) slot-aware attention to capture the association between a slot type and its corresponding slot value; and (ii) a new table position self-attention to captured the inter-dependencies among related slots.
Learning to Predict Explainable Plots for Neural Story Generation
TLDR
A latent variable model is proposed for neural story generation that treats an outline, which is a natural language sentence explainable to humans, as a latent variable to represent a high-level plot that bridges the input and output.
An Improved Coarse-to-Fine Method for Solving Generation Tasks
TLDR
An improved coarse2fine model with a control mechanism is proposed, with which the method can control the influence of the sketch on the final results in the fine stage and even if the sketch is wrong, the model has the opportunity to get a correct result.
Text Generation from Knowledge Graphs with Graph Transformers
TLDR
This work addresses the problem of generating coherent multi-sentence texts from the output of an information extraction system, and in particular a knowledge graph by introducing a novel graph transforming encoder which can leverage the relational structure of such knowledge graphs without imposing linearization or hierarchical constraints.
Building the Directed Semantic Graph for Coherent Long Text Generation
TLDR
A novel two-stage approach to generate coherent long text by first building a document-level path for each output text with each sentence embedding as its node, and a revised self-organising map (SOM) is proposed to cluster similar nodes of a family of document- level paths to construct the directed semantic graph.
AutoCite: Multi-Modal Representation Fusion for Contextual Citation Generation
TLDR
An automatic writing assistant model, AutoCite, which not only infers potentially related work but also automatically generates the citation context at the same time, and is validated on five real-world citation network datasets.
...
...

References

SHOWING 1-10 OF 24 REFERENCES
Neural Network-Based Abstract Generation for Opinions and Arguments
TLDR
An attention-based neural network model is proposed that is able to absorb information from multiple text units to construct informative, concise, and fluent summaries and an importance-based sampling method is designed to allow the encoder to integrate information from an important subset of input.
Get To The Point: Summarization with Pointer-Generator Networks
TLDR
A novel architecture that augments the standard sequence-to-sequence attentional model in two orthogonal ways, using a hybrid pointer-generator network that can copy words from the source text via pointing, which aids accurate reproduction of information, while retaining the ability to produce novel words through the generator.
A Theme-Rewriting Approach for Generating Algebra Word Problems
TLDR
This paper presents a text generation method called It rewriting, which edits existing human-authored narratives to change their theme without changing the underlying story, and applies it to math word problems, where it might help students stay more engaged by quickly transforming all of their homework assignments to the theme of their favorite movie without changes the math concepts that are being taught.
Neural Text Generation from Structured Data with Application to the Biography Domain
TLDR
A neural model for concept-to-text generation that scales to large, rich domains and significantly out-performs a classical Kneser-Ney language model adapted to this task by nearly 15 BLEU is introduced.
Globally Coherent Text Generation with Neural Checklist Models
TLDR
The neural checklist model is presented, a recurrent neural network that models global coherence by storing and updating an agenda of text strings which should be mentioned somewhere in the output, and demonstrates high coherence with greatly improved semantic coverage of the agenda.
Incorporating Copying Mechanism in Sequence-to-Sequence Learning
TLDR
This paper incorporates copying into neural network-based Seq2Seq learning and proposes a new model called CopyNet with encoder-decoder structure which can nicely integrate the regular way of word generation in the decoder with the new copying mechanism which can choose sub-sequences in the input sequence and put them at proper places in the output sequence.
Deliberation Networks: Sequence Generation Beyond One-Pass Decoding
TLDR
This work introduces the deliberation process into the encoder-decoder framework and proposes deliberation networks for sequence generation and establishes a new state-of-the-art BLEU score on the WMT 2014 English-to-French translation task.
Effective Approaches to Attention-based Neural Machine Translation
TLDR
A global approach which always attends to all source words and a local one that only looks at a subset of source words at a time are examined, demonstrating the effectiveness of both approaches on the WMT translation tasks between English and German in both directions.
Personalized Mathematical Word Problem Generation
TLDR
This work proposes a novel technique for automatic generation of personalized word problems that takes a logical encoding of the specification, synthesizes a word problem narrative and its mathematical model as a labeled logical plot graph, and realizes the problem in natural language.
Neural Machine Translation by Jointly Learning to Align and Translate
TLDR
It is conjecture that the use of a fixed-length vector is a bottleneck in improving the performance of this basic encoder-decoder architecture, and it is proposed to extend this by allowing a model to automatically (soft-)search for parts of a source sentence that are relevant to predicting a target word, without having to form these parts as a hard segment explicitly.
...
...