Discourse Embellishment Using a Deep Encoder-Decoder Network

@article{Berov2018DiscourseEU,
  title={Discourse Embellishment Using a Deep Encoder-Decoder Network},
  author={L. Berov and K. Standvoss},
  journal={ArXiv},
  year={2018},
  volume={abs/1810.08076}
}
We suggest a new NLG task in the context of the discourse generation pipeline of computational storytelling systems. This task, textual embellishment, is defined by taking a text as input and generating a semantically equivalent output with increased lexical and syntactic complexity. Ideally, this would allow the authors of computational storytellers to implement just lightweight NLG systems and use a domain-independent embellishment module to translate its output into more literary text. We… Expand
Text Embellishment using Attention Based Encoder-Decoder Model
Text embellishment is a natural language generation problem that aims to enhance the lexical and syntactic complexity of a text. i.e., for a given sentence, the goal is to generate a sentence that isExpand

References

SHOWING 1-10 OF 30 REFERENCES
An Experimental Study of LSTM Encoder-Decoder Model for Text Simplification
TLDR
Preliminary experiments are conducted to find that the LSTM Encoder-Decoder model is able to learn operation rules such as reversing, sorting and replacing from sequence pairs, which shows that the model may potentially discover and apply rulessuch as modifying sentence structure, substituting words, and removing words for TS. Expand
Improving Text Simplification Language Modeling Using Unsimplified Text Data
TLDR
This paper examines language modeling for text simplification and finds that a combined model using both simplified and normal English data achieves a 23% improvement in perplexity and a 24% improvement on the lexical simplification task over a model trained only on simple data. Expand
Sequence to Sequence Learning with Neural Networks
TLDR
This paper presents a general end-to-end approach to sequence learning that makes minimal assumptions on the sequence structure, and finds that reversing the order of the words in all source sentences improved the LSTM's performance markedly, because doing so introduced many short term dependencies between the source and the target sentence which made the optimization problem easier. Expand
Learning Phrase Representations using RNN Encoder–Decoder for Statistical Machine Translation
TLDR
Qualitatively, the proposed RNN Encoder‐Decoder model learns a semantically and syntactically meaningful representation of linguistic phrases. Expand
Semantically Conditioned LSTM-based Natural Language Generation for Spoken Dialogue Systems
TLDR
A statistical language generator based on a semantically controlled Long Short-term Memory (LSTM) structure that can learn from unaligned data by jointly optimising sentence planning and surface realisation using a simple cross entropy training criterion, and language variation can be easily achieved by sampling from output candidates. Expand
A Survey on Lexical Simplification
Lexical Simplification is the process of replacing complex words in a given sentence with simpler alternatives of equivalent meaning. This task has wide applicability both as an assistive technologyExpand
Effective Approaches to Attention-based Neural Machine Translation
TLDR
A global approach which always attends to all source words and a local one that only looks at a subset of source words at a time are examined, demonstrating the effectiveness of both approaches on the WMT translation tasks between English and German in both directions. Expand
Neural Machine Translation by Jointly Learning to Align and Translate
TLDR
It is conjecture that the use of a fixed-length vector is a bottleneck in improving the performance of this basic encoder-decoder architecture, and it is proposed to extend this by allowing a model to automatically (soft-)search for parts of a source sentence that are relevant to predicting a target word, without having to form these parts as a hard segment explicitly. Expand
A Survey of Automated Text Simplification
TLDR
This survey identifies and classifies simplification research within the period 1998-2013 and gives an overview of contemporary research whilst taking into account the history that has brought text simplification to its current state. Expand
GloVe: Global Vectors for Word Representation
TLDR
A new global logbilinear regression model that combines the advantages of the two major model families in the literature: global matrix factorization and local context window methods and produces a vector space with meaningful substructure. Expand
...
1
2
3
...