PlotMachines: Outline-Conditioned Generation with Dynamic Plot State Tracking

@inproceedings{Rashkin2020PlotMachinesOG,
  title={PlotMachines: Outline-Conditioned Generation with Dynamic Plot State Tracking},
  author={Hannah Rashkin and Asli Celikyilmaz and Yejin Choi and Jianfeng Gao},
  booktitle={EMNLP},
  year={2020}
}
We propose the task of outline-conditioned story generation: given an outline as a set of phrases that describe key characters and events to appear in a story, the task is to generate a coherent narrative that is consistent with the provided outline. This task is challenging as the input only provides a rough sketch of the plot, and thus, models need to generate a story by weaving through the key points provided in the outline. This requires the model to keep track of the dynamic states of the… 

Outline to Story: Fine-grained Controllable Story Generation from Cascaded Events

TLDR
This paper proposes a model and creates datasets for a model that fine tunes pre-trained language models on augmented sequences of outline-story pairs with simple language modeling objective, and instantiates research interest of fine-grained controllable generation of open-domain long text, where controlling inputs are represented by short text.

Plot-guided Adversarial Example Construction for Evaluating Open-domain Story Generation

TLDR
Experiments show that the evaluation metrics trained on the generated data result in more reliable automatic assessments that correlate remarkably better with human judgments compared to the baselines.

Conditional Generation of Temporally-ordered Event Sequences

TLDR
A single model is proposed that addresses both temporal ordering, sorting given events into the order they occurred, and event infilling, predicting new events which fit into an existing temporally-ordered sequence.

Text Editing by Command

TLDR
This work proposes a novel text editing task, and introduces WikiDocEdits, a dataset of single-sentence edits crawled from Wikipedia, and shows that the Interactive Editor, a transformer-based model trained on this dataset, outperforms baselines and obtains positive results in both automatic and human evaluations.

Automatic Story Generation: Challenges and Attempts

TLDR
This paper analyzes works in story generation that utilize machine learning approaches to address story generation controllability, incorporate commonsense knowledge, infer reasonable character actions, and generate creative language.

Changing the Mind of Transformers for Topically-Controllable Language Generation

TLDR
A framework that displays multiple candidate upcoming topics and produces a set of candidate topics by predicting the centers of word clusters in the possible continuations, and a text generation model whose output adheres to the chosen topics.

Fabula Entropy Indexing: Objective Measures of Story Coherence

TLDR
Fabula Entropy Indexing (FEI) is presented, an evaluation method to assess story coherence by measuring the degree to which human participants agree with each other when answering true/false questions about stories.

Plug-and-Blend: A Framework for Controllable Story Generation with Blended Control Codes

TLDR
This work describes a Plug-and-Play controllable language generation framework that allows a human user to input multiple control codes (topics) and shows that this framework controls the generation towards given continuous-weighted control codes while keeping the generated sentences fluent, demonstrating strong blending capability.

Controllable Text Generation with Focused Variation

TLDR
This work introduces Focused-Variation Network (FVN), a novel model to control language generation by learning disjoint discrete latent spaces for each attribute inside codebooks that allows for both controllability and diversity, while at the same time generating fluent text.

ReadOnce Transformers: Reusable Representations of Text for Transformers

TLDR
The ReadOnce Transformers approach to convert a transformer-based model into one that can build an information-capturing, task-independent, and compressed representation of text, resulting in a 2x-5x speedup compared to standard text-to-text models, and allows existing language models to handle longer documents without the need for designing new pre-trained models.
...

References

SHOWING 1-10 OF 43 REFERENCES

Plan-And-Write: Towards Better Automatic Storytelling

TLDR
Experiments show that with explicit storyline planning, the generated stories are more diverse, coherent, and on topic than those generated without creating a full plan, according to both automatic and human evaluations.

Controllable Neural Story Plot Generation via Reward Shaping

TLDR
A reward-shaping technique is presented that analyzes a story corpus and produces intermediate rewards that are backpropagated into a pre-trained LM in order to guide the model towards a given goal.

Story Generation with Crowdsourced Plot Graphs

TLDR
A large-scale evaluation shows that stories generated by the system for a previously unknown topic are comparable in quality to simple stories authored by untrained humans.

Event Representations for Automated Story Generation with Deep Neural Nets

TLDR
The question of event representations that provide a mid-level of abstraction between words and sentences in order to retain the semantic information of the original data while minimizing event sparsity is explored.

A Skeleton-Based Model for Promoting Coherence Among Sentences in Narrative Story Generation

TLDR
A skeleton-based model that first generates the most critical phrases, called skeleton, and then expands the skeleton to a complete and fluent sentence, which can generate significantly more coherent text according to human evaluation and automatic evaluation.

Hierarchical Neural Story Generation

TLDR
This work collects a large dataset of 300K human-written stories paired with writing prompts from an online forum that enables hierarchical story generation, where the model first generates a premise, and then transforms it into a passage of text.

Neural Storyline Extraction Model for Storyline Generation from News Articles

TLDR
A novel neural network based approach to extract structured representations and evolution patterns of storylines without using annotated data is proposed and it outperforms state-of-the-art approaches for storyline generation on both accuracy and efficiency.

Do Massively Pretrained Language Models Make Better Storytellers?

TLDR
It is found that although GPT2-117 conditions more strongly on context, is more sensitive to ordering of events, and uses more unusual words, it is just as likely to produce repetitive and under-diverse text when using likelihood-maximizing decoding algorithms.

Controlling Narrative Generation with Planning Trajectories: The Role of Constraints

TLDR
An approach to planning with trajectory constraints is developed that decomposes the problem into a set of smaller subproblems using the temporal orderings described by the constraints and then solves them incrementally.

Story Generation from Sequence of Independent Short Descriptions

TLDR
A deep recurrent neural network architecture is implemented that encodes sequence of variable length input descriptions to corresponding latent representations and decodes them to produce well formed comprehensive story like summaries.