Content Planning for Neural Story Generation with Aristotelian Rescoring

@inproceedings{GoldfarbTarrant2020ContentPF,
  title={Content Planning for Neural Story Generation with Aristotelian Rescoring},
  author={Seraphina Goldfarb-Tarrant and Tuhin Chakrabarty and Ralph M. Weischedel and Nanyun Peng},
  booktitle={Conference on Empirical Methods in Natural Language Processing},
  year={2020}
}
Long-form narrative text generated from large language models manages a fluent impersonation of human writing, but only at the local sentence level, and lacks structure or global cohesion. We posit that many of the problems of story generation can be addressed via high-quality content planning, and present a system that focuses on how to learn good plot structures to guide story generation. We utilize a plot-generation language model along with an ensemble of rescoring models that each… 

Figures and Tables from this paper

MERMAID: Metaphor Generation with Symbolism and Discriminative Decoding

This paper proposes a method to automatically construct a parallel corpus by transforming a large number of metaphorical sentences from the Gutenberg Poetry corpus to their literal counterpart using recent advances in masked language modeling coupled with commonsense inference.

Plot-guided Adversarial Example Construction for Evaluating Open-domain Story Generation

Experiments show that the evaluation metrics trained on the generated data result in more reliable automatic assessments that correlate remarkably better with human judgments compared to the baselines.

Generating Similes E̶F̶F̶O̶R̶T̶L̶E̶S̶S̶L̶Y̶ 𝘭𝘪𝘬𝘦 𝘢 𝘗𝘳𝘰: A Style Transfer Approach for Simile Generation

This paper proposes a method to automatically construct a parallel corpus by transforming a large number of similes collected from Reddit to their literal counterpart using structured common sense knowledge, and proposes to fine-tune a pretrained sequence to sequence model, BART, on the literal-simile pairs to gain generalizability.

Generating similes effortlessly like a Pro: A Style Transfer Approach for Simile Generation

This paper proposes a method to automatically construct a parallel corpus by transforming a large number of similes collected from Reddit to their literal counterpart using structured common sense knowledge and fine-tune a pretrained sequence to sequence model, BART, on the literal-simile pairs to gain generalizability.

DiSCoL: Toward Engaging Dialogue Systems through Conversational Line Guided Response Generation

DiSCoL is an open-domain dialogue system that leverages conversational lines (briefly convlines) as controllable and informative content-planning elements to guide the generation model produce engaging and informative responses.

Towards Layered Events and Schema Representations in Long Documents

This thesis proposal proposes building on sequences of event embeddings to form schema embedDings, thereby summarizing sections of documents using a single representation, which will allow for the comparisons of different sections of Documents and entire literary works.

Generating Syntactically Controlled Paraphrases without Using Annotated Parallel Pairs

This paper proposes Syntactically controlled Paraphrase Generator (SynPG), an encoder-decoder based model that learns to disentangle the semantics and the syntax of a sentence from a collection of unannotated texts that performs better syntactic control than unsupervised baselines while the quality of the generated paraphrases is competitive.

Moral Stories: Situated Reasoning about Norms, Intents, Actions, and their Consequences

Moral Stories, a crowd-sourced dataset of structured, branching narratives for the study of grounded, goal-oriented social reasoning, is introduced and decoding strategies that combine multiple expert models to significantly improve the quality of generated actions, consequences, and norms compared to strong baselines are proposed.

EventPlus: A Temporal Event Understanding Pipeline

EventPlus as the first comprehensive temporal event understanding pipeline provides a convenient tool for users to quickly obtain annotations about events and their temporal information for any user-provided document.

Go Back in Time: Generating Flashbacks in Stories with Event Temporal Prompts

A Plan-and-Write framework enhanced by reinforcement learning to generate storylines and stories end-to-end using structured storylines to encode events and their pair-wise temporal relations as **temporal prompts** that guide how stories should unfold temporally.
...

References

SHOWING 1-10 OF 33 REFERENCES

Hierarchical Neural Story Generation

This work collects a large dataset of 300K human-written stories paired with writing prompts from an online forum that enables hierarchical story generation, where the model first generates a premise, and then transforms it into a passage of text.

Plan-And-Write: Towards Better Automatic Storytelling

Experiments show that with explicit storyline planning, the generated stories are more diverse, coherent, and on topic than those generated without creating a full plan, according to both automatic and human evaluations.

Strategies for Structuring Story Generation

Writers often rely on plans or sketches to write long stories, but most current language models generate word by word from left to right. We explore coarse-to-fine models for creating narrative texts

Controllable Neural Story Plot Generation via Reward Shaping

A reward-shaping technique is presented that analyzes a story corpus and produces intermediate rewards that are backpropagated into a pre-trained LM in order to guide the model towards a given goal.

Learning to Predict Explainable Plots for Neural Story Generation

A latent variable model is proposed for neural story generation that treats an outline, which is a natural language sentence explainable to humans, as a latent variable to represent a high-level plot that bridges the input and output.

Story Realization: Expanding Plot Events into Sentences

An ensemble-based model that generates natural language guided by events is presented that generates more coherent and plausible stories than baseline approaches 1.

A Knowledge-Enhanced Pretraining Model for Commonsense Story Generation

A knowledge-enhanced pretraining model to utilize commonsense knowledge from external knowledge bases to generate reasonable stories that can generate more reasonable stories than state-of-the-art baselines, particularly in terms of logic and global coherence.

Plan, Write, and Revise: an Interactive System for Open-Domain Story Generation

A neural narrative generation system that interacts with humans to generate stories and finds that humans tasked with collaboratively improving a particular characteristic of a story are in fact able to do so, which has implications for future uses of human-in-the-loop systems.

Event Representations for Automated Story Generation with Deep Neural Nets

The question of event representations that provide a mid-level of abstraction between words and sentences in order to retain the semantic information of the original data while minimizing event sparsity is explored.

Story Ending Generation with Incremental Encoding and Commonsense Knowledge

A novel model for story ending generation that adopts an incremental encoding scheme to represent context clues which are spanning in the story context and can generate more reasonable story endings than state-of-the-art baselines1.