InFillmore: Frame-Guided Language Generation with Bidirectional Context

@inproceedings{Ou2021InFillmoreFL,
  title={InFillmore: Frame-Guided Language Generation with Bidirectional Context},
  author={Jiefu Ou and Nathaniel Weir and Anton Belyy and Felix Yu and Benjamin Van Durme},
  booktitle={STARSEM},
  year={2021}
}
We propose a structured extension to bidirectional-context conditional language generation, or “infilling,” inspired by Frame Semantic theory. Guidance is provided through one of two approaches: (1) model fine-tuning, conditioning directly on observed symbolic frames, and (2) a novel extension to disjunctive lexically constrained decoding that leverages frame semantic lexical units. Automatic and human evaluations confirm that frame-guided generation allows for explicit manipulation of intended… 
TaleBrush: Sketching Stories with Generative Pretrained Language Models
TLDR
TaleBrush is introduced, a generative story ideation tool that uses line sketching interactions with a GPT-based language model for control and sensemaking of a protagonist’s fortune in co-created stories and a reflection on how Sketching interactions can facilitate the iterative human-AI co-creation process.
Tailor: Generating and Perturbing Text with Semantic Controls
TLDR
This work presents Tailor, a semantically-controlled text generation system that builds on a pretrained seq2seq model and produces textual outputs conditioned on control codes derived from semantic representations, and shows that Tailor perturbations can improve model generalization through data augmentation.

References

SHOWING 1-10 OF 48 REFERENCES
Enabling Language Models to Fill in the Blanks
TLDR
It is shown that humans have difficulty identifying sentences infilled by the approach, which can enable LMs to infill entire sentences effectively on three different domains: short stories, scientific abstracts, and lyrics.
LOME: Large Ontology Multilingual Extraction
TLDR
This work presents LOME, a system for performing multilingual information extraction that achieves this through the use of multilingual encoders like XLM-R (Conneau et al., 2020) and leveraging multilingual training data.
Plan, Write, and Revise: an Interactive System for Open-Domain Story Generation
TLDR
A neural narrative generation system that interacts with humans to generate stories and finds that humans tasked with collaboratively improving a particular characteristic of a story are in fact able to do so, which has implications for future uses of human-in-the-loop systems.
Language Models are Unsupervised Multitask Learners
TLDR
It is demonstrated that language models begin to learn these tasks without any explicit supervision when trained on a new dataset of millions of webpages called WebText, suggesting a promising path towards building language processing systems which learn to perform tasks from their naturally occurring demonstrations.
Hierarchical Neural Story Generation
TLDR
This work collects a large dataset of 300K human-written stories paired with writing prompts from an online forum that enables hierarchical story generation, where the model first generates a premise, and then transforms it into a passage of text.
Guided generation
  • 2020
COD3S: Diverse Generation with Discrete Semantic Signatures
TLDR
This work presents COD3S, a novel method for generating semantically diverse sentences using neural sequence-to-sequence (seq2seq) models, and applies it to causal generation, the task of predicting a proposition's plausible causes or effects.
Improved Lexically Constrained Decoding for Translation and Monolingual Rewriting
Lexically-constrained sequence decoding allows for explicit positive or negative phrase-based constraints to be placed on target output strings in generation tasks such as machine translation or
ParaBank: Monolingual Bitext Generation and Sentential Paraphrasing via Lexically-constrained Neural Machine Translation
TLDR
ParaBank is presented, a large-scale English paraphrase dataset that surpasses prior work in both quantity and quality and is used to train a monolingual NMT model with the same support for lexically-constrained decoding for sentence rewriting tasks.
Fast Lexically Constrained Decoding with Dynamic Beam Allocation for Neural Machine Translation
TLDR
This work presents a algorithm for lexically constrained decoding with a complexity of O(1) in the number of constraints and demonstrates the algorithm’s remarkable ability to properly place constraints, and uses it to explore the shaky relationship between model and BLEU scores.
...
...