• Corpus ID: 245131614

Controlled Cue Generation for Play Scripts

  title={Controlled Cue Generation for Play Scripts},
  author={Alara Dirik and Hilal Donmez and Pinar Yanardag},
In this paper, we use a large-scale play scripts dataset to propose the novel task of theatrical cue generation from dialogues. Using over one million lines of dialogue and cues, we approach the problem of cue generation as a controlled text generation task, and show how cues can be used to enhance the impact of dialogue using a language model conditioned on a dialogue/cue discriminator. In addition, we explore the use of topic keywords and emotions for controlled text generation. Extensive… 

Figures and Tables from this paper


Cue Me In: Content-Inducing Approaches to Interactive Story Generation
This work focuses on the task of interactive story generation, where the user provides the model mid-level sentence abstractions in the form of cue phrases during the generation process, and presents two content-inducing approaches to effectively incorporate this additional information.
Conditioned Text Generation with Transfer for Closed-Domain Dialogue Systems
This work shows how to optimally train and control the generation of intent-specific sentences using a conditional variational autoencoder and introduces a new protocol called query transfer that allows to leverage a large unlabelled dataset, possibly containing irrelevant queries, to extract relevant information.
Towards Controllable Story Generation
A general framework of analyzing existing story corpora to generate controllable and creative new stories and creates a new interface for humans to interact with computers to generate personalized stories is presented.
Toward Controlled Generation of Text
A new neural generative model is proposed which combines variational auto-encoders and holistic attribute discriminators for effective imposition of semantic structures inGeneric generation and manipulation of text.
A Diversity-Promoting Objective Function for Neural Conversation Models
This work proposes using Maximum Mutual Information (MMI) as the objective function in neural models, and demonstrates that the proposed MMI models produce more diverse, interesting, and appropriate responses, yielding substantive gains in BLEU scores on two conversational datasets and in human evaluations.
Hierarchical Multi-Task Natural Language Understanding for Cross-domain Conversational AI: HERMIT NLU
A hierarchical multi-task architecture is developed, which delivers a multi-layer representation of sentence meaning and shows overall performance higher than state-of-the-art tools such as RASA, Dialogflow, LUIS, and Watson.
Plug and Play Language Models: A Simple Approach to Controlled Text Generation
The Plug and Play Language Model (PPLM) for controllable language generation is proposed, which combines a pretrained LM with one or more simple attribute classifiers that guide text generation without any further training of the LM.
Story Generation from Sequence of Independent Short Descriptions
A deep recurrent neural network architecture is implemented that encodes sequence of variable length input descriptions to corresponding latent representations and decodes them to produce well formed comprehensive story like summaries.
CTRL: A Conditional Transformer Language Model for Controllable Generation
CTRL is released, a 1.63 billion-parameter conditional transformer language model, trained to condition on control codes that govern style, content, and task-specific behavior, providing more explicit control over text generation.
Event Representations for Automated Story Generation with Deep Neural Nets
The question of event representations that provide a mid-level of abstraction between words and sentences in order to retain the semantic information of the original data while minimizing event sparsity is explored.