• Corpus ID: 221586046

Discovering Textual Structures: Generative Grammar Induction using Template Trees

  title={Discovering Textual Structures: Generative Grammar Induction using Template Trees},
  author={Thomas Winters and Luc De Raedt},
Natural language generation provides designers with methods for automatically generating text, e.g. for creating summaries, chatbots and game content. In practise, text generators are often either learned and hard to interpret, or created by hand using techniques such as grammars and templates. In this paper, we introduce a novel grammar induction algorithm for learning interpretable grammars for generative purposes, called Gitta. We also introduce the novel notion of template trees to discover… 

Figures and Tables from this paper

Generating Playable RPG ROMs for the Game Boy

ISAAC KARTH, University of California, Santa Cruz, USA TAMARA DUPLANTIS, University of California, Santa Cruz, USA MAX KREMINSKI, University of California, Santa Cruz, USA SACHITA KASHYAP, University

Dutch Humor Detection by Generating Negative Examples

It is found that while other language models perform well when the non-jokes came from completely different domains, RobBERT was the only one that was able to distinguish jokes from generated negative examples, and shows that transformer models are a large step forward in humor detection.



Generating Philosophical Statements using Interpolated Markov Models and Dynamic Templates

Two ways of automatically parodying philosophical statements from examples overcoming this issue are presented, and it is shown how these can work in interactive systems as well as template learning systems.

Identifying Hierarchical Structure in Sequences: A linear-time algorithm

SEQUITUR is an algorithm that infers a hierarchical structure from a sequence of discrete symbols by replacing repeated phrases with a grammatical rule that generates the phrase, and continuing this

Automatically Extracting Word Relationships as Templates for Pun Generation

T-PEG, a system that utilizes phonetic and semantic linguistic resources to automatically extract word relationships in puns and store the knowledge in template form, resulting in computer-generated puns that received an average score of 2.13 as compared to 2.70 for human- generated puns from user feedback.

Automatic Joke Generation: Learning Humor from Examples

This paper implements a system called Gag, capable of generating jokes using the “I like my X like I like my Y, Z” template, and uses established humor theory and extend computational humor concepts to allow the system to learn the structures of the given jokes and estimate how funny people might find specific instantiations of joke structures.

Modelling Mutually Interactive Fictional Character Conversational Agents

This paper model six semi-independent Twitterbots based on fictional characters based on the Belgian children’s TV show Samson & Gert, which are mutually interactive with each other as well as with other Twitter users and found that these bots were not only well received by users, but also created lots of interesting, unexpected positive interactions.

Tracery: An Author-Focused Generative Text Tool

This work identifies the design considerations necessary to serve these new generative text authors, like data portability, modular design, and additive authoring, and illustrates how these considerations informed the design of the Tracery language.

An Implemented Model of Punning Riddles

A model of simple question-answer punning, implemented in a program, JAPE-1, which generates riddles from humour-independent lexical entries, which succeeds in generating pieces of text that are recognizably jokes, but some of them are not very good jokes.

The String-to-String Correction Problem

An algorithm is presented which solves the string-to-string correction problem in time proportional to the product of the lengths of the two strings.

and Ong

  • E.
  • 2009

and Ritchie

  • G.
  • 1994