Employing Argumentation Knowledge Graphs for Neural Argument Generation

@inproceedings{Khatib2021EmployingAK,
  title={Employing Argumentation Knowledge Graphs for Neural Argument Generation},
  author={Khalid Al Khatib and Lukas Trautner and Henning Wachsmuth and Yufang Hou and Benno Stein},
  booktitle={ACL},
  year={2021}
}
Generating high-quality arguments, while being challenging, may benefit a wide range of downstream applications, such as writing assistants and argument search engines. Motivated by the effectiveness of utilizing knowledge graphs for supporting general text generation tasks, this paper investigates the usage of argumentation-related knowledge graphs to control the generation of arguments. In particular, we construct and populate three knowledge graphs, employing several compositions of them to… 

Figures and Tables from this paper

Argumentative Text Generation in Economic Domain

This paper uses translated versions of the Argumentative Microtext, Persuasive Essays and UKP Sentential corpora to fine-tune RuBERT model, and this model is used to annotate the corpus of economic news by argumentation.

Contextual information integration for stance detection via cross-attention

This work trains a model consisting of dual encoders which exchange information via cross-attention and evaluates context extracted from structured knowledge sources and from prompting large language models, able to outperform competitive baselines on a large and diverse stance detection benchmark.

Report on the 1st workshop on argumentation knowledge graphs (ArgKG 2021) at AKBC 2021

The first workshop on Argumentation Knowledge Graphs (ArgKG) was held virtually at the Automated Knowledge Base Construction (AKBC 2021) conference on October 7, 2021 and several of its findings and insights are described.

Aphorisms on Epidemiological Modelling

The paratactical approach to the melancholy science is invoked to problematise One Health, the intra-pandemic modelling culture, and to delineate an inkling of the negative in EPIC’s work.

References

SHOWING 1-10 OF 31 REFERENCES

Neural Argument Generation Augmented with Externally Retrieved Evidence

This work proposes an encoder-decoder style neural network-based argument generation model enriched with externally retrieved evidence from Wikipedia, which constructs arguments with more topic-relevant content than popular sequence-to-sequence generation models according to automatic evaluation and human assessments.

Aspect-Controlled Neural Argument Generation

The Arg-CTRL is a language model for argument generation that can be controlled to generate sentence-level arguments for a given topic, stance, and aspect, and is applicable to automatic counter-argument generation.

End-to-End Argumentation Knowledge Graph Construction

This paper studies the end-to-end construction of an argumentation knowledge graph that is intended to support argument synthesis, argumentative question answering, or fake news detection, among

Computational Argumentation Synthesis as a Language Modeling Task

The evaluation suggests that the model can, to some extent, mimic the human synthesis of strategy-specific arguments.

Argument Generation with Retrieval, Planning, and Realization

This paper presents a novel framework, CANDELA, which consists of a powerful retrieval system and a novel two-step generation model, where a text planning decoder first decides on the main talking points and a proper language style for each sentence, then a content realization decoder reflects the decisions and constructs an informative paragraph-level argument.

Text Generation from Knowledge Graphs with Graph Transformers

This work addresses the problem of generating coherent multi-sentence texts from the output of an information extraction system, and in particular a knowledge graph by introducing a novel graph transforming encoder which can leverage the relational structure of such knowledge graphs without imposing linearization or hierarchical constraints.

KG-BART: Knowledge Graph-Augmented BART for Generative Commonsense Reasoning

A novel knowledge graph augmented pre-trained language generation model KG-BART is proposed, which encompasses the complex relations of concepts through the knowledge graph and produces more logical and natural sentences as output and can leverage the graph attention to aggregate the rich concept semantics that enhances the model generalization on unseen concept sets.

Feasible Annotation Scheme for Capturing Policy Argument Reasoning using Argument Templates

This work develops a simple, yet expressive set of easily annotatable ATs that can represent a majority of writer’s reasoning for texts with diverse policy topics while maintaining the computational feasibility of the task.

Claim Synthesis via Predicate Recycling

This paper explores a method to extract the predicate of simple, manuallydetected, claims, and attempt to generate novel claims from them, and finds that this simple method yields fairly good results.

Investigating Pretrained Language Models for Graph-to-Text Generation

It is suggested that the PLMs benefit from similar facts seen during pretraining or fine-tuning, such that they perform well even when the input graph is reduced to a simple bag of node and edge labels.