NeuroLogic Decoding: (Un)supervised Neural Text Generation with Predicate Logic Constraints

@inproceedings{Lu2021NeuroLogicD,
  title={NeuroLogic Decoding: (Un)supervised Neural Text Generation with Predicate Logic Constraints},
  author={Ximing Lu and Peter West and Rowan Zellers and Ronan Le Bras and Chandra Bhagavatula and Yejin Choi},
  booktitle={NAACL},
  year={2021}
}
Conditional text generation often requires lexical constraints, i.e., which words should or shouldn’t be included in the output text. While the dominant recipe for conditional text generation has been large-scale pretrained language models that are finetuned on the task-specific training data, such models do not learn to follow the underlying constraints reliably, even when supervised with large amounts of task-specific examples. We propose NeuroLogic Decoding, a simple yet effective algorithm… 

Figures and Tables from this paper

On-the-Fly Attention Modularization for Neural Generation
TLDR
These findings motivate on-the-fly attention modularization, a simple but effective method for injecting inductive biases into attention computation during inference to yield enhanced diversity and commonsense reasoning while maintaining fluency and coherence.
Neural Rule-Execution Tracking Machine For Transformer-Based Text Generation
TLDR
A novel module named Neural Rule-Execution Tracking Machine (NRETM) that can be equipped into various transformer-based generators to leverage multiple rules simultaneously to guide the neural generation model for superior generation performance in an unified and scalable way is proposed.
NeuroLogic A*esque Decoding: Constrained Text Generation with Lookahead Heuristics
TLDR
This work proposes NEUROLOGIC AFesque, a decoding algorithm that incorporates heuristic estimates of future cost that develops efficient lookahead heuristics that are efficient for large-scale language models, making this method a drop-in replacement for common techniques such as beam search and top-k sampling.
COLD Decoding: Energy-based Constrained Text Generation with Langevin Dynamics
TLDR
This paper presents Energy-based Constrained Decoding with Langevin Dynamics (C OLD), a decoding framework which describes constrained generation as specifying constraints through an energy function, then performing differentiable reasoning over the constraints through gradient-based sampling.
Combining Extraction and Generation for Constructing Belief-Consequence Causal Links
In this paper, we introduce and justify a new task—causal link extraction based on beliefs—and do a qualitative analysis of the ability of a large language model—InstructGPT-3—to generate implicit
DeepTrust: A Reliable Financial Knowledge Retrieval Framework For Explaining Extreme Pricing Anomalies
TLDR
DeepTrust is introduced, a reliable financial knowledge retrieval framework on Twitter to explain extreme price moves at speed, while ensuring data veracity using state-of-the-art NLP techniques, paves a promising path towards building a scalable commercial solution that assists traders to reach investment decisions on pricing anomalies with authenticated knowledge from social media platforms in real-time.
Hybrid Semantics for Goal-Directed Natural Language Generation
TLDR
This work builds upon an existing goal-directed generation system, S-STRUCT, and develops a hybrid approach, which uses distributional semantics to quickly and imprecisely add the main elements of the sentence and then uses first-order logic based semantics to more slowly add the precise details.
ToxiGen: A Large-Scale Machine-Generated Dataset for Adversarial and Implicit Hate Speech Detection
TLDR
ToxiGen, a new large-scale and machine-generated dataset of 274k toxic and benign statements about 13 minority groups, is created and it is demonstrated that finetuning a toxicity classifier on data improves its performance on human-written data substantially.
XFBoost: Improving Text Generation with Controllable Decoders
TLDR
A controllable language generation framework called Extract-Finetune-Boost (XFBoost) is proposed, which addresses the problem of inaccurate low-quality inference and is found to produce significantly more descriptive text with higher image relevancy, outperforming baselines and lowering the frequency of factually inaccurate descriptions.
CaPE: Contrastive Parameter Ensembling for Reducing Hallucination in Abstractive Summarization
Hallucination is a known issue for neural abstractive summarization models. Recent work suggests that the degree of hallucination may depend on errors in the training data. In this work, we propose a
...
1
2
...

References

SHOWING 1-10 OF 41 REFERENCES
Neural Machine Translation With Noisy Lexical Constraints
TLDR
This article proposes a novel framework that is capable of improving the translation quality even if the constraints are noisy, and encodes the constraints by a memory encoder and then leverages the memories by aMemory integrator.
CGMH: Constrained Sentence Generation by Metropolis-Hastings Sampling
TLDR
This paper proposes CGMH, a novel approach using Metropolis-Hastings sampling for constrained sentence generation that allows complicated constraints such as the occurrence of multiple keywords in the target sentences, which cannot be handled in traditional RNN-based approaches.
Fast Lexically Constrained Decoding with Dynamic Beam Allocation for Neural Machine Translation
TLDR
This work presents a algorithm for lexically constrained decoding with a complexity of O(1) in the number of constraints and demonstrates the algorithm’s remarkable ability to properly place constraints, and uses it to explore the shaky relationship between model and BLEU scores.
Training Neural Machine Translation to Apply Terminology Constraints
TLDR
Comparative experiments show that the proposed method is not only more effective than a state-of-the-art implementation of constrained decoding, but is also as fast as constraint-free decoding.
The Curious Case of Neural Text Degeneration
TLDR
By sampling text from the dynamic nucleus of the probability distribution, which allows for diversity while effectively truncating the less reliable tail of the distribution, the resulting text better demonstrates the quality of human text, yielding enhanced diversity without sacrificing fluency and coherence.
Neural Machine Translation Decoding with Terminology Constraints
TLDR
This work describes the approach to constrained neural decoding based on finite-state machines and multi-stack decoding which supports target-side constraints as well as constraints with corresponding aligned input text spans and motivates the need for constrained decoding with attentions.
Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer
TLDR
This systematic study compares pre-training objectives, architectures, unlabeled datasets, transfer approaches, and other factors on dozens of language understanding tasks and achieves state-of-the-art results on many benchmarks covering summarization, question answering, text classification, and more.
Multi-domain Neural Network Language Generation for Spoken Dialogue Systems
TLDR
This paper proposes a procedure to train multi-domain, Recurrent Neural Network-based (RNN) language generators via multiple adaptation steps, and shows that the proposed procedure can achieve competitive performance in terms of BLEU score and slot error rate while significantly reducing the data needed to train generators in new, unseen domains.
Improved Lexically Constrained Decoding for Translation and Monolingual Rewriting
Lexically-constrained sequence decoding allows for explicit positive or negative phrase-based constraints to be placed on target output strings in generation tasks such as machine translation or
Language Models are Unsupervised Multitask Learners
TLDR
It is demonstrated that language models begin to learn these tasks without any explicit supervision when trained on a new dataset of millions of webpages called WebText, suggesting a promising path towards building language processing systems which learn to perform tasks from their naturally occurring demonstrations.
...
1
2
3
4
5
...