Corpus ID: 211069634

Blank Language Models

@article{Shen2020BlankLM,
  title={Blank Language Models},
  author={Tianxiao Shen and Victor Quach and Regina Barzilay and Tommi S. Jaakkola},
  journal={ArXiv},
  year={2020},
  volume={abs/2002.03079}
}
  • Tianxiao Shen, Victor Quach, +1 author Tommi S. Jaakkola
  • Published 2020
  • Computer Science
  • ArXiv
  • We propose Blank Language Model (BLM), a model that generates sequences by dynamically creating and filling in blanks. Unlike previous masked language models or the Insertion Transformer, BLM uses blanks to control which part of the sequence to expand. This fine-grained control of generation is ideal for a variety of text editing and rewriting tasks. The model can start from a single blank or partially completed text with blanks at specified locations. It iteratively determines which word to… CONTINUE READING

    Figures, Tables, and Topics from this paper.

    References

    Publications referenced by this paper.
    SHOWING 1-10 OF 41 REFERENCES
    A Framework for Distributed Computing using Mobile Intelligent Agents
    4
    Algorafie: A Sffrfiey of the History, Aesthetics and Technology of Lifie Performance of Algorithmic Electronic Dance Mffsic
    • 2014
    AIP Conf. Proc. 307, Gamma-Ray Bursts
    • 1994
    Measuring the accuracy of diagnostic systems.
    6640
    Restoring ancient text using deep learning: a case study on Greek epigraphy
    4
    Insertion Transformer: Flexible Sequence Generation via Insertion Operations
    59