Corpus ID: 211126663

Transformers as Soft Reasoners over Language

@article{Clark2020TransformersAS,
  title={Transformers as Soft Reasoners over Language},
  author={Peter Clark and Oyvind Tafjord and Kyle Richardson},
  journal={ArXiv},
  year={2020},
  volume={abs/2002.05867}
}
  • Peter Clark, Oyvind Tafjord, Kyle Richardson
  • Published 2020
  • Computer Science
  • ArXiv
  • Beginning with McCarthy's Advice Taker (1959), AI has pursued the goal of providing a system with explicit, general knowledge and having the system reason over that knowledge. However, expressing the knowledge in a formal (logical or probabilistic) representation has been a major obstacle to this research. This paper investigates a modern approach to this problem where the facts and rules are provided as natural language sentences, thus bypassing a formal representation. We train transformers… CONTINUE READING

    Figures, Tables, and Topics from this paper.

    Explore key concepts

    Links to highly relevant papers for key concepts in this paper:

    Citations

    Publications citing this paper.
    SHOWING 1-5 OF 5 CITATIONS

    Conversational Neuro-Symbolic Commonsense Reasoning

    VIEW 2 EXCERPTS
    CITES BACKGROUND

    Knowledge-Aware Language Model Pretraining

    VIEW 2 EXCERPTS
    CITES BACKGROUND

    References

    Publications referenced by this paper.
    SHOWING 1-10 OF 55 REFERENCES

    Differentiable Reasoning on Large Knowledge Bases and Natural Language

    VIEW 1 EXCERPT

    Towards AI-Complete Question Answering: A Set of Prerequisite Toy Tasks

    VIEW 3 EXCERPTS
    HIGHLY INFLUENTIAL

    Enhanced LSTM for Natural Language Inference

    VIEW 4 EXCERPTS
    HIGHLY INFLUENTIAL

    A Survey on Semantic Parsing

    VIEW 1 EXCERPT