Corpus ID: 237532584

Deep Algorithmic Question Answering: Towards a Compositionally Hybrid AI for Algorithmic Reasoning

@article{Nuamah2021DeepAQ,
  title={Deep Algorithmic Question Answering: Towards a Compositionally Hybrid AI for Algorithmic Reasoning},
  author={Kwwabena Nuamah},
  journal={ArXiv},
  year={2021},
  volume={abs/2109.08006}
}
An important aspect of artificial intelligence (AI) is the ability to reason in a step-by-step “algorithmic” manner that can be inspected and verified for its correctness. This is especially important in the domain of question answering (QA). We argue that the challenge of algorithmic reasoning in QA can be effectively tackled with a “systems” approach to AI which features a hybrid use of symbolic and sub-symbolic methods including deep neural networks. Additionally, we argue that while neural… Expand

Figures from this paper

References

SHOWING 1-10 OF 46 REFERENCES
A simple neural network module for relational reasoning
TLDR
This work shows how a deep learning architecture equipped with an RN module can implicitly discover and learn to reason about entities and their relations. Expand
Question Answering over Knowledge Bases by Leveraging Semantic Parsing and Neuro-Symbolic Reasoning
TLDR
A semantic parsing and reasoning-based Neuro-Symbolic Question Answering system that achieves state-of-the-art performance on QALD-9 and LC-QuAD 1.0 and integrates multiple, reusable modules that are trained specifically for their individual tasks and do not require end-to-end training data. Expand
Neural-Symbolic VQA: Disentangling Reasoning from Vision and Language Understanding
TLDR
This work proposes a neural-symbolic visual question answering system that first recovers a structural scene representation from the image and a program trace from the question, then executes the program on the scene representation to obtain an answer. Expand
CLEVR: A Diagnostic Dataset for Compositional Language and Elementary Visual Reasoning
TLDR
This work presents a diagnostic dataset that tests a range of visual reasoning abilities and uses this dataset to analyze a variety of modern visual reasoning systems, providing novel insights into their abilities and limitations. Expand
Explainable Inference in the FRANK Query Answering System
TLDR
It is argued that there is the need to combine diverse forms of reasoning in order to generate explanations that span the entire chain of reasoning: not just explanations for the so called, black-box models. Expand
Neural Symbolic Machines: Learning Semantic Parsers on Freebase with Weak Supervision
TLDR
A Neural Symbolic Machine is introduced, which contains a neural “programmer” that maps language utterances to programs and utilizes a key-variable memory to handle compositionality, and a symbolic “computer”, i.e., a Lisp interpreter that performs program execution, and helps find good programs by pruning the search space. Expand
Learning Dependency-Based Compositional Semantics
TLDR
A new semantic formalism, dependency-based compositional semantics (DCS) is developed and a log-linear distribution over DCS logical forms is defined and it is shown that the system obtains comparable accuracies to even state-of-the-art systems that do require annotated logical forms. Expand
BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
TLDR
A new language representation model, BERT, designed to pre-train deep bidirectional representations from unlabeled text by jointly conditioning on both left and right context in all layers, which can be fine-tuned with just one additional output layer to create state-of-the-art models for a wide range of tasks. Expand
Symbolic Logic meets Machine Learning: A Brief Survey in Infinite Domains
TLDR
There is a common misconception that logic is for discrete properties, whereas probability theory and machine learning, more generally, is for continuous properties, and results are reported that challenge this view on the limitations of logic, and expose the role that logic can play for learning in infinite domains. Expand
Neural-Symbolic Integration: A Compositional Perspective
TLDR
It is shown that a symbolic module -- with any choice for syntax and semantics, as long as the deduction and abduction methods are exposed -- can be cleanly integrated with a neural module, and facilitate the latter's efficient training, achieving empirical performance that exceeds that of previous work. Expand
...
1
2
3
4
5
...