• Corpus ID: 25206297

Finding ReMO (Related Memory Object): A Simple Neural Architecture for Text based Reasoning

@article{Moon2018FindingR,
  title={Finding ReMO (Related Memory Object): A Simple Neural Architecture for Text based Reasoning},
  author={Jihyung Moon and Hyochang Yang and Sungzoon Cho},
  journal={ArXiv},
  year={2018},
  volume={abs/1801.08459}
}
Memory Network based models have shown a remarkable progress on the task of relational reasoning. Recently, a simpler yet powerful neural network module called Relation Network (RN) has been introduced. Despite its architectural simplicity, the time complexity of relation network grows quadratically with data, hence limiting its application to tasks with a large-scaled memory. We introduce Related Memory Network, an end-to-end neural network architecture exploiting both memory network and… 

Figures and Tables from this paper

Towards Question Answering with Multi-hop Reasoning over Knowledge using a Neural Network Model with External Memories

  • Yuri MurayamaIchiro Kobayashi
  • Computer Science
    2022 Joint 12th International Conference on Soft Computing and Intelligent Systems and 23rd International Symposium on Advanced Intelligent Systems (SCIS&ISIS)
  • 2022
This work incorporates an architecture for knowledge into such DNC models, i.e. DNC, rsDNC and DNC-DMS, to improve the ability to generate correct answers for questions using both contextual information and structured knowledge.

Dialogue over Context and Structured Knowledge using a Neural Network Model with External Memories

This work incorporates an architecture for knowledge into such DNC models, i.e. DNC, rsDNC and DNC-DMS, to improve the ability to generate correct responses using both contextual information and structured knowledge.

Recurrent Relational Networks

The recurrent relational network is introduced, a general purpose module that operates on a graph representation of objects that can augment any neural network model with the capacity to do many-step relational reasoning.

Multi-layer relation networks for relational reasoning

This work proposes a multi-layer relation network architecture which enables successive refinements of relational information through multiple layers, and shows that the increased depth allows for more complex relational reasoning by applying it to the bAbI 20 QA dataset.

Robust and Scalable Differentiable Neural Computer for Question Answering

The objective precondition is to keep the general character of this model intact while making its application more reliable and speeding up its required training time.

Multi-layer Relation Networks

This work proposes a multi-layer relation network architecture which enables successive refinements of relational information through multiple layers, and shows that the increased depth allows for more complex relational reasoning by applying it to the bAbI 20 QA dataset.

End-to-end information extraction from business documents

A recurrent neural network model that can capture long range context and compare it to a baseline logistic regression model corresponding to the current CloudScan production system are described.

References

SHOWING 1-10 OF 16 REFERENCES

Dynamic Memory Networks for Visual and Textual Question Answering

The new DMN+ model improves the state of the art on both the Visual Question Answering dataset and the \babi-10k text question-answering dataset without supporting fact supervision.

Memory Networks

This work describes a new class of learning models called memory networks, which reason with inference components combined with a long-term memory component; they learn how to use these jointly.

A simple neural network module for relational reasoning

This work shows how a deep learning architecture equipped with an RN module can implicitly discover and learn to reason about entities and their relations.

Gated End-to-End Memory Networks

A novel end-to-end memory access regulation mechanism inspired by the current progress on the connection short-cutting principle in the field of computer vision is introduced, which is the first of its kind in the world.

End-To-End Memory Networks

A neural network with a recurrent attention model over a possibly large external memory that is trained end-to-end, and hence requires significantly less supervision during training, making it more generally applicable in realistic settings.

Ask Me Anything: Dynamic Memory Networks for Natural Language Processing

The dynamic memory network (DMN), a neural network architecture which processes input sequences and questions, forms episodic memories, and generates relevant answers, is introduced.

Towards AI-Complete Question Answering: A Set of Prerequisite Toy Tasks

This work argues for the usefulness of a set of proxy tasks that evaluate reading comprehension via question answering, and classify these tasks into skill sets so that researchers can identify (and then rectify) the failings of their systems.

Empirical Evaluation of Gated Recurrent Neural Networks on Sequence Modeling

These advanced recurrent units that implement a gating mechanism, such as a long short-term memory (LSTM) unit and a recently proposed gated recurrent unit (GRU), are found to be comparable to LSTM.

Learning End-to-End Goal-Oriented Dialog

It is shown that an end-to-end dialog system based on Memory Networks can reach promising, yet imperfect, performance and learn to perform non-trivial operations and be compared to a hand-crafted slot-filling baseline on data from the second Dialog State Tracking Challenge.

Long Short-Term Memory

A novel, efficient, gradient based method called long short-term memory (LSTM) is introduced, which can learn to bridge minimal time lags in excess of 1000 discrete-time steps by enforcing constant error flow through constant error carousels within special units.