Learning task-dependent distributed representations by backpropagation through structure

@article{Goller1996LearningTD,
  title={Learning task-dependent distributed representations by backpropagation through structure},
  author={Christoph Goller and Andreas K{\"u}chler},
  journal={Proceedings of International Conference on Neural Networks (ICNN'96)},
  year={1996},
  volume={1},
  pages={347-352 vol.1}
}
  • C. Goller, A. Küchler
  • Published 3 June 1996
  • Computer Science
  • Proceedings of International Conference on Neural Networks (ICNN'96)
While neural networks are very successfully applied to the processing of fixed-length vectors and variable-length sequences, the current state of the art does not allow the efficient processing of structured objects of arbitrary shape (like logical terms, trees or graphs. [...] Key Method The major difference of our approach compared to others is that the structure-representations are exclusively tuned for the intended inference task.Expand
Inductive Learning in Symbolic Domains Using Structure-Driven Recurrent Neural Networks
TLDR
A connectionist architecture together with a novel supervised learning scheme which is capable of solving inductive inference tasks on complex symbolic structures of arbitrary size and first results from experiments with inductive learning tasks consisting in the classification of logical terms are given. Expand
Comparing Structures Using a Hopfield-Style Neural Network
TLDR
The former approaches of structural matching and constraint relaxation by spreading activation in neural networks and the method of solving optimization tasks using Hopfield-style nets are combined. Expand
Learning Efficiently with Neural Networks: A Theoretical Comparison between Structured and Flat Representations
TLDR
The message of this paper is that, whenever structured representations are available, they should be preferred to "flat" (array based) representations because they are likely to simplify learning in terms of time complexity. Expand
The loading problem for recursive neural networks
TLDR
This paper presents sufficient conditions which guarantee the absence of local minima of the error function in the case of learning directed acyclic graphs with recursive neural networks and conceive a reduction algorithm that involves both the information attached to the nodes and the topology, which enlarges significantly the class of the problems with unimodal error function. Expand
A general framework for adaptive processing of data structures
TLDR
The framework described in this paper is an attempt to unify adaptive models like artificial neural nets and belief nets for the problem of processing structured information, where relations between data variables are expressed by directed acyclic graphs, where both numerical and categorical values coexist. Expand
From Hopfield nets to recursive networks to graph machines: Numerical machine learning for structured data
TLDR
It is shown that, despite the apparent diversity, two basic principles underlie the recent approaches: first, use structured machines to learn structured data; second, learn representations instead of handcrafting them, which proved very successful for handling structured data, to the point of generating a novel branch of numerical machine learning. Expand
Neural Representation Learning in Linguistic Structured Prediction
Advances in neural network architectures and training algorithms have demonstrated the effectiveness of representation learning in natural language processing. This thesis argues for the importanceExpand
Selective Training: A Strategy for Fast Backpropagation on Sentence Embeddings
TLDR
This work presents a method to reduce training time substantially by selecting training instances that provide relevant information for training, based on the similarity of the learned representations over input instances, thus allowing for learning a non-trivial weighting scheme from multi-dimensional representations. Expand
Recursive Neural Networks Applied to Discourse Representation Theory
TLDR
A novel technique is introduced, combining Discourse Representation Theory (DRT) with Recursive Neural Networks (RNN) in order to yield a neural model capable to discover properties and relationships among constituents of a knowledge-base expressed by natural language sentences. Expand
Neural Networks for Processing Data Structures
  • A. Sperduti
  • Computer Science
  • Summer School on Neural Networks
  • 1997
TLDR
It is not difficult to figure out how to extract tree automata from a neural network for structures, and this would allow the above scheme to work on the other side around, with a neural module which is driven by a symbolic subsystem. Expand
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 19 REFERENCES
Recursive Distributed Representations
TLDR
This paper presents a connectionist architecture which automatically develops compact distributed representations for variable-sized recursive data structures, as well as efficient accessing mechanisms for them. Expand
Learning Distributed Representations for the Classification of Terms
TLDR
The intended applications of the approach described in this paper are hybrid (symbolic/connectionist) systems, where the connectionist part has to solve logic-oriented inductive learning tasks similar to the term-classification problems used in the experiments. Expand
Learning Recursive Distributed Representations for Holistic Computation
TLDR
Two possible forms of holistic inference, transformational inference and confluent inference, are identified and compared and a dual-ported RAAM architecture based on Pollack's Recursive Auto-Associative Memory is implemented and demonstrated in the domain of Natural Language translation. Expand
Distributed representations for terms in hybrid reasoning systems
TLDR
The intended applications of the approach are hybrid (symbolic/connectionist) systems, where the connectionist part has to solve logic-oriented inductive learning tasks similar to the term-classiication problems used in the experiments. Expand
A Connectionist Parser with Recursive Sentence Structure and Lexical Disambiguation
TLDR
XERIC networks, presented here, are distributed representation connectionist parsers which can analyze and represent syntactically varied sentences, including ones with recursive phrase structure constructs. Expand
Backpropagation Through Time: What It Does and How to Do It
TLDR
This paper first reviews basic backpropagation, a simple method which is now being widely used in areas like pattern recognition and fault diagnosis, and describes further extensions of this method, to deal with systems other than neural networks, systems involving simultaneous equations or true recurrent networks, and other practical issues which arise with this method. Expand
Exploring the Symbolic/Subsymbolic Continuum: A case study of RAAM
It is di cult to clearly de ne the symbolic and subsymbolic paradigms; each is usually described by its tendencies rather than any one de nitive property. Symbolic processing is generallyExpand
Encoding Labeled Graphs by Labeling RAAM
TLDR
The Labeling RAAM (LRAAM), an extension to the RAAM by Pollack, can encode labeled graphs with cycles by representing pointers explicitly by transforming the encoder network of the LRAAM into an analog Hopfield network with hidden units. Expand
Learning internal representations by error propagation
This chapter contains sections titled: The Problem, The Generalized Delta Rule, Simulation Results, Some Further Generalizations, Conclusion
Structure Sensitivity in Connectionist Models
Annotation: Published in The Proceedings of the 1993 Connectionist Models Summer School, (Eds) Mozer et al., Lawrence Erlbaum, 1993.
...
1
2
...