Learn More
In this paper we present a self-organizing neural network model of early lexical development called DevLex. The network consists of two self-organizing maps (a growing semantic map and a growing phonological map) that are connected via associative links trained by Hebbian learning. The model captures a number of important phenomena that occur in early(More)
We present a model of a bidirectional three-layer neural network with sigmoidal units, which can be trained to learn arbitrary mappings. We introduce a bidirectional activation-based learning algorithm (BAL), inspired by O'Reilly's supervised Generalized Recirculation (GeneRec) algorithm that has been designed as a biologically plausible alternative to(More)
Action understanding undoubtedly involves visual representations. However, linking the observed action with the respective motor category might facilitate processing and provide us with the mechanism to “step into the shoes” of the observed agent. Such principle might be very useful also for a cognitive robot allowing it to link the observed(More)
Recently there has been an outburst of interest in extending topographic maps of vectorial data to more general data structures, such as sequences or trees. However, there is no general consensus as to how best to process sequences using topographic maps, and this topic remains an active focus of neurocomputational research. The representational(More)
As potential candidates for human cognition, connection-ist models of sentence processing must learn to behave systematically by generalizing from a small traning set. It was recently shown that Elman networks and, to a greater extent, echo state networks (ESN) possess limited ability to generalize in artificial language learning tasks. We study this(More)