Lubica Benusková

Learn More
This paper presents an on-line training procedure for a hierarchical neural network of integrate-and-fire neurons. The training is done through synaptic plasticity and changes in the network structure. Event driven computation optimizes processing speed in order to simulate networks with large number of neurons. The training procedure is applied to the face(More)
In this paper, we elaborate upon the claim that clustering in the recurrent layer of recurrent neural networks (RNNs) reflects meaningful information processing states even prior to training. By concentrating on activation clusters in RNNs, while not throwing away the continuous state space network dynamics, we extract predictive models that we call neural(More)
In this paper, we describe and evaluate a new spiking neural network (SNN) architecture and its corresponding learning procedure to perform fast and adaptive multi-view visual pattern recognition. The network is composed of a simplified type of integrate-and-fire neurons arranged hierarchically in four layers of two-dimensional neuronal maps. Using a(More)
Heterosynaptic long-term depression (LTD) is conventionally defined as occurring at synapses that are inactive during a time when neighboring synapses are activated by high-frequency stimulation. A new model that combines computational properties of both the Bienenstock, Cooper and Munro model and spike timing-dependent plasticity, however, suggests that(More)
This paper presents a new modular and integrative sensory information system inspired by the way the brain performs information processing, in particular, pattern recognition. Spiking neural networks are used to model human-like visual and auditory pathways. This bimodal system is trained to perform the specific task of person authentication. The two(More)
We have combined the nearest neighbour additive spike-timing-dependent plasticity (STDP) rule with the Bienenstock, Cooper and Munro (BCM) sliding modification threshold in a computational model of heterosynaptic plasticity in the hippocampal dentate gyrus. As a result we can reproduce (1) homosynaptic long-term potentiation of the tetanized input, and (2)(More)
Neuroscience, along with information and mathematical sciences, has developed a variety of theoretical and computational models to model complex brain functions. Along with this development, artificial neural networks – computational models that adopt principles from the nervous system, have been developed to become powerful tools for learning from data and(More)
This paper presents a novel on-line learning procedure to be used in biologically realistic networks of integrate-and-fire neurons. The on-line adaptation is based on synaptic plasticity and changes in the network structure. Event driven computation optimizes processing speed in order to simulate networks with large number of neurons. The learning method is(More)
This paper presents a novel system that performs text-independent speaker authentication using new spiking neural network (SNN) architectures. Each speaker is represented by a set of prototype vectors that is trained with standard Hebbian rule and winner-takes-all approach. For every speaker there is a separated spiking network that computes normalized(More)