#### Filter Results:

- Full text PDF available (60)

#### Publication Year

1986

2016

- This year (0)
- Last 5 years (20)
- Last 10 years (44)

#### Publication Type

#### Co-author

#### Journals and Conferences

#### Brain Region

#### Cell Type

#### Data Set Used

#### Key Phrases

#### Method

#### Organism

Learn More

- Simei Gomes Wysoski, Lubica Benusková, Nikola K. Kasabov
- ICANN
- 2006

This paper presents an on-line training procedure for a hierarchical neural network of integrate-and-fire neurons. The training is done through synaptic plasticity and changes in the network structure. Event driven computation optimizes processing speed in order to simulate networks with large number of neurons. The training procedure is applied to the face… (More)

- Peter Tiño, Michal Cernanský, Lubica Benusková
- IEEE Transactions on Neural Networks
- 2004

In this paper, we elaborate upon the claim that clustering in the recurrent layer of recurrent neural networks (RNNs) reflects meaningful information processing states even prior to training. By concentrating on activation clusters in RNNs, while not throwing away the continuous state space network dynamics, we extract predictive models that we call neural… (More)

- Simei Gomes Wysoski, Lubica Benusková, Nikola K. Kasabov
- Neurocomputing
- 2008

In this paper, we describe and evaluate a new spiking neural network (SNN) architecture and its corresponding learning procedure to perform fast and adaptive multi-view visual pattern recognition. The network is composed of a simplified type of integrate-and-fire neurons arranged hierarchically in four layers of two-dimensional neuronal maps. Using a… (More)

- Wickliffe C. Abraham, Barbara J. Logan, Amy R Wolff, Lubica Benusková
- Journal of neurophysiology
- 2007

Heterosynaptic long-term depression (LTD) is conventionally defined as occurring at synapses that are inactive during a time when neighboring synapses are activated by high-frequency stimulation. A new model that combines computational properties of both the Bienenstock, Cooper and Munro model and spike timing-dependent plasticity, however, suggests that… (More)

- Simei Gomes Wysoski, Lubica Benusková, Nikola K. Kasabov
- Neural Networks
- 2010

This paper presents a new modular and integrative sensory information system inspired by the way the brain performs information processing, in particular, pattern recognition. Spiking neural networks are used to model human-like visual and auditory pathways. This bimodal system is trained to perform the specific task of person authentication. The two… (More)

- Lubica Benusková, Wickliffe C. Abraham
- Journal of Computational Neuroscience
- 2006

We have combined the nearest neighbour additive spike-timing-dependent plasticity (STDP) rule with the Bienenstock, Cooper and Munro (BCM) sliding modification threshold in a computational model of heterosynaptic plasticity in the hippocampal dentate gyrus. As a result we can reproduce (1) homosynaptic long-term potentiation of the tetanized input, and (2)… (More)

Neuroscience, along with information and mathematical sciences, has developed a variety of theoretical and computational models to model complex brain functions. Along with this development, artificial neural networks – computational models that adopt principles from the nervous system, have been developed to become powerful tools for learning from data and… (More)

- Lubica Benusková
- Ceskoslovenska fysiologie
- 1988

- Simei Gomes Wysoski, Lubica Benusková, Nikola K. Kasabov
- ACIVS
- 2006

This paper presents a novel on-line learning procedure to be used in biologically realistic networks of integrate-and-fire neurons. The on-line adaptation is based on synaptic plasticity and changes in the network structure. Event driven computation optimizes processing speed in order to simulate networks with large number of neurons. The learning method is… (More)

- Simei Gomes Wysoski, Lubica Benusková, Nikola K. Kasabov
- ICANN
- 2007

This paper presents a novel system that performs text-independent speaker authentication using new spiking neural network (SNN) architectures. Each speaker is represented by a set of prototype vectors that is trained with standard Hebbian rule and winner-takes-all approach. For every speaker there is a separated spiking network that computes normalized… (More)