Learn More
This paper describes the salient features of using a simulated annealing (SA) algorithm in the context of designing digital filters with coefficient values expressed as the sum of power of two. A procedure for linear phase digital filter design, using this algorithm, is first presented and tested, yielding results as good as known optimal methods. The(More)
This paper proposes the Blind Separation of complex signals using a novel neu-ral network architecture based on an adaptive non-linear bi-dimensional activation function; the separation is obtained maximizing the output joint entropy. Avoiding the restriction due to the Louiville's theorem, the activation function is composed by a couple of bi-dimensional(More)
Acoustic source localization in the presence of reverberation is a difſcult task. Conventional approaches, based on time delay estimation performed by generalized cross correlation (GCC) on a set of microphone pairs, followed by geometric triangulation, are often unsatisfactory. Preſltering is usually adopted to reduce the spurious peaks due to reƀections.(More)
In this paper we derive two second-order algorithms, based on conjugate gradient, for on-line training of recurrent neural networks. These azgorithms use two different techniques to extract second-order information on the Hessian matrix without calculating or storing it and without making numericaz approximations. Several simulation results for non-linear(More)
The aim of this paper is to present a new class of learning models for linear as well as non-linear neural layers called Orthonormal Strongly-Constrained SOC or Stiefel. They allow to solve orthonormal problems where orthonormal matrices are involved. After general properties of the learning rules belonging to this new class are shown, examples derived(More)
In this paper, we study the theoretical properties of a new kind of artificial neural network, which is able to adapt its activation functions by varying the control points of a Catmull-Rom cubic spline. Most of all, we are interested in generalization capability, and we can show that our architecture presents several advantages. First of all, it can be(More)
—In this paper, a new complex-valued neural network based on adaptive activation functions is proposed. By varying the control points of a pair of Catmull–Rom cubic splines, which are used as an adaptable activation function, this new kind of neural network can be implemented as a very simple structure that is able to improve the generalization capabilities(More)
This paper focuses on on-line learning procedures for locally recurrent neural networks with emphasis on multilayer perceptron (MLP) with infinite impulse response (IIR) synapses and its variations which include generalized output and activation feedback multilayer networks (MLN's). We propose a new gradient-based procedure called recursive backpropagation(More)
In this paper, a new adaptive spline activation function neural network (ASNN) is presented. Due to the ASNN's high representation capabilities, networks with a small number of interconnections can be trained to solve both pattern recognition and data processing real-time problems. The main idea is to use a Catmull-Rom cubic spline as the neuron's(More)
Multilayer perceptrons (MLPs) with weight values restricted to powers of two or sums of powers of two are introduced. In a digital implementation, these neural networks do not need multipliers but only shift registers when computing in forward mode, thus saving chip area and computation time. A learning procedure, based on backpropagation, is presented for(More)