Aurelio Uncini

Learn More
This paper focuses on on-line learning procedures for locally recurrent neural networks with emphasis on multilayer perceptron (MLP) with infinite impulse response (IIR) synapses and its variations which include generalized output and activation feedback multilayer networks (MLN's). We propose a new gradient-based procedure called recursive backpropagation(More)
This paper describes the salient features of using a simulated annealing (SA) algorithm in the context of designing digital filters with coefficient values expressed as the sum of power of two. A procedure for linear phase digital filter design, using this algorithm, is first presented and tested, yielding results as good as known optimal methods. The(More)
Multilayer perceptrons (MLPs) with weight values restricted to powers of two or sums of powers of two are introduced. In a digital implementation, these neural networks do not need multipliers but only shift registers when computing in forward mode, thus saving chip area and computation time. A learning procedure, based on backpropagation, is presented for(More)
In this paper, neural networks based on an adaptive nonlinear function suitable for both blind complex time domain signal separation and blind frequency domain signal deconvolution, are presented. This activation function, whose shape is modified during learning, is based on a couple of spline functions, one for the real and one for the imaginary part of(More)
This paper introduces a novel independent component analysis (ICA) approach to the separation of nonlinear convolutive mixtures. The proposed model is an extension of the well-known post nonlinear (PNL) mixing model and consists of the convolutive mixing of PNL mixtures. Theoretical proof of existence and uniqueness of the solution under proper assumptions(More)
In this paper, a new complex-valued neural network based on adaptive activation functions is proposed. By varying the control points of a pair of Catmull–Rom cubic splines, which are used as an adaptable activation function, this new kind of neural network can be implemented as a very simple structure that is able to improve the generalization capabilities(More)
In this paper, a new adaptive spline activation function neural network (ASNN) is presented. Due to the ASNN's high representation capabilities, networks with a small number of interconnections can be trained to solve both pattern recognition and data processing real-time problems. The main idea is to use a Catmull-Rom cubic spline as the neuron's(More)
In this paper a new class of nonlinear adaptive filters, consisting of a linear combiner followed by a flexible memory-less function, is presented. The nonlinear function involved in the adaptation process is based on a spline function that can be modified during learning. The spline control points are adaptively changed using gradient-based techniques.(More)
In this paper, we consider the joint task of simultaneously optimizing (i) the weights of a deep neural network, (ii) the number of neurons for each hidden layer, and (iii) the subset of active input features (i.e., feature selection). While these problems are generally dealt with separately, we present a simple regularized formulation allowing to solve all(More)
Given a single sound source in a non reverberant environment, an estimate of the Time Difference Of Arrival (TDOA) between microphones can be obtained by observing the time value at which the cross correlation of the two microphone signals displays a maximum. In the presence of reverberation, however, the cross correlation function displays a great number(More)