Paolo Campolucci

Learn More
This paper focuses on on-line learning procedures for locally recurrent neural networks with emphasis on multilayer perceptron (MLP) with infinite impulse response (IIR) synapses and its variations which include generalized output and activation feedback multilayer networks (MLN's). We propose a new gradient-based procedure called recursive backpropagation(More)
—In this paper, a new complex-valued neural network based on adaptive activation functions is proposed. By varying the control points of a pair of Catmull–Rom cubic splines, which are used as an adaptable activation function, this new kind of neural network can be implemented as a very simple structure that is able to improve the generalization capabilities(More)
In this paper we derive two second-order algorithms, based on conjugate gradient, for on-line training of recurrent neural networks. These azgorithms use two different techniques to extract second-order information on the Hessian matrix without calculating or storing it and without making numericaz approximations. Several simulation results for non-linear(More)
This paper is focused on the learning algorithms for dynamic multilayer perceptron neural networks where each neuron synapsis is modelled by an infinite impulse response (IIR) filter (IIR MLP). In particular, the Backpropagation Through Time (BPTT) algorithm and its less demanding approximated on-line versions are considered. In fact it is known that the(More)
This paper concerns dynamic neural networks for signal processing: architectural issues are considered but the paper focuses on learning algorithms that work on-line. Locally recurrent neural networks, namely MLP with IIR synapses and generalization of Local Feedback Multi-Layered Networks (LF MLN), are compared to more traditional neural networks, i.e.(More)
In this paper, we derive a new general method for both on-line and off-line backward gradient computation of a system output, or cost function, with respect to system parameters, using a circuit theoretic approach. The system can be any causal, in general non-linear and time-variant, dynamic system represented by a Signal Flow Graph, in particular any(More)
In this paper we propose a new learning algorithm for locally recurrent neural networks, called Truncated Recursive Back Propagation which can be easily implemented on-line with good performance. Moreover it generalises the algorithm proposed by Waibel et al. for TDNN, and includes the Back and Tsoi algorithm as well as BPS and standard on-line Back(More)
In this paper, we study the properties of neural networks based on adaptive spline activation functions (ASNN). Using the results of regularization theory, we show how the proposed architecture is able to produce smooth approximations of unknown functions; to reduce hardware complexity a particular implementation of the kernels expected by the theory is(More)
A large class of nonlinear dynamic adaptive systems such as dynamic recurrent neural networks can be effectively represented by signal flow graphs (SFGs). By this method, complex systems are described as a general connection of many simple components, each of them implementing a simple one-input, one-output transformation, as in an electrical circuit. Even(More)
Neural networks with internal temporal dynamic can be applied to non-linear DSP problems. The classical fully connected recurrent architectures, can be replaced by less complex neural networks, based on the well known MultiLayer Perceptron (MLP) where the temporal dynamic is modelled by replacing each synapses either with a FIR filter or with an IIR filter.(More)