A Signal-Flow-Graph Approach to On-line Gradient Calculation
@article{Campolucci2000ASA, title={A Signal-Flow-Graph Approach to On-line Gradient Calculation}, author={Paolo Campolucci and Aurelio Uncini and Francesco Piazza}, journal={Neural Computation}, year={2000}, volume={12}, pages={1901-1927} }
A large class of nonlinear dynamic adaptive systems such as dynamic recurrent neural networks can be effectively represented by signal flow graphs (SFGs). By this method, complex systems are described as a general connection of many simple components, each of them implementing a simple one-input, one-output transformation, as in an electrical circuit. Even if graph representations are popular in the neural network community, they are often used for qualitative description rather than for…
11 Citations
A general approach to gradient based learning in multirate systems and neural networks
- Computer ScienceProceedings of the IEEE-INNS-ENNS International Joint Conference on Neural Networks. IJCNN 2000. Neural Computing: New Challenges and Perspectives for the New Millennium
- 2000
A new algorithm to estimate the derivative of the output with respect to an internal parameter have been proposed in the literature for discrete-time systems and can be employed for gradient-based learning of general multirate circuits, such as the new "multirate" neural networks.
A novel signal flow graph based solution for calculation of first and second derivatives of dynamical nonlinear systems
- Mathematics2004 12th European Signal Processing Conference
- 2004
An attempt of solution by means of signal flow graph (SFG) techniques is proposed and the ability of the proposed method to determine all Hessian matrix entries in a complete automatic way is highlighted.
Learning in linear and nonlinear multirate digital systems by signal flow graphs
- Computer ScienceISCAS 2001. The 2001 IEEE International Symposium on Circuits and Systems (Cat. No.01CH37196)
- 2001
The proposed method is based on the concept of adjoint graph, which allows one to estimate the derivative of the output with respect to an internal parameter at different rates (time-instant), and can be employed for the gradient-based adaptation of general multirate circuits.
Higher‐order differentiation of network functions using signal flow graphs
- MathematicsInt. J. Circuit Theory Appl.
- 2012
The application of signal flow graphs (SFG) in the calculation of higher-order derivatives (sensitivities) of the linear circuit functions is presented, showing that the SFG approach to the sensitivity calculation allows to reduce greatly the complexity of calculations.
Calculation of non-mixed second derivatives in multirate systems through signal flow graph techniques
- Engineering2004 IEEE International Symposium on Circuits and Systems (IEEE Cat. No.04CH37512)
- 2004
The overall algorithm represents a useful tool for determination of Jacobean and Hessian based information in learning systems, as was already done in other related but less general contributions in the literature.
Processing of audio signals by neural networks
- Computer Science
- 2003
In this tutorial low structural-computational complexity neural networks, suitable for real-time DSP applications, are presented and several audio real-world applications are discussed.
A Neural Abstract Machine
- Computer ScienceJ. Univers. Comput. Sci.
- 2001
This work defines a parameterized Neural Abstract Machine in such a way that the major neural networks in the literature can be described as natural exten- sions or refinements of the NAM.
Modelling in the Neural Abstract Machine Framework
- Computer Science
- 2001
This work proposes to use Abstract State Machines to define a Neural Abstract Machine, a machine able to manipulate the basic objects and functions which constitute the essence of neural computation.
Audio signal processing byneural networks
- Computer Science
- 2003
A review of architectures suitable for nonlinear real-time audio signal processing is presented and applications in the -elds of audio signal recovery, speech quality enhancement, and learning based pseudo-phy sical sound synthesis are presented and discussed.
References
SHOWING 1-10 OF 36 REFERENCES
Signal-flow-graph derivation of on-line gradient learning algorithms
- Computer ScienceProceedings of International Conference on Neural Networks (ICNN'97)
- 1997
A new general method for backward gradient computation of a system output or cost function with respect to past (or present) system parameters is derived, using the signal-flow-graph representation and its known properties.
A general approach to gradient based learning in multirate systems and neural networks
- Computer ScienceProceedings of the IEEE-INNS-ENNS International Joint Conference on Neural Networks. IJCNN 2000. Neural Computing: New Challenges and Perspectives for the New Millennium
- 2000
A new algorithm to estimate the derivative of the output with respect to an internal parameter have been proposed in the literature for discrete-time systems and can be employed for gradient-based learning of general multirate circuits, such as the new "multirate" neural networks.
Dynamical systems learning by a circuit theoretic approach
- Computer Science, MathematicsISCAS '98. Proceedings of the 1998 IEEE International Symposium on Circuits and Systems (Cat. No.98CH36187)
- 1998
A new general method is derived for both on-line and off-line backward gradient computation of a system output, or cost function, with respect to system parameters, using a circuit theoretic approach.
Learning in linear and nonlinear multirate digital systems by signal flow graphs
- Computer ScienceISCAS 2001. The 2001 IEEE International Symposium on Circuits and Systems (Cat. No.01CH37196)
- 2001
The proposed method is based on the concept of adjoint graph, which allows one to estimate the derivative of the output with respect to an internal parameter at different rates (time-instant), and can be employed for the gradient-based adaptation of general multirate circuits.
A learning algorithm for analog, fully recurrent neural networks
- Computer ScienceInternational 1989 Joint Conference on Neural Networks
- 1989
A learning algorithm for recurrent neural networks is derived. This algorithm allows a network to learn specified trajectories in state space in response to various input sequences. The network…
Signal flow graphs-Computer-aided system analysis and sensitivity calculations
- Computer Science
- 1974
The introduction of two matrices, the summing matrix S and the branching matrix B, which completely describe the topology of a signal flow graph leads to a formulation of system equations in terms of submatrices of the S- and B-matrices suitable for digital-computer programming.
An Efficient Gradient-Based Algorithm for On-Line Training of Recurrent Network Trajectories
- Computer ScienceNeural Computation
- 1990
A novel variant of the familiar backpropagation-through-time approach to training recurrent networks is described. This algorithm is intended to be used on arbitrary recurrent networks that run…
Backpropagation Through Time: What It Does and How to Do It
- Computer Science, Mathematics
- 1990
This paper first reviews basic backpropagation, a simple method which is now being widely used in areas like pattern recognition and fault diagnosis, and describes further extensions of this method, to deal with systems other than neural networks, systems involving simultaneous equations or true recurrent networks, and other practical issues which arise with this method.
Back propagation through adjoints for the identification of nonlinear dynamic systems using recurrent neural models
- Computer ScienceIEEE Trans. Neural Networks
- 1994
Back propagation is reinvestigated for an efficient evaluation of the gradient in arbitrary interconnections of recurrent subsystems and it is shown that it is sufficient to back propagate as many time steps as the order of the system for convergence.
Gradient methods for the optimization of dynamical systems containing neural networks
- Computer ScienceIEEE Trans. Neural Networks
- 1991
An extension of the backpropagation method, termed dynamic backpropagation, which can be applied in a straightforward manner for the optimization of the weights (parameters) of multilayer neural…