Multiscale optimization in neural nets: preliminary report

@article{Mjolsness1990MultiscaleOI,
  title={Multiscale optimization in neural nets: preliminary report},
  author={Eric Mjolsness and Charles Garrett and Willard L. Miranker},
  journal={1990 IJCNN International Joint Conference on Neural Networks},
  year={1990},
  pages={781-786 vol.3}
}
A multiscale optimization method for neural networks governed by quite general objective functions is presented. At the coarse scale, there is a smaller, approximating neural net. Like the original net, it is nonlinear and has a nonquadratic objective function, so the coarse-scale net is a more accurate approximation than a quadratic objective would be. The transitions and information flow form fine to coarse scale and back do not disrupt the optimization. The problem need not involve any… 

Figures from this paper

Group updates and multiscaling: an efficient neural network approach to combinatorial optimization
TLDR
Experimental results indicate that the multiscale approach is very effective in exploring the state-space of the problem and providing feasible solutions of acceptable quality, while at the same it offers a significant acceleration.
ates and Multiscaling: An Efficient Neural etwork Approach to Combinatorial Optimization
TLDR
Experimental results indicate that the multiscale approach is very effective in exploring the state-space of the problem and providing feasible solutions of acceptable quality, while at the same it offers a significant acceleration.
The design of neural networks using a priori knowledge
TLDR
It is shown that using a priori knowledge for the design of neural networks helps to solve some basic difficulties encountered in practice: Inefficient training and bad generalization of Neural networks.

References

SHOWING 1-5 OF 5 REFERENCES
Algebraic transformations of objective functions
Neurons with graded response have collective computational properties like those of two-state neurons.
  • J. Hopfield
  • Biology
    Proceedings of the National Academy of Sciences of the United States of America
  • 1984
TLDR
A model for a large network of "neurons" with a graded response (or sigmoid input-output relation) is studied and collective properties in very close correspondence with the earlier stochastic model based on McCulloch - Pitts neurons are studied.
Acceleration by aggregation of successive approximation methods
Statistical Coding and Short-Term Synaptic Plasticity: A Scheme for Knowledge Representation in the Brain
TLDR
It is suggested that the scheme of a synaptic pattern may be more adapted than the classical cell-assembly notion for explaining cognitive abilities such as generalization and categorization, which pertain to the notion of invariance.
Collective Computation With Continuous Variables
TLDR
The model on which collective computation in neural networks was originally shown to produce a content addressable memory was based on neurons with two discrete states, but the role of exchange in the magnetic system is played by the connections between the neurons in the biological system.