The geometrical learning of binary neural networks

Abstract

In this paper, the learning algorithm called expand-and-truncate learning (ETL) is proposed to train multilayer binary neural networks (BNN) with guaranteed convergence for any binary-to-binary mapping. The most significant contribution of this paper is the development of a learning algorithm for three-layer BNN which guarantees the convergence, automatically determining a required number of neurons in the hidden layer. Furthermore, the learning speed of the proposed ETL algorithm is much faster than that of backpropagation learning algorithm in a binary field. Neurons in the proposed BNN employ a hard-limiter activation function, with only integer weights and integer thresholds. Therefore, this will greatly facilitate actual hardware implementation of the proposed BNN using currently available digital VLSI technology.

DOI: 10.1109/72.363432

Statistics

01020'97'99'01'03'05'07'09'11'13'15'17
Citations per Year

87 Citations

Semantic Scholar estimates that this publication has 87 citations based on the available data.

See our FAQ for additional information.

Cite this paper

@article{Kim1995TheGL, title={The geometrical learning of binary neural networks}, author={Jung H. Kim and Sung-Kwon Park}, journal={IEEE transactions on neural networks}, year={1995}, volume={6 1}, pages={237-47} }