An Optimal Parallel Perceptron Learning Algorithm for a LargeTraining Set

Abstract

In [2], a parallel perceptron learning algorithm on the single-channel broadcast communication model was proposed to speed up the learning of weights of perceptrons [3]. The results in [2] showed that given n training examples, the average speedup is 1.48*n~ n by n processors. Here, we explain how the parallelization may be modified so that it is applicable… (More)
DOI: 10.1016/S0167-8191(06)80017-7

Topics

2 Figures and Tables

Cite this paper

@article{Hong1994AnOP, title={An Optimal Parallel Perceptron Learning Algorithm for a LargeTraining Set}, author={Tzung-Pei Hong and Shian-Shyong Tseng}, journal={Parallel Computing}, year={1994}, volume={20}, pages={347-352} }