Parallel growing and training of neural networks using output parallelism

  title={Parallel growing and training of neural networks using output parallelism},
  author={Steven Guan and Shanchun Li},
  journal={IEEE transactions on neural networks},
  volume={13 3},
In order to find an appropriate architecture for a large-scale real-world application automatically and efficiently, a natural method is to divide the original problem into a set of subproblems. In this paper, we propose a simple neural-network task decomposition method based on output parallelism. By using this method, a problem can be divided flexibly into several subproblems as chosen, each of which is composed of the whole input vector and a fraction of the output vector. Each module (for… CONTINUE READING
Highly Cited
This paper has 64 citations. REVIEW CITATIONS
26 Citations
34 References
Similar Papers


Publications citing this paper.
Showing 1-10 of 26 extracted citations

64 Citations

Citations per Year
Semantic Scholar estimates that this publication has 64 citations based on the available data.

See our FAQ for additional information.


Publications referenced by this paper.
Showing 1-10 of 34 references

An approach to parallel growing and training of neural networks

  • S.-U. Guan, S. Li
  • Proc. 2000 IEEE Int. Symp. Intell. Signal…
  • 2000
3 Excerpts

Two-dimensional extensions of cascade correlation networks

  • L. Su, S.-U. Guan
  • inProc. 4th Int. Conf./Exhibition High…
  • 2000

A review of parallel implementations of backpropagation neural networks

  • J. Torresen, O. Landsvek
  • Parallel Architectures for Artificial Neural…
  • 1998

Similar Papers

Loading similar papers…