Parallel growing and training of neural networks using output parallelism

@article{Guan2002ParallelGA,
  title={Parallel growing and training of neural networks using output parallelism},
  author={S. Guan and Shanchun Li},
  journal={IEEE transactions on neural networks},
  year={2002},
  volume={13 3},
  pages={
          542-50
        }
}
  • S. Guan, Shanchun Li
  • Published 2002
  • Medicine, Computer Science
  • IEEE transactions on neural networks
  • In order to find an appropriate architecture for a large-scale real-world application automatically and efficiently, a natural method is to divide the original problem into a set of subproblems. In this paper, we propose a simple neural-network task decomposition method based on output parallelism. By using this method, a problem can be divided flexibly into several subproblems as chosen, each of which is composed of the whole input vector and a fraction of the output vector. Each module (for… CONTINUE READING
    MultiLearner Based Recursive Supervised Training
    Reduced Training for Hierarchical Incremental Class Learning
    Snap-Shots on Neuroinformatics and Neural Information Processing Research in Singapore

    References

    Publications referenced by this paper.
    SHOWING 1-10 OF 38 REFERENCES
    Modularity, Combining and Artificial Neural Nets
    • 49
    A self-generating modular neural network architecture for supervised learning
    • 40
    Problem decomposition and subgoaling in artificial neural networks
    • 13
    A Neural Network Approach to Constructive Induction
    • 21
    Dynamic node creation in backpropagation networks
    • 142
    Regularization Algorithms for Learning That Are Equivalent to Multilayer Networks
    • 1,138
    • Open Access
    Training a 3-node neural network is NP-complete
    • 716
    • Open Access
    Objective functions for training new hidden units in constructive neural networks
    • 214
    • Open Access