From Maxout to Channel-Out: Encoding Information on Sparse Pathways

@inproceedings{Wang2014FromMT,
  title={From Maxout to Channel-Out: Encoding Information on Sparse Pathways},
  author={Q. Wang and J. J{\'a}J{\'a}},
  booktitle={ICANN},
  year={2014}
}
  • Q. Wang, J. JáJá
  • Published in ICANN 2014
  • Computer Science, Mathematics
  • Motivated by an important insight from neural science, we propose a new framework for understanding the success of the recently proposed "maxout" networks. The framework is based on encoding information on sparse pathways and recognizing the correct pathway at inference time. Elaborating further on this insight, we propose a novel deep network architecture, called "channel-out" network, which takes a much better advantage of sparse pathway encoding. In channel-out networks, pathways are not… CONTINUE READING
    14 Citations
    Batch-normalized Maxout Network in Network
    • 74
    • PDF
    New architectures for very deep learning
    A survey of regularization strategies for deep models
    • 1
    Dissecting the Winning Solution of the HiggsML Challenge
    • 9
    • PDF

    References

    SHOWING 1-10 OF 25 REFERENCES
    Sparse deep belief net model for visual area V2
    • 964
    • PDF
    An Analysis of Single-Layer Networks in Unsupervised Feature Learning
    • 1,950
    • Highly Influential
    • PDF
    Discriminative Learning of Sum-Product Networks
    • 173
    • PDF
    Sparse Feature Learning for Deep Belief Networks
    • 737
    • PDF
    Learning Deep Architectures for AI
    • 6,301
    • PDF
    Maxout Networks
    • 1,572
    • Highly Influential
    • PDF
    Regularization of Neural Networks using DropConnect
    • 1,723
    • PDF
    Learning Smooth Pooling Regions for Visual Recognition
    • 23
    • PDF