A compact network learning model for distribution regression
@article{Kou2019ACN, title={A compact network learning model for distribution regression}, author={Connie Khor Li Kou and Hwee Kuan Lee and T. K. Ng}, journal={Neural networks : the official journal of the International Neural Network Society}, year={2019}, volume={110}, pages={ 199-212 } }
Despite the superior performance of deep learning in many applications, challenges remain in the area of regression on function spaces. In particular, neural networks are unable to encode function inputs compactly as each node encodes just a real value. We propose a novel idea to address this shortcoming: to encode an entire function in a single network node. To that end, we design a compact network representation that encodes and propagates functions in single nodes for the distribution… CONTINUE READING
Figures, Tables, and Topics from this paper
8 Citations
Predicting time-varying distributions with limited training data
- Computer Science, Mathematics
- 2018
- PDF
Theoretical and experimental analysis on the generalizability of distribution regression network
- Computer Science
- Neurocomputing
- 2020
- 1
Enhancing Transformation-based Defenses using a Distribution Classifier
- Computer Science
- ArXiv
- 2019
- 1
- PDF
Enhancing Transformation-Based Defenses Against Adversarial Attacks with a Distribution Classifier
- Computer Science
- ICLR
- 2020
- 8
Anomaly Detection at Scale: The Case for Deep Distributional Time Series Models
- Computer Science, Mathematics
- ArXiv
- 2020
- 1
- PDF
Machine Learning for Observables: Reactant to Product State Distributions for Atom-Diatom Collisions.
- Chemistry, Medicine
- The journal of physical chemistry. A
- 2020
- 1
- PDF
References
SHOWING 1-10 OF 74 REFERENCES
Dropout: a simple way to prevent neural networks from overfitting
- Computer Science
- J. Mach. Learn. Res.
- 2014
- 20,888
- PDF
Deep Residual Learning for Image Recognition
- Computer Science
- 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR)
- 2016
- 57,702
- PDF
Deep Compression: Compressing Deep Neural Network with Pruning, Trained Quantization and Huffman Coding
- Computer Science
- ICLR
- 2016
- 4,051
- PDF
Understanding the difficulty of training deep feedforward neural networks
- Computer Science, Mathematics
- AISTATS
- 2010
- 9,224
- PDF