Multi-Activation Hidden Units for Neural Networks with Random Weights
@article{Patrikar2020MultiActivationHU, title={Multi-Activation Hidden Units for Neural Networks with Random Weights}, author={A. Patrikar}, journal={ArXiv}, year={2020}, volume={abs/2009.08932} }
Single layer feedforward networks with random weights are successful in a variety of classification and regression problems. These networks are known for their non-iterative and fast training algorithms. A major drawback of these networks is that they require a large number of hidden units. In this paper, we propose the use of multi-activation hidden units. Such units increase the number of tunable parameters and enable formation of complex decision surfaces, without increasing the number of… Expand
References
SHOWING 1-10 OF 34 REFERENCES
Feedforward neural networks with random weights
- Computer Science
- Proceedings., 11th IAPR International Conference on Pattern Recognition. Vol.II. Conference B: Pattern Recognition Methodology and Systems
- 1992
- 398
- PDF
Universal approximation using incremental constructive feedforward networks with random hidden nodes
- Mathematics, Medicine
- IEEE Trans. Neural Networks
- 2006
- 2,037
- PDF
A comprehensive evaluation of random vector functional link networks
- Mathematics, Computer Science
- Inf. Sci.
- 2016
- 190
- PDF
Extreme learning machine: a new learning scheme of feedforward neural networks
- Computer Science
- 2004 IEEE International Joint Conference on Neural Networks (IEEE Cat. No.04CH37541)
- 2004
- 2,915
- PDF
Activation Ensembles for Deep Neural Networks
- Computer Science, Mathematics
- 2019 IEEE International Conference on Big Data (Big Data)
- 2019
- 17
- PDF
Randomness in neural networks: an overview
- Computer Science
- Wiley Interdiscip. Rev. Data Min. Knowl. Discov.
- 2017
- 140
- PDF