Corpus ID: 221802368

Multi-Activation Hidden Units for Neural Networks with Random Weights

  title={Multi-Activation Hidden Units for Neural Networks with Random Weights},
  author={A. Patrikar},
Single layer feedforward networks with random weights are successful in a variety of classification and regression problems. These networks are known for their non-iterative and fast training algorithms. A major drawback of these networks is that they require a large number of hidden units. In this paper, we propose the use of multi-activation hidden units. Such units increase the number of tunable parameters and enable formation of complex decision surfaces, without increasing the number of… Expand

Figures and Tables from this paper


Feedforward neural networks with random weights
  • W. Schmidt, M. Kraaijveld, R. Duin
  • Computer Science
  • Proceedings., 11th IAPR International Conference on Pattern Recognition. Vol.II. Conference B: Pattern Recognition Methodology and Systems
  • 1992
  • 398
  • PDF
A review on neural networks with random weights
  • 174
Universal approximation using incremental constructive feedforward networks with random hidden nodes
  • 2,037
  • PDF
A survey of randomized algorithms for training neural networks
  • 194
  • PDF
A comprehensive evaluation of random vector functional link networks
  • 190
  • PDF
Convex incremental extreme learning machine
  • 966
  • PDF
Extreme learning machine: a new learning scheme of feedforward neural networks
  • G. Huang, Qin-Yu Zhu, C. Siew
  • Computer Science
  • 2004 IEEE International Joint Conference on Neural Networks (IEEE Cat. No.04CH37541)
  • 2004
  • 2,915
  • PDF
Activation Ensembles for Deep Neural Networks
  • Mark Harmon, D. Klabjan
  • Computer Science, Mathematics
  • 2019 IEEE International Conference on Big Data (Big Data)
  • 2019
  • 17
  • PDF
Randomness in neural networks: an overview
  • 140
  • PDF
Extreme learning machine: Theory and applications
  • 7,859
  • PDF