A non-iterative method for pruning hidden neurons in neural networks with random weights

Pablo A. Henríquez, Gonzalo A. Ruz

Research output: Contribution to journalArticlepeer-review

42 Scopus citations


Neural networks with random weights have the advantage of fast computational time in both training and testing. However, one of the main challenges of single layer feedforward neural networks is the selection of the optimal number of neurons in the hidden layer, since few/many neurons lead to problems of underfitting/overfitting. Adapting Garson's algorithm, this paper introduces a new efficient and fast non-iterative algorithm for the selection of neurons in the hidden layer for randomization based neural networks. The proposed approach is divided into three steps: (1) train the network with h hidden neurons, (2) apply Garson's algorithm to the matrix of the hidden layer, and (3) perform pruning reducing hidden layer neurons based on the harmonic mean. Our experiments in regression and classification problems confirmed that the combination of the pruning technique with these types of neural networks improved their predictive performance in terms of mean square error and accuracy. Additionally, we tested our proposed pruning method with neural networks trained under sequential learning algorithms, where Random Vector Functional Link obtained, in general, the best predictive performance compared to online sequential versions of extreme learning machines and single hidden layer neural network with random weights.

Original languageEnglish
Pages (from-to)1109-1121
Number of pages13
JournalApplied Soft Computing Journal
StatePublished - Sep 2018
Externally publishedYes


  • Classification
  • Garson's algorithm
  • Neural networks
  • Non-iterative learning
  • Pruning
  • Random weights
  • Regression


Dive into the research topics of 'A non-iterative method for pruning hidden neurons in neural networks with random weights'. Together they form a unique fingerprint.

Cite this