A non-iterative method for pruning hidden neurons in neural networks with random weights

Pablo A. Henríquez, Gonzalo A. Ruz

Producción científica: Contribución a una revistaArtículorevisión exhaustiva

42 Citas (Scopus)


Neural networks with random weights have the advantage of fast computational time in both training and testing. However, one of the main challenges of single layer feedforward neural networks is the selection of the optimal number of neurons in the hidden layer, since few/many neurons lead to problems of underfitting/overfitting. Adapting Garson's algorithm, this paper introduces a new efficient and fast non-iterative algorithm for the selection of neurons in the hidden layer for randomization based neural networks. The proposed approach is divided into three steps: (1) train the network with h hidden neurons, (2) apply Garson's algorithm to the matrix of the hidden layer, and (3) perform pruning reducing hidden layer neurons based on the harmonic mean. Our experiments in regression and classification problems confirmed that the combination of the pruning technique with these types of neural networks improved their predictive performance in terms of mean square error and accuracy. Additionally, we tested our proposed pruning method with neural networks trained under sequential learning algorithms, where Random Vector Functional Link obtained, in general, the best predictive performance compared to online sequential versions of extreme learning machines and single hidden layer neural network with random weights.

Idioma originalInglés
Páginas (desde-hasta)1109-1121
Número de páginas13
PublicaciónApplied Soft Computing Journal
EstadoPublicada - sep. 2018
Publicado de forma externa


Profundice en los temas de investigación de 'A non-iterative method for pruning hidden neurons in neural networks with random weights'. En conjunto forman una huella única.

Citar esto