Extreme learning machine with a deterministic assignment of hidden weights in two parallel layers

Pablo A. Henríquez, Gonzalo A. Ruz

Producción científica: Contribución a una revistaArtículorevisión exhaustiva

20 Citas (Scopus)

Resumen

Extreme learning machine (ELM) is a machine learning technique based on competitive single-hidden layer feedforward neural network (SLFN). However, traditional ELM and its variants are only based on random assignment of hidden weights using a uniform distribution, and then the calculation of the weights output using the least-squares method. This paper proposes a new architecture based on a non-linear layer in parallel by another non-linear layer and with entries of independent weights. We explore the use of a deterministic assignment of the hidden weight values using low-discrepancy sequences (LDSs). The simulations are performed with Halton and Sobol sequences. The results for regression and classification problems confirm the advantages of using the proposed method called PL-ELM algorithm with the deterministic assignment of hidden weights. Moreover, the PL-ELM algorithm with the deterministic generation using LDSs can be extended to other modified ELM algorithms.

Idioma originalInglés
Páginas (desde-hasta)109-116
Número de páginas8
PublicaciónNeurocomputing
Volumen226
DOI
EstadoPublicada - 22 feb. 2017

Huella

Profundice en los temas de investigación de 'Extreme learning machine with a deterministic assignment of hidden weights in two parallel layers'. En conjunto forman una huella única.

Citar esto