An empirical study of the hidden matrix rank for neural networks with random weights

Pablo A. Henriquez, Gonzalo A. Ruz

Producción científica: Capítulo del libro/informe/acta de congresoContribución a la conferenciarevisión exhaustiva

5 Citas (Scopus)

Resumen

Neural networks with random weights can be regarded as feed-forward neural networks built with a specific randomized algorithm, i.e., the input weights and biases are randomly assigned and fixed during the training phase, and the output weights are analytically evaluated by the least square method. This paper presents an empirical study of the hidden matrix rank for neural networks with random weights. We study the impacts of the scope of random parameters on the model's performance, and show that the assignment of the input weights in the range [-1,1] is misleading. Experiments were conducted using two types of neural networks obtaining insights not only on the input weights but also how these relate to different architectures.

Idioma originalInglés
Título de la publicación alojadaProceedings - 16th IEEE International Conference on Machine Learning and Applications, ICMLA 2017
EditoresXuewen Chen, Bo Luo, Feng Luo, Vasile Palade, M. Arif Wani
EditorialInstitute of Electrical and Electronics Engineers Inc.
Páginas883-888
Número de páginas6
ISBN (versión digital)9781538614174
DOI
EstadoPublicada - 2017
Evento16th IEEE International Conference on Machine Learning and Applications, ICMLA 2017 - Cancun, México
Duración: 18 dic. 201721 dic. 2017

Serie de la publicación

NombreProceedings - 16th IEEE International Conference on Machine Learning and Applications, ICMLA 2017
Volumen2017-December

Conferencia

Conferencia16th IEEE International Conference on Machine Learning and Applications, ICMLA 2017
País/TerritorioMéxico
CiudadCancun
Período18/12/1721/12/17

Huella

Profundice en los temas de investigación de 'An empirical study of the hidden matrix rank for neural networks with random weights'. En conjunto forman una huella única.

Citar esto