An empirical study of the hidden matrix rank for neural networks with random weights

Pablo A. Henriquez, Gonzalo A. Ruz

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

4 Scopus citations

Abstract

Neural networks with random weights can be regarded as feed-forward neural networks built with a specific randomized algorithm, i.e., the input weights and biases are randomly assigned and fixed during the training phase, and the output weights are analytically evaluated by the least square method. This paper presents an empirical study of the hidden matrix rank for neural networks with random weights. We study the impacts of the scope of random parameters on the model's performance, and show that the assignment of the input weights in the range [-1,1] is misleading. Experiments were conducted using two types of neural networks obtaining insights not only on the input weights but also how these relate to different architectures.

Original languageEnglish
Title of host publicationProceedings - 16th IEEE International Conference on Machine Learning and Applications, ICMLA 2017
EditorsXuewen Chen, Bo Luo, Feng Luo, Vasile Palade, M. Arif Wani
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages883-888
Number of pages6
ISBN (Electronic)9781538614174
DOIs
StatePublished - 2017
Event16th IEEE International Conference on Machine Learning and Applications, ICMLA 2017 - Cancun, Mexico
Duration: 18 Dec 201721 Dec 2017

Publication series

NameProceedings - 16th IEEE International Conference on Machine Learning and Applications, ICMLA 2017
Volume2017-December

Conference

Conference16th IEEE International Conference on Machine Learning and Applications, ICMLA 2017
Country/TerritoryMexico
CityCancun
Period18/12/1721/12/17

Keywords

  • Matrix rank
  • Neural networks
  • Random weights
  • Regression

Fingerprint

Dive into the research topics of 'An empirical study of the hidden matrix rank for neural networks with random weights'. Together they form a unique fingerprint.

Cite this