Dynamical and complexity results for high order neural networks.

E. Goles, M. Matamala

Research output: Contribution to journalArticlepeer-review

2 Scopus citations


We present dynamical results concerning neural networks with high order arguments. More precisely, we study the family of block-sequential iteration of neural networks with polynomial arguments. In this context, we prove that, under a symmetric hypothesis, the sequential iteration is the only one of this family to converge to fixed points. The other iteration modes present a highly complex dynamical behavior: non-bounded cycles and simulation of arbitrary non-symmetric linear neural network. We also study a high order memory iteration scheme which accepts an energy functional and bounded cycles in the size of the memory steps.

Original languageEnglish
Pages (from-to)241-252
Number of pages12
JournalInternational journal of neural systems
Issue number3
StatePublished - 1994
Externally publishedYes


Dive into the research topics of 'Dynamical and complexity results for high order neural networks.'. Together they form a unique fingerprint.

Cite this