Data augmentation based on dynamical systems for the classification of brain states

Yonatan Sanz Perl, Carla Pallavicini, Ignacio Perez Ipiña, Morten Kringelbach, Gustavo Deco, Helmut Laufs, Enzo Tagliazucchi

Research output: Contribution to journalArticlepeer-review

12 Scopus citations


The application of machine learning algorithms to neuroimaging data shows great promise for the classification of physiological and pathological brain states. However, classifiers trained on high dimensional data are prone to overfitting, especially for a low number of training samples. We describe the use of whole-brain computational models for data augmentation in brain state classification. Our low dimensional model is based on nonlinear oscillators coupled by the empirical structural connectivity of the brain. We use this model to enhance a dataset consisting of functional magnetic resonance imaging recordings acquired during all stages of the human wake-sleep cycle. After fitting the model to the average functional connectivity of each state, we show that the synthetic data generated by the model yields classification accuracies comparable to those obtained from the empirical data. We also show that models fitted to individual subjects generate surrogates with enough information to train classifiers that present significant transfer learning accuracy to the whole sample. Whole-brain computational modeling represents a useful tool to produce large synthetic datasets for data augmentation in the classification of certain brain states, with potential applications to computer-assisted diagnosis and prognosis of neuropsychiatric disorders.

Original languageEnglish
Article number110069
JournalChaos, Solitons and Fractals
StatePublished - Oct 2020
Externally publishedYes


  • Brain states
  • Data augmentation
  • Dynamical systems
  • Machine learning
  • Neuroimaging


Dive into the research topics of 'Data augmentation based on dynamical systems for the classification of brain states'. Together they form a unique fingerprint.

Cite this