TY - JOUR
T1 - Do We View Robots as We Do Ourselves? Examining Robotic Face Processing Using EEG
AU - Pérez-Arenas, Xaviera
AU - Rivera-Rei, Álvaro A.
AU - Huepe, David
AU - Soto, Vicente
N1 - Publisher Copyright:
© 2025 by the authors.
PY - 2026/1
Y1 - 2026/1
N2 - Background/Objectives: The ability to perceive and process emotional faces quickly and efficiently is essential for human social interactions. In recent years, humans have started to interact more regularly with robotic faces in the form of virtual or real-world robots. Neurophysiological research regarding how the brain decodes robotic faces relative to human ones is scarce and, as such, warrants further research to explore these mechanisms and their social implications. Methods: This study uses event-related potentials (ERPs) to examine the neural correlates during an emotional face categorization task involving human and robotic stimuli. We examined differences in brain activity elicited by viewing robotic and human faces expressing both happy and neutral emotions. ERP waveforms’ amplitudes for the P100, N170, P300, and P600 components were calculated and compared. Furthermore, mass univariate analysis of ERP waveforms was carried out to explore effects not limited to brain regions previously reported in the literature. Results: Results showed robotic faces evoked increased waveform amplitudes at early components (P100 and N170) as well as at the later P300 component. Further, only mid-latency and late cortical components (P300 and P600) showed amplitude differences resulting from emotional valences, aligning with dual-stage models of face processing. Conclusions: These results advance our understanding of face processing during human–robot interaction and contribute to our understanding of brain mechanisms underlying interactions when viewing social robots, setting new considerations for their use in brain health settings and broader cognitive impact.
AB - Background/Objectives: The ability to perceive and process emotional faces quickly and efficiently is essential for human social interactions. In recent years, humans have started to interact more regularly with robotic faces in the form of virtual or real-world robots. Neurophysiological research regarding how the brain decodes robotic faces relative to human ones is scarce and, as such, warrants further research to explore these mechanisms and their social implications. Methods: This study uses event-related potentials (ERPs) to examine the neural correlates during an emotional face categorization task involving human and robotic stimuli. We examined differences in brain activity elicited by viewing robotic and human faces expressing both happy and neutral emotions. ERP waveforms’ amplitudes for the P100, N170, P300, and P600 components were calculated and compared. Furthermore, mass univariate analysis of ERP waveforms was carried out to explore effects not limited to brain regions previously reported in the literature. Results: Results showed robotic faces evoked increased waveform amplitudes at early components (P100 and N170) as well as at the later P300 component. Further, only mid-latency and late cortical components (P300 and P600) showed amplitude differences resulting from emotional valences, aligning with dual-stage models of face processing. Conclusions: These results advance our understanding of face processing during human–robot interaction and contribute to our understanding of brain mechanisms underlying interactions when viewing social robots, setting new considerations for their use in brain health settings and broader cognitive impact.
KW - electroencephalography
KW - emotional facial expressions
KW - event-related potentials
KW - face processing
KW - robotic faces
KW - visual decoding
UR - https://www.scopus.com/pages/publications/105028623651
U2 - 10.3390/brainsci16010009
DO - 10.3390/brainsci16010009
M3 - Article
AN - SCOPUS:105028623651
SN - 2076-3425
VL - 16
JO - Brain Sciences
JF - Brain Sciences
IS - 1
M1 - 9
ER -