Current models of perceived facial expressions of emotions are focused on visual information. However, facial expressions are typically experienced while observers process multisensory information resulting from their active interaction with the environment. Since bodily interaction with everyday objects within the peripersonal space has been shown to produce powerful effects on perception, we hypothesized and found that the comfort/discomfort of motor actions modulates the perception of facial expressions, thus playing a pivotal role in human cognition and communication. Using MAMIP (a novel Motor Action Mood-Induction Procedure), we adapted participants to comfortable/uncomfortable visually-guided reaches and obtained consistent mood-congruency effects on facial emotion identification under different experimental paradigms. Using the method of constant stimuli we found that comfortable actions made a neutral face appear happy and a slightly angry face neutral, while uncomfortable actions induced the opposite effect. Using signal detection theory we found that the sensitivity for facial expressions was improved when they were congruent, rather than incongruent, with the action-induced mood. Our results suggest that the bias revealed by the method of constant stimuli has (at least partially) a perceptual origin: the action-induced mood affects the valence of facial expressions through an emotional modulation of visual processing.

Action molds the perception of facial expressions

GERBINO, WALTER;RIGUTTI, SARA;FANTONI, CARLO
2014-01-01

Abstract

Current models of perceived facial expressions of emotions are focused on visual information. However, facial expressions are typically experienced while observers process multisensory information resulting from their active interaction with the environment. Since bodily interaction with everyday objects within the peripersonal space has been shown to produce powerful effects on perception, we hypothesized and found that the comfort/discomfort of motor actions modulates the perception of facial expressions, thus playing a pivotal role in human cognition and communication. Using MAMIP (a novel Motor Action Mood-Induction Procedure), we adapted participants to comfortable/uncomfortable visually-guided reaches and obtained consistent mood-congruency effects on facial emotion identification under different experimental paradigms. Using the method of constant stimuli we found that comfortable actions made a neutral face appear happy and a slightly angry face neutral, while uncomfortable actions induced the opposite effect. Using signal detection theory we found that the sensitivity for facial expressions was improved when they were congruent, rather than incongruent, with the action-induced mood. Our results suggest that the bias revealed by the method of constant stimuli has (at least partially) a perceptual origin: the action-induced mood affects the valence of facial expressions through an emotional modulation of visual processing.
2014
File in questo prodotto:
Non ci sono file associati a questo prodotto.
Pubblicazioni consigliate

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11368/2833598
 Avviso

Registrazione in corso di verifica.
La registrazione di questo prodotto non è ancora stata validata in ArTS.

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus ND
  • ???jsp.display-item.citation.isi??? ND
social impact