Serveur d'exploration sur les dispositifs haptiques

Attention, ce site est en cours de développement !
Attention, site généré par des moyens informatiques à partir de corpus bruts.
Les informations ne sont donc pas validées.

Body Actions Change the Appearance of Facial Expressions

Identifieur interne : 003321 ( Ncbi/Merge ); précédent : 003320; suivant : 003322

Body Actions Change the Appearance of Facial Expressions

Auteurs : Carlo Fantoni [Italie] ; Walter Gerbino [Italie]

Source :

RBID : PMC:4176726

Abstract

Perception, cognition, and emotion do not operate along segregated pathways; rather, their adaptive interaction is supported by various sources of evidence. For instance, the aesthetic appraisal of powerful mood inducers like music can bias the facial expression of emotions towards mood congruency. In four experiments we showed similar mood-congruency effects elicited by the comfort/discomfort of body actions. Using a novel Motor Action Mood Induction Procedure, we let participants perform comfortable/uncomfortable visually-guided reaches and tested them in a facial emotion identification task. Through the alleged mediation of motor action induced mood, action comfort enhanced the quality of the participant’s global experience (a neutral face appeared happy and a slightly angry face neutral), while action discomfort made a neutral face appear angry and a slightly happy face neutral. Furthermore, uncomfortable (but not comfortable) reaching improved the sensitivity for the identification of emotional faces and reduced the identification time of facial expressions, as a possible effect of hyper-arousal from an unpleasant bodily experience.


Url:
DOI: 10.1371/journal.pone.0108211
PubMed: 25251882
PubMed Central: 4176726

Links toward previous steps (curation, corpus...)


Links to Exploration step

PMC:4176726

Le document en format XML

<record>
<TEI>
<teiHeader>
<fileDesc>
<titleStmt>
<title xml:lang="en">Body Actions Change the Appearance of Facial Expressions</title>
<author>
<name sortKey="Fantoni, Carlo" sort="Fantoni, Carlo" uniqKey="Fantoni C" first="Carlo" last="Fantoni">Carlo Fantoni</name>
<affiliation wicri:level="1">
<nlm:aff id="aff1">
<addr-line>Department of Life Sciences, Psychology Unit “Gaetano Kanizsa”, University of Trieste, Trieste, Italy</addr-line>
</nlm:aff>
<country xml:lang="fr">Italie</country>
<wicri:regionArea>Department of Life Sciences, Psychology Unit “Gaetano Kanizsa”, University of Trieste, Trieste</wicri:regionArea>
<wicri:noRegion>Trieste</wicri:noRegion>
</affiliation>
<affiliation wicri:level="1">
<nlm:aff id="aff2">
<addr-line>Center for Neuroscience and Cognitive Systems@UniTn, Istituto Italiano di Tecnologia, Rovereto, Italy</addr-line>
</nlm:aff>
<country xml:lang="fr">Italie</country>
<wicri:regionArea>Center for Neuroscience and Cognitive Systems@UniTn, Istituto Italiano di Tecnologia, Rovereto</wicri:regionArea>
<wicri:noRegion>Rovereto</wicri:noRegion>
</affiliation>
</author>
<author>
<name sortKey="Gerbino, Walter" sort="Gerbino, Walter" uniqKey="Gerbino W" first="Walter" last="Gerbino">Walter Gerbino</name>
<affiliation wicri:level="1">
<nlm:aff id="aff1">
<addr-line>Department of Life Sciences, Psychology Unit “Gaetano Kanizsa”, University of Trieste, Trieste, Italy</addr-line>
</nlm:aff>
<country xml:lang="fr">Italie</country>
<wicri:regionArea>Department of Life Sciences, Psychology Unit “Gaetano Kanizsa”, University of Trieste, Trieste</wicri:regionArea>
<wicri:noRegion>Trieste</wicri:noRegion>
</affiliation>
</author>
</titleStmt>
<publicationStmt>
<idno type="wicri:source">PMC</idno>
<idno type="pmid">25251882</idno>
<idno type="pmc">4176726</idno>
<idno type="url">http://www.ncbi.nlm.nih.gov/pmc/articles/PMC4176726</idno>
<idno type="RBID">PMC:4176726</idno>
<idno type="doi">10.1371/journal.pone.0108211</idno>
<date when="2014">2014</date>
<idno type="wicri:Area/Pmc/Corpus">000344</idno>
<idno type="wicri:Area/Pmc/Curation">000344</idno>
<idno type="wicri:Area/Pmc/Checkpoint">000D89</idno>
<idno type="wicri:Area/Ncbi/Merge">003321</idno>
</publicationStmt>
<sourceDesc>
<biblStruct>
<analytic>
<title xml:lang="en" level="a" type="main">Body Actions Change the Appearance of Facial Expressions</title>
<author>
<name sortKey="Fantoni, Carlo" sort="Fantoni, Carlo" uniqKey="Fantoni C" first="Carlo" last="Fantoni">Carlo Fantoni</name>
<affiliation wicri:level="1">
<nlm:aff id="aff1">
<addr-line>Department of Life Sciences, Psychology Unit “Gaetano Kanizsa”, University of Trieste, Trieste, Italy</addr-line>
</nlm:aff>
<country xml:lang="fr">Italie</country>
<wicri:regionArea>Department of Life Sciences, Psychology Unit “Gaetano Kanizsa”, University of Trieste, Trieste</wicri:regionArea>
<wicri:noRegion>Trieste</wicri:noRegion>
</affiliation>
<affiliation wicri:level="1">
<nlm:aff id="aff2">
<addr-line>Center for Neuroscience and Cognitive Systems@UniTn, Istituto Italiano di Tecnologia, Rovereto, Italy</addr-line>
</nlm:aff>
<country xml:lang="fr">Italie</country>
<wicri:regionArea>Center for Neuroscience and Cognitive Systems@UniTn, Istituto Italiano di Tecnologia, Rovereto</wicri:regionArea>
<wicri:noRegion>Rovereto</wicri:noRegion>
</affiliation>
</author>
<author>
<name sortKey="Gerbino, Walter" sort="Gerbino, Walter" uniqKey="Gerbino W" first="Walter" last="Gerbino">Walter Gerbino</name>
<affiliation wicri:level="1">
<nlm:aff id="aff1">
<addr-line>Department of Life Sciences, Psychology Unit “Gaetano Kanizsa”, University of Trieste, Trieste, Italy</addr-line>
</nlm:aff>
<country xml:lang="fr">Italie</country>
<wicri:regionArea>Department of Life Sciences, Psychology Unit “Gaetano Kanizsa”, University of Trieste, Trieste</wicri:regionArea>
<wicri:noRegion>Trieste</wicri:noRegion>
</affiliation>
</author>
</analytic>
<series>
<title level="j">PLoS ONE</title>
<idno type="eISSN">1932-6203</idno>
<imprint>
<date when="2014">2014</date>
</imprint>
</series>
</biblStruct>
</sourceDesc>
</fileDesc>
<profileDesc>
<textClass></textClass>
</profileDesc>
</teiHeader>
<front>
<div type="abstract" xml:lang="en">
<p>Perception, cognition, and emotion do not operate along segregated pathways; rather, their adaptive interaction is supported by various sources of evidence. For instance, the aesthetic appraisal of powerful mood inducers like music can bias the facial expression of emotions towards mood congruency. In four experiments we showed similar mood-congruency effects elicited by the
<italic>comfort/discomfort</italic>
of body actions. Using a novel
<italic>Motor Action Mood Induction Procedure</italic>
, we let participants perform comfortable/uncomfortable visually-guided reaches and tested them in a facial emotion identification task. Through the alleged mediation of motor action induced mood, action comfort enhanced the quality of the participant’s global experience (a neutral face appeared happy and a slightly angry face neutral), while action discomfort made a neutral face appear angry and a slightly happy face neutral. Furthermore, uncomfortable (but not comfortable) reaching improved the sensitivity for the identification of emotional faces and reduced the identification time of facial expressions, as a possible effect of hyper-arousal from an unpleasant bodily experience.</p>
</div>
</front>
<back>
<div1 type="bibliography">
<listBibl>
<biblStruct>
<analytic>
<author>
<name sortKey="Becchio, C" uniqKey="Becchio C">C Becchio</name>
</author>
<author>
<name sortKey="Sartori, L" uniqKey="Sartori L">L Sartori</name>
</author>
<author>
<name sortKey="Bulgheroni, M" uniqKey="Bulgheroni M">M Bulgheroni</name>
</author>
<author>
<name sortKey="Castiello, U" uniqKey="Castiello U">U Castiello</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Sartori, L" uniqKey="Sartori L">L Sartori</name>
</author>
<author>
<name sortKey="Becchio, C" uniqKey="Becchio C">C Becchio</name>
</author>
<author>
<name sortKey="Bara, Bg" uniqKey="Bara B">BG Bara</name>
</author>
<author>
<name sortKey="Castiello, U" uniqKey="Castiello U">U Castiello</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Cardinali, L" uniqKey="Cardinali L">L Cardinali</name>
</author>
<author>
<name sortKey="Frassinetti, F" uniqKey="Frassinetti F">F Frassinetti</name>
</author>
<author>
<name sortKey="Brozzoli, C" uniqKey="Brozzoli C">C Brozzoli</name>
</author>
<author>
<name sortKey="Urquizar, C" uniqKey="Urquizar C">C Urquizar</name>
</author>
<author>
<name sortKey="Roy, Ac" uniqKey="Roy A">AC Roy</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Volcic, R" uniqKey="Volcic R">R Volcic</name>
</author>
<author>
<name sortKey="Fantoni, C" uniqKey="Fantoni C">C Fantoni</name>
</author>
<author>
<name sortKey="Caudek, C" uniqKey="Caudek C">C Caudek</name>
</author>
<author>
<name sortKey="Assad, J" uniqKey="Assad J">J Assad</name>
</author>
<author>
<name sortKey="Domini, F" uniqKey="Domini F">F Domini</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Higuchi, T" uniqKey="Higuchi T">T Higuchi</name>
</author>
<author>
<name sortKey="Imanaka, K" uniqKey="Imanaka K">K Imanaka</name>
</author>
<author>
<name sortKey="Hatayama, T" uniqKey="Hatayama T">T Hatayama</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Knight, C" uniqKey="Knight C">C Knight</name>
</author>
<author>
<name sortKey="Haslam, Sa" uniqKey="Haslam S">SA Haslam</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Warren, Wh" uniqKey="Warren W">WH Warren</name>
</author>
</analytic>
</biblStruct>
<biblStruct></biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Niedenthal, Pm" uniqKey="Niedenthal P">PM Niedenthal</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Niedenthal, Pm" uniqKey="Niedenthal P">PM Niedenthal</name>
</author>
<author>
<name sortKey="Barsalou, Lw" uniqKey="Barsalou L">LW Barsalou</name>
</author>
<author>
<name sortKey="Winkielman, P" uniqKey="Winkielman P">P Winkielman</name>
</author>
<author>
<name sortKey="Krauth Gruber, S" uniqKey="Krauth Gruber S">S Krauth-Gruber</name>
</author>
<author>
<name sortKey="Ric, F" uniqKey="Ric F">F Ric</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Conway, Ft" uniqKey="Conway F">FT Conway</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Mark, Ls" uniqKey="Mark L">LS Mark</name>
</author>
<author>
<name sortKey="Nemeth, K" uniqKey="Nemeth K">K Nemeth</name>
</author>
<author>
<name sortKey="Gardner, D" uniqKey="Gardner D">D Gardner</name>
</author>
<author>
<name sortKey="Dainoff, Mj" uniqKey="Dainoff M">MJ Dainoff</name>
</author>
<author>
<name sortKey="Paasche, J" uniqKey="Paasche J">J Paasche</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Woods, Aj" uniqKey="Woods A">AJ Woods</name>
</author>
<author>
<name sortKey="Philbeck, Jw" uniqKey="Philbeck J">JW Philbeck</name>
</author>
<author>
<name sortKey="Wirtz, P" uniqKey="Wirtz P">P Wirtz</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Phelps, L" uniqKey="Phelps L">L Phelps</name>
</author>
<author>
<name sortKey="Carrasco, M" uniqKey="Carrasco M">M Carrasco</name>
</author>
</analytic>
</biblStruct>
<biblStruct></biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Russell, Ja" uniqKey="Russell J">JA Russell</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Rolls, Et" uniqKey="Rolls E">ET Rolls</name>
</author>
</analytic>
</biblStruct>
<biblStruct></biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Proffitt, Dr" uniqKey="Proffitt D">DR Proffitt</name>
</author>
<author>
<name sortKey="Bhalla, M" uniqKey="Bhalla M">M Bhalla</name>
</author>
<author>
<name sortKey="Gossweiler, R" uniqKey="Gossweiler R">R Gossweiler</name>
</author>
<author>
<name sortKey="Midgett, J" uniqKey="Midgett J">J Midgett</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Yap, Aj" uniqKey="Yap A">AJ Yap</name>
</author>
<author>
<name sortKey="Wazlawek, As" uniqKey="Wazlawek A">AS Wazlawek</name>
</author>
<author>
<name sortKey="Lucas, Bj" uniqKey="Lucas B">BJ Lucas</name>
</author>
<author>
<name sortKey="Cuddy, Ajc" uniqKey="Cuddy A">AJC Cuddy</name>
</author>
<author>
<name sortKey="Carney, Dr" uniqKey="Carney D">DR Carney</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Gasper, K" uniqKey="Gasper K">K Gasper</name>
</author>
<author>
<name sortKey="Clore, Gl" uniqKey="Clore G">GL Clore</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Wild, B" uniqKey="Wild B">B Wild</name>
</author>
<author>
<name sortKey="Erb, M" uniqKey="Erb M">M Erb</name>
</author>
<author>
<name sortKey="Bartels, M" uniqKey="Bartels M">M Bartels</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Caudek, C" uniqKey="Caudek C">C Caudek</name>
</author>
<author>
<name sortKey="Monni, A" uniqKey="Monni A">A Monni</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Pollak, Sd" uniqKey="Pollak S">SD Pollak</name>
</author>
<author>
<name sortKey="Kistler, Dj" uniqKey="Kistler D">DJ Kistler</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Klatzky, Rl" uniqKey="Klatzky R">RL Klatzky</name>
</author>
<author>
<name sortKey="Abramowicz, A" uniqKey="Abramowicz A">A Abramowicz</name>
</author>
<author>
<name sortKey="Hamilton, C" uniqKey="Hamilton C">C Hamilton</name>
</author>
<author>
<name sortKey="Lederman, Sj" uniqKey="Lederman S">SJ Lederman</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Jeong, Jw" uniqKey="Jeong J">JW Jeong</name>
</author>
<author>
<name sortKey="Diwadkar, Va" uniqKey="Diwadkar V">VA Diwadkar</name>
</author>
<author>
<name sortKey="Chugani, Cd" uniqKey="Chugani C">CD Chugani</name>
</author>
<author>
<name sortKey="Sinsoongsud, P" uniqKey="Sinsoongsud P">P Sinsoongsud</name>
</author>
<author>
<name sortKey="Muzik, O" uniqKey="Muzik O">O Muzik</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Jolij, J" uniqKey="Jolij J">J Jolij</name>
</author>
<author>
<name sortKey="Meurs, M" uniqKey="Meurs M">M Meurs</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Niedenthal, Pm" uniqKey="Niedenthal P">PM Niedenthal</name>
</author>
<author>
<name sortKey="Halberstadt, Jb" uniqKey="Halberstadt J">JB Halberstadt</name>
</author>
<author>
<name sortKey="Innes Ker, Ah" uniqKey="Innes Ker A">AH Innes-Ker</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Bouhuys, Al" uniqKey="Bouhuys A">AL Bouhuys</name>
</author>
<author>
<name sortKey="Bloem, Gm" uniqKey="Bloem G">GM Bloem</name>
</author>
<author>
<name sortKey="Groothuis, Tgg" uniqKey="Groothuis T">TGG Groothuis</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Kucera, D" uniqKey="Kucera D">D Kucera</name>
</author>
<author>
<name sortKey="Haviger, J" uniqKey="Haviger J">J Haviger</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Jallais, C" uniqKey="Jallais C">C Jallais</name>
</author>
<author>
<name sortKey="Gilet, A" uniqKey="Gilet A">A Gilet</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Yerkes, Rm" uniqKey="Yerkes R">RM Yerkes</name>
</author>
<author>
<name sortKey="Dodson, Jd" uniqKey="Dodson J">JD Dodson</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Bezdudnaya, Tcm" uniqKey="Bezdudnaya T">TCM Bezdudnaya</name>
</author>
<author>
<name sortKey="Bereshpolova, Y" uniqKey="Bereshpolova Y">Y Bereshpolova</name>
</author>
<author>
<name sortKey="Stoelzel, Cr" uniqKey="Stoelzel C">CR Stoelzel</name>
</author>
<author>
<name sortKey="Alanso, Jm" uniqKey="Alanso J">JM Alanso</name>
</author>
<author>
<name sortKey="Swadlow, Ha" uniqKey="Swadlow H">HA Swadlow</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Neill, Cm" uniqKey="Neill C">CM Neill</name>
</author>
<author>
<name sortKey="Stryker, Mp" uniqKey="Stryker M">MP Stryker</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Woods, Aj" uniqKey="Woods A">AJ Woods</name>
</author>
<author>
<name sortKey="Mennemeier, M" uniqKey="Mennemeier M">M Mennemeier</name>
</author>
<author>
<name sortKey="Garcia Rill, E" uniqKey="Garcia Rill E">E Garcia-Rill</name>
</author>
<author>
<name sortKey="Huitt, T" uniqKey="Huitt T">T Huitt</name>
</author>
<author>
<name sortKey="Chelette, Kc" uniqKey="Chelette K">KC Chelette</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Gilad, S" uniqKey="Gilad S">S Gilad</name>
</author>
<author>
<name sortKey="Meng, M" uniqKey="Meng M">M Meng</name>
</author>
<author>
<name sortKey="Sinha, P" uniqKey="Sinha P">P Sinha</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Huang, J" uniqKey="Huang J">J Huang</name>
</author>
<author>
<name sortKey="Chan, Rck" uniqKey="Chan R">RCK Chan</name>
</author>
<author>
<name sortKey="Lu, X" uniqKey="Lu X">X Lu</name>
</author>
<author>
<name sortKey="Tong, Z" uniqKey="Tong Z">Z Tong</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Haralanova, E" uniqKey="Haralanova E">E Haralanova</name>
</author>
<author>
<name sortKey="Haralanov, S" uniqKey="Haralanov S">S Haralanov</name>
</author>
<author>
<name sortKey="Beraldi, A" uniqKey="Beraldi A">A Beraldi</name>
</author>
<author>
<name sortKey="Moller, Hj" uniqKey="Moller H">HJ Möller</name>
</author>
<author>
<name sortKey="Hennig Fast, K" uniqKey="Hennig Fast K">K Hennig-Fast</name>
</author>
</analytic>
</biblStruct>
<biblStruct></biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Masanobu, A" uniqKey="Masanobu A">A Masanobu</name>
</author>
<author>
<name sortKey="Choshi, K" uniqKey="Choshi K">K Choshi</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Bar Haim, Y" uniqKey="Bar Haim Y">Y Bar-Haim</name>
</author>
<author>
<name sortKey="Lamy, D" uniqKey="Lamy D">D Lamy</name>
</author>
<author>
<name sortKey="Pergamin, L" uniqKey="Pergamin L">L Pergamin</name>
</author>
<author>
<name sortKey="Bakermans Kranenburg, Mj" uniqKey="Bakermans Kranenburg M">MJ Bakermans-Kranenburg</name>
</author>
<author>
<name sortKey="Van Ijzendoorn, Mh" uniqKey="Van Ijzendoorn M">MH van Ijzendoorn</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Richards, Hj" uniqKey="Richards H">HJ Richards</name>
</author>
<author>
<name sortKey="Hadwin, Ja" uniqKey="Hadwin J">JA Hadwin</name>
</author>
<author>
<name sortKey="Benson, V" uniqKey="Benson V">V Benson</name>
</author>
<author>
<name sortKey="Wenger, Mj" uniqKey="Wenger M">MJ Wenger</name>
</author>
<author>
<name sortKey="Donnelly, N" uniqKey="Donnelly N">N Donnelly</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Sacharin, V" uniqKey="Sacharin V">V Sacharin</name>
</author>
<author>
<name sortKey="Sander, D" uniqKey="Sander D">D Sander</name>
</author>
<author>
<name sortKey="Scherer, Kr" uniqKey="Scherer K">KR Scherer</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Becker, Dv" uniqKey="Becker D">DV Becker</name>
</author>
<author>
<name sortKey="Neel, R" uniqKey="Neel R">R Neel</name>
</author>
<author>
<name sortKey="Srinivasan, N" uniqKey="Srinivasan N">N Srinivasan</name>
</author>
<author>
<name sortKey="Neufeld, S" uniqKey="Neufeld S">S Neufeld</name>
</author>
<author>
<name sortKey="Kumar, D" uniqKey="Kumar D">D Kumar</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Marinetti, C" uniqKey="Marinetti C">C Marinetti</name>
</author>
<author>
<name sortKey="Mesquita, B" uniqKey="Mesquita B">B Mesquita</name>
</author>
<author>
<name sortKey="Yik, M" uniqKey="Yik M">M Yik</name>
</author>
<author>
<name sortKey="Cragwall, C" uniqKey="Cragwall C">C Cragwall</name>
</author>
<author>
<name sortKey="Gallagher, Ah" uniqKey="Gallagher A">AH Gallagher</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Sloan, Dm" uniqKey="Sloan D">DM Sloan</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Langner, O" uniqKey="Langner O">O Langner</name>
</author>
<author>
<name sortKey="Dotsch, R" uniqKey="Dotsch R">R Dotsch</name>
</author>
<author>
<name sortKey="Bijlstra, G" uniqKey="Bijlstra G">G Bijlstra</name>
</author>
<author>
<name sortKey="Wigboldus, Dhj" uniqKey="Wigboldus D">DHJ Wigboldus</name>
</author>
<author>
<name sortKey="Hawk, St" uniqKey="Hawk S">ST Hawk</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Benson, Pj" uniqKey="Benson P">PJ Benson</name>
</author>
<author>
<name sortKey="Perrett, Di" uniqKey="Perrett D">DI Perrett</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Marneweck, M" uniqKey="Marneweck M">M Marneweck</name>
</author>
<author>
<name sortKey="Loftus, A" uniqKey="Loftus A">A Loftus</name>
</author>
<author>
<name sortKey="Hammond, G" uniqKey="Hammond G">G Hammond</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Ellermeier, W" uniqKey="Ellermeier W">W Ellermeier</name>
</author>
<author>
<name sortKey="Westphal, W" uniqKey="Westphal W">W Westphal</name>
</author>
<author>
<name sortKey="Heidenfelder, M" uniqKey="Heidenfelder M">M Heidenfelder</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Carello, C" uniqKey="Carello C">C Carello</name>
</author>
<author>
<name sortKey="Grosofsky, A" uniqKey="Grosofsky A">A Grosofsky</name>
</author>
<author>
<name sortKey="Reichel, Fd" uniqKey="Reichel F">FD Reichel</name>
</author>
<author>
<name sortKey="Solomon, Hy" uniqKey="Solomon H">HY Solomon</name>
</author>
<author>
<name sortKey="Turvey, Mt" uniqKey="Turvey M">MT Turvey</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Choi, Hj" uniqKey="Choi H">HJ Choi</name>
</author>
<author>
<name sortKey="Mark, Ls" uniqKey="Mark L">LS Mark</name>
</author>
</analytic>
</biblStruct>
<biblStruct></biblStruct>
<biblStruct></biblStruct>
<biblStruct></biblStruct>
<biblStruct></biblStruct>
<biblStruct></biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Sun, G" uniqKey="Sun G">G Sun</name>
</author>
<author>
<name sortKey="Zhu, C" uniqKey="Zhu C">C Zhu</name>
</author>
<author>
<name sortKey="Kramer, Mh" uniqKey="Kramer M">MH Kramer</name>
</author>
<author>
<name sortKey="Yang, Ss" uniqKey="Yang S">SS Yang</name>
</author>
<author>
<name sortKey="Song, W" uniqKey="Song W">W Song</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Vonesh, Ef" uniqKey="Vonesh E">EF Vonesh</name>
</author>
<author>
<name sortKey="Chinchilli, Vm" uniqKey="Chinchilli V">VM Chinchilli</name>
</author>
<author>
<name sortKey="Pu, K" uniqKey="Pu K">K Pu</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Hansen, Ch" uniqKey="Hansen C">CH Hansen</name>
</author>
<author>
<name sortKey="Hansen, Rd" uniqKey="Hansen R">RD Hansen</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Landy, Ms" uniqKey="Landy M">MS Landy</name>
</author>
<author>
<name sortKey="Maloney, Lt" uniqKey="Maloney L">LT Maloney</name>
</author>
<author>
<name sortKey="Johnston, Eb" uniqKey="Johnston E">EB Johnston</name>
</author>
<author>
<name sortKey="Young, M" uniqKey="Young M">M Young</name>
</author>
</analytic>
</biblStruct>
<biblStruct></biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Caudek, C" uniqKey="Caudek C">C Caudek</name>
</author>
<author>
<name sortKey="Fantoni, C" uniqKey="Fantoni C">C Fantoni</name>
</author>
<author>
<name sortKey="Domini, F" uniqKey="Domini F">F Domini</name>
</author>
</analytic>
</biblStruct>
<biblStruct></biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Gregory, Rl" uniqKey="Gregory R">RL Gregory</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Oosterwijk, S" uniqKey="Oosterwijk S">S Oosterwijk</name>
</author>
<author>
<name sortKey="Lindquist, Ka" uniqKey="Lindquist K">KA Lindquist</name>
</author>
<author>
<name sortKey="Anderson, E" uniqKey="Anderson E">E Anderson</name>
</author>
<author>
<name sortKey="Dautoff, R" uniqKey="Dautoff R">R Dautoff</name>
</author>
<author>
<name sortKey="Moriguchi, Y" uniqKey="Moriguchi Y">Y Moriguchi</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Clark, Dm" uniqKey="Clark D">DM Clark</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Velten, E" uniqKey="Velten E">E Velten</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Mayberg, Hs" uniqKey="Mayberg H">HS Mayberg</name>
</author>
<author>
<name sortKey="Liotti, M" uniqKey="Liotti M">M Liotti</name>
</author>
<author>
<name sortKey="Brannan, Sk" uniqKey="Brannan S">SK Brannan</name>
</author>
<author>
<name sortKey="Mcginnis, S" uniqKey="Mcginnis S">S McGinnis</name>
</author>
<author>
<name sortKey="Mahurin, Rk" uniqKey="Mahurin R">RK Mahurin</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Robinson, O" uniqKey="Robinson O">O Robinson</name>
</author>
<author>
<name sortKey="Grillon, C" uniqKey="Grillon C">C Grillon</name>
</author>
<author>
<name sortKey="Sahakian, B" uniqKey="Sahakian B">B Sahakian</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Kersten, D" uniqKey="Kersten D">D Kersten</name>
</author>
<author>
<name sortKey="Mamassian, P" uniqKey="Mamassian P">P Mamassian</name>
</author>
<author>
<name sortKey="Yuille, A" uniqKey="Yuille A">A Yuille</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Goodale, Ma" uniqKey="Goodale M">MA Goodale</name>
</author>
<author>
<name sortKey="Milner, Ad" uniqKey="Milner A">AD Milner</name>
</author>
<author>
<name sortKey="Jakobson, Ls" uniqKey="Jakobson L">LS Jakobson</name>
</author>
<author>
<name sortKey="Carey, Dp" uniqKey="Carey D">DP Carey</name>
</author>
</analytic>
</biblStruct>
<biblStruct></biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Derryberry, D" uniqKey="Derryberry D">D Derryberry</name>
</author>
<author>
<name sortKey="Reed, Ma" uniqKey="Reed M">MA Reed</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Jeffries, Lm" uniqKey="Jeffries L">LM Jeffries</name>
</author>
<author>
<name sortKey="Smilek, D" uniqKey="Smilek D">D Smilek</name>
</author>
<author>
<name sortKey="Eich, E" uniqKey="Eich E">E Eich</name>
</author>
<author>
<name sortKey="Enns, Jt" uniqKey="Enns J">JT Enns</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Martin, D" uniqKey="Martin D">D Martin</name>
</author>
<author>
<name sortKey="Slessor, G" uniqKey="Slessor G">G Slessor</name>
</author>
<author>
<name sortKey="Allen, R" uniqKey="Allen R">R Allen</name>
</author>
<author>
<name sortKey="Phillips, Lh" uniqKey="Phillips L">LH Phillips</name>
</author>
<author>
<name sortKey="Darling, S" uniqKey="Darling S">S Darling</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Franz, Vh" uniqKey="Franz V">VH Franz</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Foster, R" uniqKey="Foster R">R Foster</name>
</author>
<author>
<name sortKey="Fantoni, C" uniqKey="Fantoni C">C Fantoni</name>
</author>
<author>
<name sortKey="Caudek, C" uniqKey="Caudek C">C Caudek</name>
</author>
<author>
<name sortKey="Domini, F" uniqKey="Domini F">F Domini</name>
</author>
</analytic>
</biblStruct>
<biblStruct></biblStruct>
</listBibl>
</div1>
</back>
</TEI>
<pmc article-type="research-article">
<pmc-dir>properties open_access</pmc-dir>
<front>
<journal-meta>
<journal-id journal-id-type="nlm-ta">PLoS One</journal-id>
<journal-id journal-id-type="iso-abbrev">PLoS ONE</journal-id>
<journal-id journal-id-type="publisher-id">plos</journal-id>
<journal-id journal-id-type="pmc">plosone</journal-id>
<journal-title-group>
<journal-title>PLoS ONE</journal-title>
</journal-title-group>
<issn pub-type="epub">1932-6203</issn>
<publisher>
<publisher-name>Public Library of Science</publisher-name>
<publisher-loc>San Francisco, USA</publisher-loc>
</publisher>
</journal-meta>
<article-meta>
<article-id pub-id-type="pmid">25251882</article-id>
<article-id pub-id-type="pmc">4176726</article-id>
<article-id pub-id-type="publisher-id">PONE-D-14-24461</article-id>
<article-id pub-id-type="doi">10.1371/journal.pone.0108211</article-id>
<article-categories>
<subj-group subj-group-type="heading">
<subject>Research Article</subject>
</subj-group>
<subj-group subj-group-type="Discipline-v2">
<subject>Biology and Life Sciences</subject>
<subj-group>
<subject>Anatomy</subject>
<subj-group>
<subject>Nervous System</subject>
<subj-group>
<subject>Motor System</subject>
</subj-group>
</subj-group>
</subj-group>
<subj-group>
<subject>Neuroscience</subject>
<subj-group>
<subject>Cognitive Neuroscience</subject>
<subj-group>
<subject>Motor Reactions</subject>
<subj-group>
<subject>Postural Control</subject>
</subj-group>
</subj-group>
<subj-group>
<subject>Reaction Time</subject>
</subj-group>
</subj-group>
<subj-group>
<subject>Cognitive Science</subject>
<subj-group>
<subject>Cognition</subject>
<subj-group>
<subject>Memory</subject>
<subj-group>
<subject>Face Recognition</subject>
</subj-group>
</subj-group>
</subj-group>
<subj-group>
<subject>Cognitive Psychology</subject>
<subj-group>
<subject>Attention</subject>
<subject>Perception</subject>
</subj-group>
</subj-group>
</subj-group>
<subj-group>
<subject>Sensory Perception</subject>
<subj-group>
<subject>Psychophysics</subject>
<subject>Vision</subject>
</subj-group>
</subj-group>
</subj-group>
<subj-group>
<subject>Psychology</subject>
<subj-group>
<subject>Behavior</subject>
<subj-group>
<subject>Human Movement</subject>
<subject>Human Performance</subject>
</subj-group>
</subj-group>
<subj-group>
<subject>Applied Psychology</subject>
<subject>Emotions</subject>
<subject>Experimental Psychology</subject>
</subj-group>
</subj-group>
</subj-group>
<subj-group subj-group-type="Discipline-v2">
<subject>Research and Analysis Methods</subject>
<subj-group>
<subject>Research Design</subject>
<subj-group>
<subject>Experimental Design</subject>
<subj-group>
<subject>Factorial Design</subject>
</subj-group>
</subj-group>
<subj-group>
<subject>Empirical Methods</subject>
<subject>Quantitative Analysis</subject>
</subj-group>
</subj-group>
</subj-group>
</article-categories>
<title-group>
<article-title>Body Actions Change the Appearance of Facial Expressions</article-title>
<alt-title alt-title-type="running-head">Action Affects Perceived Emotions</alt-title>
</title-group>
<contrib-group>
<contrib contrib-type="author">
<name>
<surname>Fantoni</surname>
<given-names>Carlo</given-names>
</name>
<xref ref-type="aff" rid="aff1">
<sup>1</sup>
</xref>
<xref ref-type="aff" rid="aff2">
<sup>2</sup>
</xref>
<xref ref-type="corresp" rid="cor1">
<sup>*</sup>
</xref>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Gerbino</surname>
<given-names>Walter</given-names>
</name>
<xref ref-type="aff" rid="aff1">
<sup>1</sup>
</xref>
</contrib>
</contrib-group>
<aff id="aff1">
<label>1</label>
<addr-line>Department of Life Sciences, Psychology Unit “Gaetano Kanizsa”, University of Trieste, Trieste, Italy</addr-line>
</aff>
<aff id="aff2">
<label>2</label>
<addr-line>Center for Neuroscience and Cognitive Systems@UniTn, Istituto Italiano di Tecnologia, Rovereto, Italy</addr-line>
</aff>
<contrib-group>
<contrib contrib-type="editor">
<name>
<surname>Urgesi</surname>
<given-names>Cosimo</given-names>
</name>
<role>Editor</role>
<xref ref-type="aff" rid="edit1"></xref>
</contrib>
</contrib-group>
<aff id="edit1">
<addr-line>University of Udine, Italy</addr-line>
</aff>
<author-notes>
<corresp id="cor1">* E-mail:
<email>cfantoni@units.it</email>
</corresp>
<fn fn-type="conflict">
<p>
<bold>Competing Interests: </bold>
The authors have declared that no competing interests exist.</p>
</fn>
<fn fn-type="con">
<p>Conceived and designed the experiments: CF WG. Performed the experiments: CF. Analyzed the data: CF. Contributed reagents/materials/analysis tools: CF. Wrote the paper: CF WG.</p>
</fn>
</author-notes>
<pub-date pub-type="collection">
<year>2014</year>
</pub-date>
<pub-date pub-type="epub">
<day>24</day>
<month>9</month>
<year>2014</year>
</pub-date>
<volume>9</volume>
<issue>9</issue>
<elocation-id>e108211</elocation-id>
<history>
<date date-type="received">
<day>8</day>
<month>6</month>
<year>2014</year>
</date>
<date date-type="accepted">
<day>21</day>
<month>8</month>
<year>2014</year>
</date>
</history>
<permissions>
<copyright-year>2014</copyright-year>
<copyright-holder>Fantoni, Gerbino</copyright-holder>
<license>
<license-p>This is an open-access article distributed under the terms of the
<ext-link ext-link-type="uri" xlink:href="http://creativecommons.org/licenses/by/4.0/">Creative Commons Attribution License</ext-link>
, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.</license-p>
</license>
</permissions>
<abstract>
<p>Perception, cognition, and emotion do not operate along segregated pathways; rather, their adaptive interaction is supported by various sources of evidence. For instance, the aesthetic appraisal of powerful mood inducers like music can bias the facial expression of emotions towards mood congruency. In four experiments we showed similar mood-congruency effects elicited by the
<italic>comfort/discomfort</italic>
of body actions. Using a novel
<italic>Motor Action Mood Induction Procedure</italic>
, we let participants perform comfortable/uncomfortable visually-guided reaches and tested them in a facial emotion identification task. Through the alleged mediation of motor action induced mood, action comfort enhanced the quality of the participant’s global experience (a neutral face appeared happy and a slightly angry face neutral), while action discomfort made a neutral face appear angry and a slightly happy face neutral. Furthermore, uncomfortable (but not comfortable) reaching improved the sensitivity for the identification of emotional faces and reduced the identification time of facial expressions, as a possible effect of hyper-arousal from an unpleasant bodily experience.</p>
</abstract>
<funding-group>
<funding-statement>This work was supported by the Italian Ministry economic development (Industria 2015, Ecoautobus Grant to WG) and by the University of Trieste (FRA-2013 Grant to CF). The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.</funding-statement>
</funding-group>
<counts>
<page-count count="12"></page-count>
</counts>
<custom-meta-group>
<custom-meta id="data-availability">
<meta-name>Data Availability</meta-name>
<meta-value>The authors confirm that all data underlying the findings are fully available without restriction. All relevant data are within the paper.</meta-value>
</custom-meta>
</custom-meta-group>
</article-meta>
<notes>
<title>Data Availability</title>
<p>The authors confirm that all data underlying the findings are fully available without restriction. All relevant data are within the paper.</p>
</notes>
</front>
<body>
<sec id="s1">
<title>Introduction</title>
<p>Bodily interaction with everyday objects within the peripersonal space has powerful effects. It can specify social
<xref rid="pone.0108211-Becchio1" ref-type="bibr">[1]</xref>
and communicative intentions
<xref rid="pone.0108211-Sartori1" ref-type="bibr">[2]</xref>
, the morphology of body schema
<xref rid="pone.0108211-Cardinali1" ref-type="bibr">[3]</xref>
, as well as object depth, object shape, and tactile sensitivity
<xref rid="pone.0108211-Volcic1" ref-type="bibr">[4]</xref>
. Furthermore, hand movement kinematics has been found to depend on subjective well-being
<xref rid="pone.0108211-Higuchi1" ref-type="bibr">[5]</xref>
, which suggests a link between action comfort and workplace productivity
<xref rid="pone.0108211-Knight1" ref-type="bibr">[6]</xref>
. Here, we take a step further by examining the impact of
<italic>comfortable/uncomfortable</italic>
reaches on the perception of facial expression of emotions.</p>
<p>Even simple activities include complex sequences of goal-directed reaches involved in the correct picking up of objects. Though reaching is an essential and pervasive component of everyday actions, people are almost blind to motor effort involved in body motion, and largely ignore biodynamic components such as muscular strength and number of involved joints
<xref rid="pone.0108211-Warren1" ref-type="bibr">[7]</xref>
. Though subtle, postural shifts associated with reaching can have a strong impact on perception and performance
<xref rid="pone.0108211-Damasio1" ref-type="bibr">[8</xref>
<xref rid="pone.0108211-Niedenthal2" ref-type="bibr">10]</xref>
.</p>
<p>Central to our study are two apparently unrelated findings. First, it has been found that the subjective state of
<italic>comfort</italic>
/
<italic>discomfort</italic>
is related to the psychological mood state
<xref rid="pone.0108211-Conway1" ref-type="bibr">[11]</xref>
and to the individual reaching mode, with perceived
<italic>discomfort</italic>
increasing as the number of body parts (muscles, joints) engaged in reaching increases
<xref rid="pone.0108211-Mark1" ref-type="bibr">[12]</xref>
. In particular, it has been shown that beyond a critical distance (corresponding on average to the 90% of the maximal arm extension) reaching for an object becomes uncomfortable and negative mood states can arise
<xref rid="pone.0108211-Mark1" ref-type="bibr">[12]</xref>
. Second, hyper-arousal from sensory stimulation (i.e., a higher level of arousal than in the normal awake state, induced by exposure to cold) can improve stereoacuity and contrast sensitivity
<xref rid="pone.0108211-Woods1" ref-type="bibr">[13]</xref>
, confirming that perceived emotions can potentiate the benefits of attention on sensory discrimination
<xref rid="pone.0108211-Phelps1" ref-type="bibr">[14]</xref>
. Similarly, hyper-arousal from action (relative to inaction) might improve the detection of subtle variations in the facial expression of emotions.</p>
<p>Two related questions are at the focus of our study.
<italic>Can the comfort/discomfort of previously performed reaches systematically bias the perception of facial expressions towards a positive (happiness) vs. negative (anger) valence? Can sensitivity to facial expressions be improved by the previous engagement in reaching?</italic>
</p>
<sec id="s1a">
<title>Action, emotion, and facial expressions</title>
<p>A large body of research on object perception and representation refers to the processing of information within a given sensory modality and to its interaction with primitives, schemata, and other types of mental entities. For instance, current models of perceived facial expression of emotions are focused on visual information. One influential approach to the recognition of facial expression of emotions is based on the identification of sets of local and global image features matching with characteristics common to a given emotion category
<xref rid="pone.0108211-Ekman1" ref-type="bibr">[15</xref>
<xref rid="pone.0108211-Rolls1" ref-type="bibr">17]</xref>
. However, in ordinary conditions facial expressions of emotions are perceived while observers process a multitude of internal and external stimuli resulting from their active interactions with the environment. Consistent with the role classically attributed to action in the acquisition of object knowledge, the integration of information obtained during the perception-action cycle attracts a growing body of research
<xref rid="pone.0108211-Santos1" ref-type="bibr">[18]</xref>
. Despite this growing interest, the effects of body actions on the perception of emotions from facial expression represent a largely unexplored territory.</p>
<p>Since bodily interaction with everyday objects within the peripersonal space has been shown to have powerful effects on perception, it is reasonable to expect that action modulates the perception of facial expressions, thus playing a pivotal role in human communication and cognition.
<xref rid="pone.0108211-Proffitt1" ref-type="bibr">[19]</xref>
showed that the perception of spatial layout is influenced by the bodily state of the observer: hills may appear steeper and distances farther away to participants who are old, fatigued, or wearing a heavy backpack.
<xref rid="pone.0108211-Yap1" ref-type="bibr">[20]</xref>
found that endorsing an expansive rather than contractive posture of the body can increase dishonest behavior.
<xref rid="pone.0108211-Volcic1" ref-type="bibr">[4]</xref>
found that depth perception can be modulated by arm representation induced by visuomotor adaptation in which participants execute reaching movements with the visual feedback of their reaching finger displaced farther in depth, as if they had a longer arm. Among others, such effects show that the brain integrates sensory signals from the body, quickly adapting to newly established body postures and using them as flexible anchors yielding the observer to a vivid impression of three-dimensionality and valence.</p>
<p>Furthermore, research investigating possible links between emotion and cognition suggests that emotional states can influence seemingly unrelated domains such as the hierarchical organization of vision
<xref rid="pone.0108211-Gasper1" ref-type="bibr">[21]</xref>
. Emotions are pervasive as well as contagious, and can be evoked while viewing or mimicking emotionally expressive faces
<xref rid="pone.0108211-Niedenthal1" ref-type="bibr">[9</xref>
,
<xref rid="pone.0108211-Wild1" ref-type="bibr">22]</xref>
. The categorical perception and representation of emotionally expressive faces depend on mood
<xref rid="pone.0108211-Caudek1" ref-type="bibr">[23]</xref>
, through mediating factors such as past experience
<xref rid="pone.0108211-Pollak1" ref-type="bibr">[24]</xref>
, neutral faces
<xref rid="pone.0108211-Klatzky1" ref-type="bibr">[25]</xref>
, and music
<xref rid="pone.0108211-Jeong1" ref-type="bibr">[26</xref>
,
<xref rid="pone.0108211-Jolij1" ref-type="bibr">27]</xref>
. Such effects are consistent with the emotional response categorization theory
<xref rid="pone.0108211-Niedenthal3" ref-type="bibr">[28]</xref>
, implying that humans are tuned to perceive things that are congruent with their emotional state. For instance,
<xref rid="pone.0108211-Bouhuys1" ref-type="bibr">[29]</xref>
found that music alters the perception of facial expression of emotions in a mood-congruent direction: the amount of rejection/sadness perceived in a neutral expression largely increased after participants were exposed to a sad music. Here, in a similar vein, we hypothesize that the temporary mood induced by comfort/discomfort associated with goal-directed actions can bias the perceived expression of emotional faces.</p>
<p>Our identification task required observers to classify a face as “happy” or “angry” after a novel
<italic>Motor Action Mood Induction Procedure</italic>
(MAMIP) based on performing a series of comfortable/uncomfortable goal-directed reaching actions. According to
<xref rid="pone.0108211-Mark1" ref-type="bibr">[12]</xref>
we manipulated the comfort/discomfort of actions by varying the depth extent of goal-directed reaches. In every identification trial a face displayed an expression corresponding to a randomly selected position along a happy-to-angry morph continuum. If motor action is an effective mood inducer, identification should then be biased in a mood-congruent direction:
<italic>comfortable</italic>
actions should increase the probability that a neutral face appears to display a
<italic>positive</italic>
emotion (happiness), because of the positive mood induced by the positive action valence. Conversely,
<italic>uncomfortable</italic>
actions should increase the probability that a neutral face appears to display a
<italic>negative</italic>
emotion (anger), because of the negative mood induced by the negative action valence. The effectiveness of the MAMIP was thus tested using an objective measure based on facial emotion identification rather than a subjective measure based on self-description, to avoid well known problems related to the self-referential assessment of internal mood states; i.e., to “emotional self-awareness”
<xref rid="pone.0108211-Kucera1" ref-type="bibr">[30</xref>
,
<xref rid="pone.0108211-Jallais1" ref-type="bibr">31]</xref>
. In the present study the effect of action on mood was thus assessed through an implicit, rather than an explicit, measure based on the biased identification of facial emotions contingent on reaching comfort/discomfort. If mood affects performance then the direction of the bias should be similar to the one observed using other types of mood inducers (e.g., music), being positive when preceded by an inducer with positive valence (i.e., comfortable actions) vs. negative when preceded by an inducer with negative valence (i.e., uncomfortable actions).</p>
<p>Furthermore, it is known that performance is affected by arousal
<xref rid="pone.0108211-Yerkes1" ref-type="bibr">[32]</xref>
. Increases in arousal have been shown to: (1) modulate the responsiveness of neurons in the early mice visual system
<xref rid="pone.0108211-Bezdudnaya1" ref-type="bibr">[33</xref>
,
<xref rid="pone.0108211-Neill1" ref-type="bibr">34]</xref>
; (2) facilitate attentional mechanisms in tasks requiring sustained performance
<xref rid="pone.0108211-Woods2" ref-type="bibr">[35]</xref>
; (3) improve stereo as well as contrast sensitivity in humans
<xref rid="pone.0108211-Woods1" ref-type="bibr">[13]</xref>
. Luminance contrast on its own is known to provide important information for the recognition of facial expressions and identity
<xref rid="pone.0108211-Gilad1" ref-type="bibr">[36]</xref>
. A further direct link between the perception of facial expression of emotions and arousal has been recently revealed by studies on emotion perception abnormalities.
<xref rid="pone.0108211-Huang1" ref-type="bibr">[37]</xref>
found that schizophrenic patients were more sensitive to angry facial expressions than control observers when processing facial expressions along the happy-to-angry morph continuum. In addition, the tendency of schizophrenic patients to assign emotional salience to neutral social stimuli has been found to correlate with their higher level of emotional arousal
<xref rid="pone.0108211-Haralanova1" ref-type="bibr">[38]</xref>
.</p>
<p>Based on such evidence we expected the precision in facial emotion identification to be higher when the task is preceded by reaching (relative to an inaction
<italic>baseline</italic>
condition without reaching) and, in addition, to be higher after uncomfortable reaching (requiring a high level of motor activation/
<italic>arousal</italic>
) is higher than after comfortable reaching (requiring a low level of motor activation/
<italic>arousal</italic>
). A similar response time asymmetry along the comfort-discomfort continuum was also expected in the facial emotion identification task, given that in general responses are faster at higher arousal levels
<xref rid="pone.0108211-Welford1" ref-type="bibr">[39</xref>
,
<xref rid="pone.0108211-Masanobu1" ref-type="bibr">40]</xref>
. With specific regards to the perception of facial expressions, personality types with higher arousal levels (e.g., individuals with high subclinical anxiety or with anxiety disorder) generally show a stronger anger superiority effect, with faster reaction times to threatening/angry faces
<xref rid="pone.0108211-BarHaim1" ref-type="bibr">[41]</xref>
and an improved capacity to quickly process more threatening faces at once
<xref rid="pone.0108211-Richards1" ref-type="bibr">[42]</xref>
, compared to low trait-anxiety individuals.</p>
</sec>
</sec>
<sec id="s2">
<title>Experiments</title>
<sec id="s2a">
<title>Rationale & Expectations</title>
<p>We tested our hypothesis that body action comfort/discomfort affects the perception of facial expression of emotions in four experiments. In Experiments 1 and 2 action comfort/discomfort was systematically manipulated during visually guided reaching movements under unrestrained body conditions, following the expectation that action valence during motor interaction induces a positive/negative mood that shifts perceived facial expressions in a congruent direction. We tested participants in a facial emotion identification task individually. In two successive blocks distinguished by reaches of opposite valence we measured the average Response Time (RT) to 6 levels of morphed expressions, as well as two indices of categorical perception along the happy-to-angry morphed face continuum: (i) the Point of Subjective Neutrality (PSN; i.e., the categorical boundary corresponding to the facial expression that led to equal probabilities of “happy” and “angry” responses) and the Just Noticeable Difference (JND, defined as half the morph interval between 16 and 84 per cent “angry” responses). In Experiment 1 participants performed 50 comfortable reaches (followed by the emotion identification block) and then 50 uncomfortable reaches (followed by another emotion identification block). The ordering of action type was reversed in Experiment 2, given that mood induction might have a long duration and the perception of changing facial expressions is affected by hysteresis
<xref rid="pone.0108211-Sacharin1" ref-type="bibr">[43]</xref>
.</p>
<p>The following hypotheses were considered:</p>
<list list-type="bullet">
<list-item>
<p>
<italic>H1)</italic>
In both experiments individual PSNs should be shifted in the direction opposite to action valence (for instance, after an uncomfortable action the PSN should correspond to a morphed face containing more happiness than anger relative to the PSN obtained after the comfortable action).</p>
</list-item>
<list-item>
<p>
<italic>H2)</italic>
As an effect of hysteresis PSNs should be globally shifted towards happiness in Experiment 2, relative to Experiment 1, given that in Experiment 2 initial reaching acts were uncomfortable, possibly inducing a negative mood that biased the whole session, making slightly happy faces look neutral.</p>
</list-item>
<list-item>
<p>
<italic>H3</italic>
) In both experiments we expected JNDs and RTs to be smaller after uncomfortable than comfortable reaches.</p>
</list-item>
<list-item>
<p>Facial expressions of happiness and anger are known to have different hedonic impact
<xref rid="pone.0108211-Becker1" ref-type="bibr">[44</xref>
,
<xref rid="pone.0108211-Marinetti1" ref-type="bibr">45]</xref>
. A 2D morphing procedure like the one we used generates an image resulting from the linear interpolation of image features. Therefore, a 50 per cent morph (in which a fully happy expression and a fully angry expression of the same person are present in equal proportions) may not necessarily correspond to a facial expression experienced as neutral. One major aim of Experiment 3 was thus to identify the
<italic>baseline</italic>
values of PSN and JND by measuring accuracy and precision in the same facial emotion identification task utilized in Experiments 1 and 2, but in the absence of previously performed actions. Hence, the following hypotheses were included:</p>
</list-item>
<list-item>
<p>
<italic>H4</italic>
) If goal-directed reaches have an arousing effect on performance, then the average JNDs obtained in Experiments 1 and 2 should be smaller than the
<italic>baseline</italic>
JND in Experiment 3.</p>
</list-item>
<list-item>
<p>
<italic>H5)</italic>
If comfortable reaches empower our sense of motor skillfulness, thus contributing to the establishment of a more positive mood than the neutral mood experienced in the absence of action (Experiment 3 -
<italic>baseline</italic>
condition), then average PSNs after comfortable reaches in Experiment 1 should be shifted toward anger relative to the
<italic>baseline</italic>
PSN in Experiment 3. This hypothesis is based on the general idea that, relative to inaction, action is rewarding, if executed within the comfort range. Vice versa, PSNs after uncomfortable reaches in Experiment 2 should be shifted toward happiness relative to the
<italic>baseline</italic>
PSN in Experiment 3, since reaching outside the natural grasping range would induce a negative mood, as a direct product of discomfort or as an effect of experiential avoidance
<xref rid="pone.0108211-Sloan1" ref-type="bibr">[46]</xref>
. It should be stressed that the expectation of a positive effect of comfortable reaches (relative to the baseline measured in Experiment 3) critically follows from the idea that engagement in comfortable actions is more pleasant than the comfort associated to inaction.</p>
</list-item>
</list>
<p>Finally, Experiment 4 was run to validate our happy-to-angry morph continuum allowing us to extract another group
<italic>baseline</italic>
PSN using a different task and a different experimental setting: a large group of participants were asked to position every emotional face belonging to the morph set used in Experiments 1–3 on a 1–17 graphic rating scale (from happy to angry in
<italic>version A</italic>
and vice versa in
<italic>version B</italic>
).</p>
<sec id="s2a1">
<title>Participants</title>
<p>Two groups of undergraduates (total number = 119) of the University of Trieste participated in the experiments. All had normal or corrected-to-normal vision and were naïve to the purpose of the experiment. Students in the first group (n = 30; women = 21, median age = 22, all right handed) were randomly assigned to Experiments 1–3 (Experiments 1 and 2, 9 participants each; Experiment 3, 12 participants) and received class credit for participation. The data of Experiment 4 were gathered in two classroom meetings with 19 (version A) and 70 (version B) psychology students (women = 64; median age = 20), who took part in a 90-min collective session.</p>
<p>The study was approved by the Research Ethics Committee of the University of Trieste (approval number 52) in compliance with national legislation, the Ethical Code of the Italian Association of Psychology, and the Code of Ethical Principles for Medical Research Involving Human Subjects of the World Medical Association (Declaration of Helsinki). Participants in Experiments 1–3 provided their written informed consent prior to inclusion in the study. Participants in Experiment 4 provided their oral informed consent before a data collection session included in lecture hours of an “introduction to perception” course. The request of oral consent formulated by the instructor (co-author WG) made explicit that people not willing to participate in the session should simply not accept the response sheet, without any consequence on the evaluation of their course attendance. The instructor specified that the required oral consent was a confirmation of the general agreement (included in the information about psychology undergraduate courses) that lectures would include classroom demonstrations and participations to short experiments, as an important part of activities directed to the fulfilment of standard learning outcomes. In Experiment 4 data were collected in a group session. Written consent (implying identification of every respondent) was redundant. Age and gender were the only elements of personal information included in the response sheet, reinforcing the emphasis on the anonymous treatment of data which was part of group instructions at the beginning of session. All students present in the classrooms accepted the response sheet and therefore behave as active participants in the data collection sessions of Experiment 4. Response sheets were filed as raw documents. The Ethics Committee of the University of Trieste approved the participation of regularly enrolled students to data collection sessions connected to this specific study. The Ethics Committee of the University of Trieste thus approved both the written informed consent used for Experiments 1–3 and the oral informed consent used for Experiment 4. Dataset is available as (
<xref ref-type="supplementary-material" rid="pone.0108211.s001">Data S1</xref>
).</p>
</sec>
<sec id="s2a2">
<title>Apparatus & Stimuli</title>
<p>In Experiments 1–3 participants were seated in a dark laboratory in front of a high-quality, front-silvered 40×30 cm mirror, slanted at 45° relative to the participant’s sagittal body mid-line and reflecting images displayed on a Sony Trinitron Color Graphic Display GDM-F520 CRT monitor (19″; 1024×768 pixels; 85 Hz refresh rate), placed at the left of the mirror (
<xref ref-type="fig" rid="pone-0108211-g001">Figure 1b, c</xref>
). For consistent vergence and accommodative information, the position of the monitor, attached to a linear positioning stage (Velmex Inc., Bloomfield, NY, USA), was adjusted on a trial-by-trial basis to equal the distance from the participant’s eyes to the virtual/real object that should be reached during the reaching block. To generate 3D visual displays we used a frame interlacing technique in conjunction with liquid crystal FE-1 goggles (Cambridge Research Systems, Cambridge, UK) synchronized with the monitor's frame rate. Head and index movements were acquired on-line with sub-millimeter resolution by using an Optotrak Certus motion capture system with two position sensors (Northern Digital Inc., Waterloo, Ontario, Canada). Head movements updated the participant’s viewpoint to present the correct geometrical projection of the stimulus in real time. The position of the index tip was calculated during the system calibration phase with respect to three infrared-emitting diodes attached on the distal phalanx. A custom C++ program was used for stimulus presentation as well as for the recording of response types (left/right keys of the computer keyboard) and RTs.</p>
<fig id="pone-0108211-g001" orientation="portrait" position="float">
<object-id pub-id-type="doi">10.1371/journal.pone.0108211.g001</object-id>
<label>Figure 1</label>
<caption>
<title>Random dot rod, action settings and facial stimulus set.</title>
<p>A stereogram representing a frontal view of the random dot rod used in the our reaching blocks together with the red sphere used to provide a visual feedback of the index finger (cross-fuse) is shown in (a). A sketch of action settings used in comfortable (b) and uncomfortable (c) reaching blocks. The facial stimulus set is illustrated in (d): the top row shows the 6 faces of the happy-angry continuum (including percentages of extreme anger in the 25–75 per cent range, and complementary percentages of extreme happiness) and the fully happy (left) and fully angry (right) expressions used to generate the morph continuum, belonging to the fourth character of the bottom row; the bottom row shows the 8 characters selected from the Radboud database, displaying the “neutral” expression obtained by morphing the fully happy and fully angry expressions in equal percentages (50 per cent each).</p>
</caption>
<graphic xlink:href="pone.0108211.g001"></graphic>
</fig>
<p>High-contrast random-dot visual stimuli were rendered in stereo simulating one vertically oriented rod with a dot density of 30 per cent and its back-surface visible (
<xref ref-type="fig" rid="pone-0108211-g001">Figure 1a</xref>
). The rod radius was 7.5 mm and the height 65 mm. The simulated egocentric depth of the rod axis along the line of sight was randomly chosen in the 0.65–0.75 range (
<xref ref-type="fig" rid="pone-0108211-g001">Figure 1b</xref>
), in the Comfortable block, and in the 0.90–1.00 range, in the Uncomfortable block (
<xref ref-type="fig" rid="pone-0108211-g001">Figure 1c</xref>
), relative to the arm length of each participant. The position of a physical rod (equal in shape to the virtual one) placed behind the mirror (completely occluded from the participant) was attached to a linear positioning stage (Velmex Inc., Bloomfield, NY, USA), adjusted on a trial-by-trial basis, so to align it perfectly with the virtual stimulus.</p>
<p>For our facial stimulus set (
<xref ref-type="fig" rid="pone-0108211-g001">Figure 1d</xref>
), we selected 8 characters (four Caucasian males and four Caucasian females) from the Radboud University Nijmegen set
<xref rid="pone.0108211-Langner1" ref-type="bibr">[47]</xref>
. The colored photographs displayed facial expressions of two basic emotions, happiness and anger, all producing a high agreement of their intended expressions in the validation study. A happy-to-angry continuum was generated for each of the 8 characters, morphing the fully happy face and the fully angry face in variable proportions, in 5 per cent steps, using MATLAB software adapted from open source programs. Given two facial images and about 75 key points, the software generates a synthetic image that contains a specified mixture of the original faces, using a sophisticated morphing algorithm that implements the principles described by
<xref rid="pone.0108211-Benson1" ref-type="bibr">[48]</xref>
. As in
<xref rid="pone.0108211-Marneweck1" ref-type="bibr">[49]</xref>
, we identified corresponding points in the two faces, with more points around areas of greater change with increasing emotional intensity (pupils, eyelids, eyebrows, and lips). For every character 6 morph intensities were selected along the happy-to-angry continuum, from 25 per cent angry ( = 75 per cent happiness) to 75 per cent angry ( = 25 per cent happy). All images were aligned for facial landmarks and masked by an oval vignette hiding hair and ears, presented on a black surround. The vignette was centered on the screen and had a size of 6.5×9.4 cm, corresponding to 7.5°×10.7° at the average viewing distance of 50 cm. Facial images used in each experimental trial were randomly extracted from this set of 48 stimuli (8 characters×6 facial expressions).</p>
<p>In Experiment 4 the same stimulus set was presented in a predefined pseudo-random order using PowerPoint through a high-resolution MARCA video projector connected to the graphic output of MAC-PRO (3D graphic accelerator). Participants were comfortably seated in a dimly lit classroom while facing the projection screen at the average distance of 12.25 m away. The average visual angle subtended by classroom displays was similar to the visual angle in Experiments 1–3, given that they were 35 times larger than the stimuli displayed on the lab CRT and the participant’s distance from the projection screen was about 35 times the one in the lab. Every participant was provided with a response form containing 48 numbered line segments, each with 17 equally spaced ticks (two extremes and central ticks marked in bold). Above the two extreme ticks two verbal labels were displayed: “happy” (left) and “angry” (right) for
<italic>version A,</italic>
and vice versa for
<italic>version B.</italic>
This manipulation was intended to control for possible effects of the spatial orientation of the rating scale.</p>
</sec>
<sec id="s2a3">
<title>Procedure</title>
<p>Reaching blocks (Experiments 1 and 2): The participant started a right hand movement from a fixed, out of view, position shifted relative to the body midline by about 25 cm from the sagittal plane and 15 cm from the coronal plane. The tip of his/her index finger, marked by a virtual red sphere (
<xref ref-type="fig" rid="pone-0108211-g001">Figure 1a</xref>
), was constantly visible from the moment the finger entered in the participant’s visual field. The task was to reach and touch the simulated random dot rod (
<xref ref-type="fig" rid="pone-0108211-g001">Figure 1a</xref>
) positioned along the line of sight (
<xref ref-type="fig" rid="pone-0108211-g001">Figure 1b, c</xref>
). Each successful reach was accompanied by haptic feedback (
<xref ref-type="fig" rid="pone-0108211-g001">Figure 1b, c</xref>
, red floating rod) and followed by acoustic feedback. Each block lasted 50 reaches, with the depth extent of each reach randomly selected in a range below (0.65–0.75 of arm length, Comfortable block) or above (0.90–1.00 of arm length, Uncomfortable block) the individual preferred critical boundary for one degree of freedom visually guided reaching
<xref rid="pone.0108211-Mark1" ref-type="bibr">[12]</xref>
, corresponding to the distance beyond which actors should introduce additional degrees of freedom to reach an object, with respect to those associated only to arm movements (
<xref ref-type="fig" rid="pone-0108211-g001">Figure 1b, c</xref>
).</p>
<p>The range of depths used for comfortable vs. uncomfortable actions were established empirically on the basis of the results of a preliminary experiment, in which 12 randomly selected students (6 women; median age = 23) of the University of Trieste were asked to perform 50 reaches toward the same random-dot cylinder used in Experiments 1 and 2, whose depth was randomly varied across trials in the entire 0.65–1.00 range of arm length (the same experimental setting of Experiments 1 and 2 was used). After each reach the participant was asked to rate the discomfort of the performed action on a 0–50 discomfort scale adapted from [50] pain scale (0 = reach felt completely natural; 25 = reach felt slightly unnatural as causing a moderate discomfort; 50 = reach felt completely unnatural as causing a severe discomfort).
<xref ref-type="fig" rid="pone-0108211-g002">Figure 2</xref>
illustrates the average ratio between the rating and the maximum value of the scale (over 7 equal intervals of relative reaching distance) together with the best fitting sigmoid function whose parameters were extracted after modelling the whole set of individual responses using a generalized linear model based on a
<italic>Cauchy link</italic>
function with a variable slope and intercept for every participant.</p>
<fig id="pone-0108211-g002" orientation="portrait" position="float">
<object-id pub-id-type="doi">10.1371/journal.pone.0108211.g002</object-id>
<label>Figure 2</label>
<caption>
<title>Subjective estimate of action discomfort increases with reaching distance.</title>
<p>Average relative rating of action discomfort as a function of reaching distance (measured relative to individual arm length) collected in the preliminary experiment. Small dots represent individual color-coded average ratings for 7 equal intervals of relative reaching distance. The larger red dots represent the global average ratings ± SEM. The black line is the generalized linear model regression curve and the shaded region represents ± standard error of the regression.</p>
</caption>
<graphic xlink:href="pone.0108211.g002"></graphic>
</fig>
<p>Two main results are: (1) the entire range of depths used to manipulate the reaching comfort/discomfort (0.65–1.00 of arm length) produced a sizable effect on the subjective estimate of action discomfort as monotonically increasing with reaching distance for all tested participants (r
<sup>2</sup>
 = 0.86, slope = −8.53±1.15, intercept = 9.73±1.31,
<italic>df</italic>
 = 554,
<italic>z</italic>
 = 7.37,
<italic>p</italic>
 = 0.0001); (2) the average distance (0.88±0.020) at which the cumulative function crosses the 0.5 response level was close to the preferred critical boundary for one degree of freedom visually guided reaching found by
<xref rid="pone.0108211-Mark1" ref-type="bibr">[12]</xref>
. These preliminary results were in agreement with previous results showing that during reaching the lower is the amount of compensatory body movements not regarding the arm (such as shoulder or trunk) the larger is action
<italic>comfort</italic>
<xref rid="pone.0108211-Carello1" ref-type="bibr">[51</xref>
,
<xref rid="pone.0108211-Choi1" ref-type="bibr">52</xref>
,
<xref rid="pone.0108211-Mark1" ref-type="bibr">12]</xref>
. According to such results a person is in a state of postural comfort if there is not, and likely will not arise, a (possibly unaware) desire or need for compensatory motions of other body parts
<xref rid="pone.0108211-Warren1" ref-type="bibr">[7]</xref>
. Furthermore, the results demonstrated that in our setup visually guided reaches were felt as comfortable in the 0.65–0.75 depth range and uncomfortable in the 0.90–1.00 depth range, thus setting the optimal conditions for the occurrence of opposite biases in the perception of facial expressions.</p>
<p>The procedure included: a session in which the participant’s arm length at rest (i.e., the effective maximum reach) was carefully measured following a procedure similar to the one used by
<xref rid="pone.0108211-Mark1" ref-type="bibr">[12]</xref>
(see Appendix 1A in
<xref rid="pone.0108211-Mark1" ref-type="bibr">[12]</xref>
), instructions, a training with 15 reaches randomly extracted across the entire depth range used in the experiment (0.65–1.00 of arm length), and the experimental session.</p>
<p>
<italic>Facial emotion identification task</italic>
(Experiments 1–3): In Experiments 1 and 2 the participant performed the required reaches and then the facial emotion identification task lasting 48 trials (approximately 10 minutes). In Experiment 3 the participant performed only the 48-trial facial emotion identification task, not preceded by reaching actions. Compared to Experiment 3, the facial emotion identification task in Experiments 1 and 2 thus involved more physical constraints (that might slow down responses): the participant should identify facial expressions right after the MAMIP, when his/her movements were still limited by infra-red markers, and his/her left hand and fingers should be positioned on the response pad by the experimenter. The 48 experimental displays resulted from the combination of 8 characters (4 actors and 4 actresses)×6 morph levels (from 25 to 75 per cent anger). The psychophysical method of constant stimuli was used in order to measure, for every participant, the PSN and JND for each of the 8 morph continua. Each facial emotion identification trial included the following: (1) a 30-pixel-wide green circle was displayed at the center of the screen for about 300 ms; (2) the face stimulus was displayed for 500 ms; (3) a blank screen followed (if the response were provided during the face presentation the blank screen lasted 200 ms); (4) until the participant pressed one of the two response keys with his/her left hand (left key for “happy” vs. right key for “angry”); (5) the next trial followed. The left hand was used for responses to the identification task given that in Experiments 1 and 2 the right hand, wearing markers, was used for the reaching task.</p>
<p>The experiments were run in a dark room allowing for dark adaptation. The participant was seated 50 cm away from the screen reflected in the mirror. The procedure included instructions, a training session in which the stimuli for the facial emotion identification task were the fully happy and fully angry faces of the 8 characters, presented twice in random order, and the experimental session.</p>
<p>
<italic>Rating scale task</italic>
(Experiment 4): The procedure was the same as in Experiment 3, except that participants were instructed to perform a different task on emotional face stimuli. Specifically, participants were carefully instructed to rate the amount of happiness/anger of each emotional face by crossing out the tick that marked the position along the happy-to-angry continuum corresponding to the displayed face.</p>
</sec>
</sec>
</sec>
<sec id="s3">
<title>Results and Discussion</title>
<sec id="s3a">
<title>Statistical analysis</title>
<p>In Experiments 1–3, indices of individual facial emotion identification performance were calculated by fitting a psychometric curve to individual data; i.e., to the percentage of “angry” responses as a function of the percentage of full anger in the 6 sets of morphed faces (each including 4 males and 4 females). Curve fitting followed the procedure indicated by [53]. We modelled the whole set of binary responses using a generalized linear model with a
<italic>probit link</italic>
function with variable slope (β
<sub>1</sub>
) and intercept (β
<sub>0</sub>
) for every combination of participant, reaching block, and experiment. Then, we reparametrized each individual Gaussian function fit in term of its mean (−β
<sub>0</sub>
<sub>1</sub>
) and standard deviation (1/β
<sub>1</sub>
). The mean defined the PSN along the happy-to-angry continuum, corresponding to equal probabilities of obtaining “happy” and “angry” responses (i.e., to maximum uncertainty). The standard deviation defined the JND.</p>
<p>Panels a, b in
<xref ref-type="fig" rid="pone-0108211-g003">Figure 3</xref>
illustrate the average percentage of “angry” responses together with the best fitting cumulative Gaussian as a function of per cent anger for comfortable (red) vs. uncomfortable (blue) actions, for the two orderings of reaching blocks: comfortable-uncomfortable (panel a) vs. uncomfortable-comfortable (panel b). As an index of identification precision we used the JND, corresponding to the standard deviation of the best fitting Gaussian model (smaller JND indicating higher identification precision). To provide an additional converging measure of the possible effect of action-induced mood on facial identification performance we also analyzed individual RTs (taking as valid RTs those between 200 and 4000 ms, which led to the removal of 44 out of 2592 values collected over Experiments 1–3) averaged for each of the 6 morph levels (c and d panels in
<xref ref-type="fig" rid="pone-0108211-g003">Figure 3</xref>
, for Experiments 1 and 2, respectively).</p>
<fig id="pone-0108211-g003" orientation="portrait" position="float">
<object-id pub-id-type="doi">10.1371/journal.pone.0108211.g003</object-id>
<label>Figure 3</label>
<caption>
<title>Distributions of percentages of “angry” responses and RTs.</title>
<p>The 4 panels depict the average percentages of “angry” responses (a, b panels) and RTs (c, d panels) [± SEM] as a function of per cent anger, after the comfortable/uncomfortable (red/blue symbols, respectively) reaching blocks and in the absence of action (green symbols). Red and blue curves in a, b panels are the best average cumulative Gaussian fits of response percentages, with shaded bands indicating ± standard error of regression. Green curves represent the average distributions, ± SEM, obtained in Experiment 3. The pink line represents the average PSN, ± SEM, obtained in Experiment 4. Data in the left panels (a, c) refer to Experiment 1 (comfortable-uncomfortable order); data in the right panels (b, d) refer to Experiment 2 (opposite order).</p>
</caption>
<graphic xlink:href="pone.0108211.g003"></graphic>
</fig>
<p>
<xref ref-type="fig" rid="pone-0108211-g004">Figure 4</xref>
shows the average PSNs and JNDs for the two reaching blocks in Experiments 1 (comfortable block first) and 2 (uncomfortable block first), relative to baseline values obtained in Experiments 3 and 4. We analyzed PSNs and JNDs using a linear mixed-effect (
<italic>lme</italic>
) model with participants as random effects, and reaching block (comfortable vs. uncomfortable) and Experiment (1 vs. 2) as fixed effects
<xref rid="pone.0108211-Bates1" ref-type="bibr">[54</xref>
,
<xref rid="pone.0108211-Bates2" ref-type="bibr">55]</xref>
. A similar
<italic>lme</italic>
analysis was applied to RTs, using the per cent anger in morph as a fixed factor to manage the intrinsic nonlinearity between RT and morph intensity. Data of Experiment 4 have been first converted into a –50 (fully happy) to 50 (fully angry) scale and then analyzed using a
<italic>lme</italic>
model with both participant and actor as random effects and per cent anger in our stimulus set and the version of the rating scale (A vs. B) as fixed effects. We used type 3 like two tailed
<italic>p</italic>
-values adjusting for the
<italic>F</italic>
-tests the denominator degrees-of-freedom with the Kenward-Rogers approximation implemented in KRmodcomp's function, R Package pbkrtest
<xref rid="pone.0108211-Halekoh1" ref-type="bibr">[56</xref>
,
<xref rid="pone.0108211-Halekoh2" ref-type="bibr">57]</xref>
. Among the indices that have been proposed as reliable measures of the predictive power and of the goodness of fit for
<italic>lme</italic>
models (e.g.,
<xref rid="pone.0108211-Sun1" ref-type="bibr">[58]</xref>
) we selected the concordance correlation coefficient,
<italic>r
<sub>c</sub>
</italic>
, providing a measure of the degree of agreement between the observed values and the predicted values, in the –1 to 1 range
<xref rid="pone.0108211-Vonesh1" ref-type="bibr">[59]</xref>
. Post-hoc tests were performed using two tailed
<italic>t</italic>
-tests and Cohen's
<italic>d</italic>
as a measure of significant effect size.</p>
<fig id="pone-0108211-g004" orientation="portrait" position="float">
<object-id pub-id-type="doi">10.1371/journal.pone.0108211.g004</object-id>
<label>Figure 4</label>
<caption>
<title>Action comfort/discomfort biases the perception of facial emotions.</title>
<p>Average PSNs (a) and JNDs (b), ± SEM, for the comfortable (red) and uncomfortable (blue) reaching blocks in Experiments 1 (comfortable → uncomfortable) and 2 (uncomfortable → comfortable) as coded along the x-axis. Horizontal green and violet lines represent the baseline scores, ± SEM, obtained in Experiments 3 and 4. In (a) these scores are the reference for evaluating the biasing effects of action comfort/discomfort, with PSNs larger than the baseline indicating an overall happiness superiority, and PSNs smaller than the baseline indicating an anger superiority. In (b) values below the green line indicate a precision improvement induced by the reaching block. (c) Individual PSN difference between uncomfortable and comfortable reaching sessions in Experiments 1 (light grey) and 2 (dark grey). A negative value represents an increased likelihood of perceiving a facial expression as being angry after the uncomfortable block. (d) Individual JND difference between uncomfortable and comfortable reaching sessions in Experiments 1 (light grey) and 2 (dark grey). A negative value represents a stronger improvement in facial expression sensitivity after the uncomfortable (rather than comfortable) block.</p>
</caption>
<graphic xlink:href="pone.0108211.g004"></graphic>
</fig>
</sec>
<sec id="s3b">
<title>Biasing the perception of facial emotion through action comfort/discomfort</title>
<p>Average PSNs shown in
<xref ref-type="fig" rid="pone-0108211-g004">Figure 4a</xref>
were in strong agreement with
<italic>H1</italic>
: the PSN was indeed biased in opposite directions after comfortable (towards anger) vs. uncomfortable (towards happiness) reaching blocks in Experiments 1 and 2. In Experiment 1, the likelihood of interpreting a facial expression as angry increased by about 130 per cent (odds ratio) after participants were adapted to uncomfortable reaching acts, with average PSNs measuring 50.9±0.97 per cent anger and 47.7±0.83 per cent anger (
<italic>F</italic>
<sub>1,8</sub>
 = 12.31,
<italic>p</italic>
 = 0.007), after comfortable and uncomfortable reaching blocks, respectively. The effect was strikingly similar in Experiment 2, where the odds of an “angry” response after the uncomfortable reaching block outperformed those after the comfortable reaching block by 116 per cent, with average PSNs measuring 43.4±1.94 per cent anger and 42.0±1.83 per cent anger (
<italic>F</italic>
<sub>1,8</sub>
 = 5.5,
<italic>p</italic>
 = 0.04), after comfortable and uncomfortable reaching blocks, respectively. Consistently with the effectiveness of our MAMIP and with perceptual hysteresis (
<italic>H2</italic>
), we found a lower PSN in the uncomfortable-comfortable reaching condition (Experiment 2, 42.71 per cent anger) than in the comfortable-uncomfortable reaching condition (Experiment 1, 49.31 per cent anger).</p>
<p>The above described effects of motor action mood induction on PSN were predicted by the
<italic>lme</italic>
model with Experiment as a fixed effect revealing significant main effects for Reaching (F
<sub>1,16</sub>
 = 17.62,
<italic>p</italic>
 = 0.0007) and Experiment (F
<sub>1,16</sub>
 = 10.58,
<italic>p</italic>
 = 0.005), but not their interaction (F
<sub>1,16</sub>
 = 2.95,
<italic>p</italic>
 = 0.104). Only 50 reaching acts distributed over 10 min, with a slightly different depth extent (average depth difference between comfortable and uncomfortable reaches = 17.74 cm ±0.19) produced dramatic changes in the perception of facial expressions.</p>
<p>However, a baseline
<italic>lme</italic>
model revealed a systematic bias in identification performance towards anger in Experiments 1 and 2, with an estimated PSN (averaged across experiments) of 46.02±1.26 per cent anger (
<italic>t</italic>
 = 36.26). Given such a bias, we wondered whether it was due to our MAMIP or whether it was in line with a well known phenomenon in the emotion perception literature: angry faces “pop out” of crowds [60]. To address this question we contrasted the average PSN from Experiments 1 and 2 with that obtained in Experiment 3 (45.5±1.7 per cent anger), where a similar anger superiority effect was found even in the absence of previously performed reaches (Welch Two Sample
<italic>t</italic>
 = 0.29,
<italic>df</italic>
 = 17.51,
<italic>p</italic>
 = 0.77). A similar result was also found in Experiment 4, where we used a different measurement method (rating scale task) and performed the experiment in the field (classroom), rather than in the laboratory. Average PSNs as extracted from an
<italic>lme</italic>
model with the per cent anger in our stimulus set as the only continuous predictor (
<italic>slope</italic>
 = 1.21,
<italic>F</italic>
<sub>1,3993</sub>
 = 7534,
<italic>p</italic>
 = 0.000,
<italic>r
<sub>c</sub>
 = </italic>
0.84), revealed no effect of the ordering of the rating scale (
<italic>F</italic>
<sub>1,87</sub>
 = 0.77,
<italic>p</italic>
 = 0.38). A similar bias toward anger was observed when the response scale was reversed, and anger was presented on the left (version A: 46.19±0.67 per cent) or right of the scale (version B: 47.2±0.97 per cent). Again, the magnitude of the anger superiority effect revealed by Experiment 4 was about the same as the one obtained in Experiments 1 and 2 (PSN = 46.40±0.49 per cent, Welch Two Sample
<italic>t</italic>
 = 0.56,
<italic>df</italic>
 = 82.98,
<italic>p</italic>
 = 0.57).</p>
<p>In summary, the present results reveal a symmetric bias in the perception of facial expressions, induced by comfortable/uncomfortable reaches. Consistently with
<italic>H5</italic>
, a sequence of comfortable reaches performed before the facial emotion identification task induced an increased likelihood of interpreting a facial expression as happy relative to the baseline. By contrast, uncomfortable reaches induced an increased likelihood of interpreting a facial expression as angry.</p>
</sec>
<sec id="s3c">
<title>Improving precision through action comfort/discomfort</title>
<p>To assess the impact of
<italic>hyper-arousal from reaching</italic>
on human ability to identify subtle facial expressions of emotion, we analyzed the JNDs and RTs in the absence of (Experiment 3) and immediately after the reaching blocks (Experiments 1 and 2). Three plausible patterns of results were considered:</p>
<list list-type="order">
<list-item>
<p>Consistent with
<italic>H4</italic>
: INDs and RTs in Experiments 1 and 2 smaller than those in Experiment 3;</p>
</list-item>
<list-item>
<p>Consistent with
<italic>H3</italic>
: JNDs and RTs after the uncomfortable reaching block smaller than JNDs and RTs after the comfortable reaching block;</p>
</list-item>
<list-item>
<p>Inconsistent with both
<italic>H3</italic>
and
<italic>H4</italic>
: Neither JNDs nor RTs smaller after an uncomfortable reaching block (inducing hyper-arousal).</p>
</list-item>
</list>
<p>The first pattern of results would suggest that goal directed reaches can influence arousal, triggering an arousal-based improvement in emotional face processing, revealed by an increased sensitivity to facial expression differences (measured by the JND in the classification task), and by a reduction of the degree of uncertainty in emotion classification (measured by RTs). The second pattern of results would suggest that arousal can be modulated continuously by the nature of goal directed reaches, being it comfortable or uncomfortable. In contrast, the last pattern of result would suggest that there is likely limited benefit for arousal states from reaching actions. Average JNDs shown in panel b of
<xref ref-type="fig" rid="pone-0108211-g004">Figure 4</xref>
are in good agreement with hypotheses
<italic>H3</italic>
and
<italic>H4</italic>
: participants’ sensitivity to subtle facial expression differences improved after both reaching blocks, but the improvement was larger after the uncomfortable, not comfortable, sequence of reaches. The distributions of average RTs depicted in panels c and d of
<xref ref-type="fig" rid="pone-0108211-g003">Figure 3</xref>
provide converging evidence in support of hypothesis
<italic>H3</italic>
: participants indeed responded more quickly, thus showing an increased degree of certainty in performing the emotion identification task, after the uncomfortable sequence of reaches than after the comfortable.</p>
<p>In Experiment 1, the JND after being adapted to uncomfortable reaches was about half the one after comfortable reaches (from 10.22±0.5 per cent anger to 6.15±1.12 per cent anger;
<italic>F</italic>
<sub>1,8</sub>
 = 11.41,
<italic>p</italic>
 = 0.009). A similar although smaller effect was found in Experiment 2 in which the JND decreased by about 16 per cent after uncomfortable rather than comfortable reaches (from 10.7±0.76 per cent anger to 9.0±1.11 per cent anger;
<italic>F</italic>
<sub>1,8</sub>
 = 5.1,
<italic>p</italic>
 = 0.048).</p>
<p>In Experiment 1, RTs were similarly affected by both the mood induced by body action (
<italic>F</italic>
<sub>1,88</sub>
 = 9.30,
<italic>p</italic>
 = 0.003) and by the per cent anger in the morph (
<italic>F</italic>
<sub>5,88</sub>
 = 5.08,
<italic>p</italic>
 = 0.0004), with faster RTs after the uncomfortable (929±42 ms) rather than the comfortable (1103±63 ms) reaching block. RTs followed an inverted U-shaped function of per cent anger reaching a maximum (1273±115 ms) at 45 per cent anger, which is close to the average value of maximal response uncertainty. This was confirmed by post-hoc paired
<italic>t</italic>
-tests: RTs decreased by about 445 ms (paired
<italic>t</italic>
 = −5.2,
<italic>df</italic>
 = 17,
<italic>p</italic>
 = 0.000,
<italic>d</italic>
 = 1.12) as the per cent anger deviates from 45 per cent towards happiness, and by about 378 ms (paired
<italic>t</italic>
 = −4.2,
<italic>df</italic>
 = 17,
<italic>p</italic>
 = 0.0005,
<italic>d</italic>
 = 0.86) as the per cent anger deviates from 45 per cent towards anger. In Experiment 2, we found a similar, though not significant (
<italic>F</italic>
<sub>1,88</sub>
 = 0.6,
<italic>p</italic>
 = 0.50), tendency of uncomfortable reaching in reducing RTs (921±36 ms vs. 897±28 ms after comfortable vs. uncomfortable reaches), and a similarly strong modulation of RTs by the per cent anger in the morph (
<italic>F</italic>
<sub>5,88</sub>
 = 5.98,
<italic>p</italic>
 = 0.000).</p>
<p>The different effect sizes in Experiments 1 and 2 were likely due to the unbalanced temporal ordering of reaching blocks. In Experiment 2 our participants were more experienced with the experimental task after the comfortable rather than uncomfortable block, and vice versa in Experiment 1. The effects of action comfort and learning were thus in opposite directions in Experiment 2, thus reducing the performance difference induced by the two reaching blocks, and in the same direction in Experiment 1, thus enhancing the performance difference induced by the two reaching blocks.</p>
<p>We further demonstrated an arousal-based improvement in emotional face processing induced by reaching discomfort by the results of the
<italic>lme</italic>
model comparing the JNDs and RTs in Experiments 1 and 2. The model on JNDs revealed a significant main effect of Reaching (
<italic>F</italic>
<sub>1,16</sub>
 = 16.27,
<italic>p</italic>
 = 0.001); while neither the effect of Experiment (
<italic>F</italic>
<sub>1,16</sub>
 = 2.40,
<italic>p</italic>
 = 0.14) nor the Reaching × Experiment interaction (
<italic>F</italic>
<sub>1,16</sub>
 = 2.86,
<italic>p</italic>
 = 0.11) were significant. Similar results were obtained on RTs, in which Reaching (RT after comfortable = 1012±37 ms; RT after uncomfortable = 913±25 ms;
<italic>F</italic>
<sub>1,176</sub>
 = 9.19,
<italic>p</italic>
 = 0.003) and per cent anger in the morph (
<italic>F</italic>
<sub>5,176</sub>
 = 9.05,
<italic>p</italic>
 = 0.0000) were the only significant main effects; other effects were not statistically significant.</p>
<p>Consistent with the idea that arousal is mainly influenced by uncomfortable reaches, we found that the
<italic>baseline</italic>
JND obtained in Experiment 3 (11.14±1.46 per cent), in which performance was measured at the normal awake arousal state, was larger than the JNDs of the uncomfortable reaching condition averaged across Experiments 1 and 2 (7.59±0.84 per cent, Welch Two Sample
<italic>t</italic>
 = −2.10, df = 18.2,
<italic>p</italic>
 = 0.049), but not of the comfortable reaching condition (10.46±0.45 per cent, Welch Two Sample
<italic>t</italic>
 = −0.44, df = 13.1,
<italic>p</italic>
 = 0.66). Analogously, despite the larger number of physical constraints to which the observer was subjected in Experiments 1 and 2 relative to Experiment 3, which should determine an unbalance between conditions in favor of Experiment 3, RTs after uncomfortable reaches were identical to those observed in Experiment 3 (913±25 vs. 864±22; Welch Two Sample
<italic>t</italic>
 = 1.47,
<italic>df</italic>
 = 176.7,
<italic>p</italic>
 = 0.14), while those after comfortable reaches (1012±37) were larger (Welch Two Sample
<italic>t</italic>
 = 3.43,
<italic>df</italic>
 = 166.2,
<italic>p</italic>
 = 0.0007,
<italic>d</italic>
 = 0.47).</p>
<p>In summary, we obtained three findings: (a) comfort/discomfort associated to goal-directed reaching biased the identification of facial emotions towards mood congruency; (b) discomfort (but not comfort) improved the precision of emotion identification; (c) discomfort speeded up the processing of facial expressions of emotion by reducing RTs and response uncertainty in our emotion identification task.</p>
</sec>
</sec>
<sec id="s4">
<title>Discussion</title>
<p>The present study demonstrates that
<italic>comfort/discomfort</italic>
of goal-directed reaching affects the perception of facial expression of emotions. Uncomfortable actions modified the perception of emotional expressions along the happy-to-angry continuum, making a neutral face appear angry and a slightly happy face neutral, and improving the identification of facial expressions. Comfortable reaching induced an opposite shift of the perceived midpoint of the happy-to-angry continuum, making a neutral face appear happy and a slightly angry face neutral, but without improving the identification of facial expressions.</p>
<p>Such biasing effects of action comfort/discomfort are challenging for the current approach to sensory integration, which is based on optimal cue integration
<xref rid="pone.0108211-Landy1" ref-type="bibr">[61</xref>
<xref rid="pone.0108211-Caudek1" ref-type="bibr">63]</xref>
and on a view of the brain as a Bayesian inference system
<xref rid="pone.0108211-vonHelmholtz1" ref-type="bibr">[64</xref>
,
<xref rid="pone.0108211-Gregory1" ref-type="bibr">65]</xref>
. According to such an approach, the brain is continuously predicting the most likely interpretation of new visual inputs on the basis of expectations and beliefs about the environment, providing priors that are optimally combined with sensory evidence. But knowledge-based priors and sensory inputs are not enough as our results demonstrate that affective components cannot be ignored when considering the process of sensory integration.</p>
<p>Our results show that body feelings impact perception too, which is also consistent with recent findings on the effect of body posture on behavior
<xref rid="pone.0108211-Yap1" ref-type="bibr">[20]</xref>
and the constructionist hypothesis by
<xref rid="pone.0108211-Oosterwijk1" ref-type="bibr">[66]</xref>
. In particular, perceived affordances depend on body capabilities that are defined by the geometry (e.g., arm length) and biodynamics (e.g., muscular strength, joint mobility) of relevant parts of the actor's body. In the case of reaching, beyond a critical distance the arm is no longer sufficient; to reach farther, actors must activate other body segments, by either leaning forward or twisting their bodies to extend their shoulders towards the object. Above such a critical distance reaching becomes uncomfortable
<xref rid="pone.0108211-Mark1" ref-type="bibr">[12]</xref>
and negative mood states arise
<xref rid="pone.0108211-Conway1" ref-type="bibr">[11]</xref>
, setting the stage for mood-congruency effects in emotion perception. On the other hand, the positive effect of comfortable reaches relative to the inaction condition measured in Experiment 3 can be interpreted as a by-product of the empowerment of motor skillfulness. Remarkably, our effect suggests that comfortable/uncomfortable actions can be conceived as a new powerful mood inducer. Hence, our Motor Action Mood Induction Procedure, MAMIP, should be added to the list including the Musical Mood Induction Technique, MMIT
<xref rid="pone.0108211-Clark1" ref-type="bibr">[67]</xref>
, the Velten Mood Induction Procedure, VMIP
<xref rid="pone.0108211-Velten1" ref-type="bibr">[68]</xref>
, and the self-referential mood induction
<xref rid="pone.0108211-Mayberg1" ref-type="bibr">[69]</xref>
, to name only a few procedures used in controlled settings.</p>
<p>Similar mood-congruency effects have been previously shown to occur using other mood-inducing procedures
<xref rid="pone.0108211-Robinson1" ref-type="bibr">[70]</xref>
. Our MAMIP is apparently new as an experimental setting (despite being implicit in all uses of relaxation as a route to well-being) and possibly more basic than others (given that listening to music – a powerful mood-inducer – evokes motor actions). Note also that music, verbal descriptions, and personal memories may be explicitly related to social perception; while the type of motor actions (i.e., reaches with slightly different depth extents) used as mood inducers in our study have no direct link with social perception, but still produce strong effect on emotion identification: reaching comfort/discomfort, as defined by the amount of compensatory body movements not regarding the arm, affects the individual mood state, which in turn influences the perceptual processing of facial expression.</p>
<p>There are two ways of looking at the mood-congruency effects we demonstrated in our study. Action-induced mood might affect only post-perceptual processing by modifying the response criterion and decision thresholds or mood might affect valence through a top-down modulation of visual processing in which perception is directly influenced by the observer’s psychological state
<xref rid="pone.0108211-Kersten1" ref-type="bibr">[71]</xref>
. Although our study is compatible with both hypotheses, we suggest that the second is more intriguing as it sheds light on new links between perception and action. Classic research focused on the role of vision for the control of fundamental motor action that humans perform with great dexterity, such as reaching and grasping
<xref rid="pone.0108211-Goodale1" ref-type="bibr">[72]</xref>
. On the other hand, important work has been conducted on visuomotor adaptation showing how hand proprioception might alter basic perceived object properties, such as shape, position, and size
<xref rid="pone.0108211-Volcic1" ref-type="bibr">[4]</xref>
. Our study provides the first evidence that expressive qualities of the social environment can be altered by subjective feelings associated to motor actions.</p>
<p>Our results are consistent with the pioneering idea that muscular and somatic states might constitute hard representations used in high level cognition
<xref rid="pone.0108211-Zajonc1" ref-type="bibr">[73]</xref>
. If the motor system is representational in nature then performing an
<italic>uncomfortable</italic>
action is likely to evoke facial expressions with negative valence, thus selectively tuning the perceiver towards face stimuli with an expression that is congruent with the one activated by the action itself.</p>
<p>However, given that no traditional explicit measures of subjective mood were collected in the present study (see
<xref rid="pone.0108211-Kucera1" ref-type="bibr">[30]</xref>
for a review), it is possible that action comfort/discomfort could have biased the perceived facial expressions without influencing mood. However, this seems unlikely, as the behavioural effects of our action-based induction were similar to those of other mood inducers (e.g., music). An interesting issue for further research is thus to clarify the mediator effects of variables such as mood, experiential avoidance, sense of reward, and sense of motor skillfulness.</p>
<p>Furthermore, the improvement of emotion identification performance induced by action comfort/discomfort suggests that one way action might affect the perceptual system is through arousal, which can prompt vision and attention enhancing detection capabilities. This finding is in line with the evidence that hyper-arousal from sensory stimulation can influence aspects of human visual perception
<xref rid="pone.0108211-Woods1" ref-type="bibr">[13</xref>
,
<xref rid="pone.0108211-Woods2" ref-type="bibr">35]</xref>
. One way in which arousal might have affected the performance in our task is by a modulation of attention which is known to be linked to emotion and in particular mood
<xref rid="pone.0108211-Derryberry1" ref-type="bibr">[74</xref>
,
<xref rid="pone.0108211-Jeffries1" ref-type="bibr">75]</xref>
. Mood was shown to affect attention through determining the focus of processing of visual stimuli
<xref rid="pone.0108211-Gasper1" ref-type="bibr">[21]</xref>
favoring a local processing strategy under negative mood state (i.e., uncomfortable block), vs. a global processing strategy under positive mood state (i.e., comfortable block). The improvement of performance in the uncomfortable relative to the comfortable block, revealed by our study, is thus in-line with recent findings showing that observers primed with local processing performed both significantly faster and more accurately on emotion recognition tasks than when they were primed with a global processing
<xref rid="pone.0108211-Martin1" ref-type="bibr">[76]</xref>
.</p>
<p>In summary, models of perception-action interaction should include emotion to predict, in particular, arousal-based changes of identification performance. In particular our results suggest a challenge in the interpretation of those numerous studies comparing perceptual based estimates vs. action based estimates of size
<xref rid="pone.0108211-Franz1" ref-type="bibr">[77]</xref>
. For instance, the finding that estimated depth with the index-to-thumb span is larger when the observer is asked to actively reach and grasp for a target object rather than to indicate the depth of the object while holding their hand away from it
<xref rid="pone.0108211-Foster1" ref-type="bibr">[78]</xref>
, could be a by-product of an enhancement of stereo sensitivity caused by the increased arousal induced by visually guided reaches.</p>
<p>Our findings have practical implications for the interior design of houses and workplaces, and exemplify a causal effect of action on perception relevant for
<italic>emotional design</italic>
<xref rid="pone.0108211-Norman1" ref-type="bibr">[79]</xref>
. The mood induced by comfortable/uncomfortable actions on/with daily objects affects the valence and discriminability of the expressive features of external objects, including conspecifics. Consider workplaces where actions are constrained by the physical structure of the environment. Comfortable artefacts at an easy-to-reach distance would induce a positive mood, which in turn would enhance the global experience of pleasantness, as revealed by a bias in perceiving faces as pleasant (happy) rather than unpleasant (angry). Among other undesirable effects, body discomfort induced by bad interior design degrades our social environment.</p>
</sec>
<sec sec-type="supplementary-material" id="s5">
<title>Supporting Information</title>
<supplementary-material content-type="local-data" id="pone.0108211.s001">
<label>Data S1</label>
<caption>
<p>
<bold>Data from Experiments 1–4.</bold>
Two worksheets are included in the file: (1) RAW_DATA_EXP12&3, with the entire dataset of Experiments 1–3, and (2) RAW_DATA_EXP4, with the entire dataset of Experiment 4.</p>
<p>(XLS)</p>
</caption>
<media xlink:href="pone.0108211.s001.xls">
<caption>
<p>Click here for additional data file.</p>
</caption>
</media>
</supplementary-material>
</sec>
</body>
<back>
<ack>
<p>We thank Robert Volcic for useful comments on a former version of the manuscript and Matteo Manzini for helping with data collection.</p>
</ack>
<ref-list>
<title>References</title>
<ref id="pone.0108211-Becchio1">
<label>1</label>
<mixed-citation publication-type="journal">
<name>
<surname>Becchio</surname>
<given-names>C</given-names>
</name>
,
<name>
<surname>Sartori</surname>
<given-names>L</given-names>
</name>
,
<name>
<surname>Bulgheroni</surname>
<given-names>M</given-names>
</name>
,
<name>
<surname>Castiello</surname>
<given-names>U</given-names>
</name>
(
<year>2008</year>
)
<article-title>Both your intention and mine are reflected in the kinematics of my reach to grasp movement</article-title>
.
<source>Cognition</source>
<volume>106</volume>
:
<fpage>894</fpage>
<lpage>912</lpage>
.
<pub-id pub-id-type="pmid">17585893</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0108211-Sartori1">
<label>2</label>
<mixed-citation publication-type="journal">
<name>
<surname>Sartori</surname>
<given-names>L</given-names>
</name>
,
<name>
<surname>Becchio</surname>
<given-names>C</given-names>
</name>
,
<name>
<surname>Bara</surname>
<given-names>BG</given-names>
</name>
,
<name>
<surname>Castiello</surname>
<given-names>U</given-names>
</name>
(
<year>2009</year>
)
<article-title>Does the intention to communicate affect action kinematics?</article-title>
<source>Consciousness and Cognition</source>
<volume>18</volume>
:
<fpage>766</fpage>
<lpage>72</lpage>
.
<pub-id pub-id-type="pmid">19632134</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0108211-Cardinali1">
<label>3</label>
<mixed-citation publication-type="journal">
<name>
<surname>Cardinali</surname>
<given-names>L</given-names>
</name>
,
<name>
<surname>Frassinetti</surname>
<given-names>F</given-names>
</name>
,
<name>
<surname>Brozzoli</surname>
<given-names>C</given-names>
</name>
,
<name>
<surname>Urquizar</surname>
<given-names>C</given-names>
</name>
,
<name>
<surname>Roy</surname>
<given-names>AC</given-names>
</name>
,
<etal>et al</etal>
(
<year>2009</year>
)
<article-title>Tool-use induces morphological updating of the body schema</article-title>
.
<source>Current Biology</source>
<volume>19</volume>
:
<fpage>R478</fpage>
<lpage>R479</lpage>
.
<pub-id pub-id-type="pmid">19549491</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0108211-Volcic1">
<label>4</label>
<mixed-citation publication-type="journal">
<name>
<surname>Volcic</surname>
<given-names>R</given-names>
</name>
,
<name>
<surname>Fantoni</surname>
<given-names>C</given-names>
</name>
,
<name>
<surname>Caudek</surname>
<given-names>C</given-names>
</name>
,
<name>
<surname>Assad</surname>
<given-names>J</given-names>
</name>
,
<name>
<surname>Domini</surname>
<given-names>F</given-names>
</name>
(
<year>2013</year>
)
<article-title>Visuomotor adaptation changes stereoscopic depth perception and tactile discrimination</article-title>
.
<source>Journal of Neuroscience</source>
<volume>33</volume>
:
<fpage>17081</fpage>
<lpage>17088</lpage>
.
<pub-id pub-id-type="pmid">24155312</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0108211-Higuchi1">
<label>5</label>
<mixed-citation publication-type="journal">
<name>
<surname>Higuchi</surname>
<given-names>T</given-names>
</name>
,
<name>
<surname>Imanaka</surname>
<given-names>K</given-names>
</name>
,
<name>
<surname>Hatayama</surname>
<given-names>T</given-names>
</name>
(
<year>2002</year>
)
<article-title>Freezing degrees of freedom under stress: Kinematic evidence of constrained movement strategies</article-title>
.
<source>Human Movement Science</source>
<volume>21</volume>
:
<fpage>831</fpage>
<lpage>846</lpage>
.
<pub-id pub-id-type="pmid">12620722</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0108211-Knight1">
<label>6</label>
<mixed-citation publication-type="journal">
<name>
<surname>Knight</surname>
<given-names>C</given-names>
</name>
,
<name>
<surname>Haslam</surname>
<given-names>SA</given-names>
</name>
(
<year>2010</year>
)
<article-title>The relative merits of lean, enriched, and empowered offices: An experimental examination of the impact of workspace management strategies on well-being and productivity</article-title>
.
<source>Journal of Experimental Psychology: Applied</source>
<volume>16</volume>
:
<fpage>158</fpage>
<lpage>172</lpage>
.
<pub-id pub-id-type="pmid">20565201</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0108211-Warren1">
<label>7</label>
<mixed-citation publication-type="journal">
<name>
<surname>Warren</surname>
<given-names>WH</given-names>
</name>
(
<year>1984</year>
)
<article-title>Perceiving affordances: Visual guidance of stair climbing</article-title>
.
<source>Journal of Experimental Psychology: Human Perception and Performance</source>
<volume>10</volume>
:
<fpage>683</fpage>
<lpage>703</lpage>
.
<pub-id pub-id-type="pmid">6238127</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0108211-Damasio1">
<label>8</label>
<mixed-citation publication-type="other">Damasio A (1994) Descartes’ error. New York, NY: Grosset/Putnam.</mixed-citation>
</ref>
<ref id="pone.0108211-Niedenthal1">
<label>9</label>
<mixed-citation publication-type="journal">
<name>
<surname>Niedenthal</surname>
<given-names>PM</given-names>
</name>
(
<year>2007</year>
)
<article-title>Embodying emotion</article-title>
.
<source>Science</source>
<volume>316</volume>
:
<fpage>1002</fpage>
<lpage>1005</lpage>
.
<pub-id pub-id-type="pmid">17510358</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0108211-Niedenthal2">
<label>10</label>
<mixed-citation publication-type="journal">
<name>
<surname>Niedenthal</surname>
<given-names>PM</given-names>
</name>
,
<name>
<surname>Barsalou</surname>
<given-names>LW</given-names>
</name>
,
<name>
<surname>Winkielman</surname>
<given-names>P</given-names>
</name>
,
<name>
<surname>Krauth-Gruber</surname>
<given-names>S</given-names>
</name>
,
<name>
<surname>Ric</surname>
<given-names>F</given-names>
</name>
(
<year>2005</year>
)
<article-title>Embodiment in attitudes, social perception, and emotion</article-title>
.
<source>Personality and Social Psychology Review</source>
<volume>9</volume>
:
<fpage>184</fpage>
<lpage>211</lpage>
.
<pub-id pub-id-type="pmid">16083360</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0108211-Conway1">
<label>11</label>
<mixed-citation publication-type="journal">
<name>
<surname>Conway</surname>
<given-names>FT</given-names>
</name>
(
<year>1999</year>
)
<article-title>Psychological mood state, psychosocial aspects of work, and musculoskeletal discomfort in intensive Video Display Terminal (VDT) work</article-title>
.
<source>International Journal of Human-Computer Interaction</source>
<volume>11</volume>
:
<fpage>95</fpage>
<lpage>107</lpage>
.</mixed-citation>
</ref>
<ref id="pone.0108211-Mark1">
<label>12</label>
<mixed-citation publication-type="journal">
<name>
<surname>Mark</surname>
<given-names>LS</given-names>
</name>
,
<name>
<surname>Nemeth</surname>
<given-names>K</given-names>
</name>
,
<name>
<surname>Gardner</surname>
<given-names>D</given-names>
</name>
,
<name>
<surname>Dainoff</surname>
<given-names>MJ</given-names>
</name>
,
<name>
<surname>Paasche</surname>
<given-names>J</given-names>
</name>
,
<etal>et al</etal>
(
<year>1997</year>
)
<article-title>Postural dynamics and the preferred critical boundary for visually guided reaching</article-title>
.
<source>Journal of Experimental Psychology: Human Perception and Performance</source>
<volume>23</volume>
:
<fpage>1365</fpage>
<lpage>1379</lpage>
.
<pub-id pub-id-type="pmid">9336957</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0108211-Woods1">
<label>13</label>
<mixed-citation publication-type="journal">
<name>
<surname>Woods</surname>
<given-names>AJ</given-names>
</name>
,
<name>
<surname>Philbeck</surname>
<given-names>JW</given-names>
</name>
,
<name>
<surname>Wirtz</surname>
<given-names>P</given-names>
</name>
(
<year>2013</year>
)
<article-title>Hyper-arousal decreases human visual thresholds</article-title>
.
<source>PLoS ONE</source>
,
<volume>8</volume>
,
<comment>e61415. doi:
<ext-link ext-link-type="uri" xlink:href="http://dx.doi.org/10.1371/journal.pone.0061415">10.1371/journal.pone.0061415</ext-link>
</comment>
</mixed-citation>
</ref>
<ref id="pone.0108211-Phelps1">
<label>14</label>
<mixed-citation publication-type="journal">
<name>
<surname>Phelps</surname>
<given-names>L</given-names>
</name>
,
<name>
<surname>Carrasco</surname>
<given-names>M</given-names>
</name>
(
<year>2006</year>
)
<article-title>Emotion facilitates perception and potentiates the perceptual benefits of attention</article-title>
.
<source>Psychological Science</source>
<volume>17</volume>
:
<fpage>292</fpage>
<lpage>299</lpage>
.
<pub-id pub-id-type="pmid">16623685</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0108211-Ekman1">
<label>15</label>
<mixed-citation publication-type="other">Ekman P, Rosenberg EL (2005) What the Face Reveals: Basic and Applied Studies of Spontaneous Expression Using the Facial Action Coding System (FACS). Oxford University Press: New York, 2nd edition.</mixed-citation>
</ref>
<ref id="pone.0108211-Russell1">
<label>16</label>
<mixed-citation publication-type="journal">
<name>
<surname>Russell</surname>
<given-names>JA</given-names>
</name>
(
<year>2003</year>
)
<article-title>Core affect and the psychological construction of emotion</article-title>
.
<source>Psychological Review</source>
<volume>110</volume>
:
<fpage>145</fpage>
<lpage>172</lpage>
.
<pub-id pub-id-type="pmid">12529060</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0108211-Rolls1">
<label>17</label>
<mixed-citation publication-type="journal">
<name>
<surname>Rolls</surname>
<given-names>ET</given-names>
</name>
(
<year>1990</year>
)
<article-title>A theory of emotion, and its application to understanding the neural basis of emotion</article-title>
.
<source>Cognition and Emotion</source>
<volume>4</volume>
:
<fpage>161</fpage>
<lpage>190</lpage>
.</mixed-citation>
</ref>
<ref id="pone.0108211-Santos1">
<label>18</label>
<mixed-citation publication-type="other">Santos LR, Hood BM (2009) Object representation as a central issue in cognitive science. In: Hood BM, Santos LR, editors. The Origins of Object Knowledge. Oxford: Oxford University Press. pp. 2–24.</mixed-citation>
</ref>
<ref id="pone.0108211-Proffitt1">
<label>19</label>
<mixed-citation publication-type="journal">
<name>
<surname>Proffitt</surname>
<given-names>DR</given-names>
</name>
,
<name>
<surname>Bhalla</surname>
<given-names>M</given-names>
</name>
,
<name>
<surname>Gossweiler</surname>
<given-names>R</given-names>
</name>
,
<name>
<surname>Midgett</surname>
<given-names>J</given-names>
</name>
(
<year>1995</year>
)
<article-title>Perceiving geographical slant</article-title>
.
<source>Psychonomic Bulletin & Review</source>
<volume>2</volume>
:
<fpage>409</fpage>
<lpage>428</lpage>
.
<pub-id pub-id-type="pmid">24203782</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0108211-Yap1">
<label>20</label>
<mixed-citation publication-type="journal">
<name>
<surname>Yap</surname>
<given-names>AJ</given-names>
</name>
,
<name>
<surname>Wazlawek</surname>
<given-names>AS</given-names>
</name>
,
<name>
<surname>Lucas</surname>
<given-names>BJ</given-names>
</name>
,
<name>
<surname>Cuddy</surname>
<given-names>AJC</given-names>
</name>
,
<name>
<surname>Carney</surname>
<given-names>DR</given-names>
</name>
(
<year>2013</year>
)
<article-title>The ergonomics of dishonesty: The effect of incidental posture on stealing, cheating, and traffic violations</article-title>
.
<source>Psychological Science</source>
<volume>24</volume>
:
<fpage>2281</fpage>
<lpage>2289</lpage>
.
<pub-id pub-id-type="pmid">24068113</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0108211-Gasper1">
<label>21</label>
<mixed-citation publication-type="journal">
<name>
<surname>Gasper</surname>
<given-names>K</given-names>
</name>
,
<name>
<surname>Clore</surname>
<given-names>GL</given-names>
</name>
(
<year>2002</year>
)
<article-title>Attending to the big picture: Mood and global versus local Processing of Visual Information</article-title>
.
<source>Psychological Science</source>
<volume>13</volume>
:
<fpage>34</fpage>
<lpage>40</lpage>
.
<pub-id pub-id-type="pmid">11892776</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0108211-Wild1">
<label>22</label>
<mixed-citation publication-type="journal">
<name>
<surname>Wild</surname>
<given-names>B</given-names>
</name>
,
<name>
<surname>Erb</surname>
<given-names>M</given-names>
</name>
,
<name>
<surname>Bartels</surname>
<given-names>M</given-names>
</name>
(
<year>2001</year>
)
<article-title>Are emotions contagious? Evoked emotions while viewing emotionally expressive faces: quality, quantity, time course and gender differences</article-title>
.
<source>Psychiatry Research</source>
<volume>102</volume>
:
<fpage>109</fpage>
<lpage>124</lpage>
.
<pub-id pub-id-type="pmid">11408051</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0108211-Caudek1">
<label>23</label>
<mixed-citation publication-type="journal">
<name>
<surname>Caudek</surname>
<given-names>C</given-names>
</name>
,
<name>
<surname>Monni</surname>
<given-names>A</given-names>
</name>
(
<year>2013</year>
)
<article-title>Do you remember your sad face? The roles of negative cognitive style and sad mood</article-title>
.
<source>Memory</source>
<volume>91</volume>
:
<fpage>891</fpage>
<lpage>903</lpage>
.
<pub-id pub-id-type="pmid">23383597</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0108211-Pollak1">
<label>24</label>
<mixed-citation publication-type="journal">
<name>
<surname>Pollak</surname>
<given-names>SD</given-names>
</name>
,
<name>
<surname>Kistler</surname>
<given-names>DJ</given-names>
</name>
(
<year>2002</year>
)
<article-title>Early experience is associated with the development of categorical representations for facial expressions of emotion</article-title>
.
<source>Proceedings of the National Academy of Sciences of the United States of America</source>
<volume>99</volume>
:
<fpage>9072</fpage>
<lpage>9076</lpage>
.
<pub-id pub-id-type="pmid">12072570</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0108211-Klatzky1">
<label>25</label>
<mixed-citation publication-type="journal">
<name>
<surname>Klatzky</surname>
<given-names>RL</given-names>
</name>
,
<name>
<surname>Abramowicz</surname>
<given-names>A</given-names>
</name>
,
<name>
<surname>Hamilton</surname>
<given-names>C</given-names>
</name>
,
<name>
<surname>Lederman</surname>
<given-names>SJ</given-names>
</name>
(
<year>2011</year>
)
<article-title>Irrelevant visual faces influence haptic identification of facial expressions of emotion</article-title>
.
<source>Attention, Perception & Psychophysics</source>
<volume>73</volume>
:
<fpage>521</fpage>
<lpage>530</lpage>
.</mixed-citation>
</ref>
<ref id="pone.0108211-Jeong1">
<label>26</label>
<mixed-citation publication-type="journal">
<name>
<surname>Jeong</surname>
<given-names>JW</given-names>
</name>
,
<name>
<surname>Diwadkar</surname>
<given-names>VA</given-names>
</name>
,
<name>
<surname>Chugani</surname>
<given-names>CD</given-names>
</name>
,
<name>
<surname>Sinsoongsud</surname>
<given-names>P</given-names>
</name>
,
<name>
<surname>Muzik</surname>
<given-names>O</given-names>
</name>
,
<etal>et al</etal>
(
<year>2011</year>
)
<article-title>Congruence of happy and sad emotion in music and faces modifies cortical audiovisual activation</article-title>
.
<source>NeuroImage</source>
<volume>54</volume>
:
<fpage>2973</fpage>
<lpage>2982</lpage>
.
<pub-id pub-id-type="pmid">21073970</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0108211-Jolij1">
<label>27</label>
<mixed-citation publication-type="journal">
<name>
<surname>Jolij</surname>
<given-names>J</given-names>
</name>
,
<name>
<surname>Meurs</surname>
<given-names>M</given-names>
</name>
(
<year>2011</year>
)
<article-title>Music alters visual perception</article-title>
.
<source>PLoS ONE</source>
,
<comment>e18861, doi:
<ext-link ext-link-type="uri" xlink:href="http://dx.doi.org/10.1371/journal.pone.0018861">10.1371/journal.pone.0018861</ext-link>
</comment>
</mixed-citation>
</ref>
<ref id="pone.0108211-Niedenthal3">
<label>28</label>
<mixed-citation publication-type="journal">
<name>
<surname>Niedenthal</surname>
<given-names>PM</given-names>
</name>
,
<name>
<surname>Halberstadt</surname>
<given-names>JB</given-names>
</name>
,
<name>
<surname>Innes-Ker</surname>
<given-names>AH</given-names>
</name>
(
<year>1999</year>
)
<article-title>Emotional response categorization</article-title>
.
<source>Psychological Review</source>
<volume>106</volume>
:
<fpage>337</fpage>
<lpage>361</lpage>
.</mixed-citation>
</ref>
<ref id="pone.0108211-Bouhuys1">
<label>29</label>
<mixed-citation publication-type="journal">
<name>
<surname>Bouhuys</surname>
<given-names>AL</given-names>
</name>
,
<name>
<surname>Bloem</surname>
<given-names>GM</given-names>
</name>
,
<name>
<surname>Groothuis</surname>
<given-names>TGG</given-names>
</name>
(
<year>1995</year>
)
<article-title>Induction of depressed and elated mood by music influences the perception of facial emotional expressions in healthy subjects</article-title>
.
<source>J Affect Disorders</source>
<volume>33</volume>
:
<fpage>215</fpage>
<lpage>225</lpage>
.
<pub-id pub-id-type="pmid">7790675</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0108211-Kucera1">
<label>30</label>
<mixed-citation publication-type="journal">
<name>
<surname>Kucera</surname>
<given-names>D</given-names>
</name>
,
<name>
<surname>Haviger</surname>
<given-names>J</given-names>
</name>
(
<year>2012</year>
)
<article-title>Using mood induction procedures in psychological research</article-title>
.
<source>Procedia - Social and Behavioral Sciences</source>
<volume>69</volume>
:
<fpage>31</fpage>
<lpage>40</lpage>
.</mixed-citation>
</ref>
<ref id="pone.0108211-Jallais1">
<label>31</label>
<mixed-citation publication-type="journal">
<name>
<surname>Jallais</surname>
<given-names>C</given-names>
</name>
,
<name>
<surname>Gilet</surname>
<given-names>A</given-names>
</name>
(
<year>2010</year>
)
<article-title>Inducing changes in arousal and valence: Comparison of two mood induction procedures</article-title>
.
<source>Behavior Research Methods</source>
<volume>42</volume>
:
<fpage>318</fpage>
<lpage>325</lpage>
.
<pub-id pub-id-type="pmid">20160311</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0108211-Yerkes1">
<label>32</label>
<mixed-citation publication-type="journal">
<name>
<surname>Yerkes</surname>
<given-names>RM</given-names>
</name>
,
<name>
<surname>Dodson</surname>
<given-names>JD</given-names>
</name>
(
<year>1908</year>
)
<article-title>The relation of strength of stimulus to rapidity of habit-formation</article-title>
.
<source>Journal of Comparative Neurology and Psychology</source>
<volume>18</volume>
:
<fpage>459</fpage>
<lpage>482</lpage>
.</mixed-citation>
</ref>
<ref id="pone.0108211-Bezdudnaya1">
<label>33</label>
<mixed-citation publication-type="journal">
<name>
<surname>Bezdudnaya</surname>
<given-names>TCM</given-names>
</name>
,
<name>
<surname>Bereshpolova</surname>
<given-names>Y</given-names>
</name>
,
<name>
<surname>Stoelzel</surname>
<given-names>CR</given-names>
</name>
,
<name>
<surname>Alanso</surname>
<given-names>JM</given-names>
</name>
,
<name>
<surname>Swadlow</surname>
<given-names>HA</given-names>
</name>
(
<year>2006</year>
)
<article-title>Thalamic burst mode and inattention in the awake LGNd</article-title>
.
<source>Neuron</source>
<volume>49</volume>
:
<fpage>421</fpage>
<lpage>432</lpage>
.
<pub-id pub-id-type="pmid">16446145</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0108211-Neill1">
<label>34</label>
<mixed-citation publication-type="journal">
<name>
<surname>Neill</surname>
<given-names>CM</given-names>
</name>
,
<name>
<surname>Stryker</surname>
<given-names>MP</given-names>
</name>
(
<year>2010</year>
)
<article-title>Modulation of visual responses by behavioral state in mouse visual cortex</article-title>
.
<source>Neuron</source>
<volume>56</volume>
:
<fpage>472</fpage>
<lpage>479</lpage>
.</mixed-citation>
</ref>
<ref id="pone.0108211-Woods2">
<label>35</label>
<mixed-citation publication-type="journal">
<name>
<surname>Woods</surname>
<given-names>AJ</given-names>
</name>
,
<name>
<surname>Mennemeier</surname>
<given-names>M</given-names>
</name>
,
<name>
<surname>Garcia-Rill</surname>
<given-names>E</given-names>
</name>
,
<name>
<surname>Huitt</surname>
<given-names>T</given-names>
</name>
,
<name>
<surname>Chelette</surname>
<given-names>KC</given-names>
</name>
,
<etal>et al</etal>
(
<year>2012</year>
)
<article-title>Improvement in arousal, visual neglect, and perception of stimulus intensity following cold pressor stimulation</article-title>
.
<source>Neurocase</source>
<volume>18</volume>
:
<fpage>115</fpage>
<lpage>122</lpage>
.
<pub-id pub-id-type="pmid">22013983</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0108211-Gilad1">
<label>36</label>
<mixed-citation publication-type="journal">
<name>
<surname>Gilad</surname>
<given-names>S</given-names>
</name>
,
<name>
<surname>Meng</surname>
<given-names>M</given-names>
</name>
,
<name>
<surname>Sinha</surname>
<given-names>P</given-names>
</name>
(
<year>2009</year>
)
<article-title>Role of ordinal contrast relationships in face encoding</article-title>
.
<source>Proceeding of the National Academy of Science</source>
<volume>106</volume>
:
<fpage>5353</fpage>
<lpage>5358</lpage>
.</mixed-citation>
</ref>
<ref id="pone.0108211-Huang1">
<label>37</label>
<mixed-citation publication-type="journal">
<name>
<surname>Huang</surname>
<given-names>J</given-names>
</name>
,
<name>
<surname>Chan</surname>
<given-names>RCK</given-names>
</name>
,
<name>
<surname>Lu</surname>
<given-names>X</given-names>
</name>
,
<name>
<surname>Tong</surname>
<given-names>Z</given-names>
</name>
(
<year>2009</year>
)
<article-title>Emotion categorization perception in schizophrenia in conversations with different social contexts</article-title>
.
<source>The Australian and New Zealand Journal of Psychiatry</source>
<volume>43</volume>
:
<fpage>438</fpage>
<lpage>445</lpage>
.
<pub-id pub-id-type="pmid">19373705</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0108211-Haralanova1">
<label>38</label>
<mixed-citation publication-type="journal">
<name>
<surname>Haralanova</surname>
<given-names>E</given-names>
</name>
,
<name>
<surname>Haralanov</surname>
<given-names>S</given-names>
</name>
,
<name>
<surname>Beraldi</surname>
<given-names>A</given-names>
</name>
,
<name>
<surname>Möller</surname>
<given-names>HJ</given-names>
</name>
,
<name>
<surname>Hennig-Fast</surname>
<given-names>K</given-names>
</name>
(
<year>2012</year>
)
<article-title>Subjective emotional over-arousal to neutral social scenes in paranoid schizophrenia</article-title>
.
<source>European Archives of Psychiatry and Clinical Neuroscience</source>
<volume>262</volume>
:
<fpage>59</fpage>
<lpage>68</lpage>
.
<pub-id pub-id-type="pmid">21792533</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0108211-Welford1">
<label>39</label>
<mixed-citation publication-type="other">Welford AT (1980) Choice reaction time: Basic concepts. In AT Welford (Ed.), Reaction Times New York: Academic Press, 73–128.</mixed-citation>
</ref>
<ref id="pone.0108211-Masanobu1">
<label>40</label>
<mixed-citation publication-type="journal">
<name>
<surname>Masanobu</surname>
<given-names>A</given-names>
</name>
,
<name>
<surname>Choshi</surname>
<given-names>K</given-names>
</name>
(
<year>2006</year>
)
<article-title>Contingent muscular tension during a choice reaction task</article-title>
.
<source>Perceptual and Motor Skills</source>
<volume>102</volume>
:
<fpage>736</fpage>
<lpage>747</lpage>
.
<pub-id pub-id-type="pmid">16916152</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0108211-BarHaim1">
<label>41</label>
<mixed-citation publication-type="journal">
<name>
<surname>Bar-Haim</surname>
<given-names>Y</given-names>
</name>
,
<name>
<surname>Lamy</surname>
<given-names>D</given-names>
</name>
,
<name>
<surname>Pergamin</surname>
<given-names>L</given-names>
</name>
,
<name>
<surname>Bakermans-Kranenburg</surname>
<given-names>MJ</given-names>
</name>
,
<name>
<surname>van Ijzendoorn</surname>
<given-names>MH</given-names>
</name>
(
<year>2007</year>
)
<article-title>Threat-related attentional bias in anxious and non-anxious individuals: A meta-analytic study</article-title>
.
<source>Psychological Bulletin</source>
<volume>133</volume>
:
<fpage>1</fpage>
<lpage>24</lpage>
.
<pub-id pub-id-type="pmid">17201568</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0108211-Richards1">
<label>42</label>
<mixed-citation publication-type="journal">
<name>
<surname>Richards</surname>
<given-names>HJ</given-names>
</name>
,
<name>
<surname>Hadwin</surname>
<given-names>JA</given-names>
</name>
,
<name>
<surname>Benson</surname>
<given-names>V</given-names>
</name>
,
<name>
<surname>Wenger</surname>
<given-names>MJ</given-names>
</name>
,
<name>
<surname>Donnelly</surname>
<given-names>N</given-names>
</name>
(
<year>2011</year>
)
<article-title>The influence of anxiety on processing capacity for threat detection</article-title>
.
<source>Psychonomic Bulletin and Review</source>
<volume>18</volume>
:
<fpage>883</fpage>
<lpage>889</lpage>
.
<pub-id pub-id-type="pmid">21748420</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0108211-Sacharin1">
<label>43</label>
<mixed-citation publication-type="journal">
<name>
<surname>Sacharin</surname>
<given-names>V</given-names>
</name>
,
<name>
<surname>Sander</surname>
<given-names>D</given-names>
</name>
,
<name>
<surname>Scherer</surname>
<given-names>KR</given-names>
</name>
(
<year>2012</year>
)
<article-title>The perception of changing emotion expressions</article-title>
.
<source>Cognition & Emotion</source>
<volume>26</volume>
:
<fpage>1273</fpage>
<lpage>1300</lpage>
.
<pub-id pub-id-type="pmid">22550942</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0108211-Becker1">
<label>44</label>
<mixed-citation publication-type="journal">
<name>
<surname>Becker</surname>
<given-names>DV</given-names>
</name>
,
<name>
<surname>Neel</surname>
<given-names>R</given-names>
</name>
,
<name>
<surname>Srinivasan</surname>
<given-names>N</given-names>
</name>
,
<name>
<surname>Neufeld</surname>
<given-names>S</given-names>
</name>
,
<name>
<surname>Kumar</surname>
<given-names>D</given-names>
</name>
,
<etal>et al</etal>
(
<year>2012</year>
)
<article-title>The vividness of happiness in dynamic facial displays of emotion</article-title>
.
<source>PloS ONE</source>
<volume>7</volume>
,
<comment>e26551. doi:
<ext-link ext-link-type="uri" xlink:href="http://dx.doi.org/10.1371/journal.pone.0026551">10.1371/journal.pone.0026551</ext-link>
]</comment>
</mixed-citation>
</ref>
<ref id="pone.0108211-Marinetti1">
<label>45</label>
<mixed-citation publication-type="journal">
<name>
<surname>Marinetti</surname>
<given-names>C</given-names>
</name>
,
<name>
<surname>Mesquita</surname>
<given-names>B</given-names>
</name>
,
<name>
<surname>Yik</surname>
<given-names>M</given-names>
</name>
,
<name>
<surname>Cragwall</surname>
<given-names>C</given-names>
</name>
,
<name>
<surname>Gallagher</surname>
<given-names>AH</given-names>
</name>
(
<year>2012</year>
)
<article-title>Threat advantage: perception of angry and happy dynamic faces across cultures</article-title>
.
<source>Cognition & Emotion</source>
<volume>26</volume>
:
<fpage>1326</fpage>
<lpage>1334</lpage>
.
<pub-id pub-id-type="pmid">22414192</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0108211-Sloan1">
<label>46</label>
<mixed-citation publication-type="journal">
<name>
<surname>Sloan</surname>
<given-names>DM</given-names>
</name>
(
<year>2004</year>
)
<article-title>Emotion regulation in action: emotional reactivity in experiential avoidance</article-title>
.
<source>Behaviour Research & Therapy</source>
<volume>42</volume>
:
<fpage>1257</fpage>
<lpage>1270</lpage>
.
<pub-id pub-id-type="pmid">15381437</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0108211-Langner1">
<label>47</label>
<mixed-citation publication-type="journal">
<name>
<surname>Langner</surname>
<given-names>O</given-names>
</name>
,
<name>
<surname>Dotsch</surname>
<given-names>R</given-names>
</name>
,
<name>
<surname>Bijlstra</surname>
<given-names>G</given-names>
</name>
,
<name>
<surname>Wigboldus</surname>
<given-names>DHJ</given-names>
</name>
,
<name>
<surname>Hawk</surname>
<given-names>ST</given-names>
</name>
,
<etal>et al</etal>
(
<year>2010</year>
)
<article-title>Presentation and validation of the Radboud Faces Database</article-title>
.
<source>Cognition & Emotion</source>
<volume>24</volume>
:
<fpage>1377</fpage>
<lpage>1388</lpage>
.</mixed-citation>
</ref>
<ref id="pone.0108211-Benson1">
<label>48</label>
<mixed-citation publication-type="journal">
<name>
<surname>Benson</surname>
<given-names>PJ</given-names>
</name>
,
<name>
<surname>Perrett</surname>
<given-names>DI</given-names>
</name>
(
<year>1993</year>
)
<article-title>Extracting prototypical facial images from exemplars</article-title>
.
<source>Perception</source>
<volume>22</volume>
:
<fpage>257</fpage>
<lpage>262</lpage>
.
<pub-id pub-id-type="pmid">8316513</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0108211-Marneweck1">
<label>49</label>
<mixed-citation publication-type="journal">
<name>
<surname>Marneweck</surname>
<given-names>M</given-names>
</name>
,
<name>
<surname>Loftus</surname>
<given-names>A</given-names>
</name>
,
<name>
<surname>Hammond</surname>
<given-names>G</given-names>
</name>
(
<year>2013</year>
)
<article-title>Psychophysical measures of sensitivity to facial expression of emotion</article-title>
.
<source>Frontiers in Psychology</source>
<volume>4</volume>
,
<comment>doi:
<ext-link ext-link-type="uri" xlink:href="http://dx.doi.org/10.3389/fpsyg.2013.00063">10.3389/fpsyg.2013.00063</ext-link>
</comment>
</mixed-citation>
</ref>
<ref id="pone.0108211-Ellermeier1">
<label>50</label>
<mixed-citation publication-type="journal">
<name>
<surname>Ellermeier</surname>
<given-names>W</given-names>
</name>
,
<name>
<surname>Westphal</surname>
<given-names>W</given-names>
</name>
,
<name>
<surname>Heidenfelder</surname>
<given-names>M</given-names>
</name>
(
<year>1991</year>
)
<article-title>On the “absoluteness” of category and magnitude scales of pain</article-title>
.
<source>Perception & Psychophysics</source>
<volume>49</volume>
:
<fpage>159</fpage>
<lpage>166</lpage>
.
<pub-id pub-id-type="pmid">2017352</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0108211-Carello1">
<label>51</label>
<mixed-citation publication-type="journal">
<name>
<surname>Carello</surname>
<given-names>C</given-names>
</name>
,
<name>
<surname>Grosofsky</surname>
<given-names>A</given-names>
</name>
,
<name>
<surname>Reichel</surname>
<given-names>FD</given-names>
</name>
,
<name>
<surname>Solomon</surname>
<given-names>HY</given-names>
</name>
,
<name>
<surname>Turvey</surname>
<given-names>MT</given-names>
</name>
(
<year>1989</year>
)
<article-title>Visually perceiving what is reachable</article-title>
.
<source>Ecological Psychology</source>
<volume>1</volume>
:
<fpage>27</fpage>
<lpage>54</lpage>
.</mixed-citation>
</ref>
<ref id="pone.0108211-Choi1">
<label>52</label>
<mixed-citation publication-type="journal">
<name>
<surname>Choi</surname>
<given-names>HJ</given-names>
</name>
,
<name>
<surname>Mark</surname>
<given-names>LS</given-names>
</name>
(
<year>2004</year>
)
<article-title>Scaling affordances for human reach actions</article-title>
.
<source>Human Movement Science</source>
<volume>23</volume>
:
<fpage>785</fpage>
<lpage>806</lpage>
.
<pub-id pub-id-type="pmid">15664673</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0108211-Knoblauch1">
<label>53</label>
<mixed-citation publication-type="other">Knoblauch K, Maloney LT (2012) Modeling Psychophysical Data in R. New York: Springer.</mixed-citation>
</ref>
<ref id="pone.0108211-Bates1">
<label>54</label>
<mixed-citation publication-type="other">Bates D (2010). lme4: Mixed-Effects Modeling with R. New York: Springer.</mixed-citation>
</ref>
<ref id="pone.0108211-Bates2">
<label>55</label>
<mixed-citation publication-type="other">Bates D, Mechler M (2014) Linear mixed-effects models using Eigen and S4. Cran. R project website. Available:
<ext-link ext-link-type="uri" xlink:href="http://cran.r-project.org/web/packages/lme4/lme4.pdf">http://cran.r-project.org/web/packages/lme4/lme4.pdf</ext-link>
. Accessed 2014 September 2.</mixed-citation>
</ref>
<ref id="pone.0108211-Halekoh1">
<label>56</label>
<mixed-citation publication-type="other">Halekoh U., Højsgaard S (2014) A Kenward-Roger Approximation and Parametric Bootstrap Methods for Tests in Linear Mixed Models - the R Package pbkrtest. Journal of Statistical Software 59, 1–30.</mixed-citation>
</ref>
<ref id="pone.0108211-Halekoh2">
<label>57</label>
<mixed-citation publication-type="other">Halekoh U, Højsgaard S (2013) Parametric bootstrap and Kenward Roger based methods for mixed model comparison. Cran. R project website. Available:
<ext-link ext-link-type="uri" xlink:href="http://cran.r-project.org/web/packages/pbkrtest/pbkrtest.pdf">http://cran.r-project.org/web/packages/pbkrtest/pbkrtest.pdf</ext-link>
. Accessed 2014 September 2.</mixed-citation>
</ref>
<ref id="pone.0108211-Sun1">
<label>58</label>
<mixed-citation publication-type="journal">
<name>
<surname>Sun</surname>
<given-names>G</given-names>
</name>
,
<name>
<surname>Zhu</surname>
<given-names>C</given-names>
</name>
,
<name>
<surname>Kramer</surname>
<given-names>MH</given-names>
</name>
,
<name>
<surname>Yang</surname>
<given-names>SS</given-names>
</name>
,
<name>
<surname>Song</surname>
<given-names>W</given-names>
</name>
,
<etal>et al</etal>
(
<year>2010</year>
)
<article-title>Variation explained in mixed-model association mapping</article-title>
.
<source>Heredity</source>
<volume>105</volume>
:
<fpage>333</fpage>
<lpage>340</lpage>
Available:
<ext-link ext-link-type="uri" xlink:href="http://www.biomedcentral.com/sfx_links?ui=1471-2229-11-52&bibl=B29">http://www.biomedcentral.com/sfx_links?ui=1471-2229-11-52&bibl=B29</ext-link>
Accessed 2014 September 2..
<pub-id pub-id-type="pmid">20145669</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0108211-Vonesh1">
<label>59</label>
<mixed-citation publication-type="journal">
<name>
<surname>Vonesh</surname>
<given-names>EF</given-names>
</name>
,
<name>
<surname>Chinchilli</surname>
<given-names>VM</given-names>
</name>
,
<name>
<surname>Pu</surname>
<given-names>K</given-names>
</name>
(
<year>1996</year>
)
<article-title>Goodness-of-fit in generalized nonlinear mixed-effects models</article-title>
.
<source>Biometrics</source>
<volume>52</volume>
:
<fpage>572</fpage>
<lpage>587</lpage>
.
<pub-id pub-id-type="pmid">10766504</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0108211-Hansen1">
<label>60</label>
<mixed-citation publication-type="journal">
<name>
<surname>Hansen</surname>
<given-names>CH</given-names>
</name>
,
<name>
<surname>Hansen</surname>
<given-names>RD</given-names>
</name>
(
<year>1988</year>
)
<article-title>Finding the face in the crowd: an anger superiority effect</article-title>
.
<source>Journal of Personality and Social Psychology</source>
<volume>54</volume>
:
<fpage>917</fpage>
<lpage>924</lpage>
.
<pub-id pub-id-type="pmid">3397866</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0108211-Landy1">
<label>61</label>
<mixed-citation publication-type="journal">
<name>
<surname>Landy</surname>
<given-names>MS</given-names>
</name>
,
<name>
<surname>Maloney</surname>
<given-names>LT</given-names>
</name>
,
<name>
<surname>Johnston</surname>
<given-names>EB</given-names>
</name>
,
<name>
<surname>Young</surname>
<given-names>M</given-names>
</name>
(
<year>1995</year>
)
<article-title>Measurement and modeling of depth cue combination: In defense of weak fusion</article-title>
.
<source>Vision Research</source>
<volume>35</volume>
:
<fpage>389</fpage>
<lpage>412</lpage>
.
<pub-id pub-id-type="pmid">7892735</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0108211-Landy2">
<label>62</label>
<mixed-citation publication-type="other">Landy MS, Banks M, Knill D (2011) Ideal-observer models of cue integration. In: Trommershuser J, Landy M, Koerding K, eds. Sensory cue integration. New York: Oxford University Press, pp. 5–29.</mixed-citation>
</ref>
<ref id="pone.0108211-Caudek2">
<label>63</label>
<mixed-citation publication-type="journal">
<name>
<surname>Caudek</surname>
<given-names>C</given-names>
</name>
,
<name>
<surname>Fantoni</surname>
<given-names>C</given-names>
</name>
,
<name>
<surname>Domini</surname>
<given-names>F</given-names>
</name>
(
<year>2011</year>
)
<article-title>Bayesian modeling of perceived surface slant from actively-generated and passively-observed optic flow</article-title>
.
<source>PLoS ONE</source>
<volume>6(4)</volume>
:
<fpage>e18731</fpage>
<comment>doi:
<ext-link ext-link-type="uri" xlink:href="http://dx.doi.org/10.1371/journal.pone.0018731">10.1371/journal.pone.0018731</ext-link>
</comment>
<pub-id pub-id-type="pmid">21533197</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0108211-vonHelmholtz1">
<label>64</label>
<mixed-citation publication-type="other">von Helmholtz H (1866) Concerning the perceptions in general, 3rd edn. Treatise on Physiological Optics, Vol. III (translated by Southall JPC 1925 Opt. Soc. Am. Section reprinted New York: Dover, 1962).</mixed-citation>
</ref>
<ref id="pone.0108211-Gregory1">
<label>65</label>
<mixed-citation publication-type="journal">
<name>
<surname>Gregory</surname>
<given-names>RL</given-names>
</name>
(
<year>1968</year>
)
<article-title>Perceptual illusions and brain models</article-title>
.
<source>Proceedings of the Royal Society B</source>
<volume>171</volume>
:
<fpage>179</fpage>
<lpage>196</lpage>
.</mixed-citation>
</ref>
<ref id="pone.0108211-Oosterwijk1">
<label>66</label>
<mixed-citation publication-type="journal">
<name>
<surname>Oosterwijk</surname>
<given-names>S</given-names>
</name>
,
<name>
<surname>Lindquist</surname>
<given-names>KA</given-names>
</name>
,
<name>
<surname>Anderson</surname>
<given-names>E</given-names>
</name>
,
<name>
<surname>Dautoff</surname>
<given-names>R</given-names>
</name>
,
<name>
<surname>Moriguchi</surname>
<given-names>Y</given-names>
</name>
,
<etal>et al</etal>
(
<year>2012</year>
)
<article-title>States of mind: Emotions, body feelings, and thoughts share distributed neural networks</article-title>
.
<source>NeuroImage</source>
<volume>62</volume>
:
<fpage>2110</fpage>
<lpage>2128</lpage>
.
<pub-id pub-id-type="pmid">22677148</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0108211-Clark1">
<label>67</label>
<mixed-citation publication-type="journal">
<name>
<surname>Clark</surname>
<given-names>DM</given-names>
</name>
(
<year>1983</year>
)
<article-title>On the induction of depressed mood in the laboratory: Evaluation and comparison of the Velten and musical procedures</article-title>
.
<source>Advances in Behaviour Research and Therapy</source>
<volume>5</volume>
:
<fpage>27</fpage>
<lpage>49</lpage>
.</mixed-citation>
</ref>
<ref id="pone.0108211-Velten1">
<label>68</label>
<mixed-citation publication-type="journal">
<name>
<surname>Velten</surname>
<given-names>E</given-names>
</name>
(
<year>1968</year>
)
<article-title>A laboratory task for induction of mood states</article-title>
.
<source>Behavioural Research and Therapy</source>
<volume>6</volume>
:
<fpage>473</fpage>
<lpage>82</lpage>
.</mixed-citation>
</ref>
<ref id="pone.0108211-Mayberg1">
<label>69</label>
<mixed-citation publication-type="journal">
<name>
<surname>Mayberg</surname>
<given-names>HS</given-names>
</name>
,
<name>
<surname>Liotti</surname>
<given-names>M</given-names>
</name>
,
<name>
<surname>Brannan</surname>
<given-names>SK</given-names>
</name>
,
<name>
<surname>McGinnis</surname>
<given-names>S</given-names>
</name>
,
<name>
<surname>Mahurin</surname>
<given-names>RK</given-names>
</name>
,
<etal>et al</etal>
(
<year>1999</year>
)
<article-title>Reciprocal limbic-cortical function and negative mood: Converging PET findings in depression and normal sadness</article-title>
.
<source>American Journal of Psychiatry</source>
<volume>156</volume>
:
<fpage>675</fpage>
<lpage>682</lpage>
.
<pub-id pub-id-type="pmid">10327898</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0108211-Robinson1">
<label>70</label>
<mixed-citation publication-type="journal">
<name>
<surname>Robinson</surname>
<given-names>O</given-names>
</name>
,
<name>
<surname>Grillon</surname>
<given-names>C</given-names>
</name>
,
<name>
<surname>Sahakian</surname>
<given-names>B</given-names>
</name>
(
<year>2012</year>
)
<article-title>The mood induction task: A standardized, computerized laboratory procedure for altering mood state in humans</article-title>
.
<source>Protocol Exchange</source>
<fpage>1</fpage>
<lpage>17</lpage>
, doi:
<ext-link ext-link-type="uri" xlink:href="http://dx.doi.org/10.1038/protex.2012.007">10.1038/protex.2012.007</ext-link>
.</mixed-citation>
</ref>
<ref id="pone.0108211-Kersten1">
<label>71</label>
<mixed-citation publication-type="journal">
<name>
<surname>Kersten</surname>
<given-names>D</given-names>
</name>
,
<name>
<surname>Mamassian</surname>
<given-names>P</given-names>
</name>
,
<name>
<surname>Yuille</surname>
<given-names>A</given-names>
</name>
(
<year>2004</year>
)
<article-title>Object perception as Bayesian inference</article-title>
.
<source>Annual Review of Psychology</source>
<volume>55</volume>
:
<fpage>271</fpage>
<lpage>304</lpage>
.</mixed-citation>
</ref>
<ref id="pone.0108211-Goodale1">
<label>72</label>
<mixed-citation publication-type="journal">
<name>
<surname>Goodale</surname>
<given-names>MA</given-names>
</name>
,
<name>
<surname>Milner</surname>
<given-names>AD</given-names>
</name>
,
<name>
<surname>Jakobson</surname>
<given-names>LS</given-names>
</name>
,
<name>
<surname>Carey</surname>
<given-names>DP</given-names>
</name>
(
<year>1991</year>
)
<article-title>A neurological dissociation between perceiving objects and grasping them</article-title>
.
<source>Nature</source>
<volume>349</volume>
:
<fpage>154</fpage>
<lpage>156</lpage>
.
<pub-id pub-id-type="pmid">1986306</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0108211-Zajonc1">
<label>73</label>
<mixed-citation publication-type="other">Zajonc RB, Markus H (1984) Affect and cognition: The hard interface. In: Izard C, Kagan J, Zajonc RB editors. Emotion, cognition, and behavior. Cambridge: Cambridge University Press, pp. 73–102.</mixed-citation>
</ref>
<ref id="pone.0108211-Derryberry1">
<label>74</label>
<mixed-citation publication-type="journal">
<name>
<surname>Derryberry</surname>
<given-names>D</given-names>
</name>
,
<name>
<surname>Reed</surname>
<given-names>MA</given-names>
</name>
(
<year>1998</year>
)
<article-title>Anxiety and attentional focusing: Trait, state and hemispheric influences</article-title>
.
<source>Personality and Individual Differences</source>
<volume>25</volume>
:
<fpage>745</fpage>
<lpage>761</lpage>
.</mixed-citation>
</ref>
<ref id="pone.0108211-Jeffries1">
<label>75</label>
<mixed-citation publication-type="journal">
<name>
<surname>Jeffries</surname>
<given-names>LM</given-names>
</name>
,
<name>
<surname>Smilek</surname>
<given-names>D</given-names>
</name>
,
<name>
<surname>Eich</surname>
<given-names>E</given-names>
</name>
,
<name>
<surname>Enns</surname>
<given-names>JT</given-names>
</name>
(
<year>2008</year>
)
<article-title>Emotional valence and arousal interact in the attentional blink</article-title>
.
<source>Psychological Science</source>
<volume>19</volume>
:
<fpage>290</fpage>
<lpage>295</lpage>
.
<pub-id pub-id-type="pmid">18315803</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0108211-Martin1">
<label>76</label>
<mixed-citation publication-type="journal">
<name>
<surname>Martin</surname>
<given-names>D</given-names>
</name>
,
<name>
<surname>Slessor</surname>
<given-names>G</given-names>
</name>
,
<name>
<surname>Allen</surname>
<given-names>R</given-names>
</name>
,
<name>
<surname>Phillips</surname>
<given-names>LH</given-names>
</name>
,
<name>
<surname>Darling</surname>
<given-names>S</given-names>
</name>
(
<year>2012</year>
)
<article-title>Processing orientation and emotion recognition</article-title>
.
<source>Emotion</source>
<volume>12</volume>
:
<fpage>39</fpage>
<lpage>43</lpage>
.
<pub-id pub-id-type="pmid">21842989</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0108211-Franz1">
<label>77</label>
<mixed-citation publication-type="journal">
<name>
<surname>Franz</surname>
<given-names>VH</given-names>
</name>
(
<year>2003</year>
)
<article-title>Manual size estimation: A neuropsychological measure of perception?</article-title>
<source>Experimental Brain Research</source>
<volume>151</volume>
:
<fpage>471</fpage>
<lpage>477</lpage>
.
<pub-id pub-id-type="pmid">12851803</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0108211-Foster1">
<label>78</label>
<mixed-citation publication-type="journal">
<name>
<surname>Foster</surname>
<given-names>R</given-names>
</name>
,
<name>
<surname>Fantoni</surname>
<given-names>C</given-names>
</name>
,
<name>
<surname>Caudek</surname>
<given-names>C</given-names>
</name>
,
<name>
<surname>Domini</surname>
<given-names>F</given-names>
</name>
(
<year>2011</year>
)
<article-title>Integration of disparity and velocity information for haptic and perceptual judgments of object depth</article-title>
.
<source>Acta Psychologica</source>
<volume>136</volume>
:
<fpage>300</fpage>
<lpage>310</lpage>
<comment>doi:
<ext-link ext-link-type="uri" xlink:href="http://dx.doi.org/10.1016/j.actpsy.2010.12.003">10.1016/j.actpsy.2010.12.003</ext-link>
</comment>
<pub-id pub-id-type="pmid">21237442</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0108211-Norman1">
<label>79</label>
<mixed-citation publication-type="other">Norman D. (2005). Emotional Design. Cambridge, MA: Basic Books.</mixed-citation>
</ref>
</ref-list>
</back>
</pmc>
<affiliations>
<list>
<country>
<li>Italie</li>
</country>
</list>
<tree>
<country name="Italie">
<noRegion>
<name sortKey="Fantoni, Carlo" sort="Fantoni, Carlo" uniqKey="Fantoni C" first="Carlo" last="Fantoni">Carlo Fantoni</name>
</noRegion>
<name sortKey="Fantoni, Carlo" sort="Fantoni, Carlo" uniqKey="Fantoni C" first="Carlo" last="Fantoni">Carlo Fantoni</name>
<name sortKey="Gerbino, Walter" sort="Gerbino, Walter" uniqKey="Gerbino W" first="Walter" last="Gerbino">Walter Gerbino</name>
</country>
</tree>
</affiliations>
</record>

Pour manipuler ce document sous Unix (Dilib)

EXPLOR_STEP=$WICRI_ROOT/Ticri/CIDE/explor/HapticV1/Data/Ncbi/Merge
HfdSelect -h $EXPLOR_STEP/biblio.hfd -nk 003321 | SxmlIndent | more

Ou

HfdSelect -h $EXPLOR_AREA/Data/Ncbi/Merge/biblio.hfd -nk 003321 | SxmlIndent | more

Pour mettre un lien sur cette page dans le réseau Wicri

{{Explor lien
   |wiki=    Ticri/CIDE
   |area=    HapticV1
   |flux=    Ncbi
   |étape=   Merge
   |type=    RBID
   |clé=     PMC:4176726
   |texte=   Body Actions Change the Appearance of Facial Expressions
}}

Pour générer des pages wiki

HfdIndexSelect -h $EXPLOR_AREA/Data/Ncbi/Merge/RBID.i   -Sk "pubmed:25251882" \
       | HfdSelect -Kh $EXPLOR_AREA/Data/Ncbi/Merge/biblio.hfd   \
       | NlmPubMed2Wicri -a HapticV1 

Wicri

This area was generated with Dilib version V0.6.23.
Data generation: Mon Jun 13 01:09:46 2016. Site generation: Wed Mar 6 09:54:07 2024