Irrelevant visual faces influence haptic identification of facial expressions of emotion.
Identifieur interne : 000F58 ( PubMed/Curation ); précédent : 000F57; suivant : 000F59Irrelevant visual faces influence haptic identification of facial expressions of emotion.
Auteurs : Roberta L. Klatzky [États-Unis] ; Aneta Abramowicz ; Cheryl Hamilton ; Susan J. LedermanSource :
- Attention, perception & psychophysics [ 1943-393X ] ; 2011.
English descriptors
- KwdEn :
- MESH :
Abstract
This study demonstrates that when people attempt to identify a facial expression of emotion (FEE) by haptically exploring a 3D facemask, they are affected by viewing a simultaneous, task-irrelevant visual FEE portrayed by another person. In comparison to a control condition, where visual noise was presented, the visual FEE facilitated haptic identification when congruent (visual and haptic FEEs same category). When the visual and haptic FEEs were incongruent, haptic identification was impaired, and error responses shifted toward the visually depicted emotion. In contrast, visual emotion labels that matched or mismatched the haptic FEE category produced no such effects. The findings indicate that vision and touch interact in FEE recognition at a level where featural invariants of the emotional category (cf. precise facial geometry or general concepts) are processed, even when the visual and haptic FEEs are not attributable to a common source. Processing mechanisms behind these effects are considered.
DOI: 10.3758/s13414-010-0038-x
PubMed: 21264726
Links toward previous steps (curation, corpus...)
- to stream PubMed, to step Corpus: Pour aller vers cette notice dans l'étape Curation :000F58
Links to Exploration step
pubmed:21264726Le document en format XML
<record><TEI><teiHeader><fileDesc><titleStmt><title xml:lang="en">Irrelevant visual faces influence haptic identification of facial expressions of emotion.</title>
<author><name sortKey="Klatzky, Roberta L" sort="Klatzky, Roberta L" uniqKey="Klatzky R" first="Roberta L" last="Klatzky">Roberta L. Klatzky</name>
<affiliation wicri:level="1"><nlm:affiliation>Department of Psychology, Carnegie Mellon University, Pittsburgh, PA 15213-3890, USA. klatzky@cmu.edu</nlm:affiliation>
<country xml:lang="fr">États-Unis</country>
<wicri:regionArea>Department of Psychology, Carnegie Mellon University, Pittsburgh, PA 15213-3890</wicri:regionArea>
</affiliation>
</author>
<author><name sortKey="Abramowicz, Aneta" sort="Abramowicz, Aneta" uniqKey="Abramowicz A" first="Aneta" last="Abramowicz">Aneta Abramowicz</name>
</author>
<author><name sortKey="Hamilton, Cheryl" sort="Hamilton, Cheryl" uniqKey="Hamilton C" first="Cheryl" last="Hamilton">Cheryl Hamilton</name>
</author>
<author><name sortKey="Lederman, Susan J" sort="Lederman, Susan J" uniqKey="Lederman S" first="Susan J" last="Lederman">Susan J. Lederman</name>
</author>
</titleStmt>
<publicationStmt><idno type="wicri:source">PubMed</idno>
<date when="2011">2011</date>
<idno type="doi">10.3758/s13414-010-0038-x</idno>
<idno type="RBID">pubmed:21264726</idno>
<idno type="pmid">21264726</idno>
<idno type="wicri:Area/PubMed/Corpus">000F58</idno>
<idno type="wicri:Area/PubMed/Curation">000F58</idno>
</publicationStmt>
<sourceDesc><biblStruct><analytic><title xml:lang="en">Irrelevant visual faces influence haptic identification of facial expressions of emotion.</title>
<author><name sortKey="Klatzky, Roberta L" sort="Klatzky, Roberta L" uniqKey="Klatzky R" first="Roberta L" last="Klatzky">Roberta L. Klatzky</name>
<affiliation wicri:level="1"><nlm:affiliation>Department of Psychology, Carnegie Mellon University, Pittsburgh, PA 15213-3890, USA. klatzky@cmu.edu</nlm:affiliation>
<country xml:lang="fr">États-Unis</country>
<wicri:regionArea>Department of Psychology, Carnegie Mellon University, Pittsburgh, PA 15213-3890</wicri:regionArea>
</affiliation>
</author>
<author><name sortKey="Abramowicz, Aneta" sort="Abramowicz, Aneta" uniqKey="Abramowicz A" first="Aneta" last="Abramowicz">Aneta Abramowicz</name>
</author>
<author><name sortKey="Hamilton, Cheryl" sort="Hamilton, Cheryl" uniqKey="Hamilton C" first="Cheryl" last="Hamilton">Cheryl Hamilton</name>
</author>
<author><name sortKey="Lederman, Susan J" sort="Lederman, Susan J" uniqKey="Lederman S" first="Susan J" last="Lederman">Susan J. Lederman</name>
</author>
</analytic>
<series><title level="j">Attention, perception & psychophysics</title>
<idno type="eISSN">1943-393X</idno>
<imprint><date when="2011" type="published">2011</date>
</imprint>
</series>
</biblStruct>
</sourceDesc>
</fileDesc>
<profileDesc><textClass><keywords scheme="KwdEn" xml:lang="en"><term>Adolescent</term>
<term>Attention</term>
<term>Emotions</term>
<term>Facial Expression</term>
<term>Female</term>
<term>Humans</term>
<term>Male</term>
<term>Pattern Recognition, Visual</term>
<term>Stereognosis</term>
<term>Young Adult</term>
</keywords>
<keywords scheme="MESH" xml:lang="en"><term>Adolescent</term>
<term>Attention</term>
<term>Emotions</term>
<term>Facial Expression</term>
<term>Female</term>
<term>Humans</term>
<term>Male</term>
<term>Pattern Recognition, Visual</term>
<term>Stereognosis</term>
<term>Young Adult</term>
</keywords>
</textClass>
</profileDesc>
</teiHeader>
<front><div type="abstract" xml:lang="en">This study demonstrates that when people attempt to identify a facial expression of emotion (FEE) by haptically exploring a 3D facemask, they are affected by viewing a simultaneous, task-irrelevant visual FEE portrayed by another person. In comparison to a control condition, where visual noise was presented, the visual FEE facilitated haptic identification when congruent (visual and haptic FEEs same category). When the visual and haptic FEEs were incongruent, haptic identification was impaired, and error responses shifted toward the visually depicted emotion. In contrast, visual emotion labels that matched or mismatched the haptic FEE category produced no such effects. The findings indicate that vision and touch interact in FEE recognition at a level where featural invariants of the emotional category (cf. precise facial geometry or general concepts) are processed, even when the visual and haptic FEEs are not attributable to a common source. Processing mechanisms behind these effects are considered.</div>
</front>
</TEI>
<pubmed><MedlineCitation Owner="NLM" Status="MEDLINE"><PMID Version="1">21264726</PMID>
<DateCreated><Year>2011</Year>
<Month>02</Month>
<Day>11</Day>
</DateCreated>
<DateCompleted><Year>2011</Year>
<Month>06</Month>
<Day>03</Day>
</DateCompleted>
<Article PubModel="Print"><Journal><ISSN IssnType="Electronic">1943-393X</ISSN>
<JournalIssue CitedMedium="Internet"><Volume>73</Volume>
<Issue>2</Issue>
<PubDate><Year>2011</Year>
<Month>Feb</Month>
</PubDate>
</JournalIssue>
<Title>Attention, perception & psychophysics</Title>
<ISOAbbreviation>Atten Percept Psychophys</ISOAbbreviation>
</Journal>
<ArticleTitle>Irrelevant visual faces influence haptic identification of facial expressions of emotion.</ArticleTitle>
<Pagination><MedlinePgn>521-30</MedlinePgn>
</Pagination>
<ELocationID EIdType="doi" ValidYN="Y">10.3758/s13414-010-0038-x</ELocationID>
<Abstract><AbstractText>This study demonstrates that when people attempt to identify a facial expression of emotion (FEE) by haptically exploring a 3D facemask, they are affected by viewing a simultaneous, task-irrelevant visual FEE portrayed by another person. In comparison to a control condition, where visual noise was presented, the visual FEE facilitated haptic identification when congruent (visual and haptic FEEs same category). When the visual and haptic FEEs were incongruent, haptic identification was impaired, and error responses shifted toward the visually depicted emotion. In contrast, visual emotion labels that matched or mismatched the haptic FEE category produced no such effects. The findings indicate that vision and touch interact in FEE recognition at a level where featural invariants of the emotional category (cf. precise facial geometry or general concepts) are processed, even when the visual and haptic FEEs are not attributable to a common source. Processing mechanisms behind these effects are considered.</AbstractText>
</Abstract>
<AuthorList CompleteYN="Y"><Author ValidYN="Y"><LastName>Klatzky</LastName>
<ForeName>Roberta L</ForeName>
<Initials>RL</Initials>
<AffiliationInfo><Affiliation>Department of Psychology, Carnegie Mellon University, Pittsburgh, PA 15213-3890, USA. klatzky@cmu.edu</Affiliation>
</AffiliationInfo>
</Author>
<Author ValidYN="Y"><LastName>Abramowicz</LastName>
<ForeName>Aneta</ForeName>
<Initials>A</Initials>
</Author>
<Author ValidYN="Y"><LastName>Hamilton</LastName>
<ForeName>Cheryl</ForeName>
<Initials>C</Initials>
</Author>
<Author ValidYN="Y"><LastName>Lederman</LastName>
<ForeName>Susan J</ForeName>
<Initials>SJ</Initials>
</Author>
</AuthorList>
<Language>eng</Language>
<GrantList CompleteYN="Y"><Grant><GrantID>1R01EY016817-01A1</GrantID>
<Acronym>EY</Acronym>
<Agency>NEI NIH HHS</Agency>
<Country>United States</Country>
</Grant>
<Grant><GrantID>MOP 74645</GrantID>
<Agency>Canadian Institutes of Health Research</Agency>
<Country>Canada</Country>
</Grant>
</GrantList>
<PublicationTypeList><PublicationType UI="D016428">Journal Article</PublicationType>
<PublicationType UI="D052061">Research Support, N.I.H., Extramural</PublicationType>
<PublicationType UI="D013485">Research Support, Non-U.S. Gov't</PublicationType>
</PublicationTypeList>
</Article>
<MedlineJournalInfo><Country>United States</Country>
<MedlineTA>Atten Percept Psychophys</MedlineTA>
<NlmUniqueID>101495384</NlmUniqueID>
<ISSNLinking>1943-3921</ISSNLinking>
</MedlineJournalInfo>
<CitationSubset>IM</CitationSubset>
<MeshHeadingList><MeshHeading><DescriptorName MajorTopicYN="N" UI="D000293">Adolescent</DescriptorName>
</MeshHeading>
<MeshHeading><DescriptorName MajorTopicYN="Y" UI="D001288">Attention</DescriptorName>
</MeshHeading>
<MeshHeading><DescriptorName MajorTopicYN="Y" UI="D004644">Emotions</DescriptorName>
</MeshHeading>
<MeshHeading><DescriptorName MajorTopicYN="Y" UI="D005149">Facial Expression</DescriptorName>
</MeshHeading>
<MeshHeading><DescriptorName MajorTopicYN="N" UI="D005260">Female</DescriptorName>
</MeshHeading>
<MeshHeading><DescriptorName MajorTopicYN="N" UI="D006801">Humans</DescriptorName>
</MeshHeading>
<MeshHeading><DescriptorName MajorTopicYN="N" UI="D008297">Male</DescriptorName>
</MeshHeading>
<MeshHeading><DescriptorName MajorTopicYN="Y" UI="D010364">Pattern Recognition, Visual</DescriptorName>
</MeshHeading>
<MeshHeading><DescriptorName MajorTopicYN="Y" UI="D013236">Stereognosis</DescriptorName>
</MeshHeading>
<MeshHeading><DescriptorName MajorTopicYN="N" UI="D055815">Young Adult</DescriptorName>
</MeshHeading>
</MeshHeadingList>
</MedlineCitation>
<PubmedData><History><PubMedPubDate PubStatus="entrez"><Year>2011</Year>
<Month>1</Month>
<Day>26</Day>
<Hour>6</Hour>
<Minute>0</Minute>
</PubMedPubDate>
<PubMedPubDate PubStatus="pubmed"><Year>2011</Year>
<Month>1</Month>
<Day>26</Day>
<Hour>6</Hour>
<Minute>0</Minute>
</PubMedPubDate>
<PubMedPubDate PubStatus="medline"><Year>2011</Year>
<Month>6</Month>
<Day>4</Day>
<Hour>6</Hour>
<Minute>0</Minute>
</PubMedPubDate>
</History>
<PublicationStatus>ppublish</PublicationStatus>
<ArticleIdList><ArticleId IdType="doi">10.3758/s13414-010-0038-x</ArticleId>
<ArticleId IdType="pubmed">21264726</ArticleId>
</ArticleIdList>
</PubmedData>
</pubmed>
</record>
Pour manipuler ce document sous Unix (Dilib)
EXPLOR_STEP=$WICRI_ROOT/Ticri/CIDE/explor/HapticV1/Data/PubMed/Curation
HfdSelect -h $EXPLOR_STEP/biblio.hfd -nk 000F58 | SxmlIndent | more
Ou
HfdSelect -h $EXPLOR_AREA/Data/PubMed/Curation/biblio.hfd -nk 000F58 | SxmlIndent | more
Pour mettre un lien sur cette page dans le réseau Wicri
{{Explor lien |wiki= Ticri/CIDE |area= HapticV1 |flux= PubMed |étape= Curation |type= RBID |clé= pubmed:21264726 |texte= Irrelevant visual faces influence haptic identification of facial expressions of emotion. }}
Pour générer des pages wiki
HfdIndexSelect -h $EXPLOR_AREA/Data/PubMed/Curation/RBID.i -Sk "pubmed:21264726" \ | HfdSelect -Kh $EXPLOR_AREA/Data/PubMed/Curation/biblio.hfd \ | NlmPubMed2Wicri -a HapticV1
![]() | This area was generated with Dilib version V0.6.23. | ![]() |