Serveur d'exploration sur les dispositifs haptiques

Attention, ce site est en cours de développement !
Attention, site généré par des moyens informatiques à partir de corpus bruts.
Les informations ne sont donc pas validées.

Haptic recognition of static and dynamic expressions of emotion in the live face.

Identifieur interne : 001645 ( PubMed/Corpus ); précédent : 001644; suivant : 001646

Haptic recognition of static and dynamic expressions of emotion in the live face.

Auteurs : S J Lederman ; R L Klatzky ; A. Abramowicz ; K. Salsman ; R. Kitada ; C. Hamilton

Source :

RBID : pubmed:17425537

English descriptors

Abstract

If humans can detect the wealth of tactile and haptic information potentially available in live facial expressions of emotion (FEEs), they should be capable of haptically recognizing the six universal expressions of emotion (anger, disgust, fear, happiness, sadness, and surprise) at levels well above chance. We tested this hypothesis in the experiments reported here. With minimal training, subjects' overall mean accuracy was 51% for static FEEs (Experiment 1) and 74% for dynamic FEEs (Experiment 2). All FEEs except static fear were successfully recognized above the chance level of 16.7%. Complementing these findings, overall confidence and information transmission were higher for dynamic than for corresponding static faces. Our performance measures (accuracy and confidence ratings, plus response latency in Experiment 2 only) confirmed that happiness, sadness, and surprise were all highly recognizable, and anger, disgust, and fear less so.

DOI: 10.1111/j.1467-9280.2007.01866.x
PubMed: 17425537

Links to Exploration step

pubmed:17425537

Le document en format XML

<record>
<TEI>
<teiHeader>
<fileDesc>
<titleStmt>
<title xml:lang="en">Haptic recognition of static and dynamic expressions of emotion in the live face.</title>
<author>
<name sortKey="Lederman, S J" sort="Lederman, S J" uniqKey="Lederman S" first="S J" last="Lederman">S J Lederman</name>
<affiliation>
<nlm:affiliation>Queen's University, Kingston, Ontario, Canada. lederman@post.queensu.ca</nlm:affiliation>
</affiliation>
</author>
<author>
<name sortKey="Klatzky, R L" sort="Klatzky, R L" uniqKey="Klatzky R" first="R L" last="Klatzky">R L Klatzky</name>
</author>
<author>
<name sortKey="Abramowicz, A" sort="Abramowicz, A" uniqKey="Abramowicz A" first="A" last="Abramowicz">A. Abramowicz</name>
</author>
<author>
<name sortKey="Salsman, K" sort="Salsman, K" uniqKey="Salsman K" first="K" last="Salsman">K. Salsman</name>
</author>
<author>
<name sortKey="Kitada, R" sort="Kitada, R" uniqKey="Kitada R" first="R" last="Kitada">R. Kitada</name>
</author>
<author>
<name sortKey="Hamilton, C" sort="Hamilton, C" uniqKey="Hamilton C" first="C" last="Hamilton">C. Hamilton</name>
</author>
</titleStmt>
<publicationStmt>
<idno type="wicri:source">PubMed</idno>
<date when="2007">2007</date>
<idno type="doi">10.1111/j.1467-9280.2007.01866.x</idno>
<idno type="RBID">pubmed:17425537</idno>
<idno type="pmid">17425537</idno>
<idno type="wicri:Area/PubMed/Corpus">001645</idno>
</publicationStmt>
<sourceDesc>
<biblStruct>
<analytic>
<title xml:lang="en">Haptic recognition of static and dynamic expressions of emotion in the live face.</title>
<author>
<name sortKey="Lederman, S J" sort="Lederman, S J" uniqKey="Lederman S" first="S J" last="Lederman">S J Lederman</name>
<affiliation>
<nlm:affiliation>Queen's University, Kingston, Ontario, Canada. lederman@post.queensu.ca</nlm:affiliation>
</affiliation>
</author>
<author>
<name sortKey="Klatzky, R L" sort="Klatzky, R L" uniqKey="Klatzky R" first="R L" last="Klatzky">R L Klatzky</name>
</author>
<author>
<name sortKey="Abramowicz, A" sort="Abramowicz, A" uniqKey="Abramowicz A" first="A" last="Abramowicz">A. Abramowicz</name>
</author>
<author>
<name sortKey="Salsman, K" sort="Salsman, K" uniqKey="Salsman K" first="K" last="Salsman">K. Salsman</name>
</author>
<author>
<name sortKey="Kitada, R" sort="Kitada, R" uniqKey="Kitada R" first="R" last="Kitada">R. Kitada</name>
</author>
<author>
<name sortKey="Hamilton, C" sort="Hamilton, C" uniqKey="Hamilton C" first="C" last="Hamilton">C. Hamilton</name>
</author>
</analytic>
<series>
<title level="j">Psychological science</title>
<idno type="ISSN">0956-7976</idno>
<imprint>
<date when="2007" type="published">2007</date>
</imprint>
</series>
</biblStruct>
</sourceDesc>
</fileDesc>
<profileDesc>
<textClass>
<keywords scheme="KwdEn" xml:lang="en">
<term>Adolescent</term>
<term>Adult</term>
<term>Affect</term>
<term>Facial Expression</term>
<term>Female</term>
<term>Humans</term>
<term>Male</term>
<term>Reaction Time</term>
<term>Recognition (Psychology)</term>
<term>Visual Perception</term>
</keywords>
<keywords scheme="MESH" xml:lang="en">
<term>Adolescent</term>
<term>Adult</term>
<term>Affect</term>
<term>Facial Expression</term>
<term>Female</term>
<term>Humans</term>
<term>Male</term>
<term>Reaction Time</term>
<term>Recognition (Psychology)</term>
<term>Visual Perception</term>
</keywords>
</textClass>
</profileDesc>
</teiHeader>
<front>
<div type="abstract" xml:lang="en">If humans can detect the wealth of tactile and haptic information potentially available in live facial expressions of emotion (FEEs), they should be capable of haptically recognizing the six universal expressions of emotion (anger, disgust, fear, happiness, sadness, and surprise) at levels well above chance. We tested this hypothesis in the experiments reported here. With minimal training, subjects' overall mean accuracy was 51% for static FEEs (Experiment 1) and 74% for dynamic FEEs (Experiment 2). All FEEs except static fear were successfully recognized above the chance level of 16.7%. Complementing these findings, overall confidence and information transmission were higher for dynamic than for corresponding static faces. Our performance measures (accuracy and confidence ratings, plus response latency in Experiment 2 only) confirmed that happiness, sadness, and surprise were all highly recognizable, and anger, disgust, and fear less so.</div>
</front>
</TEI>
<pubmed>
<MedlineCitation Owner="NLM" Status="MEDLINE">
<PMID Version="1">17425537</PMID>
<DateCreated>
<Year>2007</Year>
<Month>04</Month>
<Day>11</Day>
</DateCreated>
<DateCompleted>
<Year>2007</Year>
<Month>07</Month>
<Day>03</Day>
</DateCompleted>
<DateRevised>
<Year>2011</Year>
<Month>05</Month>
<Day>20</Day>
</DateRevised>
<Article PubModel="Print">
<Journal>
<ISSN IssnType="Print">0956-7976</ISSN>
<JournalIssue CitedMedium="Print">
<Volume>18</Volume>
<Issue>2</Issue>
<PubDate>
<Year>2007</Year>
<Month>Feb</Month>
</PubDate>
</JournalIssue>
<Title>Psychological science</Title>
<ISOAbbreviation>Psychol Sci</ISOAbbreviation>
</Journal>
<ArticleTitle>Haptic recognition of static and dynamic expressions of emotion in the live face.</ArticleTitle>
<Pagination>
<MedlinePgn>158-64</MedlinePgn>
</Pagination>
<Abstract>
<AbstractText>If humans can detect the wealth of tactile and haptic information potentially available in live facial expressions of emotion (FEEs), they should be capable of haptically recognizing the six universal expressions of emotion (anger, disgust, fear, happiness, sadness, and surprise) at levels well above chance. We tested this hypothesis in the experiments reported here. With minimal training, subjects' overall mean accuracy was 51% for static FEEs (Experiment 1) and 74% for dynamic FEEs (Experiment 2). All FEEs except static fear were successfully recognized above the chance level of 16.7%. Complementing these findings, overall confidence and information transmission were higher for dynamic than for corresponding static faces. Our performance measures (accuracy and confidence ratings, plus response latency in Experiment 2 only) confirmed that happiness, sadness, and surprise were all highly recognizable, and anger, disgust, and fear less so.</AbstractText>
</Abstract>
<AuthorList CompleteYN="Y">
<Author ValidYN="Y">
<LastName>Lederman</LastName>
<ForeName>S J</ForeName>
<Initials>SJ</Initials>
<AffiliationInfo>
<Affiliation>Queen's University, Kingston, Ontario, Canada. lederman@post.queensu.ca</Affiliation>
</AffiliationInfo>
</Author>
<Author ValidYN="Y">
<LastName>Klatzky</LastName>
<ForeName>R L</ForeName>
<Initials>RL</Initials>
</Author>
<Author ValidYN="Y">
<LastName>Abramowicz</LastName>
<ForeName>A</ForeName>
<Initials>A</Initials>
</Author>
<Author ValidYN="Y">
<LastName>Salsman</LastName>
<ForeName>K</ForeName>
<Initials>K</Initials>
</Author>
<Author ValidYN="Y">
<LastName>Kitada</LastName>
<ForeName>R</ForeName>
<Initials>R</Initials>
</Author>
<Author ValidYN="Y">
<LastName>Hamilton</LastName>
<ForeName>C</ForeName>
<Initials>C</Initials>
</Author>
</AuthorList>
<Language>eng</Language>
<PublicationTypeList>
<PublicationType UI="D016428">Journal Article</PublicationType>
<PublicationType UI="D013485">Research Support, Non-U.S. Gov't</PublicationType>
</PublicationTypeList>
</Article>
<MedlineJournalInfo>
<Country>United States</Country>
<MedlineTA>Psychol Sci</MedlineTA>
<NlmUniqueID>9007542</NlmUniqueID>
<ISSNLinking>0956-7976</ISSNLinking>
</MedlineJournalInfo>
<CitationSubset>IM</CitationSubset>
<MeshHeadingList>
<MeshHeading>
<DescriptorName MajorTopicYN="N" UI="D000293">Adolescent</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="N" UI="D000328">Adult</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="Y" UI="D000339">Affect</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="Y" UI="D005149">Facial Expression</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="N" UI="D005260">Female</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="N" UI="D006801">Humans</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="N" UI="D008297">Male</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="N" UI="D011930">Reaction Time</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="Y" UI="D021641">Recognition (Psychology)</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="N" UI="D014796">Visual Perception</DescriptorName>
</MeshHeading>
</MeshHeadingList>
</MedlineCitation>
<PubmedData>
<History>
<PubMedPubDate PubStatus="pubmed">
<Year>2007</Year>
<Month>4</Month>
<Day>12</Day>
<Hour>9</Hour>
<Minute>0</Minute>
</PubMedPubDate>
<PubMedPubDate PubStatus="medline">
<Year>2007</Year>
<Month>7</Month>
<Day>4</Day>
<Hour>9</Hour>
<Minute>0</Minute>
</PubMedPubDate>
<PubMedPubDate PubStatus="entrez">
<Year>2007</Year>
<Month>4</Month>
<Day>12</Day>
<Hour>9</Hour>
<Minute>0</Minute>
</PubMedPubDate>
</History>
<PublicationStatus>ppublish</PublicationStatus>
<ArticleIdList>
<ArticleId IdType="pii">PSCI1866</ArticleId>
<ArticleId IdType="doi">10.1111/j.1467-9280.2007.01866.x</ArticleId>
<ArticleId IdType="pubmed">17425537</ArticleId>
</ArticleIdList>
</PubmedData>
</pubmed>
</record>

Pour manipuler ce document sous Unix (Dilib)

EXPLOR_STEP=$WICRI_ROOT/Ticri/CIDE/explor/HapticV1/Data/PubMed/Corpus
HfdSelect -h $EXPLOR_STEP/biblio.hfd -nk 001645 | SxmlIndent | more

Ou

HfdSelect -h $EXPLOR_AREA/Data/PubMed/Corpus/biblio.hfd -nk 001645 | SxmlIndent | more

Pour mettre un lien sur cette page dans le réseau Wicri

{{Explor lien
   |wiki=    Ticri/CIDE
   |area=    HapticV1
   |flux=    PubMed
   |étape=   Corpus
   |type=    RBID
   |clé=     pubmed:17425537
   |texte=   Haptic recognition of static and dynamic expressions of emotion in the live face.
}}

Pour générer des pages wiki

HfdIndexSelect -h $EXPLOR_AREA/Data/PubMed/Corpus/RBID.i   -Sk "pubmed:17425537" \
       | HfdSelect -Kh $EXPLOR_AREA/Data/PubMed/Corpus/biblio.hfd   \
       | NlmPubMed2Wicri -a HapticV1 

Wicri

This area was generated with Dilib version V0.6.23.
Data generation: Mon Jun 13 01:09:46 2016. Site generation: Wed Mar 6 09:54:07 2024