Vision holds a greater share in visuo-haptic object recognition than touch.
Identifieur interne : 001704 ( Main/Exploration ); précédent : 001703; suivant : 001705Vision holds a greater share in visuo-haptic object recognition than touch.
Auteurs : Tanja Kassuba [Danemark] ; Corinna Klinge ; Cordula Hölig ; Brigitte Röder ; Hartwig R. SiebnerSource :
- NeuroImage [ 1095-9572 ] ; 2013.
English descriptors
- KwdEn :
- MESH :
- physiology : Brain, Recognition (Psychology), Touch Perception, Visual Perception.
- Adult, Brain Mapping, Female, Humans, Magnetic Resonance Imaging, Male, Young Adult.
Abstract
The integration of visual and haptic input can facilitate object recognition. Yet, vision might dominate visuo-haptic interactions as it is more effective than haptics in processing several object features in parallel and recognizing objects outside of reaching space. The maximum likelihood approach of multisensory integration would predict that haptics as the less efficient sense for object recognition gains more from integrating additional visual information than vice versa. To test for asymmetries between vision and touch in visuo-haptic interactions, we measured regional changes in brain activity using functional magnetic resonance imaging while healthy individuals performed a delayed-match-to-sample task. We manipulated identity matching of sample and target objects: We hypothesized that only coherent visual and haptic object features would activate unified object representations. The bilateral object-specific lateral occipital cortex, fusiform gyrus, and intraparietal sulcus showed increased activation to crossmodal compared to unimodal matching but only for congruent object pairs. Critically, the visuo-haptic interaction effects in these regions depended on the sensory modality which processed the target object, being more pronounced for haptic than visual targets. This preferential response of visuo-haptic regions indicates a modality-specific asymmetry in crossmodal matching of visual and haptic object features, suggesting a functional primacy of vision over touch in visuo-haptic object recognition.
DOI: 10.1016/j.neuroimage.2012.09.054
PubMed: 23032487
Affiliations:
Links toward previous steps (curation, corpus...)
- to stream PubMed, to step Corpus: 000B25
- to stream PubMed, to step Curation: 000B25
- to stream PubMed, to step Checkpoint: 000745
- to stream Ncbi, to step Merge: 002268
- to stream Ncbi, to step Curation: 002268
- to stream Ncbi, to step Checkpoint: 002268
- to stream Main, to step Merge: 001715
- to stream Main, to step Curation: 001704
Le document en format XML
<record><TEI><teiHeader><fileDesc><titleStmt><title xml:lang="en">Vision holds a greater share in visuo-haptic object recognition than touch.</title>
<author><name sortKey="Kassuba, Tanja" sort="Kassuba, Tanja" uniqKey="Kassuba T" first="Tanja" last="Kassuba">Tanja Kassuba</name>
<affiliation wicri:level="1"><nlm:affiliation>Danish Research Centre for Magnetic Resonance, Copenhagen University Hospital Hvidovre, Hvidovre, Denmark. kassuba@princeton.edu</nlm:affiliation>
<country xml:lang="fr">Danemark</country>
<wicri:regionArea>Danish Research Centre for Magnetic Resonance, Copenhagen University Hospital Hvidovre, Hvidovre</wicri:regionArea>
<wicri:noRegion>Hvidovre</wicri:noRegion>
</affiliation>
</author>
<author><name sortKey="Klinge, Corinna" sort="Klinge, Corinna" uniqKey="Klinge C" first="Corinna" last="Klinge">Corinna Klinge</name>
</author>
<author><name sortKey="Holig, Cordula" sort="Holig, Cordula" uniqKey="Holig C" first="Cordula" last="Hölig">Cordula Hölig</name>
</author>
<author><name sortKey="Roder, Brigitte" sort="Roder, Brigitte" uniqKey="Roder B" first="Brigitte" last="Röder">Brigitte Röder</name>
</author>
<author><name sortKey="Siebner, Hartwig R" sort="Siebner, Hartwig R" uniqKey="Siebner H" first="Hartwig R" last="Siebner">Hartwig R. Siebner</name>
</author>
</titleStmt>
<publicationStmt><idno type="wicri:source">PubMed</idno>
<date when="2013">2013</date>
<idno type="doi">10.1016/j.neuroimage.2012.09.054</idno>
<idno type="RBID">pubmed:23032487</idno>
<idno type="pmid">23032487</idno>
<idno type="wicri:Area/PubMed/Corpus">000B25</idno>
<idno type="wicri:Area/PubMed/Curation">000B25</idno>
<idno type="wicri:Area/PubMed/Checkpoint">000745</idno>
<idno type="wicri:Area/Ncbi/Merge">002268</idno>
<idno type="wicri:Area/Ncbi/Curation">002268</idno>
<idno type="wicri:Area/Ncbi/Checkpoint">002268</idno>
<idno type="wicri:Area/Main/Merge">001715</idno>
<idno type="wicri:Area/Main/Curation">001704</idno>
<idno type="wicri:Area/Main/Exploration">001704</idno>
</publicationStmt>
<sourceDesc><biblStruct><analytic><title xml:lang="en">Vision holds a greater share in visuo-haptic object recognition than touch.</title>
<author><name sortKey="Kassuba, Tanja" sort="Kassuba, Tanja" uniqKey="Kassuba T" first="Tanja" last="Kassuba">Tanja Kassuba</name>
<affiliation wicri:level="1"><nlm:affiliation>Danish Research Centre for Magnetic Resonance, Copenhagen University Hospital Hvidovre, Hvidovre, Denmark. kassuba@princeton.edu</nlm:affiliation>
<country xml:lang="fr">Danemark</country>
<wicri:regionArea>Danish Research Centre for Magnetic Resonance, Copenhagen University Hospital Hvidovre, Hvidovre</wicri:regionArea>
<wicri:noRegion>Hvidovre</wicri:noRegion>
</affiliation>
</author>
<author><name sortKey="Klinge, Corinna" sort="Klinge, Corinna" uniqKey="Klinge C" first="Corinna" last="Klinge">Corinna Klinge</name>
</author>
<author><name sortKey="Holig, Cordula" sort="Holig, Cordula" uniqKey="Holig C" first="Cordula" last="Hölig">Cordula Hölig</name>
</author>
<author><name sortKey="Roder, Brigitte" sort="Roder, Brigitte" uniqKey="Roder B" first="Brigitte" last="Röder">Brigitte Röder</name>
</author>
<author><name sortKey="Siebner, Hartwig R" sort="Siebner, Hartwig R" uniqKey="Siebner H" first="Hartwig R" last="Siebner">Hartwig R. Siebner</name>
</author>
</analytic>
<series><title level="j">NeuroImage</title>
<idno type="eISSN">1095-9572</idno>
<imprint><date when="2013" type="published">2013</date>
</imprint>
</series>
</biblStruct>
</sourceDesc>
</fileDesc>
<profileDesc><textClass><keywords scheme="KwdEn" xml:lang="en"><term>Adult</term>
<term>Brain (physiology)</term>
<term>Brain Mapping</term>
<term>Female</term>
<term>Humans</term>
<term>Magnetic Resonance Imaging</term>
<term>Male</term>
<term>Recognition (Psychology) (physiology)</term>
<term>Touch Perception (physiology)</term>
<term>Visual Perception (physiology)</term>
<term>Young Adult</term>
</keywords>
<keywords scheme="MESH" qualifier="physiology" xml:lang="en"><term>Brain</term>
<term>Recognition (Psychology)</term>
<term>Touch Perception</term>
<term>Visual Perception</term>
</keywords>
<keywords scheme="MESH" xml:lang="en"><term>Adult</term>
<term>Brain Mapping</term>
<term>Female</term>
<term>Humans</term>
<term>Magnetic Resonance Imaging</term>
<term>Male</term>
<term>Young Adult</term>
</keywords>
</textClass>
</profileDesc>
</teiHeader>
<front><div type="abstract" xml:lang="en">The integration of visual and haptic input can facilitate object recognition. Yet, vision might dominate visuo-haptic interactions as it is more effective than haptics in processing several object features in parallel and recognizing objects outside of reaching space. The maximum likelihood approach of multisensory integration would predict that haptics as the less efficient sense for object recognition gains more from integrating additional visual information than vice versa. To test for asymmetries between vision and touch in visuo-haptic interactions, we measured regional changes in brain activity using functional magnetic resonance imaging while healthy individuals performed a delayed-match-to-sample task. We manipulated identity matching of sample and target objects: We hypothesized that only coherent visual and haptic object features would activate unified object representations. The bilateral object-specific lateral occipital cortex, fusiform gyrus, and intraparietal sulcus showed increased activation to crossmodal compared to unimodal matching but only for congruent object pairs. Critically, the visuo-haptic interaction effects in these regions depended on the sensory modality which processed the target object, being more pronounced for haptic than visual targets. This preferential response of visuo-haptic regions indicates a modality-specific asymmetry in crossmodal matching of visual and haptic object features, suggesting a functional primacy of vision over touch in visuo-haptic object recognition.</div>
</front>
</TEI>
<affiliations><list><country><li>Danemark</li>
</country>
</list>
<tree><noCountry><name sortKey="Holig, Cordula" sort="Holig, Cordula" uniqKey="Holig C" first="Cordula" last="Hölig">Cordula Hölig</name>
<name sortKey="Klinge, Corinna" sort="Klinge, Corinna" uniqKey="Klinge C" first="Corinna" last="Klinge">Corinna Klinge</name>
<name sortKey="Roder, Brigitte" sort="Roder, Brigitte" uniqKey="Roder B" first="Brigitte" last="Röder">Brigitte Röder</name>
<name sortKey="Siebner, Hartwig R" sort="Siebner, Hartwig R" uniqKey="Siebner H" first="Hartwig R" last="Siebner">Hartwig R. Siebner</name>
</noCountry>
<country name="Danemark"><noRegion><name sortKey="Kassuba, Tanja" sort="Kassuba, Tanja" uniqKey="Kassuba T" first="Tanja" last="Kassuba">Tanja Kassuba</name>
</noRegion>
</country>
</tree>
</affiliations>
</record>
Pour manipuler ce document sous Unix (Dilib)
EXPLOR_STEP=$WICRI_ROOT/Ticri/CIDE/explor/HapticV1/Data/Main/Exploration
HfdSelect -h $EXPLOR_STEP/biblio.hfd -nk 001704 | SxmlIndent | more
Ou
HfdSelect -h $EXPLOR_AREA/Data/Main/Exploration/biblio.hfd -nk 001704 | SxmlIndent | more
Pour mettre un lien sur cette page dans le réseau Wicri
{{Explor lien |wiki= Ticri/CIDE |area= HapticV1 |flux= Main |étape= Exploration |type= RBID |clé= pubmed:23032487 |texte= Vision holds a greater share in visuo-haptic object recognition than touch. }}
Pour générer des pages wiki
HfdIndexSelect -h $EXPLOR_AREA/Data/Main/Exploration/RBID.i -Sk "pubmed:23032487" \ | HfdSelect -Kh $EXPLOR_AREA/Data/Main/Exploration/biblio.hfd \ | NlmPubMed2Wicri -a HapticV1
![]() | This area was generated with Dilib version V0.6.23. | ![]() |