Serveur d'exploration sur les dispositifs haptiques

Attention, ce site est en cours de développement !
Attention, site généré par des moyens informatiques à partir de corpus bruts.
Les informations ne sont donc pas validées.

Resolving multisensory conflict: a strategy for balancing the costs and benefits of audio-visual integration

Identifieur interne : 005D57 ( Main/Merge ); précédent : 005D56; suivant : 005D58

Resolving multisensory conflict: a strategy for balancing the costs and benefits of audio-visual integration

Auteurs : Neil W. Roach [Royaume-Uni] ; James Heron [Royaume-Uni] ; Paul V. Mcgraw [Royaume-Uni]

Source :

RBID : PMC:1635528

Abstract

In order to maintain a coherent, unified percept of the external environment, the brain must continuously combine information encoded by our different sensory systems. Contemporary models suggest that multisensory integration produces a weighted average of sensory estimates, where the contribution of each system to the ultimate multisensory percept is governed by the relative reliability of the information it provides (maximum-likelihood estimation). In the present study, we investigate interactions between auditory and visual rate perception, where observers are required to make judgments in one modality while ignoring conflicting rate information presented in the other. We show a gradual transition between partial cue integration and complete cue segregation with increasing inter-modal discrepancy that is inconsistent with mandatory implementation of maximum-likelihood estimation. To explain these findings, we implement a simple Bayesian model of integration that is also able to predict observer performance with novel stimuli. The model assumes that the brain takes into account prior knowledge about the correspondence between auditory and visual rate signals, when determining the degree of integration to implement. This provides a strategy for balancing the benefits accrued by integrating sensory estimates arising from a common source, against the costs of conflating information relating to independent objects or events.


Url:
DOI: 10.1098/rspb.2006.3578
PubMed: 16901835
PubMed Central: 1635528

Links toward previous steps (curation, corpus...)


Links to Exploration step

PMC:1635528

Le document en format XML

<record>
<TEI>
<teiHeader>
<fileDesc>
<titleStmt>
<title xml:lang="en">Resolving multisensory conflict: a strategy for balancing the costs and benefits of audio-visual integration</title>
<author>
<name sortKey="Roach, Neil W" sort="Roach, Neil W" uniqKey="Roach N" first="Neil W" last="Roach">Neil W. Roach</name>
<affiliation wicri:level="1">
<nlm:aff id="aff1">
<institution>Visual Neuroscience Group, School of Psychology, The University of Nottingham</institution>
<addr-line>Nottingham NG7 2RD, UK</addr-line>
</nlm:aff>
<country xml:lang="fr">Royaume-Uni</country>
<wicri:regionArea>Nottingham NG7 2RD</wicri:regionArea>
</affiliation>
</author>
<author>
<name sortKey="Heron, James" sort="Heron, James" uniqKey="Heron J" first="James" last="Heron">James Heron</name>
<affiliation wicri:level="1">
<nlm:aff id="aff2">
<institution>Department of Optometry, University of Bradford</institution>
<addr-line>Bradford BD7 1DP, UK</addr-line>
</nlm:aff>
<country xml:lang="fr">Royaume-Uni</country>
<wicri:regionArea>Bradford BD7 1DP</wicri:regionArea>
</affiliation>
</author>
<author>
<name sortKey="Mcgraw, Paul V" sort="Mcgraw, Paul V" uniqKey="Mcgraw P" first="Paul V" last="Mcgraw">Paul V. Mcgraw</name>
<affiliation wicri:level="1">
<nlm:aff id="aff1">
<institution>Visual Neuroscience Group, School of Psychology, The University of Nottingham</institution>
<addr-line>Nottingham NG7 2RD, UK</addr-line>
</nlm:aff>
<country xml:lang="fr">Royaume-Uni</country>
<wicri:regionArea>Nottingham NG7 2RD</wicri:regionArea>
</affiliation>
</author>
</titleStmt>
<publicationStmt>
<idno type="wicri:source">PMC</idno>
<idno type="pmid">16901835</idno>
<idno type="pmc">1635528</idno>
<idno type="url">http://www.ncbi.nlm.nih.gov/pmc/articles/PMC1635528</idno>
<idno type="RBID">PMC:1635528</idno>
<idno type="doi">10.1098/rspb.2006.3578</idno>
<date when="2006">2006</date>
<idno type="wicri:Area/Pmc/Corpus">002456</idno>
<idno type="wicri:Area/Pmc/Curation">002456</idno>
<idno type="wicri:Area/Pmc/Checkpoint">002519</idno>
<idno type="wicri:Area/Ncbi/Merge">000965</idno>
<idno type="wicri:Area/Ncbi/Curation">000965</idno>
<idno type="wicri:Area/Ncbi/Checkpoint">000965</idno>
<idno type="wicri:doubleKey">0962-8452:2006:Roach N:resolving:multisensory:conflict</idno>
<idno type="wicri:Area/Main/Merge">005D57</idno>
</publicationStmt>
<sourceDesc>
<biblStruct>
<analytic>
<title xml:lang="en" level="a" type="main">Resolving multisensory conflict: a strategy for balancing the costs and benefits of audio-visual integration</title>
<author>
<name sortKey="Roach, Neil W" sort="Roach, Neil W" uniqKey="Roach N" first="Neil W" last="Roach">Neil W. Roach</name>
<affiliation wicri:level="1">
<nlm:aff id="aff1">
<institution>Visual Neuroscience Group, School of Psychology, The University of Nottingham</institution>
<addr-line>Nottingham NG7 2RD, UK</addr-line>
</nlm:aff>
<country xml:lang="fr">Royaume-Uni</country>
<wicri:regionArea>Nottingham NG7 2RD</wicri:regionArea>
</affiliation>
</author>
<author>
<name sortKey="Heron, James" sort="Heron, James" uniqKey="Heron J" first="James" last="Heron">James Heron</name>
<affiliation wicri:level="1">
<nlm:aff id="aff2">
<institution>Department of Optometry, University of Bradford</institution>
<addr-line>Bradford BD7 1DP, UK</addr-line>
</nlm:aff>
<country xml:lang="fr">Royaume-Uni</country>
<wicri:regionArea>Bradford BD7 1DP</wicri:regionArea>
</affiliation>
</author>
<author>
<name sortKey="Mcgraw, Paul V" sort="Mcgraw, Paul V" uniqKey="Mcgraw P" first="Paul V" last="Mcgraw">Paul V. Mcgraw</name>
<affiliation wicri:level="1">
<nlm:aff id="aff1">
<institution>Visual Neuroscience Group, School of Psychology, The University of Nottingham</institution>
<addr-line>Nottingham NG7 2RD, UK</addr-line>
</nlm:aff>
<country xml:lang="fr">Royaume-Uni</country>
<wicri:regionArea>Nottingham NG7 2RD</wicri:regionArea>
</affiliation>
</author>
</analytic>
<series>
<title level="j">Proceedings of the Royal Society B: Biological Sciences</title>
<idno type="ISSN">0962-8452</idno>
<idno type="eISSN">1471-2954</idno>
<imprint>
<date when="2006">2006</date>
</imprint>
</series>
</biblStruct>
</sourceDesc>
</fileDesc>
<profileDesc>
<textClass></textClass>
</profileDesc>
</teiHeader>
<front>
<div type="abstract" xml:lang="en">
<p>In order to maintain a coherent, unified percept of the external environment, the brain must continuously combine information encoded by our different sensory systems. Contemporary models suggest that multisensory integration produces a weighted average of sensory estimates, where the contribution of each system to the ultimate multisensory percept is governed by the relative reliability of the information it provides (maximum-likelihood estimation). In the present study, we investigate interactions between auditory and visual rate perception, where observers are required to make judgments in one modality while ignoring conflicting rate information presented in the other. We show a gradual transition between partial cue integration and complete cue segregation with increasing inter-modal discrepancy that is inconsistent with mandatory implementation of maximum-likelihood estimation. To explain these findings, we implement a simple Bayesian model of integration that is also able to predict observer performance with novel stimuli. The model assumes that the brain takes into account prior knowledge about the correspondence between auditory and visual rate signals, when determining the degree of integration to implement. This provides a strategy for balancing the benefits accrued by integrating sensory estimates arising from a common source, against the costs of conflating information relating to independent objects or events.</p>
</div>
</front>
<back>
<div1 type="bibliography">
<listBibl>
<biblStruct></biblStruct>
<biblStruct></biblStruct>
<biblStruct></biblStruct>
<biblStruct></biblStruct>
<biblStruct></biblStruct>
<biblStruct></biblStruct>
<biblStruct></biblStruct>
<biblStruct></biblStruct>
<biblStruct></biblStruct>
<biblStruct></biblStruct>
<biblStruct></biblStruct>
<biblStruct></biblStruct>
<biblStruct></biblStruct>
<biblStruct></biblStruct>
<biblStruct></biblStruct>
<biblStruct></biblStruct>
<biblStruct></biblStruct>
<biblStruct></biblStruct>
<biblStruct></biblStruct>
<biblStruct></biblStruct>
<biblStruct></biblStruct>
<biblStruct></biblStruct>
<biblStruct></biblStruct>
<biblStruct></biblStruct>
<biblStruct></biblStruct>
<biblStruct></biblStruct>
<biblStruct></biblStruct>
<biblStruct></biblStruct>
<biblStruct></biblStruct>
<biblStruct></biblStruct>
<biblStruct></biblStruct>
<biblStruct></biblStruct>
<biblStruct></biblStruct>
<biblStruct></biblStruct>
<biblStruct></biblStruct>
<biblStruct></biblStruct>
<biblStruct></biblStruct>
<biblStruct></biblStruct>
<biblStruct></biblStruct>
<biblStruct></biblStruct>
<biblStruct></biblStruct>
</listBibl>
</div1>
</back>
</TEI>
</record>

Pour manipuler ce document sous Unix (Dilib)

EXPLOR_STEP=$WICRI_ROOT/Ticri/CIDE/explor/HapticV1/Data/Main/Merge
HfdSelect -h $EXPLOR_STEP/biblio.hfd -nk 005D57 | SxmlIndent | more

Ou

HfdSelect -h $EXPLOR_AREA/Data/Main/Merge/biblio.hfd -nk 005D57 | SxmlIndent | more

Pour mettre un lien sur cette page dans le réseau Wicri

{{Explor lien
   |wiki=    Ticri/CIDE
   |area=    HapticV1
   |flux=    Main
   |étape=   Merge
   |type=    RBID
   |clé=     PMC:1635528
   |texte=   Resolving multisensory conflict: a strategy for balancing the costs and benefits of audio-visual integration
}}

Pour générer des pages wiki

HfdIndexSelect -h $EXPLOR_AREA/Data/Main/Merge/RBID.i   -Sk "pubmed:16901835" \
       | HfdSelect -Kh $EXPLOR_AREA/Data/Main/Merge/biblio.hfd   \
       | NlmPubMed2Wicri -a HapticV1 

Wicri

This area was generated with Dilib version V0.6.23.
Data generation: Mon Jun 13 01:09:46 2016. Site generation: Wed Mar 6 09:54:07 2024