Serveur d'exploration sur les dispositifs haptiques

Attention, ce site est en cours de développement !
Attention, site généré par des moyens informatiques à partir de corpus bruts.
Les informations ne sont donc pas validées.

Inverse Effectiveness and Multisensory Interactions in Visual Event-Related Potentials with Audiovisual Speech

Identifieur interne : 001F01 ( Ncbi/Checkpoint ); précédent : 001F00; suivant : 001F02

Inverse Effectiveness and Multisensory Interactions in Visual Event-Related Potentials with Audiovisual Speech

Auteurs : Ryan A. Stevenson ; Maxim Bushmakin ; Sunah Kim ; Mark T. Wallace ; Aina Puce ; Thomas W. James

Source :

RBID : PMC:3789520

Abstract

In recent years, it has become evident that neural responses previously considered to be unisensory can be modulated by sensory input from other modalities. In this regard, visual neural activity elicited to viewing a face is strongly influenced by concurrent incoming auditory information, particularly speech. Here, we applied an additive-factors paradigm aimed at quantifying the impact that auditory speech has on visual event-related potentials (ERPs) elicited to visual speech. These multisensory interactions were measured across parametrically varied stimulus salience, quantified in terms of signal to noise, to provide novel insights into the neural mechanisms of audiovisual speech perception. First, we measured a monotonic increase of the amplitude of the visual P1-N1-P2 ERP complex during a spoken-word recognition task with increases in stimulus salience. ERP component amplitudes varied directly with stimulus salience for visual, audiovisual, and summed unisensory recordings. Second, we measured changes in multisensory gain across salience levels. During audiovisual speech, the P1 and P1-N1 components exhibited less multisensory gain relative to the summed unisensory components with reduced salience, while N1-P2 amplitude exhibited greater multisensory gain as salience was reduced, consistent with the principle of inverse effectiveness. The amplitude interactions were correlated with behavioral measures of multisensory gain across salience levels as measured by response times, suggesting that change in multisensory gain associated with unisensory salience modulations reflects an increased efficiency of visual speech processing.


Url:
DOI: 10.1007/s10548-012-0220-7
PubMed: 22367585
PubMed Central: 3789520


Affiliations:


Links toward previous steps (curation, corpus...)


Links to Exploration step

PMC:3789520

Le document en format XML

<record>
<TEI>
<teiHeader>
<fileDesc>
<titleStmt>
<title xml:lang="en">Inverse Effectiveness and Multisensory Interactions in Visual Event-Related Potentials with Audiovisual Speech</title>
<author>
<name sortKey="Stevenson, Ryan A" sort="Stevenson, Ryan A" uniqKey="Stevenson R" first="Ryan A." last="Stevenson">Ryan A. Stevenson</name>
</author>
<author>
<name sortKey="Bushmakin, Maxim" sort="Bushmakin, Maxim" uniqKey="Bushmakin M" first="Maxim" last="Bushmakin">Maxim Bushmakin</name>
</author>
<author>
<name sortKey="Kim, Sunah" sort="Kim, Sunah" uniqKey="Kim S" first="Sunah" last="Kim">Sunah Kim</name>
</author>
<author>
<name sortKey="Wallace, Mark T" sort="Wallace, Mark T" uniqKey="Wallace M" first="Mark T." last="Wallace">Mark T. Wallace</name>
</author>
<author>
<name sortKey="Puce, Aina" sort="Puce, Aina" uniqKey="Puce A" first="Aina" last="Puce">Aina Puce</name>
</author>
<author>
<name sortKey="James, Thomas W" sort="James, Thomas W" uniqKey="James T" first="Thomas W." last="James">Thomas W. James</name>
</author>
</titleStmt>
<publicationStmt>
<idno type="wicri:source">PMC</idno>
<idno type="pmid">22367585</idno>
<idno type="pmc">3789520</idno>
<idno type="url">http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3789520</idno>
<idno type="RBID">PMC:3789520</idno>
<idno type="doi">10.1007/s10548-012-0220-7</idno>
<date when="2012">2012</date>
<idno type="wicri:Area/Pmc/Corpus">001A40</idno>
<idno type="wicri:Area/Pmc/Curation">001A40</idno>
<idno type="wicri:Area/Pmc/Checkpoint">001696</idno>
<idno type="wicri:Area/Ncbi/Merge">001F01</idno>
<idno type="wicri:Area/Ncbi/Curation">001F01</idno>
<idno type="wicri:Area/Ncbi/Checkpoint">001F01</idno>
</publicationStmt>
<sourceDesc>
<biblStruct>
<analytic>
<title xml:lang="en" level="a" type="main">Inverse Effectiveness and Multisensory Interactions in Visual Event-Related Potentials with Audiovisual Speech</title>
<author>
<name sortKey="Stevenson, Ryan A" sort="Stevenson, Ryan A" uniqKey="Stevenson R" first="Ryan A." last="Stevenson">Ryan A. Stevenson</name>
</author>
<author>
<name sortKey="Bushmakin, Maxim" sort="Bushmakin, Maxim" uniqKey="Bushmakin M" first="Maxim" last="Bushmakin">Maxim Bushmakin</name>
</author>
<author>
<name sortKey="Kim, Sunah" sort="Kim, Sunah" uniqKey="Kim S" first="Sunah" last="Kim">Sunah Kim</name>
</author>
<author>
<name sortKey="Wallace, Mark T" sort="Wallace, Mark T" uniqKey="Wallace M" first="Mark T." last="Wallace">Mark T. Wallace</name>
</author>
<author>
<name sortKey="Puce, Aina" sort="Puce, Aina" uniqKey="Puce A" first="Aina" last="Puce">Aina Puce</name>
</author>
<author>
<name sortKey="James, Thomas W" sort="James, Thomas W" uniqKey="James T" first="Thomas W." last="James">Thomas W. James</name>
</author>
</analytic>
<series>
<title level="j">Brain topography</title>
<idno type="ISSN">0896-0267</idno>
<idno type="eISSN">1573-6792</idno>
<imprint>
<date when="2012">2012</date>
</imprint>
</series>
</biblStruct>
</sourceDesc>
</fileDesc>
<profileDesc>
<textClass></textClass>
</profileDesc>
</teiHeader>
<front>
<div type="abstract" xml:lang="en">
<p id="P1">In recent years, it has become evident that neural responses previously considered to be unisensory can be modulated by sensory input from other modalities. In this regard, visual neural activity elicited to viewing a face is strongly influenced by concurrent incoming auditory information, particularly speech. Here, we applied an additive-factors paradigm aimed at quantifying the impact that auditory speech has on visual event-related potentials (ERPs) elicited to visual speech. These multisensory interactions were measured across parametrically varied stimulus salience, quantified in terms of signal to noise, to provide novel insights into the neural mechanisms of audiovisual speech perception. First, we measured a monotonic increase of the amplitude of the visual P1-N1-P2 ERP complex during a spoken-word recognition task with increases in stimulus salience. ERP component amplitudes varied directly with stimulus salience for visual, audiovisual, and summed unisensory recordings. Second, we measured changes in multisensory gain across salience levels. During audiovisual speech, the P1 and P1-N1 components exhibited less multisensory gain relative to the summed unisensory components with reduced salience, while N1-P2 amplitude exhibited greater multisensory gain as salience was reduced, consistent with the principle of inverse effectiveness. The amplitude interactions were correlated with behavioral measures of multisensory gain across salience levels as measured by response times, suggesting that change in multisensory gain associated with unisensory salience modulations reflects an increased efficiency of visual speech processing.</p>
</div>
</front>
</TEI>
<affiliations>
<list></list>
<tree>
<noCountry>
<name sortKey="Bushmakin, Maxim" sort="Bushmakin, Maxim" uniqKey="Bushmakin M" first="Maxim" last="Bushmakin">Maxim Bushmakin</name>
<name sortKey="James, Thomas W" sort="James, Thomas W" uniqKey="James T" first="Thomas W." last="James">Thomas W. James</name>
<name sortKey="Kim, Sunah" sort="Kim, Sunah" uniqKey="Kim S" first="Sunah" last="Kim">Sunah Kim</name>
<name sortKey="Puce, Aina" sort="Puce, Aina" uniqKey="Puce A" first="Aina" last="Puce">Aina Puce</name>
<name sortKey="Stevenson, Ryan A" sort="Stevenson, Ryan A" uniqKey="Stevenson R" first="Ryan A." last="Stevenson">Ryan A. Stevenson</name>
<name sortKey="Wallace, Mark T" sort="Wallace, Mark T" uniqKey="Wallace M" first="Mark T." last="Wallace">Mark T. Wallace</name>
</noCountry>
</tree>
</affiliations>
</record>

Pour manipuler ce document sous Unix (Dilib)

EXPLOR_STEP=$WICRI_ROOT/Ticri/CIDE/explor/HapticV1/Data/Ncbi/Checkpoint
HfdSelect -h $EXPLOR_STEP/biblio.hfd -nk 001F01 | SxmlIndent | more

Ou

HfdSelect -h $EXPLOR_AREA/Data/Ncbi/Checkpoint/biblio.hfd -nk 001F01 | SxmlIndent | more

Pour mettre un lien sur cette page dans le réseau Wicri

{{Explor lien
   |wiki=    Ticri/CIDE
   |area=    HapticV1
   |flux=    Ncbi
   |étape=   Checkpoint
   |type=    RBID
   |clé=     PMC:3789520
   |texte=   Inverse Effectiveness and Multisensory Interactions in Visual Event-Related Potentials with Audiovisual Speech
}}

Pour générer des pages wiki

HfdIndexSelect -h $EXPLOR_AREA/Data/Ncbi/Checkpoint/RBID.i   -Sk "pubmed:22367585" \
       | HfdSelect -Kh $EXPLOR_AREA/Data/Ncbi/Checkpoint/biblio.hfd   \
       | NlmPubMed2Wicri -a HapticV1 

Wicri

This area was generated with Dilib version V0.6.23.
Data generation: Mon Jun 13 01:09:46 2016. Site generation: Wed Mar 6 09:54:07 2024