Haptic and visual information speed up the neural processing of auditory speech in live dyadic interactions
Identifieur interne : 000129 ( PascalFrancis/Corpus ); précédent : 000128; suivant : 000130Haptic and visual information speed up the neural processing of auditory speech in live dyadic interactions
Auteurs : Avril Treille ; Camille Cordeboeuf ; Coriandre Vilain ; Marc SatoSource :
- Neuropsychologia [ 0028-3932 ] ; 2014.
Descripteurs français
- Pascal (Inist)
English descriptors
- KwdEn :
Abstract
Speech can be perceived not only by the ear and by the eye but also by the hand, with speech gestures felt from manual tactile contact with the speaker's face. In the present electro-encephalographic study, early cross-modal interactions were investigated by comparing auditory evoked potentials during auditory, audio-visual and audio-haptic speech perception in dyadic interactions between a listener and a speaker. In line with previous studies, early auditory evoked responses were attenuated and speeded up during audio-visual compared to auditory speech perception. Crucially, shortened latencies of early auditory evoked potentials were also observed during audio-haptic speech perception. Altogether, these results suggest early bimodal interactions during live face-to-face and hand-to-face speech perception in dyadic interactions.
Notice en format standard (ISO 2709)
Pour connaître la documentation sur le format Inist Standard.
pA |
|
---|
Format Inist (serveur)
NO : | FRANCIS 14-0138017 INIST |
---|---|
ET : | Haptic and visual information speed up the neural processing of auditory speech in live dyadic interactions |
AU : | TREILLE (Avril); CORDEBOEUF (Camille); VILAIN (Coriandre); SATO (Marc) |
AF : | GIPSA-LAB, Département Parole and Cognition, CNRS and Grenoble Université/Grenoble/France (1 aut., 2 aut., 3 aut., 4 aut.) |
DT : | Publication en série; Niveau analytique |
SO : | Neuropsychologia; ISSN 0028-3932; Coden NUPSA6; Royaume-Uni; Da. 2014; Vol. 57; Pp. 71-77; Bibl. 1/2 p. |
LA : | Anglais |
EA : | Speech can be perceived not only by the ear and by the eye but also by the hand, with speech gestures felt from manual tactile contact with the speaker's face. In the present electro-encephalographic study, early cross-modal interactions were investigated by comparing auditory evoked potentials during auditory, audio-visual and audio-haptic speech perception in dyadic interactions between a listener and a speaker. In line with previous studies, early auditory evoked responses were attenuated and speeded up during audio-visual compared to auditory speech perception. Crucially, shortened latencies of early auditory evoked potentials were also observed during audio-haptic speech perception. Altogether, these results suggest early bimodal interactions during live face-to-face and hand-to-face speech perception in dyadic interactions. |
CC : | 770B09C |
FD : | Sensibilité tactile; Vision; Etude expérimentale; Audition; Traitement information; Parole; Relation dyadique; Perception verbale; Perception intermodale; Electroencéphalographie; Homme |
FG : | Interaction sociale; Cognition; Langage; Electrophysiologie |
ED : | Tactile sensitivity; Vision; Experimental study; Hearing; Information processing; Speech; Dyadic relation; Verbal perception; Intermodal perception; Electroencephalography; Human |
EG : | Social interaction; Cognition; Language; Electrophysiology |
SD : | Sensibilidad tactil; Visión; Estudio experimental; Audición; Procesamiento información; Habla; Relación diádica; Percepción verbal; Percepción intermodal; Electroencefalografía; Hombre |
LO : | INIST-11143.354000503266940080 |
ID : | 14-0138017 |
Links to Exploration step
Francis:14-0138017Le document en format XML
<record><TEI><teiHeader><fileDesc><titleStmt><title xml:lang="en" level="a">Haptic and visual information speed up the neural processing of auditory speech in live dyadic interactions</title>
<author><name sortKey="Treille, Avril" sort="Treille, Avril" uniqKey="Treille A" first="Avril" last="Treille">Avril Treille</name>
<affiliation><inist:fA14 i1="01"><s1>GIPSA-LAB, Département Parole and Cognition, CNRS and Grenoble Université</s1>
<s2>Grenoble</s2>
<s3>FRA</s3>
<sZ>1 aut.</sZ>
<sZ>2 aut.</sZ>
<sZ>3 aut.</sZ>
<sZ>4 aut.</sZ>
</inist:fA14>
</affiliation>
</author>
<author><name sortKey="Cordeboeuf, Camille" sort="Cordeboeuf, Camille" uniqKey="Cordeboeuf C" first="Camille" last="Cordeboeuf">Camille Cordeboeuf</name>
<affiliation><inist:fA14 i1="01"><s1>GIPSA-LAB, Département Parole and Cognition, CNRS and Grenoble Université</s1>
<s2>Grenoble</s2>
<s3>FRA</s3>
<sZ>1 aut.</sZ>
<sZ>2 aut.</sZ>
<sZ>3 aut.</sZ>
<sZ>4 aut.</sZ>
</inist:fA14>
</affiliation>
</author>
<author><name sortKey="Vilain, Coriandre" sort="Vilain, Coriandre" uniqKey="Vilain C" first="Coriandre" last="Vilain">Coriandre Vilain</name>
<affiliation><inist:fA14 i1="01"><s1>GIPSA-LAB, Département Parole and Cognition, CNRS and Grenoble Université</s1>
<s2>Grenoble</s2>
<s3>FRA</s3>
<sZ>1 aut.</sZ>
<sZ>2 aut.</sZ>
<sZ>3 aut.</sZ>
<sZ>4 aut.</sZ>
</inist:fA14>
</affiliation>
</author>
<author><name sortKey="Sato, Marc" sort="Sato, Marc" uniqKey="Sato M" first="Marc" last="Sato">Marc Sato</name>
<affiliation><inist:fA14 i1="01"><s1>GIPSA-LAB, Département Parole and Cognition, CNRS and Grenoble Université</s1>
<s2>Grenoble</s2>
<s3>FRA</s3>
<sZ>1 aut.</sZ>
<sZ>2 aut.</sZ>
<sZ>3 aut.</sZ>
<sZ>4 aut.</sZ>
</inist:fA14>
</affiliation>
</author>
</titleStmt>
<publicationStmt><idno type="wicri:source">INIST</idno>
<idno type="inist">14-0138017</idno>
<date when="2014">2014</date>
<idno type="stanalyst">FRANCIS 14-0138017 INIST</idno>
<idno type="RBID">Francis:14-0138017</idno>
<idno type="wicri:Area/PascalFrancis/Corpus">000129</idno>
</publicationStmt>
<sourceDesc><biblStruct><analytic><title xml:lang="en" level="a">Haptic and visual information speed up the neural processing of auditory speech in live dyadic interactions</title>
<author><name sortKey="Treille, Avril" sort="Treille, Avril" uniqKey="Treille A" first="Avril" last="Treille">Avril Treille</name>
<affiliation><inist:fA14 i1="01"><s1>GIPSA-LAB, Département Parole and Cognition, CNRS and Grenoble Université</s1>
<s2>Grenoble</s2>
<s3>FRA</s3>
<sZ>1 aut.</sZ>
<sZ>2 aut.</sZ>
<sZ>3 aut.</sZ>
<sZ>4 aut.</sZ>
</inist:fA14>
</affiliation>
</author>
<author><name sortKey="Cordeboeuf, Camille" sort="Cordeboeuf, Camille" uniqKey="Cordeboeuf C" first="Camille" last="Cordeboeuf">Camille Cordeboeuf</name>
<affiliation><inist:fA14 i1="01"><s1>GIPSA-LAB, Département Parole and Cognition, CNRS and Grenoble Université</s1>
<s2>Grenoble</s2>
<s3>FRA</s3>
<sZ>1 aut.</sZ>
<sZ>2 aut.</sZ>
<sZ>3 aut.</sZ>
<sZ>4 aut.</sZ>
</inist:fA14>
</affiliation>
</author>
<author><name sortKey="Vilain, Coriandre" sort="Vilain, Coriandre" uniqKey="Vilain C" first="Coriandre" last="Vilain">Coriandre Vilain</name>
<affiliation><inist:fA14 i1="01"><s1>GIPSA-LAB, Département Parole and Cognition, CNRS and Grenoble Université</s1>
<s2>Grenoble</s2>
<s3>FRA</s3>
<sZ>1 aut.</sZ>
<sZ>2 aut.</sZ>
<sZ>3 aut.</sZ>
<sZ>4 aut.</sZ>
</inist:fA14>
</affiliation>
</author>
<author><name sortKey="Sato, Marc" sort="Sato, Marc" uniqKey="Sato M" first="Marc" last="Sato">Marc Sato</name>
<affiliation><inist:fA14 i1="01"><s1>GIPSA-LAB, Département Parole and Cognition, CNRS and Grenoble Université</s1>
<s2>Grenoble</s2>
<s3>FRA</s3>
<sZ>1 aut.</sZ>
<sZ>2 aut.</sZ>
<sZ>3 aut.</sZ>
<sZ>4 aut.</sZ>
</inist:fA14>
</affiliation>
</author>
</analytic>
<series><title level="j" type="main">Neuropsychologia</title>
<title level="j" type="abbreviated">Neuropsychologia</title>
<idno type="ISSN">0028-3932</idno>
<imprint><date when="2014">2014</date>
</imprint>
</series>
</biblStruct>
</sourceDesc>
<seriesStmt><title level="j" type="main">Neuropsychologia</title>
<title level="j" type="abbreviated">Neuropsychologia</title>
<idno type="ISSN">0028-3932</idno>
</seriesStmt>
</fileDesc>
<profileDesc><textClass><keywords scheme="KwdEn" xml:lang="en"><term>Dyadic relation</term>
<term>Electroencephalography</term>
<term>Experimental study</term>
<term>Hearing</term>
<term>Human</term>
<term>Information processing</term>
<term>Intermodal perception</term>
<term>Speech</term>
<term>Tactile sensitivity</term>
<term>Verbal perception</term>
<term>Vision</term>
</keywords>
<keywords scheme="Pascal" xml:lang="fr"><term>Sensibilité tactile</term>
<term>Vision</term>
<term>Etude expérimentale</term>
<term>Audition</term>
<term>Traitement information</term>
<term>Parole</term>
<term>Relation dyadique</term>
<term>Perception verbale</term>
<term>Perception intermodale</term>
<term>Electroencéphalographie</term>
<term>Homme</term>
</keywords>
</textClass>
</profileDesc>
</teiHeader>
<front><div type="abstract" xml:lang="en">Speech can be perceived not only by the ear and by the eye but also by the hand, with speech gestures felt from manual tactile contact with the speaker's face. In the present electro-encephalographic study, early cross-modal interactions were investigated by comparing auditory evoked potentials during auditory, audio-visual and audio-haptic speech perception in dyadic interactions between a listener and a speaker. In line with previous studies, early auditory evoked responses were attenuated and speeded up during audio-visual compared to auditory speech perception. Crucially, shortened latencies of early auditory evoked potentials were also observed during audio-haptic speech perception. Altogether, these results suggest early bimodal interactions during live face-to-face and hand-to-face speech perception in dyadic interactions.</div>
</front>
</TEI>
<inist><standard h6="B"><pA><fA01 i1="01" i2="1"><s0>0028-3932</s0>
</fA01>
<fA02 i1="01"><s0>NUPSA6</s0>
</fA02>
<fA03 i2="1"><s0>Neuropsychologia</s0>
</fA03>
<fA05><s2>57</s2>
</fA05>
<fA08 i1="01" i2="1" l="ENG"><s1>Haptic and visual information speed up the neural processing of auditory speech in live dyadic interactions</s1>
</fA08>
<fA11 i1="01" i2="1"><s1>TREILLE (Avril)</s1>
</fA11>
<fA11 i1="02" i2="1"><s1>CORDEBOEUF (Camille)</s1>
</fA11>
<fA11 i1="03" i2="1"><s1>VILAIN (Coriandre)</s1>
</fA11>
<fA11 i1="04" i2="1"><s1>SATO (Marc)</s1>
</fA11>
<fA14 i1="01"><s1>GIPSA-LAB, Département Parole and Cognition, CNRS and Grenoble Université</s1>
<s2>Grenoble</s2>
<s3>FRA</s3>
<sZ>1 aut.</sZ>
<sZ>2 aut.</sZ>
<sZ>3 aut.</sZ>
<sZ>4 aut.</sZ>
</fA14>
<fA20><s1>71-77</s1>
</fA20>
<fA21><s1>2014</s1>
</fA21>
<fA23 i1="01"><s0>ENG</s0>
</fA23>
<fA43 i1="01"><s1>INIST</s1>
<s2>11143</s2>
<s5>354000503266940080</s5>
</fA43>
<fA44><s0>0000</s0>
<s1>© 2014 INIST-CNRS. All rights reserved.</s1>
</fA44>
<fA45><s0>1/2 p.</s0>
</fA45>
<fA47 i1="01" i2="1"><s0>14-0138017</s0>
</fA47>
<fA60><s1>P</s1>
</fA60>
<fA61><s0>A</s0>
</fA61>
<fA64 i1="01" i2="1"><s0>Neuropsychologia</s0>
</fA64>
<fA66 i1="01"><s0>GBR</s0>
</fA66>
<fC01 i1="01" l="ENG"><s0>Speech can be perceived not only by the ear and by the eye but also by the hand, with speech gestures felt from manual tactile contact with the speaker's face. In the present electro-encephalographic study, early cross-modal interactions were investigated by comparing auditory evoked potentials during auditory, audio-visual and audio-haptic speech perception in dyadic interactions between a listener and a speaker. In line with previous studies, early auditory evoked responses were attenuated and speeded up during audio-visual compared to auditory speech perception. Crucially, shortened latencies of early auditory evoked potentials were also observed during audio-haptic speech perception. Altogether, these results suggest early bimodal interactions during live face-to-face and hand-to-face speech perception in dyadic interactions.</s0>
</fC01>
<fC02 i1="01" i2="X"><s0>770B09C</s0>
<s1>II</s1>
</fC02>
<fC03 i1="01" i2="X" l="FRE"><s0>Sensibilité tactile</s0>
<s5>01</s5>
</fC03>
<fC03 i1="01" i2="X" l="ENG"><s0>Tactile sensitivity</s0>
<s5>01</s5>
</fC03>
<fC03 i1="01" i2="X" l="SPA"><s0>Sensibilidad tactil</s0>
<s5>01</s5>
</fC03>
<fC03 i1="02" i2="X" l="FRE"><s0>Vision</s0>
<s5>02</s5>
</fC03>
<fC03 i1="02" i2="X" l="ENG"><s0>Vision</s0>
<s5>02</s5>
</fC03>
<fC03 i1="02" i2="X" l="SPA"><s0>Visión</s0>
<s5>02</s5>
</fC03>
<fC03 i1="03" i2="X" l="FRE"><s0>Etude expérimentale</s0>
<s5>03</s5>
</fC03>
<fC03 i1="03" i2="X" l="ENG"><s0>Experimental study</s0>
<s5>03</s5>
</fC03>
<fC03 i1="03" i2="X" l="SPA"><s0>Estudio experimental</s0>
<s5>03</s5>
</fC03>
<fC03 i1="04" i2="X" l="FRE"><s0>Audition</s0>
<s5>05</s5>
</fC03>
<fC03 i1="04" i2="X" l="ENG"><s0>Hearing</s0>
<s5>05</s5>
</fC03>
<fC03 i1="04" i2="X" l="SPA"><s0>Audición</s0>
<s5>05</s5>
</fC03>
<fC03 i1="05" i2="X" l="FRE"><s0>Traitement information</s0>
<s5>06</s5>
</fC03>
<fC03 i1="05" i2="X" l="ENG"><s0>Information processing</s0>
<s5>06</s5>
</fC03>
<fC03 i1="05" i2="X" l="SPA"><s0>Procesamiento información</s0>
<s5>06</s5>
</fC03>
<fC03 i1="06" i2="X" l="FRE"><s0>Parole</s0>
<s5>07</s5>
</fC03>
<fC03 i1="06" i2="X" l="ENG"><s0>Speech</s0>
<s5>07</s5>
</fC03>
<fC03 i1="06" i2="X" l="SPA"><s0>Habla</s0>
<s5>07</s5>
</fC03>
<fC03 i1="07" i2="X" l="FRE"><s0>Relation dyadique</s0>
<s5>08</s5>
</fC03>
<fC03 i1="07" i2="X" l="ENG"><s0>Dyadic relation</s0>
<s5>08</s5>
</fC03>
<fC03 i1="07" i2="X" l="SPA"><s0>Relación diádica</s0>
<s5>08</s5>
</fC03>
<fC03 i1="08" i2="X" l="FRE"><s0>Perception verbale</s0>
<s5>09</s5>
</fC03>
<fC03 i1="08" i2="X" l="ENG"><s0>Verbal perception</s0>
<s5>09</s5>
</fC03>
<fC03 i1="08" i2="X" l="SPA"><s0>Percepción verbal</s0>
<s5>09</s5>
</fC03>
<fC03 i1="09" i2="X" l="FRE"><s0>Perception intermodale</s0>
<s5>10</s5>
</fC03>
<fC03 i1="09" i2="X" l="ENG"><s0>Intermodal perception</s0>
<s5>10</s5>
</fC03>
<fC03 i1="09" i2="X" l="SPA"><s0>Percepción intermodal</s0>
<s5>10</s5>
</fC03>
<fC03 i1="10" i2="X" l="FRE"><s0>Electroencéphalographie</s0>
<s5>11</s5>
</fC03>
<fC03 i1="10" i2="X" l="ENG"><s0>Electroencephalography</s0>
<s5>11</s5>
</fC03>
<fC03 i1="10" i2="X" l="SPA"><s0>Electroencefalografía</s0>
<s5>11</s5>
</fC03>
<fC03 i1="11" i2="X" l="FRE"><s0>Homme</s0>
<s5>18</s5>
</fC03>
<fC03 i1="11" i2="X" l="ENG"><s0>Human</s0>
<s5>18</s5>
</fC03>
<fC03 i1="11" i2="X" l="SPA"><s0>Hombre</s0>
<s5>18</s5>
</fC03>
<fC07 i1="01" i2="X" l="FRE"><s0>Interaction sociale</s0>
<s5>37</s5>
</fC07>
<fC07 i1="01" i2="X" l="ENG"><s0>Social interaction</s0>
<s5>37</s5>
</fC07>
<fC07 i1="01" i2="X" l="SPA"><s0>Interacción social</s0>
<s5>37</s5>
</fC07>
<fC07 i1="02" i2="X" l="FRE"><s0>Cognition</s0>
<s5>38</s5>
</fC07>
<fC07 i1="02" i2="X" l="ENG"><s0>Cognition</s0>
<s5>38</s5>
</fC07>
<fC07 i1="02" i2="X" l="SPA"><s0>Cognición</s0>
<s5>38</s5>
</fC07>
<fC07 i1="03" i2="X" l="FRE"><s0>Langage</s0>
<s5>39</s5>
</fC07>
<fC07 i1="03" i2="X" l="ENG"><s0>Language</s0>
<s5>39</s5>
</fC07>
<fC07 i1="03" i2="X" l="SPA"><s0>Lenguaje</s0>
<s5>39</s5>
</fC07>
<fC07 i1="04" i2="X" l="FRE"><s0>Electrophysiologie</s0>
<s5>40</s5>
</fC07>
<fC07 i1="04" i2="X" l="ENG"><s0>Electrophysiology</s0>
<s5>40</s5>
</fC07>
<fC07 i1="04" i2="X" l="SPA"><s0>Electrofisiología</s0>
<s5>40</s5>
</fC07>
<fN21><s1>174</s1>
</fN21>
</pA>
</standard>
<server><NO>FRANCIS 14-0138017 INIST</NO>
<ET>Haptic and visual information speed up the neural processing of auditory speech in live dyadic interactions</ET>
<AU>TREILLE (Avril); CORDEBOEUF (Camille); VILAIN (Coriandre); SATO (Marc)</AU>
<AF>GIPSA-LAB, Département Parole and Cognition, CNRS and Grenoble Université/Grenoble/France (1 aut., 2 aut., 3 aut., 4 aut.)</AF>
<DT>Publication en série; Niveau analytique</DT>
<SO>Neuropsychologia; ISSN 0028-3932; Coden NUPSA6; Royaume-Uni; Da. 2014; Vol. 57; Pp. 71-77; Bibl. 1/2 p.</SO>
<LA>Anglais</LA>
<EA>Speech can be perceived not only by the ear and by the eye but also by the hand, with speech gestures felt from manual tactile contact with the speaker's face. In the present electro-encephalographic study, early cross-modal interactions were investigated by comparing auditory evoked potentials during auditory, audio-visual and audio-haptic speech perception in dyadic interactions between a listener and a speaker. In line with previous studies, early auditory evoked responses were attenuated and speeded up during audio-visual compared to auditory speech perception. Crucially, shortened latencies of early auditory evoked potentials were also observed during audio-haptic speech perception. Altogether, these results suggest early bimodal interactions during live face-to-face and hand-to-face speech perception in dyadic interactions.</EA>
<CC>770B09C</CC>
<FD>Sensibilité tactile; Vision; Etude expérimentale; Audition; Traitement information; Parole; Relation dyadique; Perception verbale; Perception intermodale; Electroencéphalographie; Homme</FD>
<FG>Interaction sociale; Cognition; Langage; Electrophysiologie</FG>
<ED>Tactile sensitivity; Vision; Experimental study; Hearing; Information processing; Speech; Dyadic relation; Verbal perception; Intermodal perception; Electroencephalography; Human</ED>
<EG>Social interaction; Cognition; Language; Electrophysiology</EG>
<SD>Sensibilidad tactil; Visión; Estudio experimental; Audición; Procesamiento información; Habla; Relación diádica; Percepción verbal; Percepción intermodal; Electroencefalografía; Hombre</SD>
<LO>INIST-11143.354000503266940080</LO>
<ID>14-0138017</ID>
</server>
</inist>
</record>
Pour manipuler ce document sous Unix (Dilib)
EXPLOR_STEP=$WICRI_ROOT/Ticri/CIDE/explor/HapticV1/Data/PascalFrancis/Corpus
HfdSelect -h $EXPLOR_STEP/biblio.hfd -nk 000129 | SxmlIndent | more
Ou
HfdSelect -h $EXPLOR_AREA/Data/PascalFrancis/Corpus/biblio.hfd -nk 000129 | SxmlIndent | more
Pour mettre un lien sur cette page dans le réseau Wicri
{{Explor lien |wiki= Ticri/CIDE |area= HapticV1 |flux= PascalFrancis |étape= Corpus |type= RBID |clé= Francis:14-0138017 |texte= Haptic and visual information speed up the neural processing of auditory speech in live dyadic interactions }}
![]() | This area was generated with Dilib version V0.6.23. | ![]() |