Serveur d'exploration sur les dispositifs haptiques

Attention, ce site est en cours de développement !
Attention, site généré par des moyens informatiques à partir de corpus bruts.
Les informations ne sont donc pas validées.

Haptic and visual information speed up the neural processing of auditory speech in live dyadic interactions

Identifieur interne : 000129 ( PascalFrancis/Corpus ); précédent : 000128; suivant : 000130

Haptic and visual information speed up the neural processing of auditory speech in live dyadic interactions

Auteurs : Avril Treille ; Camille Cordeboeuf ; Coriandre Vilain ; Marc Sato

Source :

RBID : Francis:14-0138017

Descripteurs français

English descriptors

Abstract

Speech can be perceived not only by the ear and by the eye but also by the hand, with speech gestures felt from manual tactile contact with the speaker's face. In the present electro-encephalographic study, early cross-modal interactions were investigated by comparing auditory evoked potentials during auditory, audio-visual and audio-haptic speech perception in dyadic interactions between a listener and a speaker. In line with previous studies, early auditory evoked responses were attenuated and speeded up during audio-visual compared to auditory speech perception. Crucially, shortened latencies of early auditory evoked potentials were also observed during audio-haptic speech perception. Altogether, these results suggest early bimodal interactions during live face-to-face and hand-to-face speech perception in dyadic interactions.

Notice en format standard (ISO 2709)

Pour connaître la documentation sur le format Inist Standard.

pA  
A01 01  1    @0 0028-3932
A02 01      @0 NUPSA6
A03   1    @0 Neuropsychologia
A05       @2 57
A08 01  1  ENG  @1 Haptic and visual information speed up the neural processing of auditory speech in live dyadic interactions
A11 01  1    @1 TREILLE (Avril)
A11 02  1    @1 CORDEBOEUF (Camille)
A11 03  1    @1 VILAIN (Coriandre)
A11 04  1    @1 SATO (Marc)
A14 01      @1 GIPSA-LAB, Département Parole and Cognition, CNRS and Grenoble Université @2 Grenoble @3 FRA @Z 1 aut. @Z 2 aut. @Z 3 aut. @Z 4 aut.
A20       @1 71-77
A21       @1 2014
A23 01      @0 ENG
A43 01      @1 INIST @2 11143 @5 354000503266940080
A44       @0 0000 @1 © 2014 INIST-CNRS. All rights reserved.
A45       @0 1/2 p.
A47 01  1    @0 14-0138017
A60       @1 P
A61       @0 A
A64 01  1    @0 Neuropsychologia
A66 01      @0 GBR
C01 01    ENG  @0 Speech can be perceived not only by the ear and by the eye but also by the hand, with speech gestures felt from manual tactile contact with the speaker's face. In the present electro-encephalographic study, early cross-modal interactions were investigated by comparing auditory evoked potentials during auditory, audio-visual and audio-haptic speech perception in dyadic interactions between a listener and a speaker. In line with previous studies, early auditory evoked responses were attenuated and speeded up during audio-visual compared to auditory speech perception. Crucially, shortened latencies of early auditory evoked potentials were also observed during audio-haptic speech perception. Altogether, these results suggest early bimodal interactions during live face-to-face and hand-to-face speech perception in dyadic interactions.
C02 01  X    @0 770B09C @1 II
C03 01  X  FRE  @0 Sensibilité tactile @5 01
C03 01  X  ENG  @0 Tactile sensitivity @5 01
C03 01  X  SPA  @0 Sensibilidad tactil @5 01
C03 02  X  FRE  @0 Vision @5 02
C03 02  X  ENG  @0 Vision @5 02
C03 02  X  SPA  @0 Visión @5 02
C03 03  X  FRE  @0 Etude expérimentale @5 03
C03 03  X  ENG  @0 Experimental study @5 03
C03 03  X  SPA  @0 Estudio experimental @5 03
C03 04  X  FRE  @0 Audition @5 05
C03 04  X  ENG  @0 Hearing @5 05
C03 04  X  SPA  @0 Audición @5 05
C03 05  X  FRE  @0 Traitement information @5 06
C03 05  X  ENG  @0 Information processing @5 06
C03 05  X  SPA  @0 Procesamiento información @5 06
C03 06  X  FRE  @0 Parole @5 07
C03 06  X  ENG  @0 Speech @5 07
C03 06  X  SPA  @0 Habla @5 07
C03 07  X  FRE  @0 Relation dyadique @5 08
C03 07  X  ENG  @0 Dyadic relation @5 08
C03 07  X  SPA  @0 Relación diádica @5 08
C03 08  X  FRE  @0 Perception verbale @5 09
C03 08  X  ENG  @0 Verbal perception @5 09
C03 08  X  SPA  @0 Percepción verbal @5 09
C03 09  X  FRE  @0 Perception intermodale @5 10
C03 09  X  ENG  @0 Intermodal perception @5 10
C03 09  X  SPA  @0 Percepción intermodal @5 10
C03 10  X  FRE  @0 Electroencéphalographie @5 11
C03 10  X  ENG  @0 Electroencephalography @5 11
C03 10  X  SPA  @0 Electroencefalografía @5 11
C03 11  X  FRE  @0 Homme @5 18
C03 11  X  ENG  @0 Human @5 18
C03 11  X  SPA  @0 Hombre @5 18
C07 01  X  FRE  @0 Interaction sociale @5 37
C07 01  X  ENG  @0 Social interaction @5 37
C07 01  X  SPA  @0 Interacción social @5 37
C07 02  X  FRE  @0 Cognition @5 38
C07 02  X  ENG  @0 Cognition @5 38
C07 02  X  SPA  @0 Cognición @5 38
C07 03  X  FRE  @0 Langage @5 39
C07 03  X  ENG  @0 Language @5 39
C07 03  X  SPA  @0 Lenguaje @5 39
C07 04  X  FRE  @0 Electrophysiologie @5 40
C07 04  X  ENG  @0 Electrophysiology @5 40
C07 04  X  SPA  @0 Electrofisiología @5 40
N21       @1 174

Format Inist (serveur)

NO : FRANCIS 14-0138017 INIST
ET : Haptic and visual information speed up the neural processing of auditory speech in live dyadic interactions
AU : TREILLE (Avril); CORDEBOEUF (Camille); VILAIN (Coriandre); SATO (Marc)
AF : GIPSA-LAB, Département Parole and Cognition, CNRS and Grenoble Université/Grenoble/France (1 aut., 2 aut., 3 aut., 4 aut.)
DT : Publication en série; Niveau analytique
SO : Neuropsychologia; ISSN 0028-3932; Coden NUPSA6; Royaume-Uni; Da. 2014; Vol. 57; Pp. 71-77; Bibl. 1/2 p.
LA : Anglais
EA : Speech can be perceived not only by the ear and by the eye but also by the hand, with speech gestures felt from manual tactile contact with the speaker's face. In the present electro-encephalographic study, early cross-modal interactions were investigated by comparing auditory evoked potentials during auditory, audio-visual and audio-haptic speech perception in dyadic interactions between a listener and a speaker. In line with previous studies, early auditory evoked responses were attenuated and speeded up during audio-visual compared to auditory speech perception. Crucially, shortened latencies of early auditory evoked potentials were also observed during audio-haptic speech perception. Altogether, these results suggest early bimodal interactions during live face-to-face and hand-to-face speech perception in dyadic interactions.
CC : 770B09C
FD : Sensibilité tactile; Vision; Etude expérimentale; Audition; Traitement information; Parole; Relation dyadique; Perception verbale; Perception intermodale; Electroencéphalographie; Homme
FG : Interaction sociale; Cognition; Langage; Electrophysiologie
ED : Tactile sensitivity; Vision; Experimental study; Hearing; Information processing; Speech; Dyadic relation; Verbal perception; Intermodal perception; Electroencephalography; Human
EG : Social interaction; Cognition; Language; Electrophysiology
SD : Sensibilidad tactil; Visión; Estudio experimental; Audición; Procesamiento información; Habla; Relación diádica; Percepción verbal; Percepción intermodal; Electroencefalografía; Hombre
LO : INIST-11143.354000503266940080
ID : 14-0138017

Links to Exploration step

Francis:14-0138017

Le document en format XML

<record>
<TEI>
<teiHeader>
<fileDesc>
<titleStmt>
<title xml:lang="en" level="a">Haptic and visual information speed up the neural processing of auditory speech in live dyadic interactions</title>
<author>
<name sortKey="Treille, Avril" sort="Treille, Avril" uniqKey="Treille A" first="Avril" last="Treille">Avril Treille</name>
<affiliation>
<inist:fA14 i1="01">
<s1>GIPSA-LAB, Département Parole and Cognition, CNRS and Grenoble Université</s1>
<s2>Grenoble</s2>
<s3>FRA</s3>
<sZ>1 aut.</sZ>
<sZ>2 aut.</sZ>
<sZ>3 aut.</sZ>
<sZ>4 aut.</sZ>
</inist:fA14>
</affiliation>
</author>
<author>
<name sortKey="Cordeboeuf, Camille" sort="Cordeboeuf, Camille" uniqKey="Cordeboeuf C" first="Camille" last="Cordeboeuf">Camille Cordeboeuf</name>
<affiliation>
<inist:fA14 i1="01">
<s1>GIPSA-LAB, Département Parole and Cognition, CNRS and Grenoble Université</s1>
<s2>Grenoble</s2>
<s3>FRA</s3>
<sZ>1 aut.</sZ>
<sZ>2 aut.</sZ>
<sZ>3 aut.</sZ>
<sZ>4 aut.</sZ>
</inist:fA14>
</affiliation>
</author>
<author>
<name sortKey="Vilain, Coriandre" sort="Vilain, Coriandre" uniqKey="Vilain C" first="Coriandre" last="Vilain">Coriandre Vilain</name>
<affiliation>
<inist:fA14 i1="01">
<s1>GIPSA-LAB, Département Parole and Cognition, CNRS and Grenoble Université</s1>
<s2>Grenoble</s2>
<s3>FRA</s3>
<sZ>1 aut.</sZ>
<sZ>2 aut.</sZ>
<sZ>3 aut.</sZ>
<sZ>4 aut.</sZ>
</inist:fA14>
</affiliation>
</author>
<author>
<name sortKey="Sato, Marc" sort="Sato, Marc" uniqKey="Sato M" first="Marc" last="Sato">Marc Sato</name>
<affiliation>
<inist:fA14 i1="01">
<s1>GIPSA-LAB, Département Parole and Cognition, CNRS and Grenoble Université</s1>
<s2>Grenoble</s2>
<s3>FRA</s3>
<sZ>1 aut.</sZ>
<sZ>2 aut.</sZ>
<sZ>3 aut.</sZ>
<sZ>4 aut.</sZ>
</inist:fA14>
</affiliation>
</author>
</titleStmt>
<publicationStmt>
<idno type="wicri:source">INIST</idno>
<idno type="inist">14-0138017</idno>
<date when="2014">2014</date>
<idno type="stanalyst">FRANCIS 14-0138017 INIST</idno>
<idno type="RBID">Francis:14-0138017</idno>
<idno type="wicri:Area/PascalFrancis/Corpus">000129</idno>
</publicationStmt>
<sourceDesc>
<biblStruct>
<analytic>
<title xml:lang="en" level="a">Haptic and visual information speed up the neural processing of auditory speech in live dyadic interactions</title>
<author>
<name sortKey="Treille, Avril" sort="Treille, Avril" uniqKey="Treille A" first="Avril" last="Treille">Avril Treille</name>
<affiliation>
<inist:fA14 i1="01">
<s1>GIPSA-LAB, Département Parole and Cognition, CNRS and Grenoble Université</s1>
<s2>Grenoble</s2>
<s3>FRA</s3>
<sZ>1 aut.</sZ>
<sZ>2 aut.</sZ>
<sZ>3 aut.</sZ>
<sZ>4 aut.</sZ>
</inist:fA14>
</affiliation>
</author>
<author>
<name sortKey="Cordeboeuf, Camille" sort="Cordeboeuf, Camille" uniqKey="Cordeboeuf C" first="Camille" last="Cordeboeuf">Camille Cordeboeuf</name>
<affiliation>
<inist:fA14 i1="01">
<s1>GIPSA-LAB, Département Parole and Cognition, CNRS and Grenoble Université</s1>
<s2>Grenoble</s2>
<s3>FRA</s3>
<sZ>1 aut.</sZ>
<sZ>2 aut.</sZ>
<sZ>3 aut.</sZ>
<sZ>4 aut.</sZ>
</inist:fA14>
</affiliation>
</author>
<author>
<name sortKey="Vilain, Coriandre" sort="Vilain, Coriandre" uniqKey="Vilain C" first="Coriandre" last="Vilain">Coriandre Vilain</name>
<affiliation>
<inist:fA14 i1="01">
<s1>GIPSA-LAB, Département Parole and Cognition, CNRS and Grenoble Université</s1>
<s2>Grenoble</s2>
<s3>FRA</s3>
<sZ>1 aut.</sZ>
<sZ>2 aut.</sZ>
<sZ>3 aut.</sZ>
<sZ>4 aut.</sZ>
</inist:fA14>
</affiliation>
</author>
<author>
<name sortKey="Sato, Marc" sort="Sato, Marc" uniqKey="Sato M" first="Marc" last="Sato">Marc Sato</name>
<affiliation>
<inist:fA14 i1="01">
<s1>GIPSA-LAB, Département Parole and Cognition, CNRS and Grenoble Université</s1>
<s2>Grenoble</s2>
<s3>FRA</s3>
<sZ>1 aut.</sZ>
<sZ>2 aut.</sZ>
<sZ>3 aut.</sZ>
<sZ>4 aut.</sZ>
</inist:fA14>
</affiliation>
</author>
</analytic>
<series>
<title level="j" type="main">Neuropsychologia</title>
<title level="j" type="abbreviated">Neuropsychologia</title>
<idno type="ISSN">0028-3932</idno>
<imprint>
<date when="2014">2014</date>
</imprint>
</series>
</biblStruct>
</sourceDesc>
<seriesStmt>
<title level="j" type="main">Neuropsychologia</title>
<title level="j" type="abbreviated">Neuropsychologia</title>
<idno type="ISSN">0028-3932</idno>
</seriesStmt>
</fileDesc>
<profileDesc>
<textClass>
<keywords scheme="KwdEn" xml:lang="en">
<term>Dyadic relation</term>
<term>Electroencephalography</term>
<term>Experimental study</term>
<term>Hearing</term>
<term>Human</term>
<term>Information processing</term>
<term>Intermodal perception</term>
<term>Speech</term>
<term>Tactile sensitivity</term>
<term>Verbal perception</term>
<term>Vision</term>
</keywords>
<keywords scheme="Pascal" xml:lang="fr">
<term>Sensibilité tactile</term>
<term>Vision</term>
<term>Etude expérimentale</term>
<term>Audition</term>
<term>Traitement information</term>
<term>Parole</term>
<term>Relation dyadique</term>
<term>Perception verbale</term>
<term>Perception intermodale</term>
<term>Electroencéphalographie</term>
<term>Homme</term>
</keywords>
</textClass>
</profileDesc>
</teiHeader>
<front>
<div type="abstract" xml:lang="en">Speech can be perceived not only by the ear and by the eye but also by the hand, with speech gestures felt from manual tactile contact with the speaker's face. In the present electro-encephalographic study, early cross-modal interactions were investigated by comparing auditory evoked potentials during auditory, audio-visual and audio-haptic speech perception in dyadic interactions between a listener and a speaker. In line with previous studies, early auditory evoked responses were attenuated and speeded up during audio-visual compared to auditory speech perception. Crucially, shortened latencies of early auditory evoked potentials were also observed during audio-haptic speech perception. Altogether, these results suggest early bimodal interactions during live face-to-face and hand-to-face speech perception in dyadic interactions.</div>
</front>
</TEI>
<inist>
<standard h6="B">
<pA>
<fA01 i1="01" i2="1">
<s0>0028-3932</s0>
</fA01>
<fA02 i1="01">
<s0>NUPSA6</s0>
</fA02>
<fA03 i2="1">
<s0>Neuropsychologia</s0>
</fA03>
<fA05>
<s2>57</s2>
</fA05>
<fA08 i1="01" i2="1" l="ENG">
<s1>Haptic and visual information speed up the neural processing of auditory speech in live dyadic interactions</s1>
</fA08>
<fA11 i1="01" i2="1">
<s1>TREILLE (Avril)</s1>
</fA11>
<fA11 i1="02" i2="1">
<s1>CORDEBOEUF (Camille)</s1>
</fA11>
<fA11 i1="03" i2="1">
<s1>VILAIN (Coriandre)</s1>
</fA11>
<fA11 i1="04" i2="1">
<s1>SATO (Marc)</s1>
</fA11>
<fA14 i1="01">
<s1>GIPSA-LAB, Département Parole and Cognition, CNRS and Grenoble Université</s1>
<s2>Grenoble</s2>
<s3>FRA</s3>
<sZ>1 aut.</sZ>
<sZ>2 aut.</sZ>
<sZ>3 aut.</sZ>
<sZ>4 aut.</sZ>
</fA14>
<fA20>
<s1>71-77</s1>
</fA20>
<fA21>
<s1>2014</s1>
</fA21>
<fA23 i1="01">
<s0>ENG</s0>
</fA23>
<fA43 i1="01">
<s1>INIST</s1>
<s2>11143</s2>
<s5>354000503266940080</s5>
</fA43>
<fA44>
<s0>0000</s0>
<s1>© 2014 INIST-CNRS. All rights reserved.</s1>
</fA44>
<fA45>
<s0>1/2 p.</s0>
</fA45>
<fA47 i1="01" i2="1">
<s0>14-0138017</s0>
</fA47>
<fA60>
<s1>P</s1>
</fA60>
<fA61>
<s0>A</s0>
</fA61>
<fA64 i1="01" i2="1">
<s0>Neuropsychologia</s0>
</fA64>
<fA66 i1="01">
<s0>GBR</s0>
</fA66>
<fC01 i1="01" l="ENG">
<s0>Speech can be perceived not only by the ear and by the eye but also by the hand, with speech gestures felt from manual tactile contact with the speaker's face. In the present electro-encephalographic study, early cross-modal interactions were investigated by comparing auditory evoked potentials during auditory, audio-visual and audio-haptic speech perception in dyadic interactions between a listener and a speaker. In line with previous studies, early auditory evoked responses were attenuated and speeded up during audio-visual compared to auditory speech perception. Crucially, shortened latencies of early auditory evoked potentials were also observed during audio-haptic speech perception. Altogether, these results suggest early bimodal interactions during live face-to-face and hand-to-face speech perception in dyadic interactions.</s0>
</fC01>
<fC02 i1="01" i2="X">
<s0>770B09C</s0>
<s1>II</s1>
</fC02>
<fC03 i1="01" i2="X" l="FRE">
<s0>Sensibilité tactile</s0>
<s5>01</s5>
</fC03>
<fC03 i1="01" i2="X" l="ENG">
<s0>Tactile sensitivity</s0>
<s5>01</s5>
</fC03>
<fC03 i1="01" i2="X" l="SPA">
<s0>Sensibilidad tactil</s0>
<s5>01</s5>
</fC03>
<fC03 i1="02" i2="X" l="FRE">
<s0>Vision</s0>
<s5>02</s5>
</fC03>
<fC03 i1="02" i2="X" l="ENG">
<s0>Vision</s0>
<s5>02</s5>
</fC03>
<fC03 i1="02" i2="X" l="SPA">
<s0>Visión</s0>
<s5>02</s5>
</fC03>
<fC03 i1="03" i2="X" l="FRE">
<s0>Etude expérimentale</s0>
<s5>03</s5>
</fC03>
<fC03 i1="03" i2="X" l="ENG">
<s0>Experimental study</s0>
<s5>03</s5>
</fC03>
<fC03 i1="03" i2="X" l="SPA">
<s0>Estudio experimental</s0>
<s5>03</s5>
</fC03>
<fC03 i1="04" i2="X" l="FRE">
<s0>Audition</s0>
<s5>05</s5>
</fC03>
<fC03 i1="04" i2="X" l="ENG">
<s0>Hearing</s0>
<s5>05</s5>
</fC03>
<fC03 i1="04" i2="X" l="SPA">
<s0>Audición</s0>
<s5>05</s5>
</fC03>
<fC03 i1="05" i2="X" l="FRE">
<s0>Traitement information</s0>
<s5>06</s5>
</fC03>
<fC03 i1="05" i2="X" l="ENG">
<s0>Information processing</s0>
<s5>06</s5>
</fC03>
<fC03 i1="05" i2="X" l="SPA">
<s0>Procesamiento información</s0>
<s5>06</s5>
</fC03>
<fC03 i1="06" i2="X" l="FRE">
<s0>Parole</s0>
<s5>07</s5>
</fC03>
<fC03 i1="06" i2="X" l="ENG">
<s0>Speech</s0>
<s5>07</s5>
</fC03>
<fC03 i1="06" i2="X" l="SPA">
<s0>Habla</s0>
<s5>07</s5>
</fC03>
<fC03 i1="07" i2="X" l="FRE">
<s0>Relation dyadique</s0>
<s5>08</s5>
</fC03>
<fC03 i1="07" i2="X" l="ENG">
<s0>Dyadic relation</s0>
<s5>08</s5>
</fC03>
<fC03 i1="07" i2="X" l="SPA">
<s0>Relación diádica</s0>
<s5>08</s5>
</fC03>
<fC03 i1="08" i2="X" l="FRE">
<s0>Perception verbale</s0>
<s5>09</s5>
</fC03>
<fC03 i1="08" i2="X" l="ENG">
<s0>Verbal perception</s0>
<s5>09</s5>
</fC03>
<fC03 i1="08" i2="X" l="SPA">
<s0>Percepción verbal</s0>
<s5>09</s5>
</fC03>
<fC03 i1="09" i2="X" l="FRE">
<s0>Perception intermodale</s0>
<s5>10</s5>
</fC03>
<fC03 i1="09" i2="X" l="ENG">
<s0>Intermodal perception</s0>
<s5>10</s5>
</fC03>
<fC03 i1="09" i2="X" l="SPA">
<s0>Percepción intermodal</s0>
<s5>10</s5>
</fC03>
<fC03 i1="10" i2="X" l="FRE">
<s0>Electroencéphalographie</s0>
<s5>11</s5>
</fC03>
<fC03 i1="10" i2="X" l="ENG">
<s0>Electroencephalography</s0>
<s5>11</s5>
</fC03>
<fC03 i1="10" i2="X" l="SPA">
<s0>Electroencefalografía</s0>
<s5>11</s5>
</fC03>
<fC03 i1="11" i2="X" l="FRE">
<s0>Homme</s0>
<s5>18</s5>
</fC03>
<fC03 i1="11" i2="X" l="ENG">
<s0>Human</s0>
<s5>18</s5>
</fC03>
<fC03 i1="11" i2="X" l="SPA">
<s0>Hombre</s0>
<s5>18</s5>
</fC03>
<fC07 i1="01" i2="X" l="FRE">
<s0>Interaction sociale</s0>
<s5>37</s5>
</fC07>
<fC07 i1="01" i2="X" l="ENG">
<s0>Social interaction</s0>
<s5>37</s5>
</fC07>
<fC07 i1="01" i2="X" l="SPA">
<s0>Interacción social</s0>
<s5>37</s5>
</fC07>
<fC07 i1="02" i2="X" l="FRE">
<s0>Cognition</s0>
<s5>38</s5>
</fC07>
<fC07 i1="02" i2="X" l="ENG">
<s0>Cognition</s0>
<s5>38</s5>
</fC07>
<fC07 i1="02" i2="X" l="SPA">
<s0>Cognición</s0>
<s5>38</s5>
</fC07>
<fC07 i1="03" i2="X" l="FRE">
<s0>Langage</s0>
<s5>39</s5>
</fC07>
<fC07 i1="03" i2="X" l="ENG">
<s0>Language</s0>
<s5>39</s5>
</fC07>
<fC07 i1="03" i2="X" l="SPA">
<s0>Lenguaje</s0>
<s5>39</s5>
</fC07>
<fC07 i1="04" i2="X" l="FRE">
<s0>Electrophysiologie</s0>
<s5>40</s5>
</fC07>
<fC07 i1="04" i2="X" l="ENG">
<s0>Electrophysiology</s0>
<s5>40</s5>
</fC07>
<fC07 i1="04" i2="X" l="SPA">
<s0>Electrofisiología</s0>
<s5>40</s5>
</fC07>
<fN21>
<s1>174</s1>
</fN21>
</pA>
</standard>
<server>
<NO>FRANCIS 14-0138017 INIST</NO>
<ET>Haptic and visual information speed up the neural processing of auditory speech in live dyadic interactions</ET>
<AU>TREILLE (Avril); CORDEBOEUF (Camille); VILAIN (Coriandre); SATO (Marc)</AU>
<AF>GIPSA-LAB, Département Parole and Cognition, CNRS and Grenoble Université/Grenoble/France (1 aut., 2 aut., 3 aut., 4 aut.)</AF>
<DT>Publication en série; Niveau analytique</DT>
<SO>Neuropsychologia; ISSN 0028-3932; Coden NUPSA6; Royaume-Uni; Da. 2014; Vol. 57; Pp. 71-77; Bibl. 1/2 p.</SO>
<LA>Anglais</LA>
<EA>Speech can be perceived not only by the ear and by the eye but also by the hand, with speech gestures felt from manual tactile contact with the speaker's face. In the present electro-encephalographic study, early cross-modal interactions were investigated by comparing auditory evoked potentials during auditory, audio-visual and audio-haptic speech perception in dyadic interactions between a listener and a speaker. In line with previous studies, early auditory evoked responses were attenuated and speeded up during audio-visual compared to auditory speech perception. Crucially, shortened latencies of early auditory evoked potentials were also observed during audio-haptic speech perception. Altogether, these results suggest early bimodal interactions during live face-to-face and hand-to-face speech perception in dyadic interactions.</EA>
<CC>770B09C</CC>
<FD>Sensibilité tactile; Vision; Etude expérimentale; Audition; Traitement information; Parole; Relation dyadique; Perception verbale; Perception intermodale; Electroencéphalographie; Homme</FD>
<FG>Interaction sociale; Cognition; Langage; Electrophysiologie</FG>
<ED>Tactile sensitivity; Vision; Experimental study; Hearing; Information processing; Speech; Dyadic relation; Verbal perception; Intermodal perception; Electroencephalography; Human</ED>
<EG>Social interaction; Cognition; Language; Electrophysiology</EG>
<SD>Sensibilidad tactil; Visión; Estudio experimental; Audición; Procesamiento información; Habla; Relación diádica; Percepción verbal; Percepción intermodal; Electroencefalografía; Hombre</SD>
<LO>INIST-11143.354000503266940080</LO>
<ID>14-0138017</ID>
</server>
</inist>
</record>

Pour manipuler ce document sous Unix (Dilib)

EXPLOR_STEP=$WICRI_ROOT/Ticri/CIDE/explor/HapticV1/Data/PascalFrancis/Corpus
HfdSelect -h $EXPLOR_STEP/biblio.hfd -nk 000129 | SxmlIndent | more

Ou

HfdSelect -h $EXPLOR_AREA/Data/PascalFrancis/Corpus/biblio.hfd -nk 000129 | SxmlIndent | more

Pour mettre un lien sur cette page dans le réseau Wicri

{{Explor lien
   |wiki=    Ticri/CIDE
   |area=    HapticV1
   |flux=    PascalFrancis
   |étape=   Corpus
   |type=    RBID
   |clé=     Francis:14-0138017
   |texte=   Haptic and visual information speed up the neural processing of auditory speech in live dyadic interactions
}}

Wicri

This area was generated with Dilib version V0.6.23.
Data generation: Mon Jun 13 01:09:46 2016. Site generation: Wed Mar 6 09:54:07 2024