Serveur d'exploration sur les dispositifs haptiques

Attention, ce site est en cours de développement !
Attention, site généré par des moyens informatiques à partir de corpus bruts.
Les informations ne sont donc pas validées.

Mothers' multimodal information processing is modulated by multimodal interactions with their infants

Identifieur interne : 003371 ( Ncbi/Merge ); précédent : 003370; suivant : 003372

Mothers' multimodal information processing is modulated by multimodal interactions with their infants

Auteurs : Yukari Tanaka [Japon] ; Hirokata Fukushima [Japon] ; Kazuo Okanoya [Japon] ; Masako Myowa-Yamakoshi [Japon]

Source :

RBID : PMC:4200416

Abstract

Social learning in infancy is known to be facilitated by multimodal (e.g., visual, tactile, and verbal) cues provided by caregivers. In parallel with infants' development, recent research has revealed that maternal neural activity is altered through interaction with infants, for instance, to be sensitive to infant-directed speech (IDS). The present study investigated the effect of mother- infant multimodal interaction on maternal neural activity. Event-related potentials (ERPs) of mothers were compared to non-mothers during perception of tactile-related words primed by tactile cues. Only mothers showed ERP modulation when tactile cues were incongruent with the subsequent words, and only when the words were delivered with IDS prosody. Furthermore, the frequency of mothers' use of those words was correlated with the magnitude of ERP differentiation between congruent and incongruent stimuli presentations. These results suggest that mother-infant daily interactions enhance multimodal integration of the maternal brain in parenting contexts.


Url:
DOI: 10.1038/srep06623
PubMed: 25322936
PubMed Central: 4200416

Links toward previous steps (curation, corpus...)


Links to Exploration step

PMC:4200416

Le document en format XML

<record>
<TEI>
<teiHeader>
<fileDesc>
<titleStmt>
<title xml:lang="en">Mothers' multimodal information processing is modulated by multimodal interactions with their infants</title>
<author>
<name sortKey="Tanaka, Yukari" sort="Tanaka, Yukari" uniqKey="Tanaka Y" first="Yukari" last="Tanaka">Yukari Tanaka</name>
<affiliation wicri:level="1">
<nlm:aff id="a1">
<institution>Graduate school of Education, Kyoto University</institution>
, Kyoto,
<country>Japan</country>
</nlm:aff>
<country xml:lang="fr">Japon</country>
<wicri:regionArea># see nlm:aff country strict</wicri:regionArea>
</affiliation>
<affiliation wicri:level="1">
<nlm:aff id="a5">
<institution>Japan Society for the Promotion of Science</institution>
, Tokyo,
<country>Japan</country>
</nlm:aff>
<country xml:lang="fr">Japon</country>
<wicri:regionArea># see nlm:aff country strict</wicri:regionArea>
</affiliation>
</author>
<author>
<name sortKey="Fukushima, Hirokata" sort="Fukushima, Hirokata" uniqKey="Fukushima H" first="Hirokata" last="Fukushima">Hirokata Fukushima</name>
<affiliation wicri:level="1">
<nlm:aff id="a2">
<institution>Faculty of Sociology, Kansai University</institution>
, Suita, Osaka,
<country>Japan</country>
</nlm:aff>
<country xml:lang="fr">Japon</country>
<wicri:regionArea># see nlm:aff country strict</wicri:regionArea>
</affiliation>
</author>
<author>
<name sortKey="Okanoya, Kazuo" sort="Okanoya, Kazuo" uniqKey="Okanoya K" first="Kazuo" last="Okanoya">Kazuo Okanoya</name>
<affiliation wicri:level="1">
<nlm:aff id="a3">
<institution>Japan Science and Technology Agency</institution>
, Kawaguchi, Saitama,
<country>Japan</country>
</nlm:aff>
<country xml:lang="fr">Japon</country>
<wicri:regionArea># see nlm:aff country strict</wicri:regionArea>
</affiliation>
<affiliation wicri:level="1">
<nlm:aff id="a4">
<institution>Graduate School of Arts and Science, University of Tokyo</institution>
, Meguro, Tokyo,
<country>Japan</country>
</nlm:aff>
<country xml:lang="fr">Japon</country>
<wicri:regionArea># see nlm:aff country strict</wicri:regionArea>
</affiliation>
</author>
<author>
<name sortKey="Myowa Yamakoshi, Masako" sort="Myowa Yamakoshi, Masako" uniqKey="Myowa Yamakoshi M" first="Masako" last="Myowa-Yamakoshi">Masako Myowa-Yamakoshi</name>
<affiliation wicri:level="1">
<nlm:aff id="a1">
<institution>Graduate school of Education, Kyoto University</institution>
, Kyoto,
<country>Japan</country>
</nlm:aff>
<country xml:lang="fr">Japon</country>
<wicri:regionArea># see nlm:aff country strict</wicri:regionArea>
</affiliation>
<affiliation wicri:level="1">
<nlm:aff id="a3">
<institution>Japan Science and Technology Agency</institution>
, Kawaguchi, Saitama,
<country>Japan</country>
</nlm:aff>
<country xml:lang="fr">Japon</country>
<wicri:regionArea># see nlm:aff country strict</wicri:regionArea>
</affiliation>
</author>
</titleStmt>
<publicationStmt>
<idno type="wicri:source">PMC</idno>
<idno type="pmid">25322936</idno>
<idno type="pmc">4200416</idno>
<idno type="url">http://www.ncbi.nlm.nih.gov/pmc/articles/PMC4200416</idno>
<idno type="RBID">PMC:4200416</idno>
<idno type="doi">10.1038/srep06623</idno>
<date when="2014">2014</date>
<idno type="wicri:Area/Pmc/Corpus">000037</idno>
<idno type="wicri:Area/Pmc/Curation">000037</idno>
<idno type="wicri:Area/Pmc/Checkpoint">000B03</idno>
<idno type="wicri:Area/Ncbi/Merge">003371</idno>
</publicationStmt>
<sourceDesc>
<biblStruct>
<analytic>
<title xml:lang="en" level="a" type="main">Mothers' multimodal information processing is modulated by multimodal interactions with their infants</title>
<author>
<name sortKey="Tanaka, Yukari" sort="Tanaka, Yukari" uniqKey="Tanaka Y" first="Yukari" last="Tanaka">Yukari Tanaka</name>
<affiliation wicri:level="1">
<nlm:aff id="a1">
<institution>Graduate school of Education, Kyoto University</institution>
, Kyoto,
<country>Japan</country>
</nlm:aff>
<country xml:lang="fr">Japon</country>
<wicri:regionArea># see nlm:aff country strict</wicri:regionArea>
</affiliation>
<affiliation wicri:level="1">
<nlm:aff id="a5">
<institution>Japan Society for the Promotion of Science</institution>
, Tokyo,
<country>Japan</country>
</nlm:aff>
<country xml:lang="fr">Japon</country>
<wicri:regionArea># see nlm:aff country strict</wicri:regionArea>
</affiliation>
</author>
<author>
<name sortKey="Fukushima, Hirokata" sort="Fukushima, Hirokata" uniqKey="Fukushima H" first="Hirokata" last="Fukushima">Hirokata Fukushima</name>
<affiliation wicri:level="1">
<nlm:aff id="a2">
<institution>Faculty of Sociology, Kansai University</institution>
, Suita, Osaka,
<country>Japan</country>
</nlm:aff>
<country xml:lang="fr">Japon</country>
<wicri:regionArea># see nlm:aff country strict</wicri:regionArea>
</affiliation>
</author>
<author>
<name sortKey="Okanoya, Kazuo" sort="Okanoya, Kazuo" uniqKey="Okanoya K" first="Kazuo" last="Okanoya">Kazuo Okanoya</name>
<affiliation wicri:level="1">
<nlm:aff id="a3">
<institution>Japan Science and Technology Agency</institution>
, Kawaguchi, Saitama,
<country>Japan</country>
</nlm:aff>
<country xml:lang="fr">Japon</country>
<wicri:regionArea># see nlm:aff country strict</wicri:regionArea>
</affiliation>
<affiliation wicri:level="1">
<nlm:aff id="a4">
<institution>Graduate School of Arts and Science, University of Tokyo</institution>
, Meguro, Tokyo,
<country>Japan</country>
</nlm:aff>
<country xml:lang="fr">Japon</country>
<wicri:regionArea># see nlm:aff country strict</wicri:regionArea>
</affiliation>
</author>
<author>
<name sortKey="Myowa Yamakoshi, Masako" sort="Myowa Yamakoshi, Masako" uniqKey="Myowa Yamakoshi M" first="Masako" last="Myowa-Yamakoshi">Masako Myowa-Yamakoshi</name>
<affiliation wicri:level="1">
<nlm:aff id="a1">
<institution>Graduate school of Education, Kyoto University</institution>
, Kyoto,
<country>Japan</country>
</nlm:aff>
<country xml:lang="fr">Japon</country>
<wicri:regionArea># see nlm:aff country strict</wicri:regionArea>
</affiliation>
<affiliation wicri:level="1">
<nlm:aff id="a3">
<institution>Japan Science and Technology Agency</institution>
, Kawaguchi, Saitama,
<country>Japan</country>
</nlm:aff>
<country xml:lang="fr">Japon</country>
<wicri:regionArea># see nlm:aff country strict</wicri:regionArea>
</affiliation>
</author>
</analytic>
<series>
<title level="j">Scientific Reports</title>
<idno type="eISSN">2045-2322</idno>
<imprint>
<date when="2014">2014</date>
</imprint>
</series>
</biblStruct>
</sourceDesc>
</fileDesc>
<profileDesc>
<textClass></textClass>
</profileDesc>
</teiHeader>
<front>
<div type="abstract" xml:lang="en">
<p>Social learning in infancy is known to be facilitated by multimodal (e.g., visual, tactile, and verbal) cues provided by caregivers. In parallel with infants' development, recent research has revealed that maternal neural activity is altered through interaction with infants, for instance, to be sensitive to infant-directed speech (IDS). The present study investigated the effect of mother- infant multimodal interaction on maternal neural activity. Event-related potentials (ERPs) of mothers were compared to non-mothers during perception of tactile-related words primed by tactile cues. Only mothers showed ERP modulation when tactile cues were incongruent with the subsequent words, and only when the words were delivered with IDS prosody. Furthermore, the frequency of mothers' use of those words was correlated with the magnitude of ERP differentiation between congruent and incongruent stimuli presentations. These results suggest that mother-infant daily interactions enhance multimodal integration of the maternal brain in parenting contexts.</p>
</div>
</front>
<back>
<div1 type="bibliography">
<listBibl>
<biblStruct>
<analytic>
<author>
<name sortKey="Fernald, A" uniqKey="Fernald A">A. Fernald</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Trainor, L J" uniqKey="Trainor L">L. J. Trainor</name>
</author>
<author>
<name sortKey="Clarke, E D" uniqKey="Clarke E">E. D. Clarke</name>
</author>
<author>
<name sortKey="Huntley, A" uniqKey="Huntley A">A. Huntley</name>
</author>
<author>
<name sortKey="Adams, B A" uniqKey="Adams B">B. A. Adams</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Soderstrom, M" uniqKey="Soderstrom M">M. Soderstrom</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Barker, B A" uniqKey="Barker B">B. A. Barker</name>
</author>
<author>
<name sortKey="Newman, R S" uniqKey="Newman R">R. S. Newman</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Cooper, R P" uniqKey="Cooper R">R. P. Cooper</name>
</author>
<author>
<name sortKey="Aslin, R N" uniqKey="Aslin R">R. N. Aslin</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Werker, J F" uniqKey="Werker J">J. F. Werker</name>
</author>
<author>
<name sortKey="Mcleod, P J" uniqKey="Mcleod P">P. J. McLeod</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Taumoepeau, M" uniqKey="Taumoepeau M">M. Taumoepeau</name>
</author>
<author>
<name sortKey="Ruffman, T" uniqKey="Ruffman T">T. Ruffman</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Kuhl, P K" uniqKey="Kuhl P">P. K. Kuhl</name>
</author>
<author>
<name sortKey="Rivera Gaxiola, M" uniqKey="Rivera Gaxiola M">M. Rivera-Gaxiola</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Thiessen, E D" uniqKey="Thiessen E">E. D. Thiessen</name>
</author>
<author>
<name sortKey="Hill, E A" uniqKey="Hill E">E. A. Hill</name>
</author>
<author>
<name sortKey="Saffran, J R" uniqKey="Saffran J">J. R. Saffran</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Vallabha, G K" uniqKey="Vallabha G">G. K. Vallabha</name>
</author>
<author>
<name sortKey="Mcclelland, J L" uniqKey="Mcclelland J">J. L. McClelland</name>
</author>
<author>
<name sortKey="Pons, F" uniqKey="Pons F">F. Pons</name>
</author>
<author>
<name sortKey="Werker, J F" uniqKey="Werker J">J. F. Werker</name>
</author>
<author>
<name sortKey="Amano, S" uniqKey="Amano S">S. Amano</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Ramirez Esparza, N" uniqKey="Ramirez Esparza N">N. Ramírez-Esparza</name>
</author>
<author>
<name sortKey="Garcia Sierra, A" uniqKey="Garcia Sierra A">A. García-Sierra</name>
</author>
<author>
<name sortKey="Kuhl, P K" uniqKey="Kuhl P">P. K. Kuhl</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Sullivan, J W" uniqKey="Sullivan J">J. W. Sullivan</name>
</author>
<author>
<name sortKey="Horowitz, F D" uniqKey="Horowitz F">F. D. Horowitz</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Liebal, K" uniqKey="Liebal K">K. Liebal</name>
</author>
<author>
<name sortKey="Behne, T" uniqKey="Behne T">T. Behne</name>
</author>
<author>
<name sortKey="Carpenter, M" uniqKey="Carpenter M">M. Carpenter</name>
</author>
<author>
<name sortKey="Tomasello, M" uniqKey="Tomasello M">M. Tomasello</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Gogate, L J" uniqKey="Gogate L">L. J. Gogate</name>
</author>
<author>
<name sortKey="Bahrick, L E" uniqKey="Bahrick L">L. E. Bahrick</name>
</author>
<author>
<name sortKey="Watson, J D" uniqKey="Watson J">J. D. Watson</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Gogate, L J" uniqKey="Gogate L">L. J. Gogate</name>
</author>
<author>
<name sortKey="Bolzani, L H" uniqKey="Bolzani L">L. H. Bolzani</name>
</author>
<author>
<name sortKey="Betancourt, E A" uniqKey="Betancourt E">E. A. Betancourt</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Gogate, L J" uniqKey="Gogate L">L. J. Gogate</name>
</author>
<author>
<name sortKey="Walker Andrews, A S" uniqKey="Walker Andrews A">A. S. Walker-Andrews</name>
</author>
<author>
<name sortKey="Bahrick, L E" uniqKey="Bahrick L">L. E. Bahrick</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Matsuda, Y" uniqKey="Matsuda Y">Y. Matsuda</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Bergelson, E" uniqKey="Bergelson E">E. Bergelson</name>
</author>
<author>
<name sortKey="Swingley, D" uniqKey="Swingley D">D. Swingley</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Werker, J F" uniqKey="Werker J">J. F. Werker</name>
</author>
<author>
<name sortKey="Cohen, L B" uniqKey="Cohen L">L. B. Cohen</name>
</author>
<author>
<name sortKey="Lloyd, V L" uniqKey="Lloyd V">V. L. Lloyd</name>
</author>
<author>
<name sortKey="Casasola, M" uniqKey="Casasola M">M. Casasola</name>
</author>
<author>
<name sortKey="Stager, C L" uniqKey="Stager C">C. L. Stager</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Ferber, S G" uniqKey="Ferber S">S. G. Ferber</name>
</author>
<author>
<name sortKey="Feldman, R" uniqKey="Feldman R">R. Feldman</name>
</author>
<author>
<name sortKey="Makhoul, I R" uniqKey="Makhoul I">I. R. Makhoul</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Fernald, A" uniqKey="Fernald A">A. Fernald</name>
</author>
<author>
<name sortKey="Morikawa, H" uniqKey="Morikawa H">H. Morikawa</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Yoshida, H" uniqKey="Yoshida H">H. Yoshida</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Schneider, T R" uniqKey="Schneider T">T. R. Schneider</name>
</author>
<author>
<name sortKey="Debener, S" uniqKey="Debener S">S. Debener</name>
</author>
<author>
<name sortKey="Oostenveld, R" uniqKey="Oostenveld R">R. Oostenveld</name>
</author>
<author>
<name sortKey="Engel, A K" uniqKey="Engel A">A. K. Engel</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Schneider, T R" uniqKey="Schneider T">T. R. Schneider</name>
</author>
<author>
<name sortKey="Lorenz, S" uniqKey="Lorenz S">S. Lorenz</name>
</author>
<author>
<name sortKey="Senkowski, D" uniqKey="Senkowski D">D. Senkowski</name>
</author>
<author>
<name sortKey="Engel, A K" uniqKey="Engel A">A. K. Engel</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Verkindt, C" uniqKey="Verkindt C">C. Verkindt</name>
</author>
<author>
<name sortKey="Bertrand, O" uniqKey="Bertrand O">O. Bertrand</name>
</author>
<author>
<name sortKey="Thevenet, M" uniqKey="Thevenet M">M. Thevenet</name>
</author>
<author>
<name sortKey="Pernier, J" uniqKey="Pernier J">J. Pernier</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Senkowski, D" uniqKey="Senkowski D">D. Senkowski</name>
</author>
<author>
<name sortKey="Saint Amour, D" uniqKey="Saint Amour D">D. Saint-Amour</name>
</author>
<author>
<name sortKey="Kelly, S P" uniqKey="Kelly S">S. P. Kelly</name>
</author>
<author>
<name sortKey="Foxe, J J" uniqKey="Foxe J">J. J. Foxe</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Liebenthal, E" uniqKey="Liebenthal E">E. Liebenthal</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Parkinson, A L" uniqKey="Parkinson A">A. L. Parkinson</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Parsons, C E" uniqKey="Parsons C">C. E. Parsons</name>
</author>
<author>
<name sortKey="Stark, E A" uniqKey="Stark E">E. A. Stark</name>
</author>
<author>
<name sortKey="Young, K S" uniqKey="Young K">K. S. Young</name>
</author>
<author>
<name sortKey="Stein, A" uniqKey="Stein A">A. Stein</name>
</author>
<author>
<name sortKey="Kringelbach, M L" uniqKey="Kringelbach M">M. L. Kringelbach</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Kringelbach, M L" uniqKey="Kringelbach M">M. L. Kringelbach</name>
</author>
<author>
<name sortKey="Rolls, E T" uniqKey="Rolls E">E. T. Rolls</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Trembley, K N" uniqKey="Trembley K">K. N. Trembley</name>
</author>
<author>
<name sortKey="Kraus, N" uniqKey="Kraus N">N. Kraus</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Kutas, M" uniqKey="Kutas M">M. Kutas</name>
</author>
<author>
<name sortKey="Federmeier, K D" uniqKey="Federmeier K">K. D. Federmeier</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Orgs, G" uniqKey="Orgs G">G. Orgs</name>
</author>
<author>
<name sortKey="Lange, K" uniqKey="Lange K">K. Lange</name>
</author>
<author>
<name sortKey="Dombrowski, J H" uniqKey="Dombrowski J">J. H. Dombrowski</name>
</author>
<author>
<name sortKey="Heil, M" uniqKey="Heil M">M. Heil</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Kiefer, M" uniqKey="Kiefer M">M. Kiefer</name>
</author>
</analytic>
</biblStruct>
</listBibl>
</div1>
</back>
</TEI>
<pmc article-type="research-article">
<pmc-dir>properties open_access</pmc-dir>
<front>
<journal-meta>
<journal-id journal-id-type="nlm-ta">Sci Rep</journal-id>
<journal-id journal-id-type="iso-abbrev">Sci Rep</journal-id>
<journal-title-group>
<journal-title>Scientific Reports</journal-title>
</journal-title-group>
<issn pub-type="epub">2045-2322</issn>
<publisher>
<publisher-name>Nature Publishing Group</publisher-name>
</publisher>
</journal-meta>
<article-meta>
<article-id pub-id-type="pmid">25322936</article-id>
<article-id pub-id-type="pmc">4200416</article-id>
<article-id pub-id-type="pii">srep06623</article-id>
<article-id pub-id-type="doi">10.1038/srep06623</article-id>
<article-categories>
<subj-group subj-group-type="heading">
<subject>Article</subject>
</subj-group>
</article-categories>
<title-group>
<article-title>Mothers' multimodal information processing is modulated by multimodal interactions with their infants</article-title>
</title-group>
<contrib-group>
<contrib contrib-type="author">
<name>
<surname>Tanaka</surname>
<given-names>Yukari</given-names>
</name>
<xref ref-type="corresp" rid="c1">a</xref>
<xref ref-type="aff" rid="a1">1</xref>
<xref ref-type="aff" rid="a5">5</xref>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Fukushima</surname>
<given-names>Hirokata</given-names>
</name>
<xref ref-type="aff" rid="a2">2</xref>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Okanoya</surname>
<given-names>Kazuo</given-names>
</name>
<xref ref-type="aff" rid="a3">3</xref>
<xref ref-type="aff" rid="a4">4</xref>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Myowa-Yamakoshi</surname>
<given-names>Masako</given-names>
</name>
<xref ref-type="aff" rid="a1">1</xref>
<xref ref-type="aff" rid="a3">3</xref>
</contrib>
<aff id="a1">
<label>1</label>
<institution>Graduate school of Education, Kyoto University</institution>
, Kyoto,
<country>Japan</country>
</aff>
<aff id="a2">
<label>2</label>
<institution>Faculty of Sociology, Kansai University</institution>
, Suita, Osaka,
<country>Japan</country>
</aff>
<aff id="a3">
<label>3</label>
<institution>Japan Science and Technology Agency</institution>
, Kawaguchi, Saitama,
<country>Japan</country>
</aff>
<aff id="a4">
<label>4</label>
<institution>Graduate School of Arts and Science, University of Tokyo</institution>
, Meguro, Tokyo,
<country>Japan</country>
</aff>
<aff id="a5">
<label>5</label>
<institution>Japan Society for the Promotion of Science</institution>
, Tokyo,
<country>Japan</country>
</aff>
</contrib-group>
<author-notes>
<corresp id="c1">
<label>a</label>
<email>tanaka.yukari.62x@st.kyoto-u.ac.jp</email>
</corresp>
</author-notes>
<pub-date pub-type="epub">
<day>17</day>
<month>10</month>
<year>2014</year>
</pub-date>
<pub-date pub-type="collection">
<year>2014</year>
</pub-date>
<volume>4</volume>
<elocation-id>6623</elocation-id>
<history>
<date date-type="received">
<day>12</day>
<month>05</month>
<year>2014</year>
</date>
<date date-type="accepted">
<day>02</day>
<month>09</month>
<year>2014</year>
</date>
</history>
<permissions>
<copyright-statement>Copyright © 2014, Macmillan Publishers Limited. All rights reserved</copyright-statement>
<copyright-year>2014</copyright-year>
<copyright-holder>Macmillan Publishers Limited. All rights reserved</copyright-holder>
<license license-type="open-access" xlink:href="http://creativecommons.org/licenses/by-nc-sa/4.0/">
<pmc-comment>author-paid</pmc-comment>
<license-p>This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. The images or other third party material in this article are included in the article's Creative Commons license, unless indicated otherwise in the credit line; if the material is not included under the Creative Commons license, users will need to obtain permission from the license holder in order to reproduce the material. To view a copy of this license, visit
<ext-link ext-link-type="uri" xlink:href="http://creativecommons.org/licenses/by-nc-sa/4.0/">http://creativecommons.org/licenses/by-nc-sa/4.0/</ext-link>
</license-p>
</license>
</permissions>
<abstract>
<p>Social learning in infancy is known to be facilitated by multimodal (e.g., visual, tactile, and verbal) cues provided by caregivers. In parallel with infants' development, recent research has revealed that maternal neural activity is altered through interaction with infants, for instance, to be sensitive to infant-directed speech (IDS). The present study investigated the effect of mother- infant multimodal interaction on maternal neural activity. Event-related potentials (ERPs) of mothers were compared to non-mothers during perception of tactile-related words primed by tactile cues. Only mothers showed ERP modulation when tactile cues were incongruent with the subsequent words, and only when the words were delivered with IDS prosody. Furthermore, the frequency of mothers' use of those words was correlated with the magnitude of ERP differentiation between congruent and incongruent stimuli presentations. These results suggest that mother-infant daily interactions enhance multimodal integration of the maternal brain in parenting contexts.</p>
</abstract>
</article-meta>
</front>
<body>
<p>Human caregivers modify their behaviors when interacting with their infants. One typical modification of adults' interaction style is infant-directed speech (IDS), which is characterized by the features of specific prosodic patterns such as higher pitch, greater pitch variations, longer pauses, and a more rhythmic, slower tempo when compared to adult-directed speech (ADS)
<xref ref-type="bibr" rid="b1">1</xref>
<xref ref-type="bibr" rid="b2">2</xref>
<xref ref-type="bibr" rid="b3">3</xref>
. IDS has the function of drawing infants' attention
<xref ref-type="bibr" rid="b4">4</xref>
<xref ref-type="bibr" rid="b5">5</xref>
<xref ref-type="bibr" rid="b6">6</xref>
, promoting emotional interaction
<xref ref-type="bibr" rid="b7">7</xref>
, and facilitating language acquisition in infancy
<xref ref-type="bibr" rid="b8">8</xref>
<xref ref-type="bibr" rid="b9">9</xref>
<xref ref-type="bibr" rid="b10">10</xref>
<xref ref-type="bibr" rid="b11">11</xref>
. Importantly, the behavioral modification of caregivers is multimodal in nature, including visual, auditory, and tactile information
<xref ref-type="bibr" rid="b12">12</xref>
. For example, mothers use pointing gestures to a target object, coupled with IDS
<xref ref-type="bibr" rid="b13">13</xref>
. These multimodal cues are often demonstrated with redundant temporal synchrony among several modalities (i.e., multimodal motherese), which emphasizes the salient scenes in the environment, and facilitates infants' learning
<xref ref-type="bibr" rid="b14">14</xref>
<xref ref-type="bibr" rid="b15">15</xref>
<xref ref-type="bibr" rid="b16">16</xref>
. Multimodal interaction plays the important role of making ambient references and intentions clear, and encouraging smooth interaction between mothers and infants.</p>
<p>Although parents have been shown to modify their behaviors, it is still unclear how such experiences affect the cognitive and neural functions of caregivers. Recent neuroscience research has revealed that parenting experiences alter the brain activity of mothers. For example, mothers with preverbal infants showed enhanced cortical activation in the auditory dorsal pathway of the language areas (Broca's and Wernicke's areas) during the perception of IDS, whereas fathers and non-parents did not show enhanced activity in these areas
<xref ref-type="bibr" rid="b17">17</xref>
. These results suggest that daily experiences of vocalization and hearing speech feedback of IDS enhanced mothers' brain activities in these areas, which are considered to reflect the processing of phonological information. However, the previous research focused only on the phonological aspects of IDS perception, and we do not know whether multimodal information processing is enhanced or modulated by parenting experiences.</p>
<p>The aim of the present study is to reveal whether and how mothers' interactions with infants affect mothers' neural processing of multimodal information in the context of parenting behaviors. Specifically, we focused on auditory and tactile multimodal information processing. Auditory information, especially verbal cues, are quite important for word learning in infancy. Auditory cues enable mothers to convey referential information, which directs infants' attention to specific objects or to specific aspects of the environment
<xref ref-type="bibr" rid="b18">18</xref>
<xref ref-type="bibr" rid="b19">19</xref>
. Furthermore, tactile cues (e.g., hugging, touching, and kissing) are important signals of interaction between mothers and infants
<xref ref-type="bibr" rid="b20">20</xref>
. Tactile cues are often combined with verbal cues. For example, mothers often use baby talk including mimetic words as infant-directed speech
<xref ref-type="bibr" rid="b21">21</xref>
, and provide their toddlers (aged around 2.5 years) with tactile experiences accompanied by tactile-related onomatopoeia words in an IDS manner
<xref ref-type="bibr" rid="b22">22</xref>
(e.g., having infants touch a soft blanket and saying “fuwa-fuwa”—Japanese onomatopoeia referring to something soft—in a high-pitched voice).</p>
<p>In order to investigate the integration process of tactile-auditory information, we applied a multimodal semantic priming paradigm
<xref ref-type="bibr" rid="b23">23</xref>
<xref ref-type="bibr" rid="b24">24</xref>
(
<xref ref-type="fig" rid="f1">Figure 1</xref>
). The paradigm consisted of priming a tactile stimulus (prime) followed by an auditory linguistic stimulus (target). Participants were required to respond by choosing the word identical to the target stimuli. These two kinds of stimuli (tactile prime and auditory target) were semantically either congruent or incongruent. Target stimuli occurred in one of two prosodic conditions: IDS and ADS. We measured event-related potentials (ERPs) in mothers and non-mothers and analyzed the data collected during the presentation of the target stimuli. The ERPs of mothers and non-mothers were compared between the conditions of congruency (congruent or incongruent) and prosody (IDS or ADS).</p>
<p>We predicted that mothers' ERPs would show enhanced sensitivity to the congruency of the stimuli, particularly when they were hearing IDS prosody. This is because mothers are assumed to have more experience than non-mothers with multimodal interaction with infants in an infant-directed speech manner. To further support this hypothesis, we also tested the correlation between mothers' ERP responses and their reported frequency of use of tactile-related words during their daily interactions with their infants.</p>
<sec disp-level="1" sec-type="results">
<title>Results</title>
<sec disp-level="2">
<title>ERP Response of Mothers and Non-mothers</title>
<p>We focused on the results obtained from the middle frontal region (Fz), where the group differences of interest were most evident. ERPs in other regions are described in
<xref ref-type="fig" rid="f2">Figure 2</xref>
,
<xref ref-type="table" rid="t1">Table 1</xref>
, and the
<xref ref-type="supplementary-material" rid="s1">supplementary results</xref>
. ERPs elicited by auditory stimuli showed a negative peak around ~150 ms (N1), a positive peak around ~240 ms (P2), and a negative peak around ~350 ms from the target onset (N400) (See
<xref ref-type="fig" rid="f2">Figure 2</xref>
and
<xref ref-type="table" rid="t1">Table 1</xref>
). Each component was quantified as the mean amplitude of each of the following periods: N1, 120–180 ms; P2, 200–300 ms; and N400, 300–500 ms from target onset. These amplitudes were analyzed by mixed measures analysis of variance (ANOVAs) with group (2: mothers/non-mothers) as the between-participant factor, and prosody (2: ADS/IDS) and congruency (2: congruent/incongruent) as within-participant factors.</p>
<p>For the period of N1, we found a significant interaction between group, prosody, and congruency (
<italic>F</italic>
<sub>1,32</sub>
= 4.46,
<italic>p</italic>
= .04,
<italic>η
<sub>p</sub>
</italic>
<sup>
<italic>2</italic>
</sup>
= .12;
<xref ref-type="fig" rid="f3">Fig. 3A</xref>
and
<xref ref-type="fig" rid="f3">Fig. 3B</xref>
). We then conducted two-way ANOVAs for each group with prosody and congruency as within-participant factors. We found a significant interaction between prosody and congruency for the mothers group (
<italic>F</italic>
<sub>1,16</sub>
= 6.43,
<italic>p</italic>
= .02,
<italic>η
<sub>p</sub>
</italic>
<sup>2</sup>
= .28). Mothers showed significant differences between IDS-congruent and IDS-incongruent peaks (mean amplitude for IDS-congruent = −0.22 μV, IDS-incongruent = 0.34 μV,
<italic>t</italic>
= 2.60,
<italic>p</italic>
= .02), but not between the ADS prosodies (
<italic>t</italic>
= 0.67,
<italic>p</italic>
= .51). Non-mothers did not show significant main effects or interactions (
<italic>F</italic>
s < 1.05,
<italic>ps</italic>
> .32,
<italic>n.s.</italic>
). No interactions or main effects were detected in other regions (
<italic>F</italic>
s < 3.0,
<italic>p</italic>
s > .05, see
<xref ref-type="supplementary-material" rid="s1">Supplementary Figure S1</xref>
).</p>
<p>For the period of P2, we also found significant interactions between group, prosody, and congruency (
<italic>F</italic>
<sub>1,32</sub>
= 4.65,
<italic>p</italic>
= .04,
<italic>η
<sub>p</sub>
</italic>
<sup>
<italic>2</italic>
</sup>
= .13,
<xref ref-type="fig" rid="f3">Fig. 3A</xref>
and
<xref ref-type="fig" rid="f3">Fig. 3C</xref>
). Again, two-way ANOVAs for each group revealed a significant interaction between prosody and congruency for only the mothers group (
<italic>F</italic>
<sub>1,16</sub>
= 9.40,
<italic>p</italic>
< .01,
<italic>η
<sub>p</sub>
</italic>
<sup>2</sup>
= .37). Post-hoc analysis revealed that mothers showed a larger amplitude in the IDS-incongruent condition than the IDS-congruent condition (IDS-congruent = 0.71 μV, IDS-incongruent = 1.25 μV,
<italic>t</italic>
= 2.13,
<italic>p</italic>
= .05). We also found that only mothers showed a significantly larger amplitude in the IDS-incongruent than the ADS-incongruent condition (ADS-incongruent = 0.47 μV, IDS-incongruent = 1.25 μV,
<italic>t</italic>
= 3.57,
<italic>p</italic>
< .01). Non-mothers showed no significant main effects or interaction (
<italic>ps</italic>
> .05,
<italic>n.s.</italic>
). Aside from the group-related modulations, a main effect of congruency was observed in the period of P2 in the left frontal regions (F3) with greater positivity in the incongruent condition than in the congruent condition (
<xref ref-type="fig" rid="f2">Fig. 2</xref>
and
<xref ref-type="table" rid="t1">Table 1</xref>
).</p>
<p>For the period of N400, we did not find a three-way interaction between group, prosody, and congruency (
<italic>F</italic>
<sub>1,32</sub>
= 1.89,
<italic>p</italic>
= .18). We found that the interaction between prosody and congruency in the middle central region (
<italic>F</italic>
<sub>1,32</sub>
= 2.80,
<italic>p</italic>
= .10,
<italic>η
<sub>p</sub>
</italic>
<sup>
<italic>2</italic>
</sup>
= .09) was not significant. We also found a main effect of prosody on mean amplitude in the right frontal and central regions, with greater negative mean amplitude in IDS prosody than ADS prosody (
<xref ref-type="fig" rid="f2">Fig. 2</xref>
and
<xref ref-type="table" rid="t1">Table 1</xref>
).</p>
</sec>
<sec disp-level="2">
<title>Relationship Between the ERPs of Mothers and the Frequency of Use of Tactile-related Words</title>
<p>The frequency with which mothers used the target words with their infants in daily interactions was calculated from a parent questionnaire. We conducted correlation analysis between the frequency of mothers' target word use and the effect of audio-tactile congruency on ERPs. The effect of audio-tactile congruency was defined as the ERP differentiation between congruent and incongruent conditions in both ADS and IDS prosodies; mean amplitude in the ADS-incongruent condition minus that in the ADS-congruent condition for ADS prosody, and mean amplitude in the IDS-incongruent condition minus that in the IDS-congruent condition for IDS prosody. We found a significant positive correlation between the frequency of target word usage and the differential amplitude of the P2 component for the IDS condition, but not for the ADS condition (IDS,
<italic>r</italic>
<sub>s</sub>
= .54,
<italic>p</italic>
= .03; ADS,
<italic>r</italic>
<sub>s</sub>
= −.36,
<italic>p</italic>
= .15;
<xref ref-type="fig" rid="f4">Fig. 4</xref>
). We also found the same pattern of correlation for N400, again only for the IDS prosody (IDS:
<italic>r</italic>
<sub>s</sub>
= .60,
<italic>p</italic>
= .01, ADS:
<italic>r</italic>
<sub>s</sub>
= - .13,
<italic>p</italic>
= .61). The early component (N1) did not provide a significant correlation in either prosody. We found no correlations between the frequency of target word usage and ERP response in any other regions (See
<xref ref-type="supplementary-material" rid="s1">Supplementary Table S2</xref>
).</p>
</sec>
</sec>
<sec disp-level="1" sec-type="discussion">
<title>Discussion</title>
<p>We investigated whether and how maternal multimodal interactions with infants would affect mothers' neural processing of audio-tactile integration using ERP methodology. We found that only the group of mothers showed differences in ERP amplitudes between the IDS-congruent and IDS-incongruent conditions for N1 latency. This sensitivity to the mismatch between the tactile and verbal stimuli was observed when the tactile-related words were presented with IDS prosody, but not with ADS prosody. Contrary to mothers, the ERPs of the non-mother group did not show the specific sensitivity to the incongruity between the verbal and tactile stimuli with IDS prosody. The auditory N1 component is considered to be associated with the processing of sensory information such as frequency and intensity of a stimulus
<xref ref-type="bibr" rid="b25">25</xref>
. The present finding of N1 modulation after the semantic mismatch of the audio-tactile stimuli is likely to reflect multisensory integration reported with similar latency in a recent ERP study using naturalistic stimuli
<xref ref-type="bibr" rid="b26">26</xref>
.</p>
<p>Furthermore, we found mother-specific ERP responses in middle P2 latency; again only the group of mothers showed the different ERP amplitude between congruent and incongruent stimuli, and only with IDS prosody, whereas the non-mothers group did not show any significant differences between stimuli. The P2 component is assumed to reflect the processing of phonological categorization, being related to the neural representations of multimodal categorization
<xref ref-type="bibr" rid="b27">27</xref>
<xref ref-type="bibr" rid="b28">28</xref>
. These results suggest that mothers' modulated ERPs in the IDS condition (i.e., different ERP amplitudes according to the congruency of word and tactile stimuli) are related to the processing discrimination of multimodal categorization between tactile and verbal cues.</p>
<p>There are some reasons why different ERP modulation between mothers and non-mothers were observed in early-to-middle latency (N1 and P2). One possibility is that mothers are more skilled at detecting the incongruity of multimodal events than non-mothers. However, both groups showed congruency effects in the left frontal (F3) region, regardless of prosody (
<xref ref-type="table" rid="t1">Table 1</xref>
). In other words, the audio-tactile integration process itself was not different between the two groups. Furthermore, mother-specific ERP responses emerged in the IDS prosody condition, not with ADS prosody. These results provide an interesting insight into the functional mechanism of human parenting behavior. Recent studies have suggested that parenting behavior is related to the activation of orbitofrontal regions
<xref ref-type="bibr" rid="b29">29</xref>
, which are considered to have the function of evaluating social rewards and decision-making
<xref ref-type="bibr" rid="b30">30</xref>
. During mother-infant interactions, mothers are required to monitor infants' state and condition, and to respond to and cope with infants' signals quickly. In the present study, participants had to detect and discriminate multimodal congruency. It is possible that verbal cues with IDS prosody motivated mothers to respond selectively to the IDS stimuli, and to evaluate the congruency between tactile and verbal cues, resulting in group differences in early-to-middle latency.</p>
<p>Our data support the hypothesis that one factor influencing mother-specific ERP modulation might be mothers' experiences of speaking tactile-related words to their infants. Correlation analysis revealed that, in the P2 and N400 components elicited from mothers, the difference in amplitudes between the IDS-incongruent and IDS-congruent cues was positively correlated with the frequency of mothers' use of tactile-related words in daily interactions with their toddlers. Again, this effect was observed only in the IDS condition but not in the ADS condition, nor in other regions. We may say that mother-specific ERP responses are an ‘experience-dependent effect.' One adult study showed that P2 amplitude is enhanced by the speech training of syllables
<xref ref-type="bibr" rid="b31">31</xref>
. The training effect might facilitate mothers' response to IDS stimuli, especially in the incongruent condition, resulting in a larger differential ERP response to IDS prosody.</p>
<p>The middle-to-late component showed relatively clear correlation with the subjective evaluation of the mothers, compared to the single component in the early latency because, in general, the middle-to-late component reflects higher-order conscious processing. In particular, the processing of semantic and category-related information is represented as N400 amplitude, which is an important neurophysiological index for semantic memory organization and conceptual learning
<xref ref-type="bibr" rid="b32">32</xref>
<xref ref-type="bibr" rid="b33">33</xref>
<xref ref-type="bibr" rid="b34">34</xref>
. Mothers who often use tactile-related words could have greater accessibility to the semantic meanings of tactile-related words, and they showed larger differential ERPs between IDS-congruent and IDS-incongruent conditions in the late latencies. It is interesting, though, and deserves further investigation that the neural activity of each time scale and the level of multi-modal information processing was related to parenting behavior. It is also important to determine how experience affects neurophysiological responses at different latencies in more detail with other groups, such as childcare workers, grandparents or fathers.</p>
<p>In sum, we found that mothers showed larger ERP responses to IDS-incongruent relative to IDS-congruent stimuli at middle frontal electrodes, whereas non-mothers did not show differential responses. The multimodal congruency effects specific to IDS were related the frequency of using tactile related-words in daily interactions with infants. The results suggest that mother-infant multimodal interaction in daily life enhances mothers' selective neural responses to multimodal information within the parenting context, which might facilitate social cognitive development in infancy.</p>
</sec>
<sec disp-level="1" sec-type="methods">
<title>Methods</title>
<sec disp-level="2">
<title>Participants</title>
<p>Seventeen mothers (mean age = 32.56 ± 3.76 years, range 25–41 years) parenting toddlers (8 boys, mean age = 20.8 ± 1.65 months, range 19–23 months), and seventeen non-mothers (17 females, mean age = 22.4 ± 2.74 years, range 20–31 years) participated in the study. All participants were neurologically typical, right-handed Japanese speakers, and they were paid for participation. All gave informed consent according to the procedures approved by the Ethics Committee of Web for the Integrated Studies of the Human Mind, Japan (WISH, Japan). Some participants came to the experimental laboratory with their infants. During this time, infants were allowed to explore the room and to play freely in another space with an assistant experimenter. Data from an additional eight mothers and four non-mothers had to be excluded from the subsequent analysis due to muscle artifacts (one mother), extensive eye movement (three mothers), technical problems (three mothers and three non-mothers), infants' crying (one mother), and the lack of participants' attention to the task (one non-mother).</p>
</sec>
<sec disp-level="2">
<title>Stimuli</title>
<p>The following three textures were selected as tactile stimuli for the present experiment: fake fur, sand paper, and leather. Each tactile stimulus (length: 3 cm; width: 2 cm) was attached on a plane surface. Tactile stimuli were placed in a custom-made box so that participants could not see the tactile stimuli (length: 23 cm; width: 31 cm; height: 27 cm). Through an opening in the front (length: 9 cm; width: 11 cm), participants were instructed to put their right hands into the box. They had their right index finger fixed with a band to restrict their body movements (
<xref ref-type="supplementary-material" rid="s1">Supplementary Figure S2</xref>
). The second experimenter sat by the box and presented tactile stimuli manually through an opening at the back of the box (length: 20 cm, width: 27 cm).</p>
<p>As auditory stimuli, we prepared three tactile-related Japanese onomatopoeias as follows: /fuwa-fuwa/ (something soft), /tsuru-tsuru/ (something smooth), and /zara-zara/ (something rough and hard). These words were selected from a pilot questionnaire given to mothers, which asked how frequently they use tactile-related words with their infants. Final stimuli consisted of three high frequency words in both general and IDS use. The stimuli were recordings of two mothers speaking each word in two prosodic conditions: (i) IDS prosody, in the presence of their toddlers (aged two years) and (ii) ADS prosody, directed at an adult (an experimenter, aged 25 years). The auditory stimuli consisted of a total of 12 stimuli (3 words × 2 prosodic conditions × 2 mothers). Words were recorded at a 22.05 kHz sampling rate (in 16-bit monaural) using a digital recorder in a soundproof chamber.</p>
<p>The auditory stimuli were analyzed for the following parameters: average fundamental frequency (F0), pitch maximum (F-Max), frequency range (F-range), and duration. Pitch and duration analyses of the recordings were conducted using Adobe Audition. Statistical analyses were then conducted using Wilcoxon signed-rank test. The analyses, shown in the
<xref ref-type="supplementary-material" rid="s1">supplementary information</xref>
(
<xref ref-type="supplementary-material" rid="s1">Supplementary Table S3</xref>
), indicated that the F0 and F-Max of the auditory stimuli in IDS were significantly higher than those in ADS. F-range of the IDS stimuli was marginally higher than ADS stimuli. Duration was not different between IDS and ADS stimuli because each IDS sample was cut into short single words for use with the ERP paradigm. In order to ensure that IDS stimuli sounded like ‘infant-directed speech,' another group of non-parents scored how child-directed (1. Not-at-all (adult-directed) to 7. Very childish) auditory stimuli sounded. IDS stimuli were scored to be more childish than ADS stimuli (
<italic>t</italic>
= 3.38,
<italic>p</italic>
= .01). The intensity of the auditory stimuli was adjusted across stimuli by equalizing the root mean square power of all sound files. These stimuli were presented to the subjects at around 62.50 dB (SPL) sound pressure level.</p>
</sec>
<sec disp-level="2">
<title>Procedure</title>
<p>Each trial was comprised of a tactile stimulus (prime) and a subsequent auditory target stimulus (target). The target stimuli were tactile-related words spoken in an IDS (50%) or ADS (50%) manner, which were either semantically congruent (50%) or semantically incongruent (50%) with the priming stimuli. Thus, there were four experimental conditions: ADS-congruent (25%), ADS-incongruent (25%), IDS-congruent (25%), and IDS-incongruent (25%). The second experimenter rubbed participants' right index finger with the priming stimulus within 1000 ms of the presentation of a fixation point on the screen. Following a delay interval ranging between 500 and 600 ms, the auditory target was presented for 650 to 850 ms. After target presentation, two words were presented on the screen; one was the target word which had been presented as auditory stimuli and the other was a distractor word semantically unrelated to the auditory stimuli. Participants were instructed to indicate as quickly and accurately as possible which word on the screen they had just heard presented (
<xref ref-type="fig" rid="f1">Fig. 1</xref>
). To indicate their decision, participants had to press a button with their left middle or left index finger. The purpose of this task was to ensure that participants actively attended to the target stimuli.</p>
<p>Each experimental session consisted of four blocks, each comprised of 96 trials. Trial order was randomized within each block, with each auditory stimulus presented equally often in combination with a congruent and an incongruent tactile stimulus. In order to ensure that participants understood the procedure, participants performed a practice session composed of six trials before participating in the experimental sessions. The procedure in the practice session was the same as that of the experimental trials except for the absence of priming. All visual stimuli including the fixation point and the prompt for the responses in each trial, as well as the instructions for the task, were presented on a 22-in CRT monitor (RDT223BK, MITSUBISHI). E-prime Software (Psychology Software Tools, Inc., Pittsburgh, PA) was used to present all visual and auditory stimuli and record the participants' responses.</p>
</sec>
<sec disp-level="2">
<title>Frequency of Target Word Usage</title>
<p>After the ERP experiment, mothers were again presented with the auditory stimuli (tactile-related words) utilized in the experiment and asked to indicate the frequency with which they used those words in daily life with their children. The following question was answered with numbered scales for each individual auditory stimulus: How often do you (participants) use these words to your baby in everyday life?, 1 (never) to 5 (very often). The total score of the three words presented in the experiment was calculated for each mother.</p>
</sec>
<sec disp-level="2">
<title>EEG Data Acquisition and Processing</title>
<p>EEG data were recorded with a 64-channel Geodesic Sensor Net and analyzed using Net Station software (EGI, Eugene, OR) sampled at 250 Hz with a 0.1–100 Hz band-pass filter. Impedances were measured prior to and following EEG recording. Before recording, impedances were below 50 kΩ. All recordings were initially referenced to the vertex and later re-referenced to the average of all channels. In off-line analysis, EEG data were digitally filtered using a 0.3–30 Hz band-pass filter. The data were segmented into 1000 ms epochs time-locked to the onset of the auditory stimulus (target) with a 100 ms pre-stimulus baseline period. Artifacts were screened with automatic detection methods as follows: segments containing eye blink (80 μV threshold within 20 ms in the frontal region), eye movement artifacts (55 μV threshold), and channels with amplitudes exceeding ±80 μV were excluded from the averaging. Segments including more than ten bad channels were also excluded from averaging. Additionally, EEG records were edited for motor artifacts such as body movement based on visual inspection. The averages of amplitudes were computed separately for each condition (ADS-congruent, ADS-incongruent, IDS-congruent, IDS-incongruent) for each group.</p>
</sec>
<sec disp-level="2">
<title>Statistical Analyses</title>
<p>We computed the amplitude across electrodes in the frontal to central regions according to previous research
<xref ref-type="bibr" rid="b23">23</xref>
(for the electrode sites analyzed in this study, see
<xref ref-type="supplementary-material" rid="s1">Supplementary Figure S1</xref>
). Mean amplitude for each condition at each time point was calculated. As a preliminary analysis, we conducted ANOVAs with prosody (2: ADA/IDS) and congruency (2: congruent/incongruent) as within-subjects factors at each time point. To avoid the detection of spurious differences among conditions, we considered a time range of 7 consecutive time points (28 ms) of p-values < 0.05 to indicate a significant effect. We set the following three periods for the analysis: N1 (120–180 ms after stimulus onset), P2 (200–300 ms after stimulus onset), and N400 (300–500 ms after stimulus onset). Mean amplitude in each period was computed for each condition. These variances were analyzed by mixed measures ANOVAs with prosody (2: ADS/IDS), and congruency (2: congruent/incongruent) as within-subjects factors, and group (2: mothers/non-mothers) as a between-subjects factor.</p>
</sec>
</sec>
<sec disp-level="1">
<title>Author Contributions</title>
<p>Y.T., K.O. and M.M.-Y. designed the research. Y.T. and H.F. conducted experiments and analyzed the data. Y.T., H.F. and M.M.-Y. wrote the main manuscript text. Y.T., H.F., K.O. and M.M.-Y. reviewed and discussed the main manuscript, and approved the final manuscript.</p>
</sec>
<sec sec-type="supplementary-material" id="s1">
<title>Supplementary Material</title>
<supplementary-material id="d33e23" content-type="local-data">
<caption>
<title>Supplementary Information</title>
<p>Supplementary Information</p>
</caption>
<media xlink:href="srep06623-s1.pdf"></media>
</supplementary-material>
</sec>
</body>
<back>
<ack>
<p>This study was supported by funding from the Japan Science and Technology Agency, Exploratory Research for Advanced Technology, Okanoya Emotional Information Project, and Grants-in-Aid for Scientific Research from the Japan Society for the Promotion of Science and the Ministry of Education Culture, Sports, Science and Technology (24119005, 24300103 to M.M.-Y.; 13J05878 to Y.T.). We would like to thank all the children and parents who participated in the research. We also thank N. Naoi, and Y. Fuchino for technical support, and H. Hirai, S. Mizugaki, M. Yamamoto, and Y. Nishimura, for their assistance in the experiment.</p>
</ack>
<ref-list>
<ref id="b1">
<mixed-citation publication-type="journal">
<name>
<surname>Fernald</surname>
<given-names>A.</given-names>
</name>
<article-title>Prosody in speech to children: prelinguistic and linguistic functions</article-title>
.
<source>Ann. Child Dev.</source>
<volume>8</volume>
,
<fpage>43</fpage>
<lpage>80</lpage>
(
<year>1991</year>
).</mixed-citation>
</ref>
<ref id="b2">
<mixed-citation publication-type="journal">
<name>
<surname>Trainor</surname>
<given-names>L. J.</given-names>
</name>
,
<name>
<surname>Clarke</surname>
<given-names>E. D.</given-names>
</name>
,
<name>
<surname>Huntley</surname>
<given-names>A.</given-names>
</name>
&
<name>
<surname>Adams</surname>
<given-names>B. A.</given-names>
</name>
<article-title>The acoustic basis of preferences for infant-directed singing</article-title>
.
<source>Inf. Behav. Dev.</source>
<volume>20</volume>
,
<fpage>383</fpage>
<lpage>396</lpage>
(
<year>1997</year>
).</mixed-citation>
</ref>
<ref id="b3">
<mixed-citation publication-type="journal">
<name>
<surname>Soderstrom</surname>
<given-names>M.</given-names>
</name>
<article-title>Beyond babytalk: Re-evaluating the nature and content of speech input to preverbal infants</article-title>
.
<source>Dev. Rev.</source>
<volume>27</volume>
,
<fpage>501</fpage>
<lpage>532</lpage>
(
<year>2007</year>
).</mixed-citation>
</ref>
<ref id="b4">
<mixed-citation publication-type="journal">
<name>
<surname>Barker</surname>
<given-names>B. A.</given-names>
</name>
&
<name>
<surname>Newman</surname>
<given-names>R. S.</given-names>
</name>
<article-title>Listen to your mother! The role of talker familiarity in infant streaming</article-title>
.
<source>Cognition</source>
<volume>94</volume>
,
<fpage>45</fpage>
<lpage>53</lpage>
(
<year>2004</year>
).</mixed-citation>
</ref>
<ref id="b5">
<mixed-citation publication-type="journal">
<name>
<surname>Cooper</surname>
<given-names>R. P.</given-names>
</name>
&
<name>
<surname>Aslin</surname>
<given-names>R. N.</given-names>
</name>
<article-title>Preference for infant-directed speech in the first month after birth</article-title>
.
<source>Child Dev.</source>
<volume>61</volume>
,
<fpage>1584</fpage>
<lpage>1594</lpage>
(
<year>1990</year>
).
<pub-id pub-id-type="pmid">2245748</pub-id>
</mixed-citation>
</ref>
<ref id="b6">
<mixed-citation publication-type="journal">
<name>
<surname>Werker</surname>
<given-names>J. F.</given-names>
</name>
&
<name>
<surname>McLeod</surname>
<given-names>P. J.</given-names>
</name>
<article-title>Infant preference for both male and female infant-directed talk: a developmental study of attentional and affective responsiveness</article-title>
.
<source>Can. J. Psychol.</source>
<volume>43</volume>
,
<fpage>230</fpage>
<lpage>246</lpage>
(
<year>1989</year>
).
<pub-id pub-id-type="pmid">2486497</pub-id>
</mixed-citation>
</ref>
<ref id="b7">
<mixed-citation publication-type="journal">
<name>
<surname>Taumoepeau</surname>
<given-names>M.</given-names>
</name>
&
<name>
<surname>Ruffman</surname>
<given-names>T.</given-names>
</name>
<article-title>Stepping stones to others' minds: maternal talk relates to child mental state language and emotion understanding at 15, 24, and 33 months</article-title>
.
<source>Child Dev.</source>
<volume>79</volume>
,
<fpage>284</fpage>
<lpage>302</lpage>
(
<year>2008</year>
).
<pub-id pub-id-type="pmid">18366424</pub-id>
</mixed-citation>
</ref>
<ref id="b8">
<mixed-citation publication-type="journal">
<name>
<surname>Kuhl</surname>
<given-names>P. K.</given-names>
</name>
&
<name>
<surname>Rivera-Gaxiola</surname>
<given-names>M.</given-names>
</name>
<article-title>Neural substrates of language acquisition</article-title>
.
<source>Annu. Rev. Neurosci.</source>
<volume>31</volume>
,
<fpage>511</fpage>
<lpage>534</lpage>
(
<year>2008</year>
).
<pub-id pub-id-type="pmid">18558865</pub-id>
</mixed-citation>
</ref>
<ref id="b9">
<mixed-citation publication-type="journal">
<name>
<surname>Thiessen</surname>
<given-names>E. D.</given-names>
</name>
,
<name>
<surname>Hill</surname>
<given-names>E. A.</given-names>
</name>
&
<name>
<surname>Saffran</surname>
<given-names>J. R.</given-names>
</name>
<article-title>Infant-directed speech facilitates word segmentation</article-title>
.
<source>Infancy</source>
<volume>7</volume>
,
<fpage>53</fpage>
<lpage>71</lpage>
(
<year>2005</year>
).</mixed-citation>
</ref>
<ref id="b10">
<mixed-citation publication-type="journal">
<name>
<surname>Vallabha</surname>
<given-names>G. K.</given-names>
</name>
,
<name>
<surname>McClelland</surname>
<given-names>J. L.</given-names>
</name>
,
<name>
<surname>Pons</surname>
<given-names>F.</given-names>
</name>
,
<name>
<surname>Werker</surname>
<given-names>J. F.</given-names>
</name>
&
<name>
<surname>Amano</surname>
<given-names>S.</given-names>
</name>
<article-title>Unsupervised learning of vowel categories from infant-directed speech</article-title>
.
<source>Proc. Natl. Acad. Sci. U. S. A.</source>
<volume>104</volume>
,
<fpage>13273</fpage>
<lpage>13278</lpage>
(
<year>2007</year>
).
<pub-id pub-id-type="pmid">17664424</pub-id>
</mixed-citation>
</ref>
<ref id="b11">
<mixed-citation publication-type="journal">
<name>
<surname>Ramírez-Esparza</surname>
<given-names>N.</given-names>
</name>
,
<name>
<surname>García-Sierra</surname>
<given-names>A.</given-names>
</name>
&
<name>
<surname>Kuhl</surname>
<given-names>P. K.</given-names>
</name>
<article-title>Look who's talking: speech style and social context in language input to infants are linked to concurrent and future speech development</article-title>
.
<source>Dev. Sci.</source>
<volume>17</volume>
,
<fpage>1</fpage>
<lpage>12</lpage>
(
<year>2014</year>
).
<pub-id pub-id-type="pmid">24102702</pub-id>
</mixed-citation>
</ref>
<ref id="b12">
<mixed-citation publication-type="journal">
<name>
<surname>Sullivan</surname>
<given-names>J. W.</given-names>
</name>
&
<name>
<surname>Horowitz</surname>
<given-names>F. D.</given-names>
</name>
<article-title>Infant intermodal perception and maternal multimodal stimulation: Implications for language development</article-title>
.
<source>Advances in Infancy Research</source>
<volume>2</volume>
,
<fpage>183</fpage>
<lpage>239</lpage>
(
<year>1983</year>
).</mixed-citation>
</ref>
<ref id="b13">
<mixed-citation publication-type="journal">
<name>
<surname>Liebal</surname>
<given-names>K.</given-names>
</name>
,
<name>
<surname>Behne</surname>
<given-names>T.</given-names>
</name>
,
<name>
<surname>Carpenter</surname>
<given-names>M.</given-names>
</name>
&
<name>
<surname>Tomasello</surname>
<given-names>M.</given-names>
</name>
<article-title>Infants use shared experience to interpret pointing gestures</article-title>
.
<source>Dev. Sci.</source>
<volume>12</volume>
,
<fpage>264</fpage>
<lpage>271</lpage>
(
<year>2009</year>
).
<pub-id pub-id-type="pmid">19143799</pub-id>
</mixed-citation>
</ref>
<ref id="b14">
<mixed-citation publication-type="journal">
<name>
<surname>Gogate</surname>
<given-names>L. J.</given-names>
</name>
,
<name>
<surname>Bahrick</surname>
<given-names>L. E.</given-names>
</name>
&
<name>
<surname>Watson</surname>
<given-names>J. D.</given-names>
</name>
<article-title>A Study of Multimodal Motherese: The Role of Temporal Synchrony between Verbal Labels and Gestures</article-title>
.
<source>Child Dev.</source>
<volume>71</volume>
,
<fpage>878</fpage>
<lpage>894</lpage>
(
<year>2000</year>
).
<pub-id pub-id-type="pmid">11016554</pub-id>
</mixed-citation>
</ref>
<ref id="b15">
<mixed-citation publication-type="journal">
<name>
<surname>Gogate</surname>
<given-names>L. J.</given-names>
</name>
,
<name>
<surname>Bolzani</surname>
<given-names>L. H.</given-names>
</name>
&
<name>
<surname>Betancourt</surname>
<given-names>E. A.</given-names>
</name>
<article-title>Attention to maternal multimodal naming by 6-to 8-month-old infants and learning of word-object relations</article-title>
.
<source>Infancy</source>
<volume>9</volume>
,
<fpage>259</fpage>
<lpage>288</lpage>
(
<year>2006</year>
).</mixed-citation>
</ref>
<ref id="b16">
<mixed-citation publication-type="journal">
<name>
<surname>Gogate</surname>
<given-names>L. J.</given-names>
</name>
,
<name>
<surname>Walker-Andrews</surname>
<given-names>A. S.</given-names>
</name>
&
<name>
<surname>Bahrick</surname>
<given-names>L. E.</given-names>
</name>
<article-title>The intersensory origins of word-comprehension: An ecological-dynamic systems view</article-title>
.
<source>Dev. Sci.</source>
<volume>4</volume>
,
<fpage>1</fpage>
<lpage>18</lpage>
(
<year>2001</year>
).</mixed-citation>
</ref>
<ref id="b17">
<mixed-citation publication-type="journal">
<name>
<surname>Matsuda</surname>
<given-names>Y.</given-names>
</name>
<italic>et al.</italic>
<article-title>Processing of infant-directed speech by adults</article-title>
.
<source>NeuroImage</source>
<volume>54</volume>
,
<fpage>611</fpage>
<lpage>621</lpage>
(
<year>2011</year>
).
<pub-id pub-id-type="pmid">20691794</pub-id>
</mixed-citation>
</ref>
<ref id="b18">
<mixed-citation publication-type="journal">
<name>
<surname>Bergelson</surname>
<given-names>E.</given-names>
</name>
&
<name>
<surname>Swingley</surname>
<given-names>D.</given-names>
</name>
<article-title>At 6–9 months, human infants know the meanings of many common nouns</article-title>
.
<source>Proc. Natl. Acad. Sci.</source>
<volume>109</volume>
,
<fpage>3253</fpage>
<lpage>3258</lpage>
(
<year>2012</year>
).
<pub-id pub-id-type="pmid">22331874</pub-id>
</mixed-citation>
</ref>
<ref id="b19">
<mixed-citation publication-type="journal">
<name>
<surname>Werker</surname>
<given-names>J. F.</given-names>
</name>
,
<name>
<surname>Cohen</surname>
<given-names>L. B.</given-names>
</name>
,
<name>
<surname>Lloyd</surname>
<given-names>V. L.</given-names>
</name>
,
<name>
<surname>Casasola</surname>
<given-names>M.</given-names>
</name>
&
<name>
<surname>Stager</surname>
<given-names>C. L.</given-names>
</name>
<article-title>Acquisition of word–object associations by 14-month-old infants</article-title>
.
<source>Dev. Psychol.</source>
<volume>34</volume>
,
<fpage>1289</fpage>
<lpage>1309</lpage>
(
<year>1998</year>
).
<pub-id pub-id-type="pmid">9823513</pub-id>
</mixed-citation>
</ref>
<ref id="b20">
<mixed-citation publication-type="journal">
<name>
<surname>Ferber</surname>
<given-names>S. G.</given-names>
</name>
,
<name>
<surname>Feldman</surname>
<given-names>R.</given-names>
</name>
&
<name>
<surname>Makhoul</surname>
<given-names>I. R.</given-names>
</name>
<article-title>The development of maternal touch across the first year of life</article-title>
.
<source>Early Hum. Dev.</source>
<volume>84</volume>
,
<fpage>363</fpage>
<lpage>370</lpage>
(
<year>2008</year>
).
<pub-id pub-id-type="pmid">17988808</pub-id>
</mixed-citation>
</ref>
<ref id="b21">
<mixed-citation publication-type="journal">
<name>
<surname>Fernald</surname>
<given-names>A.</given-names>
</name>
&
<name>
<surname>Morikawa</surname>
<given-names>H.</given-names>
</name>
<article-title>Common themes and cultural variations in Japanese and american mothers' speech to infants</article-title>
.
<source>Child Dev.</source>
<volume>64</volume>
,
<fpage>637</fpage>
<lpage>656</lpage>
(
<year>1993</year>
).
<pub-id pub-id-type="pmid">8339686</pub-id>
</mixed-citation>
</ref>
<ref id="b22">
<mixed-citation publication-type="journal">
<name>
<surname>Yoshida</surname>
<given-names>H.</given-names>
</name>
<article-title>A cross-linguistic study of sound symbolism in children's verb learning</article-title>
.
<source>J. Cognition. Dev.</source>
<volume>13</volume>
,
<fpage>232</fpage>
<lpage>265</lpage>
(
<year>2012</year>
).</mixed-citation>
</ref>
<ref id="b23">
<mixed-citation publication-type="journal">
<name>
<surname>Schneider</surname>
<given-names>T. R.</given-names>
</name>
,
<name>
<surname>Debener</surname>
<given-names>S.</given-names>
</name>
,
<name>
<surname>Oostenveld</surname>
<given-names>R.</given-names>
</name>
&
<name>
<surname>Engel</surname>
<given-names>A. K.</given-names>
</name>
<article-title>Enhanced EEG gamma-band activity reflects multisensory semantic matching in visual-to-auditory object priming</article-title>
.
<source>NeuroImage.</source>
<volume>42</volume>
,
<fpage>1244</fpage>
<lpage>1254</lpage>
(
<year>2008</year>
).
<pub-id pub-id-type="pmid">18617422</pub-id>
</mixed-citation>
</ref>
<ref id="b24">
<mixed-citation publication-type="journal">
<name>
<surname>Schneider</surname>
<given-names>T. R.</given-names>
</name>
,
<name>
<surname>Lorenz</surname>
<given-names>S.</given-names>
</name>
,
<name>
<surname>Senkowski</surname>
<given-names>D.</given-names>
</name>
&
<name>
<surname>Engel</surname>
<given-names>A. K.</given-names>
</name>
<article-title>Gamma-band activity as a signature for cross-modal priming of auditory object recognition by active haptic exploration</article-title>
.
<source>J. Neurosci.</source>
<bold>31</bold>
,
<volume>7</volume>
,
<fpage>2502</fpage>
<lpage>2510</lpage>
(
<year>2011</year>
).
<pub-id pub-id-type="pmid">21325518</pub-id>
</mixed-citation>
</ref>
<ref id="b25">
<mixed-citation publication-type="journal">
<name>
<surname>Verkindt</surname>
<given-names>C.</given-names>
</name>
,
<name>
<surname>Bertrand</surname>
<given-names>O.</given-names>
</name>
,
<name>
<surname>Thevenet</surname>
<given-names>M.</given-names>
</name>
&
<name>
<surname>Pernier</surname>
<given-names>J.</given-names>
</name>
<article-title>Two auditory components in the 130–230 ms range disclosed by their stimulus frequency dependence</article-title>
.
<source>NeuroReport.</source>
<volume>5</volume>
,
<fpage>1189</fpage>
<lpage>1192</lpage>
(
<year>1994</year>
).
<pub-id pub-id-type="pmid">7919162</pub-id>
</mixed-citation>
</ref>
<ref id="b26">
<mixed-citation publication-type="journal">
<name>
<surname>Senkowski</surname>
<given-names>D.</given-names>
</name>
,
<name>
<surname>Saint-Amour</surname>
<given-names>D.</given-names>
</name>
,
<name>
<surname>Kelly</surname>
<given-names>S. P.</given-names>
</name>
&
<name>
<surname>Foxe</surname>
<given-names>J. J.</given-names>
</name>
<article-title>Multisensory processing of naturalistic objects in motion: A high-density electrical mapping and source estimation study</article-title>
.
<source>NeuroImage.</source>
<volume>36</volume>
,
<fpage>877</fpage>
<lpage>888</lpage>
(
<year>2007</year>
).
<pub-id pub-id-type="pmid">17481922</pub-id>
</mixed-citation>
</ref>
<ref id="b27">
<mixed-citation publication-type="journal">
<name>
<surname>Liebenthal</surname>
<given-names>E.</given-names>
</name>
<italic>et al.</italic>
<article-title>Specialization along the left superior temporal sulcus for auditory categorization</article-title>
.
<source>Cereb. Cortex.</source>
<volume>20</volume>
,
<fpage>2958</fpage>
<lpage>2970</lpage>
(
<year>2010</year>
).
<pub-id pub-id-type="pmid">20382643</pub-id>
</mixed-citation>
</ref>
<ref id="b28">
<mixed-citation publication-type="journal">
<name>
<surname>Parkinson</surname>
<given-names>A. L.</given-names>
</name>
<italic>et al.</italic>
<article-title>Understanding the neural mechanisms involved in sensory control of voice production</article-title>
.
<source>NeuroImage</source>
<volume>61</volume>
,
<fpage>314</fpage>
<lpage>322</lpage>
(
<year>2012</year>
).
<pub-id pub-id-type="pmid">22406500</pub-id>
</mixed-citation>
</ref>
<ref id="b29">
<mixed-citation publication-type="journal">
<name>
<surname>Parsons</surname>
<given-names>C. E.</given-names>
</name>
,
<name>
<surname>Stark</surname>
<given-names>E. A.</given-names>
</name>
,
<name>
<surname>Young</surname>
<given-names>K. S.</given-names>
</name>
,
<name>
<surname>Stein</surname>
<given-names>A.</given-names>
</name>
&
<name>
<surname>Kringelbach</surname>
<given-names>M. L.</given-names>
</name>
<article-title>Understanding the human parental brain: A critical role of the orbitofrontal cortex</article-title>
.
<source>Soc. Neurosci</source>
<volume>8</volume>
,
<fpage>525</fpage>
<lpage>543</lpage>
(
<year>2013</year>
).
<pub-id pub-id-type="pmid">24171901</pub-id>
</mixed-citation>
</ref>
<ref id="b30">
<mixed-citation publication-type="journal">
<name>
<surname>Kringelbach</surname>
<given-names>M. L.</given-names>
</name>
&
<name>
<surname>Rolls</surname>
<given-names>E. T.</given-names>
</name>
<article-title>The functional neuroanatomy of the human orbitofrontal cortex: evidence from neuroimaging and neuropsychology</article-title>
.
<source>Prog. Neurobiol.</source>
<volume>72</volume>
,
<fpage>341</fpage>
<lpage>372</lpage>
(
<year>2004</year>
).
<pub-id pub-id-type="pmid">15157726</pub-id>
</mixed-citation>
</ref>
<ref id="b31">
<mixed-citation publication-type="journal">
<name>
<surname>Trembley</surname>
<given-names>K. N.</given-names>
</name>
&
<name>
<surname>Kraus</surname>
<given-names>N.</given-names>
</name>
<article-title>Auditory training induces asymmetrical changes in cortical neural activity</article-title>
.
<source>J Speech Lang Hear Res.</source>
<volume>45</volume>
,
<fpage>564</fpage>
<lpage>572</lpage>
(
<year>2002</year>
).
<pub-id pub-id-type="pmid">12069008</pub-id>
</mixed-citation>
</ref>
<ref id="b32">
<mixed-citation publication-type="journal">
<name>
<surname>Kutas</surname>
<given-names>M.</given-names>
</name>
&
<name>
<surname>Federmeier</surname>
<given-names>K. D.</given-names>
</name>
<article-title>Thirty years and counting: finding meaning in the N400 component of the event-related brain potential (ERP)</article-title>
.
<source>Ann. Rev. Psychol.</source>
<volume>62</volume>
,
<fpage>621</fpage>
<lpage>647</lpage>
(
<year>2011</year>
).
<pub-id pub-id-type="pmid">20809790</pub-id>
</mixed-citation>
</ref>
<ref id="b33">
<mixed-citation publication-type="journal">
<name>
<surname>Orgs</surname>
<given-names>G.</given-names>
</name>
,
<name>
<surname>Lange</surname>
<given-names>K.</given-names>
</name>
,
<name>
<surname>Dombrowski</surname>
<given-names>J. H.</given-names>
</name>
&
<name>
<surname>Heil</surname>
<given-names>M.</given-names>
</name>
<article-title>N400-effects to task-irrelevant environmental sounds: Further evidence for obligatory conceptual processing</article-title>
.
<source>Neurosci. Lett.</source>
<volume>436</volume>
,
<fpage>133</fpage>
<lpage>137</lpage>
(
<year>2008</year>
).
<pub-id pub-id-type="pmid">18378076</pub-id>
</mixed-citation>
</ref>
<ref id="b34">
<mixed-citation publication-type="journal">
<name>
<surname>Kiefer</surname>
<given-names>M.</given-names>
</name>
<article-title>Repetition-priming modulates category-related effects on event-related potentials: further evidence for multiple cortical semantic systems</article-title>
.
<source>J. Cognitive Neurosci.</source>
<volume>17</volume>
,
<fpage>199</fpage>
<lpage>211</lpage>
(
<year>2005</year>
).</mixed-citation>
</ref>
</ref-list>
</back>
<floats-group>
<fig id="f1">
<label>Figure 1</label>
<caption>
<title>Experimental procedure of the present study.</title>
<p>Participants saw an on-screen fixation point. A tactile stimulus (prime) was then presented followed by a word presentation (target). Participants were required to identify the word they heard by pressing a key. Inter-stimulus intervals between the prime and target were 500 to 600 ms and between the target and identification task were 150 to 400 ms. EEG data recorded during the target stimulus presentation was analyzed.</p>
</caption>
<graphic xlink:href="srep06623-f1"></graphic>
</fig>
<fig id="f2">
<label>Figure 2</label>
<caption>
<title>Grand-averaged ERP waveforms of all participants.</title>
<p>Dashed lines designate the congruent condition and solid lines designate the incongruent condition. Line color indicates prosody (blue as ADS or red as IDS). The peak and timeline of each component (N1, P2, and N400) are shown in the ERP waves. ERPs in other regions are shown in the
<xref ref-type="supplementary-material" rid="s1">supplementary results (S1)</xref>
.</p>
</caption>
<graphic xlink:href="srep06623-f2"></graphic>
</fig>
<fig id="f3">
<label>Figure 3</label>
<caption>
<title>Grand-averaged ERP waveforms and mean ERP amplitudes of mothers and non-mothers.</title>
<p>Grand-averaged ERP waveforms in Fz region (A) are presented for mothers and non-mothers. Mean ERP amplitudes are presented in the period of N1 (B) and P2 (C). Error bars show standard errors. *
<italic>p</italic>
< .05, **
<italic>p</italic>
< .01.</p>
</caption>
<graphic xlink:href="srep06623-f3"></graphic>
</fig>
<fig id="f4">
<label>Figure 4</label>
<caption>
<title>Correlation between mothers' ERP responses and their usage score for tactile-related words.</title>
<p>The graph shows mothers' data. The X-axis shows the frequency of use of tactile-related words and the Y-axis shows differential ERP responses. The graphs represent correlations for N1, P2, and N400. The left side shows differential ERPs with IDS prosody (IDS-incongruent – IDS-congruent) [μV] and the right side shows those with ADS prosody (ADS-incongruent – ADS-congruent) [μV].</p>
</caption>
<graphic xlink:href="srep06623-f4"></graphic>
</fig>
<table-wrap position="float" id="t1">
<label>Table 1</label>
<caption>
<title>Main Effects of Congruency and Prosody in Fronto-cantral Regions</title>
</caption>
<table frame="hsides" rules="groups" border="1">
<colgroup>
<col align="left"></col>
<col align="center"></col>
<col align="center"></col>
<col align="center"></col>
<col align="center"></col>
<col align="center"></col>
<col align="center"></col>
<col align="center"></col>
</colgroup>
<thead valign="bottom">
<tr>
<th align="justify" valign="top" charoff="50">Effect</th>
<th align="center" valign="top" charoff="50">Region</th>
<th align="center" valign="top" charoff="50">Component</th>
<th align="center" valign="top" charoff="50">Peak Latency (msec)</th>
<th align="center" valign="top" charoff="50">Comparison</th>
<th align="center" valign="top" charoff="50">
<italic>F</italic>
</th>
<th align="center" valign="top" charoff="50">
<italic>p</italic>
</th>
<th align="center" valign="top" charoff="50">
<italic>η
<sup>2</sup>
</italic>
</th>
</tr>
</thead>
<tbody valign="top">
<tr>
<td align="center" valign="top" charoff="50">Congruency</td>
<td align="center" valign="top" charoff="50">F3</td>
<td align="center" valign="top" charoff="50">P2</td>
<td align="char" valign="top" char="." charoff="50">247.18</td>
<td align="center" valign="top" charoff="50">Positive Incon > Con</td>
<td align="char" valign="top" char="." charoff="50">4.90</td>
<td align="char" valign="top" char="." charoff="50">.03</td>
<td align="center" valign="top" charoff="50">.13</td>
</tr>
<tr>
<td align="left" valign="top" charoff="50"> </td>
<td align="center" valign="top" charoff="50">C3</td>
<td align="center" valign="top" charoff="50">P2</td>
<td align="char" valign="top" char="." charoff="50">242.00</td>
<td align="center" valign="top" charoff="50">Positive Incon > Con</td>
<td align="char" valign="top" char="." charoff="50">6.13</td>
<td align="char" valign="top" char="." charoff="50">.02</td>
<td align="center" valign="top" charoff="50">.16</td>
</tr>
<tr>
<td align="center" valign="top" charoff="50">Prosody</td>
<td align="center" valign="top" charoff="50">F4</td>
<td align="char" valign="top" char="." charoff="50">N400</td>
<td align="char" valign="top" char="." charoff="50">343.62</td>
<td align="center" valign="top" charoff="50">Negative IDS > ADS</td>
<td align="char" valign="top" char="." charoff="50">4.83</td>
<td align="char" valign="top" char="." charoff="50">.04</td>
<td align="center" valign="top" charoff="50">.13</td>
</tr>
<tr>
<td align="left" valign="top" charoff="50"> </td>
<td align="center" valign="top" charoff="50">C3</td>
<td align="char" valign="top" char="." charoff="50">N400</td>
<td align="char" valign="top" char="." charoff="50">356.15</td>
<td align="center" valign="top" charoff="50">Negative IDS > ADS</td>
<td align="char" valign="top" char="." charoff="50">9.86</td>
<td align="char" valign="top" char="." charoff="50">.04</td>
<td align="center" valign="top" charoff="50">.24</td>
</tr>
<tr>
<td align="left" valign="top" charoff="50"> </td>
<td align="center" valign="top" charoff="50">Cz</td>
<td align="char" valign="top" char="." charoff="50">N400</td>
<td align="char" valign="top" char="." charoff="50">359.24</td>
<td align="center" valign="top" charoff="50">Negative IDS > ADS</td>
<td align="char" valign="top" char="." charoff="50">10.36</td>
<td align="char" valign="top" char="." charoff="50">.003</td>
<td align="center" valign="top" charoff="50">.25</td>
</tr>
<tr>
<td align="left" valign="top" charoff="50"> </td>
<td align="center" valign="top" charoff="50">C4</td>
<td align="char" valign="top" char="." charoff="50">N400</td>
<td align="char" valign="top" char="." charoff="50">358.06</td>
<td align="center" valign="top" charoff="50">Negative IDS > ADS</td>
<td align="char" valign="top" char="." charoff="50">15.32</td>
<td align="char" valign="top" char="." charoff="50">.0001</td>
<td align="center" valign="top" charoff="50">.32</td>
</tr>
</tbody>
</table>
<table-wrap-foot>
<fn id="t1-fn1">
<p>
<italic>Note</italic>
. We calculated the mean amplitude of each component (N1, P2, and N400) in the fronto-central regions, and compared that with group, prosody, and congruency for each region. In the F3 and C3 regions, we found a main effect of congruency in the P2 component, and in the right frontal and central regions, we found a main effect of prosody in the N400 component.</p>
</fn>
</table-wrap-foot>
</table-wrap>
</floats-group>
</pmc>
<affiliations>
<list>
<country>
<li>Japon</li>
</country>
</list>
<tree>
<country name="Japon">
<noRegion>
<name sortKey="Tanaka, Yukari" sort="Tanaka, Yukari" uniqKey="Tanaka Y" first="Yukari" last="Tanaka">Yukari Tanaka</name>
</noRegion>
<name sortKey="Fukushima, Hirokata" sort="Fukushima, Hirokata" uniqKey="Fukushima H" first="Hirokata" last="Fukushima">Hirokata Fukushima</name>
<name sortKey="Myowa Yamakoshi, Masako" sort="Myowa Yamakoshi, Masako" uniqKey="Myowa Yamakoshi M" first="Masako" last="Myowa-Yamakoshi">Masako Myowa-Yamakoshi</name>
<name sortKey="Myowa Yamakoshi, Masako" sort="Myowa Yamakoshi, Masako" uniqKey="Myowa Yamakoshi M" first="Masako" last="Myowa-Yamakoshi">Masako Myowa-Yamakoshi</name>
<name sortKey="Okanoya, Kazuo" sort="Okanoya, Kazuo" uniqKey="Okanoya K" first="Kazuo" last="Okanoya">Kazuo Okanoya</name>
<name sortKey="Okanoya, Kazuo" sort="Okanoya, Kazuo" uniqKey="Okanoya K" first="Kazuo" last="Okanoya">Kazuo Okanoya</name>
<name sortKey="Tanaka, Yukari" sort="Tanaka, Yukari" uniqKey="Tanaka Y" first="Yukari" last="Tanaka">Yukari Tanaka</name>
</country>
</tree>
</affiliations>
</record>

Pour manipuler ce document sous Unix (Dilib)

EXPLOR_STEP=$WICRI_ROOT/Ticri/CIDE/explor/HapticV1/Data/Ncbi/Merge
HfdSelect -h $EXPLOR_STEP/biblio.hfd -nk 003371 | SxmlIndent | more

Ou

HfdSelect -h $EXPLOR_AREA/Data/Ncbi/Merge/biblio.hfd -nk 003371 | SxmlIndent | more

Pour mettre un lien sur cette page dans le réseau Wicri

{{Explor lien
   |wiki=    Ticri/CIDE
   |area=    HapticV1
   |flux=    Ncbi
   |étape=   Merge
   |type=    RBID
   |clé=     PMC:4200416
   |texte=   Mothers' multimodal information processing is modulated by multimodal interactions with their infants
}}

Pour générer des pages wiki

HfdIndexSelect -h $EXPLOR_AREA/Data/Ncbi/Merge/RBID.i   -Sk "pubmed:25322936" \
       | HfdSelect -Kh $EXPLOR_AREA/Data/Ncbi/Merge/biblio.hfd   \
       | NlmPubMed2Wicri -a HapticV1 

Wicri

This area was generated with Dilib version V0.6.23.
Data generation: Mon Jun 13 01:09:46 2016. Site generation: Wed Mar 6 09:54:07 2024