Serveur d'exploration sur l'opéra

Attention, ce site est en cours de développement !
Attention, site généré par des moyens informatiques à partir de corpus bruts.
Les informations ne sont donc pas validées.

Precursors of Dancing and Singing to Music in Three- to Four-Months-Old Infants

Identifieur interne : 000105 ( Pmc/Checkpoint ); précédent : 000104; suivant : 000106

Precursors of Dancing and Singing to Music in Three- to Four-Months-Old Infants

Auteurs : Shinya Fujii [Canada, Japon] ; Hama Watanabe [Japon] ; Hiroki Oohashi [Japon] ; Masaya Hirashima [Japon] ; Daichi Nozaki [Japon] ; Gentaro Taga [Japon]

Source :

RBID : PMC:4023986

Abstract

Dancing and singing to music involve auditory-motor coordination and have been essential to our human culture since ancient times. Although scholars have been trying to understand the evolutionary and developmental origin of music, early human developmental manifestations of auditory-motor interactions in music have not been fully investigated. Here we report limb movements and vocalizations in three- to four-months-old infants while they listened to music and were in silence. In the group analysis, we found no significant increase in the amount of movement or in the relative power spectrum density around the musical tempo in the music condition compared to the silent condition. Intriguingly, however, there were two infants who demonstrated striking increases in the rhythmic movements via kicking or arm-waving around the musical tempo during listening to music. Monte-Carlo statistics with phase-randomized surrogate data revealed that the limb movements of these individuals were significantly synchronized to the musical beat. Moreover, we found a clear increase in the formant variability of vocalizations in the group during music perception. These results suggest that infants at this age are already primed with their bodies to interact with music via limb movements and vocalizations.


Url:
DOI: 10.1371/journal.pone.0097680
PubMed: 24837135
PubMed Central: 4023986


Affiliations:


Links toward previous steps (curation, corpus...)


Links to Exploration step

PMC:4023986

Le document en format XML

<record>
<TEI>
<teiHeader>
<fileDesc>
<titleStmt>
<title xml:lang="en">Precursors of Dancing and Singing to Music in Three- to Four-Months-Old Infants</title>
<author>
<name sortKey="Fujii, Shinya" sort="Fujii, Shinya" uniqKey="Fujii S" first="Shinya" last="Fujii">Shinya Fujii</name>
<affiliation wicri:level="1">
<nlm:aff id="aff1">
<addr-line>The Heart and Stroke Foundation Canadian Partnership for Stroke Recovery, Sunnybrook Research Institute, Toronto, Ontario, Canada</addr-line>
</nlm:aff>
<country xml:lang="fr">Canada</country>
<wicri:regionArea>The Heart and Stroke Foundation Canadian Partnership for Stroke Recovery, Sunnybrook Research Institute, Toronto, Ontario</wicri:regionArea>
<wicri:noRegion>Ontario</wicri:noRegion>
</affiliation>
<affiliation wicri:level="3">
<nlm:aff id="aff3">
<addr-line>Research Fellow of the Japan Society for the Promotion of Science, Chiyoda-ku, Tokyo, Japan</addr-line>
</nlm:aff>
<country xml:lang="fr">Japon</country>
<wicri:regionArea>Research Fellow of the Japan Society for the Promotion of Science, Chiyoda-ku, Tokyo</wicri:regionArea>
<placeName>
<settlement type="city">Tokyo</settlement>
</placeName>
</affiliation>
</author>
<author>
<name sortKey="Watanabe, Hama" sort="Watanabe, Hama" uniqKey="Watanabe H" first="Hama" last="Watanabe">Hama Watanabe</name>
<affiliation wicri:level="3">
<nlm:aff id="aff2">
<addr-line>Graduate School of Education, The University of Tokyo, Bunkyo-ku, Tokyo, Japan</addr-line>
</nlm:aff>
<country xml:lang="fr">Japon</country>
<wicri:regionArea>Graduate School of Education, The University of Tokyo, Bunkyo-ku, Tokyo</wicri:regionArea>
<placeName>
<settlement type="city">Tokyo</settlement>
</placeName>
</affiliation>
</author>
<author>
<name sortKey="Oohashi, Hiroki" sort="Oohashi, Hiroki" uniqKey="Oohashi H" first="Hiroki" last="Oohashi">Hiroki Oohashi</name>
<affiliation wicri:level="3">
<nlm:aff id="aff2">
<addr-line>Graduate School of Education, The University of Tokyo, Bunkyo-ku, Tokyo, Japan</addr-line>
</nlm:aff>
<country xml:lang="fr">Japon</country>
<wicri:regionArea>Graduate School of Education, The University of Tokyo, Bunkyo-ku, Tokyo</wicri:regionArea>
<placeName>
<settlement type="city">Tokyo</settlement>
</placeName>
</affiliation>
<affiliation wicri:level="3">
<nlm:aff id="aff3">
<addr-line>Research Fellow of the Japan Society for the Promotion of Science, Chiyoda-ku, Tokyo, Japan</addr-line>
</nlm:aff>
<country xml:lang="fr">Japon</country>
<wicri:regionArea>Research Fellow of the Japan Society for the Promotion of Science, Chiyoda-ku, Tokyo</wicri:regionArea>
<placeName>
<settlement type="city">Tokyo</settlement>
</placeName>
</affiliation>
</author>
<author>
<name sortKey="Hirashima, Masaya" sort="Hirashima, Masaya" uniqKey="Hirashima M" first="Masaya" last="Hirashima">Masaya Hirashima</name>
<affiliation wicri:level="3">
<nlm:aff id="aff2">
<addr-line>Graduate School of Education, The University of Tokyo, Bunkyo-ku, Tokyo, Japan</addr-line>
</nlm:aff>
<country xml:lang="fr">Japon</country>
<wicri:regionArea>Graduate School of Education, The University of Tokyo, Bunkyo-ku, Tokyo</wicri:regionArea>
<placeName>
<settlement type="city">Tokyo</settlement>
</placeName>
</affiliation>
</author>
<author>
<name sortKey="Nozaki, Daichi" sort="Nozaki, Daichi" uniqKey="Nozaki D" first="Daichi" last="Nozaki">Daichi Nozaki</name>
<affiliation wicri:level="3">
<nlm:aff id="aff2">
<addr-line>Graduate School of Education, The University of Tokyo, Bunkyo-ku, Tokyo, Japan</addr-line>
</nlm:aff>
<country xml:lang="fr">Japon</country>
<wicri:regionArea>Graduate School of Education, The University of Tokyo, Bunkyo-ku, Tokyo</wicri:regionArea>
<placeName>
<settlement type="city">Tokyo</settlement>
</placeName>
</affiliation>
</author>
<author>
<name sortKey="Taga, Gentaro" sort="Taga, Gentaro" uniqKey="Taga G" first="Gentaro" last="Taga">Gentaro Taga</name>
<affiliation wicri:level="3">
<nlm:aff id="aff2">
<addr-line>Graduate School of Education, The University of Tokyo, Bunkyo-ku, Tokyo, Japan</addr-line>
</nlm:aff>
<country xml:lang="fr">Japon</country>
<wicri:regionArea>Graduate School of Education, The University of Tokyo, Bunkyo-ku, Tokyo</wicri:regionArea>
<placeName>
<settlement type="city">Tokyo</settlement>
</placeName>
</affiliation>
</author>
</titleStmt>
<publicationStmt>
<idno type="wicri:source">PMC</idno>
<idno type="pmid">24837135</idno>
<idno type="pmc">4023986</idno>
<idno type="url">http://www.ncbi.nlm.nih.gov/pmc/articles/PMC4023986</idno>
<idno type="RBID">PMC:4023986</idno>
<idno type="doi">10.1371/journal.pone.0097680</idno>
<date when="2014">2014</date>
<idno type="wicri:Area/Pmc/Corpus">000E69</idno>
<idno type="wicri:Area/Pmc/Curation">000E69</idno>
<idno type="wicri:Area/Pmc/Checkpoint">000105</idno>
</publicationStmt>
<sourceDesc>
<biblStruct>
<analytic>
<title xml:lang="en" level="a" type="main">Precursors of Dancing and Singing to Music in Three- to Four-Months-Old Infants</title>
<author>
<name sortKey="Fujii, Shinya" sort="Fujii, Shinya" uniqKey="Fujii S" first="Shinya" last="Fujii">Shinya Fujii</name>
<affiliation wicri:level="1">
<nlm:aff id="aff1">
<addr-line>The Heart and Stroke Foundation Canadian Partnership for Stroke Recovery, Sunnybrook Research Institute, Toronto, Ontario, Canada</addr-line>
</nlm:aff>
<country xml:lang="fr">Canada</country>
<wicri:regionArea>The Heart and Stroke Foundation Canadian Partnership for Stroke Recovery, Sunnybrook Research Institute, Toronto, Ontario</wicri:regionArea>
<wicri:noRegion>Ontario</wicri:noRegion>
</affiliation>
<affiliation wicri:level="3">
<nlm:aff id="aff3">
<addr-line>Research Fellow of the Japan Society for the Promotion of Science, Chiyoda-ku, Tokyo, Japan</addr-line>
</nlm:aff>
<country xml:lang="fr">Japon</country>
<wicri:regionArea>Research Fellow of the Japan Society for the Promotion of Science, Chiyoda-ku, Tokyo</wicri:regionArea>
<placeName>
<settlement type="city">Tokyo</settlement>
</placeName>
</affiliation>
</author>
<author>
<name sortKey="Watanabe, Hama" sort="Watanabe, Hama" uniqKey="Watanabe H" first="Hama" last="Watanabe">Hama Watanabe</name>
<affiliation wicri:level="3">
<nlm:aff id="aff2">
<addr-line>Graduate School of Education, The University of Tokyo, Bunkyo-ku, Tokyo, Japan</addr-line>
</nlm:aff>
<country xml:lang="fr">Japon</country>
<wicri:regionArea>Graduate School of Education, The University of Tokyo, Bunkyo-ku, Tokyo</wicri:regionArea>
<placeName>
<settlement type="city">Tokyo</settlement>
</placeName>
</affiliation>
</author>
<author>
<name sortKey="Oohashi, Hiroki" sort="Oohashi, Hiroki" uniqKey="Oohashi H" first="Hiroki" last="Oohashi">Hiroki Oohashi</name>
<affiliation wicri:level="3">
<nlm:aff id="aff2">
<addr-line>Graduate School of Education, The University of Tokyo, Bunkyo-ku, Tokyo, Japan</addr-line>
</nlm:aff>
<country xml:lang="fr">Japon</country>
<wicri:regionArea>Graduate School of Education, The University of Tokyo, Bunkyo-ku, Tokyo</wicri:regionArea>
<placeName>
<settlement type="city">Tokyo</settlement>
</placeName>
</affiliation>
<affiliation wicri:level="3">
<nlm:aff id="aff3">
<addr-line>Research Fellow of the Japan Society for the Promotion of Science, Chiyoda-ku, Tokyo, Japan</addr-line>
</nlm:aff>
<country xml:lang="fr">Japon</country>
<wicri:regionArea>Research Fellow of the Japan Society for the Promotion of Science, Chiyoda-ku, Tokyo</wicri:regionArea>
<placeName>
<settlement type="city">Tokyo</settlement>
</placeName>
</affiliation>
</author>
<author>
<name sortKey="Hirashima, Masaya" sort="Hirashima, Masaya" uniqKey="Hirashima M" first="Masaya" last="Hirashima">Masaya Hirashima</name>
<affiliation wicri:level="3">
<nlm:aff id="aff2">
<addr-line>Graduate School of Education, The University of Tokyo, Bunkyo-ku, Tokyo, Japan</addr-line>
</nlm:aff>
<country xml:lang="fr">Japon</country>
<wicri:regionArea>Graduate School of Education, The University of Tokyo, Bunkyo-ku, Tokyo</wicri:regionArea>
<placeName>
<settlement type="city">Tokyo</settlement>
</placeName>
</affiliation>
</author>
<author>
<name sortKey="Nozaki, Daichi" sort="Nozaki, Daichi" uniqKey="Nozaki D" first="Daichi" last="Nozaki">Daichi Nozaki</name>
<affiliation wicri:level="3">
<nlm:aff id="aff2">
<addr-line>Graduate School of Education, The University of Tokyo, Bunkyo-ku, Tokyo, Japan</addr-line>
</nlm:aff>
<country xml:lang="fr">Japon</country>
<wicri:regionArea>Graduate School of Education, The University of Tokyo, Bunkyo-ku, Tokyo</wicri:regionArea>
<placeName>
<settlement type="city">Tokyo</settlement>
</placeName>
</affiliation>
</author>
<author>
<name sortKey="Taga, Gentaro" sort="Taga, Gentaro" uniqKey="Taga G" first="Gentaro" last="Taga">Gentaro Taga</name>
<affiliation wicri:level="3">
<nlm:aff id="aff2">
<addr-line>Graduate School of Education, The University of Tokyo, Bunkyo-ku, Tokyo, Japan</addr-line>
</nlm:aff>
<country xml:lang="fr">Japon</country>
<wicri:regionArea>Graduate School of Education, The University of Tokyo, Bunkyo-ku, Tokyo</wicri:regionArea>
<placeName>
<settlement type="city">Tokyo</settlement>
</placeName>
</affiliation>
</author>
</analytic>
<series>
<title level="j">PLoS ONE</title>
<idno type="e-ISSN">1932-6203</idno>
<imprint>
<date when="2014">2014</date>
</imprint>
</series>
</biblStruct>
</sourceDesc>
</fileDesc>
<profileDesc>
<textClass></textClass>
</profileDesc>
</teiHeader>
<front>
<div type="abstract" xml:lang="en">
<p>Dancing and singing to music involve auditory-motor coordination and have been essential to our human culture since ancient times. Although scholars have been trying to understand the evolutionary and developmental origin of music, early human developmental manifestations of auditory-motor interactions in music have not been fully investigated. Here we report limb movements and vocalizations in three- to four-months-old infants while they listened to music and were in silence. In the group analysis, we found no significant increase in the amount of movement or in the relative power spectrum density around the musical tempo in the music condition compared to the silent condition. Intriguingly, however, there were two infants who demonstrated striking increases in the rhythmic movements via kicking or arm-waving around the musical tempo during listening to music. Monte-Carlo statistics with phase-randomized surrogate data revealed that the limb movements of these individuals were significantly synchronized to the musical beat. Moreover, we found a clear increase in the formant variability of vocalizations in the group during music perception. These results suggest that infants at this age are already primed with their bodies to interact with music via limb movements and vocalizations.</p>
</div>
</front>
<back>
<div1 type="bibliography">
<listBibl>
<biblStruct>
<analytic>
<author>
<name sortKey="Conard, Nj" uniqKey="Conard N">NJ Conard</name>
</author>
<author>
<name sortKey="Malina, M" uniqKey="Malina M">M Malina</name>
</author>
<author>
<name sortKey="Munzel, Sc" uniqKey="Munzel S">SC Munzel</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Fitch, Wt" uniqKey="Fitch W">WT Fitch</name>
</author>
</analytic>
</biblStruct>
<biblStruct></biblStruct>
<biblStruct></biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Wan, Cy" uniqKey="Wan C">CY Wan</name>
</author>
<author>
<name sortKey="Schlaug, G" uniqKey="Schlaug G">G Schlaug</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Zatorre, Rj" uniqKey="Zatorre R">RJ Zatorre</name>
</author>
<author>
<name sortKey="Chen, Jl" uniqKey="Chen J">JL Chen</name>
</author>
<author>
<name sortKey="Penhune, Vb" uniqKey="Penhune V">VB Penhune</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Hasegawa, A" uniqKey="Hasegawa A">A Hasegawa</name>
</author>
<author>
<name sortKey="Okanoya, K" uniqKey="Okanoya K">K Okanoya</name>
</author>
<author>
<name sortKey="Hasegawa, T" uniqKey="Hasegawa T">T Hasegawa</name>
</author>
<author>
<name sortKey="Seki, Y" uniqKey="Seki Y">Y Seki</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Patel, Ad" uniqKey="Patel A">AD Patel</name>
</author>
<author>
<name sortKey="Iversen, Jr" uniqKey="Iversen J">JR Iversen</name>
</author>
<author>
<name sortKey="Bregman, Mr" uniqKey="Bregman M">MR Bregman</name>
</author>
<author>
<name sortKey="Schulz, I" uniqKey="Schulz I">I Schulz</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Schachner, A" uniqKey="Schachner A">A Schachner</name>
</author>
<author>
<name sortKey="Brady, Tf" uniqKey="Brady T">TF Brady</name>
</author>
<author>
<name sortKey="Pepperberg, Im" uniqKey="Pepperberg I">IM Pepperberg</name>
</author>
<author>
<name sortKey="Hauser, Md" uniqKey="Hauser M">MD Hauser</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Kirschner, S" uniqKey="Kirschner S">S Kirschner</name>
</author>
<author>
<name sortKey="Tomasello, M" uniqKey="Tomasello M">M Tomasello</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Zentner, M" uniqKey="Zentner M">M Zentner</name>
</author>
<author>
<name sortKey="Eerola, T" uniqKey="Eerola T">T Eerola</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Stefanics, G" uniqKey="Stefanics G">G Stefanics</name>
</author>
<author>
<name sortKey="Haden, Gp" uniqKey="Haden G">GP Haden</name>
</author>
<author>
<name sortKey="Sziller, I" uniqKey="Sziller I">I Sziller</name>
</author>
<author>
<name sortKey="Balazs, L" uniqKey="Balazs L">L Balazs</name>
</author>
<author>
<name sortKey="Beke, A" uniqKey="Beke A">A Beke</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Perani, D" uniqKey="Perani D">D Perani</name>
</author>
<author>
<name sortKey="Saccuman, Mc" uniqKey="Saccuman M">MC Saccuman</name>
</author>
<author>
<name sortKey="Scifo, P" uniqKey="Scifo P">P Scifo</name>
</author>
<author>
<name sortKey="Spada, D" uniqKey="Spada D">D Spada</name>
</author>
<author>
<name sortKey="Andreolli, G" uniqKey="Andreolli G">G Andreolli</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Winkler, I" uniqKey="Winkler I">I Winkler</name>
</author>
<author>
<name sortKey="Haden, Gp" uniqKey="Haden G">GP Haden</name>
</author>
<author>
<name sortKey="Ladinig, O" uniqKey="Ladinig O">O Ladinig</name>
</author>
<author>
<name sortKey="Sziller, I" uniqKey="Sziller I">I Sziller</name>
</author>
<author>
<name sortKey="Honing, H" uniqKey="Honing H">H Honing</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Trainor, L" uniqKey="Trainor L">L Trainor</name>
</author>
<author>
<name sortKey="Heinmiller, B" uniqKey="Heinmiller B">B Heinmiller</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Trehub, Se" uniqKey="Trehub S">SE Trehub</name>
</author>
<author>
<name sortKey="Thorpe, La" uniqKey="Thorpe L">LA Thorpe</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Hannon, Ee" uniqKey="Hannon E">EE Hannon</name>
</author>
<author>
<name sortKey="Johnson, Sp" uniqKey="Johnson S">SP Johnson</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Phillips Silver, J" uniqKey="Phillips Silver J">J Phillips-Silver</name>
</author>
<author>
<name sortKey="Trainor, Lj" uniqKey="Trainor L">LJ Trainor</name>
</author>
</analytic>
</biblStruct>
<biblStruct></biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Condon, Ws" uniqKey="Condon W">WS Condon</name>
</author>
<author>
<name sortKey="Sander, Lw" uniqKey="Sander L">LW Sander</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Trehub, Se" uniqKey="Trehub S">SE Trehub</name>
</author>
<author>
<name sortKey="Trainor, Lj" uniqKey="Trainor L">LJ Trainor</name>
</author>
<author>
<name sortKey="Unyk, Am" uniqKey="Unyk A">AM Unyk</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Kotilahti, K" uniqKey="Kotilahti K">K Kotilahti</name>
</author>
<author>
<name sortKey="Nissila, I" uniqKey="Nissila I">I Nissila</name>
</author>
<author>
<name sortKey="Nasi, T" uniqKey="Nasi T">T Nasi</name>
</author>
<author>
<name sortKey="Lipiainen, L" uniqKey="Lipiainen L">L Lipiainen</name>
</author>
<author>
<name sortKey="Noponen, T" uniqKey="Noponen T">T Noponen</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Hadders Algra, M" uniqKey="Hadders Algra M">M Hadders-Algra</name>
</author>
<author>
<name sortKey="Prechtl, Hf" uniqKey="Prechtl H">HF Prechtl</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Prechtl, Hf" uniqKey="Prechtl H">HF Prechtl</name>
</author>
<author>
<name sortKey="Hopkins, B" uniqKey="Hopkins B">B Hopkins</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Watanabe, H" uniqKey="Watanabe H">H Watanabe</name>
</author>
<author>
<name sortKey="Homae, F" uniqKey="Homae F">F Homae</name>
</author>
<author>
<name sortKey="Taga, G" uniqKey="Taga G">G Taga</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Watanabe, H" uniqKey="Watanabe H">H Watanabe</name>
</author>
<author>
<name sortKey="Taga, G" uniqKey="Taga G">G Taga</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Chen, Jl" uniqKey="Chen J">JL Chen</name>
</author>
<author>
<name sortKey="Zatorre, Rj" uniqKey="Zatorre R">RJ Zatorre</name>
</author>
<author>
<name sortKey="Penhune, Vb" uniqKey="Penhune V">VB Penhune</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Fujioka, T" uniqKey="Fujioka T">T Fujioka</name>
</author>
<author>
<name sortKey="Zendel, Br" uniqKey="Zendel B">BR Zendel</name>
</author>
<author>
<name sortKey="Ross, B" uniqKey="Ross B">B Ross</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Grahn, Ja" uniqKey="Grahn J">JA Grahn</name>
</author>
<author>
<name sortKey="Rowe, Jb" uniqKey="Rowe J">JB Rowe</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Nozaradan, S" uniqKey="Nozaradan S">S Nozaradan</name>
</author>
<author>
<name sortKey="Peretz, I" uniqKey="Peretz I">I Peretz</name>
</author>
<author>
<name sortKey="Missal, M" uniqKey="Missal M">M Missal</name>
</author>
<author>
<name sortKey="Mouraux, A" uniqKey="Mouraux A">A Mouraux</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Kung, Sj" uniqKey="Kung S">SJ Kung</name>
</author>
<author>
<name sortKey="Chen, Jl" uniqKey="Chen J">JL Chen</name>
</author>
<author>
<name sortKey="Zatorre, Rj" uniqKey="Zatorre R">RJ Zatorre</name>
</author>
<author>
<name sortKey="Penhune, Vb" uniqKey="Penhune V">VB Penhune</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Phillips Silver, J" uniqKey="Phillips Silver J">J Phillips-Silver</name>
</author>
<author>
<name sortKey="Aktipis, Ca" uniqKey="Aktipis C">CA Aktipis</name>
</author>
<author>
<name sortKey="Bryant, Ga" uniqKey="Bryant G">GA Bryant</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Taga, G" uniqKey="Taga G">G Taga</name>
</author>
<author>
<name sortKey="Yamaguchi, Y" uniqKey="Yamaguchi Y">Y Yamaguchi</name>
</author>
<author>
<name sortKey="Shimizu, H" uniqKey="Shimizu H">H Shimizu</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Hattori, Y" uniqKey="Hattori Y">Y Hattori</name>
</author>
<author>
<name sortKey="Tomonaga, M" uniqKey="Tomonaga M">M Tomonaga</name>
</author>
<author>
<name sortKey="Matsuzawa, T" uniqKey="Matsuzawa T">T Matsuzawa</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Georgopoulos, Ap" uniqKey="Georgopoulos A">AP Georgopoulos</name>
</author>
<author>
<name sortKey="Grillner, S" uniqKey="Grillner S">S Grillner</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Grillner, S" uniqKey="Grillner S">S Grillner</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Michel, Gf" uniqKey="Michel G">GF Michel</name>
</author>
<author>
<name sortKey="Harkins, Da" uniqKey="Harkins D">DA Harkins</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Sun, T" uniqKey="Sun T">T Sun</name>
</author>
<author>
<name sortKey="Walsh, Ca" uniqKey="Walsh C">CA Walsh</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Masataka, N" uniqKey="Masataka N">N Masataka</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Fitch, Wt" uniqKey="Fitch W">WT Fitch</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Kato, M" uniqKey="Kato M">M Kato</name>
</author>
<author>
<name sortKey="Watanabe, H" uniqKey="Watanabe H">H Watanabe</name>
</author>
<author>
<name sortKey="Taga, G" uniqKey="Taga G">G Taga</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Kuhl, Pk" uniqKey="Kuhl P">PK Kuhl</name>
</author>
<author>
<name sortKey="Meltzoff, An" uniqKey="Meltzoff A">AN Meltzoff</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Fujii, S" uniqKey="Fujii S">S Fujii</name>
</author>
<author>
<name sortKey="Schlaug, G" uniqKey="Schlaug G">G Schlaug</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Tass, P" uniqKey="Tass P">P Tass</name>
</author>
<author>
<name sortKey="Rosenblum, Mg" uniqKey="Rosenblum M">MG Rosenblum</name>
</author>
<author>
<name sortKey="Weule, J" uniqKey="Weule J">J Weule</name>
</author>
<author>
<name sortKey="Kurths, J" uniqKey="Kurths J">J Kurths</name>
</author>
<author>
<name sortKey="Pikovsky, P" uniqKey="Pikovsky P">P Pikovsky</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Prichard, D" uniqKey="Prichard D">D Prichard</name>
</author>
<author>
<name sortKey="Theiler, J" uniqKey="Theiler J">J Theiler</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Stefanics, G" uniqKey="Stefanics G">G Stefanics</name>
</author>
<author>
<name sortKey="Hangya, B" uniqKey="Hangya B">B Hangya</name>
</author>
<author>
<name sortKey="Hernadi, I" uniqKey="Hernadi I">I Hernadi</name>
</author>
<author>
<name sortKey="Winkler, I" uniqKey="Winkler I">I Winkler</name>
</author>
<author>
<name sortKey="Lakatos, P" uniqKey="Lakatos P">P Lakatos</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Patel, Ad" uniqKey="Patel A">AD Patel</name>
</author>
<author>
<name sortKey="Iversen, Jr" uniqKey="Iversen J">JR Iversen</name>
</author>
<author>
<name sortKey="Bregman, Mr" uniqKey="Bregman M">MR Bregman</name>
</author>
<author>
<name sortKey="Schulz, I" uniqKey="Schulz I">I Schulz</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Dominici, N" uniqKey="Dominici N">N Dominici</name>
</author>
<author>
<name sortKey="Ivanenko, Yp" uniqKey="Ivanenko Y">YP Ivanenko</name>
</author>
<author>
<name sortKey="Cappellini, G" uniqKey="Cappellini G">G Cappellini</name>
</author>
<author>
<name sortKey="D Vella, A" uniqKey="D Vella A">A d’Avella</name>
</author>
<author>
<name sortKey="Mondi, V" uniqKey="Mondi V">V Mondi</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Stupacher, J" uniqKey="Stupacher J">J Stupacher</name>
</author>
<author>
<name sortKey="Hove, Mj" uniqKey="Hove M">MJ Hove</name>
</author>
<author>
<name sortKey="Novembre, G" uniqKey="Novembre G">G Novembre</name>
</author>
<author>
<name sortKey="Schutz Bosbach, S" uniqKey="Schutz Bosbach S">S Schutz-Bosbach</name>
</author>
<author>
<name sortKey="Keller, Pe" uniqKey="Keller P">PE Keller</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Tierney, A" uniqKey="Tierney A">A Tierney</name>
</author>
<author>
<name sortKey="Kraus, N" uniqKey="Kraus N">N Kraus</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Masataka, N" uniqKey="Masataka N">N Masataka</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Patel, Ad" uniqKey="Patel A">AD Patel</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Meltzoff, An" uniqKey="Meltzoff A">AN Meltzoff</name>
</author>
<author>
<name sortKey="Moore, Mk" uniqKey="Moore M">MK Moore</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Ellis, D" uniqKey="Ellis D">D Ellis</name>
</author>
</analytic>
</biblStruct>
<biblStruct></biblStruct>
<biblStruct></biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Mase, M" uniqKey="Mase M">M Mase</name>
</author>
<author>
<name sortKey="Faes, L" uniqKey="Faes L">L Faes</name>
</author>
<author>
<name sortKey="Antolini, R" uniqKey="Antolini R">R Antolini</name>
</author>
<author>
<name sortKey="Scaglione, M" uniqKey="Scaglione M">M Scaglione</name>
</author>
<author>
<name sortKey="Ravelli, F" uniqKey="Ravelli F">F Ravelli</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Kawahara, H" uniqKey="Kawahara H">H Kawahara</name>
</author>
<author>
<name sortKey="Masuda Katsuse, I" uniqKey="Masuda Katsuse I">I Masuda-Katsuse</name>
</author>
<author>
<name sortKey="De Cheveigne, A" uniqKey="De Cheveigne A">A de Cheveigne</name>
</author>
</analytic>
</biblStruct>
<biblStruct></biblStruct>
<biblStruct></biblStruct>
</listBibl>
</div1>
</back>
</TEI>
<pmc article-type="research-article">
<pmc-dir>properties open_access</pmc-dir>
<front>
<journal-meta>
<journal-id journal-id-type="nlm-ta">PLoS One</journal-id>
<journal-id journal-id-type="iso-abbrev">PLoS ONE</journal-id>
<journal-id journal-id-type="publisher-id">plos</journal-id>
<journal-id journal-id-type="pmc">plosone</journal-id>
<journal-title-group>
<journal-title>PLoS ONE</journal-title>
</journal-title-group>
<issn pub-type="epub">1932-6203</issn>
<publisher>
<publisher-name>Public Library of Science</publisher-name>
<publisher-loc>San Francisco, USA</publisher-loc>
</publisher>
</journal-meta>
<article-meta>
<article-id pub-id-type="pmid">24837135</article-id>
<article-id pub-id-type="pmc">4023986</article-id>
<article-id pub-id-type="publisher-id">PONE-D-13-41436</article-id>
<article-id pub-id-type="doi">10.1371/journal.pone.0097680</article-id>
<article-categories>
<subj-group subj-group-type="heading">
<subject>Research Article</subject>
</subj-group>
<subj-group subj-group-type="Discipline-v2">
<subject>Biology and Life Sciences</subject>
<subj-group>
<subject>Anatomy</subject>
<subj-group>
<subject>Nervous System</subject>
<subj-group>
<subject>Motor System</subject>
</subj-group>
</subj-group>
</subj-group>
<subj-group>
<subject>Neuroscience</subject>
<subj-group>
<subject>Cognitive Science</subject>
<subj-group>
<subject>Cognitive Psychology</subject>
</subj-group>
</subj-group>
<subj-group>
<subject>Behavioral Neuroscience</subject>
<subject>Cognitive Neuroscience</subject>
<subject>Sensory Perception</subject>
<subject>Sensory Systems</subject>
</subj-group>
</subj-group>
<subj-group>
<subject>Psychology</subject>
<subj-group>
<subject>Behavior</subject>
<subject>Experimental Psychology</subject>
</subj-group>
</subj-group>
<subj-group>
<subject>Systems Biology</subject>
</subj-group>
</subj-group>
<subj-group subj-group-type="Discipline-v2">
<subject>Computer and Information Sciences</subject>
<subj-group>
<subject>Systems Science</subject>
<subj-group>
<subject>Complex Systems</subject>
<subject>Nonlinear Dynamics</subject>
</subj-group>
</subj-group>
</subj-group>
<subj-group subj-group-type="Discipline-v2">
<subject>Physical Sciences</subject>
<subj-group>
<subject>Mathematics</subject>
<subj-group>
<subject>Applied Mathematics</subject>
</subj-group>
</subj-group>
<subj-group>
<subject>Physics</subject>
<subj-group>
<subject>Interdisciplinary Physics</subject>
</subj-group>
</subj-group>
</subj-group>
<subj-group subj-group-type="Discipline-v2">
<subject>Social Sciences</subject>
</subj-group>
</article-categories>
<title-group>
<article-title>Precursors of Dancing and Singing to Music in Three- to Four-Months-Old Infants</article-title>
<alt-title alt-title-type="running-head">Precursors of Dancing and Singing in Infants</alt-title>
</title-group>
<contrib-group>
<contrib contrib-type="author">
<name>
<surname>Fujii</surname>
<given-names>Shinya</given-names>
</name>
<xref ref-type="aff" rid="aff1">
<sup>1</sup>
</xref>
<xref ref-type="aff" rid="aff3">
<sup>3</sup>
</xref>
<xref ref-type="corresp" rid="cor1">
<sup>*</sup>
</xref>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Watanabe</surname>
<given-names>Hama</given-names>
</name>
<xref ref-type="aff" rid="aff2">
<sup>2</sup>
</xref>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Oohashi</surname>
<given-names>Hiroki</given-names>
</name>
<xref ref-type="aff" rid="aff2">
<sup>2</sup>
</xref>
<xref ref-type="aff" rid="aff3">
<sup>3</sup>
</xref>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Hirashima</surname>
<given-names>Masaya</given-names>
</name>
<xref ref-type="aff" rid="aff2">
<sup>2</sup>
</xref>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Nozaki</surname>
<given-names>Daichi</given-names>
</name>
<xref ref-type="aff" rid="aff2">
<sup>2</sup>
</xref>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Taga</surname>
<given-names>Gentaro</given-names>
</name>
<xref ref-type="aff" rid="aff2">
<sup>2</sup>
</xref>
</contrib>
</contrib-group>
<aff id="aff1">
<label>1</label>
<addr-line>The Heart and Stroke Foundation Canadian Partnership for Stroke Recovery, Sunnybrook Research Institute, Toronto, Ontario, Canada</addr-line>
</aff>
<aff id="aff2">
<label>2</label>
<addr-line>Graduate School of Education, The University of Tokyo, Bunkyo-ku, Tokyo, Japan</addr-line>
</aff>
<aff id="aff3">
<label>3</label>
<addr-line>Research Fellow of the Japan Society for the Promotion of Science, Chiyoda-ku, Tokyo, Japan</addr-line>
</aff>
<contrib-group>
<contrib contrib-type="editor">
<name>
<surname>Kotz</surname>
<given-names>Sonja</given-names>
</name>
<role>Editor</role>
<xref ref-type="aff" rid="edit1"></xref>
</contrib>
</contrib-group>
<aff id="edit1">
<addr-line>Max Planck Institute for Human Cognitive and Brain Sciences, Germany</addr-line>
</aff>
<author-notes>
<corresp id="cor1">* E-mail:
<email>sfujii@sri.utoronto.ca</email>
</corresp>
<fn fn-type="conflict">
<p>
<bold>Competing Interests: </bold>
The authors have declared that no competing interests exist.</p>
</fn>
<fn fn-type="con">
<p>Conceived and designed the experiments: SF HW MH DN GT. Performed the experiments: SF HW. Analyzed the data: SF HO. Contributed reagents/materials/analysis tools: SF HW HO GT. Wrote the paper: SF HW HO MH DN GT.</p>
</fn>
</author-notes>
<pub-date pub-type="collection">
<year>2014</year>
</pub-date>
<pub-date pub-type="epub">
<day>16</day>
<month>5</month>
<year>2014</year>
</pub-date>
<pub-date pub-type="ecorrected">
<day>25</day>
<month>6</month>
<year>2014</year>
</pub-date>
<volume>9</volume>
<issue>5</issue>
<elocation-id>e97680</elocation-id>
<history>
<date date-type="received">
<day>9</day>
<month>10</month>
<year>2013</year>
</date>
<date date-type="accepted">
<day>22</day>
<month>4</month>
<year>2014</year>
</date>
</history>
<permissions>
<copyright-year>2014</copyright-year>
<copyright-holder>Fujii et al</copyright-holder>
<license>
<license-p>This is an open-access article distributed under the terms of the
<ext-link ext-link-type="uri" xlink:href="http://creativecommons.org/licenses/by/4.0/">Creative Commons Attribution License</ext-link>
, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.</license-p>
</license>
</permissions>
<abstract>
<p>Dancing and singing to music involve auditory-motor coordination and have been essential to our human culture since ancient times. Although scholars have been trying to understand the evolutionary and developmental origin of music, early human developmental manifestations of auditory-motor interactions in music have not been fully investigated. Here we report limb movements and vocalizations in three- to four-months-old infants while they listened to music and were in silence. In the group analysis, we found no significant increase in the amount of movement or in the relative power spectrum density around the musical tempo in the music condition compared to the silent condition. Intriguingly, however, there were two infants who demonstrated striking increases in the rhythmic movements via kicking or arm-waving around the musical tempo during listening to music. Monte-Carlo statistics with phase-randomized surrogate data revealed that the limb movements of these individuals were significantly synchronized to the musical beat. Moreover, we found a clear increase in the formant variability of vocalizations in the group during music perception. These results suggest that infants at this age are already primed with their bodies to interact with music via limb movements and vocalizations.</p>
</abstract>
<funding-group>
<funding-statement>The study was supported by a Grant for the Fellows of the Japan Society for the Promotion of Science (No. 22-7777) awarded to S.F., a Grant-in-Aid for Scientific Research (No. 23700682) awarded to H.W., and a Grant-in-Aid for Scientific Research (No. 20670001 and 24119002) awarded to G.T. The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.</funding-statement>
</funding-group>
<counts>
<page-count count="12"></page-count>
</counts>
</article-meta>
</front>
<body>
<sec id="s1">
<title>Introduction</title>
<p>Humans have been universally making music by engaging in dancing and singing for 35,000 years
<xref rid="pone.0097680-Conard1" ref-type="bibr">[1]</xref>
,
<xref rid="pone.0097680-Fitch1" ref-type="bibr">[2]</xref>
,
<xref rid="pone.0097680-Mithen1" ref-type="bibr">[3]</xref>
. The uniqueness of our musicality has been “ranked among the most mysterious with which humans are endowed” as Darwin mentioned in 1871
<xref rid="pone.0097680-Darwin1" ref-type="bibr">[4]</xref>
. The emerging field of music and neuroscience has shown that interactions between the auditory and motor systems are key to understanding how the brain perceives and produces music, such as during dancing and singing
<xref rid="pone.0097680-Wan1" ref-type="bibr">[5]</xref>
,
<xref rid="pone.0097680-Zatorre1" ref-type="bibr">[6]</xref>
. Animal (e.g., “dancing cockatoo”) studies also stress the importance of a tight link between the auditory-motor circuit as a prerequisite for vocal learning and musical synchronization capabilities
<xref rid="pone.0097680-Hasegawa1" ref-type="bibr">[7]</xref>
,
<xref rid="pone.0097680-Patel1" ref-type="bibr">[8]</xref>
,
<xref rid="pone.0097680-Schachner1" ref-type="bibr">[9]</xref>
. Nonetheless, early human developmental manifestations of auditory-motor interactions in music have not been fully investigated
<xref rid="pone.0097680-Kirschner1" ref-type="bibr">[10]</xref>
,
<xref rid="pone.0097680-Zentner1" ref-type="bibr">[11]</xref>
. An important question on the topic of the developmental origins of music is whether infants show precursors of dancing and singing to music. Evidence of such precursors may suggest that our brains prime our bodies to interact with music through limb movements and vocalizations.</p>
<p>A handful of studies have investigated developmental manifestation of music perception in humans: Neonates show cortical responses to pitch interval
<xref rid="pone.0097680-Stefanics1" ref-type="bibr">[12]</xref>
, tonal key
<xref rid="pone.0097680-Perani1" ref-type="bibr">[13]</xref>
, and musical beat
<xref rid="pone.0097680-Winkler1" ref-type="bibr">[14]</xref>
, and six- to nine-months-old infants can discriminate musical consonance
<xref rid="pone.0097680-Trainor1" ref-type="bibr">[15]</xref>
, rhythm
<xref rid="pone.0097680-Trehub1" ref-type="bibr">[16]</xref>
, and meter
<xref rid="pone.0097680-Hannon1" ref-type="bibr">[17]</xref>
,
<xref rid="pone.0097680-PhillipsSilver1" ref-type="bibr">[18]</xref>
. These findings suggest that precursors of music perception have already emerged at the early stages of human development. On the other hand, the ability to synchronize body movements with music is assumed to develop later. For example, Zentner and Eerola
<xref rid="pone.0097680-Zentner1" ref-type="bibr">[11]</xref>
investigated limb movements of 5- to 24-months old infants during music perception but could not find phase synchronization of the infant’s limb movement with the musical beat. They described that “synchronization, which is characterized by perfectly overlapping music and body-movement phases, requires a degree of motor control that may not be achieved until preschool age” (second paragraph of their discussion)
<xref rid="pone.0097680-Zentner1" ref-type="bibr">[11]</xref>
. Patel also described that “young infants do
<italic>not</italic>
synchronize their movements to a musical beat…. the ability to synchronize with a beat does not appear to emerge till around age four” (page 405)
<xref rid="pone.0097680-Patel2" ref-type="bibr">[19]</xref>
. Even at the age of 2.5 to 4.5 years, the synchronization ability of children seems modest and requires prompting by an experimenter
<xref rid="pone.0097680-Kirschner1" ref-type="bibr">[10]</xref>
. Based on these studies, one could postulate that the ability to synchronize body movements with music is primarily an acquired behavior.</p>
<p>However, Condon and Sander
<xref rid="pone.0097680-Condon1" ref-type="bibr">[20]</xref>
showed that human neonates were able to synchronize their body movements with adult’s speech: They performed frame-by-frame analysis of video-taped baby’s motion and showed that the configurations of body (e.g., head, elbow, shoulder, hip, and foot) movements coincided with the articulatory segments of the adult’s speech (e.g., phonemes of words)
<xref rid="pone.0097680-Condon1" ref-type="bibr">[20]</xref>
. Although Condon and Sander
<xref rid="pone.0097680-Condon1" ref-type="bibr">[20]</xref>
investigated the synchronization of body movements not with music but with speech sound, their study suggests that the nervous system of human infants is already primed with their bodies to interact with external auditory information as early as the first day of life. From the neonate’s perspective, speech and music would be similar in a sense that both of them consist of patterns of semantically meaningless sounds
<xref rid="pone.0097680-Trehub2" ref-type="bibr">[21]</xref>
. Considering the similarity between speech and music for pre-linguistic infants, there is still a possibility that infants show synchronization of body movements not only with speech but also with music. On the other hand, the synchronization reported by Condon and Sander
<xref rid="pone.0097680-Condon1" ref-type="bibr">[20]</xref>
might be specific to the speech sound if music was processed differently in the infant’s nervous system. In fact, neonates as a group show increased hemodynamic responses in their left hemispheres only to speech but not to music
<xref rid="pone.0097680-Kotilahti1" ref-type="bibr">[22]</xref>
. Nevertheless, more developmental studies of music are needed to clarify whether infants show movement-to-music synchronization.</p>
<p>We considered that there were at least five issues needed to be tackled in the developmental study of music. First, as far as we know, there has been no study that investigates the movement-to-music synchronization in infants younger than five-months old. Although Zentner and Eerola
<xref rid="pone.0097680-Zentner1" ref-type="bibr">[11]</xref>
pointed out the immature motor-control ability, infants younger than five-months already express rich and spontaneous limb movements, coined
<italic>general movements</italic>
<xref rid="pone.0097680-HaddersAlgra1" ref-type="bibr">[23]</xref>
,
<xref rid="pone.0097680-Prechtl1" ref-type="bibr">[24]</xref>
,
<xref rid="pone.0097680-Watanabe1" ref-type="bibr">[25]</xref>
,
<xref rid="pone.0097680-Watanabe2" ref-type="bibr">[26]</xref>
. A previous study showed that general movements of the infants at three months of age were modified by audio-visual inputs possibly through the basal ganglia and cerebral cortex
<xref rid="pone.0097680-Watanabe1" ref-type="bibr">[25]</xref>
, which are the brain areas considered to be playing a central role in processing of the musical beat
<xref rid="pone.0097680-Chen1" ref-type="bibr">[27]</xref>
,
<xref rid="pone.0097680-Fujioka1" ref-type="bibr">[28]</xref>
,
<xref rid="pone.0097680-Grahn1" ref-type="bibr">[29]</xref>
,
<xref rid="pone.0097680-Nozaradan1" ref-type="bibr">[30]</xref>
,
<xref rid="pone.0097680-Kung1" ref-type="bibr">[31]</xref>
. Thus, if human musicality arises spontaneously through entrainment mechanisms between our bodies and the environment
<xref rid="pone.0097680-PhillipsSilver2" ref-type="bibr">[32]</xref>
,
<xref rid="pone.0097680-Taga1" ref-type="bibr">[33]</xref>
, synchronized limb movements to music may be observed even in infants younger than five-months-old.</p>
<p>Second, not only group level of analysis but also individual level of analysis provides significant insight on the infant’s movement-to-music synchronization because of the large individual differences. For instance, in the previous study by Condon and Sander
<xref rid="pone.0097680-Condon1" ref-type="bibr">[20]</xref>
, the movement-to-speech synchronization was shown based on the observations from the 3 neonates (babies A, C, and E in their paper). Kirschner et al.
<xref rid="pone.0097680-Kirschner1" ref-type="bibr">[10]</xref>
showed that only 1 out of 12 children at 2.5 years of age was able to synchronize the tapping movements with a rhythmic drum sound without any presence of adult social partner (see 600-ms inter-stimulus interval, acoustic condition in their paper). Animal studies also performed the individual analysis: The study of dancing cockatoo, which showed significant synchronization of head-bobbing movements with a musical beat, was a case report
<xref rid="pone.0097680-Patel1" ref-type="bibr">[8]</xref>
. A recent study on chimpanzees showed that only 1 out of 3 individuals showed significant tapping synchronization with a rhythmic auditory stimulus after a training
<xref rid="pone.0097680-Hattori1" ref-type="bibr">[34]</xref>
. Thus, it is important to perform individual analysis and to investigate how many infants in a population can synchronize their movements to a musical beat.</p>
<p>Third, the movement responses to music may be different across the four limbs (i.e., left arm, right arm, left leg, and right leg). A previous study on three-months-old infants showed that there was difference in movement patterns between the arms and the legs when a mobile toy was provided
<xref rid="pone.0097680-Watanabe2" ref-type="bibr">[26]</xref>
. It was suggested that the arm-leg difference could be attributed to different neural-control processes: Spontaneous limb movements of arms and legs in the infants are thought to mainly result from rhythmic neural oscillations in the spinal cord created by central pattern generators (CPGs), but the control of arm movements is dominated relatively more by the cerebral cortex than the leg movements
<xref rid="pone.0097680-Watanabe2" ref-type="bibr">[26]</xref>
,
<xref rid="pone.0097680-Georgopoulos1" ref-type="bibr">[35]</xref>
,
<xref rid="pone.0097680-Grillner1" ref-type="bibr">[36]</xref>
. Therefore, depending on how music affects the infant’s nervous system, different movement patterns may be observed between the arms and the legs. Asymmetry between the limb movements may also be observed considering the fact that the infants already show preference of hand use
<xref rid="pone.0097680-Michel1" ref-type="bibr">[37]</xref>
,
<xref rid="pone.0097680-Sun1" ref-type="bibr">[38]</xref>
. We need to investigate all of the four-limb movements in response to music in the infants.</p>
<p>Fourth, infants may respond to music not only through their limb movements but also their vocalizations. Infants younger than five-months already express rich and spontaneous vowel-like monosyllabic vocalizations called
<italic>coos</italic>
<xref rid="pone.0097680-Prechtl1" ref-type="bibr">[24]</xref>
,
<xref rid="pone.0097680-Masataka1" ref-type="bibr">[39]</xref>
. The source/filter theory of vocal production states that the fundamental frequency (F
<sub>0</sub>
) mainly reflects the oscillation of the vocal cord at the larynx, while the formant frequencies (F
<sub>1</sub>
and F
<sub>2</sub>
) reflect the length and shape of the vocal tract, which are rapidly modified during utterances by movement of the articulators (e.g., tongue, lips, soft palate, etc.)
<xref rid="pone.0097680-Fitch2" ref-type="bibr">[40]</xref>
. The analysis of fundamental and formant frequencies in infants allows us to infer the oral movements in response to music.</p>
<p>Fifth, previous studies showed that infant’s limb movements and vocalizations changed over the course of development
<xref rid="pone.0097680-Kato1" ref-type="bibr">[41]</xref>
,
<xref rid="pone.0097680-Kuhl1" ref-type="bibr">[42]</xref>
. Kato et al.
<xref rid="pone.0097680-Kato1" ref-type="bibr">[41]</xref>
have recently investigated motions of the infants aged 90 to 126 days and showed that there was the effect of age on changeability of limb-movement patterns when a mobile toy was provided. Kuhl and Meltzoff
<xref rid="pone.0097680-Kuhl1" ref-type="bibr">[42]</xref>
performed acoustic analysis of formant frequencies in the infants aged 12 to 20 weeks and showed that the vowel categories became more separated in the F
<sub>1</sub>
and F
<sub>2</sub>
coordinate space in the course of development
<xref rid="pone.0097680-Kuhl1" ref-type="bibr">[42]</xref>
. It is important to investigate the relationship between the age of days and limb movements/vocalizations.</p>
<p>We designed this study considering the above five issues: 1) We examined movement-to-music synchronization in three- to four-months-old infants, 2) performed both group and individual analyses 3) on the left-arm, right-arm, left-leg, and right-leg movements, 4) conducted acoustic analysis on the infants’ voice samples, and 5) investigated the relationship between the age of days and the limb movements/vocalizations. The aim of this study was to test whether the three- to four-months-old infants show synchronized limb movements and/or altered vocalizations in response to music.</p>
</sec>
<sec id="s2">
<title>Results</title>
<p>We analyzed data from 30 infants aged 106–125 days who showed no fussing, crying, or rolling over during the data recording (
<xref ref-type="sec" rid="s5">Methods</xref>
and
<xref ref-type="supplementary-material" rid="pone.0097680.s015">Tables S1</xref>
and
<xref ref-type="supplementary-material" rid="pone.0097680.s016">S2</xref>
for detail). The infants lay on their back on a baby mattress (
<xref ref-type="fig" rid="pone-0097680-g001">Figures 1A</xref>
,
<xref ref-type="supplementary-material" rid="pone.0097680.s001">S1A</xref>
, and
<xref ref-type="supplementary-material" rid="pone.0097680.s002">S2</xref>
). In the silent condition, there was no auditory stimulus (
<xref ref-type="supplementary-material" rid="pone.0097680.s022">Videos S1</xref>
and
<xref ref-type="supplementary-material" rid="pone.0097680.s023">S2</xref>
). In the music condition, one of two pop songs was played; (1) “Everybody” by Backstreet Boys–this is the same auditory stimulus used in the dancing cockatoo study
<xref rid="pone.0097680-Patel1" ref-type="bibr">[8]</xref>
(
<xref ref-type="supplementary-material" rid="pone.0097680.s024">Videos S3</xref>
and
<xref ref-type="supplementary-material" rid="pone.0097680.s025">S4</xref>
), and/or (2) “Go Trippy” by WANICO feat. Jake Smith–this was used to investigate the infant’s behavior during playing a high-tempo disco music (
<xref ref-type="supplementary-material" rid="pone.0097680.s026">Video S5</xref>
). The tempo of “Everybody” was 108.7 beat per minute (BPM) corresponding to 1.8 Hz, and that of “Go Trippy” was 130.0 BPM corresponding to 2.2 Hz. These two pop songs were used because we considered that the dance beats and jolly styles might be effective to attract infant’s interest and elicit synchronization behaviors, such as shown in the dancing cockatoo study
<xref rid="pone.0097680-Patel1" ref-type="bibr">[8]</xref>
. Limb movements and vocalizations of the infants in the supine position were recorded by a 3D motion capture system and the microphone of a digital video camera (
<xref ref-type="supplementary-material" rid="pone.0097680.s002">Figure S2</xref>
). Both experimenters and parents were out of the infant’s sight during the recording to prevent any social interaction.</p>
<fig id="pone-0097680-g001" orientation="portrait" position="float">
<object-id pub-id-type="doi">10.1371/journal.pone.0097680.g001</object-id>
<label>Figure 1</label>
<caption>
<title>Spontaneous limb movements of infants when they listen to “Everybody” by The Backstreet Boys (music condition,
<xref ref-type="supplementary-material" rid="pone.0097680.s024">Video S3</xref>
) and those without any auditory stimulus (silent condition,
<xref ref-type="supplementary-material" rid="pone.0097680.s022">Video S1</xref>
).</title>
<p>(
<bold>A</bold>
) Typical limb trajectories during the music condition in an infant (ID1) in X, Y, and Z coordinates. (
<bold>B</bold>
) Mean square sum of right leg velocities and (
<bold>C</bold>
) relative proportion of the power spectrum density (PSD) around the musical tempo for right leg movements along the Y coordinate axis in ID1 (red), other infants (grey), and the group mean except for ID1 with standard deviation (SD) (black). (
<bold>D</bold>
) The right-foot position along the Y coordinate axis in ID1. He kicked more rhythmically during the music condition (red) than the silent condition (blue). (
<bold>E</bold>
) Power spectrogram of the right foot position along the Y coordinate axis in ID1. Relatively high PSD can be seen around the musical tempo (dashed line) in the music condition. (
<bold>F</bold>
) Mean synchronization index across
<italic>moving sections</italic>
(
<xref ref-type="sec" rid="s5">Methods</xref>
for detail) in the music (red) and silent (blue) conditions. Error bars indicate standard errors (SE) across the moving sections.*
<italic>p</italic>
<0.01.</p>
</caption>
<graphic xlink:href="pone.0097680.g001"></graphic>
</fig>
<sec id="s2a">
<title>Amount of Limb Movement</title>
<p>We first quantified the mean square sum of velocities of each limb as a measure of the amount of movement (
<xref ref-type="sec" rid="s5">Methods</xref>
and
<xref ref-type="supplementary-material" rid="pone.0097680.s003">Figure S3</xref>
for detail). A four (limb; right arm, left arm, right leg, and left leg) by two (playing music; silent vs. music) by two (song; “Everybody” vs. “Go Trippy”) factorial analysis of variance (ANOVA) yielded no significant interaction among the effects (limb×playing music×song,
<italic>F
<sub>3, 132</sub>
</italic>
 = 0.03,
<italic>p = </italic>
0.99,
<italic>η</italic>
<sup>2</sup>
 = 0.001; limb×playing music,
<italic>F
<sub>3, 132</sub>
</italic>
 = 1.24,
<italic>p = </italic>
0.30,
<italic>η</italic>
<sup>2</sup>
 = 0.03; limb×song,
<italic>F
<sub>3, 132</sub>
</italic>
 = 0.25,
<italic>p = </italic>
0.86,
<italic>η</italic>
<sup>2</sup>
 = 0.006; playing music×song,
<italic>F
<sub>1, 44</sub>
</italic>
 = 1.01,
<italic>p = </italic>
0.32,
<italic>η</italic>
<sup>2</sup>
 = 0.02). Neither the main effects of limb nor song was significant (limb,
<italic>F
<sub>3, 132</sub>
</italic>
 = 0.18,
<italic>p</italic>
 = 0.91,
<italic>η</italic>
<sup>2</sup>
 = 0.004; song,
<italic>F
<sub>1, 44</sub>
</italic>
 = 0.43,
<italic>p</italic>
 = 0.51,
<italic>η</italic>
<sup>2</sup>
 = 0.01), showing that there was no difference in the amount of movement across the limbs nor between the songs. On the other hand, there was a significant main effect of playing music (
<italic>F
<sub>1, 44</sub>
</italic>
 = 8.55,
<italic>p</italic>
<0.01,
<italic>η</italic>
<sup>2</sup>
 = 0.16). That is, the amount of movement decreased when infants heard music, ([1.34±0.12]×10
<sup>4</sup>
[mm/sec]
<sup>2</sup>
; mean ± standard error) compared to the silent condition ([2.03±0.21]×10
<sup>4</sup>
[mm/sec]
<sup>2</sup>
; see also black lines in
<xref ref-type="fig" rid="pone-0097680-g001">Figures 1B</xref>
and
<xref ref-type="supplementary-material" rid="pone.0097680.s001">S1B</xref>
). There was no significant correlation between the age of days and the mean square sum of the velocity in any of the limbs (
<xref ref-type="supplementary-material" rid="pone.0097680.s017">Tables S3</xref>
and
<xref ref-type="supplementary-material" rid="pone.0097680.s018">S4</xref>
).</p>
</sec>
<sec id="s2b">
<title>Frequency of Limb Movement</title>
<p>To see the frequency range of infant’s limb movements, we performed power spectrum analysis (
<xref ref-type="sec" rid="s5">Methods</xref>
and
<xref ref-type="supplementary-material" rid="pone.0097680.s003">Figure S3</xref>
). We found that over 90% of the power spectrum density (PSD) was within 0–1 Hz frequency range on average (
<xref ref-type="supplementary-material" rid="pone.0097680.s019">Tables S5</xref>
and
<xref ref-type="supplementary-material" rid="pone.0097680.s020">S6</xref>
). Overall, the infant’s limb movements were slower than the musical tempi and it was rare to observe rhythmic movements for which frequencies were around the musical tempi. When we calculated the relative proportion of the PSD around the musical tempo (BPM ±10% range of frequency), the 4 (limb)×2 (playing music)×2 (song) ANOVA yielded no significant interaction among the effects (limb×playing music×song,
<italic>F
<sub>3, 132</sub>
</italic>
 = 0.12,
<italic>p = </italic>
0.95,
<italic>η</italic>
<sup>2</sup>
 = 0.003; limb × playing music,
<italic>F
<sub>3, 132</sub>
</italic>
 = 1.52,
<italic>p = </italic>
0.21,
<italic>η</italic>
<sup>2</sup>
 = 0.03; limb × song,
<italic>F
<sub>3, 132</sub>
</italic>
 = 1.57,
<italic>p = </italic>
0.20,
<italic>η</italic>
<sup>2</sup>
 = 0.03; playing music×song,
<italic>F
<sub>1, 44</sub>
</italic>
 = 0.81,
<italic>p = </italic>
0.37,
<italic>η</italic>
<sup>2</sup>
 = 0.02). Neither the main effects of limb nor song was significant (limb,
<italic>F
<sub>3, 132</sub>
</italic>
 = 0.66,
<italic>p</italic>
 = 0.58,
<italic>η</italic>
<sup>2</sup>
 = 0.01; song,
<italic>F
<sub>1, 44</sub>
</italic>
 = 2.12,
<italic>p</italic>
 = 0.15,
<italic>η</italic>
<sup>2</sup>
 = 0.05), showing that there was no difference in the PSD around the musical tempo across the limbs nor between the songs. On the other hand, there was a significant main effect of playing music (
<italic>F
<sub>1, 44</sub>
</italic>
 = 13.61,
<italic>p</italic>
<0.001,
<italic>η</italic>
<sup>2</sup>
 = 0.24): The relative proportion of PSD around the musical tempo was significantly smaller in the music condition (0.75±0.09%, mean ± standard error) compared to the silent condition (1.12±0.12%). That is, the limb movement frequency became slower when listening to music compared to the silent condition (see black lines in
<xref ref-type="fig" rid="pone-0097680-g001">Figures 1C</xref>
and
<xref ref-type="supplementary-material" rid="pone.0097680.s001">S1C</xref>
). There was no significant correlation between the age of days and the relative proportion of PSD around the musical tempo (
<xref ref-type="supplementary-material" rid="pone.0097680.s017">Tables S3</xref>
and
<xref ref-type="supplementary-material" rid="pone.0097680.s018">S4</xref>
). Thus, as a group, the amount of limb movements decreased and the movement frequency became slower in the music condition compared to the silent condition.</p>
</sec>
<sec id="s2c">
<title>Synchronization of Limb Movements to the Musical Beat</title>
<p>Prior to the analysis of movement-to-music synchronization, we determined a period of time during which the infants continuously moved for over three seconds and designated it as a
<italic>moving section</italic>
since they moved in an intermittent fashion (e.g.,
<xref ref-type="fig" rid="pone-0097680-g001">Figures 1DE</xref>
,
<xref ref-type="supplementary-material" rid="pone.0097680.s001">S1DE</xref>
, and
<xref ref-type="supplementary-material" rid="pone.0097680.s003">S3</xref>
). In sum, we detected 51 moving sections (27 and 24 moving sections in the music and silent condition, respectively) from the 11 infants (
<xref ref-type="supplementary-material" rid="pone.0097680.s021">Table S7</xref>
). For each of the moving sections in the music condition, we investigated the relative phase (
<inline-formula>
<inline-graphic xlink:href="pone.0097680.e001.jpg"></inline-graphic>
</inline-formula>
) between the infant’s limb motion and the musical beat (
<xref ref-type="sec" rid="s5">Methods</xref>
and
<xref ref-type="fig" rid="pone-0097680-g002">Figures 2A–E</xref>
). A typical example of circular histogram of
<inline-formula>
<inline-graphic xlink:href="pone.0097680.e002.jpg"></inline-graphic>
</inline-formula>
in a moving section is shown in
<xref ref-type="fig" rid="pone-0097680-g002">Figures 2F</xref>
. The properties of relative-phase distribution were quantified by a synchronization index that ranges from 0, when the spreading of
<inline-formula>
<inline-graphic xlink:href="pone.0097680.e003.jpg"></inline-graphic>
</inline-formula>
is maximal (i.e., perfect non-synchronization), to 1, when a
<italic>δ</italic>
-function-like probability distribution (i.e., perfect synchronization) is found
<xref rid="pone.0097680-Fujii1" ref-type="bibr">[43]</xref>
,
<xref rid="pone.0097680-Tass1" ref-type="bibr">[44]</xref>
.</p>
<fig id="pone-0097680-g002" orientation="portrait" position="float">
<object-id pub-id-type="doi">10.1371/journal.pone.0097680.g002</object-id>
<label>Figure 2</label>
<caption>
<title>Significant synchronization in right leg movements of ID1 during the music condition “Everybody” (108.7 BPM) (
<xref ref-type="supplementary-material" rid="pone.0097680.s024">Video S3</xref>
).</title>
<p>(
<bold>A</bold>
) Sound wave of the auditory stimulus (yellow) with the detected beat onsets (red vertical lines). (
<bold>B</bold>
) Observed (left) and phase-randomized (right) position data
<italic>s</italic>
<sub>pos</sub>
(
<italic>t</italic>
) along the Y coordinate axis when the infant moved continuously over a period of three seconds (defined as a
<italic>moving section</italic>
). (
<bold>C</bold>
) Instantaneous phase of the musical beat
<italic>φ</italic>
<sub>music</sub>
(
<italic>t</italic>
) calculated from the detected beat onsets. (
<bold>D</bold>
) Instantaneous phase of the motion
<italic>φ</italic>
<sub>motion</sub>
(
<italic>t</italic>
). (
<bold>E</bold>
) Relative phase
<italic>φ</italic>
<sub>rel</sub>
(
<italic>t</italic>
) between motion and the musical beat. (
<bold>F</bold>
) Circular histograms of
<italic>φ</italic>
<sub>rel</sub>
(
<italic>t</italic>
). (
<bold>G</bold>
) Monte-Carlo statistics showed that the observed synchronization index (magenta line) was above the 95% confidence interval of the surrogate synchronization indexes (blue lines) calculated from the 10,000 phase-randomized position data: The observed movement was significantly synchronized to the musical beat.</p>
</caption>
<graphic xlink:href="pone.0097680.g002"></graphic>
</fig>
<p>To test whether the degree of synchronization in the music condition is significant, we also calculated the synchronization index for each of the moving sections in the silent condition. This was performed by adding a “virtual” musical beat extracted from the auditory stimulus in the music condition to the limb motion from the silent condition. That is, although no music was played in the silent condition, we artificially calculated the relative phase between the infant’s limb motion and the virtual musical beat. Thus, the synchronization index in the silent condition indicates “non-significant” degree of synchronization. If there was no tendency of synchronization in the music condition, similar degrees of synchronization should be observed between the silent and music conditions. However, we found significantly higher degree of synchronization during the music condition compared to the silent condition (
<italic>p</italic>
<0.01, Mann-Whitney U test,
<xref ref-type="fig" rid="pone-0097680-g001">Figure 1F</xref>
).</p>
</sec>
<sec id="s2d">
<title>Individual Analysis of Limb Movements</title>
<p>The analysis above revealed significant degree of synchronization in the music condition. Intriguingly, we found that 15 out of 27 moving sections in the music condition (i.e., 56% of the total) were from an infant (ID1, 122 days of age) (
<xref ref-type="supplementary-material" rid="pone.0097680.s021">Table S7</xref>
). We also found that ID1 demonstrated a significant increase in the amount of movement of the right leg when listening to “Everybody” (see red line in
<xref ref-type="fig" rid="pone-0097680-g001">Figure 1B</xref>
). Moreover, a substantial increase in the relative proportion of the PSD around the musical tempo ( = 1.8±0.2 Hz range) was found in movements of his right leg (see red line in
<xref ref-type="fig" rid="pone-0097680-g001">Figure 1C</xref>
). The relative proportion of PSD around the musical tempo was 22.69% that was far different from the other infants (
<xref ref-type="fig" rid="pone-0097680-g001">Figure 1C</xref>
). These values from ID1 were identified as significant outliers among the group (Grubbs test, movement amount,
<italic>G</italic>
 = 4.53,
<italic>P</italic>
<0.01; PSD,
<italic>G</italic>
 = 4.84,
<italic>p</italic>
<0.01). That is, ID1 kicked with his right leg intensely and rhythmically when the music was played (
<xref ref-type="fig" rid="pone-0097680-g001">Figure 1DE</xref>
and
<xref ref-type="supplementary-material" rid="pone.0097680.s022">Videos S1</xref>
and
<xref ref-type="supplementary-material" rid="pone.0097680.s024">S3</xref>
). We also found an infant (ID25, 113 days of age) who showed prominent rhythmic movements in the left arm when listening to “Everybody” and “Go Trippy” (Grubbs test, movement amount,
<italic>G</italic>
 = 4.13,
<italic>p</italic>
<0.01; PSD,
<italic>G</italic>
 = 4.50,
<italic>p</italic>
<0.01;
<xref ref-type="supplementary-material" rid="pone.0097680.s001">Figure S1</xref>
and
<xref ref-type="supplementary-material" rid="pone.0097680.s023">Videos S2</xref>
,
<xref ref-type="supplementary-material" rid="pone.0097680.s025">S4</xref>
, and
<xref ref-type="supplementary-material" rid="pone.0097680.s026">S5</xref>
). We found that 5 out of 27 moving sections were detected from ID25 in the music condition (
<xref ref-type="supplementary-material" rid="pone.0097680.s021">Table S7</xref>
). In sum, 20 out of 27 moving sections in the music condition (74% of total) were detected from ID1 and ID25 (
<xref ref-type="supplementary-material" rid="pone.0097680.s021">Table S7</xref>
), showing that the higher degree of synchronization in the music condition resulted mostly from these two individuals.</p>
<p>We next tested whether the phases of limb movements in ID1 and ID25 were significantly synchronized with those of the musical beat. To do this, the observed degree of phase synchronization was statistically tested by comparing with those calculated from 10,000
<italic>phase</italic>
randomized surrogate data for each moving section (Monte-Carlo statistics
<xref rid="pone.0097680-Prichard1" ref-type="bibr">[45]</xref>
,
<xref rid="pone.0097680-Stefanics2" ref-type="bibr">[46]</xref>
, see right panels in
<xref ref-type="fig" rid="pone-0097680-g002">Figures 2</xref>
and
<xref ref-type="sec" rid="s5">Methods</xref>
for detail). The statistics revealed that the observed synchronization index of the right leg in ID1 was significantly above the confidence interval (
<italic>p</italic>
<0.05,
<xref ref-type="fig" rid="pone-0097680-g002">Figure 2G</xref>
). As a further investigation, we also tested whether ID1 can synchronize to a rhythmic sound at a different tempo without any vocal sound. That is, we examined the kicking movements of ID1 to a drum pattern (100 BPM = 1.7 Hz,
<xref ref-type="supplementary-material" rid="pone.0097680.s027">Video S6</xref>
). Note that this drum pattern was played only for ID1 for further investigation. We then also found significant synchronization of ID1’s kicking movements to musical beat at this tempo (
<italic>p</italic>
<0.05, Monte-Carlo statistics,
<xref ref-type="supplementary-material" rid="pone.0097680.s004">Figure S4</xref>
). Thus, ID1 showed the significant phase synchronizations to the musical beat in the two different types of musical stimuli that had different tempi.</p>
<p>Monte-Carlo statistics for ID25 revealed that the observed synchronization index of left hand was significantly above the confidence interval when listening to “Everybody” (
<italic>p</italic>
<0.05,
<xref ref-type="supplementary-material" rid="pone.0097680.s005">Figure S5</xref>
and
<xref ref-type="supplementary-material" rid="pone.0097680.s025">Video S4</xref>
). On the contrary, the significant synchronization could not be found in the moving sections in ID25 when listening to “Go Trippy” (
<xref ref-type="supplementary-material" rid="pone.0097680.s006">Figure S6</xref>
and
<xref ref-type="supplementary-material" rid="pone.0097680.s026">Video S5</xref>
). The periodicity of left hand in ID25 was relatively slow compared to the musical tempo of “Go Trippy” (130.0 BPM). Thus, the significant phase synchronizations were observed in the kicking movements of ID1 during playing of “Everybody” and a drum pattern, and arm-waving movements of ID25 during playing of “Everybody”. We also found a significant synchronization of the right-leg movements in ID20 during playing of “Everybody” (
<xref ref-type="supplementary-material" rid="pone.0097680.s007">Figure S7</xref>
). However, ID20 did not show significant increase in the amount of movement during the music condition compared to the silent condition (one of the gray lines in
<xref ref-type="fig" rid="pone-0097680-g001">Figure 1B</xref>
), and the rhythmic movement of ID20 was not as clear as compared to those of ID1 and ID25 (compare
<xref ref-type="supplementary-material" rid="pone.0097680.s007">Figure S7</xref>
to
<xref ref-type="fig" rid="pone-0097680-g002">Figures 2</xref>
and
<xref ref-type="supplementary-material" rid="pone.0097680.s004">S4</xref>
-
<xref ref-type="supplementary-material" rid="pone.0097680.s006">S6</xref>
).</p>
</sec>
<sec id="s2e">
<title>Vocalizations</title>
<p>To test whether the infants produced altered vocalizations in response to the music, we first assessed the mean duration of vocalizations per minute as a measure of the amount of vocalizations made (
<xref ref-type="supplementary-material" rid="pone.0097680.s008">Figure S8</xref>
). But we found no significant differences between the silent and music conditions for this measure (
<xref ref-type="fig" rid="pone-0097680-g003">Figures 3A</xref>
and
<xref ref-type="supplementary-material" rid="pone.0097680.s009">S9A</xref>
). There was no significant correlation between the mean duration of vocalization and the age of days (
<xref ref-type="supplementary-material" rid="pone.0097680.s018">Tables S4</xref>
and
<xref ref-type="supplementary-material" rid="pone.0097680.s019">S5</xref>
).</p>
<fig id="pone-0097680-g003" orientation="portrait" position="float">
<object-id pub-id-type="doi">10.1371/journal.pone.0097680.g003</object-id>
<label>Figure 3</label>
<caption>
<title>Spontaneous vocalizations of infants during the music condition “Go Trippy” by WANICO feat. Jake Smith (red) and in the silent condition where no auditory stimulus was present (blue).</title>
<p>Error bars indicate standard errors (SE) among the participants. (
<bold>A</bold>
) No significant difference was found in mean duration of vocalization per minute between the silent and music conditions (Wilcoxon signed-rank test,
<italic>Z</italic>
 = 1.62,
<italic>p</italic>
 = 0.11). (
<bold>B</bold>
) Typical time series of fundamental (F
<sub>0</sub>
, black lines) and formant frequencies (F
<sub>1</sub>
and F
<sub>2</sub>
, cyan and magenta lines, respectively) within utterances. (
<bold>C, D</bold>
) Mean F
<sub>0</sub>
and F
<sub>1</sub>
was significantly higher in the music condition than in the silent condition (
<italic>Z</italic>
 = 2.39, *
<italic>p</italic>
<0.05;
<italic>Z</italic>
 = 2.06, *
<italic>p</italic>
<0.05, respectively). (
<bold>E, F</bold>
) There were no significant differences in mean F
<sub>2</sub>
and SD of F
<sub>0</sub>
(
<italic>Z</italic>
 = 1.92,
<italic>p</italic>
 = 0.06;
<italic>Z</italic>
 = 1.16,
<italic>P</italic>
 = 0.25, respectively). (
<bold>G, H</bold>
) SD of F
<sub>1</sub>
and F
<sub>2</sub>
were significantly higher in the music condition than in the silent condition (
<italic>Z</italic>
 = 3.43, **
<italic>p</italic>
<0.001;
<italic>Z</italic>
 = 3.48, **
<italic>p</italic>
<0.001, respectively).</p>
</caption>
<graphic xlink:href="pone.0097680.g003"></graphic>
</fig>
<p>When we assessed the mean and standard deviation (SD) of the fundamental (F
<sub>0</sub>
) and formant frequencies (F
<sub>1</sub>
and F
<sub>2</sub>
) within the infant’s utterances (
<xref ref-type="fig" rid="pone-0097680-g003">Figures 3B</xref>
,
<xref ref-type="supplementary-material" rid="pone.0097680.s008">S8</xref>
, and
<xref ref-type="supplementary-material" rid="pone.0097680.s009">S9B</xref>
), no significant difference was found between the silent and music conditions in the mean F
<sub>2</sub>
(
<xref ref-type="fig" rid="pone-0097680-g003">Figures 3E</xref>
and
<xref ref-type="supplementary-material" rid="pone.0097680.s009">S9E</xref>
) and SD of F
<sub>0</sub>
(
<xref ref-type="fig" rid="pone-0097680-g003">Figures 3F</xref>
and
<xref ref-type="supplementary-material" rid="pone.0097680.s009">S9F</xref>
). On the contrary, we found significant increases in the SDs of F
<sub>1</sub>
and F
<sub>2</sub>
in the music compared to the silent condition (
<italic>p</italic>
<0.05,
<xref ref-type="fig" rid="pone-0097680-g003">Figures 3GH</xref>
and
<xref ref-type="supplementary-material" rid="pone.0097680.s009">S9GH</xref>
). Significant increases in the mean F
<sub>0</sub>
and F
<sub>1</sub>
were also found when infants listened to “Go Trippy” compared to the silent condition (
<xref ref-type="fig" rid="pone-0097680-g003">Figure 3CD</xref>
). However, the increases in the mean F
<sub>0</sub>
and F
<sub>1</sub>
were not observed when listening to “Everybody” compared to the silent condition (
<xref ref-type="supplementary-material" rid="pone.0097680.s009">Figure S9CD</xref>
). There was no significant correlation between the spectrum measures of vocalizations and the age of days (
<xref ref-type="supplementary-material" rid="pone.0097680.s017">Tables S3</xref>
and
<xref ref-type="supplementary-material" rid="pone.0097680.s018">S4</xref>
).</p>
</sec>
</sec>
<sec id="s3">
<title>Discussion</title>
<sec id="s3a">
<title>Movement-to-music Synchronization</title>
<p>As far as we know, this study is the first to investigate movement-to-music synchronization in three- to four-months-old infants. While the previous study on 5- to 24-months old infants could not find evidence for movement-to-music synchronization
<xref rid="pone.0097680-Zentner1" ref-type="bibr">[11]</xref>
, we found significant phase synchronization of limb movements to the musical beat. We suggest that this discrepancy is primarily due to the different ways of analysis. That is, the previous study
<xref rid="pone.0097680-Zentner1" ref-type="bibr">[11]</xref>
performed only a group level of analysis and did not perform the detailed analysis of phase synchronization on the individuals while this study did. In fact, our group analysis showed significant decreases in the movement amount and spectrum frequency around the musical tempo in the music condition compared to the silent condition. The results show that, at the group level, music did not facilitate spontaneous limb movements of the infants. Rather, most of the infants’ movements seemed to be more inactive during listening to the music. Thus, if a scholar looks at only the results from the group analysis, he/she may assume that the infants do not move their limbs actively in response to music and therefore they do not synchronize to a musical beat.</p>
<p>However, our individual analysis revealed that there were infants who significantly increased the amount of movements and the spectrum frequencies around the musical tempo. Monte-Carlo statistics showed that there were periods in which phases of limb movements in these individuals were significantly synchronized with those of the musical beats. Our results show that individual differences are large in the limb movements of the infants during playing of the music. It is worth mentioning that, in the previous study on 5- to 24-months old infants
<xref rid="pone.0097680-Zentner1" ref-type="bibr">[11]</xref>
, there were also individuals who moved their arms and legs rhythmically over three seconds in response to music (see their figure and supplementary videos). Although they did not perform individual analysis on the phases of limb movements, significant movement-to-music synchronizations might be observed if the same analysis as this study was performed (i.e., calculations of relative phases, synchronization index, and Monte-Carlo statistics). Taken together, we suggest that the movement-to-music synchronization is rare in infants, and observed at an individual level.</p>
<p>The patterns of synchronization in the individuals in this study were comparable to the case study of a dancing cockatoo
<xref rid="pone.0097680-Patel1" ref-type="bibr">[8]</xref>
,
<xref rid="pone.0097680-Patel3" ref-type="bibr">[47]</xref>
. Patel et al.
<xref rid="pone.0097680-Patel3" ref-type="bibr">[47]</xref>
described the cockatoo’s behavior as “sporadic synchronization”, meaning that there were only limited periods of genuine synchronization. They also stated that the degree of phase synchronization in the cockatoo was not at the level at which human adults show during playing of the music
<xref rid="pone.0097680-Patel3" ref-type="bibr">[47]</xref>
. The movement-to-music synchronization in the infants would be also regarded as sporadic synchronization because they did not always synchronize to the musical beat. In this regard, the movement-to-music synchronization in the infants is not at the level of human adults and may be interpreted as the precursor that evolves later.</p>
</sec>
<sec id="s3b">
<title>Difference Across the Four Limbs</title>
<p>We expected that the movement responses to music in infants would be different across the four limbs. However, in the group analysis, we could not found any significant difference across the four limbs in our movement measures. As a group, the amount of movements decreased overall across the limbs in the music condition compared to the silent condition. This result is consistent with the previous study that showed reduced amount of movement in all of the four limbs in three-months-old infants when they attended to an auditory-visual stimulus such as a mobile toy that made sounds
<xref rid="pone.0097680-Watanabe1" ref-type="bibr">[25]</xref>
. These findings suggest that the external inputs tap into perceptual-attentional system to inhibit all of the four-limb activities in most of the infants
<xref rid="pone.0097680-Watanabe1" ref-type="bibr">[25]</xref>
.</p>
<p>Contrary to the results of group analysis, ID1 moved right leg and ID25 moved her left arm more intensely and rhythmically compared to the other limbs during the music condition (
<xref ref-type="supplementary-material" rid="pone.0097680.s010">Figure S10A</xref>
B). Movement of ID1 was leg-based while that of ID25 was arm-based. In addition, the rhythmic movement of ID1 was relatively more prominent than ID25. To account for these movement patterns in ID1 and ID25, we consider the role of rhythmic neural oscillations in the CPG and its entrainment mechanism (i.e., the process of spontaneous mode locking of coupled oscillators)
<xref rid="pone.0097680-Taga1" ref-type="bibr">[33]</xref>
,
<xref rid="pone.0097680-Grillner1" ref-type="bibr">[36]</xref>
.</p>
<p>Although little is known on the neural mechanisms underlying movement generation in human infants, the spontaneous limb movements are thought to be mainly produced by the subcortical system composed of the brainstem and the spinal cord, including the CPGs, which activities interact with the higher-order cortical system
<xref rid="pone.0097680-Watanabe1" ref-type="bibr">[25]</xref>
,
<xref rid="pone.0097680-Watanabe2" ref-type="bibr">[26]</xref>
,
<xref rid="pone.0097680-Georgopoulos1" ref-type="bibr">[35]</xref>
,
<xref rid="pone.0097680-Dominici1" ref-type="bibr">[48]</xref>
. The leg movements are considered to be primarily generated by the subcortical system itself while the control of arm movements involves more contribution from the cortical system
<xref rid="pone.0097680-Watanabe1" ref-type="bibr">[25]</xref>
,
<xref rid="pone.0097680-Watanabe2" ref-type="bibr">[26]</xref>
,
<xref rid="pone.0097680-Georgopoulos1" ref-type="bibr">[35]</xref>
,
<xref rid="pone.0097680-Dominici1" ref-type="bibr">[48]</xref>
. If so, the remarkable increases in the amount of limb movements in ID1 and ID25 could be interpreted as an enhancement of the CPG activities in the subcortical system elicited by the music, but the degree of interference from the cortical system might be different between ID1 and ID25. That is, the movement of ID1 might be more dominated by the CPG activity and less interfered from the cortical system, and vice versa in ID25. Recent music and neuroscience studies have shown that beat perception and synchronization are related to neural activities not only in the auditory and motor cortices but also in the subcortical areas including the brainstem
<xref rid="pone.0097680-Chen1" ref-type="bibr">[27]</xref>
,
<xref rid="pone.0097680-Fujioka1" ref-type="bibr">[28]</xref>
,
<xref rid="pone.0097680-Grahn1" ref-type="bibr">[29]</xref>
,
<xref rid="pone.0097680-Nozaradan1" ref-type="bibr">[30]</xref>
,
<xref rid="pone.0097680-Kung1" ref-type="bibr">[31]</xref>
,
<xref rid="pone.0097680-Stupacher1" ref-type="bibr">[49]</xref>
,
<xref rid="pone.0097680-Tierney1" ref-type="bibr">[50]</xref>
. The movement-to-music synchronization in ID1 and ID25 might be caused by the entrainment between the enhanced CPG activity and the other rhythmic neural activities in the cortical and subcortical networks elicited by the music, yet the patterns of neural entrainment might be different depending on the development of the nervous system in the individual. The synchronization of ID1 might be interpreted as CPG-based neural entrainment while that of ID25 as cortical-based.</p>
</sec>
<sec id="s3c">
<title>Altered Vocalizations</title>
<p>A clear change in vocal quality (i.e., an increase in the formant variability) was found in the infants as a group when music was present. Since the formant frequencies reflect movements of the vocal tract
<xref rid="pone.0097680-Fitch2" ref-type="bibr">[40]</xref>
, the result suggests that music makes vocal-tract movements more variable in infants. This result is comparable with a previous finding where three- to four-months-old infants changed their vocalizations and showed proto-conversational abilities in response to their mother’s speaking
<xref rid="pone.0097680-Masataka1" ref-type="bibr">[39]</xref>
. Our result suggests that the music could serve as a communicative signal like speech sounds do for pre-verbal infants
<xref rid="pone.0097680-Masataka2" ref-type="bibr">[51]</xref>
. These findings might be attributed to a shared neural mechanism that processes music and speech in the brain
<xref rid="pone.0097680-Patel2" ref-type="bibr">[19]</xref>
,
<xref rid="pone.0097680-Trehub2" ref-type="bibr">[21]</xref>
,
<xref rid="pone.0097680-Kotilahti1" ref-type="bibr">[22]</xref>
,
<xref rid="pone.0097680-Patel4" ref-type="bibr">[52]</xref>
.</p>
<p>We found significant increases in the mean F
<sub>0</sub>
and F
<sub>1</sub>
when the infants listened to “Go Trippy” but not when listened to “Everybody”. The increased mean F
<sub>0</sub>
indicates higher pitch of the vocalization, and the increased mean F
<sub>1</sub>
indicates different shape of the vocal-tract during the music condition compared with the silent condition. In this study, the tempo of “Go Trippy” (130.0 BPM) was faster than that of “Everybody” (108.7 BPM), and only the former included a female voice. This might be the reason why the increased mean F
<sub>0</sub>
and F
<sub>1</sub>
were found only during playing of the “Go Trippy” but not during “Everybody”.</p>
<p>Our results suggest that music does not facilitate spontaneous limb movements in most of the infants but modulates the vocalizations instead. As discussed above, music might tap into the perceptual-attentional system in the cortex to inhibit the limb movements, but alternatively, it might facilitate neural activities for the vocal production leading to the changes in the fundamental and formant frequencies. The auditory-motor network underlying the altered vocalizations in the infants may evolve later to achieve more refined vocalizations with music. In this viewpoint, the altered vocalizations of the infants may be interpreted as a precursor of singing.</p>
</sec>
<sec id="s3d">
<title>Effect of Age</title>
<p>In this study, there was no significant correlation between the age of days and the behavioral measures, showing that the effect of age was not clear in our group analysis. This is not consistent with the previous studies that showed the effect of age on the limb movements and vocalizations in the infants
<xref rid="pone.0097680-Kato1" ref-type="bibr">[41]</xref>
,
<xref rid="pone.0097680-Kuhl1" ref-type="bibr">[42]</xref>
. One of the reasons for this inconsistency could be the age range which was narrower in this study (106–125 days of age) compared to the previous studies (90–126 days of age in the study by Kato et al.
<xref rid="pone.0097680-Kato1" ref-type="bibr">[41]</xref>
and 12–20 weeks of age in the study by Kuhl and Meltzoff
<xref rid="pone.0097680-Kuhl1" ref-type="bibr">[42]</xref>
).</p>
<p>For further individual analysis, we compared the age of days of ID1 and ID25 to the other infants (
<xref ref-type="supplementary-material" rid="pone.0097680.s010">Figure S10C</xref>
). Although the age of ID1 (122 days of age) was relatively older than the other infants (113.5±3.9 days of age, mean ± standard deviation), this was not the case for ID25 (113 days of age). Moreover, neither the oldest infant (ID2, 125 days of age) nor the youngest (ID5, 106 days of age) showed any significant rhythmic movements (
<xref ref-type="supplementary-material" rid="pone.0097680.s015">Tables S1</xref>
,
<xref ref-type="supplementary-material" rid="pone.0097680.s016">S2</xref>
, and
<xref ref-type="supplementary-material" rid="pone.0097680.s021">S7</xref>
). It is therefore difficult to explain the individual differences in this study in terms of the infant’s age.</p>
</sec>
<sec id="s3e">
<title>Limitations of the Study, Open Questions, and Future Work</title>
<p>A limitation of this study is that we cannot make a definitive conclusion about whether the group-level effects in this study could be specifically attributed to music. Since we compared the infants’ behaviors during playing of the music with those in silence, one may argue that the group-level effects could be regarded as general responses to external stimuli and not specific to music. In addition, because both “Everybody” and “Go Trippy” included the vocal tracks and were not instrumental music, we could not separate out the possibility that the human voice elicited the group effects. It would be interesting for future studies to investigate whether or not the other acoustic and non-acoustic stimuli (e.g., speech sounds, instrumental music, colorful silent videos, and pictures of interesting objects) could elicit the same group effect as well.</p>
<p>Another limitation of this study is that the 95% confidence interval criterion in the Monte-Carlo statistics might be too relaxed to demonstrate significant movement-to-music synchronization. Because we tested 51 moving sections in total in our Monte-Carlo statistics, one may argue that the significant synchronizations in the individuals could be type 1 errors. However, if the synchronization in the music condition happened purely by a chance, the same degree of synchronization should be observed between the silent and music conditions in this study, yet this was not the case (
<xref ref-type="fig" rid="pone-0097680-g001">Figure 1F</xref>
). We therefore suggest that the type 1 error is less likely although we cannot completely rule it out.</p>
<p>One of the interesting questions for future developmental studies on music is whether infants younger than three-months old show synchronized limb movements and/or altered vocalizations in response to music. Previous studies suggest that nervous systems of the infants younger than three months are more subcortically-based
<xref rid="pone.0097680-Watanabe1" ref-type="bibr">[25]</xref>
,
<xref rid="pone.0097680-Watanabe2" ref-type="bibr">[26]</xref>
,
<xref rid="pone.0097680-Dominici1" ref-type="bibr">[48]</xref>
. If the CPG activity is the key for movement-to-music synchronization in the infants, more prominent precursors of dancing might be observed in the infants younger than three-months old.</p>
</sec>
</sec>
<sec id="s4">
<title>Conclusion</title>
<p>We found striking increases in the amount of rhythmic limb movements and their significant phase synchronization to the musical beat in the individuals, but, as a group, there was no facilitation of spontaneous limb movements during the music compared to the silent condition. On the other hand, we found a clear increase in the formant variability of vocalizations in the group during music perception. The results suggest that our brains are already primed with our bodies to interact with music at three- to four-months of age via limb movements and vocalizations. These findings are comparable to those from previous studies that show the early manifestations of body-environment or cross-modal interactions in infants; imitation of adult’s facial and manual gestures
<xref rid="pone.0097680-Meltzoff1" ref-type="bibr">[53]</xref>
, and synchronization of body movements and alteration of vocalizations with adult speech
<xref rid="pone.0097680-Condon1" ref-type="bibr">[20]</xref>
,
<xref rid="pone.0097680-Masataka2" ref-type="bibr">[51]</xref>
. In line with the notion that these infant behaviors are the developmental precursors of unique human abilities such as higher order communication and/or socialization, our results may be interpreted as the precursors of dancing and singing.</p>
</sec>
<sec sec-type="methods" id="s5">
<title>Methods</title>
<sec id="s5a">
<title>Data Acquisition</title>
<sec id="s5a1">
<title>Participants</title>
<p>107 healthy infants aged three- to four-months-old were recruited via the local Basic Resident Register. Ethical approval for this study was obtained from the ethical committee of The Graduate School of Education, University of Tokyo, and written informed consent was obtained from parents of all infants prior to the initiation of the experiments. We got written permission from the parents of infants who appear in the figures and videos regarding the use of the materials for publication.</p>
</sec>
<sec id="s5a2">
<title>Stimulus</title>
<p>We used two pop songs as auditory stimuli in the music condition: “Everybody,” by the Backstreet Boys, duration = 290 sec, tempo = 108.7 beats per minute (BPM) = 1.8 Hz (see
<xref ref-type="supplementary-material" rid="pone.0097680.s024">Videos S3</xref>
and
<xref ref-type="supplementary-material" rid="pone.0097680.s025">S4</xref>
); and “Go Trippy” by WANICO feat. Jake Smith (Right Bank Music Inc. Los Angeles, CA), duration = 243 s, tempo = 130.0 BPM = 2.2 Hz (see
<xref ref-type="supplementary-material" rid="pone.0097680.s026">Video S5</xref>
). The number of BPM for each song was estimated from the sound wave file by using a script for Matlab software called “tempo2.m” which was developed by Ellis
<xref rid="pone.0097680-Ellis1" ref-type="bibr">[54]</xref>
,
<xref rid="pone.0097680-Music1" ref-type="bibr">[55]</xref>
. No auditory stimulus was provided in the silent condition.</p>
</sec>
<sec id="s5a3">
<title>Setup</title>
<p>Each infant was positioned on his/her back on a baby mattress (70 cm×120 cm,
<xref ref-type="supplementary-material" rid="pone.0097680.s002">Figure S2</xref>
). Four spherical reflective markers with a diameter of 2 cm and a weight of approximately 5 g were attached to the wrists and ankles of each infant. In the music condition, either “Everybody” or “Go Trippy” was played through two loudspeakers placed at a distance of 120 cm from the head position of the infant at a sound pressure level of 70 dB. The duration of data recording ranged from 60–393 s depending on the infant’s state (
<xref ref-type="supplementary-material" rid="pone.0097680.s015">Tables S1</xref>
and
<xref ref-type="supplementary-material" rid="pone.0097680.s016">S2</xref>
). Both experimenters and parents were out of the infant’s sight during the recording to prevent social interaction from taking place. Movements of the infants’ limbs in three-dimensional (3D) space were recorded using a 3D motion capture system (Motion Analysis Co., Santa Rosa, California). Six CCD monochrome shuttered cameras (motion sampling rate = 60 Hz; Hawk digital camera) with electronically shuttered infrared LED synchronized strobe lighting were placed around the baby mattress. A digital video camera (SONY DCR-PC300K) was also used to monitor the infant’s state, and sound data was extracted from this digital video camera in order to analyze the infant’s voice (audio sampling rate = 36,000 Hz).</p>
</sec>
<sec id="s5a4">
<title>Data set</title>
<p>We analyzed data from full-term 30 infants (18 male and 12 female) aged 106 to 125 days who underwent both the silent and music conditions (
<xref ref-type="supplementary-material" rid="pone.0097680.s015">Tables S1</xref>
and
<xref ref-type="supplementary-material" rid="pone.0097680.s016">S2</xref>
). Within this group, 7 infants underwent “Everybody”, 4 underwent “Go Trippy”, and 19 underwent both songs. In other words, 26 infants underwent “Everybody” and 23 infants underwent “Go Trippy.” Additional data from 77 infants were also collected, but excluded from the analysis because 49 infants could not complete either the silent or music condition; this was due to fussing or crying (
<italic>n</italic>
 = 41), infants rolling over (
<italic>n</italic>
 = 4), or system errors (
<italic>n</italic>
 = 4). Another 28 infants could not go through any condition because of fussing or crying (
<italic>n</italic>
 = 26) or rolling over (
<italic>n</italic>
 = 2). A large number of infants fussed or cried in this study (
<italic>n</italic>
 = 67 in total) because both experimenters and parents were out of the infant’s sight during the recording to prevent any social interaction and therefore to investigate spontaneous limb movements and vocalizations of the infants. As for the ID1 infant, an additional auditory stimulus (a drum pattern, duration = 71 s, tempo = 100.0 BPM) was provided for the further investigation (
<xref ref-type="supplementary-material" rid="pone.0097680.s027">Video S6</xref>
).</p>
</sec>
</sec>
<sec id="s5b">
<title>Analysis of Limb Movement</title>
<sec id="s5b1">
<title>Amount of limb movement</title>
<p>The position data for each limb along each coordinate axis was smoothed by applying a bidirectional fourth-order Butterworth low-pass filter at a cutoff frequency of 10 Hz. The data after the filtering is shown in
<xref ref-type="supplementary-material" rid="pone.0097680.s003">Figure S3A</xref>
. We obtained the velocity data for each limb along each of the X-, Y-, and Z-coordinate axis [
<inline-formula>
<inline-graphic xlink:href="pone.0097680.e004.jpg"></inline-graphic>
</inline-formula>
,
<inline-formula>
<inline-graphic xlink:href="pone.0097680.e005.jpg"></inline-graphic>
</inline-formula>
, and
<inline-formula>
<inline-graphic xlink:href="pone.0097680.e006.jpg"></inline-graphic>
</inline-formula>
] by differentiating the smoothed position data. Square sum of velocities
<inline-formula>
<inline-graphic xlink:href="pone.0097680.e007.jpg"></inline-graphic>
</inline-formula>
was then calculated for each limb as;
<disp-formula id="pone.0097680.e008">
<graphic xlink:href="pone.0097680.e008.jpg" position="anchor" orientation="portrait"></graphic>
<label>(1)</label>
</disp-formula>
</p>
<p>An example of the calculated square sum of velocity
<inline-formula>
<inline-graphic xlink:href="pone.0097680.e009.jpg"></inline-graphic>
</inline-formula>
is shown in
<xref ref-type="supplementary-material" rid="pone.0097680.s003">Figure S3B</xref>
. To qualitatively describe movement amount of each limb in the silent and music conditions, we used the mean square sum of velocity;
<disp-formula id="pone.0097680.e010">
<graphic xlink:href="pone.0097680.e010.jpg" position="anchor" orientation="portrait"></graphic>
<label>(2)</label>
</disp-formula>
where
<italic>N</italic>
is the number of recorded time points for each infant.</p>
</sec>
<sec id="s5b2">
<title>Frequency of limb movement</title>
<p>We submitted the smoothed position data
<inline-formula>
<inline-graphic xlink:href="pone.0097680.e011.jpg"></inline-graphic>
</inline-formula>
multiplied by Hanning window of each limb along each coordinate axis to a Fourier transform to investigate the frequency component of the infant’s motion;
<disp-formula id="pone.0097680.e012">
<graphic xlink:href="pone.0097680.e012.jpg" position="anchor" orientation="portrait"></graphic>
<label>(3)</label>
</disp-formula>
where
<inline-formula>
<inline-graphic xlink:href="pone.0097680.e013.jpg"></inline-graphic>
</inline-formula>
is the amplitude and
<inline-formula>
<inline-graphic xlink:href="pone.0097680.e014.jpg"></inline-graphic>
</inline-formula>
is the phase. Examples of the Fourier transforms are shown in
<xref ref-type="supplementary-material" rid="pone.0097680.s003">Figure S3C</xref>
. We calculated proportions of the PSD within 0.05–1.00, 1.00–2.00, and 2.00–3.00 Hz frequency ranges relative to the total PSD above 0.05 Hz. We also calculated a proportion of the PSD ±10% of the musical tempo relative to the total PSD above 0.05 Hz: This index becomes higher when the infant’s limb motion includes relatively more frequency components that are closer to the musical tempo.</p>
</sec>
<sec id="s5b3">
<title>Detection of beat onsets</title>
<p>We determined the beat onsets of the auditory stimuli by using a Matlab script called “beat2.m” which was developed by Ellis
<xref rid="pone.0097680-Ellis1" ref-type="bibr">[54]</xref>
,
<xref rid="pone.0097680-Music1" ref-type="bibr">[55]</xref>
. To check timings of the detected beat onsets, we superimposed a woodblock sound on the musical stimuli at each of the detected onset. One author who had 15 years of experience of playing drums listened to the superimposed tracks carefully and felt that the overall timing of onsets was slightly earlier than expected. Therefore, the beat onsets detected by the script were shifted in 30 ms behind to make it perceptually reasonable.</p>
</sec>
<sec id="s5b4">
<title>Relative phase</title>
<p>We calculated the instantaneous phase of the musical beat
<inline-formula>
<inline-graphic xlink:href="pone.0097680.e015.jpg"></inline-graphic>
</inline-formula>
as a linear increase from −180 to 180 degrees between the beat onsets (e.g.,
<xref ref-type="fig" rid="pone-0097680-g002">Figures 2C</xref>
and
<xref ref-type="supplementary-material" rid="pone.0097680.s004">S4</xref>
-
<xref ref-type="supplementary-material" rid="pone.0097680.s007">S7C</xref>
). We calculated the instantaneous phase of the infant’s motion
<inline-formula>
<inline-graphic xlink:href="pone.0097680.e016.jpg"></inline-graphic>
</inline-formula>
from the time series of limb-position data
<inline-formula>
<inline-graphic xlink:href="pone.0097680.e017.jpg"></inline-graphic>
</inline-formula>
as;
<disp-formula id="pone.0097680.e018">
<graphic xlink:href="pone.0097680.e018.jpg" position="anchor" orientation="portrait"></graphic>
<label>(4)</label>
</disp-formula>
where the function
<inline-formula>
<inline-graphic xlink:href="pone.0097680.e019.jpg"></inline-graphic>
</inline-formula>
is Hilbert transform of the position data and
<italic>A</italic>
(
<italic>t</italic>
) is the instantaneous amplitude (e.g.,
<xref ref-type="fig" rid="pone-0097680-g002">Figures 2BD</xref>
and
<xref ref-type="supplementary-material" rid="pone.0097680.s004">S4</xref>
-
<xref ref-type="supplementary-material" rid="pone.0097680.s007">S7BD</xref>
). We calculated the relative phase
<inline-formula>
<inline-graphic xlink:href="pone.0097680.e020.jpg"></inline-graphic>
</inline-formula>
between the infant’s motion and the musical beat as;</p>
<p>
<disp-formula id="pone.0097680.e021">
<graphic xlink:href="pone.0097680.e021.jpg" position="anchor" orientation="portrait"></graphic>
<label>(5)</label>
</disp-formula>
Examples of the calculated relative phases are shown in
<xref ref-type="fig" rid="pone-0097680-g002">Figures 2E</xref>
and
<xref ref-type="supplementary-material" rid="pone.0097680.s004">S4</xref>
-
<xref ref-type="supplementary-material" rid="pone.0097680.s007">S7E</xref>
. Note that the Hilbert transform and calculation of relative phases were performed for the entire set of recorded time series before detecting the moving sections.</p>
</sec>
<sec id="s5b5">
<title>Moving section</title>
<p>To perform phase-synchronization analysis between the musical beat and the infant’s rhythmic motion, we first determined the movement onsets and offsets to find continuous movements because the infants moved in an intermittent fashion (e.g.,
<xref ref-type="fig" rid="pone-0097680-g001">Figures 1D</xref>
,
<xref ref-type="supplementary-material" rid="pone.0097680.s001">S1D</xref>
, and
<xref ref-type="supplementary-material" rid="pone.0097680.s003">S3</xref>
). The onset was defined as the time at which the 10-points moving-averaged square sum of velocity exceeded 10% of the maximum value while the offset was defined as the time point at which the moving-averaged signal to be under the threshold (
<xref ref-type="supplementary-material" rid="pone.0097680.s003">Figure S3E</xref>
). We then detected a period of time in which the duration from the onset to offset was longer than three seconds, and designated it as a moving section. Detailed descriptions of the detected moving sections are summarized in
<xref ref-type="supplementary-material" rid="pone.0097680.s021">Table S7</xref>
. We selected an axis in which the square sum of velocity was largest among the three (X, Y, and Z) coordinates. In other words, we found an axis along which the infant moved most intensely (e.g.,
<xref ref-type="supplementary-material" rid="pone.0097680.s003">Figure S3F</xref>
). The position data in the moving section along with this selected axis was used to calculate the synchronization index. We did not integrate information from the three axes but selected one for the synchronization analysis because the rhythmic movements, whose frequency was close to the musical tempo, were clearly observed in the selected axis (e.g.,
<xref ref-type="supplementary-material" rid="pone.0097680.s003">Figure S3DE</xref>
).</p>
</sec>
<sec id="s5b6">
<title>Synchronization index</title>
<p>To quantitatively describe the properties of the relative phase distribution within the moving section, we introduced a measure of Shannon entropy (SE)
<xref rid="pone.0097680-Shannon1" ref-type="bibr">[56]</xref>
, which is defined as the average value of logarithms of the probability density function;
<disp-formula id="pone.0097680.e022">
<graphic xlink:href="pone.0097680.e022.jpg" position="anchor" orientation="portrait"></graphic>
<label>(6)</label>
</disp-formula>
</p>
<p>
<italic>M</italic>
is the number of bins with non-zero probability and
<inline-formula>
<inline-graphic xlink:href="pone.0097680.e023.jpg"></inline-graphic>
</inline-formula>
is the probability of the
<italic>i</italic>
-th bin. To relate the dispersion of relative phase with the strength of synchronization, a synchronization index (SI) was defined as;
<disp-formula id="pone.0097680.e024">
<graphic xlink:href="pone.0097680.e024.jpg" position="anchor" orientation="portrait"></graphic>
<label>(7)</label>
</disp-formula>
where
<italic>N</italic>
is the total number of bins in the circular histogram
<xref rid="pone.0097680-Tass1" ref-type="bibr">[44]</xref>
,
<xref rid="pone.0097680-Mase1" ref-type="bibr">[57]</xref>
. We used the bin size of 10 degrees to calculate the SI. The synchronization index ranges from 0, when the spreading of relative phase is maximal (i.e., when all phases lie in different bins), to 1, when a
<italic>δ</italic>
-function like probability distribution is found (i.e., all phases lie in a single bin). Thus, the larger the synchronization index value, the stronger the phase of an infant’s motion is locked to that of musical beat within the moving section.</p>
</sec>
<sec id="s5b7">
<title>Surrogate data analysis</title>
<p>To statistically test the observed degree of phase synchronization between the infant’s motion and musical beat, we performed a
<italic>phase randomized</italic>
surrogate data analysis
<xref rid="pone.0097680-Prichard1" ref-type="bibr">[45]</xref>
. A phase-randomized Fourier transform of the position data
<inline-formula>
<inline-graphic xlink:href="pone.0097680.e025.jpg"></inline-graphic>
</inline-formula>
is made by rotating the phase
<inline-formula>
<inline-graphic xlink:href="pone.0097680.e026.jpg"></inline-graphic>
</inline-formula>
at each frequency
<inline-formula>
<inline-graphic xlink:href="pone.0097680.e027.jpg"></inline-graphic>
</inline-formula>
by an independent random variable
<italic>φ</italic>
which is chosen uniformly in the range from 0 to 2
<italic>π</italic>
;
<disp-formula id="pone.0097680.e028">
<graphic xlink:href="pone.0097680.e028.jpg" position="anchor" orientation="portrait"></graphic>
<label>(8)</label>
</disp-formula>
</p>
<p>The phase randomized surrogate time series of position data
<inline-formula>
<inline-graphic xlink:href="pone.0097680.e029.jpg"></inline-graphic>
</inline-formula>
is given by the inverse Fourier transform of
<inline-formula>
<inline-graphic xlink:href="pone.0097680.e030.jpg"></inline-graphic>
</inline-formula>
;
<disp-formula id="pone.0097680.e031">
<graphic xlink:href="pone.0097680.e031.jpg" position="anchor" orientation="portrait"></graphic>
<label>(9)</label>
</disp-formula>
</p>
<p>Typical examples of the calculated surrogate data are shown in the right panels of
<xref ref-type="fig" rid="pone-0097680-g002">Figures 2</xref>
and
<xref ref-type="supplementary-material" rid="pone.0097680.s004">S4</xref>
-
<xref ref-type="supplementary-material" rid="pone.0097680.s007">S7</xref>
. Note that
<inline-formula>
<inline-graphic xlink:href="pone.0097680.e032.jpg"></inline-graphic>
</inline-formula>
has the same power spectrum as the original position data
<inline-formula>
<inline-graphic xlink:href="pone.0097680.e033.jpg"></inline-graphic>
</inline-formula>
, so that it is more suitable to test the phase synchronization than the
<italic>time-scrambled</italic>
surrogate data
<xref rid="pone.0097680-Patel3" ref-type="bibr">[47]</xref>
.</p>
</sec>
<sec id="s5b8">
<title>Monte-Carlo statistics</title>
<p>Ten thousand phase-randomized surrogate data were generated for each of the observed position data for each moving section. Thus, we obtained one observed synchronization index and 10,000 surrogated synchronization indices for each moving section. We then performed Monte-Carlo statistics in which we tested whether the observed synchronization index is above 95% confidence interval of the surrogate synchronization indices (e.g.,
<xref ref-type="fig" rid="pone-0097680-g002">Figures 2G</xref>
and
<xref ref-type="supplementary-material" rid="pone.0097680.s004">S4</xref>
-
<xref ref-type="supplementary-material" rid="pone.0097680.s007">S7G</xref>
).</p>
</sec>
<sec id="s5b9">
<title>Virtual musical beat</title>
<p>The moving sections were detected not only in the music condition but also in the silent condition. We calculated the synchronization indices between the limb movements in the moving sections of the silent condition and the artificially aligned “virtual” musical beat extracted from the auditory stimuli in the music condition. The synchronization index in the silent condition thus indicates non-significant degree of synchronization. The synchronization indices calculated from the data in the silent condition were also submitted to Monte-Carlo statistics with 10,000 phase randomized surrogate data. We confirmed that there was no significant synchronization in the Monte-Carlo statistics on the moving sections in the silent conditions (
<xref ref-type="supplementary-material" rid="pone.0097680.s015">Tables S1</xref>
and
<xref ref-type="supplementary-material" rid="pone.0097680.s016">S2</xref>
).</p>
</sec>
<sec id="s5b10">
<title>Robustness of synchronization index</title>
<p>The synchronization index which uses Shannon entropy (Eq. 6) depends on the number of bins defined by the bin size. We therefore tested the effects of bin size by changing the size from 5 to 20 degrees with a step of 5 degrees. We also calculated a circular variance of the relative phases (length of a resultant vector in the circular plot of relative phases) as another measure of synchronization consistency
<xref rid="pone.0097680-Fujii1" ref-type="bibr">[43]</xref>
. We confirmed that the mean synchronization indices during the music condition were significantly higher than those in the silent condition regardless of the indices (
<xref ref-type="supplementary-material" rid="pone.0097680.s011">Figure S11</xref>
). We also confirmed that ID1 and ID25 showed significant phase synchronization on Monte-Carlo statistics regardless of the indices (
<xref ref-type="supplementary-material" rid="pone.0097680.s012">Figures S12</xref>
-
<xref ref-type="supplementary-material" rid="pone.0097680.s014">S14</xref>
). On the other hand, the results of Monte-Carlo statistics on ID20 were not consistent across the indices: The significant synchronization was found only in the measures of Shannon Entropy with the bin sizes of 10 and 20 degrees but not with 5 or 15 degrees nor in the measure of circular variance.</p>
</sec>
</sec>
<sec id="s5c">
<title>Analysis of Vocalization</title>
<sec id="s5c1">
<title>Spectrum subtraction</title>
<p>The recorded audio data in the music condition included not only the infant’s voice but also the sound of the auditory stimulus (
<xref ref-type="supplementary-material" rid="pone.0097680.s008">Figure S8A</xref>
). That is, the infant’s voice in the music condition was contaminated by the song played in the background. We therefore performed a spectrum subtraction: The spectrum of the auditory stimulus was subtracted from the recorded auditory files to exclude the musical stimulus and thus isolate the infant’s vocalization (
<xref ref-type="supplementary-material" rid="pone.0097680.s008">Figure S8B</xref>
). The spectrum subtraction was not performed for the recorded audio data in the silent condition since there was no sound from the auditory stimulus in the background.</p>
</sec>
<sec id="s5c2">
<title>Voice activity detection</title>
<p>Root mean square (RMS) was calculated from the pre-processed audio signal as a measure of effective sound pressure with the time window of 0.1 s ( = 3,600 data points) and with a time step of 0.01 s ( = 360 data points) (
<xref ref-type="supplementary-material" rid="pone.0097680.s008">Figure S8C</xref>
). Voice activity detection (VAD) was performed as;
<disp-formula id="pone.0097680.e034">
<graphic xlink:href="pone.0097680.e034.jpg" position="anchor" orientation="portrait"></graphic>
<label>(10)</label>
</disp-formula>
where
<inline-formula>
<inline-graphic xlink:href="pone.0097680.e035.jpg"></inline-graphic>
</inline-formula>
is the RMS audio signal,
<inline-formula>
<inline-graphic xlink:href="pone.0097680.e036.jpg"></inline-graphic>
</inline-formula>
( = 50 dB) is the threshold,
<inline-formula>
<inline-graphic xlink:href="pone.0097680.e037.jpg"></inline-graphic>
</inline-formula>
equals to 0.1 sec (10 data points in the RMS signal),
<inline-formula>
<inline-graphic xlink:href="pone.0097680.e038.jpg"></inline-graphic>
</inline-formula>
is the
<italic>i</italic>
-th time point, and the detected areas were evaluated as 1. All of the detected areas were verified by careful listening. The total duration of the detected areas was divided by 60 s to qualify the mean duration of vocalizations per minute.</p>
</sec>
<sec id="s5c3">
<title>Fundamental and formant frequencies</title>
<p>The fundamental frequency (F
<sub>0</sub>
) was extracted for each detected voice using STRAIGHT (Speech Transformation and Representation using Adaptive Interpolation if weighted spectrum), a method of instantaneous-frequency-based F
<sub>0</sub>
extraction
<xref rid="pone.0097680-Kawahara1" ref-type="bibr">[58]</xref>
,
<xref rid="pone.0097680-STRAIGHT1" ref-type="bibr">[59]</xref>
. Formant frequencies (F
<sub>1</sub>
and F
<sub>2</sub>
) were calculated based on a 14
<sup>th</sup>
-order Linear Predictive Coding (LPC) algorithm using Praat
<xref rid="pone.0097680-Boersma1" ref-type="bibr">[60]</xref>
. Mean and standard deviation (SD) within an utterance were calculated for each of the detected areas. The Mean and SD values were averaged among the detected areas for each infant.</p>
</sec>
</sec>
</sec>
<sec sec-type="supplementary-material" id="s6">
<title>Supporting Information</title>
<supplementary-material content-type="local-data" id="pone.0097680.s001">
<label>Figure S1</label>
<caption>
<p>
<bold>Spontaneous limb movements of infants when they listen to “Go Trippy” by WANICO feat. Jake Smith (music condition, see also Video S5) and those without any auditory stimulus (silent condition, see also Video S2).</bold>
</p>
<p>(PDF)</p>
</caption>
<media xlink:href="pone.0097680.s001.pdf">
<caption>
<p>Click here for additional data file.</p>
</caption>
</media>
</supplementary-material>
<supplementary-material content-type="local-data" id="pone.0097680.s002">
<label>Figure S2</label>
<caption>
<p>
<bold>Experiment setup.</bold>
</p>
<p>(PDF)</p>
</caption>
<media xlink:href="pone.0097680.s002.pdf">
<caption>
<p>Click here for additional data file.</p>
</caption>
</media>
</supplementary-material>
<supplementary-material content-type="local-data" id="pone.0097680.s003">
<label>Figure S3</label>
<caption>
<p>
<bold>Schematic overview of our pipeline for analysis of limb movements.</bold>
</p>
<p>(PDF)</p>
</caption>
<media xlink:href="pone.0097680.s003.pdf">
<caption>
<p>Click here for additional data file.</p>
</caption>
</media>
</supplementary-material>
<supplementary-material content-type="local-data" id="pone.0097680.s004">
<label>Figure S4</label>
<caption>
<p>
<bold>Significant synchronization in right leg movements of ID1 during playing of a drumming pattern (100.0 BPM) (see also Video S6).</bold>
</p>
<p>(PDF)</p>
</caption>
<media xlink:href="pone.0097680.s004.pdf">
<caption>
<p>Click here for additional data file.</p>
</caption>
</media>
</supplementary-material>
<supplementary-material content-type="local-data" id="pone.0097680.s005">
<label>Figure S5</label>
<caption>
<p>
<bold>Significant synchronization in left arm movements of ID25 during the music condition “Everybody” (108.7 BPM) (see also Video S4).</bold>
</p>
<p>(PDF)</p>
</caption>
<media xlink:href="pone.0097680.s005.pdf">
<caption>
<p>Click here for additional data file.</p>
</caption>
</media>
</supplementary-material>
<supplementary-material content-type="local-data" id="pone.0097680.s006">
<label>Figure S6</label>
<caption>
<p>
<bold>Non-significant phase wandering pattern in left hand movements of ID25 during the music condition “Go Trippy” (130.0 BPM) (see also Video S5).</bold>
</p>
<p>(PDF)</p>
</caption>
<media xlink:href="pone.0097680.s006.pdf">
<caption>
<p>Click here for additional data file.</p>
</caption>
</media>
</supplementary-material>
<supplementary-material content-type="local-data" id="pone.0097680.s007">
<label>Figure S7</label>
<caption>
<p>
<bold>Significant synchronization in right leg movements of ID20 during the music condition “Everybody” (108.7 BPM).</bold>
</p>
<p>(PDF)</p>
</caption>
<media xlink:href="pone.0097680.s007.pdf">
<caption>
<p>Click here for additional data file.</p>
</caption>
</media>
</supplementary-material>
<supplementary-material content-type="local-data" id="pone.0097680.s008">
<label>Figure S8</label>
<caption>
<p>
<bold>Schematic overview of our pipeline for analysis of vocalizations.</bold>
</p>
<p>(PDF)</p>
</caption>
<media xlink:href="pone.0097680.s008.pdf">
<caption>
<p>Click here for additional data file.</p>
</caption>
</media>
</supplementary-material>
<supplementary-material content-type="local-data" id="pone.0097680.s009">
<label>Figure S9</label>
<caption>
<p>
<bold>Spontaneous vocalizations of infants during the music condition “Everybody” by Backstreet Boys and during the silent condition. Error bars indicate standard error (SE) between participants.</bold>
</p>
<p>(PDF)</p>
</caption>
<media xlink:href="pone.0097680.s009.pdf">
<caption>
<p>Click here for additional data file.</p>
</caption>
</media>
</supplementary-material>
<supplementary-material content-type="local-data" id="pone.0097680.s010">
<label>Figure S10</label>
<caption>
<p>
<bold>Further analyses for ID1 and ID25.</bold>
</p>
<p>(PDF)</p>
</caption>
<media xlink:href="pone.0097680.s010.pdf">
<caption>
<p>Click here for additional data file.</p>
</caption>
</media>
</supplementary-material>
<supplementary-material content-type="local-data" id="pone.0097680.s011">
<label>Figure S11</label>
<caption>
<p>
<bold>Mean synchronization indices across the moving sections in silent (blue bars) and music (red bars) conditions.</bold>
Error bars indicate standard errors (SE) across the moving sections (N = 27 in music condition, N = 24 in silent condition).</p>
<p>(PDF)</p>
</caption>
<media xlink:href="pone.0097680.s011.pdf">
<caption>
<p>Click here for additional data file.</p>
</caption>
</media>
</supplementary-material>
<supplementary-material content-type="local-data" id="pone.0097680.s012">
<label>Figure S12</label>
<caption>
<p>
<bold>Monte-Carlo statistics for ID1 showed significant synchronization in his right leg movements during the music condition “Everybody” (108.7 BPM, Video S3) regardless of the synchronization indices.</bold>
</p>
<p>(PDF)</p>
</caption>
<media xlink:href="pone.0097680.s012.pdf">
<caption>
<p>Click here for additional data file.</p>
</caption>
</media>
</supplementary-material>
<supplementary-material content-type="local-data" id="pone.0097680.s013">
<label>Figure S13</label>
<caption>
<p>
<bold>Monte-Carlo statistics for ID1 showed significant synchronization in the right leg movements during playing of a drumming pattern (100.0 BPM, Video S6) regardless of the synchronization indices.</bold>
</p>
<p>(PDF)</p>
</caption>
<media xlink:href="pone.0097680.s013.pdf">
<caption>
<p>Click here for additional data file.</p>
</caption>
</media>
</supplementary-material>
<supplementary-material content-type="local-data" id="pone.0097680.s014">
<label>Figure S14</label>
<caption>
<p>
<bold>Monte-Carlo statistics for ID25 showed significant synchronization in the left arm movements during the music condition “Everybody” (108.7 BPM, Video S4) regardless of the synchronization indices.</bold>
</p>
<p>(PDF)</p>
</caption>
<media xlink:href="pone.0097680.s014.pdf">
<caption>
<p>Click here for additional data file.</p>
</caption>
</media>
</supplementary-material>
<supplementary-material content-type="local-data" id="pone.0097680.s015">
<label>Table S1</label>
<caption>
<p>
<bold>Infant profiles and the number of synchronized movements to the musical beat during the music condition “Everybody” by Backstreet Boys and the silent condition.</bold>
</p>
<p>(PDF)</p>
</caption>
<media xlink:href="pone.0097680.s015.pdf">
<caption>
<p>Click here for additional data file.</p>
</caption>
</media>
</supplementary-material>
<supplementary-material content-type="local-data" id="pone.0097680.s016">
<label>Table S2</label>
<caption>
<p>
<bold>Infant profiles and the number of synchronized movements to the musical beat during the music condition “Go Trippy” by WANICO feat. Jake Smithand the silent condition.</bold>
</p>
<p>(PDF)</p>
</caption>
<media xlink:href="pone.0097680.s016.pdf">
<caption>
<p>Click here for additional data file.</p>
</caption>
</media>
</supplementary-material>
<supplementary-material content-type="local-data" id="pone.0097680.s017">
<label>Table S3</label>
<caption>
<p>
<bold>Correlation between the age of days and the behavioral measures during the music condition “Everybody” by Backstreet Boys and the silent condition.</bold>
</p>
<p>(PDF)</p>
</caption>
<media xlink:href="pone.0097680.s017.pdf">
<caption>
<p>Click here for additional data file.</p>
</caption>
</media>
</supplementary-material>
<supplementary-material content-type="local-data" id="pone.0097680.s018">
<label>Table S4</label>
<caption>
<p>
<bold>Correlation between the age of days and the behavioral measures during the music condition “Go Trippy” by WANICO feat. Jake Smith and the silent condition.</bold>
</p>
<p>(PDF)</p>
</caption>
<media xlink:href="pone.0097680.s018.pdf">
<caption>
<p>Click here for additional data file.</p>
</caption>
</media>
</supplementary-material>
<supplementary-material content-type="local-data" id="pone.0097680.s019">
<label>Table S5</label>
<caption>
<p>
<bold>Proportions of power spectrum density within 0.05–1, 1–2, and 2–3 Hz frequency ranges relative to the total power during the music condition “Everybody” by Backstreet Boys and the silent condition.</bold>
</p>
<p>(PDF)</p>
</caption>
<media xlink:href="pone.0097680.s019.pdf">
<caption>
<p>Click here for additional data file.</p>
</caption>
</media>
</supplementary-material>
<supplementary-material content-type="local-data" id="pone.0097680.s020">
<label>Table S6</label>
<caption>
<p>
<bold>Proportions of power spectrum density within 0.05–1, 1–2, and 2–3 Hz frequency ranges relative to the total power during the music condition “Go Trippy” by WANICO feat. Jake Smith and the silent condition.</bold>
</p>
<p>(PDF)</p>
</caption>
<media xlink:href="pone.0097680.s020.pdf">
<caption>
<p>Click here for additional data file.</p>
</caption>
</media>
</supplementary-material>
<supplementary-material content-type="local-data" id="pone.0097680.s021">
<label>Table S7</label>
<caption>
<p>
<bold>Detailed description of detected 51 moving sections.</bold>
</p>
<p>(PDF)</p>
</caption>
<media xlink:href="pone.0097680.s021.pdf">
<caption>
<p>Click here for additional data file.</p>
</caption>
</media>
</supplementary-material>
<supplementary-material content-type="local-data" id="pone.0097680.s022">
<label>Video S1</label>
<caption>
<p>
<bold>An excerpt from the recording of the silent condition in ID1.</bold>
</p>
<p>(MOV)</p>
</caption>
<media xlink:href="pone.0097680.s022.mov">
<caption>
<p>Click here for additional data file.</p>
</caption>
</media>
</supplementary-material>
<supplementary-material content-type="local-data" id="pone.0097680.s023">
<label>Video S2</label>
<caption>
<p>
<bold>An excerpt from the recording of the silent condition in ID25.</bold>
</p>
<p>(MOV)</p>
</caption>
<media xlink:href="pone.0097680.s023.mov">
<caption>
<p>Click here for additional data file.</p>
</caption>
</media>
</supplementary-material>
<supplementary-material content-type="local-data" id="pone.0097680.s024">
<label>Video S3</label>
<caption>
<p>
<bold>An excerpt from the recording of the music condition in ID1.</bold>
“Everybody” by Backstreet Boys (108.7 BPM) was played as an auditory stimulus.</p>
<p>(MP4)</p>
</caption>
<media xlink:href="pone.0097680.s024.mp4">
<caption>
<p>Click here for additional data file.</p>
</caption>
</media>
</supplementary-material>
<supplementary-material content-type="local-data" id="pone.0097680.s025">
<label>Video S4</label>
<caption>
<p>
<bold>An excerpt from the recording of the music condition in ID25.</bold>
“Everybody” by Backstreet Boys (108.7 BPM) was played as an auditory stimulus.</p>
<p>(MP4)</p>
</caption>
<media xlink:href="pone.0097680.s025.mp4">
<caption>
<p>Click here for additional data file.</p>
</caption>
</media>
</supplementary-material>
<supplementary-material content-type="local-data" id="pone.0097680.s026">
<label>Video S5</label>
<caption>
<p>
<bold>An excerpt from the recording of the music condition in ID25.</bold>
“Go Trippy” by WANICO feat. Jake Smith (130.0 BPM) was played as an auditory stimulus.</p>
<p>(MP4)</p>
</caption>
<media xlink:href="pone.0097680.s026.mp4">
<caption>
<p>Click here for additional data file.</p>
</caption>
</media>
</supplementary-material>
<supplementary-material content-type="local-data" id="pone.0097680.s027">
<label>Video S6</label>
<caption>
<p>
<bold>An excerpt from the recording of the music condition in ID1.</bold>
A drumming pattern (100.0 BPM) was played as an auditory stimulus.</p>
<p>(MOV)</p>
</caption>
<media xlink:href="pone.0097680.s027.mov">
<caption>
<p>Click here for additional data file.</p>
</caption>
</media>
</supplementary-material>
</sec>
</body>
<back>
<ref-list>
<title>References</title>
<ref id="pone.0097680-Conard1">
<label>1</label>
<mixed-citation publication-type="journal">
<name>
<surname>Conard</surname>
<given-names>NJ</given-names>
</name>
,
<name>
<surname>Malina</surname>
<given-names>M</given-names>
</name>
,
<name>
<surname>Munzel</surname>
<given-names>SC</given-names>
</name>
(
<year>2009</year>
)
<article-title>New flutes document the earliest musical tradition in southwestern Germany</article-title>
.
<source>Nature</source>
<volume>460</volume>
:
<fpage>737</fpage>
<lpage>740</lpage>
<pub-id pub-id-type="pmid">19553935</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0097680-Fitch1">
<label>2</label>
<mixed-citation publication-type="journal">
<name>
<surname>Fitch</surname>
<given-names>WT</given-names>
</name>
(
<year>2006</year>
)
<article-title>The biology and evolution of music: a comparative perspective</article-title>
.
<source>Cognition</source>
<volume>100</volume>
:
<fpage>173</fpage>
<lpage>215</lpage>
<pub-id pub-id-type="pmid">16412411</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0097680-Mithen1">
<label>3</label>
<mixed-citation publication-type="other">Mithen S (2005) The singing neanderthals: the origins of music, language, mind, and body: Weidenfeld & Nicolson.</mixed-citation>
</ref>
<ref id="pone.0097680-Darwin1">
<label>4</label>
<mixed-citation publication-type="other">Darwin C (1871) The Descent of Man, and Selection in Relation to Sex. London: John Murray.</mixed-citation>
</ref>
<ref id="pone.0097680-Wan1">
<label>5</label>
<mixed-citation publication-type="journal">
<name>
<surname>Wan</surname>
<given-names>CY</given-names>
</name>
,
<name>
<surname>Schlaug</surname>
<given-names>G</given-names>
</name>
(
<year>2010</year>
)
<article-title>Music making as a tool for promoting brain plasticity across the life span</article-title>
.
<source>Neuroscientist</source>
<volume>16</volume>
:
<fpage>566</fpage>
<lpage>577</lpage>
<pub-id pub-id-type="pmid">20889966</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0097680-Zatorre1">
<label>6</label>
<mixed-citation publication-type="journal">
<name>
<surname>Zatorre</surname>
<given-names>RJ</given-names>
</name>
,
<name>
<surname>Chen</surname>
<given-names>JL</given-names>
</name>
,
<name>
<surname>Penhune</surname>
<given-names>VB</given-names>
</name>
(
<year>2007</year>
)
<article-title>When the brain plays music: auditory-motor interactions in music perception and production</article-title>
.
<source>Nat Rev Neurosci</source>
<volume>8</volume>
:
<fpage>547</fpage>
<lpage>558</lpage>
<pub-id pub-id-type="pmid">17585307</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0097680-Hasegawa1">
<label>7</label>
<mixed-citation publication-type="journal">
<name>
<surname>Hasegawa</surname>
<given-names>A</given-names>
</name>
,
<name>
<surname>Okanoya</surname>
<given-names>K</given-names>
</name>
,
<name>
<surname>Hasegawa</surname>
<given-names>T</given-names>
</name>
,
<name>
<surname>Seki</surname>
<given-names>Y</given-names>
</name>
(
<year>2011</year>
)
<article-title>Rhythmic synchronization tapping to an audio-visual metronome in budgerigars</article-title>
.
<source>Sci Rep</source>
<volume>1</volume>
:
<fpage>120</fpage>
<pub-id pub-id-type="pmid">22355637</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0097680-Patel1">
<label>8</label>
<mixed-citation publication-type="journal">
<name>
<surname>Patel</surname>
<given-names>AD</given-names>
</name>
,
<name>
<surname>Iversen</surname>
<given-names>JR</given-names>
</name>
,
<name>
<surname>Bregman</surname>
<given-names>MR</given-names>
</name>
,
<name>
<surname>Schulz</surname>
<given-names>I</given-names>
</name>
(
<year>2009</year>
)
<article-title>Experimental evidence for synchronization to a musical beat in a nonhuman animal</article-title>
.
<source>Curr Biol</source>
<volume>19</volume>
:
<fpage>827</fpage>
<lpage>830</lpage>
<pub-id pub-id-type="pmid">19409790</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0097680-Schachner1">
<label>9</label>
<mixed-citation publication-type="journal">
<name>
<surname>Schachner</surname>
<given-names>A</given-names>
</name>
,
<name>
<surname>Brady</surname>
<given-names>TF</given-names>
</name>
,
<name>
<surname>Pepperberg</surname>
<given-names>IM</given-names>
</name>
,
<name>
<surname>Hauser</surname>
<given-names>MD</given-names>
</name>
(
<year>2009</year>
)
<article-title>Spontaneous motor entrainment to music in multiple vocal mimicking species</article-title>
.
<source>Curr Biol</source>
<volume>19</volume>
:
<fpage>831</fpage>
<lpage>836</lpage>
<pub-id pub-id-type="pmid">19409786</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0097680-Kirschner1">
<label>10</label>
<mixed-citation publication-type="journal">
<name>
<surname>Kirschner</surname>
<given-names>S</given-names>
</name>
,
<name>
<surname>Tomasello</surname>
<given-names>M</given-names>
</name>
(
<year>2009</year>
)
<article-title>Joint drumming: social context facilitates synchronization in preschool children</article-title>
.
<source>J Exp Child Psychol</source>
<volume>102</volume>
:
<fpage>299</fpage>
<lpage>314</lpage>
<pub-id pub-id-type="pmid">18789454</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0097680-Zentner1">
<label>11</label>
<mixed-citation publication-type="journal">
<name>
<surname>Zentner</surname>
<given-names>M</given-names>
</name>
,
<name>
<surname>Eerola</surname>
<given-names>T</given-names>
</name>
(
<year>2010</year>
)
<article-title>Rhythmic engagement with music in infancy</article-title>
.
<source>Proc Natl Acad Sci U S A</source>
<volume>107</volume>
:
<fpage>5768</fpage>
<lpage>5773</lpage>
<pub-id pub-id-type="pmid">20231438</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0097680-Stefanics1">
<label>12</label>
<mixed-citation publication-type="journal">
<name>
<surname>Stefanics</surname>
<given-names>G</given-names>
</name>
,
<name>
<surname>Haden</surname>
<given-names>GP</given-names>
</name>
,
<name>
<surname>Sziller</surname>
<given-names>I</given-names>
</name>
,
<name>
<surname>Balazs</surname>
<given-names>L</given-names>
</name>
,
<name>
<surname>Beke</surname>
<given-names>A</given-names>
</name>
,
<etal>et al</etal>
(
<year>2009</year>
)
<article-title>Newborn infants process pitch intervals</article-title>
.
<source>Clin Neurophysiol</source>
<volume>120</volume>
:
<fpage>304</fpage>
<lpage>308</lpage>
<pub-id pub-id-type="pmid">19131275</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0097680-Perani1">
<label>13</label>
<mixed-citation publication-type="journal">
<name>
<surname>Perani</surname>
<given-names>D</given-names>
</name>
,
<name>
<surname>Saccuman</surname>
<given-names>MC</given-names>
</name>
,
<name>
<surname>Scifo</surname>
<given-names>P</given-names>
</name>
,
<name>
<surname>Spada</surname>
<given-names>D</given-names>
</name>
,
<name>
<surname>Andreolli</surname>
<given-names>G</given-names>
</name>
,
<etal>et al</etal>
(
<year>2010</year>
)
<article-title>Functional specializations for music processing in the human newborn brain</article-title>
.
<source>Proc Natl Acad Sci U S A</source>
<volume>107</volume>
:
<fpage>4758</fpage>
<lpage>4763</lpage>
<pub-id pub-id-type="pmid">20176953</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0097680-Winkler1">
<label>14</label>
<mixed-citation publication-type="journal">
<name>
<surname>Winkler</surname>
<given-names>I</given-names>
</name>
,
<name>
<surname>Haden</surname>
<given-names>GP</given-names>
</name>
,
<name>
<surname>Ladinig</surname>
<given-names>O</given-names>
</name>
,
<name>
<surname>Sziller</surname>
<given-names>I</given-names>
</name>
,
<name>
<surname>Honing</surname>
<given-names>H</given-names>
</name>
(
<year>2009</year>
)
<article-title>Newborn infants detect the beat in music</article-title>
.
<source>Proc Natl Acad Sci U S A</source>
<volume>106</volume>
:
<fpage>2468</fpage>
<lpage>2471</lpage>
<pub-id pub-id-type="pmid">19171894</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0097680-Trainor1">
<label>15</label>
<mixed-citation publication-type="journal">
<name>
<surname>Trainor</surname>
<given-names>L</given-names>
</name>
,
<name>
<surname>Heinmiller</surname>
<given-names>B</given-names>
</name>
(
<year>1998</year>
)
<article-title>The development of evaluaive responses to music: Infants prefer to listen to consonance over dissonance</article-title>
.
<source>Infant Behav Dev</source>
<volume>21</volume>
:
<fpage>77</fpage>
<lpage>88</lpage>
</mixed-citation>
</ref>
<ref id="pone.0097680-Trehub1">
<label>16</label>
<mixed-citation publication-type="journal">
<name>
<surname>Trehub</surname>
<given-names>SE</given-names>
</name>
,
<name>
<surname>Thorpe</surname>
<given-names>LA</given-names>
</name>
(
<year>1989</year>
)
<article-title>Infants’ perception of rhythm: categorization of auditory sequences by temporal structure</article-title>
.
<source>Can J Psychol</source>
<volume>43</volume>
:
<fpage>217</fpage>
<lpage>229</lpage>
<pub-id pub-id-type="pmid">2486496</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0097680-Hannon1">
<label>17</label>
<mixed-citation publication-type="journal">
<name>
<surname>Hannon</surname>
<given-names>EE</given-names>
</name>
,
<name>
<surname>Johnson</surname>
<given-names>SP</given-names>
</name>
(
<year>2005</year>
)
<article-title>Infants use meter to categorize rhythms and melodies: implications for musical structure learning</article-title>
.
<source>Cogn Psychol</source>
<volume>50</volume>
:
<fpage>354</fpage>
<lpage>377</lpage>
<pub-id pub-id-type="pmid">15893524</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0097680-PhillipsSilver1">
<label>18</label>
<mixed-citation publication-type="journal">
<name>
<surname>Phillips-Silver</surname>
<given-names>J</given-names>
</name>
,
<name>
<surname>Trainor</surname>
<given-names>LJ</given-names>
</name>
(
<year>2005</year>
)
<article-title>Feeling the beat: movement influences infant rhythm perception</article-title>
.
<source>Science</source>
<volume>308</volume>
:
<fpage>1430</fpage>
<pub-id pub-id-type="pmid">15933193</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0097680-Patel2">
<label>19</label>
<mixed-citation publication-type="other">Patel AD (2008) Music, Language, and the Brain. New York: Oxford University Press.</mixed-citation>
</ref>
<ref id="pone.0097680-Condon1">
<label>20</label>
<mixed-citation publication-type="journal">
<name>
<surname>Condon</surname>
<given-names>WS</given-names>
</name>
,
<name>
<surname>Sander</surname>
<given-names>LW</given-names>
</name>
(
<year>1974</year>
)
<article-title>Neonate movement is synchronized with adult speech: interactional participation and language acquisition</article-title>
.
<source>Science</source>
<volume>183</volume>
:
<fpage>99</fpage>
<lpage>101</lpage>
<pub-id pub-id-type="pmid">4808791</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0097680-Trehub2">
<label>21</label>
<mixed-citation publication-type="journal">
<name>
<surname>Trehub</surname>
<given-names>SE</given-names>
</name>
,
<name>
<surname>Trainor</surname>
<given-names>LJ</given-names>
</name>
,
<name>
<surname>Unyk</surname>
<given-names>AM</given-names>
</name>
(
<year>1993</year>
)
<article-title>Music and speech processing in the first year of life</article-title>
.
<source>Adv Child Dev Behav</source>
<volume>24</volume>
:
<fpage>1</fpage>
<lpage>35</lpage>
<pub-id pub-id-type="pmid">8447246</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0097680-Kotilahti1">
<label>22</label>
<mixed-citation publication-type="journal">
<name>
<surname>Kotilahti</surname>
<given-names>K</given-names>
</name>
,
<name>
<surname>Nissila</surname>
<given-names>I</given-names>
</name>
,
<name>
<surname>Nasi</surname>
<given-names>T</given-names>
</name>
,
<name>
<surname>Lipiainen</surname>
<given-names>L</given-names>
</name>
,
<name>
<surname>Noponen</surname>
<given-names>T</given-names>
</name>
,
<etal>et al</etal>
(
<year>2010</year>
)
<article-title>Hemodynamic responses to speech and music in newborn infants</article-title>
.
<source>Hum Brain Mapp</source>
<volume>31</volume>
:
<fpage>595</fpage>
<lpage>603</lpage>
<pub-id pub-id-type="pmid">19790172</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0097680-HaddersAlgra1">
<label>23</label>
<mixed-citation publication-type="journal">
<name>
<surname>Hadders-Algra</surname>
<given-names>M</given-names>
</name>
,
<name>
<surname>Prechtl</surname>
<given-names>HF</given-names>
</name>
(
<year>1992</year>
)
<article-title>Developmental course of general movements in early infancy. I. Descriptive analysis of change in form</article-title>
.
<source>Early Hum Dev</source>
<volume>28</volume>
:
<fpage>201</fpage>
<lpage>213</lpage>
<pub-id pub-id-type="pmid">1592005</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0097680-Prechtl1">
<label>24</label>
<mixed-citation publication-type="journal">
<name>
<surname>Prechtl</surname>
<given-names>HF</given-names>
</name>
,
<name>
<surname>Hopkins</surname>
<given-names>B</given-names>
</name>
(
<year>1986</year>
)
<article-title>Developmental transformations of spontaneous movements in early infancy</article-title>
.
<source>Early Hum Dev</source>
<volume>14</volume>
:
<fpage>233</fpage>
<lpage>238</lpage>
<pub-id pub-id-type="pmid">3803269</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0097680-Watanabe1">
<label>25</label>
<mixed-citation publication-type="journal">
<name>
<surname>Watanabe</surname>
<given-names>H</given-names>
</name>
,
<name>
<surname>Homae</surname>
<given-names>F</given-names>
</name>
,
<name>
<surname>Taga</surname>
<given-names>G</given-names>
</name>
(
<year>2011</year>
)
<article-title>Developmental emergence of self-referential and inhibition mechanisms of body movements underling felicitous behaviors</article-title>
.
<source>J Exp Psychol Hum Percept Perform</source>
<volume>37</volume>
:
<fpage>1157</fpage>
<lpage>1173</lpage>
<pub-id pub-id-type="pmid">21500942</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0097680-Watanabe2">
<label>26</label>
<mixed-citation publication-type="journal">
<name>
<surname>Watanabe</surname>
<given-names>H</given-names>
</name>
,
<name>
<surname>Taga</surname>
<given-names>G</given-names>
</name>
(
<year>2009</year>
)
<article-title>Flexibility in infant actions during arm- and leg-based learning in a mobile paradigm</article-title>
.
<source>Infant Behav Dev</source>
<volume>32</volume>
:
<fpage>79</fpage>
<lpage>90</lpage>
<pub-id pub-id-type="pmid">19081637</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0097680-Chen1">
<label>27</label>
<mixed-citation publication-type="journal">
<name>
<surname>Chen</surname>
<given-names>JL</given-names>
</name>
,
<name>
<surname>Zatorre</surname>
<given-names>RJ</given-names>
</name>
,
<name>
<surname>Penhune</surname>
<given-names>VB</given-names>
</name>
(
<year>2006</year>
)
<article-title>Interactions between auditory and dorsal premotor cortex during synchronization to musical rhythms</article-title>
.
<source>Neuroimage</source>
<volume>32</volume>
:
<fpage>1771</fpage>
<lpage>1781</lpage>
<pub-id pub-id-type="pmid">16777432</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0097680-Fujioka1">
<label>28</label>
<mixed-citation publication-type="journal">
<name>
<surname>Fujioka</surname>
<given-names>T</given-names>
</name>
,
<name>
<surname>Zendel</surname>
<given-names>BR</given-names>
</name>
,
<name>
<surname>Ross</surname>
<given-names>B</given-names>
</name>
(
<year>2010</year>
)
<article-title>Endogenous neuromagnetic activity for mental hierarchy of timing</article-title>
.
<source>J Neurosci</source>
<volume>30</volume>
:
<fpage>3458</fpage>
<lpage>3466</lpage>
<pub-id pub-id-type="pmid">20203205</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0097680-Grahn1">
<label>29</label>
<mixed-citation publication-type="journal">
<name>
<surname>Grahn</surname>
<given-names>JA</given-names>
</name>
,
<name>
<surname>Rowe</surname>
<given-names>JB</given-names>
</name>
(
<year>2009</year>
)
<article-title>Feeling the beat: premotor and striatal interactions in musicians and nonmusicians during beat perception</article-title>
.
<source>J Neurosci</source>
<volume>29</volume>
:
<fpage>7540</fpage>
<lpage>7548</lpage>
<pub-id pub-id-type="pmid">19515922</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0097680-Nozaradan1">
<label>30</label>
<mixed-citation publication-type="journal">
<name>
<surname>Nozaradan</surname>
<given-names>S</given-names>
</name>
,
<name>
<surname>Peretz</surname>
<given-names>I</given-names>
</name>
,
<name>
<surname>Missal</surname>
<given-names>M</given-names>
</name>
,
<name>
<surname>Mouraux</surname>
<given-names>A</given-names>
</name>
(
<year>2011</year>
)
<article-title>Tagging the neuronal entrainment to beat and meter</article-title>
.
<source>J Neurosci</source>
<volume>31</volume>
:
<fpage>10234</fpage>
<lpage>10240</lpage>
<pub-id pub-id-type="pmid">21753000</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0097680-Kung1">
<label>31</label>
<mixed-citation publication-type="journal">
<name>
<surname>Kung</surname>
<given-names>SJ</given-names>
</name>
,
<name>
<surname>Chen</surname>
<given-names>JL</given-names>
</name>
,
<name>
<surname>Zatorre</surname>
<given-names>RJ</given-names>
</name>
,
<name>
<surname>Penhune</surname>
<given-names>VB</given-names>
</name>
(
<year>2013</year>
)
<article-title>Interacting cortical and basal ganglia networks underlying finding and tapping to the musical beat</article-title>
.
<source>J Cogn Neurosci</source>
<volume>25</volume>
:
<fpage>401</fpage>
<lpage>420</lpage>
<pub-id pub-id-type="pmid">23163420</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0097680-PhillipsSilver2">
<label>32</label>
<mixed-citation publication-type="journal">
<name>
<surname>Phillips-Silver</surname>
<given-names>J</given-names>
</name>
,
<name>
<surname>Aktipis</surname>
<given-names>CA</given-names>
</name>
,
<name>
<surname>Bryant</surname>
<given-names>GA</given-names>
</name>
(
<year>2010</year>
)
<article-title>The ecology of entrainment: Foundations of coordinated rhythmic movement</article-title>
.
<source>Music Percept</source>
<volume>28</volume>
:
<fpage>3</fpage>
<lpage>14</lpage>
<pub-id pub-id-type="pmid">21776183</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0097680-Taga1">
<label>33</label>
<mixed-citation publication-type="journal">
<name>
<surname>Taga</surname>
<given-names>G</given-names>
</name>
,
<name>
<surname>Yamaguchi</surname>
<given-names>Y</given-names>
</name>
,
<name>
<surname>Shimizu</surname>
<given-names>H</given-names>
</name>
(
<year>1991</year>
)
<article-title>Self-organized control of bipedal locomotion by neural oscillators in unpredictable environment</article-title>
.
<source>Biol Cybern</source>
<volume>65</volume>
:
<fpage>147</fpage>
<lpage>159</lpage>
<pub-id pub-id-type="pmid">1912008</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0097680-Hattori1">
<label>34</label>
<mixed-citation publication-type="journal">
<name>
<surname>Hattori</surname>
<given-names>Y</given-names>
</name>
,
<name>
<surname>Tomonaga</surname>
<given-names>M</given-names>
</name>
,
<name>
<surname>Matsuzawa</surname>
<given-names>T</given-names>
</name>
(
<year>2013</year>
)
<article-title>Spontaneous synchronized tapping to an auditory rhythm in a chimpanzee</article-title>
.
<source>Sci Rep</source>
<volume>3</volume>
:
<fpage>1566</fpage>
<pub-id pub-id-type="pmid">23535698</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0097680-Georgopoulos1">
<label>35</label>
<mixed-citation publication-type="journal">
<name>
<surname>Georgopoulos</surname>
<given-names>AP</given-names>
</name>
,
<name>
<surname>Grillner</surname>
<given-names>S</given-names>
</name>
(
<year>1989</year>
)
<article-title>Visuomotor coordination in reaching and locomotion</article-title>
.
<source>Science</source>
<volume>245</volume>
:
<fpage>1209</fpage>
<lpage>1210</lpage>
<pub-id pub-id-type="pmid">2675307</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0097680-Grillner1">
<label>36</label>
<mixed-citation publication-type="journal">
<name>
<surname>Grillner</surname>
<given-names>S</given-names>
</name>
(
<year>1985</year>
)
<article-title>Neurobiological bases of rhythmic motor acts in vertebrates</article-title>
.
<source>Science</source>
<volume>228</volume>
:
<fpage>143</fpage>
<lpage>149</lpage>
<pub-id pub-id-type="pmid">3975635</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0097680-Michel1">
<label>37</label>
<mixed-citation publication-type="journal">
<name>
<surname>Michel</surname>
<given-names>GF</given-names>
</name>
,
<name>
<surname>Harkins</surname>
<given-names>DA</given-names>
</name>
(
<year>1986</year>
)
<article-title>Postural and lateral asymmetries in the ontogeny of handedness during infancy</article-title>
.
<source>Dev Psychobiol</source>
<volume>19</volume>
:
<fpage>247</fpage>
<lpage>258</lpage>
<pub-id pub-id-type="pmid">3709979</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0097680-Sun1">
<label>38</label>
<mixed-citation publication-type="journal">
<name>
<surname>Sun</surname>
<given-names>T</given-names>
</name>
,
<name>
<surname>Walsh</surname>
<given-names>CA</given-names>
</name>
(
<year>2006</year>
)
<article-title>Molecular approaches to brain asymmetry and handedness</article-title>
.
<source>Nat Rev Neurosci</source>
<volume>7</volume>
:
<fpage>655</fpage>
<lpage>662</lpage>
<pub-id pub-id-type="pmid">16858393</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0097680-Masataka1">
<label>39</label>
<mixed-citation publication-type="journal">
<name>
<surname>Masataka</surname>
<given-names>N</given-names>
</name>
(
<year>2007</year>
)
<article-title>Music, evolution and language</article-title>
.
<source>Dev Sci</source>
<volume>10</volume>
:
<fpage>35</fpage>
<lpage>39</lpage>
<pub-id pub-id-type="pmid">17181697</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0097680-Fitch2">
<label>40</label>
<mixed-citation publication-type="journal">
<name>
<surname>Fitch</surname>
<given-names>WT</given-names>
</name>
(
<year>2000</year>
)
<article-title>The evolution of speech: a comparative review</article-title>
.
<source>Trends Cogn Sci</source>
<volume>4</volume>
:
<fpage>258</fpage>
<lpage>267</lpage>
<pub-id pub-id-type="pmid">10859570</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0097680-Kato1">
<label>41</label>
<mixed-citation publication-type="journal">
<name>
<surname>Kato</surname>
<given-names>M</given-names>
</name>
,
<name>
<surname>Watanabe</surname>
<given-names>H</given-names>
</name>
,
<name>
<surname>Taga</surname>
<given-names>G</given-names>
</name>
(
<year>2013</year>
)
<article-title>Diversity and changeability of infant movements in a novel environment</article-title>
.
<source>Journal of Motor Learning and Development</source>
<volume>1</volume>
:
<fpage>79</fpage>
<lpage>88</lpage>
</mixed-citation>
</ref>
<ref id="pone.0097680-Kuhl1">
<label>42</label>
<mixed-citation publication-type="journal">
<name>
<surname>Kuhl</surname>
<given-names>PK</given-names>
</name>
,
<name>
<surname>Meltzoff</surname>
<given-names>AN</given-names>
</name>
(
<year>1996</year>
)
<article-title>Infant vocalizations in response to speech: vocal imitation and developmental change</article-title>
.
<source>J Acoust Soc Am</source>
<volume>100</volume>
:
<fpage>2425</fpage>
<lpage>2438</lpage>
<pub-id pub-id-type="pmid">8865648</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0097680-Fujii1">
<label>43</label>
<mixed-citation publication-type="journal">
<name>
<surname>Fujii</surname>
<given-names>S</given-names>
</name>
,
<name>
<surname>Schlaug</surname>
<given-names>G</given-names>
</name>
(
<year>2013</year>
)
<article-title>The Harvard Beat Assessment Test (H-BAT): a battery for assessing beat perception and production and their dissociation</article-title>
.
<source>Front Hum Neurosci</source>
<volume>7</volume>
:
<fpage>771</fpage>
<pub-id pub-id-type="pmid">24324421</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0097680-Tass1">
<label>44</label>
<mixed-citation publication-type="journal">
<name>
<surname>Tass</surname>
<given-names>P</given-names>
</name>
,
<name>
<surname>Rosenblum</surname>
<given-names>MG</given-names>
</name>
,
<name>
<surname>Weule</surname>
<given-names>J</given-names>
</name>
,
<name>
<surname>Kurths</surname>
<given-names>J</given-names>
</name>
,
<name>
<surname>Pikovsky</surname>
<given-names>P</given-names>
</name>
,
<etal>et al</etal>
(
<year>1998</year>
)
<article-title>Detection of n:m phase locking from noisy data: application to magnetoencephalography</article-title>
.
<source>Phys Rev Lett</source>
<volume>81</volume>
:
<fpage>3291</fpage>
<lpage>3294</lpage>
</mixed-citation>
</ref>
<ref id="pone.0097680-Prichard1">
<label>45</label>
<mixed-citation publication-type="journal">
<name>
<surname>Prichard</surname>
<given-names>D</given-names>
</name>
,
<name>
<surname>Theiler</surname>
<given-names>J</given-names>
</name>
(
<year>1994</year>
)
<article-title>Generating surrogate data for time series with several simultaneously measured variables</article-title>
.
<source>Phys Rev Lett</source>
<volume>73</volume>
:
<fpage>951</fpage>
<lpage>954</lpage>
<pub-id pub-id-type="pmid">10057582</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0097680-Stefanics2">
<label>46</label>
<mixed-citation publication-type="journal">
<name>
<surname>Stefanics</surname>
<given-names>G</given-names>
</name>
,
<name>
<surname>Hangya</surname>
<given-names>B</given-names>
</name>
,
<name>
<surname>Hernadi</surname>
<given-names>I</given-names>
</name>
,
<name>
<surname>Winkler</surname>
<given-names>I</given-names>
</name>
,
<name>
<surname>Lakatos</surname>
<given-names>P</given-names>
</name>
,
<etal>et al</etal>
(
<year>2010</year>
)
<article-title>Phase entrainment of human delta oscillations can mediate the effects of expectation on reaction speed</article-title>
.
<source>J Neurosci</source>
<volume>30</volume>
:
<fpage>13578</fpage>
<lpage>13585</lpage>
<pub-id pub-id-type="pmid">20943899</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0097680-Patel3">
<label>47</label>
<mixed-citation publication-type="journal">
<name>
<surname>Patel</surname>
<given-names>AD</given-names>
</name>
,
<name>
<surname>Iversen</surname>
<given-names>JR</given-names>
</name>
,
<name>
<surname>Bregman</surname>
<given-names>MR</given-names>
</name>
,
<name>
<surname>Schulz</surname>
<given-names>I</given-names>
</name>
(
<year>2009</year>
)
<article-title>Studying synchronization to a musical beat in nonhuman animals</article-title>
.
<source>Ann N Y Acad Sci</source>
<volume>1169</volume>
:
<fpage>459</fpage>
<lpage>469</lpage>
<pub-id pub-id-type="pmid">19673824</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0097680-Dominici1">
<label>48</label>
<mixed-citation publication-type="journal">
<name>
<surname>Dominici</surname>
<given-names>N</given-names>
</name>
,
<name>
<surname>Ivanenko</surname>
<given-names>YP</given-names>
</name>
,
<name>
<surname>Cappellini</surname>
<given-names>G</given-names>
</name>
,
<name>
<surname>d’Avella</surname>
<given-names>A</given-names>
</name>
,
<name>
<surname>Mondi</surname>
<given-names>V</given-names>
</name>
,
<etal>et al</etal>
(
<year>2011</year>
)
<article-title>Locomotor primitives in newborn babies and their development</article-title>
.
<source>Science</source>
<volume>334</volume>
:
<fpage>997</fpage>
<lpage>999</lpage>
<pub-id pub-id-type="pmid">22096202</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0097680-Stupacher1">
<label>49</label>
<mixed-citation publication-type="journal">
<name>
<surname>Stupacher</surname>
<given-names>J</given-names>
</name>
,
<name>
<surname>Hove</surname>
<given-names>MJ</given-names>
</name>
,
<name>
<surname>Novembre</surname>
<given-names>G</given-names>
</name>
,
<name>
<surname>Schutz-Bosbach</surname>
<given-names>S</given-names>
</name>
,
<name>
<surname>Keller</surname>
<given-names>PE</given-names>
</name>
(
<year>2013</year>
)
<article-title>Musical groove modulates motor cortex excitability: a TMS investigation</article-title>
.
<source>Brain Cogn</source>
<volume>82</volume>
:
<fpage>127</fpage>
<lpage>136</lpage>
<pub-id pub-id-type="pmid">23660433</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0097680-Tierney1">
<label>50</label>
<mixed-citation publication-type="journal">
<name>
<surname>Tierney</surname>
<given-names>A</given-names>
</name>
,
<name>
<surname>Kraus</surname>
<given-names>N</given-names>
</name>
(
<year>2013</year>
)
<article-title>The ability to move to a beat is linked to the consistency of neural responses to sound</article-title>
.
<source>J Neurosci</source>
<volume>33</volume>
:
<fpage>14981</fpage>
<lpage>14988</lpage>
<pub-id pub-id-type="pmid">24048827</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0097680-Masataka2">
<label>51</label>
<mixed-citation publication-type="journal">
<name>
<surname>Masataka</surname>
<given-names>N</given-names>
</name>
(
<year>2009</year>
)
<article-title>The origins of language and the evolution of music: A comparative perspective</article-title>
.
<source>Phys Life Rev</source>
<volume>6</volume>
:
<fpage>11</fpage>
<lpage>22</lpage>
<pub-id pub-id-type="pmid">22537940</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0097680-Patel4">
<label>52</label>
<mixed-citation publication-type="journal">
<name>
<surname>Patel</surname>
<given-names>AD</given-names>
</name>
(
<year>2011</year>
)
<article-title>Why would Musical Training Benefit the Neural Encoding of Speech? The OPERA Hypothesis</article-title>
.
<source>Front Psychol</source>
<volume>2</volume>
:
<fpage>142</fpage>
<pub-id pub-id-type="pmid">21747773</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0097680-Meltzoff1">
<label>53</label>
<mixed-citation publication-type="journal">
<name>
<surname>Meltzoff</surname>
<given-names>AN</given-names>
</name>
,
<name>
<surname>Moore</surname>
<given-names>MK</given-names>
</name>
(
<year>1977</year>
)
<article-title>Imitation of facial and manual gestures by human neonates</article-title>
.
<source>Science</source>
<volume>198</volume>
:
<fpage>75</fpage>
<lpage>78</lpage>
<pub-id pub-id-type="pmid">17741897</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0097680-Ellis1">
<label>54</label>
<mixed-citation publication-type="journal">
<name>
<surname>Ellis</surname>
<given-names>D</given-names>
</name>
(
<year>2007</year>
)
<article-title>Beat tracking by dynamic programming</article-title>
.
<source>J New Music Research</source>
<volume>36</volume>
:
<fpage>51</fpage>
<lpage>60</lpage>
</mixed-citation>
</ref>
<ref id="pone.0097680-Music1">
<label>55</label>
<mixed-citation publication-type="other">Music Audio Tempo Estimation and Beat Tracking. Available:
<ext-link ext-link-type="uri" xlink:href="http://labrosa.ee.columbia.edu/projects/beattrack/">http://labrosa.ee.columbia.edu/projects/beattrack/</ext-link>
Accessed 2014 April 25.</mixed-citation>
</ref>
<ref id="pone.0097680-Shannon1">
<label>56</label>
<mixed-citation publication-type="other">Shannon CE (1948) A mathematical theory of communication. Bell System Tech J: 379–423.</mixed-citation>
</ref>
<ref id="pone.0097680-Mase1">
<label>57</label>
<mixed-citation publication-type="journal">
<name>
<surname>Mase</surname>
<given-names>M</given-names>
</name>
,
<name>
<surname>Faes</surname>
<given-names>L</given-names>
</name>
,
<name>
<surname>Antolini</surname>
<given-names>R</given-names>
</name>
,
<name>
<surname>Scaglione</surname>
<given-names>M</given-names>
</name>
,
<name>
<surname>Ravelli</surname>
<given-names>F</given-names>
</name>
(
<year>2005</year>
)
<article-title>Quantification of synchronization during atrial fibrillation by Shannon entropy: validation in patients and computer model of atrial arrhythmias</article-title>
.
<source>Physiol Meas</source>
<volume>26</volume>
:
<fpage>911</fpage>
<lpage>923</lpage>
<pub-id pub-id-type="pmid">16311441</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0097680-Kawahara1">
<label>58</label>
<mixed-citation publication-type="journal">
<name>
<surname>Kawahara</surname>
<given-names>H</given-names>
</name>
,
<name>
<surname>Masuda-Katsuse</surname>
<given-names>I</given-names>
</name>
,
<name>
<surname>de Cheveigne</surname>
<given-names>A</given-names>
</name>
(
<year>1999</year>
)
<article-title>Restructuring speech representations using a pitch-adaptive time-frequency smoothing and an instantaneous-frequency-based F0 extraction: Possible role of a repetitive structure in sounds</article-title>
.
<source>Speech Communication</source>
<volume>27</volume>
:
<fpage>187</fpage>
<lpage>207</lpage>
</mixed-citation>
</ref>
<ref id="pone.0097680-STRAIGHT1">
<label>59</label>
<mixed-citation publication-type="other">STRAIGHT, a speech analysis, modification and synthesis system. Available:
<ext-link ext-link-type="uri" xlink:href="http://www.wakayama-u.ac.jp/~kawahara/STRAIGHTadv/index_e.html">http://www.wakayama-u.ac.jp/~kawahara/STRAIGHTadv/index_e.html</ext-link>
Accessed 2014 April 25.</mixed-citation>
</ref>
<ref id="pone.0097680-Boersma1">
<label>60</label>
<mixed-citation publication-type="other">Boersma P, Weenink D (2011) Praat: doing phonetics by computer [Computer program]. Ver. 5.2.35. Available:
<ext-link ext-link-type="uri" xlink:href="http://www.praat.org/">http://www.praat.org/</ext-link>
Accessed 2014 April 25.</mixed-citation>
</ref>
</ref-list>
</back>
</pmc>
<affiliations>
<list>
<country>
<li>Canada</li>
<li>Japon</li>
</country>
<settlement>
<li>Tokyo</li>
</settlement>
</list>
<tree>
<country name="Canada">
<noRegion>
<name sortKey="Fujii, Shinya" sort="Fujii, Shinya" uniqKey="Fujii S" first="Shinya" last="Fujii">Shinya Fujii</name>
</noRegion>
</country>
<country name="Japon">
<noRegion>
<name sortKey="Fujii, Shinya" sort="Fujii, Shinya" uniqKey="Fujii S" first="Shinya" last="Fujii">Shinya Fujii</name>
</noRegion>
<name sortKey="Hirashima, Masaya" sort="Hirashima, Masaya" uniqKey="Hirashima M" first="Masaya" last="Hirashima">Masaya Hirashima</name>
<name sortKey="Nozaki, Daichi" sort="Nozaki, Daichi" uniqKey="Nozaki D" first="Daichi" last="Nozaki">Daichi Nozaki</name>
<name sortKey="Oohashi, Hiroki" sort="Oohashi, Hiroki" uniqKey="Oohashi H" first="Hiroki" last="Oohashi">Hiroki Oohashi</name>
<name sortKey="Oohashi, Hiroki" sort="Oohashi, Hiroki" uniqKey="Oohashi H" first="Hiroki" last="Oohashi">Hiroki Oohashi</name>
<name sortKey="Taga, Gentaro" sort="Taga, Gentaro" uniqKey="Taga G" first="Gentaro" last="Taga">Gentaro Taga</name>
<name sortKey="Watanabe, Hama" sort="Watanabe, Hama" uniqKey="Watanabe H" first="Hama" last="Watanabe">Hama Watanabe</name>
</country>
</tree>
</affiliations>
</record>

Pour manipuler ce document sous Unix (Dilib)

EXPLOR_STEP=$WICRI_ROOT/Wicri/Musique/explor/OperaV1/Data/Pmc/Checkpoint
HfdSelect -h $EXPLOR_STEP/biblio.hfd -nk 000105 | SxmlIndent | more

Ou

HfdSelect -h $EXPLOR_AREA/Data/Pmc/Checkpoint/biblio.hfd -nk 000105 | SxmlIndent | more

Pour mettre un lien sur cette page dans le réseau Wicri

{{Explor lien
   |wiki=    Wicri/Musique
   |area=    OperaV1
   |flux=    Pmc
   |étape=   Checkpoint
   |type=    RBID
   |clé=     PMC:4023986
   |texte=   Precursors of Dancing and Singing to Music in Three- to Four-Months-Old Infants
}}

Pour générer des pages wiki

HfdIndexSelect -h $EXPLOR_AREA/Data/Pmc/Checkpoint/RBID.i   -Sk "pubmed:24837135" \
       | HfdSelect -Kh $EXPLOR_AREA/Data/Pmc/Checkpoint/biblio.hfd   \
       | NlmPubMed2Wicri -a OperaV1 

Wicri

This area was generated with Dilib version V0.6.21.
Data generation: Thu Apr 14 14:59:05 2016. Site generation: Thu Jan 4 23:09:23 2024