Serveur d'exploration sur la musique en Sarre

Attention, ce site est en cours de développement !
Attention, site généré par des moyens informatiques à partir de corpus bruts.
Les informations ne sont donc pas validées.
***** Acces problem to record *****\

Identifieur interne : 000142 ( Pmc/Corpus ); précédent : 0001419; suivant : 0001430 ***** probable Xml problem with record *****

Links to Exploration step


Le document en format XML

<record>
<TEI>
<teiHeader>
<fileDesc>
<titleStmt>
<title xml:lang="en">Influences of Rhythm- and Timbre-Related Musical Features on Characteristics of Music-Induced Movement</title>
<author>
<name sortKey="Burger, Birgitta" sort="Burger, Birgitta" uniqKey="Burger B" first="Birgitta" last="Burger">Birgitta Burger</name>
<affiliation>
<nlm:aff id="aff1">
<institution>Department of Music, Finnish Centre of Excellence in Interdisciplinary Music Research, University of Jyväskylä</institution>
<country>Jyväskylä, Finland</country>
</nlm:aff>
</affiliation>
</author>
<author>
<name sortKey="Thompson, Marc R" sort="Thompson, Marc R" uniqKey="Thompson M" first="Marc R." last="Thompson">Marc R. Thompson</name>
<affiliation>
<nlm:aff id="aff1">
<institution>Department of Music, Finnish Centre of Excellence in Interdisciplinary Music Research, University of Jyväskylä</institution>
<country>Jyväskylä, Finland</country>
</nlm:aff>
</affiliation>
</author>
<author>
<name sortKey="Luck, Geoff" sort="Luck, Geoff" uniqKey="Luck G" first="Geoff" last="Luck">Geoff Luck</name>
<affiliation>
<nlm:aff id="aff1">
<institution>Department of Music, Finnish Centre of Excellence in Interdisciplinary Music Research, University of Jyväskylä</institution>
<country>Jyväskylä, Finland</country>
</nlm:aff>
</affiliation>
</author>
<author>
<name sortKey="Saarikallio, Suvi" sort="Saarikallio, Suvi" uniqKey="Saarikallio S" first="Suvi" last="Saarikallio">Suvi Saarikallio</name>
<affiliation>
<nlm:aff id="aff1">
<institution>Department of Music, Finnish Centre of Excellence in Interdisciplinary Music Research, University of Jyväskylä</institution>
<country>Jyväskylä, Finland</country>
</nlm:aff>
</affiliation>
</author>
<author>
<name sortKey="Toiviainen, Petri" sort="Toiviainen, Petri" uniqKey="Toiviainen P" first="Petri" last="Toiviainen">Petri Toiviainen</name>
<affiliation>
<nlm:aff id="aff1">
<institution>Department of Music, Finnish Centre of Excellence in Interdisciplinary Music Research, University of Jyväskylä</institution>
<country>Jyväskylä, Finland</country>
</nlm:aff>
</affiliation>
</author>
</titleStmt>
<publicationStmt>
<idno type="wicri:source">PMC</idno>
<idno type="pmid">23641220</idno>
<idno type="pmc">3624091</idno>
<idno type="url">http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3624091</idno>
<idno type="RBID">PMC:3624091</idno>
<idno type="doi">10.3389/fpsyg.2013.00183</idno>
<date when="2013">2013</date>
<idno type="wicri:Area/Pmc/Corpus">000142</idno>
<idno type="wicri:explorRef" wicri:stream="Pmc" wicri:step="Corpus" wicri:corpus="PMC">000142</idno>
</publicationStmt>
<sourceDesc>
<biblStruct>
<analytic>
<title xml:lang="en" level="a" type="main">Influences of Rhythm- and Timbre-Related Musical Features on Characteristics of Music-Induced Movement</title>
<author>
<name sortKey="Burger, Birgitta" sort="Burger, Birgitta" uniqKey="Burger B" first="Birgitta" last="Burger">Birgitta Burger</name>
<affiliation>
<nlm:aff id="aff1">
<institution>Department of Music, Finnish Centre of Excellence in Interdisciplinary Music Research, University of Jyväskylä</institution>
<country>Jyväskylä, Finland</country>
</nlm:aff>
</affiliation>
</author>
<author>
<name sortKey="Thompson, Marc R" sort="Thompson, Marc R" uniqKey="Thompson M" first="Marc R." last="Thompson">Marc R. Thompson</name>
<affiliation>
<nlm:aff id="aff1">
<institution>Department of Music, Finnish Centre of Excellence in Interdisciplinary Music Research, University of Jyväskylä</institution>
<country>Jyväskylä, Finland</country>
</nlm:aff>
</affiliation>
</author>
<author>
<name sortKey="Luck, Geoff" sort="Luck, Geoff" uniqKey="Luck G" first="Geoff" last="Luck">Geoff Luck</name>
<affiliation>
<nlm:aff id="aff1">
<institution>Department of Music, Finnish Centre of Excellence in Interdisciplinary Music Research, University of Jyväskylä</institution>
<country>Jyväskylä, Finland</country>
</nlm:aff>
</affiliation>
</author>
<author>
<name sortKey="Saarikallio, Suvi" sort="Saarikallio, Suvi" uniqKey="Saarikallio S" first="Suvi" last="Saarikallio">Suvi Saarikallio</name>
<affiliation>
<nlm:aff id="aff1">
<institution>Department of Music, Finnish Centre of Excellence in Interdisciplinary Music Research, University of Jyväskylä</institution>
<country>Jyväskylä, Finland</country>
</nlm:aff>
</affiliation>
</author>
<author>
<name sortKey="Toiviainen, Petri" sort="Toiviainen, Petri" uniqKey="Toiviainen P" first="Petri" last="Toiviainen">Petri Toiviainen</name>
<affiliation>
<nlm:aff id="aff1">
<institution>Department of Music, Finnish Centre of Excellence in Interdisciplinary Music Research, University of Jyväskylä</institution>
<country>Jyväskylä, Finland</country>
</nlm:aff>
</affiliation>
</author>
</analytic>
<series>
<title level="j">Frontiers in Psychology</title>
<idno type="eISSN">1664-1078</idno>
<imprint>
<date when="2013">2013</date>
</imprint>
</series>
</biblStruct>
</sourceDesc>
</fileDesc>
<profileDesc>
<textClass></textClass>
</profileDesc>
</teiHeader>
<front>
<div type="abstract" xml:lang="en">
<p>Music makes us move. Several factors can affect the characteristics of such movements, including individual factors or musical features. For this study, we investigated the effect of rhythm- and timbre-related musical features as well as tempo on movement characteristics. Sixty participants were presented with 30 musical stimuli representing different styles of popular music, and instructed to move along with the music. Optical motion capture was used to record participants’ movements. Subsequently, eight movement features and four rhythm- and timbre-related musical features were computationally extracted from the data, while the tempo was assessed in a perceptual experiment. A subsequent correlational analysis revealed that, for instance, clear pulses seemed to be embodied with the whole body, i.e., by using various movement types of different body parts, whereas spectral flux and percussiveness were found to be more distinctly related to certain body parts, such as head and hand movement. A series of ANOVAs with the stimuli being divided into three groups of five stimuli each based on the tempo revealed no significant differences between the groups, suggesting that the tempo of our stimuli set failed to have an effect on the movement features. In general, the results can be linked to the framework of embodied music cognition, as they show that body movements are used to reflect, imitate, and predict musical characteristics.</p>
</div>
</front>
<back>
<div1 type="bibliography">
<listBibl>
<biblStruct>
<analytic>
<author>
<name sortKey="Alluri, V" uniqKey="Alluri V">V. Alluri</name>
</author>
<author>
<name sortKey="Toiviainen, P" uniqKey="Toiviainen P">P. Toiviainen</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Arom, S" uniqKey="Arom S">S. Arom</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Bengtsson, S L" uniqKey="Bengtsson S">S. L. Bengtsson</name>
</author>
<author>
<name sortKey="Ullen, F" uniqKey="Ullen F">F. Ullén</name>
</author>
<author>
<name sortKey="Ehrsson, H H" uniqKey="Ehrsson H">H. H. Ehrsson</name>
</author>
<author>
<name sortKey="Hashimoto, T" uniqKey="Hashimoto T">T. Hashimoto</name>
</author>
<author>
<name sortKey="Kito, T" uniqKey="Kito T">T. Kito</name>
</author>
<author>
<name sortKey="Naito, E" uniqKey="Naito E">E. Naito</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Brown, S" uniqKey="Brown S">S. Brown</name>
</author>
<author>
<name sortKey="Merker, B" uniqKey="Merker B">B. Merker</name>
</author>
<author>
<name sortKey="Wallin, N L" uniqKey="Wallin N">N. L. Wallin</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Camurri, A" uniqKey="Camurri A">A. Camurri</name>
</author>
<author>
<name sortKey="Lagerlof, I" uniqKey="Lagerlof I">I. Lagerlöf</name>
</author>
<author>
<name sortKey="Volpe, G" uniqKey="Volpe G">G. Volpe</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Camurri, A" uniqKey="Camurri A">A. Camurri</name>
</author>
<author>
<name sortKey="Mazzarino, B" uniqKey="Mazzarino B">B. Mazzarino</name>
</author>
<author>
<name sortKey="Ricchetti, M" uniqKey="Ricchetti M">M. Ricchetti</name>
</author>
<author>
<name sortKey="Timmers, R" uniqKey="Timmers R">R. Timmers</name>
</author>
<author>
<name sortKey="Volpe, G" uniqKey="Volpe G">G. Volpe</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Chen, J L" uniqKey="Chen J">J. L. Chen</name>
</author>
<author>
<name sortKey="Penhune, V B" uniqKey="Penhune V">V. B. Penhune</name>
</author>
<author>
<name sortKey="Zatorre, R J" uniqKey="Zatorre R">R. J. Zatorre</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Collyer, C E" uniqKey="Collyer C">C. E. Collyer</name>
</author>
<author>
<name sortKey="Broadbent, H A" uniqKey="Broadbent H">H. A. Broadbent</name>
</author>
<author>
<name sortKey="Church, R M" uniqKey="Church R">R. M. Church</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Cross, I" uniqKey="Cross I">I. Cross</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Desmond, J C" uniqKey="Desmond J">J. C. Desmond</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Eerola, T" uniqKey="Eerola T">T. Eerola</name>
</author>
<author>
<name sortKey="Luck, G" uniqKey="Luck G">G. Luck</name>
</author>
<author>
<name sortKey="Toiviainen, P" uniqKey="Toiviainen P">P. Toiviainen</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Fraisse, P" uniqKey="Fraisse P">P. Fraisse</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="God Y, R I" uniqKey="God Y R">R. I. Godøy</name>
</author>
<author>
<name sortKey="Haga, E" uniqKey="Haga E">E. Haga</name>
</author>
<author>
<name sortKey="Jensenius, A R" uniqKey="Jensenius A">A. R. Jensenius</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Grahn, J A" uniqKey="Grahn J">J. A. Grahn</name>
</author>
<author>
<name sortKey="Brett, M" uniqKey="Brett M">M. Brett</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Grahn, J A" uniqKey="Grahn J">J. A. Grahn</name>
</author>
<author>
<name sortKey="Rowe, J B" uniqKey="Rowe J">J. B. Rowe</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Hadar, U" uniqKey="Hadar U">U. Hadar</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Janata, P" uniqKey="Janata P">P. Janata</name>
</author>
<author>
<name sortKey="Tomic, S T" uniqKey="Tomic S">S. T. Tomic</name>
</author>
<author>
<name sortKey="Haberman, J M" uniqKey="Haberman J">J. M. Haberman</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Jensenius, A R" uniqKey="Jensenius A">A. R. Jensenius</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Keller, P" uniqKey="Keller P">P. Keller</name>
</author>
<author>
<name sortKey="Rieger, M" uniqKey="Rieger M">M. Rieger</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Lakoff, G" uniqKey="Lakoff G">G. Lakoff</name>
</author>
<author>
<name sortKey="Johnson, M" uniqKey="Johnson M">M. Johnson</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Lakoff, G" uniqKey="Lakoff G">G. Lakoff</name>
</author>
<author>
<name sortKey="Johnson, M" uniqKey="Johnson M">M. Johnson</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Lartillot, O" uniqKey="Lartillot O">O. Lartillot</name>
</author>
<author>
<name sortKey="Eerola, T" uniqKey="Eerola T">T. Eerola</name>
</author>
<author>
<name sortKey="Toiviainen, P" uniqKey="Toiviainen P">P. Toiviainen</name>
</author>
<author>
<name sortKey="Fornari, J" uniqKey="Fornari J">J. Fornari</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Lartillot, O" uniqKey="Lartillot O">O. Lartillot</name>
</author>
<author>
<name sortKey="Toiviainen, P" uniqKey="Toiviainen P">P. Toiviainen</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Leman, M" uniqKey="Leman M">M. Leman</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Leman, M" uniqKey="Leman M">M. Leman</name>
</author>
<author>
<name sortKey="God Y, R I" uniqKey="God Y R">R. I. Godøy</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Leman, M" uniqKey="Leman M">M. Leman</name>
</author>
<author>
<name sortKey="Naveda, L" uniqKey="Naveda L">L. Naveda</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Lesaffre, M" uniqKey="Lesaffre M">M. Lesaffre</name>
</author>
<author>
<name sortKey="De Voogdt, L" uniqKey="De Voogdt L">L. De Voogdt</name>
</author>
<author>
<name sortKey="Leman, M" uniqKey="Leman M">M. Leman</name>
</author>
<author>
<name sortKey="De Baets, B" uniqKey="De Baets B">B. De Baets</name>
</author>
<author>
<name sortKey="De Meyer, H" uniqKey="De Meyer H">H. De Meyer</name>
</author>
<author>
<name sortKey="Martens, J P" uniqKey="Martens J">J.-P. Martens</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Luck, G" uniqKey="Luck G">G. Luck</name>
</author>
<author>
<name sortKey="Saarikallio, S" uniqKey="Saarikallio S">S. Saarikallio</name>
</author>
<author>
<name sortKey="Burger, B" uniqKey="Burger B">B. Burger</name>
</author>
<author>
<name sortKey="Thompson, M R" uniqKey="Thompson M">M. R. Thompson</name>
</author>
<author>
<name sortKey="Toiviainen, P" uniqKey="Toiviainen P">P. Toiviainen</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Macdougall, H G" uniqKey="Macdougall H">H. G. MacDougall</name>
</author>
<author>
<name sortKey="Moore, S T" uniqKey="Moore S">S. T. Moore</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Madison, G" uniqKey="Madison G">G. Madison</name>
</author>
<author>
<name sortKey="Gouyon, F" uniqKey="Gouyon F">F. Gouyon</name>
</author>
<author>
<name sortKey="Ullen, F" uniqKey="Ullen F">F. Ullén</name>
</author>
<author>
<name sortKey="Hornstrom, K" uniqKey="Hornstrom K">K. Hörnström</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Moelants, D" uniqKey="Moelants D">D. Moelants</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Murray, M P" uniqKey="Murray M">M. P. Murray</name>
</author>
<author>
<name sortKey="Drought, A B" uniqKey="Drought A">A. B. Drought</name>
</author>
<author>
<name sortKey="Kory, R C" uniqKey="Kory R">R. C. Kory</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Naveda, L" uniqKey="Naveda L">L. Naveda</name>
</author>
<author>
<name sortKey="Leman, M" uniqKey="Leman M">M. Leman</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Nettl, B" uniqKey="Nettl B">B. Nettl</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Pampalk, E" uniqKey="Pampalk E">E. Pampalk</name>
</author>
<author>
<name sortKey="Rauber, A" uniqKey="Rauber A">A. Rauber</name>
</author>
<author>
<name sortKey="Merkl, D" uniqKey="Merkl D">D. Merkl</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Parncutt, R" uniqKey="Parncutt R">R. Parncutt</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Phillips Silver, J" uniqKey="Phillips Silver J">J. Phillips-Silver</name>
</author>
<author>
<name sortKey="Trainor, L J" uniqKey="Trainor L">L. J. Trainor</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Repp, B H" uniqKey="Repp B">B. H. Repp</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Savitzky, A" uniqKey="Savitzky A">A. Savitzky</name>
</author>
<author>
<name sortKey="Golay, M J E" uniqKey="Golay M">M. J. E. Golay</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Shrout, P E" uniqKey="Shrout P">P. E. Shrout</name>
</author>
<author>
<name sortKey="Fleiss, J L" uniqKey="Fleiss J">J. L. Fleiss</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Stevens, C J" uniqKey="Stevens C">C. J. Stevens</name>
</author>
<author>
<name sortKey="Schubert, E" uniqKey="Schubert E">E. Schubert</name>
</author>
<author>
<name sortKey="Wang, S" uniqKey="Wang S">S. Wang</name>
</author>
<author>
<name sortKey="Kroos, C" uniqKey="Kroos C">C. Kroos</name>
</author>
<author>
<name sortKey="Halovic, S" uniqKey="Halovic S">S. Halovic</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Styns, F" uniqKey="Styns F">F. Styns</name>
</author>
<author>
<name sortKey="Van Noorden, L" uniqKey="Van Noorden L">L. van Noorden</name>
</author>
<author>
<name sortKey="Moelants, D" uniqKey="Moelants D">D. Moelants</name>
</author>
<author>
<name sortKey="Leman, M" uniqKey="Leman M">M. Leman</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Toiviainen, P" uniqKey="Toiviainen P">P. Toiviainen</name>
</author>
<author>
<name sortKey="Burger, B" uniqKey="Burger B">B. Burger</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Toiviainen, P" uniqKey="Toiviainen P">P. Toiviainen</name>
</author>
<author>
<name sortKey="Luck, G" uniqKey="Luck G">G. Luck</name>
</author>
<author>
<name sortKey="Thompson, M" uniqKey="Thompson M">M. Thompson</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Trainor, L J" uniqKey="Trainor L">L. J. Trainor</name>
</author>
<author>
<name sortKey="Gao, X" uniqKey="Gao X">X. Gao</name>
</author>
<author>
<name sortKey="Lei, J J" uniqKey="Lei J">J.-J. Lei</name>
</author>
<author>
<name sortKey="Lehtovaara, K" uniqKey="Lehtovaara K">K. Lehtovaara</name>
</author>
<author>
<name sortKey="Harris, L R" uniqKey="Harris L">L. R. Harris</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Van Dyck, E" uniqKey="Van Dyck E">E. Van Dyck</name>
</author>
<author>
<name sortKey="Moelants, D" uniqKey="Moelants D">D. Moelants</name>
</author>
<author>
<name sortKey="Demey, M" uniqKey="Demey M">M. Demey</name>
</author>
<author>
<name sortKey="Coussement, P" uniqKey="Coussement P">P. Coussement</name>
</author>
<author>
<name sortKey="Deweppe, A" uniqKey="Deweppe A">A. Deweppe</name>
</author>
<author>
<name sortKey="Leman, M" uniqKey="Leman M">M. Leman</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Varela, F J" uniqKey="Varela F">F. J. Varela</name>
</author>
<author>
<name sortKey="Thompson, E" uniqKey="Thompson E">E. Thompson</name>
</author>
<author>
<name sortKey="Rosch, E" uniqKey="Rosch E">E. Rosch</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Zentner, M" uniqKey="Zentner M">M. Zentner</name>
</author>
<author>
<name sortKey="Eerola, T" uniqKey="Eerola T">T. Eerola</name>
</author>
</analytic>
</biblStruct>
</listBibl>
</div1>
</back>
</TEI>
<pmc article-type="research-article">
<pmc-dir>properties open_access</pmc-dir>
<front>
<journal-meta>
<journal-id journal-id-type="nlm-ta">Front Psychol</journal-id>
<journal-id journal-id-type="iso-abbrev">Front Psychol</journal-id>
<journal-id journal-id-type="publisher-id">Front. Psychol.</journal-id>
<journal-title-group>
<journal-title>Frontiers in Psychology</journal-title>
</journal-title-group>
<issn pub-type="epub">1664-1078</issn>
<publisher>
<publisher-name>Frontiers Media S.A.</publisher-name>
</publisher>
</journal-meta>
<article-meta>
<article-id pub-id-type="pmid">23641220</article-id>
<article-id pub-id-type="pmc">3624091</article-id>
<article-id pub-id-type="doi">10.3389/fpsyg.2013.00183</article-id>
<article-categories>
<subj-group subj-group-type="heading">
<subject>Psychology</subject>
<subj-group>
<subject>Original Research</subject>
</subj-group>
</subj-group>
</article-categories>
<title-group>
<article-title>Influences of Rhythm- and Timbre-Related Musical Features on Characteristics of Music-Induced Movement</article-title>
</title-group>
<contrib-group>
<contrib contrib-type="author">
<name>
<surname>Burger</surname>
<given-names>Birgitta</given-names>
</name>
<xref ref-type="aff" rid="aff1">
<sup>1</sup>
</xref>
<xref ref-type="author-notes" rid="fn001">*</xref>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Thompson</surname>
<given-names>Marc R.</given-names>
</name>
<xref ref-type="aff" rid="aff1">
<sup>1</sup>
</xref>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Luck</surname>
<given-names>Geoff</given-names>
</name>
<xref ref-type="aff" rid="aff1">
<sup>1</sup>
</xref>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Saarikallio</surname>
<given-names>Suvi</given-names>
</name>
<xref ref-type="aff" rid="aff1">
<sup>1</sup>
</xref>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Toiviainen</surname>
<given-names>Petri</given-names>
</name>
<xref ref-type="aff" rid="aff1">
<sup>1</sup>
</xref>
</contrib>
</contrib-group>
<aff id="aff1">
<sup>1</sup>
<institution>Department of Music, Finnish Centre of Excellence in Interdisciplinary Music Research, University of Jyväskylä</institution>
<country>Jyväskylä, Finland</country>
</aff>
<author-notes>
<fn fn-type="edited-by">
<p>Edited by: Chris Muller, Ghent University, Belgium</p>
</fn>
<fn fn-type="edited-by">
<p>Reviewed by: Henning Scheich, Leibniz Institute for Neurobiology Magdeburg, Germany; Psyche Loui, Harvard Medical School, USA</p>
</fn>
<corresp id="fn001">*Correspondence: Birgitta Burger, Department of Music, Finnish Centre of Excellence in Interdisciplinary Music Research, University of Jyväskylä, P.O. Box 35, FI-40014 Jyväskylä, Finland. e-mail:
<email xlink:type="simple">birgitta.burger@jyu.fi</email>
</corresp>
<fn fn-type="other" id="fn002">
<p>This article was submitted to Frontiers in Auditory Cognitive Neuroscience, a specialty of Frontiers in Psychology.</p>
</fn>
</author-notes>
<pub-date pub-type="epub">
<day>12</day>
<month>4</month>
<year>2013</year>
</pub-date>
<pub-date pub-type="collection">
<year>2013</year>
</pub-date>
<volume>4</volume>
<elocation-id>183</elocation-id>
<history>
<date date-type="received">
<day>12</day>
<month>11</month>
<year>2012</year>
</date>
<date date-type="accepted">
<day>26</day>
<month>3</month>
<year>2013</year>
</date>
</history>
<permissions>
<copyright-statement>Copyright © 2013 Burger, Thompson, Luck, Saarikallio and Toiviainen.</copyright-statement>
<copyright-year>2013</copyright-year>
<license license-type="open-access" xlink:href="http://creativecommons.org/licenses/by/3.0/">
<license-p>This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits use, distribution and reproduction in other forums, provided the original authors and source are credited and subject to any copyright notices concerning any third-party graphics etc.</license-p>
</license>
</permissions>
<abstract>
<p>Music makes us move. Several factors can affect the characteristics of such movements, including individual factors or musical features. For this study, we investigated the effect of rhythm- and timbre-related musical features as well as tempo on movement characteristics. Sixty participants were presented with 30 musical stimuli representing different styles of popular music, and instructed to move along with the music. Optical motion capture was used to record participants’ movements. Subsequently, eight movement features and four rhythm- and timbre-related musical features were computationally extracted from the data, while the tempo was assessed in a perceptual experiment. A subsequent correlational analysis revealed that, for instance, clear pulses seemed to be embodied with the whole body, i.e., by using various movement types of different body parts, whereas spectral flux and percussiveness were found to be more distinctly related to certain body parts, such as head and hand movement. A series of ANOVAs with the stimuli being divided into three groups of five stimuli each based on the tempo revealed no significant differences between the groups, suggesting that the tempo of our stimuli set failed to have an effect on the movement features. In general, the results can be linked to the framework of embodied music cognition, as they show that body movements are used to reflect, imitate, and predict musical characteristics.</p>
</abstract>
<kwd-group>
<kwd>music-induced movement</kwd>
<kwd>dance</kwd>
<kwd>motion capture</kwd>
<kwd>musical feature extraction</kwd>
<kwd>pulse clarity</kwd>
<kwd>spectral flux</kwd>
</kwd-group>
<counts>
<fig-count count="3"></fig-count>
<table-count count="3"></table-count>
<equation-count count="0"></equation-count>
<ref-count count="48"></ref-count>
<page-count count="10"></page-count>
<word-count count="7868"></word-count>
</counts>
</article-meta>
</front>
<body>
<sec>
<title>Introduction</title>
<p>Music makes us move. While listening to music, we often move our bodies in a spontaneous fashion. Keller and Rieger (
<xref ref-type="bibr" rid="B19">2009</xref>
), for example, stated that simply listening to music can induce movement, and in a self-report study conducted by Lesaffre et al. (
<xref ref-type="bibr" rid="B27">2008</xref>
), most participants reported moving when listening to music. Janata et al. (
<xref ref-type="bibr" rid="B17">2011</xref>
) reported a study in which they asked participants to tap to the music and found that participants not only moved the finger/hand, but also other body parts, such as feet and head. Additionally, the tapping condition (isochronous versus free tapping) influenced the amount of movement: the more “natural” the tapping condition, the more movement was exhibited.</p>
<p>In general, people tend to move to music in an organized way by, for example, mimicking instrumentalists’ gestures, or rhythmically synchronizing with the pulse of the music by tapping their foot, nodding their head, or moving their whole body in various manners (Godøy et al.,
<xref ref-type="bibr" rid="B13">2006</xref>
; Leman and Godøy,
<xref ref-type="bibr" rid="B25">2010</xref>
). Moreover, Leman (
<xref ref-type="bibr" rid="B24">2007</xref>
, p. 96) suggests, “Spontaneous movements [to music] may be closely related to predictions of local bursts of energy in the musical audio stream, in particular to the beat and the rhythm patterns.” Such utilization of the body is the core concept of embodied cognition, which claims that the body is involved in or even required for cognitive processes (e.g., Lakoff and Johnson,
<xref ref-type="bibr" rid="B20">1980</xref>
,
<xref ref-type="bibr" rid="B21">1999</xref>
, or Varela et al.,
<xref ref-type="bibr" rid="B47">1991</xref>
). Human cognition is thus highly influenced by the interaction between mind/brain, sensorimotor capabilities, body, and environment. Following this, we can approach music (or musical involvement) by linking our perception of it to our body movement (Leman,
<xref ref-type="bibr" rid="B24">2007</xref>
). One could postulate that our bodily movements reflect, imitate, help to parse, or support understanding the structure and content of music. Leman suggests that corporeal articulations could be influenced by three (co-existing) components or concepts: “Synchronization,” “Embodied Attuning,” and “Empathy,” which differ in the degree of musical involvement and in the kind of action-perception couplings employed. “Synchronization” forms the fundamental component, as synchronizing to a beat is easy and spontaneous. The beat serves as the basic musical element, from which more complex structures emerge. Leman furthermore suggests the term “inductive resonance,” referring to the use of movements for active control, imitation, and prediction of beat-related features in the music (the opposite of passively tapping to a beat) as the first step in engaging with the music. The second component, “Embodied Attuning,” concerns the linkage of body movement to musical features more complex than the basic beat, such as melody, harmony, rhythm, tonality, or timbre. Following this idea, movement could be used to reflect, imitate, and navigate within the musical structure in order to understand it. Finally, “Empathy” is seen as the component that links musical features to expressivity and emotions. In other words, the listener feels and identifies with the emotions expressed in the music and imitates and reflects them by using body movement.</p>
<p>It has been argued that music and dance have evolved together in most cultures (Arom,
<xref ref-type="bibr" rid="B2">1991</xref>
; Cross,
<xref ref-type="bibr" rid="B9">2001</xref>
) and are crucial elements of most social and collective human behavior (Brown et al.,
<xref ref-type="bibr" rid="B4">2000</xref>
). Furthermore, most cultures have developed coordinated dance movements to rhythmically predictable music (Nettl,
<xref ref-type="bibr" rid="B34">2000</xref>
). There is neurobiological evidence for a connection between rhythmic components of music and movement (e.g., Grahn and Brett,
<xref ref-type="bibr" rid="B14">2007</xref>
; Bengtsson et al.,
<xref ref-type="bibr" rid="B3">2009</xref>
; Chen et al.,
<xref ref-type="bibr" rid="B7">2009</xref>
; Grahn and Rowe,
<xref ref-type="bibr" rid="B15">2009</xref>
), which has led to the assumption that humans prefer music that facilitates synchronization and respond to it with movement (Madison et al.,
<xref ref-type="bibr" rid="B30">2011</xref>
). Phillips-Silver and Trainor (
<xref ref-type="bibr" rid="B37">2008</xref>
) showed in their study that especially head movements were found to bias metrical encoding of rhythm and meter perception.</p>
<p>The increasing opportunities of quantitative research methods for recording and analyzing body movement have offered new insights and perspectives for studying such movements. A number of studies have investigated (professional) dance movements using quantitative methods. Camurri et al. (
<xref ref-type="bibr" rid="B5">2003</xref>
,
<xref ref-type="bibr" rid="B6">2004</xref>
), for instance, developed a video analysis tool to recognize and classify expressive and emotional gestures in professional dance performances. Jensenius (
<xref ref-type="bibr" rid="B18">2006</xref>
) developed the technique of “motiongrams” for visualizing and analyzing movement and gestures. Stevens et al. (
<xref ref-type="bibr" rid="B41">2009</xref>
) studied movements of professional dancers regarding time keeping with and without music using optical motion capture recordings. Optical motion capture was also employed by Naveda and Leman (
<xref ref-type="bibr" rid="B33">2010</xref>
) and Leman and Naveda (
<xref ref-type="bibr" rid="B26">2010</xref>
) who investigated movement in samba and Charleston dancing, focusing on spatiotemporal representations of dance gestures as movement trajectories.</p>
<p>Besides professional dance, several studies were dedicated to more general tasks and behaviors involving music-induced movement, such as movement of infants or laymen dancers. Zentner and Eerola (
<xref ref-type="bibr" rid="B48">2010</xref>
) investigated infants’ ability to bodily synchronize with musical stimuli, finding that infants showed more rhythmic movement to music and metrical stimuli than to speech suggesting a predisposition for rhythmic movement to music and other metrical regular sounds. Eerola et al. (
<xref ref-type="bibr" rid="B11">2006</xref>
) studied toddlers’ corporeal synchronization to music, finding three main periodic movement types being at times synchronized with pulse of the music. Toiviainen et al. (
<xref ref-type="bibr" rid="B44">2010</xref>
) investigated how music-induced movement exhibited pulsations on different metrical levels, and showed that eigenmovements of different body parts were synchronized with different metric levels of the stimulus. Luck et al. (
<xref ref-type="bibr" rid="B28">2010</xref>
) studied the influence of individual factors such as personality traits and preference on musically induced movements of laypersons’ dancing, finding several relationships between personality traits and movement characteristics. Van Dyck et al. (
<xref ref-type="bibr" rid="B46">2010</xref>
) found that an increased presence of the bass drum tends to increase listeners’ spontaneous movements. However, systematic investigations targeting the relationships between musical features, particularly rhythm-related features, and human movement characteristics have not been conducted. Such an investigation would reveal additional information as to how music shapes movement. Additionally, finding dependencies of musical structure and body movements that are consistent between individuals would support the notion of embodied music cognition (Leman,
<xref ref-type="bibr" rid="B24">2007</xref>
).</p>
<p>Rhythmic music is based on beats, which can be physically characterized as distinct energy bursts in time. If such beats occur as regular and repetitive temporal patterns, they give rise to a percept of pulse. Beat and pulse structures can be regarded as the basic metrical structure in music from which more complex temporal structures, such as rhythm, emerge. This is typically achieved by subdividing the basic metrical structure in smaller and larger units of varying lengths, constructing a metrically interlocked grid with events on different temporal levels (Parncutt,
<xref ref-type="bibr" rid="B36">1994</xref>
). These rhythmic structures can vary, for example, in the degree of pulse clarity. Pulse clarity estimates, on a large time scale, how clearly the underlying pulsation in music is perceivable and can therefore be regarded as a measure for the underlying periodicity of the music (Lartillot et al.,
<xref ref-type="bibr" rid="B22">2008</xref>
). Another aspect of rhythmic structure is covered by spectro-temporal features, such as the sub-band spectral flux, which has been found to be among the most important features contributing to polyphonic timbre perception (Alluri and Toiviainen,
<xref ref-type="bibr" rid="B1">2010</xref>
). Spectral flux measures spectral change, which, when taken separately for different sub-bands, is related to certain elements of the rhythm (i.e., rhythmic elements created by instruments within the frequency range of the respective sub-band). It could be that sub-band flux is a crucial feature not only in a (passive) listening situation, but also in a (active) movement situation. Furthermore, other timbral characteristics, such as percussiveness, could have an influence on movement responses to music. For instance, high amounts of percussive elements in music could result in fast movements, reflecting the way such sounds are often produced. Following these notions, it could be assumed that variations in such musical features not only increase or decrease the amount of movement, but also change the kinds and properties of the movements. In line with the embodied music cognition approach (Leman,
<xref ref-type="bibr" rid="B24">2007</xref>
), the movements could reflect and imitate the rhythmical and timbral structure of the music.</p>
<p>Besides features such as pulse clarity, tempo is an important factor contributing to the perception of rhythm (Fraisse,
<xref ref-type="bibr" rid="B12">1982</xref>
). Tempo is the speed at which beats are repeated, the underlying periodicity of music. Tempo is usually measured in beats per minute (bpm). A large body of research has been conducted on listeners’ abilities to perceive different tempi, synchronize to them, and reproduce them in tapping tasks (for a review see Repp,
<xref ref-type="bibr" rid="B38">2005</xref>
). Free tapping tasks have found that a majority of participants tapped at a rate close to 600 ms, though the individual rates differed considerably (Fraisse,
<xref ref-type="bibr" rid="B12">1982</xref>
). Synchronizing to steady, periodic beat stimuli is possible at a wide range of tempi, however it is most regular and accurate for intervals around 400–500 ms (Collyer et al.,
<xref ref-type="bibr" rid="B8">1992</xref>
), respectively 400–800 ms (Fraisse,
<xref ref-type="bibr" rid="B12">1982</xref>
), while with slower and faster tempo the time between two taps becomes more variable. Moelants (
<xref ref-type="bibr" rid="B31">2002</xref>
) suggested 120 bpm as the preferred tempo – the tempo where tempo perception is considered to be optimal and appears most natural. Interesting to note here is that literature often draws links between spontaneous/preferred tempo and repeated motor activities, such as walking, for which the spontaneous duration of steps is around 500–550 ms (Murray et al.,
<xref ref-type="bibr" rid="B32">1964</xref>
; Fraisse,
<xref ref-type="bibr" rid="B12">1982</xref>
; MacDougall and Moore,
<xref ref-type="bibr" rid="B29">2005</xref>
). Walking has been suggested as “a fundamental element of human motor activity” (Fraisse,
<xref ref-type="bibr" rid="B12">1982</xref>
, p. 152) and could therefore serve as an origin of preferred tempo perception. Following these considerations, it could furthermore be assumed that music with tempi around 110–120 bpm stimulate movement more than music with other tempi.</p>
<p>In order to investigate relationships between rhythmic and timbral aspects of music and movements that such music induces, we conducted a motion capture experiment, in which participants were asked to move to music that differed in tempo and in rhythm- and timbre-related features, such as pulse clarity, percussiveness, and spectral flux of low and high frequency ranges. We were interested in two main aspects: first, whether movement and musical characteristics are somehow related, and more particular, which body parts and movement types reflect different musical characteristics. Second, whether the tempo of music has an influence on the movement. Following the notion of embodied (music) cognition we assumed the movement to reflect aspects of the music. First, we expected movements to indicate the beat structure, in particular that a clear beat would be embodied by increased speed of movements. Second, we anticipated that hand and arm movements, having the most freedom when moving/dancing, would play an important role in reflecting musical characteristics. Finally, we assumed movement features to resemble the preferred tempo insofar, as tempi around 120 bpm might encourage participants to move differently than tempi slower or faster than 120 bpm.</p>
</sec>
<sec sec-type="materials|methods">
<title>Materials and Methods</title>
<sec>
<title>Participants</title>
<p>A total of 64 participants took part in the study. Four participants were excluded from further analysis due to incomplete data. Thus, 60 participants remained for subsequent analyses (43 female, 17 male, average age: 24, SD of age: 3.3). Six participants had received formal music education, while four participants had a formal background in dance tuition. Participation was rewarded with a movie ticket. All participants gave their informed consent prior to their inclusion in the study and they were free to discontinue the experiment at any point. Ethical permission for this study was not needed, which was according to the guidelines stated by the university ethical board.</p>
</sec>
<sec>
<title>Stimuli</title>
<p>Participants were presented with 30 randomly ordered musical stimuli representing the following popular music genres: techno, Pop, Rock, Latin, Funk, and Jazz. An overview of the stimuli can be found in the Appendix. All stimuli were 30 s long, non-vocal, and in 4/4 time, but differed in their rhythmic complexity, pulse clarity, mode, and tempo. The stimulus length was chosen to keep the experiment sufficiently short while having stimuli that were long enough to induce movement.</p>
</sec>
<sec>
<title>Apparatus</title>
<p>Participants’ movements were recorded using an eight-camera optical motion capture system (Qualisys ProReflex), tracking, at a frame rate of 120 Hz, the three-dimensional positions of 28 reflective markers attached to each participant. The locations of the markers can be seen in Figures
<xref ref-type="fig" rid="F1">1</xref>
A,B. The location of the markers were as follows (L, left; R, right; F, front; B, back): 1, LF head; 2, RF head; 3, LB head; 4, RB head; 5, L shoulder; 6, R shoulder; 7, sternum; 8, spine (T5); 9, LF hip; 10, RF hip; 11, LB hip; 12, RB hip; 13, L elbow; 14, R elbow; 15, L wrist/radius; 16, L wrist/ulna; 17, R wrist/radius; 18, R wrist/ulna; 19, L middle finger; 20, R middle finger; 21, L knee; 22, R knee; 23, L ankle; 24, R ankle; 25, L heel; 26, R heel; 27, L big toe; 28, R big toe. The musical stimuli were played back via a pair of Genelec 8030A loudspeakers using a Max/MSP patch running on an Apple computer. The room sound was recorded with two overhead microphones positioned at a height of approximately 2.5 m. This microphone input, the direct audio signal of the playback, and the synchronization pulse transmitted by the Qualisys cameras when recording, were recorded using ProTools software in order to synchronize the motion capture data with the musical stimulus afterward. Additionally, four Sony video cameras were used to record the sessions for reference purposes.</p>
<fig id="F1" position="float">
<label>Figure 1</label>
<caption>
<p>
<bold>Marker and joint locations</bold>
.
<bold>(A)</bold>
Anterior and posterior view of the marker placement on the participants’ bodies;
<bold>(B)</bold>
Anterior view of the marker locations as stick figure illustration;
<bold>(C)</bold>
Anterior view of the locations of the secondary markers/joints used in the analysis.</p>
</caption>
<graphic xlink:href="fpsyg-04-00183-g001"></graphic>
</fig>
</sec>
<sec>
<title>Procedure</title>
<p>Participants were recorded individually and were asked to move to the presented stimuli in a way that felt natural. Additionally, they were encouraged to dance if they wanted to, but were requested to remain in the center of the capture space indicated by a 115 × 200 cm carpet.</p>
</sec>
<sec>
<title>Movement feature extraction</title>
<p>In order to extract various kinematic features, the MATLAB Motion Capture (MoCap) Toolbox (Toiviainen and Burger,
<xref ref-type="bibr" rid="B43">2011</xref>
) was used to first trim the data to the duration of each stimulus and, following this, derive a set of 20 secondary markers – subsequently referred to as joints – from the original 28 markers. The locations of these 20 joints are depicted in Figure
<xref ref-type="fig" rid="F1">1</xref>
C. The locations of joints C, D, E, G, H, I, M, N, P, Q, R, and T are identical to the locations of one of the original markers, while the locations of the remaining joints were obtained by averaging the locations of two or more markers; joint A, midpoint of the four hip markers (called root in the further analysis); B, midpoint of markers 9 and 11 (left hip); F, midpoint of markers 10 and 12 (right hip); J, midpoint of breastbone, spine, and the hip markers (midtorso); K, midpoint of shoulder markers (manubrium), L, midpoint of the four head markers (head); O, midpoint of the two left wrist markers (left wrist); S, midpoint of the two right wrist markers (right wrist). Thereafter, eight movement variables were extracted from the joint location data to cover different movement characteristics, body parts, and movement types.</p>
<list list-type="simple">
<list-item>
<p>– Speed of Center of Mass, Head, both Hands, and both Feet: for calculating the speed of the movement, we applied numerical differentiation based on the Savitzky and Golay (
<xref ref-type="bibr" rid="B39">1964</xref>
) smoothing FIR filter with a window length of seven samples and a polynomial order of two. These values were found to provide an optimal combination of precision and smoothness in the time derivatives. For the Speed of Head, Hands, and Feet, the data were transformed into a local coordinate system, in which joint A was located at the origin, and segment BF had zero azimuth.</p>
</list-item>
<list-item>
<p>– Hand Distance: this feature relates to the distance between both hands.</p>
</list-item>
<list-item>
<p>– Amount of Movement: this feature is based on the traveled distance of all markers and gives a measure for the total amount of movement (data were also transformed into the local coordinate system).</p>
</list-item>
<list-item>
<p>– Hip Wiggle: this feature is defined as the mean absolute angular velocity of the hips around the anteroposterior axis.</p>
</list-item>
<list-item>
<p>– Shoulder Wiggle: this feature is defined as the mean absolute angular velocity of the shoulder around the anteroposterior axis.</p>
</list-item>
</list>
<p>Subsequently, the instantaneous values of each variable were averaged for each stimulus presentation. This yielded a total of eight statistical movement features for each of the 60 participants and 30 stimuli.</p>
</sec>
<sec>
<title>Musical feature extraction</title>
<p>To investigate the effect of rhythm- and timbre-related features on music-induced movement, we performed computational feature extraction analysis of the stimuli used in the experiment. To this end, four musical features were extracted from the stimuli using the MATLAB MIRToolbox (version 1.4) (Lartillot and Toiviainen,
<xref ref-type="bibr" rid="B23">2007</xref>
).</p>
<list list-type="simple">
<list-item>
<p>– Pulse Clarity: this feature indicates the strength of rhythmic periodicities and pulses in the signal, estimated by the relative Shannon entropy of the fluctuation spectrum (Pampalk et al.,
<xref ref-type="bibr" rid="B35">2002</xref>
). In this context, entropy can be understood as a measure of the degree of peakiness of the spectrum. Music with easily perceived beat has a distinct and regular fluctuation spectrum, which has low entropy. Thus, high pulse clarity is associated with low fluctuation entropy. To illustrate this measure of Pulse Clarity, the fluctuation spectra for high and low Pulse Clarity are shown in Figure
<xref ref-type="fig" rid="F2">2</xref>
.</p>
</list-item>
<list-item>
<p>– Sub-Band Flux: this feature indicates to which extend the spectrum changes over time. For the calculation, the stimulus is divided into 10 frequency bands, each band containing one octave in the range of 0–22050 Hz. The Sub-Band Flux is then calculated for each of these ten bands separately by calculating the Euclidean distances of the spectra for each two consecutive frames of the signal (Alluri and Toiviainen,
<xref ref-type="bibr" rid="B1">2010</xref>
), using a frame length of 25 ms and an overlap of 50% between successive frames and then averaging the resulting time-series of flux values. For the current analysis, we used two frequency bands: band no. 2 (50–100 Hz) and band no. 9 (6400–12800 Hz). High flux in the low frequency bands is produced by instruments such as kick drum and bass guitar, whereas high flux in the high frequency bands is produced by instruments such as cymbals or hihats. Two spectrograms of sub-band no. 2 are displayed in Figure
<xref ref-type="fig" rid="F3">3</xref>
to show the difference between high and low amounts of Sub-Band Flux.</p>
</list-item>
<list-item>
<p>– Percussiveness: for this feature, the slopes of the amplitude envelopes at note onsets are estimated and then averaged across all slopes of the signal. The steeper the slope, the more percussive and accentuated the sound.</p>
</list-item>
</list>
<fig id="F2" position="float">
<label>Figure 2</label>
<caption>
<p>
<bold>Fluctuation spectra of two stimuli used in the study</bold>
.
<bold>(A)</bold>
Peaks at a regular distance of 0.28 Hz, with the highest peak at 4.56 Hz and other clear peaks at 2.29, 6.85, and 9.13 Hz, suggesting clear pulses and periodicity (stimulus 1, see
<xref ref-type="app" rid="A1">Appendix</xref>
).
<bold>(B)</bold>
Markedly lower magnitude values, a less periodic pattern of peaks, and more noise, suggesting low pulse clarity (stimulus 21, see
<xref ref-type="app" rid="A1">Appendix</xref>
).</p>
</caption>
<graphic xlink:href="fpsyg-04-00183-g002"></graphic>
</fig>
<fig id="F3" position="float">
<label>Figure 3</label>
<caption>
<p>
<bold>Spectrograms (sec. 10–20) of sub-band no. 2 (50–100 Hz) of two stimuli used in the study</bold>
.
<bold>(A)</bold>
High amount of temporal change (red represents high energy at the respective time and frequency, whereas blue represents low energy; see color bar) resulting in high value for Sub-Band Flux (stimulus 26, see
<xref ref-type="app" rid="A1">Appendix</xref>
).
<bold>(B)</bold>
Low amount of temporal change resulting in low Sub-Band Flux (stimulus 5, see
<xref ref-type="app" rid="A1">Appendix</xref>
).</p>
</caption>
<graphic xlink:href="fpsyg-04-00183-g003"></graphic>
</fig>
<p>Additionally, the Tempo of the stimuli was assessed in a separate tapping experiment, in which ten participants tapped to the 30 stimuli. The average tempo was evaluated by taking the median value of all intertap intervals.</p>
</sec>
</sec>
<sec>
<title>Results</title>
<p>In order to perform further analysis, we first checked for consistency between participants by calculating Intraclass Correlations (cf., Shrout and Fleiss,
<xref ref-type="bibr" rid="B40">1979</xref>
) for each movement feature separately. Intraclass correlations operate on data that are structured in groups (as opposed to standard correlation procedures that assume data to be structured as paired observations) and indicate how strongly units of the same group (in this case the values of one movement feature for all participants and songs) resemble each other. The values are presented in Table
<xref ref-type="table" rid="T1">1</xref>
.</p>
<table-wrap id="T1" position="float">
<label>Table 1</label>
<caption>
<p>
<bold>Results of the intraclass correlations for each of the movement features</bold>
.</p>
</caption>
<table frame="hsides" rules="groups">
<thead>
<tr>
<th align="left" rowspan="1" colspan="1">Movement feature</th>
<th align="left" rowspan="1" colspan="1">
<italic>r</italic>
</th>
</tr>
</thead>
<tbody>
<tr>
<td align="left" rowspan="1" colspan="1">Speed of center of mass</td>
<td align="left" rowspan="1" colspan="1">0.90
<xref ref-type="table-fn" rid="tfn1">***</xref>
</td>
</tr>
<tr>
<td align="left" rowspan="1" colspan="1">Speed of head</td>
<td align="left" rowspan="1" colspan="1">0.89
<xref ref-type="table-fn" rid="tfn1">***</xref>
</td>
</tr>
<tr>
<td align="left" rowspan="1" colspan="1">Speed of hands</td>
<td align="left" rowspan="1" colspan="1">0.91
<xref ref-type="table-fn" rid="tfn1">***</xref>
</td>
</tr>
<tr>
<td align="left" rowspan="1" colspan="1">Speed of feet</td>
<td align="left" rowspan="1" colspan="1">0.90
<xref ref-type="table-fn" rid="tfn1">***</xref>
</td>
</tr>
<tr>
<td align="left" rowspan="1" colspan="1">Hand distance</td>
<td align="left" rowspan="1" colspan="1">0.62
<xref ref-type="table-fn" rid="tfn1">***</xref>
</td>
</tr>
<tr>
<td align="left" rowspan="1" colspan="1">Amount of movement</td>
<td align="left" rowspan="1" colspan="1">0.94
<xref ref-type="table-fn" rid="tfn1">***</xref>
</td>
</tr>
<tr>
<td align="left" rowspan="1" colspan="1">Hip wiggle</td>
<td align="left" rowspan="1" colspan="1">0.95
<xref ref-type="table-fn" rid="tfn1">***</xref>
</td>
</tr>
<tr>
<td align="left" rowspan="1" colspan="1">Shoulder wiggle</td>
<td align="left" rowspan="1" colspan="1">0.88
<xref ref-type="table-fn" rid="tfn1">***</xref>
</td>
</tr>
</tbody>
</table>
<table-wrap-foot>
<fn id="tfn1">
<p>
<italic>***
<italic>p</italic>
 < 0.001</italic>
.</p>
</fn>
</table-wrap-foot>
</table-wrap>
<p>All intraclass correlations resulted in highly significant coefficients. Therefore, we concluded that participants’ movements for each song were similar enough to justify averaging the movement features across participants, yielding one value for each stimulus presentation.</p>
<p>Subsequently, we correlated the movement features with Pulse Clarity, Low and High Frequency Spectral Flux, and Percussiveness. Due to the relatively large number of correlations (32), we applied Bonferroni correction to avoid false positive errors in case of multiple comparisons. The results are displayed in Table
<xref ref-type="table" rid="T2">2</xref>
.</p>
<table-wrap id="T2" position="float">
<label>Table 2</label>
<caption>
<p>
<bold>Results of the correlation between movement and musical features</bold>
.</p>
</caption>
<table frame="hsides" rules="groups">
<thead>
<tr>
<th align="left" rowspan="1" colspan="1"></th>
<th align="left" rowspan="1" colspan="1">Pulse clarity</th>
<th align="left" rowspan="1" colspan="1">Spectral flux SB 2</th>
<th align="left" rowspan="1" colspan="1">Spectral flux SB 9</th>
<th align="left" rowspan="1" colspan="1">Percussiveness</th>
</tr>
</thead>
<tbody>
<tr>
<td align="left" rowspan="1" colspan="1">Speed of center of mass</td>
<td align="left" rowspan="1" colspan="1">
<bold>0.67**</bold>
</td>
<td align="left" rowspan="1" colspan="1">0.51</td>
<td align="left" rowspan="1" colspan="1">0.52</td>
<td align="left" rowspan="1" colspan="1">
<bold>0.56*</bold>
</td>
</tr>
<tr>
<td align="left" rowspan="1" colspan="1">Speed of head</td>
<td align="left" rowspan="1" colspan="1">0.52</td>
<td align="left" rowspan="1" colspan="1">
<bold>0.73***</bold>
</td>
<td align="left" rowspan="1" colspan="1">
<bold>0.64**</bold>
</td>
<td align="left" rowspan="1" colspan="1">
<bold>0.67**</bold>
</td>
</tr>
<tr>
<td align="left" rowspan="1" colspan="1">Speed of hands</td>
<td align="left" rowspan="1" colspan="1">0.54</td>
<td align="left" rowspan="1" colspan="1">0.46</td>
<td align="left" rowspan="1" colspan="1">
<bold>0.65**</bold>
</td>
<td align="left" rowspan="1" colspan="1">
<bold>0.57*</bold>
</td>
</tr>
<tr>
<td align="left" rowspan="1" colspan="1">Speed of feet</td>
<td align="left" rowspan="1" colspan="1">
<bold>0.55*</bold>
</td>
<td align="left" rowspan="1" colspan="1">0.31</td>
<td align="left" rowspan="1" colspan="1">0.46</td>
<td align="left" rowspan="1" colspan="1">0.45</td>
</tr>
<tr>
<td align="left" rowspan="1" colspan="1">Hand distance</td>
<td align="left" rowspan="1" colspan="1">0.34</td>
<td align="left" rowspan="1" colspan="1">0.29</td>
<td align="left" rowspan="1" colspan="1">
<bold>0.57*</bold>
</td>
<td align="left" rowspan="1" colspan="1">
<bold>0.55*</bold>
</td>
</tr>
<tr>
<td align="left" rowspan="1" colspan="1">Amount of movement</td>
<td align="left" rowspan="1" colspan="1">
<bold>0.62**</bold>
</td>
<td align="left" rowspan="1" colspan="1">0.44</td>
<td align="left" rowspan="1" colspan="1">
<bold>0.57*</bold>
</td>
<td align="left" rowspan="1" colspan="1">
<bold>0.56*</bold>
</td>
</tr>
<tr>
<td align="left" rowspan="1" colspan="1">Hip wiggle</td>
<td align="left" rowspan="1" colspan="1">
<bold>0.58*</bold>
</td>
<td align="left" rowspan="1" colspan="1">0.25</td>
<td align="left" rowspan="1" colspan="1">0.47</td>
<td align="left" rowspan="1" colspan="1">0.46</td>
</tr>
<tr>
<td align="left" rowspan="1" colspan="1">Shoulder wiggle</td>
<td align="left" rowspan="1" colspan="1">
<bold>0.67**</bold>
</td>
<td align="left" rowspan="1" colspan="1">0.39</td>
<td align="left" rowspan="1" colspan="1">0.48</td>
<td align="left" rowspan="1" colspan="1">
<bold>0.61*</bold>
</td>
</tr>
</tbody>
</table>
<table-wrap-foot>
<p>
<italic>*
<italic>p</italic>
 < 0.05, **
<italic>p</italic>
 < 0.01, ***
<italic>p</italic>
 < 0.001</italic>
.</p>
</table-wrap-foot>
</table-wrap>
<p>As can be seen, Pulse Clarity shows significant positive correlations with Speed of Center of Mass [
<italic>r</italic>
(30) = 0.67,
<italic>p</italic>
 < 0.01], Speed of Feet [
<italic>r</italic>
(30) = 0.55,
<italic>p</italic>
 < 0.05], Amount of Movement [
<italic>r</italic>
(30) = 0.62,
<italic>p</italic>
 < 0.01], Hip Wiggle [
<italic>r</italic>
(30) = 0.58,
<italic>p</italic>
 < 0.05], and Shoulder Wiggle [
<italic>r</italic>
(30) = 0.67,
<italic>p</italic>
 < 0.01]. Thus, for music with a clear pulse, participants tended to move the center of the body and feet faster, wiggled more with hips and shoulders, and used an increased amount of movement related to the whole body. For an illustration of these features, an animation of movements performed by one participant for the stimulus with the highest value for Pulse Clarity (stimulus 1, see
<xref ref-type="app" rid="A1">Appendix</xref>
) can be found in the “PulseClarity.mov” in Supplementary Material. The selected participant exhibited high values for all the movement features that were shown to be characteristic for high Pulse Clarity.</p>
<p>Spectral Flux of Sub-band no. 2 exhibited a high positive correlation with Speed of Head [
<italic>r</italic>
(30) = 0.73,
<italic>p</italic>
 < 0.001], suggesting that stimuli with strong spectral flux in the low frequencies are related to fast head movements. For illustration, an animation of movements performed by one participant for the stimulus with the highest value for Spectral Flux of Sub-band no. 2 (stimulus 26, see
<xref ref-type="app" rid="A1">Appendix</xref>
) can be found in the “SubBandFluxNo2.mov” in Supplementary Material.</p>
<p>Spectral Flux of Sub-band no. 9 showed significant positive correlations with Speed of Head [
<italic>r</italic>
(30) = 0.64,
<italic>p</italic>
 < 0.01], Speed of Hands [
<italic>r</italic>
(30) = 0.65,
<italic>p</italic>
 < 0.01], Hand Distance [
<italic>r</italic>
(30) = 0.57,
<italic>p</italic>
 < 0.05], and Amount of Movement [
<italic>r</italic>
(30) = 0.57,
<italic>p</italic>
 < 0.05], indicating that, for stimuli with strong spectral flux in the high frequencies, participants moved their head and hands faster, had a larger distance between the hands, and used an increased amount of movement. To illustrate these features, an animation of movements performed by one participant for the stimulus with the highest value for Spectral Flux of Sub-band no. 9 (stimulus 18, see
<xref ref-type="app" rid="A1">Appendix</xref>
) can be found in the “SubBandFluxNo9.mov” in Supplementary Material.</p>
<p>Percussiveness exhibited positive correlations with Speed of Center of Mass [
<italic>r</italic>
(30) = 0.56,
<italic>p</italic>
 < 0.05], Speed of Head [
<italic>r</italic>
(30) = 0.67,
<italic>p</italic>
 < 0.01], Speed of Hands [
<italic>r</italic>
(30) = 0.57,
<italic>p</italic>
 < 0.05], Hand Distance [
<italic>r</italic>
(30) = 0.55,
<italic>p</italic>
 < 0.05], Amount of Movement [
<italic>r</italic>
(30) = 0.56,
<italic>p</italic>
 < 0.05], and Shoulder Wiggle [
<italic>r</italic>
(30) = 0.61,
<italic>p</italic>
 < 0.01]. Thus, for stimuli containing a high amount of percussive elements, participants moved their center of mass, head, and hands faster, had a larger distance between their hands, used an increased amount of movement, and wiggled more with their shoulders. For an illustration of these features, an animation of movements performed by one participant for the stimulus with the highest value for Percussiveness (stimulus 3, see
<xref ref-type="app" rid="A1">Appendix</xref>
) can be found in the “Percussiveness.mov” in Supplementary Material.</p>
<p>As we assumed a non-linear relationship between Tempo and the movement features, we excluded Tempo from the correlational analysis. Instead, we rank-ordered the stimuli based on their tempi obtained from the tapping experiment and picked the five slowest, the five medial, and the five fastest stimuli to perform a series of ANOVAs on these three groups. The tempi of the stimuli in these groups ranged from 73 to 87 bpm, from 113 to 125 bpm, and from 138 to 154 bpm. The results of the ANOVAs are shown in Table
<xref ref-type="table" rid="T3">3</xref>
.</p>
<table-wrap id="T3" position="float">
<label>Table 3</label>
<caption>
<p>
<bold>Results from the series of ANOVAs, testing for differences in the movement features between the three tempo groups</bold>
.</p>
</caption>
<table frame="hsides" rules="groups">
<thead>
<tr>
<th align="left" rowspan="1" colspan="1"></th>
<th align="left" rowspan="1" colspan="1">
<italic>F</italic>
(2, 897)</th>
<th align="left" rowspan="1" colspan="1">
<italic>p</italic>
</th>
</tr>
</thead>
<tbody>
<tr>
<td align="left" rowspan="1" colspan="1">Speed of center of mass</td>
<td align="left" rowspan="1" colspan="1">0.36</td>
<td align="left" rowspan="1" colspan="1">0.70</td>
</tr>
<tr>
<td align="left" rowspan="1" colspan="1">Speed of head</td>
<td align="left" rowspan="1" colspan="1">0.55</td>
<td align="left" rowspan="1" colspan="1">0.58</td>
</tr>
<tr>
<td align="left" rowspan="1" colspan="1">Speed of hands</td>
<td align="left" rowspan="1" colspan="1">0.58</td>
<td align="left" rowspan="1" colspan="1">0.56</td>
</tr>
<tr>
<td align="left" rowspan="1" colspan="1">Speed of feet</td>
<td align="left" rowspan="1" colspan="1">0.40</td>
<td align="left" rowspan="1" colspan="1">0.67</td>
</tr>
<tr>
<td align="left" rowspan="1" colspan="1">Hand distance</td>
<td align="left" rowspan="1" colspan="1">0.06</td>
<td align="left" rowspan="1" colspan="1">0.94</td>
</tr>
<tr>
<td align="left" rowspan="1" colspan="1">Amount of movement</td>
<td align="left" rowspan="1" colspan="1">1.35</td>
<td align="left" rowspan="1" colspan="1">0.26</td>
</tr>
<tr>
<td align="left" rowspan="1" colspan="1">Hip wiggle</td>
<td align="left" rowspan="1" colspan="1">1.64</td>
<td align="left" rowspan="1" colspan="1">0.19</td>
</tr>
<tr>
<td align="left" rowspan="1" colspan="1">Shoulder wiggle</td>
<td align="left" rowspan="1" colspan="1">0.42</td>
<td align="left" rowspan="1" colspan="1">0.66</td>
</tr>
</tbody>
</table>
</table-wrap>
<p>As none of the ANOVAs showed significant differences between the groups, the tempo of our set of stimuli failed to have any significant effect on the movement features.</p>
</sec>
<sec sec-type="discussion">
<title>Discussion</title>
<p>We investigated music-induced movement, focusing on the relationship between rhythm- and timbre-related musical features, such as Pulse Clarity and Spectral Flux, and movement characteristics of different body parts.</p>
<p>Pulse Clarity seemed to be embodied by using various movement types of different body parts, such as Speed of Center of Mass and Feet, Amount of Movement, and Hip and Shoulder Wiggle. Participants used an increasing amount of different movements and movement types of the whole body when the music contained easily and clearly perceivable pulses. Pulse Clarity might, therefore, influence body movement on a somewhat global, whole body level.</p>
<p>Low frequency (50–100 Hz) Spectral Flux was positively correlated to the Speed of Head. We observed that a high amount of low frequency Spectral Flux was associated with the presence of kick drum and bass guitar, creating strong low frequency rhythmic elements that are usually related to the beat period. A reason for the head being involved could be that, based on biomechanical properties, the head might be most prone to move to the beat level (cf. nodding to the beat or “head banging” in certain musical genres). Interesting to note here is that we found only one significant correlation, so it could be concluded that spectral changes in the low frequency ranges influences body movement on a rather local, specific level.</p>
<p>Spectral Flux in the high frequencies (6400–12800 Hz) was found to relate to Speed of Head and Hands, Hand Distance, and Amount of Movement. We observed that high frequency Spectral Flux was mainly influenced by the presence of hi-hat and cymbal sounds, creating the high frequency rhythmical figures and hence the “liveliness” of the rhythm. That could explain the dominance of hand-related movement characteristics, as the hands have the biggest movement freedom of the whole body and can therefore reflect fast rhythmic structures best.</p>
<p>Percussiveness correlated with Speed of Center of Mass, Head, and Hands, Hand Distance, Amount of Movement, and Shoulder Wiggle, suggesting that percussive elements in the music were related to movement of the upper body, especially to the hands as the most flexible body parts. Together with the findings for high frequency Spectral Flux, these results could indicate that timbral features not only affect movement responses to music, but more particular that the embodiment of these features is to a large proportion related to upper body and hand movement. Furthermore, high amounts of percussive elements seemed to be embodied with fast movement, supporting the assumption that the movement reflects the way such sounds are often produced.</p>
<p>The Tempo of the music failed to show any relationship with the movement features and could therefore not confirm our hypothesis that music around 120 bpm would stimulate movement differently than faster or slower music. However, this might be due to our selection of the stimuli. The variety of musical styles that was covered by the stimuli might have undermined the influence of the tempo. This issue thus remains an open question that requires further investigation.</p>
<p>The results can be linked to the framework of embodied music cognition and could provide support for Leman’s (
<xref ref-type="bibr" rid="B24">2007</xref>
) concepts of Synchronization, Inductive Resonance, and Embodied Attuning. The concepts of Synchronization and Inductive Resonance can be related to our findings regarding Pulse Clarity: clearly perceivable and strong beats might have a resonating effect on the overall body/torso movement and amount of movement, reflecting and imitating the clear beat structure, and making participants synchronize to the pulse. With less clear pulses, people might be less affected by the resonating effect, and are thus less encouraged to move. Concerning the correlations with Pulse Clarity, it is interesting to note that Speed of Feet correlated significantly with this feature, but not with any other musical feature. A possible explanation for this connection could be that foot movement is usually related to walking, with steps at a regular pace. In a musical context, the pulse of the music is related to a regular pace, so it could be argued that a clear and regular pulse of the music was embodied with using more extensive and synchronized foot movement, that is, stepping to the pulse (see Styns et al. (
<xref ref-type="bibr" rid="B42">2007</xref>
) for further connections between walking and beat patterns). However, our movement features do not reveal information about the actual synchronization to the pulse, and thus further analysis has to be performed to justify this interpretation. The torso movement (Center of Mass Speed, Hip Wiggle, Shoulder Wiggle) could also be part of the stepping-/walking-type movement of legs/feet, which would explain the significant correlations with Pulse Clarity. Furthermore, the feet are required to provide stability and an upright position, so they cannot move as free as, for instance, the hands. Thus, a connection of these body parts to the fundamental components of musical engagement, such as synchronization and resonance, makes sense, whereas the upper body parts (e.g., hands and head) could be expected to be more related to timbral and rhythmical structures of music (cf., Embodied Attuning, see next paragraph).</p>
<p>Moreover, the results could serve as an example for the concept of Embodied Attuning – movement-based navigation through the musical/rhythmic structures of the stimuli created by Sub-Band Flux and Percussiveness. It could be suggested that participants attune to strong spectral changes in low and high frequency ranges and to percussive elements with mainly head and hand movements, as these musical characteristics (related to rhythm) might be reflected and imitated best by using the upper extremities of the body (hand, head, and shoulder movement). Especially the hands (together with arms and shoulders) could be used to navigate through the music and to parse and understand the rhythmic/musical structure better.</p>
<p>Besides having the biggest freedom in movement as mentioned previously, the relation between hand/arm-related movement and several musical features might also occur due to knowledge and imagination of playing a musical instrument (Leman,
<xref ref-type="bibr" rid="B24">2007</xref>
). One could postulate that participants have applied their own experience in playing an instrument to convert this knowledge into spontaneous hand and arm movement; in other words, such movements could have been used as instrumental gestures to reflect and imitate certain musical features (Leman,
<xref ref-type="bibr" rid="B24">2007</xref>
). Both high frequency Sub-band Flux and Percussiveness were found to be related to hand movements. Such movement characteristics could reflect the way these sound qualities are produced. For instance, participants could have imagined playing the drums during moving to the music (cf., “Air Instruments,” Godøy et al.,
<xref ref-type="bibr" rid="B13">2006</xref>
). A follow-up study including an appropriate background questionnaire (covering use and experience of musical instruments) should be conducted to investigate this relationship in more detail.</p>
<p>Furthermore, the link between head movement and rhythm-related musical features might be based on the tendency to use head movements to spontaneously follow the rhythm of the music. This could be seen as another example of the concept of embodied attuning (Leman,
<xref ref-type="bibr" rid="B24">2007</xref>
). An additional interpretation of such head movements could be related to the results by Phillips-Silver and Trainor (
<xref ref-type="bibr" rid="B37">2008</xref>
), who found head movement to bias meter perception, as well as by Trainor et al. (
<xref ref-type="bibr" rid="B45">2009</xref>
), who discovered the vestibular system to play a primal role in rhythm perception: movements of the head could therefore support rhythm perception and understanding.</p>
<p>Movements are used not only in musical contexts, but also in speech. Hand and head gestures are widely utilized to accompany speech and convey additional information. Hadar (
<xref ref-type="bibr" rid="B16">1989</xref>
) noted that such gestures are used, for example, to clarify or emphasize messages. Consequently, our participants could have used their hand and head movements in a similar fashion, i.e., to emphasize and elaborate the musical (e.g., rhythmic) structure or certain musical events.</p>
<p>We aimed at designing an ecological study, as far as this is possible with an optical motion capture system and a laboratory situation. To this end, we chose real music stimuli (pre-existing pop songs), accepting the downside that they were less controlled, very diverse, and more difficult to analyze, as computational analysis of complex musical stimuli is not yet as sufficiently developed as for simpler, i.e., monophonic, stimuli. Furthermore, the stimuli might contain relevant musical features that we missed to extract. However, this approach made it possible to present the participants with music that they were potentially familiar with, and that is played in dance clubs. One could assume that this kind of music would make them move more and in a more natural fashion than more artificial stimuli.</p>
<p>The movement characteristics chosen in this study cover only a small part of the possible movement characteristics and types. There are certainly stereotypical genre- and style-dependent movements that are rather culturally developed than intrinsic to the music. Examples of these kinds of movements would be head banging in rock music or swaying hips to Latin music. To get more insight into such movement patterns, gesture-based computational analysis of the movement could be performed in the future.</p>
<p>As Desmond (
<xref ref-type="bibr" rid="B10">1993</xref>
) noted, dance is related to cultural identity. Since the musical styles used in this study can all be characterized as popular music in the cultural region in which the data collection took place, different movement characteristics might be found with participants that are not as familiar with such musical styles as our participants were. Comparative studies would give insights into cultural differences and commonalities of music-induced movement characteristics.</p>
<p>The present study revealed relationships between rhythm- and timbre-related musical features and movement characteristics. In the future, we will investigate periodicity and synchronization of music-induced movement, as well as further relationships of movement characteristics and musical features, such as tonality features, as they play an important role for the perception of musical emotions. Additionally, the results obtained in this study regarding tempo call for further investigation. A new set of stimuli could be created to control for tempo and related styles/genres to investigate the relationship of tempo, musical style, and resulting movement features.</p>
</sec>
<sec>
<title>Conflict of Interest Statement</title>
<p>The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.</p>
</sec>
<sec sec-type="supplementary-material">
<title>Supplementary Material</title>
<p>The Supplementary Material for this article can be found online at
<uri xlink:type="simple" xlink:href="http://www.frontiersin.org/Auditory_Cognitive_Neuroscience/10.3389/fpsyg.2013.00183/abstract">http://www.frontiersin.org/Auditory_Cognitive_Neuroscience/10.3389/fpsyg.2013.00183/abstract</uri>
</p>
<supplementary-material content-type="local-data" id="SM1">
<media xlink:href="40445_Burger_Movie1.MOV" mimetype="video" mime-subtype="quicktime">
<caption>
<p>Click here for additional data file.</p>
</caption>
</media>
</supplementary-material>
<supplementary-material content-type="local-data" id="SM2">
<media xlink:href="40445_Burger_Movie2.MOV" mimetype="video" mime-subtype="quicktime">
<caption>
<p>Click here for additional data file.</p>
</caption>
</media>
</supplementary-material>
<supplementary-material content-type="local-data" id="SM3">
<media xlink:href="40445_Burger_Movie3.MOV" mimetype="video" mime-subtype="quicktime">
<caption>
<p>Click here for additional data file.</p>
</caption>
</media>
</supplementary-material>
<supplementary-material content-type="local-data" id="SM4">
<media xlink:href="40445_Burger_Movie4.MOV" mimetype="video" mime-subtype="quicktime">
<caption>
<p>Click here for additional data file.</p>
</caption>
</media>
</supplementary-material>
</sec>
</body>
<back>
<ack>
<p>We thank Mikko Leimu for help and technical assistance during the motion capture data collection. This study was funded by the Academy of Finland (projects 118616, 125710, 136358).</p>
</ack>
<app-group>
<app id="A1">
<title>Appendix</title>
<sec>
<title>Song list</title>
<list list-type="order">
<list-item>
<p>Alice Deejay: better Off Alone (Who Needs Guitars Anyway?) 2:40–2:54 (loop).</p>
</list-item>
<list-item>
<p>Andre Visior: speed Up 1:15–1:45.</p>
</list-item>
<list-item>
<p>Antibalas: who is this America Dem Speak of Today? (Who Is This America?) 1:00–1:29.</p>
</list-item>
<list-item>
<p>Arturo Sandoval: a Mis Abuelos (Danzon) 1:53–2:21.</p>
</list-item>
<list-item>
<p>Baden Powell: deixa (Personalidade) 1:11–1:41.</p>
</list-item>
<list-item>
<p>Brad Mehldau: wave/Mother Nature’s Son (Largo) 0:00–0:29.</p>
</list-item>
<list-item>
<p>Clifford Brown & Max Roach: the Blues walk (Verve Jazz Masters, Vol. 44: Clifford Brown & Max Roach) 2:01–2:31.</p>
</list-item>
<list-item>
<p>Conjunto Imagen: medley-Esencia de Guaguanco/Sonero (Ayer, Hoy y Manana) 2:18–2:48.</p>
</list-item>
<list-item>
<p>Dave Hillyard & The Rocksteady 7: Hillyard Street (Playtime) 0:15–0:45.</p>
</list-item>
<list-item>
<p>Dave Weckl: mercy, Mercy, Mercy (Burning for Buddy) 0:10–0:40.</p>
</list-item>
<list-item>
<p>Dave Weckl: tower of Inspiration (Master Plan) 0:00–0:30.</p>
</list-item>
<list-item>
<p>DJ Shadow: napalm Brain/Scatter Brain (Endtroducing …) 3:29–3:58.</p>
</list-item>
<list-item>
<p>Gangster Politics: gangster Politics (Guns & Chicks) 1:00–1:29.</p>
</list-item>
<list-item>
<p>Gigi D’Agostino: blablabla (L’Amour Toujours) 0:00–0:30.</p>
</list-item>
<list-item>
<p>Herbie Hancock: watermelon man (Cantaloupe Island) 0:00–0.30.</p>
</list-item>
<list-item>
<p>Horace Silver: the Natives Are Restless 0:00–0:29.</p>
</list-item>
<list-item>
<p>In Flames: scream (Come Clarity) 0:00–0:30.</p>
</list-item>
<list-item>
<p>Jean Roch: can You Feel it (Club Sounds Vol. 35) 0:33–1:01.</p>
</list-item>
<list-item>
<p>Johanna Kurkela: Hetki hiljaa (Hetki hiljaa) 3:22–3:52.</p>
</list-item>
<list-item>
<p>Juana Molina: tres cosas (Tres Cosas) 0:00–0:30.</p>
</list-item>
<list-item>
<p>Kings of Leon: closer (Only by the Night) 3:17–3:47.</p>
</list-item>
<list-item>
<p>Lenny Kravitz: live (5) 3:02–3:30.</p>
</list-item>
<list-item>
<p>Martha & The Vandellas: heat Wave (Heat Wave) 1:40–2:10.</p>
</list-item>
<list-item>
<p>Maynard Ferguson: fireshaker (Live From San Francisco) 0:00–0:28.</p>
</list-item>
<list-item>
<p>MIA: 20 Dollar (Kala) 0:17–0:45.</p>
</list-item>
<list-item>
<p>Nick Beat: techno Disco 2:26–2:56.</p>
</list-item>
<list-item>
<p>Panjabi MC: mundian To Bach Ke (Legalized) 0:47–1:06 (loop).</p>
</list-item>
<list-item>
<p>Patrick Watson: Beijing (Wooden Arms) 2:30–2:59.</p>
</list-item>
<list-item>
<p>The Rippingtons: weekend in Monaco (Weekend in Monaco) 1:13–1:42.</p>
</list-item>
<list-item>
<p>Yuri Buenaventura: salsa (Salsa Movie Soundtrack) 2:17–2:45.</p>
</list-item>
</list>
</sec>
</app>
</app-group>
<ref-list>
<title>References</title>
<ref id="B1">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Alluri</surname>
<given-names>V.</given-names>
</name>
<name>
<surname>Toiviainen</surname>
<given-names>P.</given-names>
</name>
</person-group>
(
<year>2010</year>
).
<article-title>Exploring perceptual and acoustical correlates of polyphonic timbre</article-title>
.
<source>Music Percept.</source>
<volume>27</volume>
,
<fpage>223</fpage>
<lpage>242</lpage>
<pub-id pub-id-type="doi">10.1525/mp.2010.27.3.223</pub-id>
</mixed-citation>
</ref>
<ref id="B2">
<mixed-citation publication-type="book">
<person-group person-group-type="author">
<name>
<surname>Arom</surname>
<given-names>S.</given-names>
</name>
</person-group>
(
<year>1991</year>
).
<source>African Polyphony and Polyrhythm: Musical Structure and Methodology</source>
.
<publisher-loc>Cambridge</publisher-loc>
:
<publisher-name>Cambridge University Press</publisher-name>
</mixed-citation>
</ref>
<ref id="B3">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Bengtsson</surname>
<given-names>S. L.</given-names>
</name>
<name>
<surname>Ullén</surname>
<given-names>F.</given-names>
</name>
<name>
<surname>Ehrsson</surname>
<given-names>H. H.</given-names>
</name>
<name>
<surname>Hashimoto</surname>
<given-names>T.</given-names>
</name>
<name>
<surname>Kito</surname>
<given-names>T.</given-names>
</name>
<name>
<surname>Naito</surname>
<given-names>E.</given-names>
</name>
<etal></etal>
</person-group>
(
<year>2009</year>
).
<article-title>Listening to rhythms activates motor and premotor cortices</article-title>
.
<source>Cortex</source>
<volume>45</volume>
,
<fpage>62</fpage>
<lpage>71</lpage>
<pub-id pub-id-type="doi">10.1016/j.cortex.2008.07.002</pub-id>
<pub-id pub-id-type="pmid">19041965</pub-id>
</mixed-citation>
</ref>
<ref id="B4">
<mixed-citation publication-type="book">
<person-group person-group-type="author">
<name>
<surname>Brown</surname>
<given-names>S.</given-names>
</name>
<name>
<surname>Merker</surname>
<given-names>B.</given-names>
</name>
<name>
<surname>Wallin</surname>
<given-names>N. L.</given-names>
</name>
</person-group>
(
<year>2000</year>
).
<article-title>“An introduction to evolutionary musicology,”</article-title>
in
<source>The Origins of Music</source>
, eds
<person-group person-group-type="editor">
<name>
<surname>Wallin</surname>
<given-names>N. L.</given-names>
</name>
<name>
<surname>Merker</surname>
<given-names>B.</given-names>
</name>
<name>
<surname>Brown</surname>
<given-names>S.</given-names>
</name>
</person-group>
(
<publisher-loc>Cambridge, MA</publisher-loc>
:
<publisher-name>MIT Press</publisher-name>
),
<fpage>3</fpage>
<lpage>24</lpage>
</mixed-citation>
</ref>
<ref id="B5">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Camurri</surname>
<given-names>A.</given-names>
</name>
<name>
<surname>Lagerlöf</surname>
<given-names>I.</given-names>
</name>
<name>
<surname>Volpe</surname>
<given-names>G.</given-names>
</name>
</person-group>
(
<year>2003</year>
).
<article-title>Recognizing emotion from dance movement: comparison of spectator recognition and automated techniques</article-title>
.
<source>Int. J. Hum. Comput. Stud.</source>
<volume>59</volume>
,
<fpage>213</fpage>
<lpage>225</lpage>
<pub-id pub-id-type="doi">10.1016/S1071-5819(03)00050-8</pub-id>
</mixed-citation>
</ref>
<ref id="B6">
<mixed-citation publication-type="book">
<person-group person-group-type="author">
<name>
<surname>Camurri</surname>
<given-names>A.</given-names>
</name>
<name>
<surname>Mazzarino</surname>
<given-names>B.</given-names>
</name>
<name>
<surname>Ricchetti</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>Timmers</surname>
<given-names>R.</given-names>
</name>
<name>
<surname>Volpe</surname>
<given-names>G.</given-names>
</name>
</person-group>
(
<year>2004</year>
).
<article-title>“Multimodal analysis of expressive gesture in music and dance performances,”</article-title>
in
<source>Gesture-Based Communication in Human-Computer Interaction. Lecture Notes in Computer Science, 2915</source>
, eds
<person-group person-group-type="editor">
<name>
<surname>Camurri</surname>
<given-names>A.</given-names>
</name>
<name>
<surname>Volpe</surname>
<given-names>G.</given-names>
</name>
</person-group>
(
<publisher-loc>Berlin</publisher-loc>
:
<publisher-name>Springer</publisher-name>
),
<fpage>20</fpage>
<lpage>39</lpage>
</mixed-citation>
</ref>
<ref id="B7">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Chen</surname>
<given-names>J. L.</given-names>
</name>
<name>
<surname>Penhune</surname>
<given-names>V. B.</given-names>
</name>
<name>
<surname>Zatorre</surname>
<given-names>R. J.</given-names>
</name>
</person-group>
(
<year>2009</year>
).
<article-title>The role of auditory and premotor cortex in sensorimotor transformations</article-title>
.
<source>Ann. N. Y. Acad. Sci.</source>
<volume>1169</volume>
,
<fpage>15</fpage>
<lpage>34</lpage>
<pub-id pub-id-type="doi">10.1111/j.1749-6632.2009.04556.x</pub-id>
<pub-id pub-id-type="pmid">19673752</pub-id>
</mixed-citation>
</ref>
<ref id="B8">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Collyer</surname>
<given-names>C. E.</given-names>
</name>
<name>
<surname>Broadbent</surname>
<given-names>H. A.</given-names>
</name>
<name>
<surname>Church</surname>
<given-names>R. M.</given-names>
</name>
</person-group>
(
<year>1992</year>
).
<article-title>Categorical time production: evidence for discrete timing in motor control</article-title>
.
<source>Percept. Psychophys.</source>
<volume>51</volume>
,
<fpage>134</fpage>
<lpage>144</lpage>
<pub-id pub-id-type="doi">10.3758/BF03212238</pub-id>
<pub-id pub-id-type="pmid">1549432</pub-id>
</mixed-citation>
</ref>
<ref id="B9">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Cross</surname>
<given-names>I.</given-names>
</name>
</person-group>
(
<year>2001</year>
).
<article-title>Music, cognition, culture, and evolution</article-title>
.
<source>Ann. N. Y. Acad. Sci.</source>
<volume>930</volume>
,
<fpage>28</fpage>
<lpage>42</lpage>
<pub-id pub-id-type="doi">10.1111/j.1749-6632.2001.tb05723.x</pub-id>
<pub-id pub-id-type="pmid">11458835</pub-id>
</mixed-citation>
</ref>
<ref id="B10">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Desmond</surname>
<given-names>J. C.</given-names>
</name>
</person-group>
(
<year>1993</year>
).
<article-title>Embodying difference: issues in dance and cultural studies</article-title>
.
<source>Cult. Crit.</source>
<volume>26</volume>
,
<fpage>33</fpage>
<lpage>63</lpage>
<pub-id pub-id-type="doi">10.2307/1354455</pub-id>
</mixed-citation>
</ref>
<ref id="B11">
<mixed-citation publication-type="confproc">
<person-group person-group-type="author">
<name>
<surname>Eerola</surname>
<given-names>T.</given-names>
</name>
<name>
<surname>Luck</surname>
<given-names>G.</given-names>
</name>
<name>
<surname>Toiviainen</surname>
<given-names>P.</given-names>
</name>
</person-group>
(
<year>2006</year>
).
<article-title>“An investigation of pre-schoolers’ corporeal synchronization with music,”</article-title>
in
<conf-name>Proceedings of the 9th International Conference on Music Perception and Cognition</conf-name>
, eds
<person-group person-group-type="editor">
<name>
<surname>Baroni</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>Addessi</surname>
<given-names>A. R.</given-names>
</name>
<name>
<surname>Caterina</surname>
<given-names>R.</given-names>
</name>
<name>
<surname>Costa</surname>
<given-names>M.</given-names>
</name>
</person-group>
(
<conf-loc>Bologna</conf-loc>
:
<conf-sponsor>University of Bologna</conf-sponsor>
),
<fpage>472</fpage>
<lpage>476</lpage>
</mixed-citation>
</ref>
<ref id="B12">
<mixed-citation publication-type="book">
<person-group person-group-type="author">
<name>
<surname>Fraisse</surname>
<given-names>P.</given-names>
</name>
</person-group>
(
<year>1982</year>
).
<article-title>“Rhythm and tempo,”</article-title>
in
<source>The Psychology of Music</source>
, ed.
<person-group person-group-type="editor">
<name>
<surname>Deutsch</surname>
<given-names>D.</given-names>
</name>
</person-group>
(
<publisher-loc>New York, NY</publisher-loc>
:
<publisher-name>Academic Press</publisher-name>
),
<fpage>149</fpage>
<lpage>180</lpage>
</mixed-citation>
</ref>
<ref id="B13">
<mixed-citation publication-type="book">
<person-group person-group-type="author">
<name>
<surname>Godøy</surname>
<given-names>R. I.</given-names>
</name>
<name>
<surname>Haga</surname>
<given-names>E.</given-names>
</name>
<name>
<surname>Jensenius</surname>
<given-names>A. R.</given-names>
</name>
</person-group>
(
<year>2006</year>
).
<article-title>“Playing ‘Air instruments’: mimicry of sound-producing gestures by novices and experts,”</article-title>
in
<source>Gesture in Human-Computer Interaction and Simulation, Lecture Notes in Computer Science, 3881</source>
, eds
<person-group person-group-type="editor">
<name>
<surname>Gibet</surname>
<given-names>S.</given-names>
</name>
<name>
<surname>Courty</surname>
<given-names>N.</given-names>
</name>
<name>
<surname>Kamp</surname>
<given-names>J.-F.</given-names>
</name>
</person-group>
(
<publisher-loc>Berlin</publisher-loc>
:
<publisher-name>Springer</publisher-name>
),
<fpage>256</fpage>
<lpage>267</lpage>
</mixed-citation>
</ref>
<ref id="B14">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Grahn</surname>
<given-names>J. A.</given-names>
</name>
<name>
<surname>Brett</surname>
<given-names>M.</given-names>
</name>
</person-group>
(
<year>2007</year>
).
<article-title>Rhythm and beat perception in motor areas of the brain</article-title>
.
<source>J. Cogn. Neurosci.</source>
<volume>19</volume>
,
<fpage>893</fpage>
<lpage>906</lpage>
<pub-id pub-id-type="doi">10.1162/jocn.2007.19.5.893</pub-id>
<pub-id pub-id-type="pmid">17488212</pub-id>
</mixed-citation>
</ref>
<ref id="B15">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Grahn</surname>
<given-names>J. A.</given-names>
</name>
<name>
<surname>Rowe</surname>
<given-names>J. B.</given-names>
</name>
</person-group>
(
<year>2009</year>
).
<article-title>Feeling the beat: premotor and striatal interactions in musicians and nonmusicians during beat perception</article-title>
.
<source>J. Neurosci.</source>
<volume>29</volume>
,
<fpage>7540</fpage>
<lpage>7548</lpage>
<pub-id pub-id-type="doi">10.1523/JNEUROSCI.2018-08.2009</pub-id>
<pub-id pub-id-type="pmid">19515922</pub-id>
</mixed-citation>
</ref>
<ref id="B16">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Hadar</surname>
<given-names>U.</given-names>
</name>
</person-group>
(
<year>1989</year>
).
<article-title>Two types of gesture and their role in speech production</article-title>
.
<source>J. Lang. Soc. Psychol.</source>
<volume>8</volume>
,
<fpage>221</fpage>
<lpage>228</lpage>
<pub-id pub-id-type="doi">10.1177/0261927X8983004</pub-id>
</mixed-citation>
</ref>
<ref id="B17">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Janata</surname>
<given-names>P.</given-names>
</name>
<name>
<surname>Tomic</surname>
<given-names>S. T.</given-names>
</name>
<name>
<surname>Haberman</surname>
<given-names>J. M.</given-names>
</name>
</person-group>
(
<year>2011</year>
).
<article-title>Sensorimotor coupling in music and the psychology of the groove</article-title>
.
<source>J. Exp. Psychol. Gen.</source>
<volume>141</volume>
,
<fpage>54</fpage>
<lpage>75</lpage>
<pub-id pub-id-type="doi">10.1037/a0024208</pub-id>
<pub-id pub-id-type="pmid">21767048</pub-id>
</mixed-citation>
</ref>
<ref id="B18">
<mixed-citation publication-type="confproc">
<person-group person-group-type="author">
<name>
<surname>Jensenius</surname>
<given-names>A. R.</given-names>
</name>
</person-group>
(
<year>2006</year>
).
<article-title>“Using motiongrams in the study of musical gestures,”</article-title>
in
<conf-name>Proceedings of the International Computer Music Conference</conf-name>
(
<conf-loc>New Orleans, LA</conf-loc>
:
<conf-sponsor>Tulane University</conf-sponsor>
),
<fpage>499</fpage>
<lpage>502</lpage>
</mixed-citation>
</ref>
<ref id="B19">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Keller</surname>
<given-names>P.</given-names>
</name>
<name>
<surname>Rieger</surname>
<given-names>M.</given-names>
</name>
</person-group>
(
<year>2009</year>
).
<article-title>Special issue – musical movement and synchronization</article-title>
.
<source>Music Percept.</source>
<volume>26</volume>
,
<fpage>397</fpage>
<lpage>400</lpage>
<pub-id pub-id-type="doi">10.1525/mp.2009.26.3.289</pub-id>
</mixed-citation>
</ref>
<ref id="B20">
<mixed-citation publication-type="book">
<person-group person-group-type="author">
<name>
<surname>Lakoff</surname>
<given-names>G.</given-names>
</name>
<name>
<surname>Johnson</surname>
<given-names>M.</given-names>
</name>
</person-group>
(
<year>1980</year>
).
<source>Metaphors We Live By</source>
.
<publisher-loc>Chicago</publisher-loc>
:
<publisher-name>University of Chicago Press</publisher-name>
</mixed-citation>
</ref>
<ref id="B21">
<mixed-citation publication-type="book">
<person-group person-group-type="author">
<name>
<surname>Lakoff</surname>
<given-names>G.</given-names>
</name>
<name>
<surname>Johnson</surname>
<given-names>M.</given-names>
</name>
</person-group>
(
<year>1999</year>
).
<source>Philosophy in the Flesh: The Embodied Mind and Its Challenge to Western Thought</source>
.
<publisher-loc>New York, NY</publisher-loc>
:
<publisher-name>Basic Books</publisher-name>
</mixed-citation>
</ref>
<ref id="B22">
<mixed-citation publication-type="confproc">
<person-group person-group-type="author">
<name>
<surname>Lartillot</surname>
<given-names>O.</given-names>
</name>
<name>
<surname>Eerola</surname>
<given-names>T.</given-names>
</name>
<name>
<surname>Toiviainen</surname>
<given-names>P.</given-names>
</name>
<name>
<surname>Fornari</surname>
<given-names>J.</given-names>
</name>
</person-group>
(
<year>2008</year>
).
<article-title>“Multi-feature modeling of pulse clarity: design, validation, and optimization,”</article-title>
in
<conf-name>Proceedings of the 9th International Conference on Music Information Retrieval</conf-name>
, eds
<person-group person-group-type="editor">
<name>
<surname>Bello</surname>
<given-names>J. P.</given-names>
</name>
<name>
<surname>Chew</surname>
<given-names>E.</given-names>
</name>
<name>
<surname>Turnbull</surname>
<given-names>D.</given-names>
</name>
</person-group>
(
<conf-loc>Philadelphia, PA</conf-loc>
:
<conf-sponsor>Drexel University</conf-sponsor>
),
<fpage>521</fpage>
<lpage>526</lpage>
</mixed-citation>
</ref>
<ref id="B23">
<mixed-citation publication-type="confproc">
<person-group person-group-type="author">
<name>
<surname>Lartillot</surname>
<given-names>O.</given-names>
</name>
<name>
<surname>Toiviainen</surname>
<given-names>P.</given-names>
</name>
</person-group>
(
<year>2007</year>
).
<article-title>“A matlab toolbox for musical feature extraction from audio,”</article-title>
in
<conf-name>Proceedings of the 10th International Conference on Digital Audio Effects</conf-name>
(
<conf-loc>Bordeaux</conf-loc>
:
<conf-sponsor>University of Bordeaux</conf-sponsor>
),
<fpage>1</fpage>
<lpage>8</lpage>
</mixed-citation>
</ref>
<ref id="B24">
<mixed-citation publication-type="book">
<person-group person-group-type="author">
<name>
<surname>Leman</surname>
<given-names>M.</given-names>
</name>
</person-group>
(
<year>2007</year>
).
<source>Embodied Music Cognition and Mediation Technology</source>
.
<publisher-loc>Cambridge, MA</publisher-loc>
:
<publisher-name>MIT Press</publisher-name>
</mixed-citation>
</ref>
<ref id="B25">
<mixed-citation publication-type="book">
<person-group person-group-type="author">
<name>
<surname>Leman</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>Godøy</surname>
<given-names>R. I.</given-names>
</name>
</person-group>
(
<year>2010</year>
).
<article-title>“Why Study Musical Gesture?”</article-title>
in
<source>Musical Gestures. Sound, Movement, and Meaning</source>
, eds
<person-group person-group-type="editor">
<name>
<surname>Godøy</surname>
<given-names>R. I.</given-names>
</name>
<name>
<surname>Leman</surname>
<given-names>M.</given-names>
</name>
</person-group>
(
<publisher-loc>New York, NY</publisher-loc>
:
<publisher-name>Routledge</publisher-name>
),
<fpage>3</fpage>
<lpage>11</lpage>
</mixed-citation>
</ref>
<ref id="B26">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Leman</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>Naveda</surname>
<given-names>L.</given-names>
</name>
</person-group>
(
<year>2010</year>
).
<article-title>Basic gestures as spatiotemporal reference frames for repetitive dance/music patterns in Samba and Charleston</article-title>
.
<source>Music Percept.</source>
<volume>28</volume>
,
<fpage>71</fpage>
<lpage>92</lpage>
<pub-id pub-id-type="doi">10.1525/mp.2010.28.1.71</pub-id>
</mixed-citation>
</ref>
<ref id="B27">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Lesaffre</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>De Voogdt</surname>
<given-names>L.</given-names>
</name>
<name>
<surname>Leman</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>De Baets</surname>
<given-names>B.</given-names>
</name>
<name>
<surname>De Meyer</surname>
<given-names>H.</given-names>
</name>
<name>
<surname>Martens</surname>
<given-names>J.-P.</given-names>
</name>
</person-group>
(
<year>2008</year>
).
<article-title>How potential users of music search and retrieval systems describe the semantic quality of music</article-title>
.
<source>J. Am. Soc. Inf. Sci. Technol.</source>
<volume>59</volume>
,
<fpage>695</fpage>
<lpage>707</lpage>
<pub-id pub-id-type="doi">10.1002/asi.20731</pub-id>
</mixed-citation>
</ref>
<ref id="B28">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Luck</surname>
<given-names>G.</given-names>
</name>
<name>
<surname>Saarikallio</surname>
<given-names>S.</given-names>
</name>
<name>
<surname>Burger</surname>
<given-names>B.</given-names>
</name>
<name>
<surname>Thompson</surname>
<given-names>M. R.</given-names>
</name>
<name>
<surname>Toiviainen</surname>
<given-names>P.</given-names>
</name>
</person-group>
(
<year>2010</year>
).
<article-title>Effects of the Big Five and musical genre on music-induced movement</article-title>
.
<source>J. Res. Pers.</source>
<volume>44</volume>
,
<fpage>714</fpage>
<lpage>720</lpage>
<pub-id pub-id-type="doi">10.1016/j.jrp.2010.10.001</pub-id>
</mixed-citation>
</ref>
<ref id="B29">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>MacDougall</surname>
<given-names>H. G.</given-names>
</name>
<name>
<surname>Moore</surname>
<given-names>S. T.</given-names>
</name>
</person-group>
(
<year>2005</year>
).
<article-title>Marching to the beat of the same drummer: the spontaneous tempo of human locomotion</article-title>
.
<source>J. Appl. Physiol.</source>
<volume>99</volume>
,
<fpage>1164</fpage>
<lpage>1173</lpage>
<pub-id pub-id-type="doi">10.1152/japplphysiol.00138.2005</pub-id>
<pub-id pub-id-type="pmid">15890757</pub-id>
</mixed-citation>
</ref>
<ref id="B30">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Madison</surname>
<given-names>G.</given-names>
</name>
<name>
<surname>Gouyon</surname>
<given-names>F.</given-names>
</name>
<name>
<surname>Ullén</surname>
<given-names>F.</given-names>
</name>
<name>
<surname>Hörnström</surname>
<given-names>K.</given-names>
</name>
</person-group>
(
<year>2011</year>
).
<article-title>Modeling the tendency for music to induce movement in humans: first correlations with low-level audio descriptors across music genres</article-title>
.
<source>J. Exp. Psychol. Hum. Percept. Perform.</source>
<volume>37</volume>
,
<fpage>1578</fpage>
<lpage>1594</lpage>
<pub-id pub-id-type="doi">10.1037/a0024323</pub-id>
<pub-id pub-id-type="pmid">21728462</pub-id>
</mixed-citation>
</ref>
<ref id="B31">
<mixed-citation publication-type="confproc">
<person-group person-group-type="author">
<name>
<surname>Moelants</surname>
<given-names>D.</given-names>
</name>
</person-group>
(
<year>2002</year>
).
<article-title>“Preferred tempo reconsidered,”</article-title>
in
<conf-name>Proceedings of the 7th International Conference on Music Perception and Cognition</conf-name>
, eds
<person-group person-group-type="editor">
<name>
<surname>Stevens</surname>
<given-names>C.</given-names>
</name>
<name>
<surname>Burnham</surname>
<given-names>D.</given-names>
</name>
<name>
<surname>McPherson</surname>
<given-names>G.</given-names>
</name>
<name>
<surname>Schubert</surname>
<given-names>E.</given-names>
</name>
<name>
<surname>Renwick</surname>
<given-names>J.</given-names>
</name>
</person-group>
(
<conf-loc>Adelaide</conf-loc>
:
<conf-sponsor>Causal Productions</conf-sponsor>
),
<fpage>580</fpage>
<lpage>583</lpage>
</mixed-citation>
</ref>
<ref id="B32">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Murray</surname>
<given-names>M. P.</given-names>
</name>
<name>
<surname>Drought</surname>
<given-names>A. B.</given-names>
</name>
<name>
<surname>Kory</surname>
<given-names>R. C.</given-names>
</name>
</person-group>
(
<year>1964</year>
).
<article-title>Walking patterns of normal men</article-title>
.
<source>J. Bone Joint Surg.</source>
<volume>46</volume>
,
<fpage>335</fpage>
<lpage>360</lpage>
<pub-id pub-id-type="pmid">14129683</pub-id>
</mixed-citation>
</ref>
<ref id="B33">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Naveda</surname>
<given-names>L.</given-names>
</name>
<name>
<surname>Leman</surname>
<given-names>M.</given-names>
</name>
</person-group>
(
<year>2010</year>
).
<article-title>The spatiotemporal representation of dance and music gestures using Topological Gesture Analysis (TGA)</article-title>
.
<source>Music Percept.</source>
<volume>28</volume>
,
<fpage>93</fpage>
<lpage>112</lpage>
<pub-id pub-id-type="doi">10.1525/mp.2010.28.1.93</pub-id>
</mixed-citation>
</ref>
<ref id="B34">
<mixed-citation publication-type="book">
<person-group person-group-type="author">
<name>
<surname>Nettl</surname>
<given-names>B.</given-names>
</name>
</person-group>
(
<year>2000</year>
).
<article-title>“An ethnomusicologist contemplates universals in musical sound and musical culture,”</article-title>
in
<source>The Origins of Music</source>
, eds
<person-group person-group-type="editor">
<name>
<surname>Wallin</surname>
<given-names>N. L.</given-names>
</name>
<name>
<surname>Merker</surname>
<given-names>B.</given-names>
</name>
<name>
<surname>Brown</surname>
<given-names>S.</given-names>
</name>
</person-group>
(
<publisher-loc>Cambridge, MA</publisher-loc>
:
<publisher-name>MIT Press</publisher-name>
),
<fpage>463</fpage>
<lpage>472</lpage>
</mixed-citation>
</ref>
<ref id="B35">
<mixed-citation publication-type="confproc">
<person-group person-group-type="author">
<name>
<surname>Pampalk</surname>
<given-names>E.</given-names>
</name>
<name>
<surname>Rauber</surname>
<given-names>A.</given-names>
</name>
<name>
<surname>Merkl</surname>
<given-names>D.</given-names>
</name>
</person-group>
(
<year>2002</year>
).
<article-title>“Content-based organization and visualization of music archives,”</article-title>
in
<conf-name>Proceedings of the 10th ACM International Conference on Multimedia, Juan-les-Pins</conf-name>
(
<conf-loc>New York, NY</conf-loc>
:
<conf-sponsor>ACM Press</conf-sponsor>
),
<fpage>570</fpage>
<lpage>579</lpage>
</mixed-citation>
</ref>
<ref id="B36">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Parncutt</surname>
<given-names>R.</given-names>
</name>
</person-group>
(
<year>1994</year>
).
<article-title>A perceptual model of pulse salience and metrical accent in musical rhythms</article-title>
.
<source>Music Percept.</source>
<volume>11</volume>
,
<fpage>409</fpage>
<lpage>464</lpage>
<pub-id pub-id-type="doi">10.2307/40285633</pub-id>
</mixed-citation>
</ref>
<ref id="B37">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Phillips-Silver</surname>
<given-names>J.</given-names>
</name>
<name>
<surname>Trainor</surname>
<given-names>L. J.</given-names>
</name>
</person-group>
(
<year>2008</year>
).
<article-title>Vestibular influence on auditory metrical interpretation</article-title>
.
<source>Brain Cogn.</source>
<volume>67</volume>
,
<fpage>94</fpage>
<lpage>102</lpage>
<pub-id pub-id-type="doi">10.1016/j.bandc.2007.11.007</pub-id>
<pub-id pub-id-type="pmid">18234407</pub-id>
</mixed-citation>
</ref>
<ref id="B38">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Repp</surname>
<given-names>B. H.</given-names>
</name>
</person-group>
(
<year>2005</year>
).
<article-title>Sensorimotor synchronization: a review of the tapping literature</article-title>
.
<source>Psychon. Bull. Rev.</source>
<volume>12</volume>
,
<fpage>969</fpage>
<lpage>992</lpage>
<pub-id pub-id-type="doi">10.3758/BF03206433</pub-id>
<pub-id pub-id-type="pmid">16615317</pub-id>
</mixed-citation>
</ref>
<ref id="B39">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Savitzky</surname>
<given-names>A.</given-names>
</name>
<name>
<surname>Golay</surname>
<given-names>M. J. E.</given-names>
</name>
</person-group>
(
<year>1964</year>
).
<article-title>Smoothing and differentiation of data by simplified least squares procedures</article-title>
.
<source>Anal. Chem.</source>
<volume>36</volume>
,
<fpage>1627</fpage>
<lpage>1639</lpage>
<pub-id pub-id-type="doi">10.1021/ac60214a047</pub-id>
</mixed-citation>
</ref>
<ref id="B40">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Shrout</surname>
<given-names>P. E.</given-names>
</name>
<name>
<surname>Fleiss</surname>
<given-names>J. L.</given-names>
</name>
</person-group>
(
<year>1979</year>
).
<article-title>Intraclass correlations: uses in assessing rater reliability</article-title>
.
<source>Psychol. Bull.</source>
<volume>86</volume>
,
<fpage>420</fpage>
<lpage>428</lpage>
<pub-id pub-id-type="doi">10.1037/0033-2909.86.2.420</pub-id>
<pub-id pub-id-type="pmid">18839484</pub-id>
</mixed-citation>
</ref>
<ref id="B41">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Stevens</surname>
<given-names>C. J.</given-names>
</name>
<name>
<surname>Schubert</surname>
<given-names>E.</given-names>
</name>
<name>
<surname>Wang</surname>
<given-names>S.</given-names>
</name>
<name>
<surname>Kroos</surname>
<given-names>C.</given-names>
</name>
<name>
<surname>Halovic</surname>
<given-names>S.</given-names>
</name>
</person-group>
(
<year>2009</year>
).
<article-title>Moving with and without music: scaling and lapsing in time in the performance of contemporary dance</article-title>
.
<source>Music Percept.</source>
<volume>26</volume>
,
<fpage>451</fpage>
<lpage>464</lpage>
<pub-id pub-id-type="doi">10.1525/mp.2009.26.5.451</pub-id>
</mixed-citation>
</ref>
<ref id="B42">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Styns</surname>
<given-names>F.</given-names>
</name>
<name>
<surname>van Noorden</surname>
<given-names>L.</given-names>
</name>
<name>
<surname>Moelants</surname>
<given-names>D.</given-names>
</name>
<name>
<surname>Leman</surname>
<given-names>M.</given-names>
</name>
</person-group>
(
<year>2007</year>
).
<article-title>Walking on music</article-title>
.
<source>Hum. Mov. Sci.</source>
<volume>26</volume>
,
<fpage>769</fpage>
<lpage>785</lpage>
<pub-id pub-id-type="doi">10.1016/j.humov.2007.07.007</pub-id>
<pub-id pub-id-type="pmid">17910985</pub-id>
</mixed-citation>
</ref>
<ref id="B43">
<mixed-citation publication-type="book">
<person-group person-group-type="author">
<name>
<surname>Toiviainen</surname>
<given-names>P.</given-names>
</name>
<name>
<surname>Burger</surname>
<given-names>B.</given-names>
</name>
</person-group>
(
<year>2011</year>
).
<source>MoCap Toolbox Manual</source>
.
<publisher-loc>Jyväskylä</publisher-loc>
:
<publisher-name>University of Jyväskylä</publisher-name>
</mixed-citation>
</ref>
<ref id="B44">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Toiviainen</surname>
<given-names>P.</given-names>
</name>
<name>
<surname>Luck</surname>
<given-names>G.</given-names>
</name>
<name>
<surname>Thompson</surname>
<given-names>M.</given-names>
</name>
</person-group>
(
<year>2010</year>
).
<article-title>Embodied meter: hierarchical eigenmodes in music-induced movement</article-title>
.
<source>Music Percept.</source>
<volume>28</volume>
,
<fpage>59</fpage>
<lpage>70</lpage>
<pub-id pub-id-type="doi">10.1525/mp.2010.28.1.1</pub-id>
</mixed-citation>
</ref>
<ref id="B45">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Trainor</surname>
<given-names>L. J.</given-names>
</name>
<name>
<surname>Gao</surname>
<given-names>X.</given-names>
</name>
<name>
<surname>Lei</surname>
<given-names>J.-J.</given-names>
</name>
<name>
<surname>Lehtovaara</surname>
<given-names>K.</given-names>
</name>
<name>
<surname>Harris</surname>
<given-names>L. R.</given-names>
</name>
</person-group>
(
<year>2009</year>
).
<article-title>The primal role of the vestibular system in determining musical rhythm</article-title>
.
<source>Cortex</source>
<volume>45</volume>
,
<fpage>35</fpage>
<lpage>43</lpage>
<pub-id pub-id-type="doi">10.1016/j.cortex.2007.10.014</pub-id>
<pub-id pub-id-type="pmid">19054504</pub-id>
</mixed-citation>
</ref>
<ref id="B46">
<mixed-citation publication-type="confproc">
<person-group person-group-type="author">
<name>
<surname>Van Dyck</surname>
<given-names>E.</given-names>
</name>
<name>
<surname>Moelants</surname>
<given-names>D.</given-names>
</name>
<name>
<surname>Demey</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>Coussement</surname>
<given-names>P.</given-names>
</name>
<name>
<surname>Deweppe</surname>
<given-names>A.</given-names>
</name>
<name>
<surname>Leman</surname>
<given-names>M.</given-names>
</name>
</person-group>
(
<year>2010</year>
).
<article-title>“The impact of the bass drum on body movement in spontaneous dance,”</article-title>
in
<conf-name>Proceedings of the 11th International Conference in Music Perception and Cognition</conf-name>
, eds
<person-group person-group-type="editor">
<name>
<surname>Demorest</surname>
<given-names>S. M.</given-names>
</name>
<name>
<surname>Morrison</surname>
<given-names>S. J.</given-names>
</name>
<name>
<surname>Campbell</surname>
<given-names>P. S.</given-names>
</name>
</person-group>
(
<conf-loc>Seattle, WA</conf-loc>
:
<conf-sponsor>University of Washington</conf-sponsor>
),
<fpage>429</fpage>
<lpage>434</lpage>
</mixed-citation>
</ref>
<ref id="B47">
<mixed-citation publication-type="book">
<person-group person-group-type="author">
<name>
<surname>Varela</surname>
<given-names>F. J.</given-names>
</name>
<name>
<surname>Thompson</surname>
<given-names>E.</given-names>
</name>
<name>
<surname>Rosch</surname>
<given-names>E.</given-names>
</name>
</person-group>
(
<year>1991</year>
).
<source>The Embodied Mind: Cognitive Science and Human Experience</source>
.
<publisher-loc>Cambridge, MA</publisher-loc>
:
<publisher-name>MIT Press</publisher-name>
</mixed-citation>
</ref>
<ref id="B48">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Zentner</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>Eerola</surname>
<given-names>T.</given-names>
</name>
</person-group>
(
<year>2010</year>
).
<article-title>Rhythmic engagement with music in infancy</article-title>
.
<source>Proc. Natl. Acad. Sci. U.S.A.</source>
<volume>107</volume>
,
<fpage>5768</fpage>
<lpage>5773</lpage>
<pub-id pub-id-type="doi">10.1073/pnas.1000121107</pub-id>
<pub-id pub-id-type="pmid">20231438</pub-id>
</mixed-citation>
</ref>
</ref-list>
</back>
</pmc>
</record>

Pour manipuler ce document sous Unix (Dilib)

EXPLOR_STEP=$WICRI_ROOT/Wicri/Sarre/explor/MusicSarreV3/Data/Pmc/Corpus
HfdSelect -h $EXPLOR_STEP/biblio.hfd -nk 000142  | SxmlIndent | more

Ou

HfdSelect -h $EXPLOR_AREA/Data/Pmc/Corpus/biblio.hfd -nk 000142  | SxmlIndent | more

Pour mettre un lien sur cette page dans le réseau Wicri

{{Explor lien
   |wiki=    Wicri/Sarre
   |area=    MusicSarreV3
   |flux=    Pmc
   |étape=   Corpus
   |type=    RBID
   |clé=     
   |texte=   
}}

Wicri

This area was generated with Dilib version V0.6.33.
Data generation: Sun Jul 15 18:16:09 2018. Site generation: Tue Mar 5 19:21:25 2024