Serveur d'exploration sur Mozart

Attention, ce site est en cours de développement !
Attention, site généré par des moyens informatiques à partir de corpus bruts.
Les informations ne sont donc pas validées.

Rhythmic complexity and predictive coding: a novel approach to modeling rhythm and meter perception in music

Identifieur interne : 000093 ( Pmc/Checkpoint ); précédent : 000092; suivant : 000094

Rhythmic complexity and predictive coding: a novel approach to modeling rhythm and meter perception in music

Auteurs : Peter Vuust [Danemark] ; Maria A. G. Witek [Danemark]

Source :

RBID : PMC:4181238

Abstract

Musical rhythm, consisting of apparently abstract intervals of accented temporal events, has a remarkable capacity to move our minds and bodies. How does the cognitive system enable our experiences of rhythmically complex music? In this paper, we describe some common forms of rhythmic complexity in music and propose the theory of predictive coding (PC) as a framework for understanding how rhythm and rhythmic complexity are processed in the brain. We also consider why we feel so compelled by rhythmic tension in music. First, we consider theories of rhythm and meter perception, which provide hierarchical and computational approaches to modeling. Second, we present the theory of PC, which posits a hierarchical organization of brain responses reflecting fundamental, survival-related mechanisms associated with predicting future events. According to this theory, perception and learning is manifested through the brain’s Bayesian minimization of the error between the input to the brain and the brain’s prior expectations. Third, we develop a PC model of musical rhythm, in which rhythm perception is conceptualized as an interaction between what is heard (“rhythm”) and the brain’s anticipatory structuring of music (“meter”). Finally, we review empirical studies of the neural and behavioral effects of syncopation, polyrhythm and groove, and propose how these studies can be seen as special cases of the PC theory. We argue that musical rhythm exploits the brain’s general principles of prediction and propose that pleasure and desire for sensorimotor synchronization from musical rhythm may be a result of such mechanisms.


Url:
DOI: 10.3389/fpsyg.2014.01111
PubMed: 25324813
PubMed Central: 4181238


Affiliations:


Links toward previous steps (curation, corpus...)


Links to Exploration step

PMC:4181238

Le document en format XML

<record>
<TEI>
<teiHeader>
<fileDesc>
<titleStmt>
<title xml:lang="en">Rhythmic complexity and predictive coding: a novel approach to modeling rhythm and meter perception in music</title>
<author>
<name sortKey="Vuust, Peter" sort="Vuust, Peter" uniqKey="Vuust P" first="Peter" last="Vuust">Peter Vuust</name>
<affiliation wicri:level="1">
<nlm:aff id="aff1">
<institution>Center of Functionally Integrative Neuroscience, Aarhus University Hospital</institution>
<country>Aarhus, Denmark</country>
</nlm:aff>
<country xml:lang="fr">Danemark</country>
<wicri:regionArea></wicri:regionArea>
<wicri:regionArea># see nlm:aff region in country</wicri:regionArea>
</affiliation>
<affiliation wicri:level="1">
<nlm:aff id="aff2">
<institution>Royal Academy of Music</institution>
<country>Aarhus/Aalborg, Denmark</country>
</nlm:aff>
<country xml:lang="fr">Danemark</country>
<wicri:regionArea></wicri:regionArea>
<wicri:regionArea># see nlm:aff region in country</wicri:regionArea>
</affiliation>
</author>
<author>
<name sortKey="Witek, Maria A G" sort="Witek, Maria A G" uniqKey="Witek M" first="Maria A. G." last="Witek">Maria A. G. Witek</name>
<affiliation wicri:level="1">
<nlm:aff id="aff1">
<institution>Center of Functionally Integrative Neuroscience, Aarhus University Hospital</institution>
<country>Aarhus, Denmark</country>
</nlm:aff>
<country xml:lang="fr">Danemark</country>
<wicri:regionArea></wicri:regionArea>
<wicri:regionArea># see nlm:aff region in country</wicri:regionArea>
</affiliation>
</author>
</titleStmt>
<publicationStmt>
<idno type="wicri:source">PMC</idno>
<idno type="pmid">25324813</idno>
<idno type="pmc">4181238</idno>
<idno type="url">http://www.ncbi.nlm.nih.gov/pmc/articles/PMC4181238</idno>
<idno type="RBID">PMC:4181238</idno>
<idno type="doi">10.3389/fpsyg.2014.01111</idno>
<date when="2014">2014</date>
<idno type="wicri:Area/Pmc/Corpus">000838</idno>
<idno type="wicri:Area/Pmc/Curation">000838</idno>
<idno type="wicri:Area/Pmc/Checkpoint">000093</idno>
</publicationStmt>
<sourceDesc>
<biblStruct>
<analytic>
<title xml:lang="en" level="a" type="main">Rhythmic complexity and predictive coding: a novel approach to modeling rhythm and meter perception in music</title>
<author>
<name sortKey="Vuust, Peter" sort="Vuust, Peter" uniqKey="Vuust P" first="Peter" last="Vuust">Peter Vuust</name>
<affiliation wicri:level="1">
<nlm:aff id="aff1">
<institution>Center of Functionally Integrative Neuroscience, Aarhus University Hospital</institution>
<country>Aarhus, Denmark</country>
</nlm:aff>
<country xml:lang="fr">Danemark</country>
<wicri:regionArea></wicri:regionArea>
<wicri:regionArea># see nlm:aff region in country</wicri:regionArea>
</affiliation>
<affiliation wicri:level="1">
<nlm:aff id="aff2">
<institution>Royal Academy of Music</institution>
<country>Aarhus/Aalborg, Denmark</country>
</nlm:aff>
<country xml:lang="fr">Danemark</country>
<wicri:regionArea></wicri:regionArea>
<wicri:regionArea># see nlm:aff region in country</wicri:regionArea>
</affiliation>
</author>
<author>
<name sortKey="Witek, Maria A G" sort="Witek, Maria A G" uniqKey="Witek M" first="Maria A. G." last="Witek">Maria A. G. Witek</name>
<affiliation wicri:level="1">
<nlm:aff id="aff1">
<institution>Center of Functionally Integrative Neuroscience, Aarhus University Hospital</institution>
<country>Aarhus, Denmark</country>
</nlm:aff>
<country xml:lang="fr">Danemark</country>
<wicri:regionArea></wicri:regionArea>
<wicri:regionArea># see nlm:aff region in country</wicri:regionArea>
</affiliation>
</author>
</analytic>
<series>
<title level="j">Frontiers in Psychology</title>
<idno type="e-ISSN">1664-1078</idno>
<imprint>
<date when="2014">2014</date>
</imprint>
</series>
</biblStruct>
</sourceDesc>
</fileDesc>
<profileDesc>
<textClass></textClass>
</profileDesc>
</teiHeader>
<front>
<div type="abstract" xml:lang="en">
<p>Musical rhythm, consisting of apparently abstract intervals of accented temporal events, has a remarkable capacity to move our minds and bodies. How does the cognitive system enable our experiences of rhythmically complex music? In this paper, we describe some common forms of rhythmic complexity in music and propose the theory of predictive coding (PC) as a framework for understanding how rhythm and rhythmic complexity are processed in the brain. We also consider why we feel so compelled by rhythmic tension in music. First, we consider theories of rhythm and meter perception, which provide hierarchical and computational approaches to modeling. Second, we present the theory of PC, which posits a hierarchical organization of brain responses reflecting fundamental, survival-related mechanisms associated with predicting future events. According to this theory, perception and learning is manifested through the brain’s Bayesian minimization of the error between the input to the brain and the brain’s prior expectations. Third, we develop a PC model of musical rhythm, in which rhythm perception is conceptualized as an interaction between what is heard (“rhythm”) and the brain’s anticipatory structuring of music (“meter”). Finally, we review empirical studies of the neural and behavioral effects of syncopation, polyrhythm and groove, and propose how these studies can be seen as special cases of the PC theory. We argue that musical rhythm exploits the brain’s general principles of prediction and propose that pleasure and desire for sensorimotor synchronization from musical rhythm may be a result of such mechanisms.</p>
</div>
</front>
<back>
<div1 type="bibliography">
<listBibl>
<biblStruct>
<analytic>
<author>
<name sortKey="Agawu, K" uniqKey="Agawu K">K. Agawu</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Altenmuller, E" uniqKey="Altenmuller E">E. Altenmuller</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Barnes, R" uniqKey="Barnes R">R. Barnes</name>
</author>
<author>
<name sortKey="Jones, M R" uniqKey="Jones M">M. R. Jones</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Bastos, A M" uniqKey="Bastos A">A. M. Bastos</name>
</author>
<author>
<name sortKey="Usrey, W M" uniqKey="Usrey W">W. M. Usrey</name>
</author>
<author>
<name sortKey="Adams, R A" uniqKey="Adams R">R. A. Adams</name>
</author>
<author>
<name sortKey="Mangun, G R" uniqKey="Mangun G">G. R. Mangun</name>
</author>
<author>
<name sortKey="Fries, P" uniqKey="Fries P">P. Fries</name>
</author>
<author>
<name sortKey="Friston, K J" uniqKey="Friston K">K. J. Friston</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Bengtsson, S L" uniqKey="Bengtsson S">S. L. Bengtsson</name>
</author>
<author>
<name sortKey="Ullen, F" uniqKey="Ullen F">F. Ullén</name>
</author>
<author>
<name sortKey="Henrik Ehrsson, H" uniqKey="Henrik Ehrsson H">H. Henrik Ehrsson</name>
</author>
<author>
<name sortKey="Hashimoto, T" uniqKey="Hashimoto T">T. Hashimoto</name>
</author>
<author>
<name sortKey="Kito, T" uniqKey="Kito T">T. Kito</name>
</author>
<author>
<name sortKey="Naito, E" uniqKey="Naito E">E. Naito</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Berlyne, D E" uniqKey="Berlyne D">D. E. Berlyne</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Berniker, M" uniqKey="Berniker M">M. Berniker</name>
</author>
<author>
<name sortKey="Kording, K" uniqKey="Kording K">K. Körding</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Born, R T" uniqKey="Born R">R. T. Born</name>
</author>
<author>
<name sortKey="Tsui, J M" uniqKey="Tsui J">J. M. Tsui</name>
</author>
<author>
<name sortKey="Pack, C C" uniqKey="Pack C">C. C. Pack</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Brochard, R" uniqKey="Brochard R">R. Brochard</name>
</author>
<author>
<name sortKey="Abecasis, D" uniqKey="Abecasis D">D. Abecasis</name>
</author>
<author>
<name sortKey="Potter, D" uniqKey="Potter D">D. Potter</name>
</author>
<author>
<name sortKey="Ragot, R" uniqKey="Ragot R">R. Ragot</name>
</author>
<author>
<name sortKey="Drake, C" uniqKey="Drake C">C. Drake</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Brown, H" uniqKey="Brown H">H. Brown</name>
</author>
<author>
<name sortKey="Friston, K J" uniqKey="Friston K">K. J. Friston</name>
</author>
<author>
<name sortKey="Bestmann, S" uniqKey="Bestmann S">S. Bestmann</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Burr, D" uniqKey="Burr D">D. Burr</name>
</author>
<author>
<name sortKey="Tozzi, A" uniqKey="Tozzi A">A. Tozzi</name>
</author>
<author>
<name sortKey="Morrone, M C" uniqKey="Morrone M">M. C. Morrone</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Butler, M J" uniqKey="Butler M">M. J. Butler</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Cabeza, R" uniqKey="Cabeza R">R. Cabeza</name>
</author>
<author>
<name sortKey="Nyberg, L" uniqKey="Nyberg L">L. Nyberg</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Chapin, H L" uniqKey="Chapin H">H. L. Chapin</name>
</author>
<author>
<name sortKey="Zanto, T" uniqKey="Zanto T">T. Zanto</name>
</author>
<author>
<name sortKey="Jantzen, K J" uniqKey="Jantzen K">K. J. Jantzen</name>
</author>
<author>
<name sortKey="Kelso, S J A" uniqKey="Kelso S">S. J. A. Kelso</name>
</author>
<author>
<name sortKey="Steinberg, F" uniqKey="Steinberg F">F. Steinberg</name>
</author>
<author>
<name sortKey="Large, E W" uniqKey="Large E">E. W. Large</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Chen, J L" uniqKey="Chen J">J. L. Chen</name>
</author>
<author>
<name sortKey="Penhune, V B" uniqKey="Penhune V">V. B. Penhune</name>
</author>
<author>
<name sortKey="Zatorre, R J" uniqKey="Zatorre R">R. J. Zatorre</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Chen, J L" uniqKey="Chen J">J. L. Chen</name>
</author>
<author>
<name sortKey="Zatorre, R J" uniqKey="Zatorre R">R. J. Zatorre</name>
</author>
<author>
<name sortKey="Penhune, V B" uniqKey="Penhune V">V. B. Penhune</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Cicchini, G M" uniqKey="Cicchini G">G. M. Cicchini</name>
</author>
<author>
<name sortKey="Arrighi, R" uniqKey="Arrighi R">R. Arrighi</name>
</author>
<author>
<name sortKey="Cecchetti, L" uniqKey="Cecchetti L">L. Cecchetti</name>
</author>
<author>
<name sortKey="Giusti, M" uniqKey="Giusti M">M. Giusti</name>
</author>
<author>
<name sortKey="Burr, D C" uniqKey="Burr D">D. C. Burr</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Clark, A" uniqKey="Clark A">A. Clark</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Clark, A" uniqKey="Clark A">A. Clark</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Clark, A" uniqKey="Clark A">A. Clark</name>
</author>
<author>
<name sortKey="Chalmers, D" uniqKey="Chalmers D">D. Chalmers</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Clarke, E F" uniqKey="Clarke E">E. F. Clarke</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Clayton, M" uniqKey="Clayton M">M. Clayton</name>
</author>
<author>
<name sortKey="Sager, R" uniqKey="Sager R">R. Sager</name>
</author>
<author>
<name sortKey="Will, U" uniqKey="Will U">U. Will</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Danielsen, A" uniqKey="Danielsen A">A. Danielsen</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Danielsen, A" uniqKey="Danielsen A">A. Danielsen</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Demos, A P" uniqKey="Demos A">A. P. Demos</name>
</author>
<author>
<name sortKey="Chaffin, R" uniqKey="Chaffin R">R. Chaffin</name>
</author>
<author>
<name sortKey="Begosh, K T" uniqKey="Begosh K">K. T. Begosh</name>
</author>
<author>
<name sortKey="Daniels, J R" uniqKey="Daniels J">J. R. Daniels</name>
</author>
<author>
<name sortKey="Marsh, K L" uniqKey="Marsh K">K. L. Marsh</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Desain, P" uniqKey="Desain P">P. Desain</name>
</author>
<author>
<name sortKey="Honing, H" uniqKey="Honing H">H. Honing</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Dixon, S" uniqKey="Dixon S">S. Dixon</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Elliott, M T" uniqKey="Elliott M">M. T. Elliott</name>
</author>
<author>
<name sortKey="Wing, A M" uniqKey="Wing A">A. M. Wing</name>
</author>
<author>
<name sortKey="Welchman, A E" uniqKey="Welchman A">A. E. Welchman</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Feldman, H" uniqKey="Feldman H">H. Feldman</name>
</author>
<author>
<name sortKey="Friston, K J" uniqKey="Friston K">K. J. Friston</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Fiebach, C J" uniqKey="Fiebach C">C. J. Fiebach</name>
</author>
<author>
<name sortKey="Schubotz, R I" uniqKey="Schubotz R">R. I. Schubotz</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Fiez, J A" uniqKey="Fiez J">J. A. Fiez</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Fitch, W T" uniqKey="Fitch W">W. T. Fitch</name>
</author>
<author>
<name sortKey="Rosenfeld, A J" uniqKey="Rosenfeld A">A. J. Rosenfeld</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Fraisse, P" uniqKey="Fraisse P">P. Fraisse</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Fraisse, P" uniqKey="Fraisse P">P. Fraisse</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Fraisse, P" uniqKey="Fraisse P">P. Fraisse</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Friedman, D" uniqKey="Friedman D">D. Friedman</name>
</author>
<author>
<name sortKey="Cycowicz, Y M" uniqKey="Cycowicz Y">Y. M. Cycowicz</name>
</author>
<author>
<name sortKey="Gaeta, H" uniqKey="Gaeta H">H. Gaeta</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Friston, K" uniqKey="Friston K">K. Friston</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Friston, K" uniqKey="Friston K">K. Friston</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Friston, K" uniqKey="Friston K">K. Friston</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Friston, K" uniqKey="Friston K">K. Friston</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Friston, K" uniqKey="Friston K">K. Friston</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Friston, K" uniqKey="Friston K">K. Friston</name>
</author>
<author>
<name sortKey="Daunizeau, J" uniqKey="Daunizeau J">J. Daunizeau</name>
</author>
<author>
<name sortKey="Kilner, J" uniqKey="Kilner J">J. Kilner</name>
</author>
<author>
<name sortKey="Kiebel, S J" uniqKey="Kiebel S">S. J. Kiebel</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Fujioka, T" uniqKey="Fujioka T">T. Fujioka</name>
</author>
<author>
<name sortKey="Zendel, B R" uniqKey="Zendel B">B. R. Zendel</name>
</author>
<author>
<name sortKey="Ross, B" uniqKey="Ross B">B. Ross</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Gebauer, L" uniqKey="Gebauer L">L. Gebauer</name>
</author>
<author>
<name sortKey="Kringelbach, M L" uniqKey="Kringelbach M">M. L. Kringelbach</name>
</author>
<author>
<name sortKey="Vuust, P" uniqKey="Vuust P">P. Vuust</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Gooch, C M" uniqKey="Gooch C">C. M. Gooch</name>
</author>
<author>
<name sortKey="Wiener, M" uniqKey="Wiener M">M. Wiener</name>
</author>
<author>
<name sortKey="Hamilton, A C" uniqKey="Hamilton A">A. C. Hamilton</name>
</author>
<author>
<name sortKey="Coslett, H B" uniqKey="Coslett H">H. B. Coslett</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Grahn, J A" uniqKey="Grahn J">J. A. Grahn</name>
</author>
<author>
<name sortKey="Brett, M" uniqKey="Brett M">M. Brett</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Grahn, J A" uniqKey="Grahn J">J. A. Grahn</name>
</author>
<author>
<name sortKey="Mcauley, J D" uniqKey="Mcauley J">J. D. McAuley</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Grahn, J A" uniqKey="Grahn J">J. A. Grahn</name>
</author>
<author>
<name sortKey="Rowe, J B" uniqKey="Rowe J">J. B. Rowe</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Grahn, J A" uniqKey="Grahn J">J. A. Grahn</name>
</author>
<author>
<name sortKey="Rowe, J B" uniqKey="Rowe J">J. B. Rowe</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Greenwald, J" uniqKey="Greenwald J">J. Greenwald</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Hohwy, J" uniqKey="Hohwy J">J. Hohwy</name>
</author>
<author>
<name sortKey="Roepstorff, A" uniqKey="Roepstorff A">A. Roepstorff</name>
</author>
<author>
<name sortKey="Friston, K" uniqKey="Friston K">K. Friston</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Honing, H" uniqKey="Honing H">H. Honing</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Honing, H" uniqKey="Honing H">H. Honing</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Huron, D" uniqKey="Huron D">D. Huron</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Iyer, V" uniqKey="Iyer V">V. Iyer</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Jahanshahi, M" uniqKey="Jahanshahi M">M. Jahanshahi</name>
</author>
<author>
<name sortKey="Dirnberger, G" uniqKey="Dirnberger G">G. Dirnberger</name>
</author>
<author>
<name sortKey="Fuller, R" uniqKey="Fuller R">R. Fuller</name>
</author>
<author>
<name sortKey="Frith, C" uniqKey="Frith C">C. Frith</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Janata, P" uniqKey="Janata P">P. Janata</name>
</author>
<author>
<name sortKey="Tomic, S T" uniqKey="Tomic S">S. T. Tomic</name>
</author>
<author>
<name sortKey="Haberman, J M" uniqKey="Haberman J">J. M. Haberman</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Johnston, A" uniqKey="Johnston A">A. Johnston</name>
</author>
<author>
<name sortKey="Arnold, D H" uniqKey="Arnold D">D. H. Arnold</name>
</author>
<author>
<name sortKey="Nishida, S" uniqKey="Nishida S">S. Nishida</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Jones, M R" uniqKey="Jones M">M. R. Jones</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Jones, M R" uniqKey="Jones M">M. R. Jones</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Kalender, B" uniqKey="Kalender B">B. Kalender</name>
</author>
<author>
<name sortKey="Trehub, S E" uniqKey="Trehub S">S. E. Trehub</name>
</author>
<author>
<name sortKey="Schellenberg, E G" uniqKey="Schellenberg E">E. G. Schellenberg</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Keller, P E" uniqKey="Keller P">P. E. Keller</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Kleinschmidt, A" uniqKey="Kleinschmidt A">A. Kleinschmidt</name>
</author>
<author>
<name sortKey="Buchel, C" uniqKey="Buchel C">C. Buchel</name>
</author>
<author>
<name sortKey="Zeki, S" uniqKey="Zeki S">S. Zeki</name>
</author>
<author>
<name sortKey="Frackowiak, R S" uniqKey="Frackowiak R">R. S. Frackowiak</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Koelsch, S" uniqKey="Koelsch S">S. Koelsch</name>
</author>
<author>
<name sortKey="Schroger, E" uniqKey="Schroger E">E. Schröger</name>
</author>
<author>
<name sortKey="Tervaniemi, M" uniqKey="Tervaniemi M">M. Tervaniemi</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Konvalinka, I" uniqKey="Konvalinka I">I. Konvalinka</name>
</author>
<author>
<name sortKey="Vuust, P" uniqKey="Vuust P">P. Vuust</name>
</author>
<author>
<name sortKey="Roepstorff, A" uniqKey="Roepstorff A">A. Roepstorff</name>
</author>
<author>
<name sortKey="Frith, C D" uniqKey="Frith C">C. D. Frith</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Kording, K P" uniqKey="Kording K">K. P. Körding</name>
</author>
<author>
<name sortKey="Beierholm, U" uniqKey="Beierholm U">U. Beierholm</name>
</author>
<author>
<name sortKey="Ma, W J" uniqKey="Ma W">W. J. Ma</name>
</author>
<author>
<name sortKey="Quartz, S" uniqKey="Quartz S">S. Quartz</name>
</author>
<author>
<name sortKey="Tenenbaum, J B" uniqKey="Tenenbaum J">J. B. Tenenbaum</name>
</author>
<author>
<name sortKey="Shams, L" uniqKey="Shams L">L. Shams</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Kung, S J" uniqKey="Kung S">S.-J. Kung</name>
</author>
<author>
<name sortKey="Chen, J L" uniqKey="Chen J">J. L. Chen</name>
</author>
<author>
<name sortKey="Zatorre, R J" uniqKey="Zatorre R">R. J. Zatorre</name>
</author>
<author>
<name sortKey="Penhune, V B" uniqKey="Penhune V">V. B. Penhune</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Ladinig, O" uniqKey="Ladinig O">O. Ladinig</name>
</author>
<author>
<name sortKey="Honing, H" uniqKey="Honing H">H. Honing</name>
</author>
<author>
<name sortKey="Haden, G" uniqKey="Haden G">G. Haden</name>
</author>
<author>
<name sortKey="Winkler, I" uniqKey="Winkler I">I. Winkler</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Large, E W" uniqKey="Large E">E. W. Large</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Large, E W" uniqKey="Large E">E. W. Large</name>
</author>
<author>
<name sortKey="Jones, M R" uniqKey="Jones M">M. R. Jones</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Large, E W" uniqKey="Large E">E. W. Large</name>
</author>
<author>
<name sortKey="Kolen, J F" uniqKey="Kolen J">J. F. Kolen</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Lee, D N" uniqKey="Lee D">D. N. Lee</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Lee, S H" uniqKey="Lee S">S.-H. Lee</name>
</author>
<author>
<name sortKey="Blake, R" uniqKey="Blake R">R. Blake</name>
</author>
<author>
<name sortKey="Heeger, D J" uniqKey="Heeger D">D. J. Heeger</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Leman, M" uniqKey="Leman M">M. Leman</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Leopold, D A" uniqKey="Leopold D">D. A. Leopold</name>
</author>
<author>
<name sortKey="Logothetis, N K" uniqKey="Logothetis N">N. K. Logothetis</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Lerdahl, F" uniqKey="Lerdahl F">F. Lerdahl</name>
</author>
<author>
<name sortKey="Jackendoff, R" uniqKey="Jackendoff R">R. Jackendoff</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Lewis, P" uniqKey="Lewis P">P. Lewis</name>
</author>
<author>
<name sortKey="Miall, R" uniqKey="Miall R">R. Miall</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Lewis, P" uniqKey="Lewis P">P. Lewis</name>
</author>
<author>
<name sortKey="Miall, R" uniqKey="Miall R">R. Miall</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="London, J" uniqKey="London J">J. London</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Longuet Higgins, H C" uniqKey="Longuet Higgins H">H. C. Longuet-Higgins</name>
</author>
<author>
<name sortKey="Lee, C" uniqKey="Lee C">C. Lee</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Lumer, E D" uniqKey="Lumer E">E. D. Lumer</name>
</author>
<author>
<name sortKey="Friston, K J" uniqKey="Friston K">K. J. Friston</name>
</author>
<author>
<name sortKey="Rees, G" uniqKey="Rees G">G. Rees</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Madison, G" uniqKey="Madison G">G. Madison</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Madison, G" uniqKey="Madison G">G. Madison</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Madison, G" uniqKey="Madison G">G. Madison</name>
</author>
<author>
<name sortKey="Gouyon, F" uniqKey="Gouyon F">F. Gouyon</name>
</author>
<author>
<name sortKey="Ullen, F" uniqKey="Ullen F">F. Ullén</name>
</author>
<author>
<name sortKey="Hornstrom, K" uniqKey="Hornstrom K">K. Hörnström</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Maloney, L T" uniqKey="Maloney L">L. T. Maloney</name>
</author>
<author>
<name sortKey="Mamassian, P" uniqKey="Mamassian P">P. Mamassian</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Margulis, E H" uniqKey="Margulis E">E. H. Margulis</name>
</author>
<author>
<name sortKey="Beatty, A P" uniqKey="Beatty A">A. P. Beatty</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Mayville, J M" uniqKey="Mayville J">J. M. Mayville</name>
</author>
<author>
<name sortKey="Fuchs, A" uniqKey="Fuchs A">A. Fuchs</name>
</author>
<author>
<name sortKey="Ding, M" uniqKey="Ding M">M. Ding</name>
</author>
<author>
<name sortKey="Cheyne, D" uniqKey="Cheyne D">D. Cheyne</name>
</author>
<author>
<name sortKey="Deecke, L" uniqKey="Deecke L">L. Deecke</name>
</author>
<author>
<name sortKey="Kelso, J A S" uniqKey="Kelso J">J. A. S. Kelso</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Mcauley, J D" uniqKey="Mcauley J">J. D. McAuley</name>
</author>
<author>
<name sortKey="Jones, M R" uniqKey="Jones M">M. R. Jones</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Meyer, L B" uniqKey="Meyer L">L. B. Meyer</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Molnar Szakacz, I" uniqKey="Molnar Szakacz I">I. Molnar-Szakacz</name>
</author>
<author>
<name sortKey="Overy, K" uniqKey="Overy K">K. Overy</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Morrone, M C" uniqKey="Morrone M">M. C. Morrone</name>
</author>
<author>
<name sortKey="Ross, J" uniqKey="Ross J">J. Ross</name>
</author>
<author>
<name sortKey="Burr, D" uniqKey="Burr D">D. Burr</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Mumford, D" uniqKey="Mumford D">D. Mumford</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Mumford, D" uniqKey="Mumford D">D. Mumford</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Munte, T F" uniqKey="Munte T">T. F. Münte</name>
</author>
<author>
<name sortKey="Kohlmetz, C" uniqKey="Kohlmetz C">C. Kohlmetz</name>
</author>
<author>
<name sortKey="Nager, W" uniqKey="Nager W">W. Nager</name>
</author>
<author>
<name sortKey="Altenmuller, E" uniqKey="Altenmuller E">E. Altenmüller</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="N T Nen, R" uniqKey="N T Nen R">R. Näätänen</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="N T Nen, R" uniqKey="N T Nen R">R. Näätänen</name>
</author>
<author>
<name sortKey="Paaviliainen, P" uniqKey="Paaviliainen P">P. Paaviliainen</name>
</author>
<author>
<name sortKey="Alho, K" uniqKey="Alho K">K. Alho</name>
</author>
<author>
<name sortKey="Reinikainen, K" uniqKey="Reinikainen K">K. Reinikainen</name>
</author>
<author>
<name sortKey="Sams, M" uniqKey="Sams M">M. Sams</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="N T Nen, R" uniqKey="N T Nen R">R. Näätänen</name>
</author>
<author>
<name sortKey="Tervaniemi, M" uniqKey="Tervaniemi M">M. Tervaniemi</name>
</author>
<author>
<name sortKey="Sussman, E S" uniqKey="Sussman E">E. S. Sussman</name>
</author>
<author>
<name sortKey="Paavilainen, P" uniqKey="Paavilainen P">P. Paavilainen</name>
</author>
<author>
<name sortKey="Winkler, I" uniqKey="Winkler I">I. Winkler</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Nagarajan, S S" uniqKey="Nagarajan S">S. S. Nagarajan</name>
</author>
<author>
<name sortKey="Blake, D T" uniqKey="Blake D">D. T. Blake</name>
</author>
<author>
<name sortKey="Wright, B A" uniqKey="Wright B">B. A. Wright</name>
</author>
<author>
<name sortKey="Byl, N" uniqKey="Byl N">N. Byl</name>
</author>
<author>
<name sortKey="Merzenich, M M" uniqKey="Merzenich M">M. M. Merzenich</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Nikjeh, D A" uniqKey="Nikjeh D">D. A. Nikjeh</name>
</author>
<author>
<name sortKey="Lister, J J" uniqKey="Lister J">J. J. Lister</name>
</author>
<author>
<name sortKey="Frisch, S A" uniqKey="Frisch S">S. A. Frisch</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="North, A C" uniqKey="North A">A. C. North</name>
</author>
<author>
<name sortKey="Hargreaves, D J" uniqKey="Hargreaves D">D. J. Hargreaves</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Nozaradan, S" uniqKey="Nozaradan S">S. Nozaradan</name>
</author>
<author>
<name sortKey="Peretz, I" uniqKey="Peretz I">I. Peretz</name>
</author>
<author>
<name sortKey="Missal, M" uniqKey="Missal M">M. Missal</name>
</author>
<author>
<name sortKey="Mouraux, A" uniqKey="Mouraux A">A. Mouraux</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Orr, M G" uniqKey="Orr M">M. G. Orr</name>
</author>
<author>
<name sortKey="Ohlsson, S" uniqKey="Ohlsson S">S. Ohlsson</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Overy, K" uniqKey="Overy K">K. Overy</name>
</author>
<author>
<name sortKey="Molnar Szakacs, I" uniqKey="Molnar Szakacs I">I. Molnar-Szakacs</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Owen, A M" uniqKey="Owen A">A. M. Owen</name>
</author>
<author>
<name sortKey="Mcmillan, K M" uniqKey="Mcmillan K">K. M. Mcmillan</name>
</author>
<author>
<name sortKey="Laird, A R" uniqKey="Laird A">A. R. Laird</name>
</author>
<author>
<name sortKey="Bullmore, E" uniqKey="Bullmore E">E. Bullmore</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Paavilainen, P" uniqKey="Paavilainen P">P. Paavilainen</name>
</author>
<author>
<name sortKey="Karlsson, M L" uniqKey="Karlsson M">M.-L. Karlsson</name>
</author>
<author>
<name sortKey="Reinikainen, K" uniqKey="Reinikainen K">K. Reinikainen</name>
</author>
<author>
<name sortKey="N T Nen, R" uniqKey="N T Nen R">R. Näätänen</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Paavilainen, P" uniqKey="Paavilainen P">P. Paavilainen</name>
</author>
<author>
<name sortKey="Simola, J" uniqKey="Simola J">J. Simola</name>
</author>
<author>
<name sortKey="Jaramillo, M" uniqKey="Jaramillo M">M. Jaramillo</name>
</author>
<author>
<name sortKey="N T Nen, R" uniqKey="N T Nen R">R. Näätänen</name>
</author>
<author>
<name sortKey="Winkler, I" uniqKey="Winkler I">I. Winkler</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Pack, C C" uniqKey="Pack C">C. C. Pack</name>
</author>
<author>
<name sortKey="Born, R T" uniqKey="Born R">R. T. Born</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Palmer, C" uniqKey="Palmer C">C. Palmer</name>
</author>
<author>
<name sortKey="Krumhansl, C L" uniqKey="Krumhansl C">C. L. Krumhansl</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Parncutt, R" uniqKey="Parncutt R">R. Parncutt</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Pearce, M T" uniqKey="Pearce M">M. T. Pearce</name>
</author>
<author>
<name sortKey="Ruiz, M H" uniqKey="Ruiz M">M. H. Ruiz</name>
</author>
<author>
<name sortKey="Kapasi, S" uniqKey="Kapasi S">S. Kapasi</name>
</author>
<author>
<name sortKey="Wiggins, G A" uniqKey="Wiggins G">G. A. Wiggins</name>
</author>
<author>
<name sortKey="Bhattacharya, J" uniqKey="Bhattacharya J">J. Bhattacharya</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Pearce, M T" uniqKey="Pearce M">M. T. Pearce</name>
</author>
<author>
<name sortKey="Wiggins, G A" uniqKey="Wiggins G">G. A. Wiggins</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Pecenka, N" uniqKey="Pecenka N">N. Pecenka</name>
</author>
<author>
<name sortKey="Keller, P E" uniqKey="Keller P">P. E. Keller</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Phillips Silver, J" uniqKey="Phillips Silver J">J. Phillips-Silver</name>
</author>
<author>
<name sortKey="Aktipis, A C" uniqKey="Aktipis A">A. C. Aktipis</name>
</author>
<author>
<name sortKey="Bryant, G" uniqKey="Bryant G">G. Bryant</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Phillips Silver, J" uniqKey="Phillips Silver J">J. Phillips-Silver</name>
</author>
<author>
<name sortKey="Trainor, L J" uniqKey="Trainor L">L. J. Trainor</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Phillips Silver, J" uniqKey="Phillips Silver J">J. Phillips-Silver</name>
</author>
<author>
<name sortKey="Trainor, L J" uniqKey="Trainor L">L. J. Trainor</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Phillips Silver, J" uniqKey="Phillips Silver J">J. Phillips-Silver</name>
</author>
<author>
<name sortKey="Trainor, L J" uniqKey="Trainor L">L. J. Trainor</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Pressing, J" uniqKey="Pressing J">J. Pressing</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Raij, T" uniqKey="Raij T">T. Raij</name>
</author>
<author>
<name sortKey="Mcevoy, L" uniqKey="Mcevoy L">L. Mcevoy</name>
</author>
<author>
<name sortKey="M Kel, J P" uniqKey="M Kel J">J. P. Mäkelä</name>
</author>
<author>
<name sortKey="Hari, R" uniqKey="Hari R">R. Hari</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Rao, R P" uniqKey="Rao R">R. P. Rao</name>
</author>
<author>
<name sortKey="Ballard, D H" uniqKey="Ballard D">D. H. Ballard</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Repp, B" uniqKey="Repp B">B. Repp</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Repp, B" uniqKey="Repp B">B. Repp</name>
</author>
<author>
<name sortKey="Keller, P E" uniqKey="Keller P">P. E. Keller</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Robbins, H" uniqKey="Robbins H">H. Robbins</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Roepstorff, A" uniqKey="Roepstorff A">A. Roepstorff</name>
</author>
<author>
<name sortKey="Niewohner, J" uniqKey="Niewohner J">J. Niewohner</name>
</author>
<author>
<name sortKey="Beck, S" uniqKey="Beck S">S. Beck</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Rohrmeier, M A" uniqKey="Rohrmeier M">M. A. Rohrmeier</name>
</author>
<author>
<name sortKey="Koelsch, S" uniqKey="Koelsch S">S. Koelsch</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Rubin, E" uniqKey="Rubin E">E. Rubin</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Sadeghi, H" uniqKey="Sadeghi H">H. Sadeghi</name>
</author>
<author>
<name sortKey="Allard, P" uniqKey="Allard P">P. Allard</name>
</author>
<author>
<name sortKey="Prince, F" uniqKey="Prince F">F. Prince</name>
</author>
<author>
<name sortKey="Labelle, H" uniqKey="Labelle H">H. Labelle</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Sams, M" uniqKey="Sams M">M. Sams</name>
</author>
<author>
<name sortKey="Paavilainen, P" uniqKey="Paavilainen P">P. Paavilainen</name>
</author>
<author>
<name sortKey="Alho, K" uniqKey="Alho K">K. Alho</name>
</author>
<author>
<name sortKey="N T Nen, R" uniqKey="N T Nen R">R. Näätänen</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Schaffrath, H" uniqKey="Schaffrath H">H. Schaffrath</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Schmidt, R" uniqKey="Schmidt R">R. Schmidt</name>
</author>
<author>
<name sortKey="Fitzpatrick, P" uniqKey="Fitzpatrick P">P. Fitzpatrick</name>
</author>
<author>
<name sortKey="Caron, R" uniqKey="Caron R">R. Caron</name>
</author>
<author>
<name sortKey="Mergeche, J" uniqKey="Mergeche J">J. Mergeche</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Schogler, B" uniqKey="Schogler B">B. Schogler</name>
</author>
<author>
<name sortKey="Pepping, G J" uniqKey="Pepping G">G.-J. Pepping</name>
</author>
<author>
<name sortKey="Lee, D N" uniqKey="Lee D">D. N. Lee</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Schultz, W" uniqKey="Schultz W">W. Schultz</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Schultz, W" uniqKey="Schultz W">W. Schultz</name>
</author>
<author>
<name sortKey="Preuschoff, K" uniqKey="Preuschoff K">K. Preuschoff</name>
</author>
<author>
<name sortKey="Camerer, C" uniqKey="Camerer C">C. Camerer</name>
</author>
<author>
<name sortKey="Hsu, M" uniqKey="Hsu M">M. Hsu</name>
</author>
<author>
<name sortKey="Fiorillo, C D" uniqKey="Fiorillo C">C. D. Fiorillo</name>
</author>
<author>
<name sortKey="Tobler, P N" uniqKey="Tobler P">P. N. Tobler</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Simons, J S" uniqKey="Simons J">J. S. Simons</name>
</author>
<author>
<name sortKey="Scholvinck, M L" uniqKey="Scholvinck M">M. L. Schölvinck</name>
</author>
<author>
<name sortKey="Gilbert, S J" uniqKey="Gilbert S">S. J. Gilbert</name>
</author>
<author>
<name sortKey="Frith, C D" uniqKey="Frith C">C. D. Frith</name>
</author>
<author>
<name sortKey="Burgess, P W" uniqKey="Burgess P">P. W. Burgess</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Song, C" uniqKey="Song C">C. Song</name>
</author>
<author>
<name sortKey="Simpson, A J" uniqKey="Simpson A">A. J. Simpson</name>
</author>
<author>
<name sortKey="Harte, C A" uniqKey="Harte C">C. A. Harte</name>
</author>
<author>
<name sortKey="Pearce, M T" uniqKey="Pearce M">M. T. Pearce</name>
</author>
<author>
<name sortKey="Sandler, M B" uniqKey="Sandler M">M. B. Sandler</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Stephan, K E" uniqKey="Stephan K">K. E. Stephan</name>
</author>
<author>
<name sortKey="Harrison, L M" uniqKey="Harrison L">L. M. Harrison</name>
</author>
<author>
<name sortKey="Kiebel, S J" uniqKey="Kiebel S">S. J. Kiebel</name>
</author>
<author>
<name sortKey="David, O" uniqKey="David O">O. David</name>
</author>
<author>
<name sortKey="Penny, W D" uniqKey="Penny W">W. D. Penny</name>
</author>
<author>
<name sortKey="Friston, K J" uniqKey="Friston K">K. J. Friston</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Sterzer, P" uniqKey="Sterzer P">P. Sterzer</name>
</author>
<author>
<name sortKey="Russ, M O" uniqKey="Russ M">M. O. Russ</name>
</author>
<author>
<name sortKey="Preibisch, C" uniqKey="Preibisch C">C. Preibisch</name>
</author>
<author>
<name sortKey="Kleinschmidt, A" uniqKey="Kleinschmidt A">A. Kleinschmidt</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Stupacher, J" uniqKey="Stupacher J">J. Stupacher</name>
</author>
<author>
<name sortKey="Hove, M J" uniqKey="Hove M">M. J. Hove</name>
</author>
<author>
<name sortKey="Novembre, G" uniqKey="Novembre G">G. Novembre</name>
</author>
<author>
<name sortKey="Schutz Bosbach, S" uniqKey="Schutz Bosbach S">S. Schütz-Bosbach</name>
</author>
<author>
<name sortKey="Keller, P E" uniqKey="Keller P">P. E. Keller</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Teki, S" uniqKey="Teki S">S. Teki</name>
</author>
<author>
<name sortKey="Grube, M" uniqKey="Grube M">M. Grube</name>
</author>
<author>
<name sortKey="Griffiths, T D" uniqKey="Griffiths T">T. D. Griffiths</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Teki, S" uniqKey="Teki S">S. Teki</name>
</author>
<author>
<name sortKey="Grube, M" uniqKey="Grube M">M. Grube</name>
</author>
<author>
<name sortKey="Kumar, S" uniqKey="Kumar S">S. Kumar</name>
</author>
<author>
<name sortKey="Griffiths, T D" uniqKey="Griffiths T">T. D. Griffiths</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Temperley, D" uniqKey="Temperley D">D. Temperley</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Temperley, D" uniqKey="Temperley D">D. Temperley</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Temperley, D" uniqKey="Temperley D">D. Temperley</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Temperley, D" uniqKey="Temperley D">D. Temperley</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Temperley, D" uniqKey="Temperley D">D. Temperley</name>
</author>
<author>
<name sortKey="Sleator, D" uniqKey="Sleator D">D. Sleator</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Trainor, L J" uniqKey="Trainor L">L. J. Trainor</name>
</author>
<author>
<name sortKey="Mcdonald, K L" uniqKey="Mcdonald K">K. L. Mcdonald</name>
</author>
<author>
<name sortKey="Alain, C" uniqKey="Alain C">C. Alain</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Trost, W" uniqKey="Trost W">W. Trost</name>
</author>
<author>
<name sortKey="Vuilleumier, P" uniqKey="Vuilleumier P">P. Vuilleumier</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Van Zuijen, T L" uniqKey="Van Zuijen T">T. L. Van Zuijen</name>
</author>
<author>
<name sortKey="Sussman, E" uniqKey="Sussman E">E. Sussman</name>
</author>
<author>
<name sortKey="Winkler, I" uniqKey="Winkler I">I. Winkler</name>
</author>
<author>
<name sortKey="N T Nen, R" uniqKey="N T Nen R">R. Näätänen</name>
</author>
<author>
<name sortKey="Tervaniemi, M" uniqKey="Tervaniemi M">M. Tervaniemi</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Verschure, P F" uniqKey="Verschure P">P. F. Verschure</name>
</author>
<author>
<name sortKey="Voegtlin, T" uniqKey="Voegtlin T">T. Voegtlin</name>
</author>
<author>
<name sortKey="Douglas, R J" uniqKey="Douglas R">R. J. Douglas</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Volk, A" uniqKey="Volk A">A. Volk</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Vuust, P" uniqKey="Vuust P">P. Vuust</name>
</author>
<author>
<name sortKey="Brattico, E" uniqKey="Brattico E">E. Brattico</name>
</author>
<author>
<name sortKey="Sepp Nen, M" uniqKey="Sepp Nen M">M. Seppänen</name>
</author>
<author>
<name sortKey="N T Nen, R" uniqKey="N T Nen R">R. Näätänen</name>
</author>
<author>
<name sortKey="Tervaniemi, M" uniqKey="Tervaniemi M">M. Tervaniemi</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Vuust, P" uniqKey="Vuust P">P. Vuust</name>
</author>
<author>
<name sortKey="Brattico, E" uniqKey="Brattico E">E. Brattico</name>
</author>
<author>
<name sortKey="Sepp Nen, M" uniqKey="Sepp Nen M">M. Seppänen</name>
</author>
<author>
<name sortKey="N T Nen, R" uniqKey="N T Nen R">R. Näätänen</name>
</author>
<author>
<name sortKey="Tervaniemi, M" uniqKey="Tervaniemi M">M. Tervaniemi</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Vuust, P" uniqKey="Vuust P">P. Vuust</name>
</author>
<author>
<name sortKey="Frith, C D" uniqKey="Frith C">C. D. Frith</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Vuust, P" uniqKey="Vuust P">P. Vuust</name>
</author>
<author>
<name sortKey="Ostergaard, L" uniqKey="Ostergaard L">L. Ostergaard</name>
</author>
<author>
<name sortKey="Pallesen, K J" uniqKey="Pallesen K">K. J. Pallesen</name>
</author>
<author>
<name sortKey="Bailey, C" uniqKey="Bailey C">C. Bailey</name>
</author>
<author>
<name sortKey="Roepstorff, A" uniqKey="Roepstorff A">A. Roepstorff</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Vuust, P" uniqKey="Vuust P">P. Vuust</name>
</author>
<author>
<name sortKey="Pallesen, K J" uniqKey="Pallesen K">K. J. Pallesen</name>
</author>
<author>
<name sortKey="Bailey, C" uniqKey="Bailey C">C. Bailey</name>
</author>
<author>
<name sortKey="Van Zuijen, T L" uniqKey="Van Zuijen T">T. L. Van Zuijen</name>
</author>
<author>
<name sortKey="Gjedde, A" uniqKey="Gjedde A">A. Gjedde</name>
</author>
<author>
<name sortKey="Roepstorff, A" uniqKey="Roepstorff A">A. Roepstorff</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Vuust, P" uniqKey="Vuust P">P. Vuust</name>
</author>
<author>
<name sortKey="Roepstorff, A" uniqKey="Roepstorff A">A. Roepstorff</name>
</author>
<author>
<name sortKey="Wallentin, M" uniqKey="Wallentin M">M. Wallentin</name>
</author>
<author>
<name sortKey="Mouridsen, K" uniqKey="Mouridsen K">K. Mouridsen</name>
</author>
<author>
<name sortKey=" Stergaard, L" uniqKey=" Stergaard L">L. Østergaard</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Vuust, P" uniqKey="Vuust P">P. Vuust</name>
</author>
<author>
<name sortKey="Wallentin, M" uniqKey="Wallentin M">M. Wallentin</name>
</author>
<author>
<name sortKey="Mouridsen, K" uniqKey="Mouridsen K">K. Mouridsen</name>
</author>
<author>
<name sortKey=" Stergaard, L" uniqKey=" Stergaard L">L. Østergaard</name>
</author>
<author>
<name sortKey="Roepstorff, A" uniqKey="Roepstorff A">A. Roepstorff</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Waadeland, C H" uniqKey="Waadeland C">C. H. Waadeland</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Winkler, I" uniqKey="Winkler I">I. Winkler</name>
</author>
<author>
<name sortKey="Karmos, G" uniqKey="Karmos G">G. Karmos</name>
</author>
<author>
<name sortKey="N T Nen, R" uniqKey="N T Nen R">R. Näätänen</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Witek, M A G" uniqKey="Witek M">M. A. G. Witek</name>
</author>
<author>
<name sortKey="Clarke, E F" uniqKey="Clarke E">E. F. Clarke</name>
</author>
<author>
<name sortKey="Kringelbach, M L" uniqKey="Kringelbach M">M. L. Kringelbach</name>
</author>
<author>
<name sortKey="Vuust, P" uniqKey="Vuust P">P. Vuust</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Witek, M A G" uniqKey="Witek M">M. A. G. Witek</name>
</author>
<author>
<name sortKey="Clarke, E F" uniqKey="Clarke E">E. F. Clarke</name>
</author>
<author>
<name sortKey="Wallentin, M" uniqKey="Wallentin M">M. Wallentin</name>
</author>
<author>
<name sortKey="Kringelbach, M L" uniqKey="Kringelbach M">M. L. Kringelbach</name>
</author>
<author>
<name sortKey="Vuust, P" uniqKey="Vuust P">P. Vuust</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Wundt, W" uniqKey="Wundt W">W. Wundt</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Yu, A J" uniqKey="Yu A">A. J. Yu</name>
</author>
</analytic>
</biblStruct>
</listBibl>
</div1>
</back>
</TEI>
<pmc article-type="review-article">
<pmc-dir>properties open_access</pmc-dir>
<front>
<journal-meta>
<journal-id journal-id-type="nlm-ta">Front Psychol</journal-id>
<journal-id journal-id-type="iso-abbrev">Front Psychol</journal-id>
<journal-id journal-id-type="publisher-id">Front. Psychol.</journal-id>
<journal-title-group>
<journal-title>Frontiers in Psychology</journal-title>
</journal-title-group>
<issn pub-type="epub">1664-1078</issn>
<publisher>
<publisher-name>Frontiers Media S.A.</publisher-name>
</publisher>
</journal-meta>
<article-meta>
<article-id pub-id-type="pmid">25324813</article-id>
<article-id pub-id-type="pmc">4181238</article-id>
<article-id pub-id-type="doi">10.3389/fpsyg.2014.01111</article-id>
<article-categories>
<subj-group subj-group-type="heading">
<subject>Psychology</subject>
<subj-group>
<subject>Review Article</subject>
</subj-group>
</subj-group>
</article-categories>
<title-group>
<article-title>Rhythmic complexity and predictive coding: a novel approach to modeling rhythm and meter perception in music</article-title>
</title-group>
<contrib-group>
<contrib contrib-type="author">
<name>
<surname>Vuust</surname>
<given-names>Peter</given-names>
</name>
<xref ref-type="aff" rid="aff1">
<sup>1</sup>
</xref>
<xref ref-type="aff" rid="aff2">
<sup>2</sup>
</xref>
<uri xlink:type="simple" xlink:href="http://community.frontiersin.org/people/u/97499"></uri>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Witek</surname>
<given-names>Maria A. G.</given-names>
</name>
<uri xlink:type="simple" xlink:href="http://community.frontiersin.org/people/u/51754"></uri>
<xref ref-type="aff" rid="aff1">
<sup>1</sup>
</xref>
<xref ref-type="author-notes" rid="fn001">
<sup>*</sup>
</xref>
</contrib>
</contrib-group>
<aff id="aff1">
<sup>1</sup>
<institution>Center of Functionally Integrative Neuroscience, Aarhus University Hospital</institution>
<country>Aarhus, Denmark</country>
</aff>
<aff id="aff2">
<sup>2</sup>
<institution>Royal Academy of Music</institution>
<country>Aarhus/Aalborg, Denmark</country>
</aff>
<author-notes>
<fn fn-type="edited-by">
<p>Edited by:
<italic>Jessica A. Grahn, University of Western Ontario, Canada</italic>
</p>
</fn>
<fn fn-type="edited-by">
<p>Reviewed by:
<italic>Shinya Fujii, Sunnybrook Research Institute, Canada; Richard Ashley, Northwestern University, USA</italic>
</p>
</fn>
<corresp id="fn001">*Correspondence:
<italic>Maria A. G. Witek, Center of Functionally Integrative Neuroscience, Aarhus University Hospital, Noerrebrogade 44, Building 10G,5th Floor, 8000 Aarhus C, Denmark e-mail:
<email xlink:type="simple">maria.witek@cfin.au.dk</email>
</italic>
</corresp>
<fn fn-type="other" id="fn002">
<p>This article was submitted to Auditory Cognitive Neuroscience, a section of the journal Frontiers in Psychology.</p>
</fn>
</author-notes>
<pub-date pub-type="epub">
<day>01</day>
<month>10</month>
<year>2014</year>
</pub-date>
<pub-date pub-type="collection">
<year>2014</year>
</pub-date>
<volume>5</volume>
<elocation-id>1111</elocation-id>
<history>
<date date-type="received">
<day>24</day>
<month>4</month>
<year>2014</year>
</date>
<date date-type="accepted">
<day>12</day>
<month>9</month>
<year>2014</year>
</date>
</history>
<permissions>
<copyright-statement>Copyright © 2014 Vuust and Witek.</copyright-statement>
<copyright-year>2014</copyright-year>
<license license-type="open-access" xlink:href="http://creativecommons.org/licenses/by/4.0/">
<license-p> This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.</license-p>
</license>
</permissions>
<abstract>
<p>Musical rhythm, consisting of apparently abstract intervals of accented temporal events, has a remarkable capacity to move our minds and bodies. How does the cognitive system enable our experiences of rhythmically complex music? In this paper, we describe some common forms of rhythmic complexity in music and propose the theory of predictive coding (PC) as a framework for understanding how rhythm and rhythmic complexity are processed in the brain. We also consider why we feel so compelled by rhythmic tension in music. First, we consider theories of rhythm and meter perception, which provide hierarchical and computational approaches to modeling. Second, we present the theory of PC, which posits a hierarchical organization of brain responses reflecting fundamental, survival-related mechanisms associated with predicting future events. According to this theory, perception and learning is manifested through the brain’s Bayesian minimization of the error between the input to the brain and the brain’s prior expectations. Third, we develop a PC model of musical rhythm, in which rhythm perception is conceptualized as an interaction between what is heard (“rhythm”) and the brain’s anticipatory structuring of music (“meter”). Finally, we review empirical studies of the neural and behavioral effects of syncopation, polyrhythm and groove, and propose how these studies can be seen as special cases of the PC theory. We argue that musical rhythm exploits the brain’s general principles of prediction and propose that pleasure and desire for sensorimotor synchronization from musical rhythm may be a result of such mechanisms.</p>
</abstract>
<kwd-group>
<kwd>rhythm</kwd>
<kwd>meter</kwd>
<kwd>rhythmic complexity</kwd>
<kwd>predictive coding</kwd>
<kwd>pleasure</kwd>
</kwd-group>
<counts>
<fig-count count="4"></fig-count>
<table-count count="0"></table-count>
<equation-count count="0"></equation-count>
<ref-count count="162"></ref-count>
<page-count count="14"></page-count>
<word-count count="0"></word-count>
</counts>
</article-meta>
</front>
<body>
<sec sec-type="intro">
<title>INTRODUCTION</title>
<p>Music can move us, both emotionally and corporeally. It can send shivers down our spines and make us tap our feet in time with the beat. How does the brain facilitate the rich and complex experiences we have of rhythm in music? Here, we propose the theory of predictive coding (PC) as a framework for understanding the ways in which complex rhythms are processed in the brain and discuss why we derive pleasure from rhythm in music. First, we point to the theories of rhythm and meter which allow for hierarchical and computational modeling. Second, we present the theory of PC, which posits a hierarchical organization of neural functioning, reflecting fundamental mechanisms associated with predicting future events. The theory puts forward that perception and learning occurs in a recursive Bayesian process by which the brain tries to minimize the error between the input and the brain’s expectation. Third, we view rhythm perception in light of this theory as an interaction between what is heard (“rhythm”) and the brain’s anticipatory model (“meter”). We describe the experience of rhythm in music as depending on the degree of tension or discrepancy between rhythm and meter. Finally, we review some empirical studies of different forms of tension between rhythm and meter – syncopation, polyrhythm and groove – and propose that these can be seen as special cases of PC. Our examples illustrate a number of fundamental principles of its mechanisms; the effects of prior experience, model comparison, and the relationship between prediction error and affective and embodied responses.</p>
</sec>
<sec>
<title>HIERARCHICAL MODELS OF RHYTHM AND METER</title>
<p>Theories of rhythmic perception often contrast rhythm with meter. Broadly,
<italic>rhythm</italic>
is a pattern of discrete durations and is largely thought to depend on the underlying perceptual mechanisms of grouping (
<xref rid="B33" ref-type="bibr">Fraisse, 1963</xref>
,
<xref rid="B34" ref-type="bibr">1982</xref>
,
<xref rid="B35" ref-type="bibr">1984</xref>
;
<xref rid="B21" ref-type="bibr">Clarke, 1999</xref>
).
<italic>Meter</italic>
, again broadly, is the temporal framework according to which rhythm is perceived. More specifically, as defined by
<xref rid="B79" ref-type="bibr">London (2012</xref>
, p. 4): “meter involves our initial perception as well as subsequent anticipation of a series of beats that we abstract from the rhythmic surface of the music as it unfolds in time.” At the most basic level, the perception of meter involves a sense of pulse, i.e., a pattern of beats at isochronously spaced intervals (
<xref rid="B52" ref-type="bibr">Honing, 2012</xref>
,
<xref rid="B53" ref-type="bibr">2013</xref>
). When such beats are hierarchically differentiated into strong and weak accents, it is thought that we perceive meter (
<xref rid="B76" ref-type="bibr">Lerdahl and Jackendoff, 1983</xref>
;
<xref rid="B79" ref-type="bibr">London, 2012</xref>
). Because of its hierarchical nature, meter allows for rhythmic expectations in music (
<xref rid="B71" ref-type="bibr">Large and Kolen, 1994</xref>
;
<xref rid="B60" ref-type="bibr">Jones, 2009</xref>
;
<xref rid="B68" ref-type="bibr">Ladinig et al., 2009</xref>
;
<xref rid="B124" ref-type="bibr">Rohrmeier and Koelsch, 2012</xref>
). In other words, meter provides the listener with an expectancy structure underlying the perception of music according to which each musical time-point encompasses a conjoint perception of time and salience.</p>
<p>However, there are instances in which this sharp distinction between rhythm (perceived) and meter (induced) becomes blurred.
<xref rid="B138" ref-type="bibr">Teki et al. (2011a</xref>
,
<xref rid="B139" ref-type="bibr">b</xref>
) distinguish between duration-based and beat-based timing associated with the coordination and temporal patterning of body-movements (see also
<xref rid="B88" ref-type="bibr">McAuley and Jones, 2003</xref>
;
<xref rid="B47" ref-type="bibr">Grahn and McAuley, 2009</xref>
). In the former, time is organized sequentially and relies on absolute intervallic relationships between discrete events. In the latter, time intervals are organized relative to overall temporal regularity. In other words, beat-based rhythms subserve and enable hierarchical meter. In such cases, the rhythm is perceived as reflecting its underlying metric organization. This is the most common form of timing perceived in music. As we shall see, the theory of PC offers a way of understanding what goes on in our brains when the beats do not seem to uniformly correspond to one single regularity framework.</p>
<p>In formal music-theory terms, meter is often specified in the time signature traditionally given at the beginning of a musical score. Some common time signatures in Western tonal and metric music are 4/4, 2/4, and 3/4. In these time signatures, the first digit indicates the number of pulses in the bar, and the second indicates their durational value. Hierarchical meters are organized by the recursive subdivision of each metric level, both above and below the main pulse (or tactus).
<bold>Figure
<xref ref-type="fig" rid="F1">1</xref>
</bold>
shows how metric levels and their corresponding note durations are organized hierarchically in a 4/4 bar. In 4/4, the metric hierarchy is duple. Each level – from the whole-note level to the level of 16th notes
<sup>
<xref ref-type="fn" rid="fn01">1</xref>
</sup>
– is recursively subdivided into two equal parts. The ways of subdividing each metrical level vary in other time signatures, such as compound meters like 6/8 in which the duple tactus is divided into three at the eighth note level, or more complex meters like 5/4 in which the tactus is quintuple, but other subdivisions are duple. However, the time signature of a given piece can often be notated in more than one way, and the subjective experience of its meter may be at odds with its formal time signature. There is, more generally, greater disagreement about the perceptual definition of meter, compared to formal metric categories. While most agree on the particular salience of the tactus (
<xref rid="B109" ref-type="bibr">Parncutt, 1994</xref>
;
<xref rid="B1" ref-type="bibr">Agawu, 2003</xref>
;
<xref rid="B60" ref-type="bibr">Jones, 2009</xref>
;
<xref rid="B52" ref-type="bibr">Honing, 2012</xref>
), the extent of hierarchical differentiation of pulse sequences beyond the tactus (i.e., at higher or lower levels) is still unknown (
<xref rid="B68" ref-type="bibr">Ladinig et al., 2009</xref>
;
<xref rid="B159" ref-type="bibr">Witek et al., in press</xref>
).
<xref rid="B76" ref-type="bibr">Lerdahl and Jackendoff (1983)</xref>
have proposed a highly hierarchical theory of meter, in which rhythm perception is thought to be underpinned by a metric framework organized in a tree-like structure (similar to that of
<bold>Figure
<xref ref-type="fig" rid="F1">1</xref>
</bold>
). This hierarchical structure is derived from the representation of the musical input which interacts with a small number of top-down cognitive rules. Similar tree-like organizations of meter feature in
<xref rid="B80" ref-type="bibr">Longuet-Higgins and Lee’s (1984)</xref>
computational model of rhythmic syncopation. Here, each metric level is associated with a metric weight – the higher the level, the more salient its metric values. Although
<xref rid="B108" ref-type="bibr">Palmer and Krumhansl (1990)</xref>
found such highly hierarchical structures reflected in the rhythmic perception of musicians, more recent studies have found it difficult to empirically demonstrate that listeners’ (both musicians and non-musicians) metric hierarchies extend beyond the salience of the downbeat (
<xref rid="B68" ref-type="bibr">Ladinig et al., 2009</xref>
;
<xref rid="B134" ref-type="bibr">Song et al., 2013</xref>
;
<xref rid="B159" ref-type="bibr">Witek et al., in press</xref>
).</p>
<fig id="F1" position="float">
<label>FIGURE 1</label>
<caption>
<p>
<bold>Hierarchical model of 4/4 meter.</bold>
Each metric level (or value) is recursively subdivided into equally spaced parts (or values) at the level below, determining the metric salience of positions within the metric framework. The higher the level in the hierarchy, the more salient the position in the meter. Numbers designate serial positions within the meter, at 16th note resolution. The dashed line specifies the level of the tactus.</p>
</caption>
<graphic xlink:href="fpsyg-05-01111-g001"></graphic>
</fig>
<p>In another influential model of meter, dynamic attending theory (DAT), different metric levels are also thought to vary in salience in relation to each other, but such hierarchical relationships are seen as much more dynamic, adaptive and flexible (
<xref rid="B71" ref-type="bibr">Large and Kolen, 1994</xref>
;
<xref rid="B70" ref-type="bibr">Large and Jones, 1999</xref>
;
<xref rid="B3" ref-type="bibr">Barnes and Jones, 2000</xref>
;
<xref rid="B59" ref-type="bibr">Jones, 2004</xref>
,
<xref rid="B60" ref-type="bibr">2009</xref>
). Originally proposed as a model for temporal expectations more generally (
<xref rid="B70" ref-type="bibr">Large and Jones, 1999</xref>
;
<xref rid="B3" ref-type="bibr">Barnes and Jones, 2000</xref>
;
<xref rid="B59" ref-type="bibr">Jones, 2004</xref>
,
<xref rid="B60" ref-type="bibr">2009</xref>
), DAT has since been specifically applied to music (
<xref rid="B22" ref-type="bibr">Clayton et al., 2004</xref>
;
<xref rid="B90" ref-type="bibr">Molnar-Szakacz and Overy, 2006</xref>
;
<xref rid="B113" ref-type="bibr">Phillips-Silver et al., 2010</xref>
;
<xref rid="B79" ref-type="bibr">London, 2012</xref>
;
<xref rid="B146" ref-type="bibr">Trost and Vuilleumier, 2013</xref>
). DAT posits that metric frameworks are perceived in rhythm by way of entrainment. The listener’s attention is captured and driven by the periodicities (or oscillations) in the rhythmic pattern, and the experience of metric accents corresponds to the relative strength of attention directed toward each rhythmic event, distributed hierarchically and isochronously across a rhythmic measure. In this way, meter emerges as a consequence of the reciprocal relationship between external periodicities and internal attending processes. Although bottom-up and top-down processes are acknowledged in both theories (albeit not explicitly in
<xref rid="B76" ref-type="bibr">Lerdahl and Jackendoff, 1983</xref>
),
<xref rid="B76" ref-type="bibr">Lerdahl and Jackendoff (1983)</xref>
focus on final-state representations of meter, while DAT (
<xref rid="B71" ref-type="bibr">Large and Kolen, 1994</xref>
;
<xref rid="B70" ref-type="bibr">Large and Jones, 1999</xref>
;
<xref rid="B3" ref-type="bibr">Barnes and Jones, 2000</xref>
;
<xref rid="B59" ref-type="bibr">Jones, 2004</xref>
,
<xref rid="B60" ref-type="bibr">2009</xref>
) treats bottom-up and top-down processing simultaneously and is more concerned with the dynamic process underlying meter perception. As will soon become clear, such equal emphasis on bottom-up and top-down is one aspect that DAT shares with PC.</p>
<p>It is becoming increasingly common to model metrical perception using wholly computational models (e.g.,
<xref rid="B26" ref-type="bibr">Desain and Honing, 1999</xref>
;
<xref rid="B144" ref-type="bibr">Temperley and Sleator, 1999</xref>
;
<xref rid="B27" ref-type="bibr">Dixon, 2001</xref>
;
<xref rid="B86" ref-type="bibr">Margulis and Beatty, 2008</xref>
;
<xref rid="B149" ref-type="bibr">Volk, 2008</xref>
).
<xref rid="B140" ref-type="bibr">Temperley’s (2004</xref>
,
<xref rid="B141" ref-type="bibr">2007</xref>
,
<xref rid="B142" ref-type="bibr">2009</xref>
,
<xref rid="B143" ref-type="bibr">2010</xref>
) influential model of rhythm and meter uses “Bayes’ rule,” a computational theorem that allows the calculation of probabilities of certain observations based on prior statistical information. Through a generative process similar to that proposed by
<xref rid="B76" ref-type="bibr">Lerdahl and Jackendoff (1983)</xref>
, Temperley proposes that meter is inferred from the probabilities of different patterns of regularity generated by a given rhythmic input. In one study (
<xref rid="B143" ref-type="bibr">Temperley, 2010</xref>
), he tested the performance of six probabilistic models of meter, calculated using Bayes’ rule of probability, on two corpuses of music; the Essen Folk Song Collection (
<xref rid="B128" ref-type="bibr">Schaffrath, 1995</xref>
) and a collection of string quartets by Hayden and Mozart. The Bayesian model allowed Temperley to draw conclusions about how well a sample of data (e.g., a rhythmic pattern) fits with other samples of the same type of data more generally (a model of rhythm or meter). As will become clear below, such Bayesian approaches can also be seen as the basis of perceptual processing more generally, from the level of individual neurons, to subjective affective experience.</p>
</sec>
<sec>
<title>PREDICTIVE CODING</title>
<p>The idea that perception can be modeled as a Bayesian process is the point of departure for a relatively novel way of understanding fundamental brain function. The theory of PC tries to explain how areas in the brain share and integrate information. It was first proposed by
<xref rid="B37" ref-type="bibr">Friston (2002</xref>
,
<xref rid="B39" ref-type="bibr">2005</xref>
), but preceded by several similar theories about fundamental brain processing centered on prediction (
<xref rid="B92" ref-type="bibr">Mumford, 1992</xref>
,
<xref rid="B93" ref-type="bibr">1994</xref>
;
<xref rid="B119" ref-type="bibr">Rao and Ballard, 1999</xref>
). Via Bayesian inference, the brain predicts the causes and sources of its internal states from the actual sensory input as compared with previous “knowledge,” accumulated through experience (
<xref rid="B39" ref-type="bibr">Friston, 2005</xref>
). In this way, the brain is a “hypothesis-tester” and its goal is to “explain away” prediction error by adapting its
<italic>a priori</italic>
predictions. Mathematically speaking, it uses Bayes’ rule recursively (i.e., from level to level in the nested neural networks) to infer the probability of its hypothesis, given the equation p(a|b) = p(b|a)
<sup></sup>
p(a)/p(b), where b is the input and a is the hypothesis (see
<xref rid="B141" ref-type="bibr">Temperley, 2007</xref>
for a very accessible and music-oriented explanation of Bayes’ theorem). Note that Bayesian inference is assumed to take place at every level of brain processing so that higher levels of processing provide priors for lower levels, thus creating nested and hierarchical links across the entire brain. The PC theory assumes a multi-level cascade of processing at different time-scales, in which each level attempts to predict the activity at the level below it via backward connections. The higher-level predictions act as priors for the lower-level processing (so-called “empirical Bayes,”
<xref rid="B122" ref-type="bibr">Robbins, 1956</xref>
). These priors are influenced by previous experience and culture (
<xref rid="B123" ref-type="bibr">Roepstorff et al., 2010</xref>
), often termed hyper-priors (
<xref rid="B40" ref-type="bibr">Friston, 2008</xref>
). However, it is not only experiences from the lifetime scale that affect the process; more short-term priors also influence predictions that are made on a moment-to-moment basis. For example, while the experience of a metrically complex rhythmic pattern will depend on whether one has been exposed to such rhythms in playing (
<xref rid="B151" ref-type="bibr">Vuust et al., 2012b</xref>
) or listening (
<xref rid="B61" ref-type="bibr">Kalender et al., 2013</xref>
), the perception of it will also depend on how frequently this pattern is featured within the current musical context (
<xref rid="B54" ref-type="bibr">Huron, 2006</xref>
).</p>
<p>The relationship between bottom-up (input) and top-down (prediction) processes is entirely mutually dependent, and the comparison between them is essential to the system, since a variety of environmental causes can theoretically result in similar sensory input (e.g., a cat vs. an image of a cat). The top-down models provide the brain with context-sensitive ways of selecting the correct interpretation of the incoming information. The predictive models continuously predict the causal relationship between sensory input and environmental events. In changing environments, the models are gradually updated (as a result of the bottom-up prediction error) to maximize the correspondence between the sensory input and the predictions, and minimize prediction error. In this way, the causes of sensations are not solely backtracked from the sensory input, but also inferred and anticipated based on contextual cues and previous sensations. Thus, perception is a process that is mutually manifested between the perceiver and the environment, reflecting the bottom-up/top-down reciprocity that is also central to DAT, as mentioned above.</p>
<p>According to PC, the process of comparing input to predictions occurs hierarchically at every level of processing, from the interaction between individual neurons, to communication between large populations of neurons (i.e., brain areas or networks). Furthermore, there are both forward and backward projections between the different layers in the system (
<xref rid="B119" ref-type="bibr">Rao and Ballard, 1999</xref>
). Using a simplified physiological model of PC we can assume that mainly superficial layers in the cortex, rich in pyramidal cells, are responsible for forwarding prediction error upward in the system (driving), whereas mainly modulatory feedback connections from deeper layers provide predictions from higher cortical areas to suppress prediction errors at the lower levels (
<xref rid="B4" ref-type="bibr">Bastos et al., 2012</xref>
). In this way, specific neuronal populations are associated with specific computational roles, disclosing the correspondence between the microcircuitry of the cortical column and the connectivity implied by PC. Hence, at any given level, the input is compared with the prediction from the level above (backward projection). If there is any discrepancy between the two, the difference, i.e., the
<italic>prediction error</italic>
, is fed forward to the next level (forward projections). At the original level, predictions are changed to comply with the input. Depending on the degree of violation, the brain does this by either updating the model, or changing the way it samples information from the environment. This dynamic and continuous updating of models and sampling methods is the basis for the system’s adaptive learning and plasticity (
<xref rid="B38" ref-type="bibr">Friston, 2003</xref>
). When predictions change, the connectivity between the neurons is believed to change accordingly. According to PC, the brain’s task is to minimize prediction error and its ultimate goal is to attain a fully predicted representation of the world. This results in a system which is highly efficient, since only the prediction error and no redundant (predicted) information needs to be processed. This is a key component of PC that sets it apart from previous theories of prediction and Bayesian inference. The only information that needs to be communicated “upward” is the prediction error, making it a kind of proxy (
<xref rid="B29" ref-type="bibr">Feldman and Friston, 2010</xref>
) for sensory information itself.</p>
<p>Predictive coding is notoriously difficult to prove by imaging or recording in the human brain due to the spatial and temporal limitations of the available methods, such as functional magnetic resonance imaging (fMRI), positron emission tomography (PET), electroencephalography (EEG) and magnetoencephalography (MEG). Thus, it remains a theory whose empirical validation is yet to be completed. Nonetheless, PC is supported by recent developments in our understanding of brain physiology (see
<xref rid="B4" ref-type="bibr">Bastos et al., 2012</xref>
for a summary), and this physiological implementation of PC conforms to what we know about error processing in the brain. One particularly well-understood neural marker for error processing (or change detection), which has been frequently employed in auditory and music experiments, is the mismatch negativity (MMN) as recorded with EEG/MEG. The neuronal origins of MMN share physiological features with the error-units suggested by PC, originating in dense superficial layers of the auditory cortices.</p>
<p>Furthermore, there are recent behavioral studies indicating that humans act as rational Bayesian estimators, in perception and action, across different domains (
<xref rid="B66" ref-type="bibr">Körding et al., 2007</xref>
;
<xref rid="B162" ref-type="bibr">Yu, 2007</xref>
;
<xref rid="B7" ref-type="bibr">Berniker and Körding, 2008</xref>
). Recent research into rhythmic tapping is closely related to such studies:
<xref rid="B65" ref-type="bibr">Konvalinka et al. (2010)</xref>
showed that when two participants tap together (instructed to keep the tempo and synchronize with each other), they adapt to each other at a tap-by-tap basis, meaning that each tapper speeds up when the other has been faster on the last tap, and slows down if the other has been slower. In other words, interactive tappers seem to be trying to minimize prediction error at a microtemporal level (although the authors do not strictly use PC in interpreting their results). More recently,
<xref rid="B28" ref-type="bibr">Elliott et al. (2014)</xref>
provided evidence that, compared to alternative models, Bayesian modeling could better account for the behavior of participants instructed to “tap in time” with two irregular metronomes separated by a lag, suggesting that humans exploit Bayesian inference to control movement timing in situations where the underlying beat structure of auditory signals needs to be resolved (i.e., beat-based timing). Specifically, compared with models based exclusively on separation or integration of cues, the Bayesian inference model better predicted participants’ behavior and motor timing errors, since it infers the choice of separation vs. integration based on the likelihood of the onsets of the competing cues and the prior expectation of the beats’ occurrence.
<xref rid="B17" ref-type="bibr">Cicchini et al. (2012)</xref>
found that the accuracy with which percussionists were able to reproduce isolated timing intervals (duration-based timing) was more successfully predicted using a Bayesian model whose prior was estimated from statistical information about mean and standard deviation of interval distribution, compared with a model which ignored such priors. It should be noted, however, that if the system from the outside looks as if it applies Bayesian inference, it does not necessarily mean that its intrinsic mechanisms are guided by Bayesian computational principles (
<xref rid="B85" ref-type="bibr">Maloney and Mamassian, 2009</xref>
). Furthermore, even if the architecture of the brain is governed by Bayes’ rule, it does not mean that all behavior and conscious experience should reflect it (
<xref rid="B19" ref-type="bibr">Clark, 2013</xref>
). Human rhythmic behavior and sensorimotor synchronization, both in musical (e.g.,
<xref rid="B69" ref-type="bibr">Large, 2000</xref>
;
<xref rid="B120" ref-type="bibr">Repp, 2005</xref>
;
<xref rid="B62" ref-type="bibr">Keller, 2008</xref>
;
<xref rid="B121" ref-type="bibr">Repp and Keller, 2008</xref>
;
<xref rid="B130" ref-type="bibr">Schogler et al., 2008</xref>
;
<xref rid="B112" ref-type="bibr">Pecenka and Keller, 2011</xref>
;
<xref rid="B25" ref-type="bibr">Demos et al., 2012</xref>
) and non-musical domains (
<xref rid="B72" ref-type="bibr">Lee, 1998</xref>
;
<xref rid="B70" ref-type="bibr">Large and Jones, 1999</xref>
;
<xref rid="B87" ref-type="bibr">Mayville et al., 2001</xref>
;
<xref rid="B129" ref-type="bibr">Schmidt et al., 2011</xref>
), have been theorized in a number of ways. As mentioned, DAT has proved a particularly useful framework for understanding dynamic temporal and motor processes. We are not claiming that this and other theories are wrong, but rather that PC provides a broader framework according to which they can be understood. The findings of DAT and other compatible research build a strong case for PC, and as we shall see below, several examples of perception of rhythmic complexity in music seem to support it as well.</p>
<p>Predictive coding has received wide recognition in the cognitive sciences and remains a frequently discussed topic (
<xref rid="B119" ref-type="bibr">Rao and Ballard, 1999</xref>
;
<xref rid="B41" ref-type="bibr">Friston, 2010</xref>
;
<xref rid="B10" ref-type="bibr">Brown et al., 2011</xref>
;
<xref rid="B19" ref-type="bibr">Clark, 2013</xref>
). Recently, cognitive philosopher
<xref rid="B19" ref-type="bibr">Clark (2013)</xref>
proposed that the theory could provide the much sought after “grand unifying theory” of cognition. Advocating embodied approaches to cognition (
<xref rid="B20" ref-type="bibr">Clark and Chalmers, 1998</xref>
;
<xref rid="B18" ref-type="bibr">Clark, 2008</xref>
), PC appeals to Clark particularly due to the close relationship it posits between action and perception (
<xref rid="B42" ref-type="bibr">Friston et al., 2010</xref>
). By emphasizing what he calls “action-oriented predictive processing,” he asserts that action follows the same computational strategies as perception, namely Bayesian inference. The only difference is that in motor systems, the perceiver’s own movements and active engagement with the environment constitute the prediction error minimization (
<xref rid="B38" ref-type="bibr">Friston, 2003</xref>
). Ultimately, action-oriented predictive processing is a way to mold the world and actively elicit, via body-movement, the brain’s sensory input. Thus, action and perception work together in a loop to selectively sample and actively sculpt the environment, a principle that has important commonalities with theories of situated and embodied cognition (
<xref rid="B148" ref-type="bibr">Verschure et al., 2003</xref>
;
<xref rid="B74" ref-type="bibr">Leman, 2007</xref>
;
<xref rid="B18" ref-type="bibr">Clark, 2008</xref>
). Furthermore,
<xref rid="B19" ref-type="bibr">Clark (2013)</xref>
suggests that such a principle easily allows for extensions into theories of social action and cultural environments. He also notes how interpersonal music-making can be seen as a form of multi-agent cooperation to collectively shape sensory input through sensorimotor synchronization (
<xref rid="B90" ref-type="bibr">Molnar-Szakacz and Overy, 2006</xref>
;
<xref rid="B121" ref-type="bibr">Repp and Keller, 2008</xref>
;
<xref rid="B103" ref-type="bibr">Overy and Molnar-Szakacs, 2009</xref>
;
<xref rid="B113" ref-type="bibr">Phillips-Silver et al., 2010</xref>
). But to what extent can PC help us understand rhythm and meter perception at a more basic level? Can the way we perceive and produce complex rhythm in music be seen as a Bayesian process? And to what extent can our affective responses to rhythm in music be seen as a result of predictive mechanisms? In the following discussion we will use special cases of rhythmic complexity in music to demonstrate how the relationship between rhythm and meter, one of the most fundamental premises for music perception, is an expression of input vs. model, bottom-up vs. top-down, action-perception loops, and Bayesian PC.</p>
</sec>
<sec>
<title>PREDICTIVE CODING IN MUSICAL RHYTHM</title>
<p>The principles of PC align closely with the statistical learning account of melodic perception proposed by
<xref rid="B111" ref-type="bibr">Pearce and Wiggins (2006)</xref>
and
<xref rid="B110" ref-type="bibr">Pearce et al. (2010)</xref>
. Their notion that initial neuronal error messages are followed by synchronized activity in various brain areas in response to low-probability sequences corresponds to the local prediction error at a low hierarchical level posited in PC. The ensuing neural synchronization across various brain areas is analogous with the integration of new information into the models at higher hierarchical layers. Recently,
<xref rid="B152" ref-type="bibr">Vuust and Frith (2008)</xref>
,
<xref rid="B153" ref-type="bibr">Vuust et al. (2009)</xref>
, and
<xref rid="B44" ref-type="bibr">Gebauer et al. (2012)</xref>
have suggested that PC can provide a useful framework for understanding music perception in general, and rhythm perception in particular.</p>
<p>Central to this claim is that meter constitutes a key predictive model for the musical brain, shaped by statistical learning, and repeatedly challenged by the sensory input from rhythmic patterns in the actual music. Perception of rhythm is thus heavily dependent on the metrical prior.
<xref rid="B9" ref-type="bibr">Brochard et al. (2003)</xref>
have demonstrated the automaticity of this process in a remarkably simple experiment. They showed that listening to an undifferentiated metronome pattern caused the brain to register some beats as automatically more salient than others. Specifically, it arranged them into an alternating strong-weak pattern, i.e., according to duple meter. In PC terms, the brain interpreted the input – in this case the metronomic beats – based on its own predictions. Duple meters are statistically the most common in Western metric music (
<xref rid="B143" ref-type="bibr">Temperley, 2010</xref>
) and are embodied in human locomotion (
<xref rid="B126" ref-type="bibr">Sadeghi et al., 2000</xref>
). Thus the brain maximizes successful prediction by expecting rhythms to be duple (as opposed to, e.g., triple or compound). These predictive brain mechanisms are dependent on long-term learning, familiarity with a particular piece of music, deliberate listening strategies and short-term memory during listening (
<xref rid="B2" ref-type="bibr">Altenmuller, 2001</xref>
). In this way, neural structures underlying musical expectation are influenced by culture, personal listening history, musical training, mood, listening situation, and biology (
<bold>Figure
<xref ref-type="fig" rid="F2">2</xref>
</bold>
).</p>
<fig id="F2" position="float">
<label>FIGURE 2</label>
<caption>
<p>
<bold>Predictive coding of music.</bold>
The experience and learning of music takes place in a dynamic interplay between anticipatory structures in music, such as the build-up and relief of tension in rhythm, melody, harmony, form and other intra-musical features on one side, and the predictive brain on the other. The real time brain model is dependent on cultural background, personal listening history, musical competence, context (e.g., social environment), brain state (including attentional state and mood), and innate biological factors. The brain is constantly trying to minimize the discrepancy between its interpretation model and the musical input by iteratively updating the real time brain model (or prior) by weighting this model with the likelihood (musical input) through Bayes’ theorem. This leads to a constantly changing musical experience and long-term learning.</p>
</caption>
<graphic xlink:href="fpsyg-05-01111-g002"></graphic>
</fig>
<p>The proposed hierarchical processing in PC makes the theory particularly illustrative of the mechanisms behind meter perception in music. Although the extent of the hierarchical differentiation between pulse levels in meter is debated (
<xref rid="B159" ref-type="bibr">Witek et al., in press</xref>
), one cannot define meter without acknowledging at least
<italic>some</italic>
degree of hierarchy (e.g., between the whole-note level and the subsequent levels, as evidenced by the increased metric salience of the downbeat,
<xref rid="B68" ref-type="bibr">Ladinig et al., 2009</xref>
;
<xref rid="B134" ref-type="bibr">Song et al., 2013</xref>
;
<xref rid="B159" ref-type="bibr">Witek et al., in press</xref>
). For meter perception, PC can explain how lower levels, e.g., events at the eighth-note level, provide metric information about the whole-note level and the salience of the downbeat (feed forward). At the same time, the whole-note level, as marked by the most salient beat, the downbeat, provides a metric framework according to which the eighth-notes at the lower level are heard (feed back). This PC way of understanding metric hierarchies emphasizes the mutual relationship between bottom-up and top-down processes.</p>
<p>The influence of top-down processes has been demonstrated in neuroimaging studies of rhythm and beat perception. During passive listening to rhythms (i.e., with no direct priming for body-movement),
<xref rid="B15" ref-type="bibr">Chen et al. (2008a)</xref>
found activations of cortical secondary motor areas, such as the supplementary motor area and premotor area, indicating inherent coupling in the brain between action and perception.
<xref rid="B48" ref-type="bibr">Grahn and Rowe (2009)</xref>
showed that connections between such secondary motor areas and the auditory cortex were more strongly coupled during duration-beat (rhythms whose underlying beat was induced through varying rhythmic interval) than during volume-beat (rhythms whose underlying beat was induced through alternating dynamics). This suggests that secondary motor areas increase their feedback to primary sensory areas during meter perception. Similar findings were reported by
<xref rid="B5" ref-type="bibr">Bengtsson et al. (2009)</xref>
, who observed parametric modulation of activity in cortical motor areas as a function of stimulus predictability (isochronous, metric or non-metric), suggesting that these areas are involved in prediction. In accordance with previous research (
<xref rid="B118" ref-type="bibr">Raij et al., 1997</xref>
;
<xref rid="B145" ref-type="bibr">Trainor et al., 2002</xref>
), they also found increased activity in response to stimulus predictability in a number of frontal areas (medial-frontal gyrus, dorsal-prefrontal cortex, and superior-frontal gyrus). Many such studies have also found that musical training modulates activity patterns and connections between areas, illustrating the importance of previous experience, exposure and expertise in perception of rhythm and meter (
<xref rid="B154" ref-type="bibr">Vuust et al., 2005</xref>
,
<xref rid="B155" ref-type="bibr">2006</xref>
;
<xref rid="B15" ref-type="bibr">Chen et al., 2008a</xref>
;
<xref rid="B48" ref-type="bibr">Grahn and Rowe, 2009</xref>
;
<xref rid="B137" ref-type="bibr">Stupacher et al., 2013</xref>
). These and other studies show a rhythm-related expertise-dependent action-perception reciprocity in the brain (
<xref rid="B46" ref-type="bibr">Grahn and Brett, 2007</xref>
;
<xref rid="B16" ref-type="bibr">Chen et al., 2008b</xref>
;
<xref rid="B14" ref-type="bibr">Chapin et al., 2010</xref>
;
<xref rid="B67" ref-type="bibr">Kung et al., 2013</xref>
), which may reflect the top-down/bottom-up mutuality and action-oriented perception posited by PC.</p>
<p>Neurophysiological research into rhythm and meter suggests similar mechanisms. Using EEG,
<xref rid="B101" ref-type="bibr">Nozaradan et al. (2011)</xref>
recorded neuronal entrainment during listening to a musical beat whose meter was imagined rather than manifested acoustically. Importantly, they found that properties of the beat which were only imagined (acoustically silent) elicited sustained oscillations tuned to the appropriate beat frequency, providing neural evidence of musical entrainment and induced meter.
<xref rid="B43" ref-type="bibr">Fujioka et al. (2010)</xref>
used MEG to show that listening to identical metronome clicks were spatiotemporally encoded in musicians’ brains in different ways, depending on the metric context according to which the ticks had been heard, either duple or triple. Specifically, the right hippocampus showed temporally differentiated peaks to both conditions, suggesting that this chiefly memory-related area may act as a predictor (or anticipator) during metric encoding of temporal structures. In the left basal ganglia, peaks corresponded to the duple condition only. As mentioned, duple meter is thought to be more salient than triple due to the inherent symmetry in human locomotion and, at least for certain populations, because of the bias toward duple meters in Western music. Therefore, the authors propose that the basal ganglia may be involved in the generation of metric hierarchies (duple is more salient than triple;
<xref rid="B46" ref-type="bibr">Grahn and Brett, 2007</xref>
;
<xref rid="B49" ref-type="bibr">Grahn and Rowe, 2012</xref>
). Finally, they speculate that the hippocampal memory system and striatal metric hierarchy system facilitate endogenous activation in auditory and auditory association areas through feedback loops. Studies such as these tap into the hierarchical yet dynamic nature of the brain’s functional organization at the millisecond level. Neurophysiological indications of entrainment, prediction, hierarchy and reciprocity in the brain are therefore highly compatible with the theory of PC.</p>
<p>Understanding the neural mechanisms underlying rhythm in a PC hierarchical framework has recently been suggested for the differential processing of timing at different time scales (Madison, in commentary to
<xref rid="B19" ref-type="bibr">Clark, 2013</xref>
). Whereas time representation at the level of milliseconds will typically be encoded close to the action output (e.g., cortical motor areas and the cerebellum), observations and actions that are more detached in time should involve more prefrontal processing. This is supported by studies showing processing distinctions between intervals above and below
<italic>circa</italic>
one second (
<xref rid="B82" ref-type="bibr">Madison, 2001</xref>
;
<xref rid="B77" ref-type="bibr">Lewis and Miall, 2003</xref>
;
<xref rid="B45" ref-type="bibr">Gooch et al., 2011</xref>
), as well as by indications that time representation for sub-second intervals are to some extent sensory specific (
<xref rid="B98" ref-type="bibr">Nagarajan et al., 1998</xref>
;
<xref rid="B91" ref-type="bibr">Morrone et al., 2005</xref>
) and under some conditions even limited to spatial locations (
<xref rid="B58" ref-type="bibr">Johnston et al., 2006</xref>
;
<xref rid="B11" ref-type="bibr">Burr et al., 2007</xref>
). For longer time periods, a larger part of the prefrontal cortex is activated (
<xref rid="B78" ref-type="bibr">Lewis and Miall, 2006</xref>
;
<xref rid="B133" ref-type="bibr">Simons et al., 2006</xref>
). This timing-related frontal lobe network overlaps with working memory and executive control networks (
<xref rid="B56" ref-type="bibr">Jahanshahi et al., 2000</xref>
;
<xref rid="B104" ref-type="bibr">Owen et al., 2005</xref>
), suggesting that timing constitutes a general cognitive control problem at longer time durations. As we shall see below, this division of labor persists in relation to the different time scales at which perceived rhythms can contradict the metrical framework. Whereas syncopations occurring at a single instance in drum rhythms with a clearly defined meter can be dealt with by the auditory cortices alone, polyrhythms that persist for several bars employ more frontally located (supposedly higher level) neuronal resources.</p>
</sec>
<sec>
<title>NEURAL PROCESSING OF SYNCOPATION AND MUSICAL EXPERTISE</title>
<p>A key factor in our experience of rhythm is the extent to which a rhythmic pattern challenges our perception of meter. The most common example of such tension between rhythm and meter is
<italic>syncopation</italic>
. Syncopations are generally defined as rhythmic events which violate metric expectations (
<xref rid="B80" ref-type="bibr">Longuet-Higgins and Lee, 1984</xref>
;
<xref rid="B32" ref-type="bibr">Fitch and Rosenfeld, 2007</xref>
;
<xref rid="B68" ref-type="bibr">Ladinig et al., 2009</xref>
;
<xref rid="B159" ref-type="bibr">Witek et al., in press</xref>
). Generally, it is thought that listeners expect the majority of onsets in a rhythm to coincide with metrically salient positions, while rests or tied notes are expected to occur at metrically less salient positions (
<xref rid="B80" ref-type="bibr">Longuet-Higgins and Lee, 1984</xref>
;
<xref rid="B143" ref-type="bibr">Temperley, 2010</xref>
;
<xref rid="B160" ref-type="bibr">Witek et al., 2014</xref>
). A syncopation occurs when these expectations are violated, when onsets occur on metrically weak accents and rests or tied notes occur on metrically strong accents. Such expectations can be conceptualized in Bayesian terms (
<xref rid="B141" ref-type="bibr">Temperley, 2007</xref>
,
<xref rid="B143" ref-type="bibr">2010</xref>
). The model assigns relative probabilities to all notes and rests of a pattern based on prior information about statistical frequencies and a hierarchical model of meter. A syncopation’s perceptual effect is thus a consequence of its predictability within the context of music as a whole. For a syncopation to obtain its characteristic effect, it must be experienced as contradicting the meter, but not so strongly that it undermines the meter. Syncopations can also be thought of as phase-shifts, where the rhythmic onset, rather than occurring in phase with its metric reference point, has a negative lag and occurs before it.</p>
<p>Auditory expectancy violations have been extensively studied via the “MMN” response in the brain (
<xref rid="B127" ref-type="bibr">Sams et al., 1985</xref>
;
<xref rid="B96" ref-type="bibr">Näätänen et al., 1987</xref>
,
<xref rid="B97" ref-type="bibr">2001</xref>
;
<xref rid="B105" ref-type="bibr">Paavilainen et al., 1989</xref>
), a component of the auditory event-related potential (ERP), measurable with EEG and MEG. MMNs relate to change in different sound features, such as pitch, timbre, location of sound source, intensity, rhythm or other more abstract auditory changes, such as streams of ascending intervals (
<xref rid="B96" ref-type="bibr">Näätänen et al., 1987</xref>
,
<xref rid="B97" ref-type="bibr">2001</xref>
;
<xref rid="B95" ref-type="bibr">Näätänen, 1992</xref>
;
<xref rid="B36" ref-type="bibr">Friedman et al., 2001</xref>
). The MMN is an effective way to measure pre-attentive prediction processes in the brain, and thus provides a particularly suitable tool to investigate PC. The MMN appears to have properties analogous to the error signal in a PC framework. It is dependent on the establishment of a pattern (or model) and responds only when the predictive pattern is broken. MMNs have been found in response to pattern deviations determined by physical parameters such as frequency (
<xref rid="B127" ref-type="bibr">Sams et al., 1985</xref>
), intensity (
<xref rid="B96" ref-type="bibr">Näätänen et al., 1987</xref>
), spatial localization, and duration (
<xref rid="B105" ref-type="bibr">Paavilainen et al., 1989</xref>
), but also to patterns with more abstract properties (
<xref rid="B106" ref-type="bibr">Paavilainen et al., 2001</xref>
;
<xref rid="B147" ref-type="bibr">Van Zuijen et al., 2004</xref>
). Importantly for our comparison with PC theories, the size of the MMN adjusts as the pattern adapts (
<xref rid="B158" ref-type="bibr">Winkler et al., 1996</xref>
), hence the size of the error message is dependent on the brain’s model of the incoming input as well as on the input itself.</p>
<p>The MMN is also strongly dependent on expertise. Musicians who adjust the tuning of their instruments during performance, such as violinists, display a greater sensitivity to small differences in pitch compared to non-musicians and other musicians playing other instruments (
<xref rid="B64" ref-type="bibr">Koelsch et al., 1999</xref>
); singers respond with a stronger MMN than instrumentalists to small pitch changes (
<xref rid="B99" ref-type="bibr">Nikjeh et al., 2008</xref>
); and conductors process spatial sound information more accurately than professional pianists and non-musicians (
<xref rid="B94" ref-type="bibr">Münte et al., 2001</xref>
). Recently, it was shown that performing musicians’ characteristics of style and genre influence their perceptual skills and their brains’ processing of sound features embedded in a musical context, as indexed by larger MMN (
<xref rid="B150" ref-type="bibr">Vuust et al., 2012a</xref>
,
<xref rid="B151" ref-type="bibr">b</xref>
). Such influences of training on low-level, pre-attentive neural processing exemplify the longer-term contextual, environmental and cultural aspects of PC.</p>
<p>To address the effects of expertise on metric perception,
<xref rid="B153" ref-type="bibr">Vuust et al. (2009)</xref>
investigated whether differential violations of the hierarchical prediction model provided by musical meter would produce error messages indexed as MMN. They compared rhythmically unskilled non-musicians with expert jazz musicians on two different types of metric violations: syncopations in the bass drum of a drum-kit (a musically common violation), and a more general (across all instruments of the drum-kit) disruption of meter (a musically less common violation). Jazz musicians frequently produce highly complex rhythmic music and are therefore ideal candidates for identifying putative competence-dependent differences in the processing of metric violations. MMNm (the magnetic equivalent to the MMN, measured with MEG) in response to metric disruption was found in both participant groups. All expert jazz musicians, and some of the unskilled non-musicians, also exhibited the P3am after the MMNm. The P3am is the magnetic equivalent of the P3a, an event-related response usually associated with the evaluation of salient change for subsequent behavioral action. The study also showed that responses to syncopation were found in most of the expert musicians. The MMNms were localized to the auditory cortices, whereas the P3am showed greater variance in localization between individual subjects. MMNms of expert musicians were stronger in the left hemisphere than in the right hemisphere, in contrast to P3ams showing a slight, non-significant right-lateralization.</p>
<p>The MMNm and P3am can be interpreted as reflecting an error term generated in the auditory cortex and its subsequent evaluation in a broader network of generators in the auditory cortex and higher-level neuronal sources. Consistent with this point of view is the fact that the MMN signal is mainly generated by pyramidal cells in the superficial layers of the cortex, as the canonical microcircuit implementation of PC suggests (
<xref rid="B4" ref-type="bibr">Bastos et al., 2012</xref>
). The study by
<xref rid="B153" ref-type="bibr">Vuust et al. (2009)</xref>
also showed indications of model adjustment in two of the jazz musicians, since their finger-tapping suggested a shift in metric framework (e.g., shifting of the position of the downbeat). These findings are thus in keeping with the PC theory and suggest that there is a congruous relationship between perceptual experience of rhythmic incongruities and the way that these are processed by the brain. However, PC is yet to determine the precise physiological localization and computations of the networks underlying such metric violations. Dynamic causal modeling (
<xref rid="B135" ref-type="bibr">Stephan et al., 2007</xref>
) is a relatively new neural network analysis tool that may help specify some of the unknowns in PC of rhythm and meter in music. Nonetheless, the study by
<xref rid="B153" ref-type="bibr">Vuust et al. (2009)</xref>
showed quantitative and qualitative differences in brain processing between two participant groups with different musical experience, indicating that prediction error generated by meter violation correlates positively with musical competence. A PC interpretation of these findings would posit that the metric models of musicians are stronger than those of non-musicians, leading to greater prediction error.</p>
</sec>
<sec>
<title>PREDICTIVE CODING OF POLYRHYTHM</title>
<p>In some styles of music, the meter may at times be only weakly (or not at all) acoustically actualized, a situation which creates extreme instances of perceptual rhythmic complexity. The pervasive use of
<italic>polyrhythm</italic>
, or even polymeter, throughout musical compositions is a radically complex rhythmic practice that occurs especially in (but is not restricted to) jazz music (
<xref rid="B117" ref-type="bibr">Pressing, 2002</xref>
). During polyrhythm the formal meter may be completely absent in the actual acoustic signal, and musicians must rely on listeners’ ability to predict the formal metric framework. One example of polyrhythm is “cross-rhythm,” in which different overlaid rhythmic patterns can be perceived as suggesting different meters (
<xref rid="B23" ref-type="bibr">Danielsen, 2006</xref>
). A typical example is the so-called “three-against-four” pattern, which may be illustrated by tapping three equally spaced beats in one hand and four equally spaced beats in the other at the same time, so that the periods of both patterns are synchronized. It is possible to perceive the meter of such a pattern in two ways, either as triple or duple. In triple meter, the formal time signature is 3/4 and the four-beat pattern acts as a counter-metric pattern (
<bold>Figure
<xref ref-type="fig" rid="F3">3A</xref>
</bold>
). In duple meter, the time signature is 4/4 and the three-beat pattern is the counter-metric pattern (
<bold>Figure
<xref ref-type="fig" rid="F3">3B</xref>
</bold>
). The rhythmic organization of the two interpretations in
<bold>Figure
<xref ref-type="fig" rid="F3">3</xref>
</bold>
is exactly the same; that is, in each pattern the cross-rhythmic relationship between the two streams is identical. The pattern notated in the lower part of the staves expresses the meter while the pattern in the higher part is the counter-rhythm. The phenomenological experience of this polyrhythm therefore depends on which of the patterns in the cross-rhythm is defined as the meter. The three-against-four polyrhythm is thus analogous to ambiguous images such as Rubin’s vase, which can be seen either as a vase on black background, or faces on white background (
<bold>Figure
<xref ref-type="fig" rid="F3">3C</xref>
</bold>
). In the case of the cross-rhythms, the meter is the background and the counter-metric rhythm is the foreground. As with
<xref rid="B125" ref-type="bibr">Rubin’s (1918)</xref>
vase, cross-rhythm in music can sometimes cause perceptual shifts in which the metric model can be reinterpreted in a different way. In music, such metric shifts can be supported by sensorimotor synchronization, e.g., foot-tapping emphasizing the tactus of the meter.
<xref rid="B114" ref-type="bibr">Phillips-Silver and Trainor (2005)</xref>
found that after an initial period of listening to metrically ambiguous rhythms while being bounced according to either a duple or triple meter, 7-month old babies preferred (i.e., listened longer to) rhythms with accent patterns (i.e., meter) to which they had previously been bounced. Similar patterns were found in adults, suggesting that auditory and vestibular information affects rhythm and meter perception (
<xref rid="B115" ref-type="bibr">Phillips-Silver and Trainor, 2007</xref>
,
<xref rid="B116" ref-type="bibr">2008</xref>
). Viewed as PC, their findings indicate that body-movement shapes perception, suggesting action-oriented perception (
<xref rid="B19" ref-type="bibr">Clark, 2013</xref>
). Polyrhythms and otherwise ambiguous rhythms can thus be seen as presenting to the listener a bistable percept (
<xref rid="B117" ref-type="bibr">Pressing, 2002</xref>
) that affords rhythmic tension and embodied engagement.</p>
<fig id="F3" position="float">
<label>FIGURE 3</label>
<caption>
<p>
<bold>Cross-rhythms. (A)</bold>
Three-beat triple meter with four-beat pattern as counter-rhythm.
<bold>(B)</bold>
Four-beat duple meter with three-beat counter-rhythm. Dots below the staves designate the tactus.
<bold>(C)</bold>
The bistable percept of Rubin’s vase.</p>
</caption>
<graphic xlink:href="fpsyg-05-01111-g003"></graphic>
</fig>
<p>Bistable percepts and other types of perceptual illusions have been suggested to provide particularly revealing illustrations of PC (
<xref rid="B51" ref-type="bibr">Hohwy et al., 2008</xref>
;
<xref rid="B19" ref-type="bibr">Clark, 2013</xref>
). A common example is binocular rivalry, a perceptual scenario in which, using a special experimental setup, each eye is shown a different image simultaneously – for example, a house and a face (
<xref rid="B75" ref-type="bibr">Leopold and Logothetis, 1999</xref>
;
<xref rid="B51" ref-type="bibr">Hohwy et al., 2008</xref>
). In such experiments, the experienced image is not a combination of the two images – some morphed structure with both house- and face-features – but rather a bistable percept in which the image shifts from one to the other, but never the two at the same time. According to PC, such artificially induced experiences illustrate how our perceptual system deals with situations in which there are more than one predictive model. The bottom-up input presents two equally plausible models – it is just as common to see a house as it is too see a face – but they are temporally and spatially incompatible, i.e., the hyper-prior is that we never see a face and a house as coming from the same source at the same time. However, no one stable model can be said to be more likely or more expected than the other. In choosing one hypothesis over the other, the top-down signals will “explain away” only those elements of the driving signal that conform to this hypothesis, causing the prediction error of the alternative hypothesis to be forwarded upward in the system. Therefore, no single prediction can account for all the incoming information or reduce all prediction error, and the brain alternates between the two semi-stable percepts. While non-Bayesian feed-forward accounts of such scenarios posit that switching is caused by attention alone (e.g.,
<xref rid="B73" ref-type="bibr">Lee et al., 2005</xref>
), PC posits a “top-down” competition between linked sets of hypotheses.</p>
<p>In a similar way, we can perceive two alternative rhythms in cross-rhythmic patterns of the kind depicted in
<bold>Figure
<xref ref-type="fig" rid="F3">3</xref>
</bold>
, but never both at the same time. In such complex cases, perceptually alternating and prediction-switching processes are the best way for the brain to minimize prediction error and maintain a statistically viable representation of its environment. However, cross-rhythmic patterns differ from binocular rivalry in one important way: in cross-rhythms such as the three-against-four pattern, it is possible for musically trained individuals to consciously “hear” one interpretation of the pattern, despite the perceptual input advocating for the other. In such cases, the perceiver must devote considerable effort to sustain his or her internal metric model while the rhythmic input deviates from it.</p>
<p>
<xref rid="B155" ref-type="bibr">Vuust et al. (2006</xref>
,
<xref rid="B156" ref-type="bibr">2011</xref>
) have taken advantage of these alternative perceptual consequences of polyrhythm in two studies, using fMRI to measure blood-oxygenated-level-dependent (BOLD) responses in “rhythm section” musicians (drummers, bassists, pianists, and guitarists). The musical example used was the soprano saxophone solo in Sting’s “Lazarus Heart,” in which the rhythm suddenly changes to a different meter for six measures, leaving no acoustic trace of the original meter. However, despite the shift in the musical surface, it is still possible to infer the original meter since the subdivisions and metric frameworks of the two eventually align at the end of the six measures. In other words, a listener could, depending on his or her musical-temporal abilities, consciously maintain the counter-meter. During the first experiment, participants were asked to tap along to the main meter of the music while mentally focusing first on the main meter and then on the counter-meter (
<xref rid="B155" ref-type="bibr">Vuust et al., 2006</xref>
,
<xref rid="B156" ref-type="bibr">2011</xref>
). In the second experiment, they listened to the main meter throughout the study and were asked to tap both the main meter and the counter-meter. In the second experiment, it was found that Brodman’s area (BA) 40 showed increased activity during tapping to the counter-meter compared to the original meter (
<bold>Figure
<xref ref-type="fig" rid="F4">4</xref>
</bold>
). This brain area has been associated with language prosody, and with particular relevance for our discussion, with bistable percepts (
<xref rid="B63" ref-type="bibr">Kleinschmidt et al., 1998</xref>
;
<xref rid="B81" ref-type="bibr">Lumer et al., 1998</xref>
;
<xref rid="B136" ref-type="bibr">Sterzer et al., 2002</xref>
). Furthermore, in both experiments, the counter-metric tasks showed increased activity in a part of the inferior frontal gyrus corresponding to BA 47, most strongly in the right hemisphere (
<bold>Figure
<xref ref-type="fig" rid="F4">4</xref>
</bold>
). This area is typically associated with language, particularly semantic processing (for reviews, see
<xref rid="B31" ref-type="bibr">Fiez, 1997</xref>
;
<xref rid="B13" ref-type="bibr">Cabeza and Nyberg, 2000</xref>
).
<xref rid="B155" ref-type="bibr">Vuust et al.’s (2006</xref>
,
<xref rid="B156" ref-type="bibr">2011</xref>
) studies thus suggest that these areas may serve more general purposes than formerly believed, such as sequencing or hierarchical ordering of perceptual information (BA 47) (
<xref rid="B30" ref-type="bibr">Fiebach and Schubotz, 2006</xref>
), and predictive model comparisons (BA 40). Interestingly, BA 47 was found to be active both in relation the experience (experiment 1) and production (experiment 2) of polyrhythmic tension. Therefore, it is possible that this area, bilaterally, is involved in the processing of prediction error in polyrhythm
<italic>per se</italic>
. The findings may thus provide evidence of action-oriented predictive processing and the close relationship posited between action and perception in PC (
<xref rid="B19" ref-type="bibr">Clark, 2013</xref>
). Furthermore, activity in BA 47 was inversely related to rhythmic expertise as measured by standard deviation of finger-tapping accuracy. In other words, the effort to maintain a counter-metric model during polyrhythm requires less brain activity for musicians than for non-musicians. This finding supports the PC hypothesis that the more accurate the prediction, the less processing is needed by the perceptual system. According to PC, the continuous effort needed to sustain a counter-metric model should lead to sustained activity in the relevant brain areas (e.g., BA 47) and networks, including areas at higher levels than those primarily generating the prediction errors. At these higher levels, the experts’ models should be more successful at predicting the incoming rhythmic information since they require less “processing power” to maintain a competing metric model. In this way, the decreased neural activity in response to increased musical ability in expert musicians is an expression of the hierarchical, bidirectional, and context-sensitive mechanisms posited by PC.</p>
<fig id="F4" position="float">
<label>FIGURE 4</label>
<caption>
<p>
<bold>Areas of activity in the brain during tapping to polyrhythm.</bold>
Activations of Brodman’s areas (BA) 40 and 47 in the parietal and prefrontal cortices, respectively, as associated with tapping to polyrhythms. See
<xref rid="B155" ref-type="bibr">Vuust et al. (2006)</xref>
for more detail.</p>
</caption>
<graphic xlink:href="fpsyg-05-01111-g004"></graphic>
</fig>
</sec>
<sec>
<title>PREDICTIVE CODING IN GROOVE</title>
<p>In certain styles of music, such as funk (
<xref rid="B23" ref-type="bibr">Danielsen, 2006</xref>
), hip-hop (
<xref rid="B50" ref-type="bibr">Greenwald, 2002</xref>
) and electronic dance music (
<xref rid="B12" ref-type="bibr">Butler, 2006</xref>
), continuous rhythmic complexity is the basis for structural development. Such music is often referred to as “groove-based” (
<xref rid="B24" ref-type="bibr">Danielsen, 2010</xref>
). Groove is primarily defined as a psychological construct, characterized by a pleasurable drive toward body-movement in response to rhythmically entraining music (
<xref rid="B83" ref-type="bibr">Madison, 2006</xref>
;
<xref rid="B84" ref-type="bibr">Madison et al., 2011</xref>
;
<xref rid="B57" ref-type="bibr">Janata et al., 2012</xref>
;
<xref rid="B137" ref-type="bibr">Stupacher et al., 2013</xref>
;
<xref rid="B160" ref-type="bibr">Witek et al., 2014</xref>
). Such behavioral effects require that the rhythmically complex musical structures, such as syncopation and cross-rhythm, are continuously repeated. Other examples of repeated rhythmic complexity in groove are metric displacement (
<xref rid="B12" ref-type="bibr">Butler, 2006</xref>
;
<xref rid="B23" ref-type="bibr">Danielsen, 2006</xref>
) and microtiming (
<xref rid="B157" ref-type="bibr">Waadeland, 2001</xref>
;
<xref rid="B55" ref-type="bibr">Iyer, 2002</xref>
;
<xref rid="B23" ref-type="bibr">Danielsen, 2006</xref>
).</p>
<p>In recent experiments,
<xref rid="B160" ref-type="bibr">Witek et al. (2014)</xref>
investigated the relationship between syncopation in groove, the desire to move, and feelings of pleasure. Their stimuli consisted of 50 groove-based (funk) drum-breaks, in which two-bar rhythmic phrases featuring varying degrees of syncopation were repeated four times, continuously. Using a web-based survey, participants were asked to listen to the drum-beaks and rate the extent to which they felt like moving and experienced pleasure. The results showed an inverted U-shaped relationship between degree of syncopation and ratings, indicating that intermediate degrees of rhythmic complexity afford optimal pleasure and desire for body-movement. The inverted U is a familiar function (
<xref rid="B161" ref-type="bibr">Wundt, 1874</xref>
) in music psychology (
<xref rid="B100" ref-type="bibr">North and Hargreaves, 1995</xref>
;
<xref rid="B102" ref-type="bibr">Orr and Ohlsson, 2005</xref>
) and has been suggested to describe the relationship between perceptual complexity and arousal in art more broadly (
<xref rid="B6" ref-type="bibr">Berlyne, 1971</xref>
). Interestingly, rather than being affected by participants’ formal musical training,
<xref rid="B160" ref-type="bibr">Witek et al. (2014)</xref>
found that those who enjoyed dancing and often danced to music rated the drum-breaks as eliciting more pleasure and more desire to move, overall. Thus, it seems that not only institutionalized formal musical training, but also more informal embodied experience with music may affect subjective experiences of rhythmic complexity such as groove.</p>
<p>The inverted U-shape found between degree of syncopation in groove, wanting to move, and feelings of pleasure can be seen as complying with a hierarchical perceptual system at its higher and more subjectively manifested levels. At this higher level, prediction in perception and action facilitates affective and embodied experiences. At low degrees of syncopation, there is little incongruence between the rhythm of the groove (the input) and the meter (the predicted metrical model). Thus, little prediction error is being fed forward from the lower to the higher levels, and the experiential effect is weak – there is little pleasure, and little desire to move. At high degrees of syncopation, the degree of complexity is so high, and the rhythmic input deviates from the metric framework to such an extent, that the predicted model breaks down. Affective and embodied responses are decreased since the system is in the process of “learning” and adjusting its internal models. Also here there is little prediction error since the brain is unable to provide an appropriate prediction model to compare the incoming input with. This uncertainty of the system in the initial phase of perception is widely reported in the literature (
<xref rid="B107" ref-type="bibr">Pack and Born, 2001</xref>
;
<xref rid="B8" ref-type="bibr">Born et al., 2010</xref>
) and is what one would expect if perception involved recruiting top-level models to explain away sensory data. However, at intermediate degrees of syncopation in groove, the balance between the rhythm and the meter is such that the tension is sufficient to produce prediction error, and for the perceptual system to come up with a prediction model, but not so complex as to cause the metric model to break down. The input and the model are incongruent, but not incompatible, and the prediction error affords a string of hierarchical encoding and evaluation from lower to higher levels in the brain, ultimately facilitating feelings of pleasure and desire to move. In fact, synchronized body-movement in groove-directed dance is a good example of action-oriented perception, since the body essentially emphasizes the predictive model by moving to the beat and hence actively tries to minimize prediction error.</p>
<p>These nested levels of input-model comparisons can also explain why it is that, despite persistent repetition, the rhythmic complexity in groove does not lose its characteristic perceptual effect. That is, higher levels in the groove processing hierarchy do not only provide basic perceptual metric models (i.e., rhythmic onsets should occur on strong and not weak accents), they also model expected deviations from the meter (i.e., in groove, rhythmic onsets often occur on metrically weak accents). In this way, groove remains complex, and there is constant tension between rhythm and meter, despite the same rhythmically complex patterns being repeated time and time again. The relationship between lower and higher models can thus be one of tension itself.</p>
<p>Prediction and expectation have been proposed as the primary mechanisms for emotion and pleasure in music (
<xref rid="B89" ref-type="bibr">Meyer, 1956</xref>
;
<xref rid="B54" ref-type="bibr">Huron, 2006</xref>
). The general idea in Huron’s theory is that the brain rewards behavior that stimulates prediction, since prediction is an evolutionarily adaptive cognitive ability. However, it should be noted that although PC has been claimed to provide a “grand unifying theory” of cognition and brain processing, able to provide explanations from low-level firing in individual neurons to high-level conscious experience, perceptual inputs are of course not necessarily evaluated and consciously perceived in terms of prediction (
<xref rid="B19" ref-type="bibr">Clark, 2013</xref>
). That is, when we listen to groove-based music, we may not be consciously perceiving violations of expectation and prediction errors. Of course, in experience, affective and embodied responses are more readily available to evaluation. Rather, PC should be seen as the system “working in the background” to facilitate the characteristic affective and embodied experiences with groove.</p>
<p>This discussion highlights how music and the relationship between rhythm and meter can illustrate the PC theory. The apparent paradox of the pleasure felt in relation to moderate amounts of syncopation is an example of the so-called “dark room problem,” which was recently highlighted in Schaefer et al.’s comment to Clark’s paper, and his subsequent response (Schaefer et al., in commentary to
<xref rid="B19" ref-type="bibr">Clark, 2013</xref>
). What is clearly consistent with PC is that the prediction error between the meter representation and the syncopated rhythm depends on the brain’s ability to infer a meter. Rhythm with low syncopation should only entail small or no prediction errors related to the meter. Rhythm with medium syncopation, i.e., still possible for the brain to reconcile with a certain metric interpretation, will lead to larger prediction errors. Rhythm with too much syncopation, however, could lead to less prediction error if the brain cannot find the meter, even though the complexity in the stimulus is objectively greater. In other words, there cannot be an increase in prediction error if there is no model to compare the input with. What is not evident is why prediction error could lead to higher experience of pleasure. The “dark room problem” in this situation is how to bridge the gap between neuronal activity and organization, and conscious and subjective experience.</p>
<p>Clark addresses this problem by stating that the brain’s end goal is to maximize prediction, rather than minimize prediction error. Thus, the brain may be rewarding prediction error since it leads to learning (i.e., maximizing future prediction). A likely candidate for mediating this effect is the neurotransmitter dopamine in the mesolimbic pathway, as suggested by
<xref rid="B44" ref-type="bibr">Gebauer et al. (2012)</xref>
. Research in rodents (
<xref rid="B131" ref-type="bibr">Schultz, 2007</xref>
;
<xref rid="B132" ref-type="bibr">Schultz et al., 2008</xref>
) has shown dopamine release to both expected and unexpected stimuli, suggesting that the complex interaction between dopamine release and predictions ensures a balance between “explaining away” prediction error in the short term, and maintaining an incentive to engage in novel activities (of potential high risk) leading to adaptive learning in the long term. A next step would be to empirically validate whether the relationship between syncopation in groove and pleasure is modulated by the dopamine system, and to what extent prediction describes the underlying system at both behavioral and neural levels.</p>
</sec>
<sec sec-type="conclusions">
<title>CONCLUSION</title>
<p>The hierarchical nature of meter and the relationship between rhythm and meter in rhythmic complexity provide particularly suitable examples of predictive coding in music. Predictive coding posits that perception and action are mechanisms relying on hierarchical processing of information in Bayesian terms, by which perceptual input, modulated by motor action, is compared with predictive models in the brain. In music, rhythm (the input) is heard in relation to meter (the model). When these are at odds, the difference between them (the prediction error) is fed forward into the system and is subjected to a string of computational evaluations at each level of the perceptual hierarchy, from low-level neuronal firing to high-level perception and cognition. The predictive models are inferred from previous experience, and thus the system is always in a relationship between bottom-up and top-down processes. We suggest that during syncopation – a rhythmic structure that violates metric expectations – the listener’s previous musical training determines the strength of the metric model, and thus the size of the prediction error. Polyrhythm is a type of bistable percept in the auditory domain, which relies on competition between different predictive models to achieve its perceptually characteristic effect. In groove, medium degrees of syncopation provide the optimal balance between prediction and complexity, allowing for just enough prediction error to stimulate the cascade of model comparisons at the nested levels of the perceptual hierarchy and elicit the characteristic pleasurable desire to dance. Further, the constantly repeated rhythmic complexity in groove resists permanent model shifts of low-level metric frameworks, because higher-level models predict that groove should be complex. These instances of rhythmic complexity in music thus provide unique examples of several different properties of predictive coding, and present us with ecologically valid stimuli to use in studying human perception, action, prediction, and the brain.</p>
</sec>
<sec>
<title>Conflict of Interest Statement</title>
<p>The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.</p>
</sec>
</body>
<back>
<fn-group>
<fn id="fn01">
<label>1</label>
<p>More levels can be defined above the bar-level (e.g., the hyper-bar level) and below the 16th level (e.g., the 32nd and 64th levels). Theoretically, metric levels are relative and can be subdivided indefinitely. In practice, however, the metric levels we perceive are limited by our perceptual system: Time-spans too long or too short are not detectable to the human ear (
<xref rid="B35" ref-type="bibr">Fraisse, 1984</xref>
).</p>
</fn>
</fn-group>
<ref-list>
<title>REFERENCES</title>
<ref id="B1">
<mixed-citation publication-type="book">
<person-group person-group-type="author">
<name>
<surname>Agawu</surname>
<given-names>K.</given-names>
</name>
</person-group>
(
<year>2003</year>
).
<article-title>
<italic>Representing African Music: Postcolonial Notes, Queries, Positions</italic>
.</article-title>
<publisher-loc>New York/London</publisher-loc>
:
<publisher-name>Routledge</publisher-name>
</mixed-citation>
</ref>
<ref id="B2">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Altenmuller</surname>
<given-names>E.</given-names>
</name>
</person-group>
(
<year>2001</year>
).
<article-title>How many music centers are in the brain?</article-title>
<source>
<italic>Ann. N. Y. Acad. Sci.</italic>
</source>
<volume>930</volume>
<fpage>273</fpage>
<lpage>280</lpage>
<pub-id pub-id-type="doi">10.1111/j.1749-6632.2001.tb05738.x</pub-id>
<pub-id pub-id-type="pmid">11458834</pub-id>
</mixed-citation>
</ref>
<ref id="B3">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Barnes</surname>
<given-names>R.</given-names>
</name>
<name>
<surname>Jones</surname>
<given-names>M. R.</given-names>
</name>
</person-group>
(
<year>2000</year>
).
<article-title>Expectancy, attention and time.</article-title>
<source>
<italic>Cogn. Psychol.</italic>
</source>
<volume>41</volume>
<fpage>254</fpage>
<lpage>311</lpage>
<pub-id pub-id-type="doi">10.1006/cogp.2000.0738</pub-id>
<pub-id pub-id-type="pmid">11032658</pub-id>
</mixed-citation>
</ref>
<ref id="B4">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Bastos</surname>
<given-names>A. M.</given-names>
</name>
<name>
<surname>Usrey</surname>
<given-names>W. M.</given-names>
</name>
<name>
<surname>Adams</surname>
<given-names>R. A.</given-names>
</name>
<name>
<surname>Mangun</surname>
<given-names>G. R.</given-names>
</name>
<name>
<surname>Fries</surname>
<given-names>P.</given-names>
</name>
<name>
<surname>Friston</surname>
<given-names>K. J.</given-names>
</name>
</person-group>
(
<year>2012</year>
).
<article-title>Canonical microcircuits for predictive coding.</article-title>
<source>
<italic>Neuron</italic>
</source>
<volume>76</volume>
<fpage>695</fpage>
<lpage>711</lpage>
<pub-id pub-id-type="doi">10.1016/j.neuron.2012.10.038</pub-id>
<pub-id pub-id-type="pmid">23177956</pub-id>
</mixed-citation>
</ref>
<ref id="B5">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Bengtsson</surname>
<given-names>S. L.</given-names>
</name>
<name>
<surname>Ullén</surname>
<given-names>F.</given-names>
</name>
<name>
<surname>Henrik Ehrsson</surname>
<given-names>H.</given-names>
</name>
<name>
<surname>Hashimoto</surname>
<given-names>T.</given-names>
</name>
<name>
<surname>Kito</surname>
<given-names>T.</given-names>
</name>
<name>
<surname>Naito</surname>
<given-names>E.</given-names>
</name>
<etal></etal>
</person-group>
(
<year>2009</year>
).
<article-title>Listening to rhythms activates motor and premotor cortices.</article-title>
<source>
<italic>Cortex</italic>
</source>
<volume>45</volume>
<fpage>62</fpage>
<lpage>71</lpage>
<pub-id pub-id-type="doi">10.1016/j.cortex.2008.07.002</pub-id>
<pub-id pub-id-type="pmid">19041965</pub-id>
</mixed-citation>
</ref>
<ref id="B6">
<mixed-citation publication-type="book">
<person-group person-group-type="author">
<name>
<surname>Berlyne</surname>
<given-names>D. E.</given-names>
</name>
</person-group>
(
<year>1971</year>
).
<source>
<italic>Aesthetics and Psychobiology.</italic>
</source>
<publisher-loc>East Norwalk, CT</publisher-loc>
:
<publisher-name>Appleton-Century-Crofts</publisher-name>
</mixed-citation>
</ref>
<ref id="B7">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Berniker</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>Körding</surname>
<given-names>K.</given-names>
</name>
</person-group>
(
<year>2008</year>
).
<article-title>Estimating the sources of motor errors for adaptation and generalization.</article-title>
<source>
<italic>Nat. Neurosci.</italic>
</source>
<volume>11</volume>
<fpage>1454</fpage>
<lpage>1461</lpage>
<pub-id pub-id-type="doi">10.1038/nn.2229</pub-id>
<pub-id pub-id-type="pmid">19011624</pub-id>
</mixed-citation>
</ref>
<ref id="B8">
<mixed-citation publication-type="book">
<person-group person-group-type="author">
<name>
<surname>Born</surname>
<given-names>R. T.</given-names>
</name>
<name>
<surname>Tsui</surname>
<given-names>J. M.</given-names>
</name>
<name>
<surname>Pack</surname>
<given-names>C. C.</given-names>
</name>
</person-group>
(
<year>2010</year>
).
<article-title>“Temporal dynamics of motion integration,” in</article-title>
<source>
<italic>Dynamics of Visual Motion Processing</italic>
</source>
<role>eds</role>
<person-group person-group-type="editor">
<name>
<surname>Masson</surname>
<given-names>G.</given-names>
</name>
<name>
<surname>Ilg</surname>
<given-names>U. J.</given-names>
</name>
</person-group>
(
<publisher-loc>New York</publisher-loc>
:
<publisher-name>Springer</publisher-name>
)
<fpage>37</fpage>
<lpage>54</lpage>
</mixed-citation>
</ref>
<ref id="B9">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Brochard</surname>
<given-names>R.</given-names>
</name>
<name>
<surname>Abecasis</surname>
<given-names>D.</given-names>
</name>
<name>
<surname>Potter</surname>
<given-names>D.</given-names>
</name>
<name>
<surname>Ragot</surname>
<given-names>R.</given-names>
</name>
<name>
<surname>Drake</surname>
<given-names>C.</given-names>
</name>
</person-group>
(
<year>2003</year>
).
<article-title>The ‘tick-tock’ of our internal clock: direct brain evidence of subjective accents in isochronous sequences.</article-title>
<source>
<italic>Psychol. Sci.</italic>
</source>
<volume>14</volume>
<fpage>362</fpage>
<lpage>366</lpage>
<pub-id pub-id-type="doi">10.1111/1467-9280.24441</pub-id>
<pub-id pub-id-type="pmid">12807411</pub-id>
</mixed-citation>
</ref>
<ref id="B10">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Brown</surname>
<given-names>H.</given-names>
</name>
<name>
<surname>Friston</surname>
<given-names>K. J.</given-names>
</name>
<name>
<surname>Bestmann</surname>
<given-names>S.</given-names>
</name>
</person-group>
(
<year>2011</year>
).
<article-title>Active inference, attention, and motor preparation.</article-title>
<source>
<italic>Front. Psychol.</italic>
</source>
<volume>2</volume>
:
<issue>218</issue>
<pub-id pub-id-type="doi">10.3389/fpsyg.2011.00218</pub-id>
</mixed-citation>
</ref>
<ref id="B11">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Burr</surname>
<given-names>D.</given-names>
</name>
<name>
<surname>Tozzi</surname>
<given-names>A.</given-names>
</name>
<name>
<surname>Morrone</surname>
<given-names>M. C.</given-names>
</name>
</person-group>
(
<year>2007</year>
).
<article-title>Neural mechanisms for timing visual events are spatially selective in real-world coordinates.</article-title>
<source>
<italic>Nat. Neurosci.</italic>
</source>
<volume>10</volume>
<fpage>423</fpage>
<lpage>425</lpage>
<pub-id pub-id-type="doi">10.1038/nn1874</pub-id>
<pub-id pub-id-type="pmid">17369824</pub-id>
</mixed-citation>
</ref>
<ref id="B12">
<mixed-citation publication-type="book">
<person-group person-group-type="author">
<name>
<surname>Butler</surname>
<given-names>M. J.</given-names>
</name>
</person-group>
(
<year>2006</year>
).
<source>
<italic>Unlocking the Groove: Rhythm, Meter, and Musical Design in Electronic Dance Music</italic>
.</source>
<publisher-loc>Bloomington</publisher-loc>
:
<publisher-name>Indiana University Press</publisher-name>
</mixed-citation>
</ref>
<ref id="B13">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Cabeza</surname>
<given-names>R.</given-names>
</name>
<name>
<surname>Nyberg</surname>
<given-names>L.</given-names>
</name>
</person-group>
(
<year>2000</year>
).
<article-title>Imaging cognition II: an empirical review of 275 PET and fMRI studies.</article-title>
<source>
<italic>J. Cogn. Neurosci.</italic>
</source>
<volume>12</volume>
<fpage>1</fpage>
<lpage>47</lpage>
<pub-id pub-id-type="doi">10.1162/08989290051137585</pub-id>
<pub-id pub-id-type="pmid">10769304</pub-id>
</mixed-citation>
</ref>
<ref id="B14">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Chapin</surname>
<given-names>H. L.</given-names>
</name>
<name>
<surname>Zanto</surname>
<given-names>T.</given-names>
</name>
<name>
<surname>Jantzen</surname>
<given-names>K. J.</given-names>
</name>
<name>
<surname>Kelso</surname>
<given-names>S. J. A.</given-names>
</name>
<name>
<surname>Steinberg</surname>
<given-names>F.</given-names>
</name>
<name>
<surname>Large</surname>
<given-names>E. W.</given-names>
</name>
</person-group>
(
<year>2010</year>
).
<article-title>Neural responses to complex auditory rhythms: the role of attending.</article-title>
<source>
<italic>Front. Psychol.</italic>
</source>
<volume>1</volume>
:
<issue>224</issue>
<pub-id pub-id-type="doi">10.3389/fpsyg.2010.00224</pub-id>
</mixed-citation>
</ref>
<ref id="B15">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Chen</surname>
<given-names>J. L.</given-names>
</name>
<name>
<surname>Penhune</surname>
<given-names>V. B.</given-names>
</name>
<name>
<surname>Zatorre</surname>
<given-names>R. J.</given-names>
</name>
</person-group>
(
<year>2008a</year>
).
<article-title>Listening to musical rhythms recruits motor regions of the brain</article-title>
<source>
<italic>Cereb. Cortex</italic>
</source>
<volume>18</volume>
<fpage>2844</fpage>
<lpage>2854</lpage>
<pub-id pub-id-type="doi">10.1093/cercor/bhn042</pub-id>
<pub-id pub-id-type="pmid">18388350</pub-id>
</mixed-citation>
</ref>
<ref id="B16">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Chen</surname>
<given-names>J. L.</given-names>
</name>
<name>
<surname>Zatorre</surname>
<given-names>R. J.</given-names>
</name>
<name>
<surname>Penhune</surname>
<given-names>V. B.</given-names>
</name>
</person-group>
(
<year>2008b</year>
).
<article-title>Moving on time: brain network for auditory-motor synchronization is modulated by rhythm complexity and musical training.</article-title>
<source>
<italic>J. Cogn. Neurosci.</italic>
</source>
<volume>20</volume>
<fpage>226</fpage>
<lpage>239</lpage>
<pub-id pub-id-type="doi">10.1162/jocn.2008.20018</pub-id>
<pub-id pub-id-type="pmid">18275331</pub-id>
</mixed-citation>
</ref>
<ref id="B17">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Cicchini</surname>
<given-names>G. M.</given-names>
</name>
<name>
<surname>Arrighi</surname>
<given-names>R.</given-names>
</name>
<name>
<surname>Cecchetti</surname>
<given-names>L.</given-names>
</name>
<name>
<surname>Giusti</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>Burr</surname>
<given-names>D. C.</given-names>
</name>
</person-group>
(
<year>2012</year>
).
<article-title>Optimal encoding of interval timing in expert percussionists.</article-title>
<source>
<italic>J. Neurosci.</italic>
</source>
<volume>32</volume>
<fpage>1056</fpage>
<lpage>1060</lpage>
<pub-id pub-id-type="doi">10.1523/JNEUROSCI.3411-11.2012</pub-id>
<pub-id pub-id-type="pmid">22262903</pub-id>
</mixed-citation>
</ref>
<ref id="B18">
<mixed-citation publication-type="book">
<person-group person-group-type="author">
<name>
<surname>Clark</surname>
<given-names>A.</given-names>
</name>
</person-group>
(
<year>2008</year>
).
<article-title>
<italic>Supersizing the Mind: Embodiment, Action, and Cognitive Extension</italic>
.</article-title>
<publisher-loc>New York, NY</publisher-loc>
:
<publisher-name>Oxford University Press</publisher-name>
<pub-id pub-id-type="doi">10.1093/acprof:oso/9780195333213.001.0001</pub-id>
</mixed-citation>
</ref>
<ref id="B19">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Clark</surname>
<given-names>A.</given-names>
</name>
</person-group>
(
<year>2013</year>
).
<article-title>Whatever next? Predictive brains, situated agents, and the future of cognitive science.</article-title>
<source>
<italic>Behav. Brain Sci.</italic>
</source>
<volume>36</volume>
<fpage>181</fpage>
<lpage>204</lpage>
<pub-id pub-id-type="doi">10.1017/S0140525X12000477</pub-id>
<pub-id pub-id-type="pmid">23663408</pub-id>
</mixed-citation>
</ref>
<ref id="B20">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Clark</surname>
<given-names>A.</given-names>
</name>
<name>
<surname>Chalmers</surname>
<given-names>D.</given-names>
</name>
</person-group>
(
<year>1998</year>
).
<article-title>The extended mind.</article-title>
<source>
<italic>Analysis</italic>
</source>
<volume>58</volume>
<fpage>7</fpage>
<lpage>19</lpage>
<pub-id pub-id-type="doi">10.1093/analys/58.1.7</pub-id>
</mixed-citation>
</ref>
<ref id="B21">
<mixed-citation publication-type="book">
<person-group person-group-type="author">
<name>
<surname>Clarke</surname>
<given-names>E. F.</given-names>
</name>
</person-group>
(
<year>1999</year>
).
<article-title>“Rhythm and timing in music,” in</article-title>
<source>
<italic>The Psychology of Music</italic>
</source>
<edition>2nd Edn</edition>
<role>ed.</role>
<person-group person-group-type="editor">
<name>
<surname>Deutsch</surname>
<given-names>D.</given-names>
</name>
</person-group>
(
<publisher-loc>New York</publisher-loc>
:
<publisher-name>Academic Press</publisher-name>
).</mixed-citation>
</ref>
<ref id="B22">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Clayton</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>Sager</surname>
<given-names>R.</given-names>
</name>
<name>
<surname>Will</surname>
<given-names>U.</given-names>
</name>
</person-group>
(
<year>2004</year>
).
<article-title>In time with the music: the concept of entrainment and its significance for ethnomusicology.</article-title>
<source>
<italic>Eur. Meet. Ethnomusicol.</italic>
</source>
<volume>11</volume>
<fpage>3</fpage>
<lpage>75</lpage>
</mixed-citation>
</ref>
<ref id="B23">
<mixed-citation publication-type="book">
<person-group person-group-type="author">
<name>
<surname>Danielsen</surname>
<given-names>A.</given-names>
</name>
</person-group>
(
<year>2006</year>
).
<source>
<italic>Presence and Pleasure. The Funk Grooves of James Brown and Parliament.</italic>
</source>
<publisher-loc>Middletown, CT</publisher-loc>
:
<publisher-name>Wesleyan University Press</publisher-name>
</mixed-citation>
</ref>
<ref id="B24">
<mixed-citation publication-type="book">
<person-group person-group-type="author">
<name>
<surname>Danielsen</surname>
<given-names>A.</given-names>
</name>
</person-group>
(
<year>2010</year>
).
<source>
<italic>Musical Rhythm in the Age of Digital Reproduction.</italic>
</source>
<publisher-loc>Farnham</publisher-loc>
:
<publisher-name>Ashgate</publisher-name>
</mixed-citation>
</ref>
<ref id="B25">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Demos</surname>
<given-names>A. P.</given-names>
</name>
<name>
<surname>Chaffin</surname>
<given-names>R.</given-names>
</name>
<name>
<surname>Begosh</surname>
<given-names>K. T.</given-names>
</name>
<name>
<surname>Daniels</surname>
<given-names>J. R.</given-names>
</name>
<name>
<surname>Marsh</surname>
<given-names>K. L.</given-names>
</name>
</person-group>
(
<year>2012</year>
).
<article-title>Rocking to the beat: effects of music and partner’s movements on spontaneous interpersonal coordination.</article-title>
<source>
<italic>J. Exp. Psychol. Gen.</italic>
</source>
<volume>141</volume>
<issue>49</issue>
<pub-id pub-id-type="doi">10.1037/a0023843</pub-id>
</mixed-citation>
</ref>
<ref id="B26">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Desain</surname>
<given-names>P.</given-names>
</name>
<name>
<surname>Honing</surname>
<given-names>H.</given-names>
</name>
</person-group>
(
<year>1999</year>
).
<article-title>Computational model of beat induction: the rule-based approach.</article-title>
<source>
<italic>J. New Music Res.</italic>
</source>
<volume>28</volume>
<fpage>29</fpage>
<lpage>42</lpage>
<pub-id pub-id-type="doi">10.1076/jnmr.28.1.29.3123</pub-id>
</mixed-citation>
</ref>
<ref id="B27">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Dixon</surname>
<given-names>S.</given-names>
</name>
</person-group>
(
<year>2001</year>
).
<article-title>Automatic extraction of tempo and beat from expressive performances.</article-title>
<source>
<italic>J. New Music Res.</italic>
</source>
<volume>30</volume>
<fpage>39</fpage>
<lpage>58</lpage>
<pub-id pub-id-type="doi">10.1076/jnmr.30.1.39.7119</pub-id>
</mixed-citation>
</ref>
<ref id="B28">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Elliott</surname>
<given-names>M. T.</given-names>
</name>
<name>
<surname>Wing</surname>
<given-names>A. M.</given-names>
</name>
<name>
<surname>Welchman</surname>
<given-names>A. E.</given-names>
</name>
</person-group>
(
<year>2014</year>
).
<article-title>Moving in time: Bayesian causal inference explains movement coordination to auditory beats.</article-title>
<source>
<italic>Proc. Royal Soc. B Biol. Sci.</italic>
</source>
<volume>281</volume>
<issue>20140751</issue>
<pub-id pub-id-type="doi">10.1098/rspb.2014.0751</pub-id>
</mixed-citation>
</ref>
<ref id="B29">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Feldman</surname>
<given-names>H.</given-names>
</name>
<name>
<surname>Friston</surname>
<given-names>K. J.</given-names>
</name>
</person-group>
(
<year>2010</year>
).
<article-title>Attention, uncertainty, and free-energy.</article-title>
<source>
<italic>Front. Hum. Neurosci.</italic>
</source>
<volume>4</volume>
:
<issue>215</issue>
<pub-id pub-id-type="doi">10.3389/fnhum.2010.00215</pub-id>
</mixed-citation>
</ref>
<ref id="B30">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Fiebach</surname>
<given-names>C. J.</given-names>
</name>
<name>
<surname>Schubotz</surname>
<given-names>R. I.</given-names>
</name>
</person-group>
(
<year>2006</year>
).
<article-title>Dybamic anticipatory processing of hierarchical sequential events: a common role for Broca’s area and ventral premotor cortex across domains?</article-title>
<source>
<italic>Cortex</italic>
</source>
<volume>42</volume>
<fpage>499</fpage>
<lpage>502</lpage>
<pub-id pub-id-type="doi">10.1016/S0010-9452(08)70386-1</pub-id>
<pub-id pub-id-type="pmid">16881258</pub-id>
</mixed-citation>
</ref>
<ref id="B31">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Fiez</surname>
<given-names>J. A.</given-names>
</name>
</person-group>
(
<year>1997</year>
).
<article-title>Phonology, semantics, and the role of the left inferior prefrontal cortex.</article-title>
<source>
<italic>Hum. Brain Mapp.</italic>
</source>
<volume>5</volume>
<fpage>79</fpage>
<lpage>83</lpage>
<pub-id pub-id-type="doi">10.1002/(SICI)1097-0193(1997)5:2<79::AID-HBM1>3.0.CO;2-J</pub-id>
<pub-id pub-id-type="pmid">10096412</pub-id>
</mixed-citation>
</ref>
<ref id="B32">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Fitch</surname>
<given-names>W. T.</given-names>
</name>
<name>
<surname>Rosenfeld</surname>
<given-names>A. J.</given-names>
</name>
</person-group>
(
<year>2007</year>
).
<article-title>Perception and production of syncopated rhythms.</article-title>
<source>
<italic>Music Percept.</italic>
</source>
<volume>25</volume>
<fpage>43</fpage>
<lpage>58</lpage>
<pub-id pub-id-type="doi">10.1525/mp.2007.25.1.43</pub-id>
</mixed-citation>
</ref>
<ref id="B33">
<mixed-citation publication-type="book">
<person-group person-group-type="author">
<name>
<surname>Fraisse</surname>
<given-names>P.</given-names>
</name>
</person-group>
(
<year>1963</year>
).
<source>
<italic>The Psychology of Time.</italic>
</source>
<publisher-loc>New York</publisher-loc>
:
<publisher-name>Harper & Row</publisher-name>
</mixed-citation>
</ref>
<ref id="B34">
<mixed-citation publication-type="book">
<person-group person-group-type="author">
<name>
<surname>Fraisse</surname>
<given-names>P.</given-names>
</name>
</person-group>
(
<year>1982</year>
).
<article-title>“Rhythm and tempo,” in</article-title>
<source>
<italic>The Psychology of Music</italic>
</source>
<edition>1st Edn</edition>
<role>ed.</role>
<person-group person-group-type="editor">
<name>
<surname>Deutsch</surname>
<given-names>D.</given-names>
</name>
</person-group>
(
<publisher-loc>New York</publisher-loc>
:
<publisher-name>Academic Press</publisher-name>
).</mixed-citation>
</ref>
<ref id="B35">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Fraisse</surname>
<given-names>P.</given-names>
</name>
</person-group>
(
<year>1984</year>
).
<article-title>Perception and estimation of time.</article-title>
<source>
<italic>Annu. Rev. Psychol.</italic>
</source>
<volume>35</volume>
<fpage>1</fpage>
<lpage>37</lpage>
<pub-id pub-id-type="doi">10.1146/annurev.ps.35.020184.000245</pub-id>
<pub-id pub-id-type="pmid">6367623</pub-id>
</mixed-citation>
</ref>
<ref id="B36">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Friedman</surname>
<given-names>D.</given-names>
</name>
<name>
<surname>Cycowicz</surname>
<given-names>Y. M.</given-names>
</name>
<name>
<surname>Gaeta</surname>
<given-names>H.</given-names>
</name>
</person-group>
(
<year>2001</year>
).
<article-title>The novelty P3: an event-related brain potential (ERP) sign of the brain’s evaluation of novelty.</article-title>
<source>
<italic>Neurosci. Biobehav. Rev.</italic>
</source>
<volume>25</volume>
<fpage>355</fpage>
<lpage>373</lpage>
<pub-id pub-id-type="doi">10.1016/S0149-7634(01)00019-7</pub-id>
<pub-id pub-id-type="pmid">11445140</pub-id>
</mixed-citation>
</ref>
<ref id="B37">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Friston</surname>
<given-names>K.</given-names>
</name>
</person-group>
(
<year>2002</year>
).
<article-title>Beyond phrenology: what can neuroimaging tell us about distributed circuitry?</article-title>
<source>
<italic>Annu. Rev. Neurosci.</italic>
</source>
<volume>25</volume>
<fpage>221</fpage>
<lpage>250</lpage>
<pub-id pub-id-type="doi">10.1146/annurev.neuro.25.112701.142846</pub-id>
<pub-id pub-id-type="pmid">12052909</pub-id>
</mixed-citation>
</ref>
<ref id="B38">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Friston</surname>
<given-names>K.</given-names>
</name>
</person-group>
(
<year>2003</year>
).
<article-title>Learning and inference in the brain.</article-title>
<source>
<italic>Neural Netw.</italic>
</source>
<volume>16</volume>
<fpage>1325</fpage>
<lpage>1352</lpage>
<pub-id pub-id-type="doi">10.1016/j.neunet.2003.06.005</pub-id>
<pub-id pub-id-type="pmid">14622888</pub-id>
</mixed-citation>
</ref>
<ref id="B39">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Friston</surname>
<given-names>K.</given-names>
</name>
</person-group>
(
<year>2005</year>
).
<article-title>A theory of cortical responses.</article-title>
<source>
<italic>Philos. Trans. R. Soc. B Biol. Sci.</italic>
</source>
<volume>360</volume>
<fpage>815</fpage>
<lpage>836</lpage>
<pub-id pub-id-type="doi">10.1098/rstb.2005.1622</pub-id>
</mixed-citation>
</ref>
<ref id="B40">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Friston</surname>
<given-names>K.</given-names>
</name>
</person-group>
(
<year>2008</year>
).
<article-title>Hierarchical models in the brain.</article-title>
<source>
<italic>PLoS Comput. Biol.</italic>
</source>
<volume>4</volume>
:
<issue>e1000211</issue>
<pub-id pub-id-type="doi">10.1371/journal.pcbi.1000211</pub-id>
</mixed-citation>
</ref>
<ref id="B41">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Friston</surname>
<given-names>K.</given-names>
</name>
</person-group>
(
<year>2010</year>
).
<article-title>The free-energy principle: a unified brain theory?</article-title>
<source>
<italic>Nat. Rev. Neurosci.</italic>
</source>
<volume>1</volume>
<fpage>127</fpage>
<lpage>138</lpage>
<pub-id pub-id-type="doi">10.1038/nrn2787</pub-id>
<pub-id pub-id-type="pmid">20068583</pub-id>
</mixed-citation>
</ref>
<ref id="B42">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Friston</surname>
<given-names>K.</given-names>
</name>
<name>
<surname>Daunizeau</surname>
<given-names>J.</given-names>
</name>
<name>
<surname>Kilner</surname>
<given-names>J.</given-names>
</name>
<name>
<surname>Kiebel</surname>
<given-names>S. J.</given-names>
</name>
</person-group>
(
<year>2010</year>
).
<article-title>Action and behavior: a free-energy formulation.</article-title>
<source>
<italic>Biol. Cybern.</italic>
</source>
<volume>102</volume>
<fpage>227</fpage>
<lpage>260</lpage>
<pub-id pub-id-type="doi">10.1007/s00422-010-0364-z</pub-id>
<pub-id pub-id-type="pmid">20148260</pub-id>
</mixed-citation>
</ref>
<ref id="B43">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Fujioka</surname>
<given-names>T.</given-names>
</name>
<name>
<surname>Zendel</surname>
<given-names>B. R.</given-names>
</name>
<name>
<surname>Ross</surname>
<given-names>B.</given-names>
</name>
</person-group>
(
<year>2010</year>
).
<article-title>Endogenous neuromagnetic activity for mental hierarchy of timing.</article-title>
<source>
<italic>J. Neurosci.</italic>
</source>
<volume>30</volume>
<fpage>3458</fpage>
<lpage>3466</lpage>
<pub-id pub-id-type="doi">10.1523/JNEUROSCI.3086-09.2010</pub-id>
<pub-id pub-id-type="pmid">20203205</pub-id>
</mixed-citation>
</ref>
<ref id="B44">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Gebauer</surname>
<given-names>L.</given-names>
</name>
<name>
<surname>Kringelbach</surname>
<given-names>M. L.</given-names>
</name>
<name>
<surname>Vuust</surname>
<given-names>P.</given-names>
</name>
</person-group>
(
<year>2012</year>
).
<article-title>Ever-changing cycles of musical pleasure: the role of dopamine and anticipation.</article-title>
<source>
<italic>Psychomusicology</italic>
</source>
<volume>22</volume>
<fpage>152</fpage>
<lpage>167</lpage>
<pub-id pub-id-type="doi">10.1037/a0031126</pub-id>
</mixed-citation>
</ref>
<ref id="B45">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Gooch</surname>
<given-names>C. M.</given-names>
</name>
<name>
<surname>Wiener</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>Hamilton</surname>
<given-names>A. C.</given-names>
</name>
<name>
<surname>Coslett</surname>
<given-names>H. B.</given-names>
</name>
</person-group>
(
<year>2011</year>
).
<article-title>Temporal discrimination of sub-and suprasecond time intervals: a voxel-based lesion mapping analysis.</article-title>
<source>
<italic>Front. Integr. Neurosci.</italic>
</source>
<volume>5</volume>
:
<issue>59</issue>
<pub-id pub-id-type="doi">10.3389/fnint.2011.00059</pub-id>
</mixed-citation>
</ref>
<ref id="B46">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Grahn</surname>
<given-names>J. A.</given-names>
</name>
<name>
<surname>Brett</surname>
<given-names>M.</given-names>
</name>
</person-group>
(
<year>2007</year>
).
<article-title>Rhythm and beat perception in motor areas of the brain.</article-title>
<source>
<italic>J. Cogn. Neurosci.</italic>
</source>
<volume>19</volume>
<fpage>893</fpage>
<lpage>906</lpage>
<pub-id pub-id-type="doi">10.1162/jocn.2007.19.5.893</pub-id>
<pub-id pub-id-type="pmid">17488212</pub-id>
</mixed-citation>
</ref>
<ref id="B47">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Grahn</surname>
<given-names>J. A.</given-names>
</name>
<name>
<surname>McAuley</surname>
<given-names>J. D.</given-names>
</name>
</person-group>
(
<year>2009</year>
).
<article-title>Neural bases of individual differences in beat perception.</article-title>
<source>
<italic>Neuroimage</italic>
</source>
<volume>47</volume>
<fpage>1894</fpage>
<lpage>1903</lpage>
<pub-id pub-id-type="doi">10.1016/j.neuroimage.2009.04.039</pub-id>
<pub-id pub-id-type="pmid">19376241</pub-id>
</mixed-citation>
</ref>
<ref id="B48">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Grahn</surname>
<given-names>J. A.</given-names>
</name>
<name>
<surname>Rowe</surname>
<given-names>J. B.</given-names>
</name>
</person-group>
(
<year>2009</year>
).
<article-title>Feeling the beat: premotor and striatal interactions in musicians and nonmusicians during beat perception.</article-title>
<source>
<italic>J. Neurosci.</italic>
</source>
<volume>29</volume>
<fpage>7540</fpage>
<lpage>7548</lpage>
<pub-id pub-id-type="doi">10.1523/jneurosci.2018-08.2009</pub-id>
<pub-id pub-id-type="pmid">19515922</pub-id>
</mixed-citation>
</ref>
<ref id="B49">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Grahn</surname>
<given-names>J. A.</given-names>
</name>
<name>
<surname>Rowe</surname>
<given-names>J. B.</given-names>
</name>
</person-group>
(
<year>2012</year>
).
<article-title>Finding and feeling the musical beat: striatal dissociations between detection and prediction of regularity.</article-title>
<source>
<italic>Cereb. Cortex</italic>
</source>
<volume>23</volume>
<fpage>913</fpage>
<lpage>921</lpage>
<pub-id pub-id-type="doi">10.1093/cercor/bhs083</pub-id>
<pub-id pub-id-type="pmid">22499797</pub-id>
</mixed-citation>
</ref>
<ref id="B50">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Greenwald</surname>
<given-names>J.</given-names>
</name>
</person-group>
(
<year>2002</year>
).
<article-title>Hip-hop drumming: the rhyme may define, but the groove makes you move.</article-title>
<source>
<italic>Black Music Res. J.</italic>
</source>
<volume>22</volume>
<fpage>259</fpage>
<lpage>271</lpage>
<pub-id pub-id-type="doi">10.2307/1519959</pub-id>
</mixed-citation>
</ref>
<ref id="B51">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Hohwy</surname>
<given-names>J.</given-names>
</name>
<name>
<surname>Roepstorff</surname>
<given-names>A.</given-names>
</name>
<name>
<surname>Friston</surname>
<given-names>K.</given-names>
</name>
</person-group>
(
<year>2008</year>
).
<article-title>Predictive coding explains binocular rivalry: an epistemological review.</article-title>
<source>
<italic>Cognition</italic>
</source>
<volume>108</volume>
<fpage>687</fpage>
<lpage>701</lpage>
<pub-id pub-id-type="doi">10.1016/j.cognition.2008.05.010</pub-id>
<pub-id pub-id-type="pmid">18649876</pub-id>
</mixed-citation>
</ref>
<ref id="B52">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Honing</surname>
<given-names>H.</given-names>
</name>
</person-group>
(
<year>2012</year>
).
<article-title>Without it no music: beat induction as a fundamental musical trait.</article-title>
<source>
<italic>Ann. N. Y. Acad. Sci.</italic>
</source>
<volume>1252</volume>
<fpage>85</fpage>
<lpage>91</lpage>
<pub-id pub-id-type="doi">10.1111/j.1749-6632.2011.06402.x</pub-id>
<pub-id pub-id-type="pmid">22524344</pub-id>
</mixed-citation>
</ref>
<ref id="B53">
<mixed-citation publication-type="book">
<person-group person-group-type="author">
<name>
<surname>Honing</surname>
<given-names>H.</given-names>
</name>
</person-group>
(
<year>2013</year>
).
<article-title>“Structure and interpretation of rhythm in music,” in</article-title>
<source>
<italic>The Psychology of Music</italic>
</source>
<edition>3rd Edn</edition>
<role>ed.</role>
<person-group person-group-type="editor">
<name>
<surname>Deutsch</surname>
<given-names>D.</given-names>
</name>
</person-group>
(
<publisher-loc>Amsterdam</publisher-loc>
:
<publisher-name>Academic Press</publisher-name>
)
<fpage>369</fpage>
<lpage>404</lpage>
</mixed-citation>
</ref>
<ref id="B54">
<mixed-citation publication-type="book">
<person-group person-group-type="author">
<name>
<surname>Huron</surname>
<given-names>D.</given-names>
</name>
</person-group>
(
<year>2006</year>
).
<source>
<italic>Sweet Anticipation: Music and the Psychology of Expectation</italic>
.</source>
<publisher-loc>Cambridge, MA</publisher-loc>
:
<publisher-name>The MIT Press</publisher-name>
</mixed-citation>
</ref>
<ref id="B55">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Iyer</surname>
<given-names>V.</given-names>
</name>
</person-group>
(
<year>2002</year>
).
<article-title>Embodied mind, situated cognition, and expressive microtiming in African–American music.</article-title>
<source>
<italic>Music Percept.</italic>
</source>
<volume>19</volume>
<fpage>387</fpage>
<lpage>414</lpage>
<pub-id pub-id-type="doi">10.1525/mp.2002.19.3.387</pub-id>
</mixed-citation>
</ref>
<ref id="B56">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Jahanshahi</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>Dirnberger</surname>
<given-names>G.</given-names>
</name>
<name>
<surname>Fuller</surname>
<given-names>R.</given-names>
</name>
<name>
<surname>Frith</surname>
<given-names>C.</given-names>
</name>
</person-group>
(
<year>2000</year>
).
<article-title>The role of the dorsolateral prefrontal cortex in random number generation: a study with positron emission tomography.</article-title>
<source>
<italic>Neuroimage</italic>
</source>
<volume>12</volume>
<fpage>713</fpage>
<lpage>725</lpage>
<pub-id pub-id-type="doi">10.1006/nimg.2000.0647</pub-id>
<pub-id pub-id-type="pmid">11112403</pub-id>
</mixed-citation>
</ref>
<ref id="B57">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Janata</surname>
<given-names>P.</given-names>
</name>
<name>
<surname>Tomic</surname>
<given-names>S. T.</given-names>
</name>
<name>
<surname>Haberman</surname>
<given-names>J. M.</given-names>
</name>
</person-group>
(
<year>2012</year>
).
<article-title>Sensorimotor coupling in music and the psychology of the groove.</article-title>
<source>
<italic>J. Exp. Psychol. Gen.</italic>
</source>
<volume>141</volume>
<fpage>54</fpage>
<lpage>75</lpage>
<pub-id pub-id-type="doi">10.1037/a0024208</pub-id>
<pub-id pub-id-type="pmid">21767048</pub-id>
</mixed-citation>
</ref>
<ref id="B58">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Johnston</surname>
<given-names>A.</given-names>
</name>
<name>
<surname>Arnold</surname>
<given-names>D. H.</given-names>
</name>
<name>
<surname>Nishida</surname>
<given-names>S.</given-names>
</name>
</person-group>
(
<year>2006</year>
).
<article-title>Spatially localized distortions of event time.</article-title>
<source>
<italic>Curr. Biol.</italic>
</source>
<volume>16</volume>
<fpage>472</fpage>
<lpage>479</lpage>
<pub-id pub-id-type="doi">10.1016/j.cub.2006.01.032</pub-id>
<pub-id pub-id-type="pmid">16527741</pub-id>
</mixed-citation>
</ref>
<ref id="B59">
<mixed-citation publication-type="book">
<person-group person-group-type="author">
<name>
<surname>Jones</surname>
<given-names>M. R.</given-names>
</name>
</person-group>
(
<year>2004</year>
).
<article-title>“Attention and timing,” in</article-title>
<source>
<italic>Ecological Psychoacoustics</italic>
</source>
<role>ed.</role>
<person-group person-group-type="editor">
<name>
<surname>Neuoff</surname>
<given-names>J. G.</given-names>
</name>
</person-group>
(
<publisher-loc>Amsterdam</publisher-loc>
:
<publisher-name>Elsevier Academic Press</publisher-name>
)
<fpage>49</fpage>
<lpage>85</lpage>
</mixed-citation>
</ref>
<ref id="B60">
<mixed-citation publication-type="book">
<person-group person-group-type="author">
<name>
<surname>Jones</surname>
<given-names>M. R.</given-names>
</name>
</person-group>
(
<year>2009</year>
).
<article-title>“Musical time,” in</article-title>
<source>
<italic>The Oxford Handbook of Music Psychology</italic>
</source>
<role>eds</role>
<person-group person-group-type="editor">
<name>
<surname>Hallam</surname>
<given-names>S.</given-names>
</name>
<name>
<surname>Cross</surname>
<given-names>I.</given-names>
</name>
<name>
<surname>Thaut</surname>
<given-names>M.</given-names>
</name>
</person-group>
(
<publisher-loc>New York</publisher-loc>
:
<publisher-name>Oxford University Press</publisher-name>
)
<fpage>81</fpage>
<lpage>92</lpage>
</mixed-citation>
</ref>
<ref id="B61">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Kalender</surname>
<given-names>B.</given-names>
</name>
<name>
<surname>Trehub</surname>
<given-names>S. E.</given-names>
</name>
<name>
<surname>Schellenberg</surname>
<given-names>E. G.</given-names>
</name>
</person-group>
(
<year>2013</year>
).
<article-title>Cross-cultural differences in meter perception.</article-title>
<source>
<italic>Psychol. Res.</italic>
</source>
<volume>77</volume>
<fpage>196</fpage>
<lpage>203</lpage>
<pub-id pub-id-type="doi">10.1007/s00426-012-0427-y</pub-id>
<pub-id pub-id-type="pmid">22367155</pub-id>
</mixed-citation>
</ref>
<ref id="B62">
<mixed-citation publication-type="book">
<person-group person-group-type="author">
<name>
<surname>Keller</surname>
<given-names>P. E.</given-names>
</name>
</person-group>
(
<year>2008</year>
).
<article-title>“Joint action in music performance,” in</article-title>
<source>
<italic>Enacting Intersubjectivity: A Cognitive and Social Perspective to the Study of Interactions</italic>
</source>
<role>eds</role>
<person-group person-group-type="editor">
<name>
<surname>Morganti</surname>
<given-names>F.</given-names>
</name>
<name>
<surname>Carassa</surname>
<given-names>A.</given-names>
</name>
<name>
<surname>Riva</surname>
<given-names>G.</given-names>
</name>
</person-group>
(
<publisher-loc>Amsterdam</publisher-loc>
:
<publisher-name>IOS Press</publisher-name>
)
<fpage>205</fpage>
<lpage>221</lpage>
</mixed-citation>
</ref>
<ref id="B63">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Kleinschmidt</surname>
<given-names>A.</given-names>
</name>
<name>
<surname>Buchel</surname>
<given-names>C.</given-names>
</name>
<name>
<surname>Zeki</surname>
<given-names>S.</given-names>
</name>
<name>
<surname>Frackowiak</surname>
<given-names>R. S.</given-names>
</name>
</person-group>
(
<year>1998</year>
).
<article-title>Human brain activity during spontaneous reversing perception of ambiguous figures.</article-title>
<source>
<italic>Proc. R. Soc. Lond. B Biol. Sci.</italic>
</source>
<volume>265</volume>
<fpage>2427</fpage>
<lpage>2433</lpage>
<pub-id pub-id-type="doi">10.1098/rspb.1998.0594</pub-id>
</mixed-citation>
</ref>
<ref id="B64">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Koelsch</surname>
<given-names>S.</given-names>
</name>
<name>
<surname>Schröger</surname>
<given-names>E.</given-names>
</name>
<name>
<surname>Tervaniemi</surname>
<given-names>M.</given-names>
</name>
</person-group>
(
<year>1999</year>
).
<article-title>Superior pre-attentive auditory processing in musicians.</article-title>
<source>
<italic>Neuroreport</italic>
</source>
<volume>10</volume>
<fpage>1309</fpage>
<lpage>1313</lpage>
<pub-id pub-id-type="doi">10.1097/00001756-199904260-00029</pub-id>
<pub-id pub-id-type="pmid">10363945</pub-id>
</mixed-citation>
</ref>
<ref id="B65">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Konvalinka</surname>
<given-names>I.</given-names>
</name>
<name>
<surname>Vuust</surname>
<given-names>P.</given-names>
</name>
<name>
<surname>Roepstorff</surname>
<given-names>A.</given-names>
</name>
<name>
<surname>Frith</surname>
<given-names>C. D.</given-names>
</name>
</person-group>
(
<year>2010</year>
).
<article-title>Follow you, follow me: continuous mutual prediction and adaptation in joint tapping.</article-title>
<source>
<italic>Q. J. Exp. Psychol.</italic>
</source>
<volume>63</volume>
<fpage>2220</fpage>
<lpage>2230</lpage>
<pub-id pub-id-type="doi">10.1080/17470218.2010.497843</pub-id>
</mixed-citation>
</ref>
<ref id="B66">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Körding</surname>
<given-names>K. P.</given-names>
</name>
<name>
<surname>Beierholm</surname>
<given-names>U.</given-names>
</name>
<name>
<surname>Ma</surname>
<given-names>W. J.</given-names>
</name>
<name>
<surname>Quartz</surname>
<given-names>S.</given-names>
</name>
<name>
<surname>Tenenbaum</surname>
<given-names>J. B.</given-names>
</name>
<name>
<surname>Shams</surname>
<given-names>L.</given-names>
</name>
</person-group>
(
<year>2007</year>
).
<article-title>Causal inference in multisensory perception.</article-title>
<source>
<italic>PLoS ONE</italic>
</source>
<volume>2</volume>
:
<issue>e943</issue>
<pub-id pub-id-type="doi">10.1371/journal.pone.0000943</pub-id>
</mixed-citation>
</ref>
<ref id="B67">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Kung</surname>
<given-names>S.-J.</given-names>
</name>
<name>
<surname>Chen</surname>
<given-names>J. L.</given-names>
</name>
<name>
<surname>Zatorre</surname>
<given-names>R. J.</given-names>
</name>
<name>
<surname>Penhune</surname>
<given-names>V. B.</given-names>
</name>
</person-group>
(
<year>2013</year>
).
<article-title>Interacting cortical and basal ganglia networks underlying finding and tapping to the musical beat.</article-title>
<source>
<italic>J. Cogn. Neurosci.</italic>
</source>
<volume>25</volume>
<fpage>401</fpage>
<lpage>420</lpage>
<pub-id pub-id-type="doi">10.1162/jocn_a_00325</pub-id>
<pub-id pub-id-type="pmid">23163420</pub-id>
</mixed-citation>
</ref>
<ref id="B68">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Ladinig</surname>
<given-names>O.</given-names>
</name>
<name>
<surname>Honing</surname>
<given-names>H.</given-names>
</name>
<name>
<surname>Haden</surname>
<given-names>G.</given-names>
</name>
<name>
<surname>Winkler</surname>
<given-names>I.</given-names>
</name>
</person-group>
(
<year>2009</year>
).
<article-title>Probing attentive and preattentive emergent meter in adult listeners without extensive musical training.</article-title>
<source>
<italic>Music Percept.</italic>
</source>
<volume>26</volume>
<fpage>377</fpage>
<lpage>386</lpage>
<pub-id pub-id-type="doi">10.1525/mp.2009.26.4.377</pub-id>
</mixed-citation>
</ref>
<ref id="B69">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Large</surname>
<given-names>E. W.</given-names>
</name>
</person-group>
(
<year>2000</year>
).
<article-title>On synchronizing movements to music.</article-title>
<source>
<italic>Hum. Mov. Sci.</italic>
</source>
<volume>19</volume>
<fpage>527</fpage>
<lpage>566</lpage>
<pub-id pub-id-type="doi">10.1016/S0167-9457%2800%2900026-9</pub-id>
</mixed-citation>
</ref>
<ref id="B70">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Large</surname>
<given-names>E. W.</given-names>
</name>
<name>
<surname>Jones</surname>
<given-names>M. R.</given-names>
</name>
</person-group>
(
<year>1999</year>
).
<article-title>The dynamics of attending: how people track time-varying events.</article-title>
<source>
<italic>Psychol. Rev.</italic>
</source>
<volume>106</volume>
<fpage>119</fpage>
<lpage>159</lpage>
<pub-id pub-id-type="doi">10.1037/0033-295X.106.1.119</pub-id>
</mixed-citation>
</ref>
<ref id="B71">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Large</surname>
<given-names>E. W.</given-names>
</name>
<name>
<surname>Kolen</surname>
<given-names>J. F.</given-names>
</name>
</person-group>
(
<year>1994</year>
).
<article-title>Resonance and the perception of musical meter.</article-title>
<source>
<italic>Conn. Sci.</italic>
</source>
<volume>6</volume>
<fpage>177</fpage>
<lpage>208</lpage>
<pub-id pub-id-type="doi">10.1080/09540099408915723</pub-id>
</mixed-citation>
</ref>
<ref id="B72">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Lee</surname>
<given-names>D. N.</given-names>
</name>
</person-group>
(
<year>1998</year>
).
<article-title>Guiding movement by coupling taus.</article-title>
<source>
<italic>Ecol. Psychol.</italic>
</source>
<volume>10</volume>
<fpage>221</fpage>
<lpage>250</lpage>
<pub-id pub-id-type="doi">10.1080/10407413.1998.9652683</pub-id>
</mixed-citation>
</ref>
<ref id="B73">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Lee</surname>
<given-names>S.-H.</given-names>
</name>
<name>
<surname>Blake</surname>
<given-names>R.</given-names>
</name>
<name>
<surname>Heeger</surname>
<given-names>D. J.</given-names>
</name>
</person-group>
(
<year>2005</year>
).
<article-title>Travelling waves of activity in primary visual cortex during binocular rivalry.</article-title>
<source>
<italic>Nat. Neurosci.</italic>
</source>
<volume>8</volume>
<issue>22</issue>
<pub-id pub-id-type="doi">10.1038/nn1365</pub-id>
</mixed-citation>
</ref>
<ref id="B74">
<mixed-citation publication-type="book">
<person-group person-group-type="author">
<name>
<surname>Leman</surname>
<given-names>M.</given-names>
</name>
</person-group>
(
<year>2007</year>
).
<source>
<italic>Embodied Music Cognition and Mediation Technology.</italic>
</source>
<publisher-loc>Cambridge, MA</publisher-loc>
:
<publisher-name>MIT Press</publisher-name>
</mixed-citation>
</ref>
<ref id="B75">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Leopold</surname>
<given-names>D. A.</given-names>
</name>
<name>
<surname>Logothetis</surname>
<given-names>N. K.</given-names>
</name>
</person-group>
(
<year>1999</year>
).
<article-title>Multistable phenomena: changing views in perception.</article-title>
<source>
<italic>Trends Cogn. Sci.</italic>
</source>
<volume>3</volume>
<fpage>254</fpage>
<lpage>264</lpage>
<pub-id pub-id-type="doi">10.1016/S1364-6613(99)01332-7</pub-id>
<pub-id pub-id-type="pmid">10377540</pub-id>
</mixed-citation>
</ref>
<ref id="B76">
<mixed-citation publication-type="book">
<person-group person-group-type="author">
<name>
<surname>Lerdahl</surname>
<given-names>F.</given-names>
</name>
<name>
<surname>Jackendoff</surname>
<given-names>R.</given-names>
</name>
</person-group>
(
<year>1983</year>
).
<source>
<italic>A Generative Theory of Tonal Music.</italic>
</source>
<publisher-loc>Cambridge, MA</publisher-loc>
:
<publisher-name>MIT Press</publisher-name>
</mixed-citation>
</ref>
<ref id="B77">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Lewis</surname>
<given-names>P.</given-names>
</name>
<name>
<surname>Miall</surname>
<given-names>R.</given-names>
</name>
</person-group>
(
<year>2003</year>
).
<article-title>Brain activation patterns during measurement of sub-and supra-second intervals.</article-title>
<source>
<italic>Neuropsychologia</italic>
</source>
<volume>41</volume>
<fpage>1583</fpage>
<lpage>1592</lpage>
<pub-id pub-id-type="doi">10.1016/S0028-3932(03)00118-0</pub-id>
<pub-id pub-id-type="pmid">12887983</pub-id>
</mixed-citation>
</ref>
<ref id="B78">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Lewis</surname>
<given-names>P.</given-names>
</name>
<name>
<surname>Miall</surname>
<given-names>R.</given-names>
</name>
</person-group>
(
<year>2006</year>
).
<article-title>A right hemispheric prefrontal system for cognitive time measurement.</article-title>
<source>
<italic>Behav. Processes</italic>
</source>
<volume>71</volume>
<fpage>226</fpage>
<lpage>234</lpage>
<pub-id pub-id-type="doi">10.1016/j.beproc.2005.12.009</pub-id>
<pub-id pub-id-type="pmid">16434151</pub-id>
</mixed-citation>
</ref>
<ref id="B79">
<mixed-citation publication-type="book">
<person-group person-group-type="author">
<name>
<surname>London</surname>
<given-names>J.</given-names>
</name>
</person-group>
(
<year>2012</year>
).
<source>
<italic>Hearing in Time.</italic>
</source>
<publisher-loc>New York</publisher-loc>
:
<publisher-name>Oxford University Press</publisher-name>
<pub-id pub-id-type="doi">10.1093/acprof:oso/9780199744374.001.0001</pub-id>
</mixed-citation>
</ref>
<ref id="B80">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Longuet-Higgins</surname>
<given-names>H. C.</given-names>
</name>
<name>
<surname>Lee</surname>
<given-names>C.</given-names>
</name>
</person-group>
(
<year>1984</year>
).
<article-title>The rhythmic interpretation of monophonic music.</article-title>
<source>
<italic>Music Percept.</italic>
</source>
<volume>1</volume>
<fpage>424</fpage>
<lpage>440</lpage>
<pub-id pub-id-type="doi">10.2307/40285271</pub-id>
</mixed-citation>
</ref>
<ref id="B81">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Lumer</surname>
<given-names>E. D.</given-names>
</name>
<name>
<surname>Friston</surname>
<given-names>K. J.</given-names>
</name>
<name>
<surname>Rees</surname>
<given-names>G.</given-names>
</name>
</person-group>
(
<year>1998</year>
).
<article-title>Neural correlates of perceptual rivalry in the human brain.</article-title>
<source>
<italic>Science</italic>
</source>
<volume>280</volume>
<fpage>1930</fpage>
<lpage>1934</lpage>
<pub-id pub-id-type="doi">10.1126/science.280.5371.1930</pub-id>
<pub-id pub-id-type="pmid">9632390</pub-id>
</mixed-citation>
</ref>
<ref id="B82">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Madison</surname>
<given-names>G.</given-names>
</name>
</person-group>
(
<year>2001</year>
).
<article-title>Variability in isochronous tapping: higher order dependencies as a function of intertap interval.</article-title>
<source>
<italic>J. Exp. Psychol. Hum. Percept. Perform.</italic>
</source>
<volume>27</volume>
<issue>411</issue>
<pub-id pub-id-type="doi">10.1037/0096-1523.27.2.411</pub-id>
</mixed-citation>
</ref>
<ref id="B83">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Madison</surname>
<given-names>G.</given-names>
</name>
</person-group>
(
<year>2006</year>
).
<article-title>Experiencing groove induced by music: consistency and phenomenology.</article-title>
<source>
<italic>Music Percept.</italic>
</source>
<volume>24</volume>
<fpage>201</fpage>
<lpage>208</lpage>
<pub-id pub-id-type="doi">10.1525/mp.2006.24.2.201</pub-id>
</mixed-citation>
</ref>
<ref id="B84">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Madison</surname>
<given-names>G.</given-names>
</name>
<name>
<surname>Gouyon</surname>
<given-names>F.</given-names>
</name>
<name>
<surname>Ullén</surname>
<given-names>F.</given-names>
</name>
<name>
<surname>Hörnström</surname>
<given-names>K.</given-names>
</name>
</person-group>
(
<year>2011</year>
).
<article-title>Modeling the tendency for music to induce movement in humans: first correlations with low-level audio descriptors across music genres.</article-title>
<source>
<italic>J. Exp. Psychol. Hum. Percept. Perform.</italic>
</source>
<volume>37</volume>
<fpage>1578</fpage>
<lpage>1594</lpage>
<pub-id pub-id-type="doi">10.1037/a0024323</pub-id>
<pub-id pub-id-type="pmid">21728462</pub-id>
</mixed-citation>
</ref>
<ref id="B85">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Maloney</surname>
<given-names>L. T.</given-names>
</name>
<name>
<surname>Mamassian</surname>
<given-names>P.</given-names>
</name>
</person-group>
(
<year>2009</year>
).
<article-title>Bayesian decision theory as a model of human visual perception: testing Bayesian transfer.</article-title>
<source>
<italic>Vis. Neurosci.</italic>
</source>
<volume>26</volume>
<fpage>147</fpage>
<lpage>155</lpage>
<pub-id pub-id-type="doi">10.1017/S0952523808080905</pub-id>
<pub-id pub-id-type="pmid">19193251</pub-id>
</mixed-citation>
</ref>
<ref id="B86">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Margulis</surname>
<given-names>E. H.</given-names>
</name>
<name>
<surname>Beatty</surname>
<given-names>A. P.</given-names>
</name>
</person-group>
(
<year>2008</year>
).
<article-title>Musical style, psychoaesthetics, and prospects for entropy as an analytical tool.</article-title>
<source>
<italic>Comput. Music J.</italic>
</source>
<volume>32</volume>
<fpage>64</fpage>
<lpage>78</lpage>
<pub-id pub-id-type="doi">10.1162/comj.2008.32.4.64</pub-id>
</mixed-citation>
</ref>
<ref id="B87">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Mayville</surname>
<given-names>J. M.</given-names>
</name>
<name>
<surname>Fuchs</surname>
<given-names>A.</given-names>
</name>
<name>
<surname>Ding</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>Cheyne</surname>
<given-names>D.</given-names>
</name>
<name>
<surname>Deecke</surname>
<given-names>L.</given-names>
</name>
<name>
<surname>Kelso</surname>
<given-names>J. A. S.</given-names>
</name>
</person-group>
(
<year>2001</year>
).
<article-title>Event-related changes in neuromagnetic activity associated with syncopation and synchronization timing tasks.</article-title>
<source>
<italic>Hum. Brain Mapp.</italic>
</source>
<volume>14</volume>
<fpage>65</fpage>
<lpage>80</lpage>
<pub-id pub-id-type="doi">10.1002/hbm.1042</pub-id>
<pub-id pub-id-type="pmid">11500991</pub-id>
</mixed-citation>
</ref>
<ref id="B88">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>McAuley</surname>
<given-names>J. D.</given-names>
</name>
<name>
<surname>Jones</surname>
<given-names>M. R.</given-names>
</name>
</person-group>
(
<year>2003</year>
).
<article-title>Modeling effects of rhythmic context on perceived duration: a comparison of interval and entrainment approaches to short-interval timing.</article-title>
<source>
<italic>J. Exp. Psychol. Hum. Percept. Perform.</italic>
</source>
<volume>29</volume>
<issue>1102</issue>
<pub-id pub-id-type="doi">10.1037/0096-1523.29.6.1102</pub-id>
</mixed-citation>
</ref>
<ref id="B89">
<mixed-citation publication-type="book">
<person-group person-group-type="author">
<name>
<surname>Meyer</surname>
<given-names>L. B.</given-names>
</name>
</person-group>
(
<year>1956</year>
).
<source>
<italic>Emotion and Meaning in Music.</italic>
</source>
<publisher-loc>Chicago</publisher-loc>
:
<publisher-name>University of Chicago Press</publisher-name>
</mixed-citation>
</ref>
<ref id="B90">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Molnar-Szakacz</surname>
<given-names>I.</given-names>
</name>
<name>
<surname>Overy</surname>
<given-names>K.</given-names>
</name>
</person-group>
(
<year>2006</year>
).
<article-title>Music and mirror neurons: from motion to ‘e’motion.</article-title>
<source>
<italic>Soc. Cogn. Affect. Neurosci.</italic>
</source>
<volume>1</volume>
<fpage>235</fpage>
<lpage>241</lpage>
<pub-id pub-id-type="doi">10.1093/scan/nsl029</pub-id>
<pub-id pub-id-type="pmid">18985111</pub-id>
</mixed-citation>
</ref>
<ref id="B91">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Morrone</surname>
<given-names>M. C.</given-names>
</name>
<name>
<surname>Ross</surname>
<given-names>J.</given-names>
</name>
<name>
<surname>Burr</surname>
<given-names>D.</given-names>
</name>
</person-group>
(
<year>2005</year>
).
<article-title>Saccadic eye movements cause compression of time as well as space.</article-title>
<source>
<italic>Nat. Neurosci.</italic>
</source>
<volume>8</volume>
<fpage>950</fpage>
<lpage>954</lpage>
<pub-id pub-id-type="doi">10.1038/nn1488</pub-id>
<pub-id pub-id-type="pmid">15965472</pub-id>
</mixed-citation>
</ref>
<ref id="B92">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Mumford</surname>
<given-names>D.</given-names>
</name>
</person-group>
(
<year>1992</year>
).
<article-title>On the computational architecture of the neocortex.</article-title>
<source>
<italic>Biol. Cybern.</italic>
</source>
<volume>66</volume>
<fpage>241</fpage>
<lpage>251</lpage>
<pub-id pub-id-type="doi">10.1007/BF00198477</pub-id>
<pub-id pub-id-type="pmid">1540675</pub-id>
</mixed-citation>
</ref>
<ref id="B93">
<mixed-citation publication-type="book">
<person-group person-group-type="author">
<name>
<surname>Mumford</surname>
<given-names>D.</given-names>
</name>
</person-group>
(
<year>1994</year>
).
<article-title>“Neuronal architectures for pattern-theoretic problems,” in</article-title>
<source>
<italic>Large-Scale Neuronal Theories of the Brain</italic>
</source>
<role>eds</role>
<person-group person-group-type="editor">
<name>
<surname>Koch</surname>
<given-names>C.</given-names>
</name>
<name>
<surname>Davis</surname>
<given-names>J.</given-names>
</name>
</person-group>
(
<publisher-loc>Cambridge, MA</publisher-loc>
:
<publisher-name>MIT Press</publisher-name>
)
<fpage>125</fpage>
<lpage>152</lpage>
</mixed-citation>
</ref>
<ref id="B94">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Münte</surname>
<given-names>T. F.</given-names>
</name>
<name>
<surname>Kohlmetz</surname>
<given-names>C.</given-names>
</name>
<name>
<surname>Nager</surname>
<given-names>W.</given-names>
</name>
<name>
<surname>Altenmüller</surname>
<given-names>E.</given-names>
</name>
</person-group>
(
<year>2001</year>
).
<article-title>Superior auditory spatial tuning in conductors.</article-title>
<source>
<italic>Nature</italic>
</source>
<volume>409</volume>
<issue>580</issue>
<pub-id pub-id-type="doi">10.1038/35054668</pub-id>
</mixed-citation>
</ref>
<ref id="B95">
<mixed-citation publication-type="book">
<person-group person-group-type="author">
<name>
<surname>Näätänen</surname>
<given-names>R.</given-names>
</name>
</person-group>
(
<year>1992</year>
).
<source>
<italic>Attention and Brain Function.</italic>
</source>
<publisher-loc>Hillsdale, NJ.</publisher-loc>
:
<publisher-name>Erlbaum</publisher-name>
</mixed-citation>
</ref>
<ref id="B96">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Näätänen</surname>
<given-names>R.</given-names>
</name>
<name>
<surname>Paaviliainen</surname>
<given-names>P.</given-names>
</name>
<name>
<surname>Alho</surname>
<given-names>K.</given-names>
</name>
<name>
<surname>Reinikainen</surname>
<given-names>K.</given-names>
</name>
<name>
<surname>Sams</surname>
<given-names>M.</given-names>
</name>
</person-group>
(
<year>1987</year>
).
<article-title>The mismatch negativity to intensity changes in an auditory stimulus sequence.</article-title>
<source>
<italic>Electroencephalogr. Clin. Neurophysiol.</italic>
</source>
<volume>40</volume>
<fpage>125</fpage>
<lpage>131</lpage>
</mixed-citation>
</ref>
<ref id="B97">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Näätänen</surname>
<given-names>R.</given-names>
</name>
<name>
<surname>Tervaniemi</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>Sussman</surname>
<given-names>E. S.</given-names>
</name>
<name>
<surname>Paavilainen</surname>
<given-names>P.</given-names>
</name>
<name>
<surname>Winkler</surname>
<given-names>I.</given-names>
</name>
</person-group>
(
<year>2001</year>
).
<article-title>‘Primitive intelligence’ in the auditory cortex.</article-title>
<source>
<italic>Trends Neurosci.</italic>
</source>
<volume>24</volume>
<fpage>283</fpage>
<lpage>288</lpage>
<pub-id pub-id-type="doi">10.1016/S0166-2236(00)01790-2</pub-id>
<pub-id pub-id-type="pmid">11311381</pub-id>
</mixed-citation>
</ref>
<ref id="B98">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Nagarajan</surname>
<given-names>S. S.</given-names>
</name>
<name>
<surname>Blake</surname>
<given-names>D. T.</given-names>
</name>
<name>
<surname>Wright</surname>
<given-names>B. A.</given-names>
</name>
<name>
<surname>Byl</surname>
<given-names>N.</given-names>
</name>
<name>
<surname>Merzenich</surname>
<given-names>M. M.</given-names>
</name>
</person-group>
(
<year>1998</year>
).
<article-title>Practice-related improvements in somatosensory interval discrimination are temporally specific but generalize across skin location, hemisphere, and modality.</article-title>
<source>
<italic>J. Neurosci.</italic>
</source>
<volume>18</volume>
<fpage>1559</fpage>
<lpage>1570</lpage>
<pub-id pub-id-type="pmid">9454861</pub-id>
</mixed-citation>
</ref>
<ref id="B99">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Nikjeh</surname>
<given-names>D. A.</given-names>
</name>
<name>
<surname>Lister</surname>
<given-names>J. J.</given-names>
</name>
<name>
<surname>Frisch</surname>
<given-names>S. A.</given-names>
</name>
</person-group>
(
<year>2008</year>
).
<article-title>Hearing of note: an electrophysiologic and psychoacoustic comparison of pitch discrimination between vocal and instrumental musicians.</article-title>
<source>
<italic>Psychophysiology</italic>
</source>
<volume>45</volume>
<fpage>994</fpage>
<lpage>1007</lpage>
<pub-id pub-id-type="doi">10.1111/j.1469-8986.2008.00689.x</pub-id>
<pub-id pub-id-type="pmid">18778322</pub-id>
</mixed-citation>
</ref>
<ref id="B100">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>North</surname>
<given-names>A. C.</given-names>
</name>
<name>
<surname>Hargreaves</surname>
<given-names>D. J.</given-names>
</name>
</person-group>
(
<year>1995</year>
).
<article-title>Subjective complexity, familiarity, and liking for popular music.</article-title>
<source>
<italic>Psychomusicology</italic>
</source>
<volume>14</volume>
<fpage>77</fpage>
<lpage>93</lpage>
<pub-id pub-id-type="doi">10.1037/h0094090</pub-id>
</mixed-citation>
</ref>
<ref id="B101">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Nozaradan</surname>
<given-names>S.</given-names>
</name>
<name>
<surname>Peretz</surname>
<given-names>I.</given-names>
</name>
<name>
<surname>Missal</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>Mouraux</surname>
<given-names>A.</given-names>
</name>
</person-group>
(
<year>2011</year>
).
<article-title>Tagging the neuronal entrainment to beat and meter.</article-title>
<source>
<italic>J. Neurosci.</italic>
</source>
<volume>31</volume>
<fpage>10234</fpage>
<lpage>10240</lpage>
<pub-id pub-id-type="doi">10.1523/JNEUROSCI.0411-11.2011</pub-id>
<pub-id pub-id-type="pmid">21753000</pub-id>
</mixed-citation>
</ref>
<ref id="B102">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Orr</surname>
<given-names>M. G.</given-names>
</name>
<name>
<surname>Ohlsson</surname>
<given-names>S.</given-names>
</name>
</person-group>
(
<year>2005</year>
).
<article-title>Relationship between complexity and liking as a function of expertise.</article-title>
<source>
<italic>Music Percept.</italic>
</source>
<volume>22</volume>
<fpage>583</fpage>
<lpage>611</lpage>
<pub-id pub-id-type="doi">10.1525/mp.2005.22.4.583</pub-id>
</mixed-citation>
</ref>
<ref id="B103">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Overy</surname>
<given-names>K.</given-names>
</name>
<name>
<surname>Molnar-Szakacs</surname>
<given-names>I.</given-names>
</name>
</person-group>
(
<year>2009</year>
).
<article-title>Being together in time: musical experience and the mirror neuron system.</article-title>
<source>
<italic>Music Percept.</italic>
</source>
<volume>26</volume>
<fpage>489</fpage>
<lpage>504</lpage>
<pub-id pub-id-type="doi">10.1525/mp.2009.26.5.489</pub-id>
</mixed-citation>
</ref>
<ref id="B104">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Owen</surname>
<given-names>A. M.</given-names>
</name>
<name>
<surname>Mcmillan</surname>
<given-names>K. M.</given-names>
</name>
<name>
<surname>Laird</surname>
<given-names>A. R.</given-names>
</name>
<name>
<surname>Bullmore</surname>
<given-names>E.</given-names>
</name>
</person-group>
(
<year>2005</year>
).
<article-title>N-back working memory paradigm: a meta-analysis of normative functional neuroimaging studies.</article-title>
<source>
<italic>Hum. Brain Mapp.</italic>
</source>
<volume>25</volume>
<fpage>46</fpage>
<lpage>59</lpage>
<pub-id pub-id-type="doi">10.1002/hbm.20131</pub-id>
<pub-id pub-id-type="pmid">15846822</pub-id>
</mixed-citation>
</ref>
<ref id="B105">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Paavilainen</surname>
<given-names>P.</given-names>
</name>
<name>
<surname>Karlsson</surname>
<given-names>M.-L.</given-names>
</name>
<name>
<surname>Reinikainen</surname>
<given-names>K.</given-names>
</name>
<name>
<surname>Näätänen</surname>
<given-names>R.</given-names>
</name>
</person-group>
(
<year>1989</year>
).
<article-title>Mismatch negativity to change in spatial location of an auditory stimulus.</article-title>
<source>
<italic>Electroencephalogr. Clin. Neurophysiol.</italic>
</source>
<volume>73</volume>
<fpage>129</fpage>
<lpage>141</lpage>
<pub-id pub-id-type="doi">10.1016/0013-4694(89)90192-2</pub-id>
<pub-id pub-id-type="pmid">2473880</pub-id>
</mixed-citation>
</ref>
<ref id="B106">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Paavilainen</surname>
<given-names>P.</given-names>
</name>
<name>
<surname>Simola</surname>
<given-names>J.</given-names>
</name>
<name>
<surname>Jaramillo</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>Näätänen</surname>
<given-names>R.</given-names>
</name>
<name>
<surname>Winkler</surname>
<given-names>I.</given-names>
</name>
</person-group>
(
<year>2001</year>
).
<article-title>Preattentive extraction of abstract feature conjunctions from auditory stimulation as reflected by the mismatch negativity (MMN).</article-title>
<source>
<italic>Psychophysiology</italic>
</source>
<volume>38</volume>
<fpage>359</fpage>
<lpage>365</lpage>
<pub-id pub-id-type="doi">10.1111/1469-8986.3820359</pub-id>
<pub-id pub-id-type="pmid">11347880</pub-id>
</mixed-citation>
</ref>
<ref id="B107">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Pack</surname>
<given-names>C. C.</given-names>
</name>
<name>
<surname>Born</surname>
<given-names>R. T.</given-names>
</name>
</person-group>
(
<year>2001</year>
).
<article-title>Temporal dynamics of a neural solution to the aperture problem in visual area MT of macaque brain.</article-title>
<source>
<italic>Nature</italic>
</source>
<volume>409</volume>
<fpage>1040</fpage>
<lpage>1042</lpage>
<pub-id pub-id-type="doi">10.1038/35059085</pub-id>
<pub-id pub-id-type="pmid">11234012</pub-id>
</mixed-citation>
</ref>
<ref id="B108">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Palmer</surname>
<given-names>C.</given-names>
</name>
<name>
<surname>Krumhansl</surname>
<given-names>C. L.</given-names>
</name>
</person-group>
(
<year>1990</year>
).
<article-title>Mental representation for musical meter.</article-title>
<source>
<italic>J. Exp. Psychol. Hum. Percept. Perform.</italic>
</source>
<volume>16</volume>
<fpage>728</fpage>
<lpage>741</lpage>
<pub-id pub-id-type="doi">10.1037/0096-1523.16.4.728</pub-id>
<pub-id pub-id-type="pmid">2148588</pub-id>
</mixed-citation>
</ref>
<ref id="B109">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Parncutt</surname>
<given-names>R.</given-names>
</name>
</person-group>
(
<year>1994</year>
).
<article-title>A perceptual model of pulse salience and metrical accent in musical rhythms.</article-title>
<source>
<italic>Music Percept.</italic>
</source>
<volume>11</volume>
<fpage>409</fpage>
<lpage>464</lpage>
<pub-id pub-id-type="doi">10.2307/40285633</pub-id>
</mixed-citation>
</ref>
<ref id="B110">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Pearce</surname>
<given-names>M. T.</given-names>
</name>
<name>
<surname>Ruiz</surname>
<given-names>M. H.</given-names>
</name>
<name>
<surname>Kapasi</surname>
<given-names>S.</given-names>
</name>
<name>
<surname>Wiggins</surname>
<given-names>G. A.</given-names>
</name>
<name>
<surname>Bhattacharya</surname>
<given-names>J.</given-names>
</name>
</person-group>
(
<year>2010</year>
).
<article-title>Unsupervised statistical learning underpins computational, behavioural and neural manifestations of musical expectation.</article-title>
<source>
<italic>Neuroimage</italic>
</source>
<volume>50</volume>
<fpage>302</fpage>
<lpage>313</lpage>
<pub-id pub-id-type="doi">10.1016/j.neuroimage.2009.12.019</pub-id>
<pub-id pub-id-type="pmid">20005297</pub-id>
</mixed-citation>
</ref>
<ref id="B111">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Pearce</surname>
<given-names>M. T.</given-names>
</name>
<name>
<surname>Wiggins</surname>
<given-names>G. A.</given-names>
</name>
</person-group>
(
<year>2006</year>
).
<article-title>Expectation in melody: the influence of context and learning.</article-title>
<source>
<italic>Music Percept.</italic>
</source>
<volume>23</volume>
<fpage>377</fpage>
<lpage>405</lpage>
<pub-id pub-id-type="doi">10.1525/mp.2006.23.5.377</pub-id>
</mixed-citation>
</ref>
<ref id="B112">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Pecenka</surname>
<given-names>N.</given-names>
</name>
<name>
<surname>Keller</surname>
<given-names>P. E.</given-names>
</name>
</person-group>
(
<year>2011</year>
).
<article-title>The role of temporal prediction abilities in interpersonal sensorimotor synchronization.</article-title>
<source>
<italic>Exp. Brain Res.</italic>
</source>
<volume>211</volume>
<fpage>505</fpage>
<lpage>515</lpage>
<pub-id pub-id-type="doi">10.1007/s00221-011-2616-0</pub-id>
<pub-id pub-id-type="pmid">21424257</pub-id>
</mixed-citation>
</ref>
<ref id="B113">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Phillips-Silver</surname>
<given-names>J.</given-names>
</name>
<name>
<surname>Aktipis</surname>
<given-names>A. C.</given-names>
</name>
<name>
<surname>Bryant</surname>
<given-names>G.</given-names>
</name>
</person-group>
(
<year>2010</year>
).
<article-title>The ecology of entrainment: foundations of coordinated rhythmic movement.</article-title>
<source>
<italic>Music Percept.</italic>
</source>
<volume>28</volume>
<fpage>3</fpage>
<lpage>14</lpage>
<pub-id pub-id-type="doi">10.1525/mp.2010.28.1.3</pub-id>
<pub-id pub-id-type="pmid">21776183</pub-id>
</mixed-citation>
</ref>
<ref id="B114">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Phillips-Silver</surname>
<given-names>J.</given-names>
</name>
<name>
<surname>Trainor</surname>
<given-names>L. J.</given-names>
</name>
</person-group>
(
<year>2005</year>
).
<article-title>Feeling the beat: movement influences infant rhythm perception.</article-title>
<source>
<italic>Science</italic>
</source>
<volume>308</volume>
<fpage>1430</fpage>
<lpage>1430</lpage>
<pub-id pub-id-type="doi">10.1126/science.1110922</pub-id>
<pub-id pub-id-type="pmid">15933193</pub-id>
</mixed-citation>
</ref>
<ref id="B115">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Phillips-Silver</surname>
<given-names>J.</given-names>
</name>
<name>
<surname>Trainor</surname>
<given-names>L. J.</given-names>
</name>
</person-group>
(
<year>2007</year>
).
<article-title>Hearing what the body feels: auditory encoding of rhythmic movement.</article-title>
<source>
<italic>Cognition</italic>
</source>
<volume>105</volume>
<fpage>533</fpage>
<lpage>546</lpage>
<pub-id pub-id-type="doi">10.1016/j.cognition.2006.11.006</pub-id>
<pub-id pub-id-type="pmid">17196580</pub-id>
</mixed-citation>
</ref>
<ref id="B116">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Phillips-Silver</surname>
<given-names>J.</given-names>
</name>
<name>
<surname>Trainor</surname>
<given-names>L. J.</given-names>
</name>
</person-group>
(
<year>2008</year>
).
<article-title>Vestibular influence on auditory metrical interpretation.</article-title>
<source>
<italic>Brain Cogn.</italic>
</source>
<volume>67</volume>
<fpage>94</fpage>
<lpage>102</lpage>
<pub-id pub-id-type="doi">10.1016/j.bandc.2007.11.007</pub-id>
<pub-id pub-id-type="pmid">18234407</pub-id>
</mixed-citation>
</ref>
<ref id="B117">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Pressing</surname>
<given-names>J.</given-names>
</name>
</person-group>
(
<year>2002</year>
).
<article-title>Black Atlantic rhythm: its computational and transcultural foundations.</article-title>
<source>
<italic>Music Percept.</italic>
</source>
<volume>19</volume>
<fpage>285</fpage>
<lpage>310</lpage>
<pub-id pub-id-type="doi">10.1525/mp.2002.19.3.285</pub-id>
</mixed-citation>
</ref>
<ref id="B118">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Raij</surname>
<given-names>T.</given-names>
</name>
<name>
<surname>Mcevoy</surname>
<given-names>L.</given-names>
</name>
<name>
<surname>Mäkelä</surname>
<given-names>J. P.</given-names>
</name>
<name>
<surname>Hari</surname>
<given-names>R.</given-names>
</name>
</person-group>
(
<year>1997</year>
).
<article-title>Human auditory cortex is activated by omissions of auditory stimuli.</article-title>
<source>
<italic>Brain Res.</italic>
</source>
<volume>745</volume>
<fpage>134</fpage>
<lpage>143</lpage>
<pub-id pub-id-type="doi">10.1016/S0006-8993(96)01140-7</pub-id>
<pub-id pub-id-type="pmid">9037402</pub-id>
</mixed-citation>
</ref>
<ref id="B119">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Rao</surname>
<given-names>R. P.</given-names>
</name>
<name>
<surname>Ballard</surname>
<given-names>D. H.</given-names>
</name>
</person-group>
(
<year>1999</year>
).
<article-title>Predictive coding in the visual cortex: a functional interpretation of some extra-classical receptive-field effects.</article-title>
<source>
<italic>Nat. Neurosci.</italic>
</source>
<volume>2</volume>
<fpage>79</fpage>
<lpage>87</lpage>
<pub-id pub-id-type="doi">10.1038/4580</pub-id>
<pub-id pub-id-type="pmid">10195184</pub-id>
</mixed-citation>
</ref>
<ref id="B120">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Repp</surname>
<given-names>B.</given-names>
</name>
</person-group>
(
<year>2005</year>
).
<article-title>Sensorimotor synchronization: a review of the tapping literature.</article-title>
<source>
<italic>Psychon. Bull. Rev.</italic>
</source>
<volume>12</volume>
<fpage>969</fpage>
<lpage>992</lpage>
<pub-id pub-id-type="doi">10.3758/bf03206433</pub-id>
<pub-id pub-id-type="pmid">16615317</pub-id>
</mixed-citation>
</ref>
<ref id="B121">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Repp</surname>
<given-names>B.</given-names>
</name>
<name>
<surname>Keller</surname>
<given-names>P. E.</given-names>
</name>
</person-group>
(
<year>2008</year>
).
<article-title>Sensorimotor synchronisation with adaptively timed sequences.</article-title>
<source>
<italic>Hum. Mov. Sci.</italic>
</source>
<volume>27</volume>
<fpage>423</fpage>
<lpage>456</lpage>
<pub-id pub-id-type="doi">10.1016/j.humov.2008.02.016</pub-id>
<pub-id pub-id-type="pmid">18405989</pub-id>
</mixed-citation>
</ref>
<ref id="B122">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Robbins</surname>
<given-names>H.</given-names>
</name>
</person-group>
(
<year>1956</year>
).
<article-title>“An empirical Bayes approach to statistics,” in</article-title>
<source>
<italic>Proceedings of the Third Berkeley Symposium on Mathematical Statistics and Probability: Contributions to the Theory of Statistics</italic>
</source>
(Berkeley, CA: University of California Press),
<fpage>157</fpage>
<lpage>163</lpage>
</mixed-citation>
</ref>
<ref id="B123">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Roepstorff</surname>
<given-names>A.</given-names>
</name>
<name>
<surname>Niewohner</surname>
<given-names>J.</given-names>
</name>
<name>
<surname>Beck</surname>
<given-names>S.</given-names>
</name>
</person-group>
(
<year>2010</year>
).
<article-title>Enculturing brains through patterned practices.</article-title>
<source>
<italic>Neural Netw.</italic>
</source>
<volume>23</volume>
<fpage>1051</fpage>
<lpage>1059</lpage>
<pub-id pub-id-type="doi">10.1016/j.neunet.2010.08.002</pub-id>
<pub-id pub-id-type="pmid">20813499</pub-id>
</mixed-citation>
</ref>
<ref id="B124">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Rohrmeier</surname>
<given-names>M. A.</given-names>
</name>
<name>
<surname>Koelsch</surname>
<given-names>S.</given-names>
</name>
</person-group>
(
<year>2012</year>
).
<article-title>Predictive information processing in music cognition. A critical review.</article-title>
<source>
<italic>Int. J. Psychophysiol.</italic>
</source>
<volume>83</volume>
<fpage>164</fpage>
<lpage>175</lpage>
<pub-id pub-id-type="doi">10.1016/j.ijpsycho.2011.12.010</pub-id>
<pub-id pub-id-type="pmid">22245599</pub-id>
</mixed-citation>
</ref>
<ref id="B125">
<mixed-citation publication-type="book">
<person-group person-group-type="author">
<name>
<surname>Rubin</surname>
<given-names>E.</given-names>
</name>
</person-group>
(
<year>1918</year>
).
<source>
<italic>Synsoplevede Figurer. Studier i Psykologisk Analyse</italic>
</source>
<comment>Part I.</comment>
<publisher-loc>Copenhagen</publisher-loc>
:
<publisher-name>Gyldendal</publisher-name>
</mixed-citation>
</ref>
<ref id="B126">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Sadeghi</surname>
<given-names>H.</given-names>
</name>
<name>
<surname>Allard</surname>
<given-names>P.</given-names>
</name>
<name>
<surname>Prince</surname>
<given-names>F.</given-names>
</name>
<name>
<surname>Labelle</surname>
<given-names>H.</given-names>
</name>
</person-group>
(
<year>2000</year>
).
<article-title>Symmetry and limb dominance in able-bodied gait: a review.</article-title>
<source>
<italic>Gait Posture</italic>
</source>
<volume>12</volume>
<fpage>34</fpage>
<lpage>45</lpage>
<pub-id pub-id-type="doi">10.1016/S0966-6362(00)00070-9</pub-id>
<pub-id pub-id-type="pmid">10996295</pub-id>
</mixed-citation>
</ref>
<ref id="B127">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Sams</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>Paavilainen</surname>
<given-names>P.</given-names>
</name>
<name>
<surname>Alho</surname>
<given-names>K.</given-names>
</name>
<name>
<surname>Näätänen</surname>
<given-names>R.</given-names>
</name>
</person-group>
(
<year>1985</year>
).
<article-title>Auditory frequency discrimination and event-related potentials.</article-title>
<source>
<italic>Electroencephalogr. Clin. Neurophysiol.</italic>
</source>
<volume>62</volume>
<fpage>437</fpage>
<lpage>448</lpage>
<pub-id pub-id-type="doi">10.1016/0168-5597(85)90054-1</pub-id>
<pub-id pub-id-type="pmid">2415340</pub-id>
</mixed-citation>
</ref>
<ref id="B128">
<mixed-citation publication-type="book">
<person-group person-group-type="author">
<name>
<surname>Schaffrath</surname>
<given-names>H.</given-names>
</name>
</person-group>
(
<year>1995</year>
).
<article-title>“The Essen folksong collection,”</article-title>
<role>ed.</role>
<person-group person-group-type="editor">
<name>
<surname>Huron</surname>
<given-names>D.</given-names>
</name>
</person-group>
(
<publisher-loc>Stanford, CA</publisher-loc>
:
<publisher-name>Center for Computer-Assisted Research in the Humanities</publisher-name>
).</mixed-citation>
</ref>
<ref id="B129">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Schmidt</surname>
<given-names>R.</given-names>
</name>
<name>
<surname>Fitzpatrick</surname>
<given-names>P.</given-names>
</name>
<name>
<surname>Caron</surname>
<given-names>R.</given-names>
</name>
<name>
<surname>Mergeche</surname>
<given-names>J.</given-names>
</name>
</person-group>
(
<year>2011</year>
).
<article-title>Understanding social motor coordination.</article-title>
<source>
<italic>Hum. Mov. Sci.</italic>
</source>
<volume>30</volume>
<fpage>834</fpage>
<lpage>845</lpage>
<pub-id pub-id-type="doi">10.1016/j.humov.2010.05.014</pub-id>
<pub-id pub-id-type="pmid">20817320</pub-id>
</mixed-citation>
</ref>
<ref id="B130">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Schogler</surname>
<given-names>B.</given-names>
</name>
<name>
<surname>Pepping</surname>
<given-names>G.-J.</given-names>
</name>
<name>
<surname>Lee</surname>
<given-names>D. N.</given-names>
</name>
</person-group>
(
<year>2008</year>
).
<article-title>TauG-guidance of transients in expressive musical performance.</article-title>
<source>
<italic>Exp. Brain Res.</italic>
</source>
<volume>189</volume>
<fpage>361</fpage>
<lpage>372</lpage>
<pub-id pub-id-type="doi">10.1007/s00221-008-1431-8</pub-id>
<pub-id pub-id-type="pmid">18560815</pub-id>
</mixed-citation>
</ref>
<ref id="B131">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Schultz</surname>
<given-names>W.</given-names>
</name>
</person-group>
(
<year>2007</year>
).
<article-title>Behavioral dopamine signals.</article-title>
<source>
<italic>Trends Neurosci.</italic>
</source>
<volume>30</volume>
<fpage>203</fpage>
<lpage>210</lpage>
<pub-id pub-id-type="doi">10.1016/j.tins.2007.03.007</pub-id>
<pub-id pub-id-type="pmid">17400301</pub-id>
</mixed-citation>
</ref>
<ref id="B132">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Schultz</surname>
<given-names>W.</given-names>
</name>
<name>
<surname>Preuschoff</surname>
<given-names>K.</given-names>
</name>
<name>
<surname>Camerer</surname>
<given-names>C.</given-names>
</name>
<name>
<surname>Hsu</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>Fiorillo</surname>
<given-names>C. D.</given-names>
</name>
<name>
<surname>Tobler</surname>
<given-names>P. N.</given-names>
</name>
<etal></etal>
</person-group>
(
<year>2008</year>
).
<article-title>Explicit neural signals reflecting reward uncertainty.</article-title>
<source>
<italic>Philos. Trans. R. Soc. B Biol. Sci.</italic>
</source>
<volume>363</volume>
<fpage>3801</fpage>
<lpage>3811</lpage>
<pub-id pub-id-type="doi">10.1098/rstb.2008.0152</pub-id>
</mixed-citation>
</ref>
<ref id="B133">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Simons</surname>
<given-names>J. S.</given-names>
</name>
<name>
<surname>Schölvinck</surname>
<given-names>M. L.</given-names>
</name>
<name>
<surname>Gilbert</surname>
<given-names>S. J.</given-names>
</name>
<name>
<surname>Frith</surname>
<given-names>C. D.</given-names>
</name>
<name>
<surname>Burgess</surname>
<given-names>P. W.</given-names>
</name>
</person-group>
(
<year>2006</year>
).
<article-title>Differential components of prospective memory? Evidence from fMRI.</article-title>
<source>
<italic>Neuropsychologia</italic>
</source>
<volume>44</volume>
<fpage>1388</fpage>
<lpage>1397</lpage>
<pub-id pub-id-type="doi">10.1016/j.neuropsychologia.2006.01.005</pub-id>
<pub-id pub-id-type="pmid">16513147</pub-id>
</mixed-citation>
</ref>
<ref id="B134">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Song</surname>
<given-names>C.</given-names>
</name>
<name>
<surname>Simpson</surname>
<given-names>A. J.</given-names>
</name>
<name>
<surname>Harte</surname>
<given-names>C. A.</given-names>
</name>
<name>
<surname>Pearce</surname>
<given-names>M. T.</given-names>
</name>
<name>
<surname>Sandler</surname>
<given-names>M. B.</given-names>
</name>
</person-group>
(
<year>2013</year>
).
<article-title>Syncopation and the Score.</article-title>
<source>
<italic>PLoS ONE</italic>
</source>
<volume>8</volume>
:
<issue>e74692</issue>
<pub-id pub-id-type="doi">10.1371/journal.pone.0074692</pub-id>
</mixed-citation>
</ref>
<ref id="B135">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Stephan</surname>
<given-names>K. E.</given-names>
</name>
<name>
<surname>Harrison</surname>
<given-names>L. M.</given-names>
</name>
<name>
<surname>Kiebel</surname>
<given-names>S. J.</given-names>
</name>
<name>
<surname>David</surname>
<given-names>O.</given-names>
</name>
<name>
<surname>Penny</surname>
<given-names>W. D.</given-names>
</name>
<name>
<surname>Friston</surname>
<given-names>K. J.</given-names>
</name>
</person-group>
(
<year>2007</year>
).
<article-title>Dynamic causal models of neural system dynamics: current state and future extensions.</article-title>
<source>
<italic>J. Biosci.</italic>
</source>
<volume>32</volume>
<fpage>129</fpage>
<lpage>144</lpage>
<pub-id pub-id-type="doi">10.1007/s12038-007-0012-5</pub-id>
<pub-id pub-id-type="pmid">17426386</pub-id>
</mixed-citation>
</ref>
<ref id="B136">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Sterzer</surname>
<given-names>P.</given-names>
</name>
<name>
<surname>Russ</surname>
<given-names>M. O.</given-names>
</name>
<name>
<surname>Preibisch</surname>
<given-names>C.</given-names>
</name>
<name>
<surname>Kleinschmidt</surname>
<given-names>A.</given-names>
</name>
</person-group>
(
<year>2002</year>
).
<article-title>Neural correlates of spontaneous direction reversals in ambiguous apparent visual motion.</article-title>
<source>
<italic>Neuroimage</italic>
</source>
<volume>15</volume>
<fpage>908</fpage>
<lpage>916</lpage>
<pub-id pub-id-type="doi">10.1006/nimg.2001.1030</pub-id>
<pub-id pub-id-type="pmid">11906231</pub-id>
</mixed-citation>
</ref>
<ref id="B137">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Stupacher</surname>
<given-names>J.</given-names>
</name>
<name>
<surname>Hove</surname>
<given-names>M. J.</given-names>
</name>
<name>
<surname>Novembre</surname>
<given-names>G.</given-names>
</name>
<name>
<surname>Schütz-Bosbach</surname>
<given-names>S.</given-names>
</name>
<name>
<surname>Keller</surname>
<given-names>P. E.</given-names>
</name>
</person-group>
(
<year>2013</year>
).
<article-title>Musical groove modulates motor cortex excitability: a TMS investigation.</article-title>
<source>
<italic>Brain Cogn.</italic>
</source>
<volume>82</volume>
<fpage>127</fpage>
<lpage>136</lpage>
<pub-id pub-id-type="doi">10.1016/j.bandc.2013.03.003</pub-id>
<pub-id pub-id-type="pmid">23660433</pub-id>
</mixed-citation>
</ref>
<ref id="B138">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Teki</surname>
<given-names>S.</given-names>
</name>
<name>
<surname>Grube</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>Griffiths</surname>
<given-names>T. D.</given-names>
</name>
</person-group>
(
<year>2011a</year>
).
<article-title>A unified model of time perception accounts for duration-based and beat-based timing mechanisms.</article-title>
<source>
<italic>Front. Integr. Neurosci.</italic>
</source>
<volume>5</volume>
:
<issue>90</issue>
<pub-id pub-id-type="doi">10.3389/fnint.2011.00090</pub-id>
</mixed-citation>
</ref>
<ref id="B139">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Teki</surname>
<given-names>S.</given-names>
</name>
<name>
<surname>Grube</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>Kumar</surname>
<given-names>S.</given-names>
</name>
<name>
<surname>Griffiths</surname>
<given-names>T. D.</given-names>
</name>
</person-group>
(
<year>2011b</year>
).
<article-title>Distinct neural substrates of duration-based and beat-based auditory timing.</article-title>
<source>
<italic>J. Neurosci.</italic>
</source>
<volume>31</volume>
<fpage>3805</fpage>
<lpage>3812</lpage>
<pub-id pub-id-type="doi">10.1523/JNEUROSCI.5561-10.2011</pub-id>
<pub-id pub-id-type="pmid">21389235</pub-id>
</mixed-citation>
</ref>
<ref id="B140">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Temperley</surname>
<given-names>D.</given-names>
</name>
</person-group>
(
<year>2004</year>
).
<article-title>An Evalutation system for metrical models.</article-title>
<source>
<italic>Comput. Music J.</italic>
</source>
<volume>28</volume>
<fpage>28</fpage>
<lpage>44</lpage>
<pub-id pub-id-type="doi">10.1162/0148926041790621</pub-id>
</mixed-citation>
</ref>
<ref id="B141">
<mixed-citation publication-type="book">
<person-group person-group-type="author">
<name>
<surname>Temperley</surname>
<given-names>D.</given-names>
</name>
</person-group>
(
<year>2007</year>
).
<source>
<italic>Music and Probability.</italic>
</source>
<publisher-loc>Cambridge, MA</publisher-loc>
:
<publisher-name>MIT Press</publisher-name>
</mixed-citation>
</ref>
<ref id="B142">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Temperley</surname>
<given-names>D.</given-names>
</name>
</person-group>
(
<year>2009</year>
).
<article-title>A unified probabilistic model for polyphonic music analysis.</article-title>
<source>
<italic>J. New Music Res.</italic>
</source>
<volume>38</volume>
<fpage>3</fpage>
<lpage>18</lpage>
<pub-id pub-id-type="doi">10.1080/09298210902928495</pub-id>
</mixed-citation>
</ref>
<ref id="B143">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Temperley</surname>
<given-names>D.</given-names>
</name>
</person-group>
(
<year>2010</year>
).
<article-title>Modeling common-practice rhythm.</article-title>
<source>
<italic>Music Percept.</italic>
</source>
<volume>27</volume>
<fpage>355</fpage>
<lpage>376</lpage>
<pub-id pub-id-type="doi">10.1525/mp.2010.27.5.355</pub-id>
</mixed-citation>
</ref>
<ref id="B144">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Temperley</surname>
<given-names>D.</given-names>
</name>
<name>
<surname>Sleator</surname>
<given-names>D.</given-names>
</name>
</person-group>
(
<year>1999</year>
).
<article-title>Modeling meter and harmony: a preference rule approach.</article-title>
<source>
<italic>Comput. Music J.</italic>
</source>
<volume>23</volume>
<fpage>10</fpage>
<lpage>27</lpage>
<pub-id pub-id-type="doi">10.1162/014892699559616</pub-id>
</mixed-citation>
</ref>
<ref id="B145">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Trainor</surname>
<given-names>L. J.</given-names>
</name>
<name>
<surname>Mcdonald</surname>
<given-names>K. L.</given-names>
</name>
<name>
<surname>Alain</surname>
<given-names>C.</given-names>
</name>
</person-group>
(
<year>2002</year>
).
<article-title>Automatic and controlled processing of melodic contour and interval information measured by electrical brain activity.</article-title>
<source>
<italic>J. Cogn. Neurosci.</italic>
</source>
<volume>14</volume>
<fpage>430</fpage>
<lpage>442</lpage>
<pub-id pub-id-type="doi">10.1162/089892902317361949</pub-id>
<pub-id pub-id-type="pmid">11970802</pub-id>
</mixed-citation>
</ref>
<ref id="B146">
<mixed-citation publication-type="book">
<person-group person-group-type="author">
<name>
<surname>Trost</surname>
<given-names>W.</given-names>
</name>
<name>
<surname>Vuilleumier</surname>
<given-names>P.</given-names>
</name>
</person-group>
(
<year>2013</year>
).
<article-title>“Rhythmic entrainment as a mechanism for emotion induction by music: A neurophysiological perspective,” in</article-title>
<source>
<italic>The Emotional Power of Music: Multidisciplinary Perspectives on Musical Arousal, Expression, and Social Control</italic>
</source>
<role>eds</role>
<person-group person-group-type="editor">
<name>
<surname>Cochrane</surname>
<given-names>T.</given-names>
</name>
<name>
<surname>Fantini</surname>
<given-names>B.</given-names>
</name>
<name>
<surname>Scherer</surname>
<given-names>K. R.</given-names>
</name>
</person-group>
(
<publisher-loc>New York</publisher-loc>
:
<publisher-name>Oxford University Press</publisher-name>
)
<fpage>213</fpage>
<lpage>225</lpage>
<pub-id pub-id-type="doi">10.1093/acprof:oso/9780199654888.003.0016</pub-id>
</mixed-citation>
</ref>
<ref id="B147">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Van Zuijen</surname>
<given-names>T. L.</given-names>
</name>
<name>
<surname>Sussman</surname>
<given-names>E.</given-names>
</name>
<name>
<surname>Winkler</surname>
<given-names>I.</given-names>
</name>
<name>
<surname>Näätänen</surname>
<given-names>R.</given-names>
</name>
<name>
<surname>Tervaniemi</surname>
<given-names>M.</given-names>
</name>
</person-group>
(
<year>2004</year>
).
<article-title>Grouping of sequential sounds-an event-related potential study comparing musicians and nonmusicians.</article-title>
<source>
<italic>J. Cogn. Neurosci.</italic>
</source>
<volume>16</volume>
<fpage>331</fpage>
<lpage>338</lpage>
<pub-id pub-id-type="doi">10.1162/089892904322984607</pub-id>
<pub-id pub-id-type="pmid">15068601</pub-id>
</mixed-citation>
</ref>
<ref id="B148">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Verschure</surname>
<given-names>P. F.</given-names>
</name>
<name>
<surname>Voegtlin</surname>
<given-names>T.</given-names>
</name>
<name>
<surname>Douglas</surname>
<given-names>R. J.</given-names>
</name>
</person-group>
(
<year>2003</year>
).
<article-title>Environmentally mediated synergy between perception and behaviour in mobile robots.</article-title>
<source>
<italic>Nature</italic>
</source>
<volume>425</volume>
<fpage>620</fpage>
<lpage>624</lpage>
<pub-id pub-id-type="doi">10.1038/nature02024</pub-id>
<pub-id pub-id-type="pmid">14534588</pub-id>
</mixed-citation>
</ref>
<ref id="B149">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Volk</surname>
<given-names>A.</given-names>
</name>
</person-group>
(
<year>2008</year>
).
<article-title>The study of syncopation using inner metric analysis: linking theoretical and experimental analysis of metre in music.</article-title>
<source>
<italic>J. New Music Res.</italic>
</source>
<volume>37</volume>
<fpage>259</fpage>
<lpage>273</lpage>
<pub-id pub-id-type="doi">10.1080/09298210802680758</pub-id>
</mixed-citation>
</ref>
<ref id="B150">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Vuust</surname>
<given-names>P.</given-names>
</name>
<name>
<surname>Brattico</surname>
<given-names>E.</given-names>
</name>
<name>
<surname>Seppänen</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>Näätänen</surname>
<given-names>R.</given-names>
</name>
<name>
<surname>Tervaniemi</surname>
<given-names>M.</given-names>
</name>
</person-group>
(
<year>2012a</year>
).
<article-title>Practiced musical style shapes auditory skills.</article-title>
<source>
<italic>Ann. N. Y. Acad. Sci.</italic>
</source>
<volume>1252</volume>
<fpage>139</fpage>
<lpage>146</lpage>
<pub-id pub-id-type="doi">10.1111/j.1749-6632.2011.06409.x</pub-id>
<pub-id pub-id-type="pmid">22524351</pub-id>
</mixed-citation>
</ref>
<ref id="B151">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Vuust</surname>
<given-names>P.</given-names>
</name>
<name>
<surname>Brattico</surname>
<given-names>E.</given-names>
</name>
<name>
<surname>Seppänen</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>Näätänen</surname>
<given-names>R.</given-names>
</name>
<name>
<surname>Tervaniemi</surname>
<given-names>M.</given-names>
</name>
</person-group>
(
<year>2012b</year>
).
<article-title>The sound of music: differentiating musicians using a fast, musical multi-feature mismatch negativity paradigm.</article-title>
<source>
<italic>Neuropsychologia</italic>
</source>
<volume>50</volume>
<fpage>1432</fpage>
<lpage>1443</lpage>
<pub-id pub-id-type="doi">10.1016/j.neuropsychologia.2012.02.028</pub-id>
<pub-id pub-id-type="pmid">22414595</pub-id>
</mixed-citation>
</ref>
<ref id="B152">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Vuust</surname>
<given-names>P.</given-names>
</name>
<name>
<surname>Frith</surname>
<given-names>C. D.</given-names>
</name>
</person-group>
(
<year>2008</year>
).
<article-title>Anticipation is the key to understanding music and the effects of music on emotion.</article-title>
<source>
<italic>Behav. Brain Sci.</italic>
</source>
<volume>31</volume>
<fpage>599</fpage>
<lpage>600</lpage>
<pub-id pub-id-type="doi">10.1017/S0140525X08005542</pub-id>
</mixed-citation>
</ref>
<ref id="B153">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Vuust</surname>
<given-names>P.</given-names>
</name>
<name>
<surname>Ostergaard</surname>
<given-names>L.</given-names>
</name>
<name>
<surname>Pallesen</surname>
<given-names>K. J.</given-names>
</name>
<name>
<surname>Bailey</surname>
<given-names>C.</given-names>
</name>
<name>
<surname>Roepstorff</surname>
<given-names>A.</given-names>
</name>
</person-group>
(
<year>2009</year>
).
<article-title>Predictive coding of music-brain responses to rhythmic incongruity.</article-title>
<source>
<italic>Cortex</italic>
</source>
<volume>45</volume>
<fpage>80</fpage>
<lpage>92</lpage>
<pub-id pub-id-type="doi">10.1016/j.cortex.2008.05.014</pub-id>
<pub-id pub-id-type="pmid">19054506</pub-id>
</mixed-citation>
</ref>
<ref id="B154">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Vuust</surname>
<given-names>P.</given-names>
</name>
<name>
<surname>Pallesen</surname>
<given-names>K. J.</given-names>
</name>
<name>
<surname>Bailey</surname>
<given-names>C.</given-names>
</name>
<name>
<surname>Van Zuijen</surname>
<given-names>T. L.</given-names>
</name>
<name>
<surname>Gjedde</surname>
<given-names>A.</given-names>
</name>
<name>
<surname>Roepstorff</surname>
<given-names>A.</given-names>
</name>
<etal></etal>
</person-group>
(
<year>2005</year>
).
<article-title>To musicians, the message is in the meter: pre-attentive neuronal responses to incongruent rhythm are left-lateralized in musicians.</article-title>
<source>
<italic>Neuroimage</italic>
</source>
<volume>24</volume>
<fpage>560</fpage>
<lpage>564</lpage>
<pub-id pub-id-type="doi">10.1016/j.neuroimage.2004.08.039</pub-id>
<pub-id pub-id-type="pmid">15627598</pub-id>
</mixed-citation>
</ref>
<ref id="B155">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Vuust</surname>
<given-names>P.</given-names>
</name>
<name>
<surname>Roepstorff</surname>
<given-names>A.</given-names>
</name>
<name>
<surname>Wallentin</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>Mouridsen</surname>
<given-names>K.</given-names>
</name>
<name>
<surname>Østergaard</surname>
<given-names>L.</given-names>
</name>
</person-group>
(
<year>2006</year>
).
<article-title>It don’t mean a thing... keeping the rhythm during polyrhythmic tension, activates language areas (BA47).</article-title>
<source>
<italic>Neuroimage</italic>
</source>
<volume>31</volume>
<fpage>832</fpage>
<lpage>841</lpage>
<pub-id pub-id-type="doi">10.1016/j.neuroimage.2005.12.037</pub-id>
<pub-id pub-id-type="pmid">16516496</pub-id>
</mixed-citation>
</ref>
<ref id="B156">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Vuust</surname>
<given-names>P.</given-names>
</name>
<name>
<surname>Wallentin</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>Mouridsen</surname>
<given-names>K.</given-names>
</name>
<name>
<surname>Østergaard</surname>
<given-names>L.</given-names>
</name>
<name>
<surname>Roepstorff</surname>
<given-names>A.</given-names>
</name>
</person-group>
(
<year>2011</year>
).
<article-title>Tapping polyrhythms in music activates language areas.</article-title>
<source>
<italic>Neurosci. Lett.</italic>
</source>
<volume>494</volume>
<fpage>211</fpage>
<lpage>216</lpage>
<pub-id pub-id-type="doi">10.1016/j.neulet.2011.03.015</pub-id>
<pub-id pub-id-type="pmid">21397659</pub-id>
</mixed-citation>
</ref>
<ref id="B157">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Waadeland</surname>
<given-names>C. H.</given-names>
</name>
</person-group>
(
<year>2001</year>
).
<article-title>“It don’t mean a thing if it ain’t got that swing” – Simulating expressive timing by modulated movements.</article-title>
<source>
<italic>J. New Music Res.</italic>
</source>
<volume>30</volume>
<fpage>23</fpage>
<lpage>37</lpage>
<pub-id pub-id-type="doi">10.1076/jnmr.30.1.23.7123</pub-id>
</mixed-citation>
</ref>
<ref id="B158">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Winkler</surname>
<given-names>I.</given-names>
</name>
<name>
<surname>Karmos</surname>
<given-names>G.</given-names>
</name>
<name>
<surname>Näätänen</surname>
<given-names>R.</given-names>
</name>
</person-group>
(
<year>1996</year>
).
<article-title>Adaptive modeling of the unattended acoustic environment reflected in the mismatch negativity event-related potential.</article-title>
<source>
<italic>Brain Res.</italic>
</source>
<volume>742</volume>
<fpage>239</fpage>
<lpage>252</lpage>
<pub-id pub-id-type="doi">10.1016/S0006-8993(96)01008-6</pub-id>
<pub-id pub-id-type="pmid">9117400</pub-id>
</mixed-citation>
</ref>
<ref id="B159">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Witek</surname>
<given-names>M. A. G.</given-names>
</name>
<name>
<surname>Clarke</surname>
<given-names>E. F.</given-names>
</name>
<name>
<surname>Kringelbach</surname>
<given-names>M. L.</given-names>
</name>
<name>
<surname>Vuust</surname>
<given-names>P.</given-names>
</name>
</person-group>
<comment>(in press)</comment>
<article-title>Effects of polyphonic context, instrumentation and metric location on syncopation in music.</article-title>
<source>
<italic>Music Percept.</italic>
</source>
</mixed-citation>
</ref>
<ref id="B160">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Witek</surname>
<given-names>M. A. G.</given-names>
</name>
<name>
<surname>Clarke</surname>
<given-names>E. F.</given-names>
</name>
<name>
<surname>Wallentin</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>Kringelbach</surname>
<given-names>M. L.</given-names>
</name>
<name>
<surname>Vuust</surname>
<given-names>P.</given-names>
</name>
</person-group>
(
<year>2014</year>
).
<article-title>Syncopation, body-movement and pleasure in groove music.</article-title>
<source>
<italic>PLoS ONE</italic>
</source>
<volume>9</volume>
:
<issue>e94446</issue>
<pub-id pub-id-type="doi">10.1371/journal.pone.0094446</pub-id>
</mixed-citation>
</ref>
<ref id="B161">
<mixed-citation publication-type="book">
<person-group person-group-type="author">
<name>
<surname>Wundt</surname>
<given-names>W.</given-names>
</name>
</person-group>
(
<year>1874</year>
).
<source>
<italic>Grundzuge der Physiologischen Psychologie.</italic>
</source>
<publisher-loc>Leipzig</publisher-loc>
:
<publisher-name>Englemann</publisher-name>
</mixed-citation>
</ref>
<ref id="B162">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Yu</surname>
<given-names>A. J.</given-names>
</name>
</person-group>
(
<year>2007</year>
).
<article-title>Adaptive behavior: humans act as Bayesian learners.</article-title>
<source>
<italic>Curr. Biol.</italic>
</source>
<volume>17</volume>
<fpage>R977</fpage>
<lpage>R980</lpage>
<pub-id pub-id-type="doi">10.1016/j.cub.2007.09.007</pub-id>
<pub-id pub-id-type="pmid">18029257</pub-id>
</mixed-citation>
</ref>
</ref-list>
</back>
</pmc>
<affiliations>
<list>
<country>
<li>Danemark</li>
</country>
</list>
<tree>
<country name="Danemark">
<noRegion>
<name sortKey="Vuust, Peter" sort="Vuust, Peter" uniqKey="Vuust P" first="Peter" last="Vuust">Peter Vuust</name>
</noRegion>
<name sortKey="Vuust, Peter" sort="Vuust, Peter" uniqKey="Vuust P" first="Peter" last="Vuust">Peter Vuust</name>
<name sortKey="Witek, Maria A G" sort="Witek, Maria A G" uniqKey="Witek M" first="Maria A. G." last="Witek">Maria A. G. Witek</name>
</country>
</tree>
</affiliations>
</record>

Pour manipuler ce document sous Unix (Dilib)

EXPLOR_STEP=$WICRI_ROOT/Wicri/Musique/explor/MozartV1/Data/Pmc/Checkpoint
HfdSelect -h $EXPLOR_STEP/biblio.hfd -nk 000093 | SxmlIndent | more

Ou

HfdSelect -h $EXPLOR_AREA/Data/Pmc/Checkpoint/biblio.hfd -nk 000093 | SxmlIndent | more

Pour mettre un lien sur cette page dans le réseau Wicri

{{Explor lien
   |wiki=    Wicri/Musique
   |area=    MozartV1
   |flux=    Pmc
   |étape=   Checkpoint
   |type=    RBID
   |clé=     PMC:4181238
   |texte=   Rhythmic complexity and predictive coding: a novel approach to modeling rhythm and meter perception in music
}}

Pour générer des pages wiki

HfdIndexSelect -h $EXPLOR_AREA/Data/Pmc/Checkpoint/RBID.i   -Sk "pubmed:25324813" \
       | HfdSelect -Kh $EXPLOR_AREA/Data/Pmc/Checkpoint/biblio.hfd   \
       | NlmPubMed2Wicri -a MozartV1 

Wicri

This area was generated with Dilib version V0.6.20.
Data generation: Sun Apr 10 15:06:14 2016. Site generation: Tue Feb 7 15:40:35 2023