Serveur d'exploration sur les relations entre la France et l'Australie

Attention, ce site est en cours de développement !
Attention, site généré par des moyens informatiques à partir de corpus bruts.
Les informations ne sont donc pas validées.
***** Acces problem to record *****\

Identifieur interne : 0028599 ( Pmc/Corpus ); précédent : 0028598; suivant : 0028600 ***** probable Xml problem with record *****

Links to Exploration step


Le document en format XML

<record>
<TEI>
<teiHeader>
<fileDesc>
<titleStmt>
<title xml:lang="en">A Neurocomputational Model of the Mismatch Negativity</title>
<author>
<name sortKey="Lieder, Falk" sort="Lieder, Falk" uniqKey="Lieder F" first="Falk" last="Lieder">Falk Lieder</name>
<affiliation>
<nlm:aff id="aff1">
<addr-line>Translational Neuromodeling Unit (TNU), Institute of Biomedical Engineering, University of Zurich & ETH Zurich, Zurich, Switzerland</addr-line>
</nlm:aff>
</affiliation>
<affiliation>
<nlm:aff id="aff2">
<addr-line>Laboratory for Social and Neuronal Systems Research, Dept. of Economics, University of Zurich, Zurich, Switzerland</addr-line>
</nlm:aff>
</affiliation>
<affiliation>
<nlm:aff id="aff3">
<addr-line>Helen Wills Neuroscience Institute, University of California at Berkeley, Berkeley, California, United States of America</addr-line>
</nlm:aff>
</affiliation>
</author>
<author>
<name sortKey="Stephan, Klaas E" sort="Stephan, Klaas E" uniqKey="Stephan K" first="Klaas E." last="Stephan">Klaas E. Stephan</name>
<affiliation>
<nlm:aff id="aff1">
<addr-line>Translational Neuromodeling Unit (TNU), Institute of Biomedical Engineering, University of Zurich & ETH Zurich, Zurich, Switzerland</addr-line>
</nlm:aff>
</affiliation>
<affiliation>
<nlm:aff id="aff2">
<addr-line>Laboratory for Social and Neuronal Systems Research, Dept. of Economics, University of Zurich, Zurich, Switzerland</addr-line>
</nlm:aff>
</affiliation>
<affiliation>
<nlm:aff id="aff4">
<addr-line>Wellcome Trust Centre for Neuroimaging, Institute of Neurology, University College London, London, United Kingdom</addr-line>
</nlm:aff>
</affiliation>
</author>
<author>
<name sortKey="Daunizeau, Jean" sort="Daunizeau, Jean" uniqKey="Daunizeau J" first="Jean" last="Daunizeau">Jean Daunizeau</name>
<affiliation>
<nlm:aff id="aff1">
<addr-line>Translational Neuromodeling Unit (TNU), Institute of Biomedical Engineering, University of Zurich & ETH Zurich, Zurich, Switzerland</addr-line>
</nlm:aff>
</affiliation>
<affiliation>
<nlm:aff id="aff2">
<addr-line>Laboratory for Social and Neuronal Systems Research, Dept. of Economics, University of Zurich, Zurich, Switzerland</addr-line>
</nlm:aff>
</affiliation>
<affiliation>
<nlm:aff id="aff4">
<addr-line>Wellcome Trust Centre for Neuroimaging, Institute of Neurology, University College London, London, United Kingdom</addr-line>
</nlm:aff>
</affiliation>
<affiliation>
<nlm:aff id="aff5">
<addr-line>Brain and Spine Institute (ICM), Paris, France</addr-line>
</nlm:aff>
</affiliation>
</author>
<author>
<name sortKey="Garrido, Marta I" sort="Garrido, Marta I" uniqKey="Garrido M" first="Marta I." last="Garrido">Marta I. Garrido</name>
<affiliation>
<nlm:aff id="aff4">
<addr-line>Wellcome Trust Centre for Neuroimaging, Institute of Neurology, University College London, London, United Kingdom</addr-line>
</nlm:aff>
</affiliation>
<affiliation>
<nlm:aff id="aff6">
<addr-line>Queensland Brain Institute, The University of Queensland, St Lucia, Australia</addr-line>
</nlm:aff>
</affiliation>
</author>
<author>
<name sortKey="Friston, Karl J" sort="Friston, Karl J" uniqKey="Friston K" first="Karl J." last="Friston">Karl J. Friston</name>
<affiliation>
<nlm:aff id="aff4">
<addr-line>Wellcome Trust Centre for Neuroimaging, Institute of Neurology, University College London, London, United Kingdom</addr-line>
</nlm:aff>
</affiliation>
</author>
</titleStmt>
<publicationStmt>
<idno type="wicri:source">PMC</idno>
<idno type="pmid">24244118</idno>
<idno type="pmc">3820518</idno>
<idno type="url">http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3820518</idno>
<idno type="RBID">PMC:3820518</idno>
<idno type="doi">10.1371/journal.pcbi.1003288</idno>
<date when="2013">2013</date>
<idno type="wicri:Area/Pmc/Corpus">002859</idno>
<idno type="wicri:explorRef" wicri:stream="Pmc" wicri:step="Corpus" wicri:corpus="PMC">002859</idno>
</publicationStmt>
<sourceDesc>
<biblStruct>
<analytic>
<title xml:lang="en" level="a" type="main">A Neurocomputational Model of the Mismatch Negativity</title>
<author>
<name sortKey="Lieder, Falk" sort="Lieder, Falk" uniqKey="Lieder F" first="Falk" last="Lieder">Falk Lieder</name>
<affiliation>
<nlm:aff id="aff1">
<addr-line>Translational Neuromodeling Unit (TNU), Institute of Biomedical Engineering, University of Zurich & ETH Zurich, Zurich, Switzerland</addr-line>
</nlm:aff>
</affiliation>
<affiliation>
<nlm:aff id="aff2">
<addr-line>Laboratory for Social and Neuronal Systems Research, Dept. of Economics, University of Zurich, Zurich, Switzerland</addr-line>
</nlm:aff>
</affiliation>
<affiliation>
<nlm:aff id="aff3">
<addr-line>Helen Wills Neuroscience Institute, University of California at Berkeley, Berkeley, California, United States of America</addr-line>
</nlm:aff>
</affiliation>
</author>
<author>
<name sortKey="Stephan, Klaas E" sort="Stephan, Klaas E" uniqKey="Stephan K" first="Klaas E." last="Stephan">Klaas E. Stephan</name>
<affiliation>
<nlm:aff id="aff1">
<addr-line>Translational Neuromodeling Unit (TNU), Institute of Biomedical Engineering, University of Zurich & ETH Zurich, Zurich, Switzerland</addr-line>
</nlm:aff>
</affiliation>
<affiliation>
<nlm:aff id="aff2">
<addr-line>Laboratory for Social and Neuronal Systems Research, Dept. of Economics, University of Zurich, Zurich, Switzerland</addr-line>
</nlm:aff>
</affiliation>
<affiliation>
<nlm:aff id="aff4">
<addr-line>Wellcome Trust Centre for Neuroimaging, Institute of Neurology, University College London, London, United Kingdom</addr-line>
</nlm:aff>
</affiliation>
</author>
<author>
<name sortKey="Daunizeau, Jean" sort="Daunizeau, Jean" uniqKey="Daunizeau J" first="Jean" last="Daunizeau">Jean Daunizeau</name>
<affiliation>
<nlm:aff id="aff1">
<addr-line>Translational Neuromodeling Unit (TNU), Institute of Biomedical Engineering, University of Zurich & ETH Zurich, Zurich, Switzerland</addr-line>
</nlm:aff>
</affiliation>
<affiliation>
<nlm:aff id="aff2">
<addr-line>Laboratory for Social and Neuronal Systems Research, Dept. of Economics, University of Zurich, Zurich, Switzerland</addr-line>
</nlm:aff>
</affiliation>
<affiliation>
<nlm:aff id="aff4">
<addr-line>Wellcome Trust Centre for Neuroimaging, Institute of Neurology, University College London, London, United Kingdom</addr-line>
</nlm:aff>
</affiliation>
<affiliation>
<nlm:aff id="aff5">
<addr-line>Brain and Spine Institute (ICM), Paris, France</addr-line>
</nlm:aff>
</affiliation>
</author>
<author>
<name sortKey="Garrido, Marta I" sort="Garrido, Marta I" uniqKey="Garrido M" first="Marta I." last="Garrido">Marta I. Garrido</name>
<affiliation>
<nlm:aff id="aff4">
<addr-line>Wellcome Trust Centre for Neuroimaging, Institute of Neurology, University College London, London, United Kingdom</addr-line>
</nlm:aff>
</affiliation>
<affiliation>
<nlm:aff id="aff6">
<addr-line>Queensland Brain Institute, The University of Queensland, St Lucia, Australia</addr-line>
</nlm:aff>
</affiliation>
</author>
<author>
<name sortKey="Friston, Karl J" sort="Friston, Karl J" uniqKey="Friston K" first="Karl J." last="Friston">Karl J. Friston</name>
<affiliation>
<nlm:aff id="aff4">
<addr-line>Wellcome Trust Centre for Neuroimaging, Institute of Neurology, University College London, London, United Kingdom</addr-line>
</nlm:aff>
</affiliation>
</author>
</analytic>
<series>
<title level="j">PLoS Computational Biology</title>
<idno type="ISSN">1553-734X</idno>
<idno type="eISSN">1553-7358</idno>
<imprint>
<date when="2013">2013</date>
</imprint>
</series>
</biblStruct>
</sourceDesc>
</fileDesc>
<profileDesc>
<textClass></textClass>
</profileDesc>
</teiHeader>
<front>
<div type="abstract" xml:lang="en">
<p>The mismatch negativity (MMN) is an event related potential evoked by violations of regularity. Here, we present a model of the underlying neuronal dynamics based upon the idea that auditory cortex continuously updates a generative model to predict its sensory inputs. The MMN is then modelled as the superposition of the electric fields evoked by neuronal activity reporting prediction errors. The process by which auditory cortex generates predictions and resolves prediction errors was simulated using generalised (Bayesian) filtering – a biologically plausible scheme for probabilistic inference on the hidden states of hierarchical dynamical models. The resulting scheme generates realistic MMN waveforms, explains the qualitative effects of deviant probability and magnitude on the MMN – in terms of latency and amplitude – and makes quantitative predictions about the interactions between deviant probability and magnitude. This work advances a formal understanding of the MMN and – more generally – illustrates the potential for developing computationally informed dynamic causal models of empirical electromagnetic responses.</p>
</div>
</front>
<back>
<div1 type="bibliography">
<listBibl>
<biblStruct>
<analytic>
<author>
<name sortKey="Friston, K" uniqKey="Friston K">K Friston</name>
</author>
<author>
<name sortKey="Dolan, R" uniqKey="Dolan R">R Dolan</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="David, O" uniqKey="David O">O David</name>
</author>
<author>
<name sortKey="Kiebel, S" uniqKey="Kiebel S">S Kiebel</name>
</author>
<author>
<name sortKey="Harrison, L" uniqKey="Harrison L">L Harrison</name>
</author>
<author>
<name sortKey="Mattout, J" uniqKey="Mattout J">J Mattout</name>
</author>
<author>
<name sortKey="Kilner, J" uniqKey="Kilner J">J Kilner</name>
</author>
</analytic>
</biblStruct>
<biblStruct></biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Friston, K" uniqKey="Friston K">K Friston</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Friston, K" uniqKey="Friston K">K Friston</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Friston, K" uniqKey="Friston K">K Friston</name>
</author>
<author>
<name sortKey="Kiebel, S" uniqKey="Kiebel S">S Kiebel</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Friston, K" uniqKey="Friston K">K Friston</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Garrido, M" uniqKey="Garrido M">M Garrido</name>
</author>
<author>
<name sortKey="Kilner, J" uniqKey="Kilner J">J Kilner</name>
</author>
<author>
<name sortKey="Stephan, K" uniqKey="Stephan K">K Stephan</name>
</author>
<author>
<name sortKey="Friston, K" uniqKey="Friston K">K Friston</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Tiitinen, H" uniqKey="Tiitinen H">H Tiitinen</name>
</author>
<author>
<name sortKey="May, P" uniqKey="May P">P May</name>
</author>
<author>
<name sortKey="Reinikainen, K" uniqKey="Reinikainen K">K Reinikainen</name>
</author>
<author>
<name sortKey="N T Nen, R" uniqKey="N T Nen R">R Näätänen</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Sinkkonen, J" uniqKey="Sinkkonen J">J Sinkkonen</name>
</author>
<author>
<name sortKey="Kaski, S" uniqKey="Kaski S">S Kaski</name>
</author>
<author>
<name sortKey="Huotilainen, M" uniqKey="Huotilainen M">M Huotilainen</name>
</author>
<author>
<name sortKey="Ilmoniemi, Rj" uniqKey="Ilmoniemi R">RJ Ilmoniemi</name>
</author>
<author>
<name sortKey="N T Nen, R" uniqKey="N T Nen R">R Näätänen</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Javitt, Dc" uniqKey="Javitt D">DC Javitt</name>
</author>
<author>
<name sortKey="Grochowski, S" uniqKey="Grochowski S">S Grochowski</name>
</author>
<author>
<name sortKey="Shelley, Am" uniqKey="Shelley A">AM Shelley</name>
</author>
<author>
<name sortKey="Ritter, W" uniqKey="Ritter W">W Ritter</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="N T Nen, R" uniqKey="N T Nen R">R Näätänen</name>
</author>
<author>
<name sortKey="Paavilainen, P" uniqKey="Paavilainen P">P Paavilainen</name>
</author>
<author>
<name sortKey="Rinne, T" uniqKey="Rinne T">T Rinne</name>
</author>
<author>
<name sortKey="Alho, K" uniqKey="Alho K">K Alho</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="May, P" uniqKey="May P">P May</name>
</author>
<author>
<name sortKey="Tiitinen, H" uniqKey="Tiitinen H">H Tiitinen</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Kujala, T" uniqKey="Kujala T">T Kujala</name>
</author>
<author>
<name sortKey="Tervaniemi, M" uniqKey="Tervaniemi M">M Tervaniemi</name>
</author>
<author>
<name sortKey="Schroger, E" uniqKey="Schroger E">E Schröger</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Lieder, F" uniqKey="Lieder F">F Lieder</name>
</author>
<author>
<name sortKey="Daunizeau, J" uniqKey="Daunizeau J">J Daunizeau</name>
</author>
<author>
<name sortKey="Garrido, M" uniqKey="Garrido M">M Garrido</name>
</author>
<author>
<name sortKey="Friston, K" uniqKey="Friston K">K Friston</name>
</author>
<author>
<name sortKey="Stephan, Ke" uniqKey="Stephan K">KE Stephan</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Opitz, B" uniqKey="Opitz B">B Opitz</name>
</author>
<author>
<name sortKey="Rinne, T" uniqKey="Rinne T">T Rinne</name>
</author>
<author>
<name sortKey="Mecklinger, D" uniqKey="Mecklinger D">D Mecklinger</name>
</author>
<author>
<name sortKey="Van Cramon, Y" uniqKey="Van Cramon Y">Y van Cramon</name>
</author>
<author>
<name sortKey="Schroger, E" uniqKey="Schroger E">E Schröger</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Baldeweg, T" uniqKey="Baldeweg T">T Baldeweg</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Garrido, M" uniqKey="Garrido M">M Garrido</name>
</author>
<author>
<name sortKey="Kilner, J" uniqKey="Kilner J">J Kilner</name>
</author>
<author>
<name sortKey="Kiebel, S" uniqKey="Kiebel S">S Kiebel</name>
</author>
<author>
<name sortKey="Stephan, K" uniqKey="Stephan K">K Stephan</name>
</author>
<author>
<name sortKey="Baldeweg, T" uniqKey="Baldeweg T">T Baldeweg</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Garrido, M" uniqKey="Garrido M">M Garrido</name>
</author>
<author>
<name sortKey="Friston, K" uniqKey="Friston K">K Friston</name>
</author>
<author>
<name sortKey="Kiebel, S" uniqKey="Kiebel S">S Kiebel</name>
</author>
<author>
<name sortKey="Stephan, K" uniqKey="Stephan K">K Stephan</name>
</author>
<author>
<name sortKey="Baldeweg, T" uniqKey="Baldeweg T">T Baldeweg</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Daunizeau, J" uniqKey="Daunizeau J">J Daunizeau</name>
</author>
<author>
<name sortKey="Den Ouden, H" uniqKey="Den Ouden H">H den Ouden</name>
</author>
<author>
<name sortKey="Pessiglione, M" uniqKey="Pessiglione M">M Pessiglione</name>
</author>
<author>
<name sortKey="Kiebel, S" uniqKey="Kiebel S">S Kiebel</name>
</author>
<author>
<name sortKey="Stephan, K" uniqKey="Stephan K">K Stephan</name>
</author>
</analytic>
</biblStruct>
<biblStruct></biblStruct>
<biblStruct></biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Friston, K" uniqKey="Friston K">K Friston</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Friston, K" uniqKey="Friston K">K Friston</name>
</author>
<author>
<name sortKey="Kiebel, S" uniqKey="Kiebel S">S Kiebel</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Friston, K" uniqKey="Friston K">K Friston</name>
</author>
<author>
<name sortKey="Trujillobarreto, N" uniqKey="Trujillobarreto N">N Trujillobarreto</name>
</author>
<author>
<name sortKey="Daunizeau, J" uniqKey="Daunizeau J">J Daunizeau</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Friston, Kj" uniqKey="Friston K">KJ Friston</name>
</author>
<author>
<name sortKey="Stephan, Ke" uniqKey="Stephan K">KE Stephan</name>
</author>
<author>
<name sortKey="Li, B" uniqKey="Li B">B Li</name>
</author>
<author>
<name sortKey="Daunizeau, J" uniqKey="Daunizeau J">J Daunizeau</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Friston, K" uniqKey="Friston K">K Friston</name>
</author>
<author>
<name sortKey="Kiebel, S" uniqKey="Kiebel S">S Kiebel</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Friston, K" uniqKey="Friston K">K Friston</name>
</author>
<author>
<name sortKey="Kilner, J" uniqKey="Kilner J">J Kilner</name>
</author>
<author>
<name sortKey="Harrison, L" uniqKey="Harrison L">L Harrison</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Balaguer Ballester, E" uniqKey="Balaguer Ballester E">E Balaguer-Ballester</name>
</author>
<author>
<name sortKey="Clark, Nr" uniqKey="Clark N">NR Clark</name>
</author>
<author>
<name sortKey="Coath, M" uniqKey="Coath M">M Coath</name>
</author>
<author>
<name sortKey="Krumbholz, K" uniqKey="Krumbholz K">K Krumbholz</name>
</author>
<author>
<name sortKey="Denham, Sl" uniqKey="Denham S">SL Denham</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Kiebel, S" uniqKey="Kiebel S">S Kiebel</name>
</author>
<author>
<name sortKey="Daunizeau, J" uniqKey="Daunizeau J">J Daunizeau</name>
</author>
<author>
<name sortKey="Friston, K" uniqKey="Friston K">K Friston</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Rauschecker, J" uniqKey="Rauschecker J">J Rauschecker</name>
</author>
<author>
<name sortKey="Scott, S" uniqKey="Scott S">S Scott</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Formisano, E" uniqKey="Formisano E">E Formisano</name>
</author>
<author>
<name sortKey="Kim, D S" uniqKey="Kim D">D-S Kim</name>
</author>
<author>
<name sortKey="Di Salle, F" uniqKey="Di Salle F">F Di Salle</name>
</author>
<author>
<name sortKey="Van De Moortele, P F" uniqKey="Van De Moortele P">P-F van de Moortele</name>
</author>
<author>
<name sortKey="Ugurbil, K" uniqKey="Ugurbil K">K Ugurbil</name>
</author>
</analytic>
</biblStruct>
<biblStruct></biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Bendor, D" uniqKey="Bendor D">D Bendor</name>
</author>
<author>
<name sortKey="Wang, X" uniqKey="Wang X">X Wang</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Patterson, R" uniqKey="Patterson R">R Patterson</name>
</author>
<author>
<name sortKey="Uppenkamp, S" uniqKey="Uppenkamp S">S Uppenkamp</name>
</author>
<author>
<name sortKey="Johnsrude, I" uniqKey="Johnsrude I">I Johnsrude</name>
</author>
<author>
<name sortKey="Griffiths, T" uniqKey="Griffiths T">T Griffiths</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Hall, D" uniqKey="Hall D">D Hall</name>
</author>
<author>
<name sortKey="Plack, C" uniqKey="Plack C">C Plack</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Schonwiesner, M" uniqKey="Schonwiesner M">M Schönwiesner</name>
</author>
<author>
<name sortKey="Zatorre, R" uniqKey="Zatorre R">R Zatorre</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Koulakov, A" uniqKey="Koulakov A">A Koulakov</name>
</author>
<author>
<name sortKey="Raghavachari, S" uniqKey="Raghavachari S">S Raghavachari</name>
</author>
<author>
<name sortKey="Kepecs, A" uniqKey="Kepecs A">A Kepecs</name>
</author>
<author>
<name sortKey="Lisman, J" uniqKey="Lisman J">J Lisman</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Kiebel, S" uniqKey="Kiebel S">S Kiebel</name>
</author>
<author>
<name sortKey="Daunizeau, J" uniqKey="Daunizeau J">J Daunizeau</name>
</author>
<author>
<name sortKey="Friston, K" uniqKey="Friston K">K Friston</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Bastos, A" uniqKey="Bastos A">A Bastos</name>
</author>
<author>
<name sortKey="Usrey, M" uniqKey="Usrey M">M Usrey</name>
</author>
<author>
<name sortKey="Adams, R" uniqKey="Adams R">R Adams</name>
</author>
<author>
<name sortKey="Mangun, G" uniqKey="Mangun G">G Mangun</name>
</author>
<author>
<name sortKey="Fries, P" uniqKey="Fries P">P Fries</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Peter, V" uniqKey="Peter V">V Peter</name>
</author>
<author>
<name sortKey="Mcarthur, G" uniqKey="Mcarthur G">G McArthur</name>
</author>
<author>
<name sortKey="Thompson, Wf" uniqKey="Thompson W">WF Thompson</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Ugray, Z" uniqKey="Ugray Z">Z Ugray</name>
</author>
<author>
<name sortKey="Lasdon, L" uniqKey="Lasdon L">L Lasdon</name>
</author>
<author>
<name sortKey="Plummer, J" uniqKey="Plummer J">J Plummer</name>
</author>
<author>
<name sortKey="Glover, F" uniqKey="Glover F">F Glover</name>
</author>
<author>
<name sortKey="Kelly, J" uniqKey="Kelly J">J Kelly</name>
</author>
</analytic>
</biblStruct>
<biblStruct></biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Woodman, G" uniqKey="Woodman G">G Woodman</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Kiesel, A" uniqKey="Kiesel A">A Kiesel</name>
</author>
<author>
<name sortKey="Miller, J" uniqKey="Miller J">J Miller</name>
</author>
<author>
<name sortKey="Jolicoeur, P" uniqKey="Jolicoeur P">P Jolicoeur</name>
</author>
<author>
<name sortKey="Brisson, B" uniqKey="Brisson B">B Brisson</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Schroger, E" uniqKey="Schroger E">E Schröger</name>
</author>
<author>
<name sortKey="Wolff, C" uniqKey="Wolff C">C Wolff</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Jacobsen, T" uniqKey="Jacobsen T">T Jacobsen</name>
</author>
<author>
<name sortKey="Schroger, E" uniqKey="Schroger E">E Schröger</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Jacobsen, T" uniqKey="Jacobsen T">T Jacobsen</name>
</author>
<author>
<name sortKey="Horenkamp, T" uniqKey="Horenkamp T">T Horenkamp</name>
</author>
<author>
<name sortKey="Schroger, E" uniqKey="Schroger E">E Schröger</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Jacobsen, T" uniqKey="Jacobsen T">T Jacobsen</name>
</author>
<author>
<name sortKey="Schroger, E" uniqKey="Schroger E">E Schröger</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Horvath, J" uniqKey="Horvath J">J Horváth</name>
</author>
<author>
<name sortKey="Czigler, I" uniqKey="Czigler I">I Czigler</name>
</author>
<author>
<name sortKey="Jacobsen, T" uniqKey="Jacobsen T">T Jacobsen</name>
</author>
<author>
<name sortKey="Maess, B" uniqKey="Maess B">B Maess</name>
</author>
<author>
<name sortKey="Schroger, E" uniqKey="Schroger E">E Schröger</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Taaseh, N" uniqKey="Taaseh N">N Taaseh</name>
</author>
<author>
<name sortKey="Yaron, A" uniqKey="Yaron A">A Yaron</name>
</author>
<author>
<name sortKey="Nelken, I" uniqKey="Nelken I">I Nelken</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Mill, R" uniqKey="Mill R">R Mill</name>
</author>
<author>
<name sortKey="Coath, M" uniqKey="Coath M">M Coath</name>
</author>
<author>
<name sortKey="Wennekers, T" uniqKey="Wennekers T">T Wennekers</name>
</author>
<author>
<name sortKey="Denham, S" uniqKey="Denham S">S Denham</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Wacongne, C" uniqKey="Wacongne C">C Wacongne</name>
</author>
<author>
<name sortKey="Changeux, J P" uniqKey="Changeux J">J-P Changeux</name>
</author>
<author>
<name sortKey="Dehaene, S" uniqKey="Dehaene S">S Dehaene</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Garrido, M" uniqKey="Garrido M">M Garrido</name>
</author>
<author>
<name sortKey="Kilner, J" uniqKey="Kilner J">J Kilner</name>
</author>
<author>
<name sortKey="Kiebel, S" uniqKey="Kiebel S">S Kiebel</name>
</author>
<author>
<name sortKey="Friston, K" uniqKey="Friston K">K Friston</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="May, P" uniqKey="May P">P May</name>
</author>
<author>
<name sortKey="Tiitinen, H" uniqKey="Tiitinen H">H Tiitinen</name>
</author>
<author>
<name sortKey="Ilmoniemi, Rj" uniqKey="Ilmoniemi R">RJ Ilmoniemi</name>
</author>
<author>
<name sortKey="Nyman, G" uniqKey="Nyman G">G Nyman</name>
</author>
<author>
<name sortKey="Taylor, Jg" uniqKey="Taylor J">JG Taylor</name>
</author>
</analytic>
</biblStruct>
<biblStruct></biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Slabu, L" uniqKey="Slabu L">L Slabu</name>
</author>
<author>
<name sortKey="Grimm, S" uniqKey="Grimm S">S Grimm</name>
</author>
<author>
<name sortKey="Escera, C" uniqKey="Escera C">C Escera</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Slabu, L" uniqKey="Slabu L">L Slabu</name>
</author>
<author>
<name sortKey="Escera, C" uniqKey="Escera C">C Escera</name>
</author>
<author>
<name sortKey="Grimm, S" uniqKey="Grimm S">S Grimm</name>
</author>
<author>
<name sortKey="Costa Faidella, J" uniqKey="Costa Faidella J">J Costa-Faidella</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Grimm, S" uniqKey="Grimm S">S Grimm</name>
</author>
<author>
<name sortKey="Escera, C" uniqKey="Escera C">C Escera</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Penny, W" uniqKey="Penny W">W Penny</name>
</author>
<author>
<name sortKey="Stephan, K" uniqKey="Stephan K">K Stephan</name>
</author>
<author>
<name sortKey="Mechelli, A" uniqKey="Mechelli A">A Mechelli</name>
</author>
<author>
<name sortKey="Friston, K" uniqKey="Friston K">K Friston</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Penny, W" uniqKey="Penny W">W Penny</name>
</author>
<author>
<name sortKey="Stephan, K" uniqKey="Stephan K">K Stephan</name>
</author>
<author>
<name sortKey="Daunizeau, J" uniqKey="Daunizeau J">J Daunizeau</name>
</author>
<author>
<name sortKey="Rosa, M" uniqKey="Rosa M">M Rosa</name>
</author>
<author>
<name sortKey="Friston, K" uniqKey="Friston K">K Friston</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Stephan, Ke" uniqKey="Stephan K">KE Stephan</name>
</author>
<author>
<name sortKey="Penny, W" uniqKey="Penny W">W Penny</name>
</author>
<author>
<name sortKey="Daunizeau, J" uniqKey="Daunizeau J">J Daunizeau</name>
</author>
<author>
<name sortKey="Moran, R" uniqKey="Moran R">R Moran</name>
</author>
<author>
<name sortKey="Friston, K" uniqKey="Friston K">K Friston</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Winkler, I" uniqKey="Winkler I">I Winkler</name>
</author>
<author>
<name sortKey="Czigler, I" uniqKey="Czigler I">I Czigler</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Schroger, E" uniqKey="Schroger E">E Schröger</name>
</author>
<author>
<name sortKey="Bendixen, A" uniqKey="Bendixen A">A Bendixen</name>
</author>
<author>
<name sortKey="Trujillo Barreto, N" uniqKey="Trujillo Barreto N">N Trujillo-Barreto</name>
</author>
<author>
<name sortKey="Roeber, U" uniqKey="Roeber U">U Roeber</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="N T Nen, R" uniqKey="N T Nen R">R Näätänen</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Bendixen, A" uniqKey="Bendixen A">A Bendixen</name>
</author>
<author>
<name sortKey="Schroger, E" uniqKey="Schroger E">E Schröger</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Umbricht, D" uniqKey="Umbricht D">D Umbricht</name>
</author>
<author>
<name sortKey="Krljes, S" uniqKey="Krljes S">S Krljes</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Stephan, Ke" uniqKey="Stephan K">KE Stephan</name>
</author>
<author>
<name sortKey="Baldeweg, T" uniqKey="Baldeweg T">T Baldeweg</name>
</author>
<author>
<name sortKey="Friston, Kj" uniqKey="Friston K">KJ Friston</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Baldeweg, T" uniqKey="Baldeweg T">T Baldeweg</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Umbricht, D" uniqKey="Umbricht D">D Umbricht</name>
</author>
<author>
<name sortKey="Schmid, L" uniqKey="Schmid L">L Schmid</name>
</author>
<author>
<name sortKey="Koller, R" uniqKey="Koller R">R Koller</name>
</author>
<author>
<name sortKey="Vollenweider, Fx" uniqKey="Vollenweider F">FX Vollenweider</name>
</author>
<author>
<name sortKey="Hell, D" uniqKey="Hell D">D Hell</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Schmidt, A" uniqKey="Schmidt A">A Schmidt</name>
</author>
<author>
<name sortKey="Diaconescu, A" uniqKey="Diaconescu A">A Diaconescu</name>
</author>
<author>
<name sortKey="Kometer, M" uniqKey="Kometer M">M Kometer</name>
</author>
<author>
<name sortKey="Friston, K" uniqKey="Friston K">K Friston</name>
</author>
<author>
<name sortKey="Stephan, K" uniqKey="Stephan K">K Stephan</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Baldeweg, T" uniqKey="Baldeweg T">T Baldeweg</name>
</author>
<author>
<name sortKey="Wong, D" uniqKey="Wong D">D Wong</name>
</author>
<author>
<name sortKey="Stephan, K" uniqKey="Stephan K">K Stephan</name>
</author>
</analytic>
</biblStruct>
</listBibl>
</div1>
</back>
</TEI>
<pmc article-type="research-article">
<pmc-dir>properties open_access</pmc-dir>
<front>
<journal-meta>
<journal-id journal-id-type="nlm-ta">PLoS Comput Biol</journal-id>
<journal-id journal-id-type="iso-abbrev">PLoS Comput. Biol</journal-id>
<journal-id journal-id-type="publisher-id">plos</journal-id>
<journal-id journal-id-type="pmc">ploscomp</journal-id>
<journal-title-group>
<journal-title>PLoS Computational Biology</journal-title>
</journal-title-group>
<issn pub-type="ppub">1553-734X</issn>
<issn pub-type="epub">1553-7358</issn>
<publisher>
<publisher-name>Public Library of Science</publisher-name>
<publisher-loc>San Francisco, USA</publisher-loc>
</publisher>
</journal-meta>
<article-meta>
<article-id pub-id-type="pmid">24244118</article-id>
<article-id pub-id-type="pmc">3820518</article-id>
<article-id pub-id-type="publisher-id">PCOMPBIOL-D-13-00782</article-id>
<article-id pub-id-type="doi">10.1371/journal.pcbi.1003288</article-id>
<article-categories>
<subj-group subj-group-type="heading">
<subject>Research Article</subject>
</subj-group>
</article-categories>
<title-group>
<article-title>A Neurocomputational Model of the Mismatch Negativity</article-title>
<alt-title alt-title-type="running-head">Modelling the MMN Waveform</alt-title>
</title-group>
<contrib-group>
<contrib contrib-type="author">
<name>
<surname>Lieder</surname>
<given-names>Falk</given-names>
</name>
<xref ref-type="aff" rid="aff1">
<sup>1</sup>
</xref>
<xref ref-type="aff" rid="aff2">
<sup>2</sup>
</xref>
<xref ref-type="aff" rid="aff3">
<sup>3</sup>
</xref>
<xref ref-type="corresp" rid="cor1">
<sup>*</sup>
</xref>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Stephan</surname>
<given-names>Klaas E.</given-names>
</name>
<xref ref-type="aff" rid="aff1">
<sup>1</sup>
</xref>
<xref ref-type="aff" rid="aff2">
<sup>2</sup>
</xref>
<xref ref-type="aff" rid="aff4">
<sup>4</sup>
</xref>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Daunizeau</surname>
<given-names>Jean</given-names>
</name>
<xref ref-type="aff" rid="aff1">
<sup>1</sup>
</xref>
<xref ref-type="aff" rid="aff2">
<sup>2</sup>
</xref>
<xref ref-type="aff" rid="aff4">
<sup>4</sup>
</xref>
<xref ref-type="aff" rid="aff5">
<sup>5</sup>
</xref>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Garrido</surname>
<given-names>Marta I.</given-names>
</name>
<xref ref-type="aff" rid="aff4">
<sup>4</sup>
</xref>
<xref ref-type="aff" rid="aff6">
<sup>6</sup>
</xref>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Friston</surname>
<given-names>Karl J.</given-names>
</name>
<xref ref-type="aff" rid="aff4">
<sup>4</sup>
</xref>
</contrib>
</contrib-group>
<aff id="aff1">
<label>1</label>
<addr-line>Translational Neuromodeling Unit (TNU), Institute of Biomedical Engineering, University of Zurich & ETH Zurich, Zurich, Switzerland</addr-line>
</aff>
<aff id="aff2">
<label>2</label>
<addr-line>Laboratory for Social and Neuronal Systems Research, Dept. of Economics, University of Zurich, Zurich, Switzerland</addr-line>
</aff>
<aff id="aff3">
<label>3</label>
<addr-line>Helen Wills Neuroscience Institute, University of California at Berkeley, Berkeley, California, United States of America</addr-line>
</aff>
<aff id="aff4">
<label>4</label>
<addr-line>Wellcome Trust Centre for Neuroimaging, Institute of Neurology, University College London, London, United Kingdom</addr-line>
</aff>
<aff id="aff5">
<label>5</label>
<addr-line>Brain and Spine Institute (ICM), Paris, France</addr-line>
</aff>
<aff id="aff6">
<label>6</label>
<addr-line>Queensland Brain Institute, The University of Queensland, St Lucia, Australia</addr-line>
</aff>
<contrib-group>
<contrib contrib-type="editor">
<name>
<surname>Sporns</surname>
<given-names>Olaf</given-names>
</name>
<role>Editor</role>
<xref ref-type="aff" rid="edit1"></xref>
</contrib>
</contrib-group>
<aff id="edit1">
<addr-line>Indiana University, United States of America</addr-line>
</aff>
<author-notes>
<corresp id="cor1">* E-mail:
<email>lieder@biomed.ee.ethz.ch</email>
</corresp>
<fn fn-type="COI-statement">
<p>The authors have declared that no competing interests exist.</p>
</fn>
<fn fn-type="con">
<p>Conceived and designed the experiments: FL KES JD MIG KJF. Performed the experiments: FL. Analyzed the data: FL. Contributed reagents/materials/analysis tools: JD KJF. Wrote the paper: FL KES JD MIG KJF. Provided experimental data: MIG.</p>
</fn>
</author-notes>
<pub-date pub-type="collection">
<month>11</month>
<year>2013</year>
</pub-date>
<pub-date pub-type="epub">
<day>7</day>
<month>11</month>
<year>2013</year>
</pub-date>
<volume>9</volume>
<issue>11</issue>
<elocation-id>e1003288</elocation-id>
<history>
<date date-type="received">
<day>6</day>
<month>5</month>
<year>2013</year>
</date>
<date date-type="accepted">
<day>3</day>
<month>9</month>
<year>2013</year>
</date>
</history>
<permissions>
<copyright-statement>© 2013 Lieder et al</copyright-statement>
<copyright-year>2013</copyright-year>
<copyright-holder>Lieder et al</copyright-holder>
<license xlink:href="http://creativecommons.org/licenses/by/4.0/">
<license-p>This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are properly credited.</license-p>
</license>
</permissions>
<abstract>
<p>The mismatch negativity (MMN) is an event related potential evoked by violations of regularity. Here, we present a model of the underlying neuronal dynamics based upon the idea that auditory cortex continuously updates a generative model to predict its sensory inputs. The MMN is then modelled as the superposition of the electric fields evoked by neuronal activity reporting prediction errors. The process by which auditory cortex generates predictions and resolves prediction errors was simulated using generalised (Bayesian) filtering – a biologically plausible scheme for probabilistic inference on the hidden states of hierarchical dynamical models. The resulting scheme generates realistic MMN waveforms, explains the qualitative effects of deviant probability and magnitude on the MMN – in terms of latency and amplitude – and makes quantitative predictions about the interactions between deviant probability and magnitude. This work advances a formal understanding of the MMN and – more generally – illustrates the potential for developing computationally informed dynamic causal models of empirical electromagnetic responses.</p>
</abstract>
<abstract abstract-type="summary">
<title>Author Summary</title>
<p>Computational neuroimaging enables quantitative inferences from non-invasive measures of brain activity on the underlying mechanisms. Ultimately, we would like to understand these mechanisms not only in terms of physiology but also in terms of computation. So far, this has not been addressed by mathematical models of neuroimaging data (e.g., dynamic causal models), which have rather focused on ever more detailed inferences about physiology. Here we present the first instance of a dynamic causal model that explains electrophysiological data in terms of computation rather than physiology. Concretely, we predict the mismatch negativity – an event-related potential elicited by regularity violation – from the dynamics of perceptual inference as prescribed by the free energy principle. The resulting model explains the waveform of the mismatch negativity and some of its phenomenological properties at a level of precision that has not been attempted before. This highlights the potential of neurocomputational dynamic causal models to enable inferences from neuroimaging data on neurocomputational mechanisms.</p>
</abstract>
<funding-group>
<funding-statement>This work was supported by SystemsX.ch (FL, JD, KES), the René and Susanne Braginsky Foundation (KES), the Clinical Research Priority Program “Multiple Sclerosis” (KES), the European Research Council (JD), the Wellcome Trust (KJF, MIG), and the Australian Research Council (MIG). The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.</funding-statement>
</funding-group>
<counts>
<page-count count="14"></page-count>
</counts>
</article-meta>
</front>
<body>
<sec id="s1">
<title>Introduction</title>
<p>Recent advances in computational neuroimaging
<xref rid="pcbi.1003288-Friston1" ref-type="bibr">[1]</xref>
have enabled inferences about the neurophysiological mechanisms that generate non-invasive measures of task or stimulus-evoked neuronal responses; as measured by functional magnetic resonance imaging (fMRI) or electroencephalography (EEG). One such approach is dynamic causal modelling
<xref rid="pcbi.1003288-David1" ref-type="bibr">[2]</xref>
that tries to explain EEG data in terms of synaptic coupling within a network of interacting neuronal populations or sources. However, this description is at the level of physiological processes that do not have a direct interpretation in terms of information processing. Cognitive scientists have been using formal models of cognitive processes to infer on information processing from behaviour for decades
<xref rid="pcbi.1003288-Sun1" ref-type="bibr">[3]</xref>
, but it has remained largely unclear how such inferences should be informed by neurophysiological data. We argue that one may overcome the limitations of both approaches by integrating normative models of information processing (e.g.,
<xref rid="pcbi.1003288-Friston2" ref-type="bibr">[4]</xref>
,
<xref rid="pcbi.1003288-Friston3" ref-type="bibr">[5]</xref>
) with physiologically grounded models of neuroimaging data
<xref rid="pcbi.1003288-Friston2" ref-type="bibr">[4]</xref>
,
<xref rid="pcbi.1003288-Friston3" ref-type="bibr">[5]</xref>
. This approach may produce computationally informed neuronal models – or neurocomputational models – enabling one to test hypotheses about how the brain processes information to generate adaptive behaviour. Here, we provide a proof-of-concept for this approach by jointly modelling a cognitive process – perceptual inference – and the event related potential (ERP) that it may generate – the mismatch negativity (MMN). Specifically, we ask whether the MMN can be modelled by a neuronal system performing perceptual inference, as prescribed by predictive coding
<xref rid="pcbi.1003288-Friston2" ref-type="bibr">[4]</xref>
,
<xref rid="pcbi.1003288-Friston3" ref-type="bibr">[5]</xref>
.</p>
<p>The MMN is an event-related potential that is evoked by the violation of a regular stream of sensory events. By convention, the MMN is estimated by subtracting the ERP elicited by
<italic>standards</italic>
, i.e. events that established the regularity, from the ERP elicited by
<italic>deviants</italic>
, i.e. events violating this regularity. Depending on the specific type of regularity, the MMN is usually expressed most strongly at fronto-central electrodes, with a peak latency between 100 and 250 milliseconds after deviant onset
<xref rid="pcbi.1003288-Friston1" ref-type="bibr">[1]</xref>
. More precisely, the MMN has been shown to depend upon deviant probability and magnitude. Deviant probability is the relative frequency of tones that violate an established regularity. In studies of the MMN evoked by changes in sound frequency, deviance magnitude is the (proportional) difference between the deviant frequency and the standard frequency. The effects of these factors are usually summarized in terms of changes in the MMN peak amplitude and its latency (see
<xref ref-type="table" rid="pcbi-1003288-t001">Table 1</xref>
). While increasing the deviance magnitude makes the MMN peak earlier and with a larger amplitude
<xref rid="pcbi.1003288-Friston2" ref-type="bibr">[4]</xref>
,
<xref rid="pcbi.1003288-Friston4" ref-type="bibr">[6]</xref>
,
<xref rid="pcbi.1003288-Friston5" ref-type="bibr">[7]</xref>
, decreasing deviant probability only increases the MMN peak amplitude
<xref rid="pcbi.1003288-Garrido1" ref-type="bibr">[8]</xref>
but does not change its latency
<xref rid="pcbi.1003288-Tiitinen1" ref-type="bibr">[9]</xref>
.</p>
<table-wrap id="pcbi-1003288-t001" orientation="portrait" position="float">
<object-id pub-id-type="doi">10.1371/journal.pcbi.1003288.t001</object-id>
<label>Table 1</label>
<caption>
<title>Overview of the Phenomenological Properties of the MMN.</title>
</caption>
<alternatives>
<graphic id="pcbi-1003288-t001-1" xlink:href="pcbi.1003288.t001"></graphic>
<table frame="hsides" rules="groups">
<colgroup span="1">
<col align="left" span="1"></col>
<col align="center" span="1"></col>
<col align="center" span="1"></col>
</colgroup>
<thead>
<tr>
<td align="left" rowspan="1" colspan="1">Effect of ↓ on →</td>
<td align="left" rowspan="1" colspan="1">|MMN Amplitude|</td>
<td align="left" rowspan="1" colspan="1">MMN Latency</td>
</tr>
</thead>
<tbody>
<tr>
<td align="left" rowspan="1" colspan="1">Higher Deviance Magnitude</td>
<td align="left" rowspan="1" colspan="1">
<bold></bold>
<xref rid="pcbi.1003288-Tiitinen1" ref-type="bibr">[9]</xref>
</td>
<td align="left" rowspan="1" colspan="1">
<bold>
<inline-formula>
<inline-graphic xlink:href="pcbi.1003288.e108.jpg"></inline-graphic>
</inline-formula>
</bold>
<xref rid="pcbi.1003288-Javitt1" ref-type="bibr">[11]</xref>
</td>
</tr>
<tr>
<td align="left" rowspan="1" colspan="1">Lower Deviant Probability</td>
<td align="left" rowspan="1" colspan="1">
<bold></bold>
<xref rid="pcbi.1003288-Tiitinen1" ref-type="bibr">[9]</xref>
</td>
<td align="left" rowspan="1" colspan="1">no effect
<xref rid="pcbi.1003288-Tiitinen1" ref-type="bibr">[9]</xref>
</td>
</tr>
</tbody>
</table>
</alternatives>
</table-wrap>
<p>The question as to which neurophysiological mechanisms generate the MMN remains controversial (cf.
<xref rid="pcbi.1003288-Sinkkonen1" ref-type="bibr">[10]</xref>
vs.
<xref rid="pcbi.1003288-Javitt1" ref-type="bibr">[11]</xref>
), even though this issue has been addressed by a large number of studies over the last thirty years
<xref rid="pcbi.1003288-Ntnen1" ref-type="bibr">[12]</xref>
. One reason for an enduring controversy could be that the MMN's latency and amplitude contain insufficient information to disambiguate between competing hypotheses (but see
<xref rid="pcbi.1003288-May1" ref-type="bibr">[13]</xref>
). While the MMN is the sum of overlapping subcomponents that are generated in temporal and frontal brain areas
<xref rid="pcbi.1003288-Ntnen1" ref-type="bibr">[12]</xref>
,
<xref rid="pcbi.1003288-Kujala1" ref-type="bibr">[14]</xref>
– and are differentially affected by experimental manipulations
<xref rid="pcbi.1003288-Lieder1" ref-type="bibr">[15]</xref>
– it is a continuous function of time. This means that the underlying ERP waveforms may contain valuable information about MMN subcomponents, the physiological mechanisms that generate them and, critically, their functional correlates (see e.g.
<xref rid="pcbi.1003288-Opitz1" ref-type="bibr">[16]</xref>
). Predictive coding offers a unique and unified explanation of the MMN's neurophysiological features. In brief, predictive coding is a computational mechanism that formally links perception and learning processes to neural activity and synaptic plasticity, respectively
<xref rid="pcbi.1003288-Baldeweg1" ref-type="bibr">[17]</xref>
. More precisely, event-related electrophysiological responses are thought to arise from the brain's attempt to minimize prediction errors (i.e. differences between actual and predicted sensory input) through hierarchical Bayesian inference. In this context, the MMN simply reflects neuronal activity reporting these prediction errors in hierarchically organized network of auditory cortical sources. If this is true, then the rise and fall of the MMN may reflect the appearance of a discrepancy between sensory input and top-down predictions – and its resolution through perceptual inference. These ideas have been used to interpret the results of experimental studies of the MMN
<xref rid="pcbi.1003288-Garrido1" ref-type="bibr">[8]</xref>
,
<xref rid="pcbi.1003288-Garrido2" ref-type="bibr">[18]</xref>
and computational treatments of trial-wise changes in amplitude
<xref rid="pcbi.1003288-Friston4" ref-type="bibr">[6]</xref>
. However, no attempt has been made to quantitatively relate predictive coding models to empirical MMN waveforms. Here, we extend these efforts by explicitly modelling the physiological mechanisms underlying the MMN in terms of a computational mechanism: predictive coding. In other words, our model is both an extension to dynamic causal models of observed electrophysiological responses
<xref rid="pcbi.1003288-Garrido2" ref-type="bibr">[18]</xref>
,
<xref rid="pcbi.1003288-Garrido3" ref-type="bibr">[19]</xref>
to information processing, and a neurophysiological view on meta-Bayesian approaches to cognitive process
<xref rid="pcbi.1003288-Lieder1" ref-type="bibr">[15]</xref>
. We establish the face validity of this neurocomputational model in terms of its ability to explain the observed MMN and its dependence on deviant frequency and deviance magnitude.</p>
<p>This paper comprises two sections. In the first section, we summarize mathematical models of predictive coding (as derived from the free energy principle), and describe the particular perceptual model that we assume the brain uses in the context of a predictable stream of auditory stimuli. The resulting scheme provides a model of neuronal responses in auditory oddball paradigms. In line with the DCM framework, we then augment this model with a mapping from (hidden) neuronal dynamics to (observed) scalp electrophysiological data. In the second section, we use empirical ERPs acquired during an oddball paradigm to tune the parameters of the observation model. Equipped with these parameters, we then simulate MMN waveforms under different levels of deviant probability and deviance magnitude – and compare the resulting latency and amplitude changes with findings reported in the literature. This serves to provide a proof of principle that dynamic causal models can have a computational form – and establish the face validity of predictive coding theories brain function.</p>
</sec>
<sec id="s2">
<title>Models</title>
<p>To simulate the MMN under the predictive coding hypothesis, we simulated the processing of standard and deviant stimuli using established Bayesian filtering (or predictive coding) – under a hierarchical dynamic model of repeated stimuli. This generates time-continuous trajectories, encoding beliefs (posterior expectations and predictions) and prediction errors. These prediction errors were then used to explain the MMN, via a forward model of the mapping between neuronal representations of prediction error and observed scalp potentials. In this section, we describe the steps entailed by this sort of modelling. See
<xref ref-type="fig" rid="pcbi-1003288-g001">Figure 1</xref>
for an overview.</p>
<fig id="pcbi-1003288-g001" orientation="portrait" position="float">
<object-id pub-id-type="doi">10.1371/journal.pcbi.1003288.g001</object-id>
<label>Figure 1</label>
<caption>
<title>Flow Chart of MMN simulations.</title>
<p>Sensory input was generated from a Hierarchical Dynamic Model (true HDM) for a standard or deviant stimulus. This stimulus was produced by inputs controlling the temporal evolution of loudness and frequency (hidden causes). We simulated perception with the inversion of the internal model (internal HDM) of a subject – who anticipates the standard event with a certain degree of confidence (prior beliefs) – with Generalised Filtering (GF). This produces a simulated trajectory of the prediction errors that are minimised during perceptual inference. These prediction errors were weighted by their precisions and used to predict event related potentials. Model parameters are listed on the left and model equations are provided on the right. To map prediction errors to empirical responses, they were shifted and scaled so that the simulated stimulus duration was 70 ms. A sigmoid function was applied to model nonlinearities in the relationship between prediction error and equivalent current dipole activity. Third, the scalp potential at the simulated electrode location was modelled as a linear superposition of the ensuing local field potentials. Finally, the simulated EEG data was down-sampled and sheltered.</p>
</caption>
<graphic xlink:href="pcbi.1003288.g001"></graphic>
</fig>
<sec id="s2a">
<title>Predictive coding and hierarchical dynamic models</title>
<p>Perception estimates the causes (
<inline-formula>
<inline-graphic xlink:href="pcbi.1003288.e001.jpg"></inline-graphic>
</inline-formula>
) of the sensory inputs (
<inline-formula>
<inline-graphic xlink:href="pcbi.1003288.e002.jpg"></inline-graphic>
</inline-formula>
) that the brain receives. In other words, to recognise causal structure in the world, the brain has to invert the process by which its sensory consequences are generated from causes in the environment. This view of perception as unconscious inference was introduced by Helmholtz
<xref rid="pcbi.1003288-David1" ref-type="bibr">[2]</xref>
in the 19
<sup>th</sup>
century. More recently, it has been formalized as the inversion of a generative model
<inline-formula>
<inline-graphic xlink:href="pcbi.1003288.e003.jpg"></inline-graphic>
</inline-formula>
of sensory inputs
<inline-formula>
<inline-graphic xlink:href="pcbi.1003288.e004.jpg"></inline-graphic>
</inline-formula>
<xref rid="pcbi.1003288-Daunizeau1" ref-type="bibr">[20]</xref>
. In the language of probability theory, this means that the percept corresponds to the posterior belief
<inline-formula>
<inline-graphic xlink:href="pcbi.1003288.e005.jpg"></inline-graphic>
</inline-formula>
about the putative causes
<inline-formula>
<inline-graphic xlink:href="pcbi.1003288.e006.jpg"></inline-graphic>
</inline-formula>
of sensory input
<inline-formula>
<inline-graphic xlink:href="pcbi.1003288.e007.jpg"></inline-graphic>
</inline-formula>
and any hidden states
<inline-formula>
<inline-graphic xlink:href="pcbi.1003288.e008.jpg"></inline-graphic>
</inline-formula>
that mediate their effect. This means that any perceptual experience depends on the model
<inline-formula>
<inline-graphic xlink:href="pcbi.1003288.e009.jpg"></inline-graphic>
</inline-formula>
of how sensory input is generated. To capture the rich structure of natural sounds, the model
<inline-formula>
<inline-graphic xlink:href="pcbi.1003288.e010.jpg"></inline-graphic>
</inline-formula>
has to be dynamic, hierarchical, and nonlinear. Hierarchical dynamic models (HDMs)
<xref rid="pcbi.1003288-Helmholtz1" ref-type="bibr">[21]</xref>
accommodate these attributes and can be used to model sounds as complex as birdsong
<xref rid="pcbi.1003288-Knill1" ref-type="bibr">[22]</xref>
.</p>
<p>HDMs generate time-continuous data
<inline-formula>
<inline-graphic xlink:href="pcbi.1003288.e011.jpg"></inline-graphic>
</inline-formula>
as noisy observations of a nonlinear transformation
<inline-formula>
<inline-graphic xlink:href="pcbi.1003288.e012.jpg"></inline-graphic>
</inline-formula>
of hidden states
<inline-formula>
<inline-graphic xlink:href="pcbi.1003288.e013.jpg"></inline-graphic>
</inline-formula>
and hidden causes
<inline-formula>
<inline-graphic xlink:href="pcbi.1003288.e014.jpg"></inline-graphic>
</inline-formula>
:
<disp-formula id="pcbi.1003288.e015">
<graphic xlink:href="pcbi.1003288.e015.jpg" position="anchor" orientation="portrait"></graphic>
<label>(1)</label>
</disp-formula>
where the temporal evolution of hidden states
<inline-formula>
<inline-graphic xlink:href="pcbi.1003288.e016.jpg"></inline-graphic>
</inline-formula>
is given by the differential equation:
<disp-formula id="pcbi.1003288.e017">
<graphic xlink:href="pcbi.1003288.e017.jpg" position="anchor" orientation="portrait"></graphic>
<label>(2)</label>
</disp-formula>
This equation models the change in
<inline-formula>
<inline-graphic xlink:href="pcbi.1003288.e018.jpg"></inline-graphic>
</inline-formula>
as a nonlinear function
<inline-formula>
<inline-graphic xlink:href="pcbi.1003288.e019.jpg"></inline-graphic>
</inline-formula>
of the hidden states
<inline-formula>
<inline-graphic xlink:href="pcbi.1003288.e020.jpg"></inline-graphic>
</inline-formula>
and hidden causes
<inline-formula>
<inline-graphic xlink:href="pcbi.1003288.e021.jpg"></inline-graphic>
</inline-formula>
plus state noise
<inline-formula>
<inline-graphic xlink:href="pcbi.1003288.e022.jpg"></inline-graphic>
</inline-formula>
. The hidden causes
<inline-formula>
<inline-graphic xlink:href="pcbi.1003288.e023.jpg"></inline-graphic>
</inline-formula>
of the change in
<inline-formula>
<inline-graphic xlink:href="pcbi.1003288.e024.jpg"></inline-graphic>
</inline-formula>
are modelled as the outputs of a hidden process at the second level. This second process is modelled in the same way as the hidden process at the first level, but with new nonlinear functions
<inline-formula>
<inline-graphic xlink:href="pcbi.1003288.e025.jpg"></inline-graphic>
</inline-formula>
and
<inline-formula>
<inline-graphic xlink:href="pcbi.1003288.e026.jpg"></inline-graphic>
</inline-formula>
:
<disp-formula id="pcbi.1003288.e027">
<graphic xlink:href="pcbi.1003288.e027.jpg" position="anchor" orientation="portrait"></graphic>
<label>(3)</label>
</disp-formula>
As in the first level, the hidden dynamics of the second level are driven by hidden causes
<inline-formula>
<inline-graphic xlink:href="pcbi.1003288.e028.jpg"></inline-graphic>
</inline-formula>
that are modelled as the output of a hidden process at the next higher level, and so forth. This composition can be repeated as often as necessary to model the system under consideration – up to the last level, whose input is usually modelled as a known function of time plus noise:
<disp-formula id="pcbi.1003288.e029">
<graphic xlink:href="pcbi.1003288.e029.jpg" position="anchor" orientation="portrait"></graphic>
<label>(4)</label>
</disp-formula>
The (Bayesian) inversion of HDMs is a difficult issue, which calls for appropriate approximation schemes. To explain how the brain is nevertheless able to recognise the causes of natural sounds, we assume that it performs
<italic>approximate Bayesian inference</italic>
by minimising variational free energy
<xref rid="pcbi.1003288-Friston6" ref-type="bibr">[23]</xref>
. More generally, the free-energy principle is a mathematical framework for modelling how organisms perceive, learn, and make decisions in a parsimonious and biologically plausible fashion. In brief, it assumes that biological systems like the brain solve complex inference problems by adopting a parametric approximation
<inline-formula>
<inline-graphic xlink:href="pcbi.1003288.e030.jpg"></inline-graphic>
</inline-formula>
to a posterior belief over hidden causes and states
<inline-formula>
<inline-graphic xlink:href="pcbi.1003288.e031.jpg"></inline-graphic>
</inline-formula>
. It then optimises this approximation by minimizing the variational free-energy:
<disp-formula id="pcbi.1003288.e032">
<graphic xlink:href="pcbi.1003288.e032.jpg" position="anchor" orientation="portrait"></graphic>
<label>(5)</label>
</disp-formula>
One can think of this free-energy as an information theoretic measure of the discrepancy between the brain's approximate belief about the causes of sensory input and the true posterior density. According to the free-energy principle, cognitive processes and their neurophysiological mechanisms serve to minimize free-energy
<xref rid="pcbi.1003288-Friston7" ref-type="bibr">[24]</xref>
– generally by a gradient descent with respect to the sufficient statistics
<inline-formula>
<inline-graphic xlink:href="pcbi.1003288.e033.jpg"></inline-graphic>
</inline-formula>
of the brain's approximate posterior
<inline-formula>
<inline-graphic xlink:href="pcbi.1003288.e034.jpg"></inline-graphic>
</inline-formula>
<xref rid="pcbi.1003288-Friston3" ref-type="bibr">[5]</xref>
:
<disp-formula id="pcbi.1003288.e035">
<graphic xlink:href="pcbi.1003288.e035.jpg" position="anchor" orientation="portrait"></graphic>
<label>(6)</label>
</disp-formula>
This idea that the brain implements perceptual inference by free-energy minimization is supported by a substantial amount of anatomical, physiological, and neuroimaging evidence
<xref rid="pcbi.1003288-Friston2" ref-type="bibr">[4]</xref>
. Algorithms that invert HDMs by minimizing free-energy, such as dynamic expectation maximization
<xref rid="pcbi.1003288-Friston8" ref-type="bibr">[25]</xref>
,
<xref rid="pcbi.1003288-Friston9" ref-type="bibr">[26]</xref>
and generalized filtering (GF)
<xref rid="pcbi.1003288-Friston2" ref-type="bibr">[4]</xref>
,
<xref rid="pcbi.1003288-Friston3" ref-type="bibr">[5]</xref>
,
<xref rid="pcbi.1003288-Friston6" ref-type="bibr">[23]</xref>
,
<xref rid="pcbi.1003288-Friston10" ref-type="bibr">[27]</xref>
,
<xref rid="pcbi.1003288-Friston11" ref-type="bibr">[28]</xref>
, are therefore attractive candidates for simulating and understanding perceptual inference in the brain.</p>
<p>Importantly, algorithmic implementations of this gradient descent are formally equivalent to predictive coding schemes. In brief, representations (sufficient statistics encoding approximate posterior expectations) generate top-down predictions to produce prediction errors. These prediction errors are then passed up the hierarchy in the reverse direction, to update posterior expectations. This ensures an accurate prediction of sensory input and all its intermediate representations. This hierarchal message passing can be expressed mathematically as a gradient descent on the (sum of squared) prediction errors
<inline-formula>
<inline-graphic xlink:href="pcbi.1003288.e036.jpg"></inline-graphic>
</inline-formula>
which are weighted by their precisions (inverse variances)
<inline-formula>
<inline-graphic xlink:href="pcbi.1003288.e037.jpg"></inline-graphic>
</inline-formula>
:
<disp-formula id="pcbi.1003288.e038">
<graphic xlink:href="pcbi.1003288.e038.jpg" position="anchor" orientation="portrait"></graphic>
<label>(6b)</label>
</disp-formula>
where
<inline-formula>
<inline-graphic xlink:href="pcbi.1003288.e039.jpg"></inline-graphic>
</inline-formula>
are prediction errors and
<inline-formula>
<inline-graphic xlink:href="pcbi.1003288.e040.jpg"></inline-graphic>
</inline-formula>
are their precisions (inverse variances). Here and below, the ∼ notation denotes generalised variables (state, velocity, acceleration and so on). The first pair of equalities just says that posterior expectations about hidden causes and states
<inline-formula>
<inline-graphic xlink:href="pcbi.1003288.e041.jpg"></inline-graphic>
</inline-formula>
change according to a mixture of prior prediction– the first term – and an update term in the direction of the gradient of (precision-weighted) prediction error. The second pair of equations expresses precision weighted prediction error
<inline-formula>
<inline-graphic xlink:href="pcbi.1003288.e042.jpg"></inline-graphic>
</inline-formula>
as the difference between posterior expectations about hidden causes and (the changes in) hidden states and their predicted values (
<inline-formula>
<inline-graphic xlink:href="pcbi.1003288.e043.jpg"></inline-graphic>
</inline-formula>
,
<inline-formula>
<inline-graphic xlink:href="pcbi.1003288.e044.jpg"></inline-graphic>
</inline-formula>
), weighed by their precisions
<inline-formula>
<inline-graphic xlink:href="pcbi.1003288.e045.jpg"></inline-graphic>
</inline-formula>
. The predictions are nonlinear functions of expectations at each level of the hierarchy and the level above. In what follows, this predictive coding formulation will serve to simulate perceptual recognition. We will then use prediction errors as a proxy for neuronal activity producing ERPs. To simulate neuronal processing using
<xref ref-type="disp-formula" rid="pcbi.1003288.e035">Equation 6</xref>
, we need to specify the form of the functions
<inline-formula>
<inline-graphic xlink:href="pcbi.1003288.e046.jpg"></inline-graphic>
</inline-formula>
that constitute the generative model:</p>
</sec>
<sec id="s2b">
<title>The generative (auditory) model</title>
<p>To model auditory cortical responses, we assume that cortical sources embody a hierarchical model of repeated stimuli. In other words, the hierarchical structure of the auditory cortex recapitulates the hierarchical structure of sound generation (cf.
<xref rid="pcbi.1003288-Friston8" ref-type="bibr">[25]</xref>
). This hierarchical structure was modelled using the HDM illustrated in
<xref ref-type="fig" rid="pcbi-1003288-g002">Figure 2</xref>
. Note that this model was used to both generate stimuli and simulate predictive coding – assuming the brain is using the same model. The model's sensory prediction
<inline-formula>
<inline-graphic xlink:href="pcbi.1003288.e047.jpg"></inline-graphic>
</inline-formula>
took the form of a vector of loudness modulated frequency channels (spectrogram) at the lowest level. The level above models temporal fluctuations in instantaneous loudness (
<inline-formula>
<inline-graphic xlink:href="pcbi.1003288.e048.jpg"></inline-graphic>
</inline-formula>
) and frequency (
<inline-formula>
<inline-graphic xlink:href="pcbi.1003288.e049.jpg"></inline-graphic>
</inline-formula>
). The hidden causes
<inline-formula>
<inline-graphic xlink:href="pcbi.1003288.e050.jpg"></inline-graphic>
</inline-formula>
and
<inline-formula>
<inline-graphic xlink:href="pcbi.1003288.e051.jpg"></inline-graphic>
</inline-formula>
of these fluctuations are produced by the highest level. These three levels of representation can be mapped onto three hierarchically organized areas of auditory cortex: primary auditory cortex (A1), lateral Heschl's gyrus, and inferior frontal gyrus.</p>
<fig id="pcbi-1003288-g002" orientation="portrait" position="float">
<object-id pub-id-type="doi">10.1371/journal.pcbi.1003288.g002</object-id>
<label>Figure 2</label>
<caption>
<title>Hierarchical dynamical model of stimulus generation.</title>
<p>This figure shows the form of the hierarchical dynamic model used to generate and subsequently recognise stimuli. The sensory input (
<inline-formula>
<inline-graphic xlink:href="pcbi.1003288.e052.jpg"></inline-graphic>
</inline-formula>
)is modelled as a vector of amplitude-modulated frequency channels
<inline-formula>
<inline-graphic xlink:href="pcbi.1003288.e053.jpg"></inline-graphic>
</inline-formula>
whose values are nonlinear functions of the hidden states
<inline-formula>
<inline-graphic xlink:href="pcbi.1003288.e054.jpg"></inline-graphic>
</inline-formula>
plus observation noise. The hidden states represent the instantaneous loudness (
<inline-formula>
<inline-graphic xlink:href="pcbi.1003288.e055.jpg"></inline-graphic>
</inline-formula>
and
<inline-formula>
<inline-graphic xlink:href="pcbi.1003288.e056.jpg"></inline-graphic>
</inline-formula>
) and frequency (
<inline-formula>
<inline-graphic xlink:href="pcbi.1003288.e057.jpg"></inline-graphic>
</inline-formula>
). The temporal evolution of these hidden states is determined by a nonlinear random differential equation that is driven by hidden causes (
<inline-formula>
<inline-graphic xlink:href="pcbi.1003288.e058.jpg"></inline-graphic>
</inline-formula>
). The mean of the subject's belief (posterior expectation) about hidden causes and states is denoted by
<inline-formula>
<inline-graphic xlink:href="pcbi.1003288.e059.jpg"></inline-graphic>
</inline-formula>
. The tilde denotes variables in generalised coordinates of motion.</p>
</caption>
<graphic xlink:href="pcbi.1003288.g002"></graphic>
</fig>
<p>A1 and lateral Heschl's gyrus contain neuronal units encoding posterior expectations and prediction errors, respectively. The activity of the expectation units encodes the time course of
<inline-formula>
<inline-graphic xlink:href="pcbi.1003288.e060.jpg"></inline-graphic>
</inline-formula>
for A1 and expectations about hidden states
<inline-formula>
<inline-graphic xlink:href="pcbi.1003288.e061.jpg"></inline-graphic>
</inline-formula>
for Heschl's gyrus. Error units encode prediction error, i.e. the difference between posterior expectations and top-down predictions. Top-down connections therefore convey predictions, whereas bottom-up connections convey prediction errors. The hidden causes are the expectations of
<inline-formula>
<inline-graphic xlink:href="pcbi.1003288.e062.jpg"></inline-graphic>
</inline-formula>
, providing top-down projections from units in inferior frontal gyrus.</p>
<p>Our model respects the tonotopic organization of primary auditory cortex (see e.g.
<xref rid="pcbi.1003288-Friston9" ref-type="bibr">[26]</xref>
) by considering 50 frequency channels
<inline-formula>
<inline-graphic xlink:href="pcbi.1003288.e063.jpg"></inline-graphic>
</inline-formula>
. It also captures the fact that, while most neurons in A1 have a preferred frequency, their response also increases with loudness
<xref rid="pcbi.1003288-BalaguerBallester1" ref-type="bibr">[29]</xref>
<xref rid="pcbi.1003288-Rauschecker1" ref-type="bibr">[31]</xref>
. Specifically, we assume that the activity
<inline-formula>
<inline-graphic xlink:href="pcbi.1003288.e064.jpg"></inline-graphic>
</inline-formula>
of neurons selective for frequency
<inline-formula>
<inline-graphic xlink:href="pcbi.1003288.e065.jpg"></inline-graphic>
</inline-formula>
is given by:
<disp-formula id="pcbi.1003288.e066">
<graphic xlink:href="pcbi.1003288.e066.jpg" position="anchor" orientation="portrait"></graphic>
<label>(7)</label>
</disp-formula>
We can rewrite this equation in terms of the loudness
<inline-formula>
<inline-graphic xlink:href="pcbi.1003288.e067.jpg"></inline-graphic>
</inline-formula>
and a tuning function
<inline-formula>
<inline-graphic xlink:href="pcbi.1003288.e068.jpg"></inline-graphic>
</inline-formula>
that measures how close the log-frequency
<inline-formula>
<inline-graphic xlink:href="pcbi.1003288.e069.jpg"></inline-graphic>
</inline-formula>
is to the neuron's preferred log-frequency
<inline-formula>
<inline-graphic xlink:href="pcbi.1003288.e070.jpg"></inline-graphic>
</inline-formula>
:
<disp-formula id="pcbi.1003288.e071">
<graphic xlink:href="pcbi.1003288.e071.jpg" position="anchor" orientation="portrait"></graphic>
<label>(8)</label>
</disp-formula>
This is our (perceptual) model of how the frequency and loudness is encoded by frequency-selective neurons in primary auditory cortex. We use it to simulate the activity of A1 neurons.</p>
<p>Note that a neuronal representation of
<inline-formula>
<inline-graphic xlink:href="pcbi.1003288.e072.jpg"></inline-graphic>
</inline-formula>
depends only on frequency. In the brain, frequency representations that are invariant to the sound level (and other sound attributes) are found in higher auditory areas; for instance in marmoset auditory cortex
<xref rid="pcbi.1003288-Formisano1" ref-type="bibr">[32]</xref>
. Neuroimaging in humans suggests that periodicity is represented in lateral Heschl's gyrus and planum temporale
<xref rid="pcbi.1003288-Schnupp1" ref-type="bibr">[33]</xref>
, and LFP recordings from patients again implicate lateral Heschl's gyrus
<xref rid="pcbi.1003288-Bendor1" ref-type="bibr">[34]</xref>
. We therefore assume that
<inline-formula>
<inline-graphic xlink:href="pcbi.1003288.e073.jpg"></inline-graphic>
</inline-formula>
is represented in lateral Heschl's gyrus. The dynamics of the instantaneous frequency
<inline-formula>
<inline-graphic xlink:href="pcbi.1003288.e074.jpg"></inline-graphic>
</inline-formula>
is given by
<disp-formula id="pcbi.1003288.e075">
<graphic xlink:href="pcbi.1003288.e075.jpg" position="anchor" orientation="portrait"></graphic>
<label>(9)</label>
</disp-formula>
This equation says that the instantaneous frequency converges towards the current target frequency
<inline-formula>
<inline-graphic xlink:href="pcbi.1003288.e076.jpg"></inline-graphic>
</inline-formula>
at a rate of
<inline-formula>
<inline-graphic xlink:href="pcbi.1003288.e077.jpg"></inline-graphic>
</inline-formula>
. In the context of communication, one can think of the target frequency as the frequency that an agent intends to generate, where the instantaneous frequency
<inline-formula>
<inline-graphic xlink:href="pcbi.1003288.e078.jpg"></inline-graphic>
</inline-formula>
is the frequency that is currently being produced. The motivation for this is that deviations from the target frequency will be corrected dynamically over time. The agent's belief about
<inline-formula>
<inline-graphic xlink:href="pcbi.1003288.e079.jpg"></inline-graphic>
</inline-formula>
reflects its expectation about the frequency of the perceived tone and its subjective certainty or confidence about that expectation. Therefore, the effect of the deviant probability – in an oddball paradigm – can be modelled via the precision of this prior belief.</p>
<p>The temporal evolution of the hidden states
<inline-formula>
<inline-graphic xlink:href="pcbi.1003288.e080.jpg"></inline-graphic>
</inline-formula>
and
<inline-formula>
<inline-graphic xlink:href="pcbi.1003288.e081.jpg"></inline-graphic>
</inline-formula>
(encoding loudness) was modelled with the following linear dynamical system:
<disp-formula id="pcbi.1003288.e082">
<graphic xlink:href="pcbi.1003288.e082.jpg" position="anchor" orientation="portrait"></graphic>
<label>(10)</label>
</disp-formula>
In this equation the first hidden cause
<inline-formula>
<inline-graphic xlink:href="pcbi.1003288.e083.jpg"></inline-graphic>
</inline-formula>
drives the drives the dynamics of hidden states, which spiral (decay) towards zero in its absence. Finally, our model makes the realistic assumption that the stochastic perturbations are smooth functions of time. This is achieved by assuming that the derivatives of the stochastic perturbations are drawn from a multivariate Gaussian with zero mean:
<disp-formula id="pcbi.1003288.e084">
<graphic xlink:href="pcbi.1003288.e084.jpg" position="anchor" orientation="portrait"></graphic>
<label>(11)</label>
</disp-formula>
The parameters of this model were chosen according to the biological and psychological considerations explained in Supplementary
<xref ref-type="supplementary-material" rid="pcbi.1003288.s001">Text S1</xref>
.</p>
</sec>
<sec id="s2c">
<title>Modelling perception</title>
<p>Having posited the relevant part of the generative model embodied by auditory cortex, one can now proceed to its inversion by the Bayesian generalized filtering scheme described in
<xref ref-type="disp-formula" rid="pcbi.1003288.e035">Equation 6</xref>
. This is the focus of the next section, which recapitulates how auditory cortex might perceive sound frequency and amplitude using predictive coding mechanisms, given the above hierarchal dynamic model.</p>
<sec id="s2c1">
<title>Perception as model inversion by generalised filtering</title>
<p>Generalised filtering or predictive coding (
<xref ref-type="disp-formula" rid="pcbi.1003288.e035">Equation 6</xref>
) provides a process model of how auditory cortex might invert the model above, yielding posterior estimates of (hidden) sensory causes
<inline-formula>
<inline-graphic xlink:href="pcbi.1003288.e085.jpg"></inline-graphic>
</inline-formula>
from their noisy consequences
<inline-formula>
<inline-graphic xlink:href="pcbi.1003288.e086.jpg"></inline-graphic>
</inline-formula>
. Generalised filtering (GF)
<xref rid="pcbi.1003288-Patterson1" ref-type="bibr">[35]</xref>
,
<xref rid="pcbi.1003288-Hall1" ref-type="bibr">[36]</xref>
is a computationally efficient scheme for variational Bayesian inference on hierarchical dynamical systems. This makes it a likely candidate mechanism for typical recognition problems that the brain solves when perceiving stimulus sequences.</p>
<p>Generalised filtering effectively updates posterior expectations by accumulating evidence over time. Since it is well known that neuronal population activity integrates inputs in a similar way
<xref rid="pcbi.1003288-Schnwiesner1" ref-type="bibr">[37]</xref>
, we take generalised filtering as a model of neuronal evidence accumulation or predictive coding (cf.
<xref rid="pcbi.1003288-Friston9" ref-type="bibr">[26]</xref>
). The neuronal implementation of this filtering is based on the anatomy of cortical microcircuits and rests on the interaction between error units and expectation units implicit in
<xref ref-type="disp-formula" rid="pcbi.1003288.e035">Equation 6</xref>
. Irrespective of the neuronal details of the implementation, prediction error units are likely to play a key role, because (precision weighted) prediction errors determine the free-energy gradients that update posterior beliefs about hidden states. It has been argued that prediction error units correspond to pyramidal neurons in the superficial layers of cortex
<xref rid="pcbi.1003288-Koulakov1" ref-type="bibr">[38]</xref>
. Since these neurons are the primary source of local field potentials (LFP) and EEG signals, the time course of prediction errors can – in principle – be used to model event related potentials such as the MMN.</p>
</sec>
<sec id="s2c2">
<title>Modelling expectations and perception in MMN experiments</title>
<p>To simulate how MMN features (such as amplitude and latency) depend upon deviant probability and magnitude, we assumed that the subject has heard a sequence of standard stimuli (presented at regular intervals) and therefore expects the next stimulus to be a standard. Under Gaussian assumptions this prior belief is fully characterized by its mean – the expected attributes of the anticipated stimulus – and precision (inverse variance). The precision determines the subject's certainty about the sound it expects to hear; in other words, the subjective probability that the stimulus will have the attributes of a standard. This means one can use the expected precision to model the effect of the deviant probability in oddball paradigms – as well as the effects of the number of preceding standards. The effect of deviance magnitude was simulated by varying the difference between the expected and observed frequency. Sensory inputs to A1 were spectrograms generated by sampling from the hierarchical dynamic model described in the previous section (
<xref ref-type="fig" rid="pcbi-1003288-g002">Figure 2</xref>
). First, the hidden cause at the 2
<sup>nd</sup>
level, i.e. the target log-frequency
<inline-formula>
<inline-graphic xlink:href="pcbi.1003288.e087.jpg"></inline-graphic>
</inline-formula>
, was sampled from a normal distribution; for standards this distribution was centred on the standard frequency and for deviants it was centred on the standard frequency plus the deviance magnitude. Then the sensory input (
<inline-formula>
<inline-graphic xlink:href="pcbi.1003288.e088.jpg"></inline-graphic>
</inline-formula>
) was generated by integrating the HDM's random differential equations with
<inline-formula>
<inline-graphic xlink:href="pcbi.1003288.e089.jpg"></inline-graphic>
</inline-formula>
equal to the sampled target frequency. All simulated sensory inputs were generated with low levels of noise, i.e. the precisions were set to
<inline-formula>
<inline-graphic xlink:href="pcbi.1003288.e090.jpg"></inline-graphic>
</inline-formula>
. The subject's probabilistic expectation was modelled by a Gaussian prior on the target log-frequency
<inline-formula>
<inline-graphic xlink:href="pcbi.1003288.e091.jpg"></inline-graphic>
</inline-formula>
. Perception was simulated with generalised filtering of the ensuing sensory input. The generative model of the subject was identical to the model used to generate the inputs, except for the prior belief about the target frequency. The prior belief about the target frequency models prior expectations established by the preceding events, where the mean was set to the standard frequency – and its precision was set according to the deviant probability of the simulated oddball experiment: see
<xref ref-type="supplementary-material" rid="pcbi.1003288.s001">Text S1</xref>
. The noise precisions were chosen to reflect epistemic uncertainty about the process generating the sensory inputs: see
<xref ref-type="supplementary-material" rid="pcbi.1003288.s001">Text S1</xref>
. Note that since we are dealing with dynamic generative models, the prior belief is not just about the initial value, but about the entire trajectory of the target frequency.</p>
<p>
<xref ref-type="fig" rid="pcbi-1003288-g003">Figure 3</xref>
shows an example of stimulus generation and recognition. This figure shows that the predictive coding scheme correctly inferred the frequency of the tone. In these simulations, the loudness of the stimulus was modulated by a Gaussian bump function that peaks at about 70 ms and has a standard deviation of about 30 ms. The sensory evidence is therefore only transient, whereas prior beliefs are in place before, during, and after sensory evidence is available. As a consequence, the inferred target frequency drops back to the prior mean, when sensory input ceases. Although we are now in a position to simulate neuronal responses to standard and oddball stimuli, we still have to complete the model of observed electromagnetic responses:</p>
<fig id="pcbi-1003288-g003" orientation="portrait" position="float">
<object-id pub-id-type="doi">10.1371/journal.pcbi.1003288.g003</object-id>
<label>Figure 3</label>
<caption>
<title>Simulation of perceptual inference.</title>
<p>This figure shows the simulated time course of the perceived frequency for four different deviants. The expected frequency was
<inline-formula>
<inline-graphic xlink:href="pcbi.1003288.e092.jpg"></inline-graphic>
</inline-formula>
and the frequency of the simulated deviant varied between
<inline-formula>
<inline-graphic xlink:href="pcbi.1003288.e093.jpg"></inline-graphic>
</inline-formula>
and
<inline-formula>
<inline-graphic xlink:href="pcbi.1003288.e094.jpg"></inline-graphic>
</inline-formula>
. The simulated auditory responses correctly inferred the deviant frequency, despite its discrepancy with its prior expectation. The prior certainty was chosen to correspond to a deviant probability of 0.05.</p>
</caption>
<graphic xlink:href="pcbi.1003288.g003"></graphic>
</fig>
</sec>
</sec>
<sec id="s2d">
<title>From prediction errors to ERPs</title>
<p>The production of the MMN from prediction errors was modelled as a two stage process: the generation of scalp potentials from neuronal responses and subsequent data processing (see
<xref ref-type="fig" rid="pcbi-1003288-g001">Figure 1</xref>
). We modelled the scalp potentials (at one fronto-central electrode) as the linear superposition of electromagnetic fields caused by the activity of prediction error units in the three simulated cortical sources – plus background activity. Specifically, prediction error units in the A1 source are assumed to encode
<inline-formula>
<inline-graphic xlink:href="pcbi.1003288.e095.jpg"></inline-graphic>
</inline-formula>
– the precision weighted sensory error; error units in lateral Heschl's gyrus were assumed to encode
<inline-formula>
<inline-graphic xlink:href="pcbi.1003288.e096.jpg"></inline-graphic>
</inline-formula>
– the precision weighted errors in the motion of hidden (log-frequency and amplitude) states; and prediction error units in the inferior frontal gyrus were assumed to encode
<inline-formula>
<inline-graphic xlink:href="pcbi.1003288.e097.jpg"></inline-graphic>
</inline-formula>
– the precision weighted errors in their inferred causes. The prediction errors were transformed into event related potentials by three transformations. First, the time axis was shifted (to accommodate conduction delays from the ear) and scaled so that the simulated stimulus duration was 70 ms. Second, a sigmoidal transformation was applied to capture the presumably non-linear mapping from signed precision-weighted prediction error to neural activity (i.e. the firing rate cannot be negative and saturates for high prediction error) and in the mapping from neuronal activity to equivalent current dipole activity; these first two steps are summarized by
<disp-formula id="pcbi.1003288.e098">
<graphic xlink:href="pcbi.1003288.e098.jpg" position="anchor" orientation="portrait"></graphic>
<label>(12)</label>
</disp-formula>
Finally, the scalp potential
<inline-formula>
<inline-graphic xlink:href="pcbi.1003288.e099.jpg"></inline-graphic>
</inline-formula>
is simulated with a linear combination of the three local field potentials
<inline-formula>
<inline-graphic xlink:href="pcbi.1003288.e100.jpg"></inline-graphic>
</inline-formula>
plus a constant:
<disp-formula id="pcbi.1003288.e101">
<graphic xlink:href="pcbi.1003288.e101.jpg" position="anchor" orientation="portrait"></graphic>
<label>(13)</label>
</disp-formula>
Data processing was simulated by the application of down-sampling to 200 Hz and a 3
<sup>rd</sup>
order Butterworth low-pass filtering with a cut-off frequency of 40 Hz, cf.
<xref rid="pcbi.1003288-Friston4" ref-type="bibr">[6]</xref>
,
<xref rid="pcbi.1003288-Friston6" ref-type="bibr">[23]</xref>
,
<xref rid="pcbi.1003288-Friston11" ref-type="bibr">[28]</xref>
,
<xref rid="pcbi.1003288-Kiebel2" ref-type="bibr">[39]</xref>
. We performed two simulations for each condition. In the first simulation the subject expected stimulus A but was presented with stimulus B (deviant). In the second simulation, the subject expected stimulus B and was presented with stimulus B (standard). The MMN was estimated by the difference wave (deviant ERP – standard ERP). This procedure reproduces the analysis used in electrophysiology
<xref rid="pcbi.1003288-Friston5" ref-type="bibr">[7]</xref>
,
<xref rid="pcbi.1003288-Bastos1" ref-type="bibr">[40]</xref>
.</p>
<p>This completes the specification of our computationally informed dynamic causal model of the MMN.</p>
<p>To explore the predictions of this model under different levels of deviant probability and magnitude, we first estimated the biophysical parameters (i.e. the slope parameters
<inline-formula>
<inline-graphic xlink:href="pcbi.1003288.e102.jpg"></inline-graphic>
</inline-formula>
in (12) and the lead field
<inline-formula>
<inline-graphic xlink:href="pcbi.1003288.e103.jpg"></inline-graphic>
</inline-formula>
in (13)) from the empirical ERPs described in
<xref rid="pcbi.1003288-Garrido3" ref-type="bibr">[19]</xref>
, using standard nonlinear least-squares techniques (i.e. the GlobalSearch algorithm
<xref rid="pcbi.1003288-Peter1" ref-type="bibr">[41]</xref>
from the Matlab Global Optimization toolbox). We then used the estimated parameters to predict the MMN under different combinations of deviant probability and magnitude.</p>
<p>In particular, the simulated MMN waveforms were used to reproduce the descriptive statistics typically reported in MMN experiments, i.e. MMN amplitude and latency. MMN latency was estimated by the fractional area technique
<xref rid="pcbi.1003288-Garrido3" ref-type="bibr">[19]</xref>
, because it is regarded as one of the most robust methods for measuring ERP latencies
<xref rid="pcbi.1003288-Ugray1" ref-type="bibr">[42]</xref>
. Specifically, we estimated the MMN latency as the time point at which 50% of the area of the MMN trough lies on either side. This analysis was performed on the difference wave between the first and last point at which the amplitude was at least half the MMN amplitude. This analysis was performed on the unfiltered MMN waveforms as recommended by
<xref rid="pcbi.1003288-Luck1" ref-type="bibr">[43]</xref>
. MMN amplitude was estimated by the average voltage of the low-pass filtered MMN difference wave within a ±10 ms window around the estimated latency.</p>
</sec>
</sec>
<sec id="s3">
<title>Results</title>
<sec id="s3a">
<title>Simulated ERPs</title>
<p>
<xref ref-type="fig" rid="pcbi-1003288-g004">Figure 4</xref>
shows that the waveforms generated by our model reproduce the characteristic shape of the MMN, the positivity evoked by the standard and the negativity evoked by the deviant. The latency of the simulated MMN (164 ms) was almost identical to the latency of the empirical MMN (166 ms). Its peak amplitude (−2.71 µV) was slightly higher than for the empirically measured MMN (
<inline-formula>
<inline-graphic xlink:href="pcbi.1003288.e104.jpg"></inline-graphic>
</inline-formula>
), and its width at half-maximum amplitude (106 ms) was also very similar to the width of the empirical MMN waveform (96 ms). In short, having optimised the parameters mapping from the simulated neuronal activity to empirically observed responses, we were able to reproduce empirical MMNs remarkably accurately. This is nontrivial because the underlying neuronal dynamics are effectively solving a very difficult Bayesian model inversion or filtering problem. Using these optimised parameters, we proceeded to quantify how the MMN waveform would change with deviance magnitude and probability.</p>
<fig id="pcbi-1003288-g004" orientation="portrait" position="float">
<object-id pub-id-type="doi">10.1371/journal.pcbi.1003288.g004</object-id>
<label>Figure 4</label>
<caption>
<title>Simulated ERPs vs. empirical ERPs.</title>
<p>This figure compares the simulated ERPs evoked by the standard and the deviant, and their difference – the MMN – to the empirical ERPs from
<xref rid="pcbi.1003288-Umbricht2" ref-type="bibr">[70]</xref>
,
<xref rid="pcbi.1003288-Schmidt1" ref-type="bibr">[71]</xref>
to which the model was fitted. The simulation captures both the positivity evoked by the standard and the negativity evoked by the deviant.</p>
</caption>
<graphic xlink:href="pcbi.1003288.g004"></graphic>
</fig>
<p>To simulate the effect of deviant probability, we simulated the responses to a deviant under different degrees of prior certainty. To simulate the effect of deviance magnitude, we varied the discrepancy between the expected and observed frequency, while keeping the deviant probability constant. Finally, we investigated potential interactions between deviance magnitude and deviant probability by simulating the effect of magnitude under different prior certainties and
<italic>vice versa</italic>
.</p>
<sec id="s3a1">
<title>Qualitative comparisons to empirical data</title>
<p>To establish the model's face validity, we asked whether it could replicate the empirically established qualitative effects of deviant probability and magnitude summarized in
<xref ref-type="table" rid="pcbi-1003288-t001">Table 1</xref>
.
<xref ref-type="fig" rid="pcbi-1003288-g005">Figure 5a</xref>
shows the simulated effects of deviance magnitude on the MMN for a deviant probability of 0.05. As the deviance magnitude increases from 2% to 32% the MMN trough deepens. Interestingly, this deepening is not uniform across peristimulus time, but it is more pronounced at the beginning. In effect, the shape of the MMN changes, such that an early peak emerges and the MMN latency decreases. The effects of deviance magnitude on MMN peak amplitude and latency hold irrespective of the deviant probability: see
<xref ref-type="fig" rid="pcbi-1003288-g006">Figure 6</xref>
. In short, our model correctly predicts the empirical effects of deviance magnitude on MMN amplitude and latency (
<xref ref-type="table" rid="pcbi-1003288-t001">Table 1</xref>
).</p>
<fig id="pcbi-1003288-g005" orientation="portrait" position="float">
<object-id pub-id-type="doi">10.1371/journal.pcbi.1003288.g005</object-id>
<label>Figure 5</label>
<caption>
<title>Simulated effects of deviance magnitude and deviant probability.</title>
<p>This figure shows the simulated effect of deviance magnitude (panel A) and deviant probability (panel B) on the MMN waveform. As the deviance magnitude increases, the trough becomes deeper and wider and an early peak emerges (panel A). As deviant probability is decreased, the depth of the MMN's trough increases, whereas its latency does not change (panel B). In panel A, the standard frequency was 1000 Hz, the corresponding deviance frequencies were 1020 Hz, 1040 Hz, 1270 Hz, and 1320 Hz, and the simulated deviant probability was 0.05. In panel B, the deviance magnitude was 12.7% (standard: 1000 Hz, deviant 1270 Hz).</p>
</caption>
<graphic xlink:href="pcbi.1003288.g005"></graphic>
</fig>
<fig id="pcbi-1003288-g006" orientation="portrait" position="float">
<object-id pub-id-type="doi">10.1371/journal.pcbi.1003288.g006</object-id>
<label>Figure 6</label>
<caption>
<title>Simulated MMN phenomenology.</title>
<p>Our simulations predict that deviance magnitude increases the MMN peak amplitude and shortens its latency. Furthermore, our simulations suggest that when the deviant probability is decreased, the peak amplitude increases, while its latency does not change. The deviance magnitude is specified relative to the standard frequency of 1000</p>
</caption>
<graphic xlink:href="pcbi.1003288.g006"></graphic>
</fig>
<p>
<xref ref-type="fig" rid="pcbi-1003288-g005">Figure 5b</xref>
shows the effect of deviant probability on the MMN for a deviance magnitude of 12.7%. As the probability of a deviant decreases, the MMN trough deepens, but its shape and centre remain unchanged. As with empirical findings (
<xref ref-type="table" rid="pcbi-1003288-t001">Table 1</xref>
), our simulations suggest that the amplitude of the MMN's peak increases with decreasing deviant probability, but its latency is unaffected.
<xref ref-type="fig" rid="pcbi-1003288-g006">Figure 6</xref>
summarizes the peak amplitudes and latencies of the simulated MMN as a function of deviance magnitude and probability. As the upper plot shows, the MMN peak amplitude increases with deviance magnitude and decreases with deviant probability. Furthermore, deviance magnitude appears to amplify the effect of deviant probability and vice versa. The lower plot shows that the MMN latency is shorter when deviance magnitude is 32% than when it is 12.7%. These results also suggest that the deviant probability has no systematic effect on MMN latency – if the deviance magnitude is at most 12.7% and deviant probability is below 40%. However, they predict that MMN latency shortens with decreasing deviant probability – if deviance magnitude is increased to 32% or deviant probability is increased to 40%.</p>
<p>Furthermore, our model predicts that MMN amplitude is higher when the deviant is embedded in a stream of standards (deviant condition) than when the same tone is embedded in a random sequence of equiprobable tones (control condition)
<xref rid="pcbi.1003288-Woodman1" ref-type="bibr">[44]</xref>
,
<xref rid="pcbi.1003288-Kiesel1" ref-type="bibr">[45]</xref>
: In the control condition – with its equiprobable tones – the trial-wise prediction about the target frequency is necessarily less precise. As a result, the neural activity encoding the precision weighted prediction error about the target frequency will be lower, so that the deviant negativity will be reduced relative to the deviant condition. This phenomenon cannot be explained by the spike-frequency adaptation in narrow frequency channels [44], but see
<xref rid="pcbi.1003288-Schrger1" ref-type="bibr">[46]</xref>
-
<xref rid="pcbi.1003288-Horvth1" ref-type="bibr">[50]</xref>
for a demonstration that it can be explained by synaptic depression.</p>
</sec>
<sec id="s3a2">
<title>Quantitative comparisons to empirical data</title>
<p>Having established that the model reproduces the effects of deviant probability and magnitude on MMN amplitude and latency in a qualitative sense, we went one step further and assessed quantitative predictions. For this purpose, we simulated three MMN experiments and reproduced the analyses reported in the corresponding empirical studies. We found that the effects of deviance magnitude and probability on the MMN peak amplitude matched the empirical data of
<xref rid="pcbi.1003288-Taaseh1" ref-type="bibr">[51]</xref>
and
<xref rid="pcbi.1003288-Mill1" ref-type="bibr">[52]</xref>
not only qualitatively but also quantitatively (see
<xref ref-type="fig" rid="pcbi-1003288-g007">Figure 7a</xref>
). Our model explained 93.6% of the variance due to deviance magnitude reported in
<xref rid="pcbi.1003288-Tiitinen1" ref-type="bibr">[9]</xref>
(
<inline-formula>
<inline-graphic xlink:href="pcbi.1003288.e105.jpg"></inline-graphic>
</inline-formula>
) and 93.2% of the variance due to deviant probability reported in
<xref rid="pcbi.1003288-Sinkkonen1" ref-type="bibr">[10]</xref>
(
<inline-formula>
<inline-graphic xlink:href="pcbi.1003288.e106.jpg"></inline-graphic>
</inline-formula>
). Furthermore, we simulated two experiments that investigated how the MMN latency depends on deviance magnitude
<xref rid="pcbi.1003288-Tiitinen1" ref-type="bibr">[9]</xref>
and probability
<xref rid="pcbi.1003288-Sinkkonen1" ref-type="bibr">[10]</xref>
(see
<xref ref-type="fig" rid="pcbi-1003288-g007">Figure 7b</xref>
). The model correctly predicted the absence of an effect of deviant probability on MMN latency in a study where the deviance magnitude was 20%
<xref rid="pcbi.1003288-Tiitinen1" ref-type="bibr">[9]</xref>
. While our model predicted that the MMN latency is shorter for high deviance magnitudes than for low deviance magnitudes, it also predicted a sharp transition between long MMN latencies (195 ms) for deviance magnitudes up to 12.7% and a substantially shorter MMN latency (125 ms) for a deviance magnitude of 32%. By contrast, the results reported in
<xref rid="pcbi.1003288-Javitt1" ref-type="bibr">[11]</xref>
appear to suggest a gradual transition between long and short MMN latencies. In effect, the model's predictions explained only 51.9% of the variance of MMN latency as a function of deviance magnitude
<xref rid="pcbi.1003288-Javitt1" ref-type="bibr">[11]</xref>
(
<inline-formula>
<inline-graphic xlink:href="pcbi.1003288.e107.jpg"></inline-graphic>
</inline-formula>
).</p>
<fig id="pcbi-1003288-g007" orientation="portrait" position="float">
<object-id pub-id-type="doi">10.1371/journal.pcbi.1003288.g007</object-id>
<label>Figure 7</label>
<caption>
<title>Quantitative model fit of MMN amplitude and latency.</title>
<p>This figure compares predictions about the MMN amplitude (panel A) and latency (panel B) with empirical data from auditory oddball experiments. The upper plot in panel A is based on
<xref rid="pcbi.1003288-Baldeweg3" ref-type="bibr">[72]</xref>
, where deviance magnitude was varied for a fixed deviant probability of 0.05. The lower plot in panel A is based on
<xref rid="pcbi.1003288-Garrido3" ref-type="bibr">[19]</xref>
, where deviant probability was varied for a fixed deviance magnitude of 15% (deviant frequency: 1150 Hz, standard frequency: 1000 Hz). The upper plot in panel B is based on the same experiment
<xref rid="pcbi.1003288-Tiitinen1" ref-type="bibr">[9]</xref>
as the upper plot in panel A. The lower plot in panel B is based on
<xref rid="pcbi.1003288-Sinkkonen1" ref-type="bibr">[10]</xref>
, where deviant probability was varied with a fixed deviance magnitude of 20% (deviant frequency 1200 Hz, standard frequency 1000 Hz). The error bars indicate the standard error of the mean.</p>
</caption>
<graphic xlink:href="pcbi.1003288.g007"></graphic>
</fig>
</sec>
</sec>
</sec>
<sec id="s4">
<title>Discussion</title>
<p>We have described a process model of the MMN and its dependence on deviant stimulus (deviance magnitude) and context (deviant probability). Together with the study presented in
<xref rid="pcbi.1003288-Tiitinen1" ref-type="bibr">[9]</xref>
, this work demonstrates the potential of predictive coding to provide a comprehensive explanation of MMN phenomenology. More precisely, our model explains the effects of deviant probability and magnitude on the MMN amplitude under the assumption that evoked responses reflect the neuronal encoding of (precision weighted) prediction errors. The simulated MMN was a superposition of the electrical fields generated by prediction errors at different hierarchical levels of representation (see
<xref ref-type="fig" rid="pcbi-1003288-g002">Figure 2</xref>
), where their relative contributions (i.e. the coefficients in equation (13)) differed: the errors in the predictions at the highest level of representation (inferior frontal gyrus) were weighted most strongly, followed by prediction error at the sensory level (A1) and prediction errors at the intermediate level (lateral Heschl's gyrus). As a result, the simulated MMN primarily reflected prediction errors on the hidden causes (attributes), rather than prediction errors on their physical features.</p>
<p>Our model offers a simple explanation as to why the MMN amplitude decreases with deviant probability and increases with deviance magnitude. Precision weighted prediction errors are the product of a prediction error and the precision of the top-down prediction. Hence, according to our model, deviance magnitude increases MMN amplitude, because it increases prediction errors. Similarly decreasing the probability of the deviant increases the MMN amplitude by increasing the precision of (learned) top-down predictions. Furthermore, since precision and prediction error interact multiplicatively, the precision determines the gain of the effect of prediction error and
<italic>vice versa</italic>
.</p>
<p>This model explains the shortening of the MMN latency with deviance magnitude by a selective amplification of frequency-related prediction errors that are only transiently expressed – because they are explained away quickly by top-down predictions. These prediction errors increase with deviance magnitude. However, there are also prediction errors that are not explained away by perceptual inference. These errors are sustained throughout the duration of the stimulus (as the stimulus amplitude fluctuates) and do not depend on the difference between the standard and the deviant event. Hence, according to our model, deviance magnitude selectively increases the early prediction error component, but not sustained errors. In effect, as deviance magnitude increases, an early trough emerges within the MMN, so that the MMN latency shortens (see
<xref ref-type="fig" rid="pcbi-1003288-g005">Figure 5a</xref>
and
<xref ref-type="fig" rid="pcbi-1003288-g006">Figure 6</xref>
). By contrast, increasing the precision of high-level beliefs increases all precision weighted frequency prediction errors – the transient and the sustained – equally. Thus the MMN deepens uniformly, and no early trough emerges. This is why – according to the model – the deviant probability has no effect on the MMN latency for moderate deviance magnitudes. However, if the deviance magnitude is so large that the transient component dominates the frequency-related prediction error, the situation is different. In this case, increasing the weight of the frequency-related prediction errors relative to loudness-related prediction errors can shorten the latency, because the frequency-related prediction error predominates at the beginning of perception – whereas the amplitude related prediction error is constant throughout perception. This is why our model predicts that the MMN latency becomes dependent on deviant probability at higher levels of deviance magnitude.</p>
<sec id="s4a">
<title>Novel predictions</title>
<p>Our MMN simulations predict a nonlinear interaction between the effects of deviant probability and magnitude. The upper plot in
<xref ref-type="fig" rid="pcbi-1003288-g006">Figure 6</xref>
suggests that the effect of deviant probability on MMN peak amplitude increases with increasing deviance magnitude. Conversely, the effect of deviance magnitude increases with decreasing deviant probability. Furthermore, the lower plot in
<xref ref-type="fig" rid="pcbi-1003288-g006">Figure 6</xref>
suggests, that the effect of deviant probability on MMN latency depends on deviance magnitude: If deviance magnitude is at most 12.7%, the MMN latency does not depend on deviant probability, but when deviance magnitude is as large as 32%, the MMN latency increases with deviant probability. Conversely, the size of the effect of deviance magnitude on MMN latency depends on deviant probability. Hence, our simulations predict a number of interaction effects that can be tested empirically.</p>
</sec>
<sec id="s4b">
<title>Relation to previous work</title>
<p>Although the physiological mechanisms generating the MMN have been modelled previously
<xref rid="pcbi.1003288-Tiitinen1" ref-type="bibr">[9]</xref>
, the model presented here is the first to bridge the gap between the computations implicit in perceptual inference and the neurophysiology of ERP waveforms. In terms of Marr's levels of analysis
<xref rid="pcbi.1003288-Wacongne1" ref-type="bibr">[53]</xref>
, our model provides an explanation at both the algorithmic and implementational levels of analysis – and represents a step towards full meta-Bayesian inference – namely inferring from measurements of brain activity on how the brain computes (cf.
<xref rid="pcbi.1003288-May1" ref-type="bibr">[13]</xref>
,
<xref rid="pcbi.1003288-Garrido3" ref-type="bibr">[19]</xref>
,
<xref rid="pcbi.1003288-Taaseh1" ref-type="bibr">[51]</xref>
<xref rid="pcbi.1003288-May2" ref-type="bibr">[55]</xref>
).</p>
<p>Our model builds upon the proposal that the brain inverts hierarchical dynamic models of its sensory inputs by minimizing free-energy in a hierarchy of predictive coding circuits
<xref rid="pcbi.1003288-Marr1" ref-type="bibr">[56]</xref>
. Specifically, we asked whether the computational principles proposed in
<xref rid="pcbi.1003288-Lieder1" ref-type="bibr">[15]</xref>
,
<xref rid="pcbi.1003288-Daunizeau1" ref-type="bibr">[20]</xref>
are sufficient to generate realistic MMN waveforms and account for their dependence on deviant probability and deviance magnitude. In doing so, we have provided a more realistic account of the algorithmic nature of the brain's implementation of these computational principles: While previous simulations have explored the dynamics of perceptual inference prescribed by the free-energy principle using dynamic expectation maximization (DEM)
<xref rid="pcbi.1003288-Friston6" ref-type="bibr">[23]</xref>
,
<xref rid="pcbi.1003288-Kiebel2" ref-type="bibr">[39]</xref>
, the simulations presented here are based on GF
<xref rid="pcbi.1003288-Friston6" ref-type="bibr">[23]</xref>
,
<xref rid="pcbi.1003288-Kiebel2" ref-type="bibr">[39]</xref>
. Arguably, GF provides a more realistic model of learning and inference in the brain than DEM, because it is an online algorithm that can be run in real-time to simultaneously infer hidden states and learn the model; i.e., as sensory inputs arrive. In contrast to DEM it does not have to iterate between inferring hidden states, learning parameters, and learning hyperparameters. This is possible, because GF dispenses the mean-field assumption made by DEM. Another difference to previous work is that we have modelled the neural representation of precision weighted prediction error by sigmoidal activation functions, whereas previous simulations ignored potential nonlinear effects by assuming that the activity of prediction error units is a linear function of precision weighted prediction error
<xref rid="pcbi.1003288-Friston4" ref-type="bibr">[6]</xref>
,
<xref rid="pcbi.1003288-Friston7" ref-type="bibr">[24]</xref>
,
<xref rid="pcbi.1003288-Friston10" ref-type="bibr">[27]</xref>
,
<xref rid="pcbi.1003288-Kiebel2" ref-type="bibr">[39]</xref>
. Most importantly, the model presented here connects the theory of free-energy minimisation and predictive coding to empirical measurements of the MMN in human subjects.</p>
<p>To our knowledge, our model is the first to provide a computational explanation of the MMN's dependence on deviance magnitude, deviant probability, and their interaction. While
<xref rid="pcbi.1003288-Friston9" ref-type="bibr">[26]</xref>
modelled the effect of deviance magnitude, they did not consider the effect of deviant probability. Although
<xref rid="pcbi.1003288-Friston4" ref-type="bibr">[6]</xref>
,
<xref rid="pcbi.1003288-Friston7" ref-type="bibr">[24]</xref>
modelled the effect of deviant probability, they did not simulate the effect of deviance magnitude, nor did they make quantitative predictions of MMN latency or amplitude. Mill et al.
<xref rid="pcbi.1003288-May1" ref-type="bibr">[13]</xref>
,
<xref rid="pcbi.1003288-May2" ref-type="bibr">[55]</xref>
simulated the effects of deviance magnitude and deviant probability on the firing rate of single auditory neurons in anaesthetized rats. While their simulations captured the qualitative effects of deviance magnitude and deviant probability on response amplitude, they did not capture the shortening of the MMN latency with decreasing deviant probability. By contrast, our model generates realistic MMN
<italic>waveforms</italic>
and explains the qualitative effects of deviant probability and magnitude on the amplitude and latency of the MMN. Beyond this, our model makes remarkably accurate quantitative predictions of the MMN amplitude across two experiments
<xref rid="pcbi.1003288-Wacongne1" ref-type="bibr">[53]</xref>
examining several combinations of deviance magnitude and deviant probability.</p>
</sec>
<sec id="s4c">
<title>Limitations</title>
<p>The simulations reported in this paper demonstrate that predictive coding can explain the MMN and certain aspects of its dependence on the deviant stimulus and its context. However, they do not imply that the assumptions of predictive coding are necessary to explain the MMN. Instead, the simulations are a proof-of-concept that it is possible to relate the MMN to a process model of how prediction errors are encoded dynamically by superficial pyramidal cells during perceptual inference. For parsimony, our model includes only those three intermediate levels of the auditory hierarchy that are assumed to be the primary sources of the MMN. In particular, we do not model the subcortical levels of the auditory system. However, our model does
<italic>not</italic>
assume that predictive coding starts in primary auditory cortex. To the contrary, the input to A1 is assumed to be the prediction error from auditory thalamus. This is consistent with the recent discovery of subcortical precursors of the MMN
<xref rid="pcbi.1003288-Mill1" ref-type="bibr">[52]</xref>
. Since MMN waveforms were simulated using the parameters estimated from the average ERPs reported in
<xref rid="pcbi.1003288-Tiitinen1" ref-type="bibr">[9]</xref>
,
<xref rid="pcbi.1003288-Sinkkonen1" ref-type="bibr">[10]</xref>
, the waveforms shown in
<xref ref-type="fig" rid="pcbi-1003288-g004">Figure 4</xref>
are merely a demonstration that our model can fit empirical data. However, the model's ability to predict how the MMN waveform changes as a function of deviance magnitude and deviant probability speaks to its face validity.</p>
<p>Our model's most severe failure was that while our model correctly predicted that MMN latency shortens with deviance magnitude, it failed to predict that this shortening occurs gradually for deviance magnitudes between 2.5% and 7.5%. In principle, the model predicts that the latency shortens gradually within a certain range of deviance magnitudes, but this range did not coincide with the one observed empirically.</p>
<p>There are clearly many explanations for this failure – for example, an inappropriate generative model or incorrect forms for the mapping between prediction errors and local field potentials. Perhaps the more important point here is that these failures generally represent opportunities. This is because one can revise or extend the model and compare the evidence for an alternative model with the evidence for the original model using Bayesian model comparison of dynamic causal models in the usual way
<xref rid="pcbi.1003288-Slabu1" ref-type="bibr">[57]</xref>
<xref rid="pcbi.1003288-Grimm1" ref-type="bibr">[59]</xref>
. Indeed, this is one of the primary motivations for developing dynamic causal models that are computationally informed or constrained. In other words, one can test competing hypotheses or models about both the computational (and biophysical) processes underlying observed brain responses.</p>
</sec>
<sec id="s4d">
<title>Conclusions</title>
<p>This work is a proof-of-principle that important aspects of evoked responses in general – and the MMN in particular – can be explained by formal (Bayesian) models of the predictive coding mechanism
<xref rid="pcbi.1003288-Garrido3" ref-type="bibr">[19]</xref>
. Our model explains the dynamics of the MMN in continuous time and some of its phenomenology at a precision level that has not been attempted before. By placing normative models of computation within the framework of dynamic causal models one has the opportunity to use Bayesian model comparison to adjudicate between competing computational theories. Future studies might compare predictive coding to competing accounts such as the fresh-afferent theory
<xref rid="pcbi.1003288-Penny1" ref-type="bibr">[60]</xref>
<xref rid="pcbi.1003288-Stephan1" ref-type="bibr">[62]</xref>
. In addition, the approach presented here could be extended to a range of potentials evoked by sensory stimuli, including the N1 and the P300, in order to generalise the explanatory scope of predictive coding or free energy formulations.</p>
<p>This sort of modelling approach might be used to infer how perceptual inference changes with learning, attention, and context. This is an attractive prospect, given that the MMN is elicited not only in simple oddball paradigms, but also in more complex paradigms involving the processing of speech, language, music, and abstract features
<xref rid="pcbi.1003288-Friston5" ref-type="bibr">[7]</xref>
,
<xref rid="pcbi.1003288-Wacongne1" ref-type="bibr">[53]</xref>
,
<xref rid="pcbi.1003288-Winkler1" ref-type="bibr">[63]</xref>
. Furthermore, a computational anatomy of the MMN might be useful for probing disturbances of perceptual inference and learning in psychiatric conditions, such as schizophrenia
<xref rid="pcbi.1003288-May1" ref-type="bibr">[13]</xref>
,
<xref rid="pcbi.1003288-May2" ref-type="bibr">[55]</xref>
. Similarly, extensions of this model could also be used to better understand the effects of drugs, such as ketamine
<xref rid="pcbi.1003288-Ntnen1" ref-type="bibr">[12]</xref>
,
<xref rid="pcbi.1003288-Schrger2" ref-type="bibr">[64]</xref>
<xref rid="pcbi.1003288-Bendixen1" ref-type="bibr">[66]</xref>
, or neuromodulators, such as acetylcholine
<xref rid="pcbi.1003288-Umbricht1" ref-type="bibr">[67]</xref>
<xref rid="pcbi.1003288-Baldeweg2" ref-type="bibr">[69]</xref>
, on the MMN. We hope to pursue this avenue of research in future work.</p>
</sec>
</sec>
<sec sec-type="supplementary-material" id="s5">
<title>Supporting Information</title>
<supplementary-material content-type="local-data" id="pcbi.1003288.s001">
<label>Text S1</label>
<caption>
<p>
<bold>Modelling assumptions about tuning curves in primary auditory cortex and the brain's prior uncertainty.</bold>
The supplementary text details and justifies our model's assumptions about the tuning curves of neurons in primary auditory cortex and the covariance matrices in the perceptual model.</p>
<p>(DOCX)</p>
</caption>
<media xlink:href="pcbi.1003288.s001.docx">
<caption>
<p>Click here for additional data file.</p>
</caption>
</media>
</supplementary-material>
</sec>
</body>
<back>
<ref-list>
<title>References</title>
<ref id="pcbi.1003288-Friston1">
<label>1</label>
<mixed-citation publication-type="journal">
<name>
<surname>Friston</surname>
<given-names>K</given-names>
</name>
,
<name>
<surname>Dolan</surname>
<given-names>R</given-names>
</name>
(
<year>2010</year>
)
<article-title>Computational and dynamic models in neuroimaging</article-title>
.
<source>NeuroImage</source>
<volume>52</volume>
:
<fpage>752</fpage>
<lpage>765</lpage>
.
<pub-id pub-id-type="pmid">20036335</pub-id>
</mixed-citation>
</ref>
<ref id="pcbi.1003288-David1">
<label>2</label>
<mixed-citation publication-type="journal">
<name>
<surname>David</surname>
<given-names>O</given-names>
</name>
,
<name>
<surname>Kiebel</surname>
<given-names>S</given-names>
</name>
,
<name>
<surname>Harrison</surname>
<given-names>L</given-names>
</name>
,
<name>
<surname>Mattout</surname>
<given-names>J</given-names>
</name>
,
<name>
<surname>Kilner</surname>
<given-names>J</given-names>
</name>
,
<etal>et al</etal>
(
<year>2006</year>
)
<article-title>Dynamic causal modeling of evoked responses in EEG and MEG</article-title>
.
<source>NeuroImage</source>
<volume>30</volume>
:
<fpage>1255</fpage>
<lpage>1272</lpage>
.
<pub-id pub-id-type="pmid">16473023</pub-id>
</mixed-citation>
</ref>
<ref id="pcbi.1003288-Sun1">
<label>3</label>
<mixed-citation publication-type="other">Sun R, editor (2008) The Cambridge Handbook of Computational Psychology. 1 ed: Cambridge University Press. 766 p.</mixed-citation>
</ref>
<ref id="pcbi.1003288-Friston2">
<label>4</label>
<mixed-citation publication-type="journal">
<name>
<surname>Friston</surname>
<given-names>K</given-names>
</name>
(
<year>2010</year>
)
<article-title>The free-energy principle: a unified brain theory</article-title>
?
<source>Nature reviews Neuroscience</source>
<volume>11</volume>
:
<fpage>127</fpage>
<lpage>138</lpage>
.
<pub-id pub-id-type="pmid">20068583</pub-id>
</mixed-citation>
</ref>
<ref id="pcbi.1003288-Friston3">
<label>5</label>
<mixed-citation publication-type="journal">
<name>
<surname>Friston</surname>
<given-names>K</given-names>
</name>
(
<year>2009</year>
)
<article-title>The free-energy principle: a rough guide to the brain</article-title>
?
<source>Trends in Cognitive Sciences</source>
<volume>13</volume>
:
<fpage>293</fpage>
<lpage>301</lpage>
.
<pub-id pub-id-type="pmid">19559644</pub-id>
</mixed-citation>
</ref>
<ref id="pcbi.1003288-Friston4">
<label>6</label>
<mixed-citation publication-type="journal">
<name>
<surname>Friston</surname>
<given-names>K</given-names>
</name>
,
<name>
<surname>Kiebel</surname>
<given-names>S</given-names>
</name>
(
<year>2009</year>
)
<article-title>Predictive coding under the free-energy principle</article-title>
.
<source>Philosophical transactions of the Royal Society of London Series B, Biological sciences</source>
<volume>364</volume>
:
<fpage>1211</fpage>
<lpage>1221</lpage>
.
<pub-id pub-id-type="pmid">19528002</pub-id>
</mixed-citation>
</ref>
<ref id="pcbi.1003288-Friston5">
<label>7</label>
<mixed-citation publication-type="journal">
<name>
<surname>Friston</surname>
<given-names>K</given-names>
</name>
(
<year>2005</year>
)
<article-title>A theory of cortical responses</article-title>
.
<source>Philosophical transactions of the Royal Society of London Series B, Biological sciences</source>
<volume>360</volume>
:
<fpage>815</fpage>
<lpage>836</lpage>
.
<pub-id pub-id-type="pmid">15937014</pub-id>
</mixed-citation>
</ref>
<ref id="pcbi.1003288-Garrido1">
<label>8</label>
<mixed-citation publication-type="journal">
<name>
<surname>Garrido</surname>
<given-names>M</given-names>
</name>
,
<name>
<surname>Kilner</surname>
<given-names>J</given-names>
</name>
,
<name>
<surname>Stephan</surname>
<given-names>K</given-names>
</name>
,
<name>
<surname>Friston</surname>
<given-names>K</given-names>
</name>
(
<year>2009</year>
)
<article-title>The mismatch negativity: a review of underlying mechanisms</article-title>
.
<source>Clinical neurophysiology</source>
<volume>120</volume>
:
<fpage>453</fpage>
<lpage>463</lpage>
.
<pub-id pub-id-type="pmid">19181570</pub-id>
</mixed-citation>
</ref>
<ref id="pcbi.1003288-Tiitinen1">
<label>9</label>
<mixed-citation publication-type="journal">
<name>
<surname>Tiitinen</surname>
<given-names>H</given-names>
</name>
,
<name>
<surname>May</surname>
<given-names>P</given-names>
</name>
,
<name>
<surname>Reinikainen</surname>
<given-names>K</given-names>
</name>
,
<name>
<surname>Näätänen</surname>
<given-names>R</given-names>
</name>
(
<year>1994</year>
)
<article-title>Attentive novelty detection in humans is governed by pre-attentive sensory memory</article-title>
.
<source>Nature</source>
<volume>372</volume>
:
<fpage>90</fpage>
<lpage>92</lpage>
.
<pub-id pub-id-type="pmid">7969425</pub-id>
</mixed-citation>
</ref>
<ref id="pcbi.1003288-Sinkkonen1">
<label>10</label>
<mixed-citation publication-type="journal">
<name>
<surname>Sinkkonen</surname>
<given-names>J</given-names>
</name>
,
<name>
<surname>Kaski</surname>
<given-names>S</given-names>
</name>
,
<name>
<surname>Huotilainen</surname>
<given-names>M</given-names>
</name>
,
<name>
<surname>Ilmoniemi</surname>
<given-names>RJ</given-names>
</name>
,
<name>
<surname>Näätänen</surname>
<given-names>R</given-names>
</name>
,
<etal>et al</etal>
(
<year>1996</year>
)
<article-title>Optimal resource allocation for novelty detection in a human auditory memory</article-title>
.
<source>Neuroreport</source>
<volume>7</volume>
:
<fpage>2479</fpage>
<lpage>2482</lpage>
.
<pub-id pub-id-type="pmid">8981407</pub-id>
</mixed-citation>
</ref>
<ref id="pcbi.1003288-Javitt1">
<label>11</label>
<mixed-citation publication-type="journal">
<name>
<surname>Javitt</surname>
<given-names>DC</given-names>
</name>
,
<name>
<surname>Grochowski</surname>
<given-names>S</given-names>
</name>
,
<name>
<surname>Shelley</surname>
<given-names>AM</given-names>
</name>
,
<name>
<surname>Ritter</surname>
<given-names>W</given-names>
</name>
(
<year>1998</year>
)
<article-title>Impaired mismatch negativity (MMN) generation in schizophrenia as a function of stimulus deviance, probability, and interstimulus/interdeviant interval</article-title>
.
<source>Electroencephalography and clinical neurophysiology</source>
<volume>108</volume>
:
<fpage>143</fpage>
<lpage>153</lpage>
.
<pub-id pub-id-type="pmid">9566627</pub-id>
</mixed-citation>
</ref>
<ref id="pcbi.1003288-Ntnen1">
<label>12</label>
<mixed-citation publication-type="journal">
<name>
<surname>Näätänen</surname>
<given-names>R</given-names>
</name>
,
<name>
<surname>Paavilainen</surname>
<given-names>P</given-names>
</name>
,
<name>
<surname>Rinne</surname>
<given-names>T</given-names>
</name>
,
<name>
<surname>Alho</surname>
<given-names>K</given-names>
</name>
(
<year>2007</year>
)
<article-title>The mismatch negativity (MMN) in basic research of central auditory processing: A review</article-title>
.
<source>Clinical Neurophysiology</source>
<volume>118</volume>
:
<fpage>2544</fpage>
<lpage>2590</lpage>
.
<pub-id pub-id-type="pmid">17931964</pub-id>
</mixed-citation>
</ref>
<ref id="pcbi.1003288-May1">
<label>13</label>
<mixed-citation publication-type="journal">
<name>
<surname>May</surname>
<given-names>P</given-names>
</name>
,
<name>
<surname>Tiitinen</surname>
<given-names>H</given-names>
</name>
(
<year>2010</year>
)
<article-title>Mismatch negativity (MMN), the deviance-elicited auditory deflection, explained</article-title>
.
<source>Psychophysiology</source>
<volume>47</volume>
:
<fpage>66</fpage>
<lpage>122</lpage>
.
<pub-id pub-id-type="pmid">19686538</pub-id>
</mixed-citation>
</ref>
<ref id="pcbi.1003288-Kujala1">
<label>14</label>
<mixed-citation publication-type="journal">
<name>
<surname>Kujala</surname>
<given-names>T</given-names>
</name>
,
<name>
<surname>Tervaniemi</surname>
<given-names>M</given-names>
</name>
,
<name>
<surname>Schröger</surname>
<given-names>E</given-names>
</name>
(
<year>2007</year>
)
<article-title>The mismatch negativity in cognitive and clinical neuroscience: Theoretical and methodological considerations</article-title>
.
<source>Biological Psychology</source>
<volume>74</volume>
:
<fpage>1</fpage>
<lpage>19</lpage>
.
<pub-id pub-id-type="pmid">16844278</pub-id>
</mixed-citation>
</ref>
<ref id="pcbi.1003288-Lieder1">
<label>15</label>
<mixed-citation publication-type="journal">
<name>
<surname>Lieder</surname>
<given-names>F</given-names>
</name>
,
<name>
<surname>Daunizeau</surname>
<given-names>J</given-names>
</name>
,
<name>
<surname>Garrido</surname>
<given-names>M</given-names>
</name>
,
<name>
<surname>Friston</surname>
<given-names>K</given-names>
</name>
,
<name>
<surname>Stephan</surname>
<given-names>KE</given-names>
</name>
(
<year>2013</year>
)
<article-title>Modeling Trial-by-Trial Changes in Mismatch Negativity Amplitudes</article-title>
.
<source>PLoS Comput Biol</source>
<volume>9</volume>
:
<fpage>e1002911</fpage>
.
<pub-id pub-id-type="pmid">23436989</pub-id>
</mixed-citation>
</ref>
<ref id="pcbi.1003288-Opitz1">
<label>16</label>
<mixed-citation publication-type="journal">
<name>
<surname>Opitz</surname>
<given-names>B</given-names>
</name>
,
<name>
<surname>Rinne</surname>
<given-names>T</given-names>
</name>
,
<name>
<surname>Mecklinger</surname>
<given-names>D</given-names>
</name>
,
<name>
<surname>van Cramon</surname>
<given-names>Y</given-names>
</name>
,
<name>
<surname>Schröger</surname>
<given-names>E</given-names>
</name>
(
<year>2002</year>
)
<article-title>Differential Contribution of Frontal and Temporal Cortices to Auditory Change Detection: fMRI and ERP Results</article-title>
.
<source>NeuroImage</source>
<volume>15</volume>
:
<fpage>167</fpage>
<lpage>174</lpage>
.
<pub-id pub-id-type="pmid">11771985</pub-id>
</mixed-citation>
</ref>
<ref id="pcbi.1003288-Baldeweg1">
<label>17</label>
<mixed-citation publication-type="journal">
<name>
<surname>Baldeweg</surname>
<given-names>T</given-names>
</name>
(
<year>2007</year>
)
<article-title>ERP Repetition Effects and Mismatch Negativity Generation</article-title>
.
<source>Journal of Psychophysiology</source>
<volume>21</volume>
:
<fpage>204</fpage>
<lpage>213</lpage>
.</mixed-citation>
</ref>
<ref id="pcbi.1003288-Garrido2">
<label>18</label>
<mixed-citation publication-type="journal">
<name>
<surname>Garrido</surname>
<given-names>M</given-names>
</name>
,
<name>
<surname>Kilner</surname>
<given-names>J</given-names>
</name>
,
<name>
<surname>Kiebel</surname>
<given-names>S</given-names>
</name>
,
<name>
<surname>Stephan</surname>
<given-names>K</given-names>
</name>
,
<name>
<surname>Baldeweg</surname>
<given-names>T</given-names>
</name>
,
<etal>et al</etal>
(
<year>2009</year>
)
<article-title>Repetition suppression and plasticity in the human brain</article-title>
.
<source>NeuroImage</source>
<volume>48</volume>
:
<fpage>269</fpage>
<lpage>279</lpage>
.
<pub-id pub-id-type="pmid">19540921</pub-id>
</mixed-citation>
</ref>
<ref id="pcbi.1003288-Garrido3">
<label>19</label>
<mixed-citation publication-type="journal">
<name>
<surname>Garrido</surname>
<given-names>M</given-names>
</name>
,
<name>
<surname>Friston</surname>
<given-names>K</given-names>
</name>
,
<name>
<surname>Kiebel</surname>
<given-names>S</given-names>
</name>
,
<name>
<surname>Stephan</surname>
<given-names>K</given-names>
</name>
,
<name>
<surname>Baldeweg</surname>
<given-names>T</given-names>
</name>
,
<etal>et al</etal>
(
<year>2008</year>
)
<article-title>The functional anatomy of the MMN: a DCM study of the roving paradigm</article-title>
.
<source>NeuroImage</source>
<volume>42</volume>
:
<fpage>936</fpage>
<lpage>944</lpage>
.
<pub-id pub-id-type="pmid">18602841</pub-id>
</mixed-citation>
</ref>
<ref id="pcbi.1003288-Daunizeau1">
<label>20</label>
<mixed-citation publication-type="journal">
<name>
<surname>Daunizeau</surname>
<given-names>J</given-names>
</name>
,
<name>
<surname>den Ouden</surname>
<given-names>H</given-names>
</name>
,
<name>
<surname>Pessiglione</surname>
<given-names>M</given-names>
</name>
,
<name>
<surname>Kiebel</surname>
<given-names>S</given-names>
</name>
,
<name>
<surname>Stephan</surname>
<given-names>K</given-names>
</name>
,
<etal>et al</etal>
(
<year>2010</year>
)
<article-title>Observing the Observer (I): Meta-Bayesian Models of Learning and Decision-Making</article-title>
.
<source>PLoS ONE</source>
<volume>5</volume>
:
<fpage>e15554</fpage>
.
<pub-id pub-id-type="pmid">21179480</pub-id>
</mixed-citation>
</ref>
<ref id="pcbi.1003288-Helmholtz1">
<label>21</label>
<mixed-citation publication-type="other">Helmholtz H (1867) Handbuch der Physiologischen Optik. Leipzig: Leopold Voss.</mixed-citation>
</ref>
<ref id="pcbi.1003288-Knill1">
<label>22</label>
<mixed-citation publication-type="other">Knill D, Richards W, editors (1996) Perception as Bayesian Inference. 1 ed. New York: Cambridge University Press.</mixed-citation>
</ref>
<ref id="pcbi.1003288-Friston6">
<label>23</label>
<mixed-citation publication-type="journal">
<name>
<surname>Friston</surname>
<given-names>K</given-names>
</name>
(
<year>2008</year>
)
<article-title>Hierarchical models in the brain</article-title>
.
<source>PLoS computational biology</source>
<volume>4</volume>
:
<fpage>e1000211</fpage>
.
<pub-id pub-id-type="pmid">18989391</pub-id>
</mixed-citation>
</ref>
<ref id="pcbi.1003288-Friston7">
<label>24</label>
<mixed-citation publication-type="journal">
<name>
<surname>Friston</surname>
<given-names>K</given-names>
</name>
,
<name>
<surname>Kiebel</surname>
<given-names>S</given-names>
</name>
(
<year>2009</year>
)
<article-title>Attractors in song</article-title>
.
<source>New Mathematics and Natural Computation</source>
<volume>05</volume>
:
<fpage>83</fpage>
.</mixed-citation>
</ref>
<ref id="pcbi.1003288-Friston8">
<label>25</label>
<mixed-citation publication-type="journal">
<name>
<surname>Friston</surname>
<given-names>K</given-names>
</name>
,
<name>
<surname>Trujillobarreto</surname>
<given-names>N</given-names>
</name>
,
<name>
<surname>Daunizeau</surname>
<given-names>J</given-names>
</name>
(
<year>2008</year>
)
<article-title>DEM: A variational treatment of dynamic systems</article-title>
.
<source>NeuroImage</source>
<volume>41</volume>
:
<fpage>849</fpage>
<lpage>885</lpage>
.
<pub-id pub-id-type="pmid">18434205</pub-id>
</mixed-citation>
</ref>
<ref id="pcbi.1003288-Friston9">
<label>26</label>
<mixed-citation publication-type="journal">
<name>
<surname>Friston</surname>
<given-names>KJ</given-names>
</name>
,
<name>
<surname>Stephan</surname>
<given-names>KE</given-names>
</name>
,
<name>
<surname>Li</surname>
<given-names>B</given-names>
</name>
,
<name>
<surname>Daunizeau</surname>
<given-names>J</given-names>
</name>
(
<year>2010</year>
)
<article-title>Generalised Filtering</article-title>
.
<source>Mathematical Problems in Engineering</source>
<volume>2010</volume>
:
<fpage>621670</fpage>
.</mixed-citation>
</ref>
<ref id="pcbi.1003288-Friston10">
<label>27</label>
<mixed-citation publication-type="journal">
<name>
<surname>Friston</surname>
<given-names>K</given-names>
</name>
,
<name>
<surname>Kiebel</surname>
<given-names>S</given-names>
</name>
(
<year>2009</year>
)
<article-title>Cortical circuits for perceptual inference</article-title>
.
<source>Neural networks : the official journal of the International Neural Network Society</source>
<volume>22</volume>
:
<fpage>1093</fpage>
<lpage>1104</lpage>
.
<pub-id pub-id-type="pmid">19635656</pub-id>
</mixed-citation>
</ref>
<ref id="pcbi.1003288-Friston11">
<label>28</label>
<mixed-citation publication-type="journal">
<name>
<surname>Friston</surname>
<given-names>K</given-names>
</name>
,
<name>
<surname>Kilner</surname>
<given-names>J</given-names>
</name>
,
<name>
<surname>Harrison</surname>
<given-names>L</given-names>
</name>
(
<year>2006</year>
)
<article-title>A free energy principle for the brain</article-title>
.
<source>Journal of Physiology-Paris</source>
<volume>100</volume>
:
<fpage>70</fpage>
<lpage>87</lpage>
.</mixed-citation>
</ref>
<ref id="pcbi.1003288-BalaguerBallester1">
<label>29</label>
<mixed-citation publication-type="journal">
<name>
<surname>Balaguer-Ballester</surname>
<given-names>E</given-names>
</name>
,
<name>
<surname>Clark</surname>
<given-names>NR</given-names>
</name>
,
<name>
<surname>Coath</surname>
<given-names>M</given-names>
</name>
,
<name>
<surname>Krumbholz</surname>
<given-names>K</given-names>
</name>
,
<name>
<surname>Denham</surname>
<given-names>SL</given-names>
</name>
(
<year>2009</year>
)
<article-title>Understanding pitch perception as a hierarchical process with top-down modulation</article-title>
.
<source>PLoS computational biology</source>
<volume>5</volume>
:
<fpage>e1000301</fpage>
.
<pub-id pub-id-type="pmid">19266015</pub-id>
</mixed-citation>
</ref>
<ref id="pcbi.1003288-Kiebel1">
<label>30</label>
<mixed-citation publication-type="journal">
<name>
<surname>Kiebel</surname>
<given-names>S</given-names>
</name>
,
<name>
<surname>Daunizeau</surname>
<given-names>J</given-names>
</name>
,
<name>
<surname>Friston</surname>
<given-names>K</given-names>
</name>
(
<year>2008</year>
)
<article-title>A Hierarchy of Time-Scales and the Brain</article-title>
.
<source>PLoS Comput Biol</source>
<volume>4</volume>
:
<fpage>e1000209</fpage>
.
<pub-id pub-id-type="pmid">19008936</pub-id>
</mixed-citation>
</ref>
<ref id="pcbi.1003288-Rauschecker1">
<label>31</label>
<mixed-citation publication-type="journal">
<name>
<surname>Rauschecker</surname>
<given-names>J</given-names>
</name>
,
<name>
<surname>Scott</surname>
<given-names>S</given-names>
</name>
(
<year>2009</year>
)
<article-title>Maps and streams in the auditory cortex: nonhuman primates illuminate human speech processing</article-title>
.
<source>Nature neuroscience</source>
<volume>12</volume>
:
<fpage>718</fpage>
<lpage>724</lpage>
.
<pub-id pub-id-type="pmid">19471271</pub-id>
</mixed-citation>
</ref>
<ref id="pcbi.1003288-Formisano1">
<label>32</label>
<mixed-citation publication-type="journal">
<name>
<surname>Formisano</surname>
<given-names>E</given-names>
</name>
,
<name>
<surname>Kim</surname>
<given-names>D-S</given-names>
</name>
,
<name>
<surname>Di Salle</surname>
<given-names>F</given-names>
</name>
,
<name>
<surname>van de Moortele</surname>
<given-names>P-F</given-names>
</name>
,
<name>
<surname>Ugurbil</surname>
<given-names>K</given-names>
</name>
,
<etal>et al</etal>
(
<year>2003</year>
)
<article-title>Mirror-Symmetric Tonotopic Maps in Human Primary Auditory Cortex</article-title>
.
<source>Neuron</source>
<volume>40</volume>
:
<fpage>859</fpage>
<lpage>869</lpage>
.
<pub-id pub-id-type="pmid">14622588</pub-id>
</mixed-citation>
</ref>
<ref id="pcbi.1003288-Schnupp1">
<label>33</label>
<mixed-citation publication-type="other">Schnupp J, Nelken I, King A (2010) Auditory Neuroscience: Making Sense of Sound: The MIT Press.</mixed-citation>
</ref>
<ref id="pcbi.1003288-Bendor1">
<label>34</label>
<mixed-citation publication-type="journal">
<name>
<surname>Bendor</surname>
<given-names>D</given-names>
</name>
,
<name>
<surname>Wang</surname>
<given-names>X</given-names>
</name>
(
<year>2005</year>
)
<article-title>The neuronal representation of pitch in primate auditory cortex</article-title>
.
<source>Nature</source>
<volume>436</volume>
:
<fpage>1161</fpage>
<lpage>1165</lpage>
.
<pub-id pub-id-type="pmid">16121182</pub-id>
</mixed-citation>
</ref>
<ref id="pcbi.1003288-Patterson1">
<label>35</label>
<mixed-citation publication-type="journal">
<name>
<surname>Patterson</surname>
<given-names>R</given-names>
</name>
,
<name>
<surname>Uppenkamp</surname>
<given-names>S</given-names>
</name>
,
<name>
<surname>Johnsrude</surname>
<given-names>I</given-names>
</name>
,
<name>
<surname>Griffiths</surname>
<given-names>T</given-names>
</name>
(
<year>2002</year>
)
<article-title>The Processing of Temporal Pitch and Melody Information in Auditory Cortex</article-title>
.
<source>Neuron</source>
<volume>36</volume>
:
<fpage>767</fpage>
<lpage>776</lpage>
.
<pub-id pub-id-type="pmid">12441063</pub-id>
</mixed-citation>
</ref>
<ref id="pcbi.1003288-Hall1">
<label>36</label>
<mixed-citation publication-type="journal">
<name>
<surname>Hall</surname>
<given-names>D</given-names>
</name>
,
<name>
<surname>Plack</surname>
<given-names>C</given-names>
</name>
(
<year>2009</year>
)
<article-title>Pitch Processing Sites in the Human Auditory Brain</article-title>
.
<source>Cerebral Cortex</source>
<volume>19</volume>
:
<fpage>576</fpage>
<lpage>585</lpage>
.
<pub-id pub-id-type="pmid">18603609</pub-id>
</mixed-citation>
</ref>
<ref id="pcbi.1003288-Schnwiesner1">
<label>37</label>
<mixed-citation publication-type="journal">
<name>
<surname>Schönwiesner</surname>
<given-names>M</given-names>
</name>
,
<name>
<surname>Zatorre</surname>
<given-names>R</given-names>
</name>
(
<year>2008</year>
)
<article-title>Depth electrode recordings show double dissociation between pitch processing in lateral Heschl's gyrus and sound onset processing in medial Heschl's gyrus</article-title>
.
<source>Experimental brain research</source>
<volume>187</volume>
:
<fpage>97</fpage>
<lpage>105</lpage>
.
<pub-id pub-id-type="pmid">18236034</pub-id>
</mixed-citation>
</ref>
<ref id="pcbi.1003288-Koulakov1">
<label>38</label>
<mixed-citation publication-type="journal">
<name>
<surname>Koulakov</surname>
<given-names>A</given-names>
</name>
,
<name>
<surname>Raghavachari</surname>
<given-names>S</given-names>
</name>
,
<name>
<surname>Kepecs</surname>
<given-names>A</given-names>
</name>
,
<name>
<surname>Lisman</surname>
<given-names>J</given-names>
</name>
(
<year>2002</year>
)
<article-title>Model for a robust neural integrator</article-title>
.
<source>Nature Neuroscience</source>
<volume>5</volume>
:
<fpage>775</fpage>
<lpage>782</lpage>
.
<pub-id pub-id-type="pmid">12134153</pub-id>
</mixed-citation>
</ref>
<ref id="pcbi.1003288-Kiebel2">
<label>39</label>
<mixed-citation publication-type="journal">
<name>
<surname>Kiebel</surname>
<given-names>S</given-names>
</name>
,
<name>
<surname>Daunizeau</surname>
<given-names>J</given-names>
</name>
,
<name>
<surname>Friston</surname>
<given-names>K</given-names>
</name>
(
<year>2009</year>
)
<article-title>Perception and hierarchical dynamics</article-title>
.
<source>Frontiers in neuroinformatics</source>
<volume>3</volume>
:
<fpage>20</fpage>
.
<pub-id pub-id-type="pmid">19649171</pub-id>
</mixed-citation>
</ref>
<ref id="pcbi.1003288-Bastos1">
<label>40</label>
<mixed-citation publication-type="journal">
<name>
<surname>Bastos</surname>
<given-names>A</given-names>
</name>
,
<name>
<surname>Usrey</surname>
<given-names>M</given-names>
</name>
,
<name>
<surname>Adams</surname>
<given-names>R</given-names>
</name>
,
<name>
<surname>Mangun</surname>
<given-names>G</given-names>
</name>
,
<name>
<surname>Fries</surname>
<given-names>P</given-names>
</name>
,
<etal>et al</etal>
(
<year>2012</year>
)
<article-title>Canonical Microcircuits for Predictive Coding</article-title>
.
<source>Neuron</source>
<volume>76</volume>
:
<fpage>695</fpage>
<lpage>711</lpage>
.
<pub-id pub-id-type="pmid">23177956</pub-id>
</mixed-citation>
</ref>
<ref id="pcbi.1003288-Peter1">
<label>41</label>
<mixed-citation publication-type="journal">
<name>
<surname>Peter</surname>
<given-names>V</given-names>
</name>
,
<name>
<surname>McArthur</surname>
<given-names>G</given-names>
</name>
,
<name>
<surname>Thompson</surname>
<given-names>WF</given-names>
</name>
(
<year>2010</year>
)
<article-title>Effect of deviance direction and calculation method on duration and frequency mismatch negativity (MMN)</article-title>
.
<source>Neuroscience Letters</source>
<volume>482</volume>
:
<fpage>71</fpage>
<lpage>75</lpage>
.
<pub-id pub-id-type="pmid">20630487</pub-id>
</mixed-citation>
</ref>
<ref id="pcbi.1003288-Ugray1">
<label>42</label>
<mixed-citation publication-type="journal">
<name>
<surname>Ugray</surname>
<given-names>Z</given-names>
</name>
,
<name>
<surname>Lasdon</surname>
<given-names>L</given-names>
</name>
,
<name>
<surname>Plummer</surname>
<given-names>J</given-names>
</name>
,
<name>
<surname>Glover</surname>
<given-names>F</given-names>
</name>
,
<name>
<surname>Kelly</surname>
<given-names>J</given-names>
</name>
,
<etal>et al</etal>
(
<year>2007</year>
)
<article-title>Scatter Search and Local NLP Solvers: A Multistart Framework for Global Optimization</article-title>
.
<source>INFORMS Journal on Computing</source>
<volume>19</volume>
:
<fpage>328</fpage>
<lpage>340</lpage>
.</mixed-citation>
</ref>
<ref id="pcbi.1003288-Luck1">
<label>43</label>
<mixed-citation publication-type="other">Luck S (2005) An Introduction to the Event-Related Potential Technique. Cambridge, MA: The MIT Press.</mixed-citation>
</ref>
<ref id="pcbi.1003288-Woodman1">
<label>44</label>
<mixed-citation publication-type="journal">
<name>
<surname>Woodman</surname>
<given-names>G</given-names>
</name>
(
<year>2010</year>
)
<article-title>A brief introduction to the use of event-related potentials in studies of perception and attention</article-title>
.
<source>Attention, perception & psychophysics</source>
<volume>72</volume>
:
<fpage>2031</fpage>
<lpage>2046</lpage>
.</mixed-citation>
</ref>
<ref id="pcbi.1003288-Kiesel1">
<label>45</label>
<mixed-citation publication-type="journal">
<name>
<surname>Kiesel</surname>
<given-names>A</given-names>
</name>
,
<name>
<surname>Miller</surname>
<given-names>J</given-names>
</name>
,
<name>
<surname>Jolicoeur</surname>
<given-names>P</given-names>
</name>
,
<name>
<surname>Brisson</surname>
<given-names>B</given-names>
</name>
(
<year>2008</year>
)
<article-title>Measurement of ERP latency differences: a comparison of single-participant and jackknife-based scoring methods</article-title>
.
<source>Psychophysiology</source>
<volume>45</volume>
:
<fpage>250</fpage>
<lpage>274</lpage>
.
<pub-id pub-id-type="pmid">17995913</pub-id>
</mixed-citation>
</ref>
<ref id="pcbi.1003288-Schrger1">
<label>46</label>
<mixed-citation publication-type="journal">
<name>
<surname>Schröger</surname>
<given-names>E</given-names>
</name>
,
<name>
<surname>Wolff</surname>
<given-names>C</given-names>
</name>
(
<year>1996</year>
)
<article-title>Mismatch response of the human brain to changes in sound location</article-title>
.
<source>Neuroreport</source>
<volume>7</volume>
:
<fpage>3005</fpage>
<lpage>3008</lpage>
.
<pub-id pub-id-type="pmid">9116228</pub-id>
</mixed-citation>
</ref>
<ref id="pcbi.1003288-Jacobsen1">
<label>47</label>
<mixed-citation publication-type="journal">
<name>
<surname>Jacobsen</surname>
<given-names>T</given-names>
</name>
,
<name>
<surname>Schröger</surname>
<given-names>E</given-names>
</name>
(
<year>2001</year>
)
<article-title>Is there pre-attentive memory-based comparison of pitch</article-title>
?
<source>Psychophysiology</source>
<volume>38</volume>
:
<fpage>723</fpage>
<lpage>727</lpage>
.
<pub-id pub-id-type="pmid">11446587</pub-id>
</mixed-citation>
</ref>
<ref id="pcbi.1003288-Jacobsen2">
<label>48</label>
<mixed-citation publication-type="journal">
<name>
<surname>Jacobsen</surname>
<given-names>T</given-names>
</name>
,
<name>
<surname>Horenkamp</surname>
<given-names>T</given-names>
</name>
,
<name>
<surname>Schröger</surname>
<given-names>E</given-names>
</name>
(
<year>2003</year>
)
<article-title>Preattentive memory-based comparison of sound intensity</article-title>
.
<source>Audiology & neuro-otology</source>
<volume>8</volume>
:
<fpage>338</fpage>
<lpage>346</lpage>
.
<pub-id pub-id-type="pmid">14566104</pub-id>
</mixed-citation>
</ref>
<ref id="pcbi.1003288-Jacobsen3">
<label>49</label>
<mixed-citation publication-type="journal">
<name>
<surname>Jacobsen</surname>
<given-names>T</given-names>
</name>
,
<name>
<surname>Schröger</surname>
<given-names>E</given-names>
</name>
(
<year>2003</year>
)
<article-title>Measuring duration mismatch negativity</article-title>
.
<source>Clinical Neurophysiology</source>
<volume>114</volume>
:
<fpage>1133</fpage>
<lpage>1143</lpage>
.
<pub-id pub-id-type="pmid">12804682</pub-id>
</mixed-citation>
</ref>
<ref id="pcbi.1003288-Horvth1">
<label>50</label>
<mixed-citation publication-type="journal">
<name>
<surname>Horváth</surname>
<given-names>J</given-names>
</name>
,
<name>
<surname>Czigler</surname>
<given-names>I</given-names>
</name>
,
<name>
<surname>Jacobsen</surname>
<given-names>T</given-names>
</name>
,
<name>
<surname>Maess</surname>
<given-names>B</given-names>
</name>
,
<name>
<surname>Schröger</surname>
<given-names>E</given-names>
</name>
,
<etal>et al</etal>
(
<year>2008</year>
)
<article-title>MMN or no MMN: no magnitude of deviance effect on the MMN amplitude</article-title>
.
<source>Psychophysiology</source>
<volume>45</volume>
:
<fpage>60</fpage>
<lpage>69</lpage>
.
<pub-id pub-id-type="pmid">17868262</pub-id>
</mixed-citation>
</ref>
<ref id="pcbi.1003288-Taaseh1">
<label>51</label>
<mixed-citation publication-type="journal">
<name>
<surname>Taaseh</surname>
<given-names>N</given-names>
</name>
,
<name>
<surname>Yaron</surname>
<given-names>A</given-names>
</name>
,
<name>
<surname>Nelken</surname>
<given-names>I</given-names>
</name>
(
<year>2011</year>
)
<article-title>Stimulus-specific adaptation and deviance detection in the rat auditory cortex</article-title>
.
<source>PLoS ONE</source>
<volume>6</volume>
:
<fpage>e23369</fpage>
.
<pub-id pub-id-type="pmid">21853120</pub-id>
</mixed-citation>
</ref>
<ref id="pcbi.1003288-Mill1">
<label>52</label>
<mixed-citation publication-type="journal">
<name>
<surname>Mill</surname>
<given-names>R</given-names>
</name>
,
<name>
<surname>Coath</surname>
<given-names>M</given-names>
</name>
,
<name>
<surname>Wennekers</surname>
<given-names>T</given-names>
</name>
,
<name>
<surname>Denham</surname>
<given-names>S</given-names>
</name>
(
<year>2011</year>
)
<article-title>A Neurocomputational Model of Stimulus-Specific Adaptation to Oddball and Markov Sequences</article-title>
.
<source>PLoS Comput Biol</source>
<volume>7</volume>
:
<fpage>e1002117</fpage>
.
<pub-id pub-id-type="pmid">21876661</pub-id>
</mixed-citation>
</ref>
<ref id="pcbi.1003288-Wacongne1">
<label>53</label>
<mixed-citation publication-type="journal">
<name>
<surname>Wacongne</surname>
<given-names>C</given-names>
</name>
,
<name>
<surname>Changeux</surname>
<given-names>J-P</given-names>
</name>
,
<name>
<surname>Dehaene</surname>
<given-names>S</given-names>
</name>
(
<year>2012</year>
)
<article-title>A Neuronal Model of Predictive Coding Accounting for the Mismatch Negativity</article-title>
.
<source>The Journal of Neuroscience</source>
<volume>32</volume>
:
<fpage>3665</fpage>
<lpage>3678</lpage>
.
<pub-id pub-id-type="pmid">22423089</pub-id>
</mixed-citation>
</ref>
<ref id="pcbi.1003288-Garrido4">
<label>54</label>
<mixed-citation publication-type="journal">
<name>
<surname>Garrido</surname>
<given-names>M</given-names>
</name>
,
<name>
<surname>Kilner</surname>
<given-names>J</given-names>
</name>
,
<name>
<surname>Kiebel</surname>
<given-names>S</given-names>
</name>
,
<name>
<surname>Friston</surname>
<given-names>K</given-names>
</name>
(
<year>2009</year>
)
<article-title>Dynamic causal modelling of the response to frequency deviants</article-title>
.
<source>J Neurophysiol</source>
<volume>101</volume>
:
<fpage>90291.92008</fpage>
<lpage>92631</lpage>
.</mixed-citation>
</ref>
<ref id="pcbi.1003288-May2">
<label>55</label>
<mixed-citation publication-type="journal">
<name>
<surname>May</surname>
<given-names>P</given-names>
</name>
,
<name>
<surname>Tiitinen</surname>
<given-names>H</given-names>
</name>
,
<name>
<surname>Ilmoniemi</surname>
<given-names>RJ</given-names>
</name>
,
<name>
<surname>Nyman</surname>
<given-names>G</given-names>
</name>
,
<name>
<surname>Taylor</surname>
<given-names>JG</given-names>
</name>
,
<etal>et al</etal>
(
<year>1999</year>
)
<article-title>Frequency change detection in human auditory cortex</article-title>
.
<source>Journal of computational neuroscience</source>
<volume>6</volume>
:
<fpage>99</fpage>
<lpage>120</lpage>
.
<pub-id pub-id-type="pmid">10333158</pub-id>
</mixed-citation>
</ref>
<ref id="pcbi.1003288-Marr1">
<label>56</label>
<mixed-citation publication-type="other">Marr D (1982) Vision: A Computational Investigation into the Human Representation and Processing of Visual Information. New York: W. H. Freeman.</mixed-citation>
</ref>
<ref id="pcbi.1003288-Slabu1">
<label>57</label>
<mixed-citation publication-type="journal">
<name>
<surname>Slabu</surname>
<given-names>L</given-names>
</name>
,
<name>
<surname>Grimm</surname>
<given-names>S</given-names>
</name>
,
<name>
<surname>Escera</surname>
<given-names>C</given-names>
</name>
(
<year>2012</year>
)
<article-title>Novelty Detection in the Human Auditory Brainstem</article-title>
.
<source>The Journal of Neuroscience</source>
<volume>32</volume>
:
<fpage>1447</fpage>
<lpage>1452</lpage>
.
<pub-id pub-id-type="pmid">22279229</pub-id>
</mixed-citation>
</ref>
<ref id="pcbi.1003288-Slabu2">
<label>58</label>
<mixed-citation publication-type="journal">
<name>
<surname>Slabu</surname>
<given-names>L</given-names>
</name>
,
<name>
<surname>Escera</surname>
<given-names>C</given-names>
</name>
,
<name>
<surname>Grimm</surname>
<given-names>S</given-names>
</name>
,
<name>
<surname>Costa-Faidella</surname>
<given-names>J</given-names>
</name>
(
<year>2010</year>
)
<article-title>Early change detection in humans as revealed by auditory brainstem and middle-latency evoked potentials</article-title>
.
<source>The European journal of neuroscience</source>
<volume>32</volume>
:
<fpage>859</fpage>
<lpage>865</lpage>
.
<pub-id pub-id-type="pmid">20626459</pub-id>
</mixed-citation>
</ref>
<ref id="pcbi.1003288-Grimm1">
<label>59</label>
<mixed-citation publication-type="journal">
<name>
<surname>Grimm</surname>
<given-names>S</given-names>
</name>
,
<name>
<surname>Escera</surname>
<given-names>C</given-names>
</name>
(
<year>2012</year>
)
<article-title>Auditory deviance detection revisited: evidence for a hierarchical novelty system</article-title>
.
<source>International Journal of Psychophysiology</source>
<volume>85</volume>
:
<fpage>88</fpage>
<lpage>92</lpage>
.
<pub-id pub-id-type="pmid">21669238</pub-id>
</mixed-citation>
</ref>
<ref id="pcbi.1003288-Penny1">
<label>60</label>
<mixed-citation publication-type="journal">
<name>
<surname>Penny</surname>
<given-names>W</given-names>
</name>
,
<name>
<surname>Stephan</surname>
<given-names>K</given-names>
</name>
,
<name>
<surname>Mechelli</surname>
<given-names>A</given-names>
</name>
,
<name>
<surname>Friston</surname>
<given-names>K</given-names>
</name>
(
<year>2004</year>
)
<article-title>Comparing dynamic causal models</article-title>
.
<source>NeuroImage</source>
<volume>22</volume>
:
<fpage>1157</fpage>
<lpage>1172</lpage>
.
<pub-id pub-id-type="pmid">15219588</pub-id>
</mixed-citation>
</ref>
<ref id="pcbi.1003288-Penny2">
<label>61</label>
<mixed-citation publication-type="journal">
<name>
<surname>Penny</surname>
<given-names>W</given-names>
</name>
,
<name>
<surname>Stephan</surname>
<given-names>K</given-names>
</name>
,
<name>
<surname>Daunizeau</surname>
<given-names>J</given-names>
</name>
,
<name>
<surname>Rosa</surname>
<given-names>M</given-names>
</name>
,
<name>
<surname>Friston</surname>
<given-names>K</given-names>
</name>
,
<etal>et al</etal>
(
<year>2010</year>
)
<article-title>Comparing Families of Dynamic Causal Models</article-title>
.
<source>PLoS Comput Biol</source>
<volume>6</volume>
:
<fpage>e1000709</fpage>
.
<pub-id pub-id-type="pmid">20300649</pub-id>
</mixed-citation>
</ref>
<ref id="pcbi.1003288-Stephan1">
<label>62</label>
<mixed-citation publication-type="journal">
<name>
<surname>Stephan</surname>
<given-names>KE</given-names>
</name>
,
<name>
<surname>Penny</surname>
<given-names>W</given-names>
</name>
,
<name>
<surname>Daunizeau</surname>
<given-names>J</given-names>
</name>
,
<name>
<surname>Moran</surname>
<given-names>R</given-names>
</name>
,
<name>
<surname>Friston</surname>
<given-names>K</given-names>
</name>
(
<year>2009</year>
)
<article-title>Bayesian model selection for group studies</article-title>
.
<source>NeuroImage</source>
<volume>46</volume>
:
<fpage>1004</fpage>
<lpage>1017</lpage>
.
<pub-id pub-id-type="pmid">19306932</pub-id>
</mixed-citation>
</ref>
<ref id="pcbi.1003288-Winkler1">
<label>63</label>
<mixed-citation publication-type="journal">
<name>
<surname>Winkler</surname>
<given-names>I</given-names>
</name>
,
<name>
<surname>Czigler</surname>
<given-names>I</given-names>
</name>
(
<year>2012</year>
)
<article-title>Evidence from auditory and visual event-related potential (ERP) studies of deviance detection (MMN and vMMN) linking predictive coding theories and perceptual object representations</article-title>
.
<source>International Journal of Psychophysiology</source>
<volume>83</volume>
(
<issue>2</issue>
)
<fpage>132</fpage>
<lpage>43</lpage>
.
<pub-id pub-id-type="pmid">22047947</pub-id>
</mixed-citation>
</ref>
<ref id="pcbi.1003288-Schrger2">
<label>64</label>
<mixed-citation publication-type="journal">
<name>
<surname>Schröger</surname>
<given-names>E</given-names>
</name>
,
<name>
<surname>Bendixen</surname>
<given-names>A</given-names>
</name>
,
<name>
<surname>Trujillo-Barreto</surname>
<given-names>N</given-names>
</name>
,
<name>
<surname>Roeber</surname>
<given-names>U</given-names>
</name>
(
<year>2007</year>
)
<article-title>Processing of abstract rule violations in audition</article-title>
.
<source>PLoS ONE</source>
<volume>2</volume>
:
<fpage>e1131</fpage>
.
<pub-id pub-id-type="pmid">17987118</pub-id>
</mixed-citation>
</ref>
<ref id="pcbi.1003288-Ntnen2">
<label>65</label>
<mixed-citation publication-type="journal">
<name>
<surname>Näätänen</surname>
<given-names>R</given-names>
</name>
(
<year>2001</year>
)
<article-title>‘Primitive intelligence’ in the auditory cortex</article-title>
.
<source>Trends in Neurosciences</source>
<volume>24</volume>
:
<fpage>283</fpage>
<lpage>288</lpage>
.
<pub-id pub-id-type="pmid">11311381</pub-id>
</mixed-citation>
</ref>
<ref id="pcbi.1003288-Bendixen1">
<label>66</label>
<mixed-citation publication-type="journal">
<name>
<surname>Bendixen</surname>
<given-names>A</given-names>
</name>
,
<name>
<surname>Schröger</surname>
<given-names>E</given-names>
</name>
(
<year>2008</year>
)
<article-title>Memory trace formation for abstract auditory features and its consequences in different attentional contexts</article-title>
.
<source>Biological Psychology</source>
<volume>78</volume>
:
<fpage>231</fpage>
<lpage>241</lpage>
.
<pub-id pub-id-type="pmid">18439740</pub-id>
</mixed-citation>
</ref>
<ref id="pcbi.1003288-Umbricht1">
<label>67</label>
<mixed-citation publication-type="journal">
<name>
<surname>Umbricht</surname>
<given-names>D</given-names>
</name>
,
<name>
<surname>Krljes</surname>
<given-names>S</given-names>
</name>
(
<year>2005</year>
)
<article-title>Mismatch negativity in schizophrenia: a meta-analysis</article-title>
.
<source>Schizophrenia Research</source>
<volume>76</volume>
:
<fpage>1</fpage>
<lpage>23</lpage>
.
<pub-id pub-id-type="pmid">15927795</pub-id>
</mixed-citation>
</ref>
<ref id="pcbi.1003288-Stephan2">
<label>68</label>
<mixed-citation publication-type="journal">
<name>
<surname>Stephan</surname>
<given-names>KE</given-names>
</name>
,
<name>
<surname>Baldeweg</surname>
<given-names>T</given-names>
</name>
,
<name>
<surname>Friston</surname>
<given-names>KJ</given-names>
</name>
(
<year>2006</year>
)
<article-title>Synaptic plasticity and dysconnection in schizophrenia</article-title>
.
<source>Biological psychiatry</source>
<volume>59</volume>
:
<fpage>929</fpage>
<lpage>939</lpage>
.
<pub-id pub-id-type="pmid">16427028</pub-id>
</mixed-citation>
</ref>
<ref id="pcbi.1003288-Baldeweg2">
<label>69</label>
<mixed-citation publication-type="journal">
<name>
<surname>Baldeweg</surname>
<given-names>T</given-names>
</name>
(
<year>2004</year>
)
<article-title>Mismatch negativity potentials and cognitive impairment in schizophrenia</article-title>
.
<source>Schizophrenia Research</source>
<volume>69</volume>
:
<fpage>203</fpage>
<lpage>217</lpage>
.
<pub-id pub-id-type="pmid">15469194</pub-id>
</mixed-citation>
</ref>
<ref id="pcbi.1003288-Umbricht2">
<label>70</label>
<mixed-citation publication-type="journal">
<name>
<surname>Umbricht</surname>
<given-names>D</given-names>
</name>
,
<name>
<surname>Schmid</surname>
<given-names>L</given-names>
</name>
,
<name>
<surname>Koller</surname>
<given-names>R</given-names>
</name>
,
<name>
<surname>Vollenweider</surname>
<given-names>FX</given-names>
</name>
,
<name>
<surname>Hell</surname>
<given-names>D</given-names>
</name>
,
<etal>et al</etal>
(
<year>2000</year>
)
<article-title>Ketamine-induced deficits in auditory and visual context-dependent processing in healthy volunteers: implications for models of cognitive deficits in schizophrenia</article-title>
.
<source>Archives of general psychiatry</source>
<volume>57</volume>
:
<fpage>1139</fpage>
<lpage>1147</lpage>
.
<pub-id pub-id-type="pmid">11115327</pub-id>
</mixed-citation>
</ref>
<ref id="pcbi.1003288-Schmidt1">
<label>71</label>
<mixed-citation publication-type="journal">
<name>
<surname>Schmidt</surname>
<given-names>A</given-names>
</name>
,
<name>
<surname>Diaconescu</surname>
<given-names>A</given-names>
</name>
,
<name>
<surname>Kometer</surname>
<given-names>M</given-names>
</name>
,
<name>
<surname>Friston</surname>
<given-names>K</given-names>
</name>
,
<name>
<surname>Stephan</surname>
<given-names>K</given-names>
</name>
,
<etal>et al</etal>
(
<year>2013</year>
)
<article-title>Modeling Ketamine Effects on Synaptic Plasticity During the Mismatch Negativity</article-title>
.
<source>Cerebral Cortex</source>
<volume>23</volume>
:
<fpage>2394</fpage>
<lpage>2406</lpage>
.
<pub-id pub-id-type="pmid">22875863</pub-id>
</mixed-citation>
</ref>
<ref id="pcbi.1003288-Baldeweg3">
<label>72</label>
<mixed-citation publication-type="journal">
<name>
<surname>Baldeweg</surname>
<given-names>T</given-names>
</name>
,
<name>
<surname>Wong</surname>
<given-names>D</given-names>
</name>
,
<name>
<surname>Stephan</surname>
<given-names>K</given-names>
</name>
(
<year>2006</year>
)
<article-title>Nicotinic modulation of human auditory sensory memory: Evidence from mismatch negativity potentials</article-title>
.
<source>International journal of psychophysiology : official journal of the International Organization of Psychophysiology</source>
<volume>59</volume>
:
<fpage>49</fpage>
<lpage>58</lpage>
.
<pub-id pub-id-type="pmid">16313986</pub-id>
</mixed-citation>
</ref>
</ref-list>
</back>
</pmc>
</record>

Pour manipuler ce document sous Unix (Dilib)

EXPLOR_STEP=$WICRI_ROOT/Wicri/Asie/explor/AustralieFrV1/Data/Pmc/Corpus
HfdSelect -h $EXPLOR_STEP/biblio.hfd -nk 0028599 | SxmlIndent | more

Ou

HfdSelect -h $EXPLOR_AREA/Data/Pmc/Corpus/biblio.hfd -nk 0028599 | SxmlIndent | more

Pour mettre un lien sur cette page dans le réseau Wicri

{{Explor lien
   |wiki=    Wicri/Asie
   |area=    AustralieFrV1
   |flux=    Pmc
   |étape=   Corpus
   |type=    RBID
   |clé=     
   |texte=   
}}

Wicri

This area was generated with Dilib version V0.6.33.
Data generation: Tue Dec 5 10:43:12 2017. Site generation: Tue Mar 5 14:07:20 2024