Serveur d'exploration sur les relations entre la France et l'Australie

Attention, ce site est en cours de développement !
Attention, site généré par des moyens informatiques à partir de corpus bruts.
Les informations ne sont donc pas validées.

EEG Frequency-Tagging and Input-Output Comparison in Rhythm Perception.

Identifieur interne : 005200 ( Ncbi/Merge ); précédent : 005199; suivant : 005201

EEG Frequency-Tagging and Input-Output Comparison in Rhythm Perception.

Auteurs : Sylvie Nozaradan [Australie] ; Peter E. Keller [Australie] ; Bruno Rossion [Belgique] ; André Mouraux [Belgique]

Source :

RBID : pubmed:29127530

Abstract

The combination of frequency-tagging with electroencephalography (EEG) has recently proved fruitful for understanding the perception of beat and meter in musical rhythm, a common behavior shared by humans of all cultures. EEG frequency-tagging allows the objective measurement of input-output transforms to investigate beat perception, its modulation by exogenous and endogenous factors, development, and neural basis. Recent doubt has been raised about the validity of comparing frequency-domain representations of auditory rhythmic stimuli and corresponding EEG responses, assuming that it implies a one-to-one mapping between the envelope of the rhythmic input and the neural output, and that it neglects the sensitivity of frequency-domain representations to acoustic features making up the rhythms. Here we argue that these elements actually reinforce the strengths of the approach. The obvious fact that acoustic features influence the frequency spectrum of the sound envelope precisely justifies taking into consideration the sounds used to generate a beat percept for interpreting neural responses to auditory rhythms. Most importantly, the many-to-one relationship between rhythmic input and perceived beat actually validates an approach that objectively measures the input-output transforms underlying the perceptual categorization of rhythmic inputs. Hence, provided that a number of potential pitfalls and fallacies are avoided, EEG frequency-tagging to study input-output relationships appears valuable for understanding rhythm perception.

DOI: 10.1007/s10548-017-0605-8
PubMed: 29127530

Links toward previous steps (curation, corpus...)


Links to Exploration step

pubmed:29127530

Le document en format XML

<record>
<TEI>
<teiHeader>
<fileDesc>
<titleStmt>
<title xml:lang="en">EEG Frequency-Tagging and Input-Output Comparison in Rhythm Perception.</title>
<author>
<name sortKey="Nozaradan, Sylvie" sort="Nozaradan, Sylvie" uniqKey="Nozaradan S" first="Sylvie" last="Nozaradan">Sylvie Nozaradan</name>
<affiliation wicri:level="1">
<nlm:affiliation>The MARCS Institute for Brain, Behaviour and Development (WSU), Sydney, NSW, Australia. sylvie.nozaradan@uclouvain.be.</nlm:affiliation>
<country xml:lang="fr">Australie</country>
<wicri:regionArea>The MARCS Institute for Brain, Behaviour and Development (WSU), Sydney, NSW</wicri:regionArea>
<wicri:noRegion>NSW</wicri:noRegion>
</affiliation>
</author>
<author>
<name sortKey="Keller, Peter E" sort="Keller, Peter E" uniqKey="Keller P" first="Peter E" last="Keller">Peter E. Keller</name>
<affiliation wicri:level="1">
<nlm:affiliation>The MARCS Institute for Brain, Behaviour and Development (WSU), Sydney, NSW, Australia.</nlm:affiliation>
<country xml:lang="fr">Australie</country>
<wicri:regionArea>The MARCS Institute for Brain, Behaviour and Development (WSU), Sydney, NSW</wicri:regionArea>
<wicri:noRegion>NSW</wicri:noRegion>
</affiliation>
</author>
<author>
<name sortKey="Rossion, Bruno" sort="Rossion, Bruno" uniqKey="Rossion B" first="Bruno" last="Rossion">Bruno Rossion</name>
<affiliation wicri:level="4">
<nlm:affiliation>Institute of Neuroscience (Ions), Université catholique de Louvain (UCL), Brussels, Belgium.</nlm:affiliation>
<country xml:lang="fr">Belgique</country>
<wicri:regionArea>Institute of Neuroscience (Ions), Université catholique de Louvain (UCL), Brussels</wicri:regionArea>
<placeName>
<settlement type="city">Bruxelles</settlement>
<region nuts="2">Région de Bruxelles-Capitale</region>
<settlement type="city">Louvain-la-Neuve</settlement>
</placeName>
<orgName type="university">Université catholique de Louvain</orgName>
</affiliation>
</author>
<author>
<name sortKey="Mouraux, Andre" sort="Mouraux, Andre" uniqKey="Mouraux A" first="André" last="Mouraux">André Mouraux</name>
<affiliation wicri:level="4">
<nlm:affiliation>Institute of Neuroscience (Ions), Université catholique de Louvain (UCL), Brussels, Belgium.</nlm:affiliation>
<country xml:lang="fr">Belgique</country>
<wicri:regionArea>Institute of Neuroscience (Ions), Université catholique de Louvain (UCL), Brussels</wicri:regionArea>
<placeName>
<settlement type="city">Bruxelles</settlement>
<region nuts="2">Région de Bruxelles-Capitale</region>
<settlement type="city">Louvain-la-Neuve</settlement>
</placeName>
<orgName type="university">Université catholique de Louvain</orgName>
</affiliation>
</author>
</titleStmt>
<publicationStmt>
<idno type="wicri:source">PubMed</idno>
<date when="2017">2017</date>
<idno type="RBID">pubmed:29127530</idno>
<idno type="pmid">29127530</idno>
<idno type="doi">10.1007/s10548-017-0605-8</idno>
<idno type="wicri:Area/PubMed/Corpus">000134</idno>
<idno type="wicri:explorRef" wicri:stream="PubMed" wicri:step="Corpus" wicri:corpus="PubMed">000134</idno>
<idno type="wicri:Area/PubMed/Curation">000134</idno>
<idno type="wicri:explorRef" wicri:stream="PubMed" wicri:step="Curation">000134</idno>
<idno type="wicri:Area/PubMed/Checkpoint">000134</idno>
<idno type="wicri:explorRef" wicri:stream="Checkpoint" wicri:step="PubMed">000134</idno>
<idno type="wicri:Area/Ncbi/Merge">005200</idno>
</publicationStmt>
<sourceDesc>
<biblStruct>
<analytic>
<title xml:lang="en">EEG Frequency-Tagging and Input-Output Comparison in Rhythm Perception.</title>
<author>
<name sortKey="Nozaradan, Sylvie" sort="Nozaradan, Sylvie" uniqKey="Nozaradan S" first="Sylvie" last="Nozaradan">Sylvie Nozaradan</name>
<affiliation wicri:level="1">
<nlm:affiliation>The MARCS Institute for Brain, Behaviour and Development (WSU), Sydney, NSW, Australia. sylvie.nozaradan@uclouvain.be.</nlm:affiliation>
<country xml:lang="fr">Australie</country>
<wicri:regionArea>The MARCS Institute for Brain, Behaviour and Development (WSU), Sydney, NSW</wicri:regionArea>
<wicri:noRegion>NSW</wicri:noRegion>
</affiliation>
</author>
<author>
<name sortKey="Keller, Peter E" sort="Keller, Peter E" uniqKey="Keller P" first="Peter E" last="Keller">Peter E. Keller</name>
<affiliation wicri:level="1">
<nlm:affiliation>The MARCS Institute for Brain, Behaviour and Development (WSU), Sydney, NSW, Australia.</nlm:affiliation>
<country xml:lang="fr">Australie</country>
<wicri:regionArea>The MARCS Institute for Brain, Behaviour and Development (WSU), Sydney, NSW</wicri:regionArea>
<wicri:noRegion>NSW</wicri:noRegion>
</affiliation>
</author>
<author>
<name sortKey="Rossion, Bruno" sort="Rossion, Bruno" uniqKey="Rossion B" first="Bruno" last="Rossion">Bruno Rossion</name>
<affiliation wicri:level="4">
<nlm:affiliation>Institute of Neuroscience (Ions), Université catholique de Louvain (UCL), Brussels, Belgium.</nlm:affiliation>
<country xml:lang="fr">Belgique</country>
<wicri:regionArea>Institute of Neuroscience (Ions), Université catholique de Louvain (UCL), Brussels</wicri:regionArea>
<placeName>
<settlement type="city">Bruxelles</settlement>
<region nuts="2">Région de Bruxelles-Capitale</region>
<settlement type="city">Louvain-la-Neuve</settlement>
</placeName>
<orgName type="university">Université catholique de Louvain</orgName>
</affiliation>
</author>
<author>
<name sortKey="Mouraux, Andre" sort="Mouraux, Andre" uniqKey="Mouraux A" first="André" last="Mouraux">André Mouraux</name>
<affiliation wicri:level="4">
<nlm:affiliation>Institute of Neuroscience (Ions), Université catholique de Louvain (UCL), Brussels, Belgium.</nlm:affiliation>
<country xml:lang="fr">Belgique</country>
<wicri:regionArea>Institute of Neuroscience (Ions), Université catholique de Louvain (UCL), Brussels</wicri:regionArea>
<placeName>
<settlement type="city">Bruxelles</settlement>
<region nuts="2">Région de Bruxelles-Capitale</region>
<settlement type="city">Louvain-la-Neuve</settlement>
</placeName>
<orgName type="university">Université catholique de Louvain</orgName>
</affiliation>
</author>
</analytic>
<series>
<title level="j">Brain topography</title>
<idno type="eISSN">1573-6792</idno>
<imprint>
<date when="2017" type="published">2017</date>
</imprint>
</series>
</biblStruct>
</sourceDesc>
</fileDesc>
<profileDesc>
<textClass></textClass>
</profileDesc>
</teiHeader>
<front>
<div type="abstract" xml:lang="en">The combination of frequency-tagging with electroencephalography (EEG) has recently proved fruitful for understanding the perception of beat and meter in musical rhythm, a common behavior shared by humans of all cultures. EEG frequency-tagging allows the objective measurement of input-output transforms to investigate beat perception, its modulation by exogenous and endogenous factors, development, and neural basis. Recent doubt has been raised about the validity of comparing frequency-domain representations of auditory rhythmic stimuli and corresponding EEG responses, assuming that it implies a one-to-one mapping between the envelope of the rhythmic input and the neural output, and that it neglects the sensitivity of frequency-domain representations to acoustic features making up the rhythms. Here we argue that these elements actually reinforce the strengths of the approach. The obvious fact that acoustic features influence the frequency spectrum of the sound envelope precisely justifies taking into consideration the sounds used to generate a beat percept for interpreting neural responses to auditory rhythms. Most importantly, the many-to-one relationship between rhythmic input and perceived beat actually validates an approach that objectively measures the input-output transforms underlying the perceptual categorization of rhythmic inputs. Hence, provided that a number of potential pitfalls and fallacies are avoided, EEG frequency-tagging to study input-output relationships appears valuable for understanding rhythm perception.</div>
</front>
</TEI>
<pubmed>
<MedlineCitation Status="Publisher" Owner="NLM">
<PMID Version="1">29127530</PMID>
<DateCreated>
<Year>2017</Year>
<Month>11</Month>
<Day>11</Day>
</DateCreated>
<DateRevised>
<Year>2017</Year>
<Month>11</Month>
<Day>11</Day>
</DateRevised>
<Article PubModel="Print-Electronic">
<Journal>
<ISSN IssnType="Electronic">1573-6792</ISSN>
<JournalIssue CitedMedium="Internet">
<PubDate>
<Year>2017</Year>
<Month>Nov</Month>
<Day>10</Day>
</PubDate>
</JournalIssue>
<Title>Brain topography</Title>
<ISOAbbreviation>Brain Topogr</ISOAbbreviation>
</Journal>
<ArticleTitle>EEG Frequency-Tagging and Input-Output Comparison in Rhythm Perception.</ArticleTitle>
<ELocationID EIdType="doi" ValidYN="Y">10.1007/s10548-017-0605-8</ELocationID>
<Abstract>
<AbstractText>The combination of frequency-tagging with electroencephalography (EEG) has recently proved fruitful for understanding the perception of beat and meter in musical rhythm, a common behavior shared by humans of all cultures. EEG frequency-tagging allows the objective measurement of input-output transforms to investigate beat perception, its modulation by exogenous and endogenous factors, development, and neural basis. Recent doubt has been raised about the validity of comparing frequency-domain representations of auditory rhythmic stimuli and corresponding EEG responses, assuming that it implies a one-to-one mapping between the envelope of the rhythmic input and the neural output, and that it neglects the sensitivity of frequency-domain representations to acoustic features making up the rhythms. Here we argue that these elements actually reinforce the strengths of the approach. The obvious fact that acoustic features influence the frequency spectrum of the sound envelope precisely justifies taking into consideration the sounds used to generate a beat percept for interpreting neural responses to auditory rhythms. Most importantly, the many-to-one relationship between rhythmic input and perceived beat actually validates an approach that objectively measures the input-output transforms underlying the perceptual categorization of rhythmic inputs. Hence, provided that a number of potential pitfalls and fallacies are avoided, EEG frequency-tagging to study input-output relationships appears valuable for understanding rhythm perception.</AbstractText>
</Abstract>
<AuthorList CompleteYN="Y">
<Author ValidYN="Y">
<LastName>Nozaradan</LastName>
<ForeName>Sylvie</ForeName>
<Initials>S</Initials>
<Identifier Source="ORCID">http://orcid.org/0000-0002-5662-3173</Identifier>
<AffiliationInfo>
<Affiliation>The MARCS Institute for Brain, Behaviour and Development (WSU), Sydney, NSW, Australia. sylvie.nozaradan@uclouvain.be.</Affiliation>
</AffiliationInfo>
<AffiliationInfo>
<Affiliation>Institute of Neuroscience (Ions), Université catholique de Louvain (UCL), Brussels, Belgium. sylvie.nozaradan@uclouvain.be.</Affiliation>
</AffiliationInfo>
<AffiliationInfo>
<Affiliation>International Laboratory for Brain, Music and Sound Research (Brams), Montreal, QC, Canada. sylvie.nozaradan@uclouvain.be.</Affiliation>
</AffiliationInfo>
<AffiliationInfo>
<Affiliation>MARCS Institute for Brain, Behaviour and Development, Western Sydney University, Locked Bag 1797, Penrith, NSW, 2751, Australia. sylvie.nozaradan@uclouvain.be.</Affiliation>
</AffiliationInfo>
</Author>
<Author ValidYN="Y">
<LastName>Keller</LastName>
<ForeName>Peter E</ForeName>
<Initials>PE</Initials>
<AffiliationInfo>
<Affiliation>The MARCS Institute for Brain, Behaviour and Development (WSU), Sydney, NSW, Australia.</Affiliation>
</AffiliationInfo>
</Author>
<Author ValidYN="Y">
<LastName>Rossion</LastName>
<ForeName>Bruno</ForeName>
<Initials>B</Initials>
<AffiliationInfo>
<Affiliation>Institute of Neuroscience (Ions), Université catholique de Louvain (UCL), Brussels, Belgium.</Affiliation>
</AffiliationInfo>
<AffiliationInfo>
<Affiliation>Neurology Unit, Centre Hospitalier Régional Universitaire (CHRU) de Nancy, Nancy, France.</Affiliation>
</AffiliationInfo>
</Author>
<Author ValidYN="Y">
<LastName>Mouraux</LastName>
<ForeName>André</ForeName>
<Initials>A</Initials>
<AffiliationInfo>
<Affiliation>Institute of Neuroscience (Ions), Université catholique de Louvain (UCL), Brussels, Belgium.</Affiliation>
</AffiliationInfo>
</Author>
</AuthorList>
<Language>eng</Language>
<GrantList CompleteYN="Y">
<Grant>
<GrantID>DE160101064</GrantID>
<Agency>Australian Research Council</Agency>
<Country>United States</Country>
</Grant>
<Grant>
<GrantID>FT140101162</GrantID>
<Agency>Australian Research Council</Agency>
<Country>United States</Country>
</Grant>
</GrantList>
<PublicationTypeList>
<PublicationType UI="D016428">Journal Article</PublicationType>
<PublicationType UI="D016454">Review</PublicationType>
</PublicationTypeList>
<ArticleDate DateType="Electronic">
<Year>2017</Year>
<Month>11</Month>
<Day>10</Day>
</ArticleDate>
</Article>
<MedlineJournalInfo>
<Country>United States</Country>
<MedlineTA>Brain Topogr</MedlineTA>
<NlmUniqueID>8903034</NlmUniqueID>
<ISSNLinking>0896-0267</ISSNLinking>
</MedlineJournalInfo>
<KeywordList Owner="NOTNLM">
<Keyword MajorTopicYN="N">Auditory system</Keyword>
<Keyword MajorTopicYN="N">EEG</Keyword>
<Keyword MajorTopicYN="N">Frequency-tagging</Keyword>
<Keyword MajorTopicYN="N">Neural transform</Keyword>
<Keyword MajorTopicYN="N">Perceptual categorization</Keyword>
<Keyword MajorTopicYN="N">Rhythm and beat perception</Keyword>
</KeywordList>
</MedlineCitation>
<PubmedData>
<History>
<PubMedPubDate PubStatus="received">
<Year>2017</Year>
<Month>05</Month>
<Day>30</Day>
</PubMedPubDate>
<PubMedPubDate PubStatus="accepted">
<Year>2017</Year>
<Month>10</Month>
<Day>27</Day>
</PubMedPubDate>
<PubMedPubDate PubStatus="entrez">
<Year>2017</Year>
<Month>11</Month>
<Day>12</Day>
<Hour>6</Hour>
<Minute>0</Minute>
</PubMedPubDate>
<PubMedPubDate PubStatus="pubmed">
<Year>2017</Year>
<Month>11</Month>
<Day>12</Day>
<Hour>6</Hour>
<Minute>0</Minute>
</PubMedPubDate>
<PubMedPubDate PubStatus="medline">
<Year>2017</Year>
<Month>11</Month>
<Day>12</Day>
<Hour>6</Hour>
<Minute>0</Minute>
</PubMedPubDate>
</History>
<PublicationStatus>aheadofprint</PublicationStatus>
<ArticleIdList>
<ArticleId IdType="pubmed">29127530</ArticleId>
<ArticleId IdType="doi">10.1007/s10548-017-0605-8</ArticleId>
<ArticleId IdType="pii">10.1007/s10548-017-0605-8</ArticleId>
</ArticleIdList>
</PubmedData>
</pubmed>
<affiliations>
<list>
<country>
<li>Australie</li>
<li>Belgique</li>
</country>
<region>
<li>Région de Bruxelles-Capitale</li>
</region>
<settlement>
<li>Bruxelles</li>
<li>Louvain-la-Neuve</li>
</settlement>
<orgName>
<li>Université catholique de Louvain</li>
</orgName>
</list>
<tree>
<country name="Australie">
<noRegion>
<name sortKey="Nozaradan, Sylvie" sort="Nozaradan, Sylvie" uniqKey="Nozaradan S" first="Sylvie" last="Nozaradan">Sylvie Nozaradan</name>
</noRegion>
<name sortKey="Keller, Peter E" sort="Keller, Peter E" uniqKey="Keller P" first="Peter E" last="Keller">Peter E. Keller</name>
</country>
<country name="Belgique">
<region name="Région de Bruxelles-Capitale">
<name sortKey="Rossion, Bruno" sort="Rossion, Bruno" uniqKey="Rossion B" first="Bruno" last="Rossion">Bruno Rossion</name>
</region>
<name sortKey="Mouraux, Andre" sort="Mouraux, Andre" uniqKey="Mouraux A" first="André" last="Mouraux">André Mouraux</name>
</country>
</tree>
</affiliations>
</record>

Pour manipuler ce document sous Unix (Dilib)

EXPLOR_STEP=$WICRI_ROOT/Wicri/Asie/explor/AustralieFrV1/Data/Ncbi/Merge
HfdSelect -h $EXPLOR_STEP/biblio.hfd -nk 005200 | SxmlIndent | more

Ou

HfdSelect -h $EXPLOR_AREA/Data/Ncbi/Merge/biblio.hfd -nk 005200 | SxmlIndent | more

Pour mettre un lien sur cette page dans le réseau Wicri

{{Explor lien
   |wiki=    Wicri/Asie
   |area=    AustralieFrV1
   |flux=    Ncbi
   |étape=   Merge
   |type=    RBID
   |clé=     pubmed:29127530
   |texte=   EEG Frequency-Tagging and Input-Output Comparison in Rhythm Perception.
}}

Pour générer des pages wiki

HfdIndexSelect -h $EXPLOR_AREA/Data/Ncbi/Merge/RBID.i   -Sk "pubmed:29127530" \
       | HfdSelect -Kh $EXPLOR_AREA/Data/Ncbi/Merge/biblio.hfd   \
       | NlmPubMed2Wicri -a AustralieFrV1 

Wicri

This area was generated with Dilib version V0.6.33.
Data generation: Tue Dec 5 10:43:12 2017. Site generation: Tue Mar 5 14:07:20 2024