Serveur d'exploration sur les dispositifs haptiques

Attention, ce site est en cours de développement !
Attention, site généré par des moyens informatiques à partir de corpus bruts.
Les informations ne sont donc pas validées.

Early, but not late visual distractors affect movement synchronization to a temporal-spatial visual cue

Identifieur interne : 000614 ( Pmc/Checkpoint ); précédent : 000613; suivant : 000615

Early, but not late visual distractors affect movement synchronization to a temporal-spatial visual cue

Auteurs : Ashley J. Booth [Royaume-Uni] ; Mark T. Elliott [Royaume-Uni]

Source :

RBID : PMC:4478893

Abstract

The ease of synchronizing movements to a rhythmic cue is dependent on the modality of the cue presentation: timing accuracy is much higher when synchronizing with discrete auditory rhythms than an equivalent visual stimulus presented through flashes. However, timing accuracy is improved if the visual cue presents spatial as well as temporal information (e.g., a dot following an oscillatory trajectory). Similarly, when synchronizing with an auditory target metronome in the presence of a second visual distracting metronome, the distraction is stronger when the visual cue contains spatial-temporal information rather than temporal only. The present study investigates individuals’ ability to synchronize movements to a temporal-spatial visual cue in the presence of same-modality temporal-spatial distractors. Moreover, we investigated how increasing the number of distractor stimuli impacted on maintaining synchrony with the target cue. Participants made oscillatory vertical arm movements in time with a vertically oscillating white target dot centered on a large projection screen. The target dot was surrounded by 2, 8, or 14 distractor dots, which had an identical trajectory to the target but at a phase lead or lag of 0, 100, or 200 ms. We found participants’ timing performance was only affected in the phase-lead conditions and when there were large numbers of distractors present (8 and 14). This asymmetry suggests participants still rely on salient events in the stimulus trajectory to synchronize movements. Subsequently, distractions occurring in the window of attention surrounding those events have the maximum impact on timing performance.


Url:
DOI: 10.3389/fpsyg.2015.00866
PubMed: 26157412
PubMed Central: 4478893


Affiliations:


Links toward previous steps (curation, corpus...)


Links to Exploration step

PMC:4478893

Le document en format XML

<record>
<TEI>
<teiHeader>
<fileDesc>
<titleStmt>
<title xml:lang="en">Early, but not late visual distractors affect movement synchronization to a temporal-spatial visual cue</title>
<author>
<name sortKey="Booth, Ashley J" sort="Booth, Ashley J" uniqKey="Booth A" first="Ashley J." last="Booth">Ashley J. Booth</name>
<affiliation wicri:level="1">
<nlm:aff id="aff1">
<institution>School of Psychology, University of Birmingham</institution>
,
<country>Edgbaston, UK</country>
</nlm:aff>
<country xml:lang="fr">Royaume-Uni</country>
<wicri:regionArea></wicri:regionArea>
<wicri:regionArea># see nlm:aff region in country</wicri:regionArea>
</affiliation>
</author>
<author>
<name sortKey="Elliott, Mark T" sort="Elliott, Mark T" uniqKey="Elliott M" first="Mark T." last="Elliott">Mark T. Elliott</name>
<affiliation wicri:level="1">
<nlm:aff id="aff1">
<institution>School of Psychology, University of Birmingham</institution>
,
<country>Edgbaston, UK</country>
</nlm:aff>
<country xml:lang="fr">Royaume-Uni</country>
<wicri:regionArea></wicri:regionArea>
<wicri:regionArea># see nlm:aff region in country</wicri:regionArea>
</affiliation>
<affiliation wicri:level="1">
<nlm:aff id="aff2">
<institution>Institute of Digital Healthcare, Warwick Manufacturing Group, University of Warwick</institution>
,
<country>Coventry, UK</country>
</nlm:aff>
<country xml:lang="fr">Royaume-Uni</country>
<wicri:regionArea></wicri:regionArea>
<wicri:regionArea># see nlm:aff region in country</wicri:regionArea>
</affiliation>
</author>
</titleStmt>
<publicationStmt>
<idno type="wicri:source">PMC</idno>
<idno type="pmid">26157412</idno>
<idno type="pmc">4478893</idno>
<idno type="url">http://www.ncbi.nlm.nih.gov/pmc/articles/PMC4478893</idno>
<idno type="RBID">PMC:4478893</idno>
<idno type="doi">10.3389/fpsyg.2015.00866</idno>
<date when="2015">2015</date>
<idno type="wicri:Area/Pmc/Corpus">000199</idno>
<idno type="wicri:Area/Pmc/Curation">000199</idno>
<idno type="wicri:Area/Pmc/Checkpoint">000614</idno>
</publicationStmt>
<sourceDesc>
<biblStruct>
<analytic>
<title xml:lang="en" level="a" type="main">Early, but not late visual distractors affect movement synchronization to a temporal-spatial visual cue</title>
<author>
<name sortKey="Booth, Ashley J" sort="Booth, Ashley J" uniqKey="Booth A" first="Ashley J." last="Booth">Ashley J. Booth</name>
<affiliation wicri:level="1">
<nlm:aff id="aff1">
<institution>School of Psychology, University of Birmingham</institution>
,
<country>Edgbaston, UK</country>
</nlm:aff>
<country xml:lang="fr">Royaume-Uni</country>
<wicri:regionArea></wicri:regionArea>
<wicri:regionArea># see nlm:aff region in country</wicri:regionArea>
</affiliation>
</author>
<author>
<name sortKey="Elliott, Mark T" sort="Elliott, Mark T" uniqKey="Elliott M" first="Mark T." last="Elliott">Mark T. Elliott</name>
<affiliation wicri:level="1">
<nlm:aff id="aff1">
<institution>School of Psychology, University of Birmingham</institution>
,
<country>Edgbaston, UK</country>
</nlm:aff>
<country xml:lang="fr">Royaume-Uni</country>
<wicri:regionArea></wicri:regionArea>
<wicri:regionArea># see nlm:aff region in country</wicri:regionArea>
</affiliation>
<affiliation wicri:level="1">
<nlm:aff id="aff2">
<institution>Institute of Digital Healthcare, Warwick Manufacturing Group, University of Warwick</institution>
,
<country>Coventry, UK</country>
</nlm:aff>
<country xml:lang="fr">Royaume-Uni</country>
<wicri:regionArea></wicri:regionArea>
<wicri:regionArea># see nlm:aff region in country</wicri:regionArea>
</affiliation>
</author>
</analytic>
<series>
<title level="j">Frontiers in Psychology</title>
<idno type="eISSN">1664-1078</idno>
<imprint>
<date when="2015">2015</date>
</imprint>
</series>
</biblStruct>
</sourceDesc>
</fileDesc>
<profileDesc>
<textClass></textClass>
</profileDesc>
</teiHeader>
<front>
<div type="abstract" xml:lang="en">
<p>The ease of synchronizing movements to a rhythmic cue is dependent on the modality of the cue presentation: timing accuracy is much higher when synchronizing with discrete auditory rhythms than an equivalent visual stimulus presented through flashes. However, timing accuracy is improved if the visual cue presents spatial as well as temporal information (e.g., a dot following an oscillatory trajectory). Similarly, when synchronizing with an auditory target metronome in the presence of a second visual distracting metronome, the distraction is stronger when the visual cue contains spatial-temporal information rather than temporal only. The present study investigates individuals’ ability to synchronize movements to a temporal-spatial visual cue in the presence of same-modality temporal-spatial distractors. Moreover, we investigated how increasing the number of distractor stimuli impacted on maintaining synchrony with the target cue. Participants made oscillatory vertical arm movements in time with a vertically oscillating white target dot centered on a large projection screen. The target dot was surrounded by 2, 8, or 14 distractor dots, which had an identical trajectory to the target but at a phase lead or lag of 0, 100, or 200 ms. We found participants’ timing performance was only affected in the phase-lead conditions and when there were large numbers of distractors present (8 and 14). This asymmetry suggests participants still rely on salient events in the stimulus trajectory to synchronize movements. Subsequently, distractions occurring in the window of attention surrounding those events have the maximum impact on timing performance.</p>
</div>
</front>
<back>
<div1 type="bibliography">
<listBibl>
<biblStruct>
<analytic>
<author>
<name sortKey="Armstrong, A" uniqKey="Armstrong A">A. Armstrong</name>
</author>
<author>
<name sortKey="Issartel, J" uniqKey="Issartel J">J. Issartel</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Armstrong, A" uniqKey="Armstrong A">A. Armstrong</name>
</author>
<author>
<name sortKey="Issartel, J" uniqKey="Issartel J">J. Issartel</name>
</author>
<author>
<name sortKey="Varlet, M" uniqKey="Varlet M">M. Varlet</name>
</author>
<author>
<name sortKey="Marin, L" uniqKey="Marin L">L. Marin</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Aschersleben, G" uniqKey="Aschersleben G">G. Aschersleben</name>
</author>
<author>
<name sortKey="Prinz, W" uniqKey="Prinz W">W. Prinz</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Bartram, L" uniqKey="Bartram L">L. Bartram</name>
</author>
<author>
<name sortKey="Ware, C" uniqKey="Ware C">C. Ware</name>
</author>
<author>
<name sortKey="Calvert, T" uniqKey="Calvert T">T. Calvert</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Brainard, D H" uniqKey="Brainard D">D. H. Brainard</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Debats, N B" uniqKey="Debats N">N. B. Debats</name>
</author>
<author>
<name sortKey="Ridderikhoff, A" uniqKey="Ridderikhoff A">A. Ridderikhoff</name>
</author>
<author>
<name sortKey="De Boer, B J" uniqKey="De Boer B">B. J. de Boer</name>
</author>
<author>
<name sortKey="Peper, C L" uniqKey="Peper C">C. L. Peper</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Elliott, M T" uniqKey="Elliott M">M. T. Elliott</name>
</author>
<author>
<name sortKey="Welchman, A E" uniqKey="Welchman A">A. E. Welchman</name>
</author>
<author>
<name sortKey="Wing, A M" uniqKey="Wing A">A. M. Wing</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Elliott, M T" uniqKey="Elliott M">M. T. Elliott</name>
</author>
<author>
<name sortKey="Welchman, A E" uniqKey="Welchman A">A. E. Welchman</name>
</author>
<author>
<name sortKey="Wing, A M" uniqKey="Wing A">A. M. Wing</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Elliott, M T" uniqKey="Elliott M">M. T. Elliott</name>
</author>
<author>
<name sortKey="Wing, A M" uniqKey="Wing A">A. M. Wing</name>
</author>
<author>
<name sortKey="Welchman, A E" uniqKey="Welchman A">A. E. Welchman</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Elliott, M T" uniqKey="Elliott M">M. T. Elliott</name>
</author>
<author>
<name sortKey="Wing, A M" uniqKey="Wing A">A. M. Wing</name>
</author>
<author>
<name sortKey="Welchman, A E" uniqKey="Welchman A">A. E. Welchman</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Ernst, M O" uniqKey="Ernst M">M. O. Ernst</name>
</author>
<author>
<name sortKey="Banks, M S" uniqKey="Banks M">M. S. Banks</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Hajnal, A" uniqKey="Hajnal A">A. Hajnal</name>
</author>
<author>
<name sortKey="Richardson, M J" uniqKey="Richardson M">M. J. Richardson</name>
</author>
<author>
<name sortKey="Harrison, S J" uniqKey="Harrison S">S. J. Harrison</name>
</author>
<author>
<name sortKey="Schmidt, R C" uniqKey="Schmidt R">R. C. Schmidt</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Helmuth, L L" uniqKey="Helmuth L">L. L. Helmuth</name>
</author>
<author>
<name sortKey="Ivry, R B" uniqKey="Ivry R">R. B. Ivry</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Hove, M J" uniqKey="Hove M">M. J. Hove</name>
</author>
<author>
<name sortKey="Iversen, J R" uniqKey="Iversen J">J. R. Iversen</name>
</author>
<author>
<name sortKey="Zhang, A" uniqKey="Zhang A">A. Zhang</name>
</author>
<author>
<name sortKey="Repp, B H" uniqKey="Repp B">B. H. Repp</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Kurgansky, A V" uniqKey="Kurgansky A">A. V. Kurgansky</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Luck, G" uniqKey="Luck G">G. Luck</name>
</author>
<author>
<name sortKey="Sloboda, J" uniqKey="Sloboda J">J. Sloboda</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Miura, A" uniqKey="Miura A">A. Miura</name>
</author>
<author>
<name sortKey="Kudo, K" uniqKey="Kudo K">K. Kudo</name>
</author>
<author>
<name sortKey="Ohtsuki, T" uniqKey="Ohtsuki T">T. Ohtsuki</name>
</author>
<author>
<name sortKey="Kanehisa, H" uniqKey="Kanehisa H">H. Kanehisa</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Naccache, L" uniqKey="Naccache L">L. Naccache</name>
</author>
<author>
<name sortKey="Blandin, E" uniqKey="Blandin E">E. Blandin</name>
</author>
<author>
<name sortKey="Dehaene, S" uniqKey="Dehaene S">S. Dehaene</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Noormohammadi, N" uniqKey="Noormohammadi N">N. Noormohammadi</name>
</author>
<author>
<name sortKey="Brownjohn, J" uniqKey="Brownjohn J">J. Brownjohn</name>
</author>
<author>
<name sortKey="Wing, A M" uniqKey="Wing A">A. M. Wing</name>
</author>
<author>
<name sortKey="Racic, V" uniqKey="Racic V">V. Racic</name>
</author>
<author>
<name sortKey="Johannsen, L" uniqKey="Johannsen L">L. Johannsen</name>
</author>
<author>
<name sortKey="Elliott, M T" uniqKey="Elliott M">M. T. Elliott</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Repp, B H" uniqKey="Repp B">B. H. Repp</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Repp, B H" uniqKey="Repp B">B. H. Repp</name>
</author>
<author>
<name sortKey="Penel, A" uniqKey="Penel A">A. Penel</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Repp, B H" uniqKey="Repp B">B. H. Repp</name>
</author>
<author>
<name sortKey="Penel, A" uniqKey="Penel A">A. Penel</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Repp, B H" uniqKey="Repp B">B. H. Repp</name>
</author>
<author>
<name sortKey="Su, Y H" uniqKey="Su Y">Y.-H. Su</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Richardson, M J" uniqKey="Richardson M">M. J. Richardson</name>
</author>
<author>
<name sortKey="Marsh, K L" uniqKey="Marsh K">K. L. Marsh</name>
</author>
<author>
<name sortKey="Isenhower, R W" uniqKey="Isenhower R">R. W. Isenhower</name>
</author>
<author>
<name sortKey="Goodman, J R L" uniqKey="Goodman J">J. R. L. Goodman</name>
</author>
<author>
<name sortKey="Schmidt, R C" uniqKey="Schmidt R">R. C. Schmidt</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Spencer, R M C" uniqKey="Spencer R">R. M. C. Spencer</name>
</author>
<author>
<name sortKey="Zelaznik, H N" uniqKey="Zelaznik H">H. N. Zelaznik</name>
</author>
<author>
<name sortKey="Diedrichsen, J" uniqKey="Diedrichsen J">J. Diedrichsen</name>
</author>
<author>
<name sortKey="Ivry, R B" uniqKey="Ivry R">R. B. Ivry</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Sunny, M M" uniqKey="Sunny M">M. M. Sunny</name>
</author>
<author>
<name sortKey="Von Muhlenen, A" uniqKey="Von Muhlenen A">A. von Mühlenen</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Su, Y H" uniqKey="Su Y">Y.-H. Su</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Su, Y H" uniqKey="Su Y">Y.-H. Su</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Su, Y H" uniqKey="Su Y">Y.-H. Su</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Theeuwes, J" uniqKey="Theeuwes J">J. Theeuwes</name>
</author>
<author>
<name sortKey="Atchley, P" uniqKey="Atchley P">P. Atchley</name>
</author>
<author>
<name sortKey="Kramer, A F" uniqKey="Kramer A">A. F. Kramer</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Varlet, M" uniqKey="Varlet M">M. Varlet</name>
</author>
<author>
<name sortKey="Coey, C A" uniqKey="Coey C">C. A. Coey</name>
</author>
<author>
<name sortKey="Schmidt, R C" uniqKey="Schmidt R">R. C. Schmidt</name>
</author>
<author>
<name sortKey="Marin, L" uniqKey="Marin L">L. Marin</name>
</author>
<author>
<name sortKey="Bardy, B G" uniqKey="Bardy B">B. G. Bardy</name>
</author>
<author>
<name sortKey="Richardson, M J" uniqKey="Richardson M">M. J. Richardson</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Varlet, M" uniqKey="Varlet M">M. Varlet</name>
</author>
<author>
<name sortKey="Marin, L" uniqKey="Marin L">L. Marin</name>
</author>
<author>
<name sortKey="Issartel, J" uniqKey="Issartel J">J. Issartel</name>
</author>
<author>
<name sortKey="Schmidt, R C" uniqKey="Schmidt R">R. C. Schmidt</name>
</author>
<author>
<name sortKey="Bardy, B G" uniqKey="Bardy B">B. G. Bardy</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Wright, R L" uniqKey="Wright R">R. L. Wright</name>
</author>
<author>
<name sortKey="Elliott, M T" uniqKey="Elliott M">M. T. Elliott</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Zivotofsky, A Z" uniqKey="Zivotofsky A">A. Z. Zivotofsky</name>
</author>
<author>
<name sortKey="Gruendlinger, L" uniqKey="Gruendlinger L">L. Gruendlinger</name>
</author>
<author>
<name sortKey="Hausdorff, J M" uniqKey="Hausdorff J">J. M. Hausdorff</name>
</author>
</analytic>
</biblStruct>
</listBibl>
</div1>
</back>
</TEI>
<pmc article-type="research-article">
<pmc-dir>properties open_access</pmc-dir>
<front>
<journal-meta>
<journal-id journal-id-type="nlm-ta">Front Psychol</journal-id>
<journal-id journal-id-type="iso-abbrev">Front Psychol</journal-id>
<journal-id journal-id-type="publisher-id">Front. Psychol.</journal-id>
<journal-title-group>
<journal-title>Frontiers in Psychology</journal-title>
</journal-title-group>
<issn pub-type="epub">1664-1078</issn>
<publisher>
<publisher-name>Frontiers Media S.A.</publisher-name>
</publisher>
</journal-meta>
<article-meta>
<article-id pub-id-type="pmid">26157412</article-id>
<article-id pub-id-type="pmc">4478893</article-id>
<article-id pub-id-type="doi">10.3389/fpsyg.2015.00866</article-id>
<article-categories>
<subj-group subj-group-type="heading">
<subject>Psychology</subject>
<subj-group>
<subject>Original Research</subject>
</subj-group>
</subj-group>
</article-categories>
<title-group>
<article-title>Early, but not late visual distractors affect movement synchronization to a temporal-spatial visual cue</article-title>
</title-group>
<contrib-group>
<contrib contrib-type="author">
<name>
<surname>Booth</surname>
<given-names>Ashley J.</given-names>
</name>
<xref ref-type="aff" rid="aff1">
<sup>1</sup>
</xref>
<uri xlink:type="simple" xlink:href="http://loop.frontiersin.org/people/245991"></uri>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Elliott</surname>
<given-names>Mark T.</given-names>
</name>
<xref ref-type="aff" rid="aff1">
<sup>1</sup>
</xref>
<xref ref-type="aff" rid="aff2">
<sup>2</sup>
</xref>
<xref ref-type="author-notes" rid="fn001">
<sup>*</sup>
</xref>
<uri xlink:type="simple" xlink:href="http://loop.frontiersin.org/people/97862"></uri>
</contrib>
</contrib-group>
<aff id="aff1">
<sup>1</sup>
<institution>School of Psychology, University of Birmingham</institution>
,
<country>Edgbaston, UK</country>
</aff>
<aff id="aff2">
<sup>2</sup>
<institution>Institute of Digital Healthcare, Warwick Manufacturing Group, University of Warwick</institution>
,
<country>Coventry, UK</country>
</aff>
<author-notes>
<fn fn-type="edited-by">
<p>Edited by:
<italic>Lihan Chen, Peking University, China</italic>
</p>
</fn>
<fn fn-type="edited-by">
<p>Reviewed by:
<italic>Yoshimori Sugano, Kyushu Sangyo University, Japan; Yi-Huang Su, Technical University of Munich, Germany</italic>
</p>
</fn>
<corresp id="fn001">*Correspondence:
<italic>Mark T. Elliott, Institute of Digital Healthcare, Warwick Manufacturing Group, University of Warwick, University Road, Coventry CV4 7AL, UK,
<email xlink:type="simple">m.t.elliott@warwick.ac.uk</email>
</italic>
</corresp>
<fn fn-type="other" id="fn002">
<p>This article was submitted to Perception Science, a section of the journal Frontiers in Psychology.</p>
</fn>
</author-notes>
<pub-date pub-type="epub">
<day>24</day>
<month>6</month>
<year>2015</year>
</pub-date>
<pub-date pub-type="collection">
<year>2015</year>
</pub-date>
<volume>6</volume>
<elocation-id>866</elocation-id>
<history>
<date date-type="received">
<day>20</day>
<month>4</month>
<year>2015</year>
</date>
<date date-type="accepted">
<day>12</day>
<month>6</month>
<year>2015</year>
</date>
</history>
<permissions>
<copyright-statement>Copyright © 2015 Booth and Elliott.</copyright-statement>
<copyright-year>2015</copyright-year>
<copyright-holder>Booth and Elliott</copyright-holder>
<license xlink:href="http://creativecommons.org/licenses/by/4.0/">
<license-p>This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.</license-p>
</license>
</permissions>
<abstract>
<p>The ease of synchronizing movements to a rhythmic cue is dependent on the modality of the cue presentation: timing accuracy is much higher when synchronizing with discrete auditory rhythms than an equivalent visual stimulus presented through flashes. However, timing accuracy is improved if the visual cue presents spatial as well as temporal information (e.g., a dot following an oscillatory trajectory). Similarly, when synchronizing with an auditory target metronome in the presence of a second visual distracting metronome, the distraction is stronger when the visual cue contains spatial-temporal information rather than temporal only. The present study investigates individuals’ ability to synchronize movements to a temporal-spatial visual cue in the presence of same-modality temporal-spatial distractors. Moreover, we investigated how increasing the number of distractor stimuli impacted on maintaining synchrony with the target cue. Participants made oscillatory vertical arm movements in time with a vertically oscillating white target dot centered on a large projection screen. The target dot was surrounded by 2, 8, or 14 distractor dots, which had an identical trajectory to the target but at a phase lead or lag of 0, 100, or 200 ms. We found participants’ timing performance was only affected in the phase-lead conditions and when there were large numbers of distractors present (8 and 14). This asymmetry suggests participants still rely on salient events in the stimulus trajectory to synchronize movements. Subsequently, distractions occurring in the window of attention surrounding those events have the maximum impact on timing performance.</p>
</abstract>
<kwd-group>
<kwd>sensorimotor synchronization</kwd>
<kwd>visual cues</kwd>
<kwd>movement timing</kwd>
<kwd>distractor cues</kwd>
</kwd-group>
<funding-group>
<award-group>
<funding-source id="cn001">Engineering and Physical Sciences Research Council</funding-source>
<award-id rid="cn001">EP/I031030/1</award-id>
</award-group>
</funding-group>
<counts>
<fig-count count="3"></fig-count>
<table-count count="0"></table-count>
<equation-count count="0"></equation-count>
<ref-count count="34"></ref-count>
<page-count count="8"></page-count>
<word-count count="5350"></word-count>
</counts>
</article-meta>
</front>
<body>
<sec>
<title>Introduction</title>
<p>Nodding or tapping along to a favorite song is often something we do with little conscious thought. This demonstrates the automaticity of being able to move in time to a rhythmic stimulus, an ability that forms the basis of sensorimotor synchronization (SMS) research (
<xref rid="B23" ref-type="bibr">Repp and Su, 2013</xref>
). The majority of SMS research has focussed on the timing of movements to an auditory rhythmic cue and indeed it appears this is the sensory modality that facilitates the most accurate timing of movements (
<xref rid="B22" ref-type="bibr">Repp and Penel, 2004</xref>
;
<xref rid="B9" ref-type="bibr">Elliott et al., 2010</xref>
). However, movement synchrony can also occur outside the context of music. In social situations, groups of individuals can spontaneously coordinate the timing of their movements, for example, two people falling into step when walking together (
<xref rid="B34" ref-type="bibr">Zivotofsky et al., 2012</xref>
), or an excited crowd bouncing up and down together in a sports stadium (
<xref rid="B19" ref-type="bibr">Noormohammadi et al., 2011</xref>
). In these group scenarios, visual cues are likely to provide a strong timing stimulus that results in implicit synchrony emerging within the group. However, with each person in a group exhibiting slightly different timing properties, it is currently unclear how synchrony occurs in the face of conflicting visual cues. Here, we have developed an experimental paradigm that investigates how individuals synchronize movements to a target visual cue in the presence of conflicting visual stimuli.</p>
<p>Timing accuracy in SMS studies is often quantified by the
<italic>asynchronies</italic>
, which represent the time difference between the target and the executed movement. The mean and variability of the asynchronies are taken into account. A negative mean asynchrony (NMA) is usually observed in SMS research where the movement typically precedes the target by 30–50 ms (
<xref rid="B3" ref-type="bibr">Aschersleben and Prinz, 1995</xref>
). While auditory cues dominate SMS research, other modalities have been investigated. In particular, SMS to a discrete flashing visual stimulus results in reduced timing accuracy in terms of asynchrony variability (
<xref rid="B22" ref-type="bibr">Repp and Penel, 2004</xref>
;
<xref rid="B15" ref-type="bibr">Kurgansky, 2008</xref>
;
<xref rid="B9" ref-type="bibr">Elliott et al., 2010</xref>
;
<xref rid="B33" ref-type="bibr">Wright and Elliott, 2014</xref>
) compared to an auditory metronome. Hence, discrete auditory stimuli provide a more reliable, salient cue compared to a discrete rhythmic visual cue (
<xref rid="B22" ref-type="bibr">Repp and Penel, 2004</xref>
). However, more recent studies found that synchronizing movement to continuous visual cues, i.e., those exhibiting temporal and spatial information, yielded strong SMS that was comparable to studies using auditory cues (
<xref rid="B14" ref-type="bibr">Hove et al., 2012</xref>
;
<xref rid="B32" ref-type="bibr">Varlet et al., 2012</xref>
;
<xref rid="B2" ref-type="bibr">Armstrong et al., 2013</xref>
). Moreover, visual trajectories representing biologically compatible movements further facilitates rhythm perception (
<xref rid="B27" ref-type="bibr">Su, 2014a</xref>
) and movement synchronization (
<xref rid="B29" ref-type="bibr">Su, 2014c</xref>
). This latter finding indicates how the temporal-spatial visual information provided by surrounding members of a group could influence the implicit synchrony of movements within the group.</p>
<p>A number of studies have implemented a distractor paradigm to observe how irrelevant cues presented in auditory or auditory versus visual modalities can affect an individual’s ability to synchronize their movements to a target cue. As might be expected, an auditory distractor in the presence of a discrete visual target leads to a strong distraction effect, due to the strong saliency of the auditory modality (
<xref rid="B21" ref-type="bibr">Repp and Penel, 2002</xref>
,
<xref rid="B22" ref-type="bibr">2004</xref>
). These distraction effects are quantified through a change in NMA, i.e., asynchronies becoming more negative in the presence of early distractors or more positive for late distractors, and asynchrony variability, with strong distractor effects reducing the stability of the asynchronies. In general, discrete distractor cues (be it auditory–auditory or auditory-visual modalities) exhibit an asymmetric NMA effect, where a strong attraction is observed when the distractor precedes the target, but show little change for late distractors (
<xref rid="B20" ref-type="bibr">Repp, 2003</xref>
;
<xref rid="B22" ref-type="bibr">Repp and Penel, 2004</xref>
).</p>
<p>What is currently unclear is how an individual’s ability to synchronize movements to temporal-spatial visual cues is affected by similar conflicting visual distractors. In this study, we investigated participants’ ability to synchronize oscillatory arm movements in time to a temporal-spatial oscillating visual target, in the presence of identical visual distractors offset in phase to the target. As well as varying phase to influence the temporal relation between target and distractor, we also varied the visual impact of the distraction effect by varying the number of distractor stimuli present. Increasing the number of distraction stimuli should correspondingly increase visual attention to the distractors (
<xref rid="B4" ref-type="bibr">Bartram et al., 2003</xref>
). Hence we predicted that the strength of the distraction effect would be a function of both the temporal separation and the number of distractors present. As observed in previous studies, we further expected that the temporal distraction would be at it’s greatest when the phase offset was around a quarter of the oscillation period (
<xref rid="B20" ref-type="bibr">Repp, 2003</xref>
;
<xref rid="B22" ref-type="bibr">Repp and Penel, 2004</xref>
). However, due to the continuous nature of both the movements and the stimuli, we did not expect to see an asymmetry in the distraction effect as observed with discrete stimuli paradigms.</p>
</sec>
<sec sec-type="materials|methods">
<title>Materials and Methods</title>
<sec>
<title>Participants</title>
<p>Eleven University of Birmingham undergraduate Psychology students (six female;
<italic>M</italic>
<sub>age</sub>
= 18.4, range = 18–20, SD 0.67 years) gave written informed consent to take part in the study. All participants reported themselves free of any neurological disease, head trauma, musculoskeletal impairment, visual impairment, or hearing impairment. Ethical approval was granted by the University of Birmingham Science, Technology, Engineering, and Mathematics Ethical Review Committee. Of the 11 participants, nine were right-handed. Data from one participant was removed due to difficulty with following instructions and completing the task correctly.</p>
</sec>
<sec>
<title>Experimental Setup</title>
<p>Participants stood on a marked point 1.85 m from a projection screen (1.6 m wide × 1.2 m tall; Figure
<xref ref-type="fig" rid="F1">1A</xref>
). Arm movement trajectories were captured using a 12-camera Qualisys Oqus motion capture system (Qualisys AB, Gothenburg, Sweden), with adhesive reflective markers attached to the shoulders, elbows, wrists, and index fingers of both arms. The camera system operated with a sampling rate of 200 Hz.</p>
<fig id="F1" position="float">
<label>FIGURE 1</label>
<caption>
<p>
<bold>(A)</bold>
Representation of experimental set up. Participants faced a large projection screen which presented the visual stimuli. The stimuli (100 pixel diameter dots) moved vertically up and down, following a sinusoidal trajectory. The target stimulus was always the center dot. Distractor dots moved out of phase with the target by 0, ±100, ±200 ms. Participants made bimanual arm movements in synchrony with the target stimulus, flexing and extending the forearm from the elbow.
<bold>(B)</bold>
Formation of target and distractor stimuli. We investigated if the distraction effect was a function of the number of distractor stimuli. The number of distractor stimuli was varied across trials such that there were no distractors (top left), two distractors (bottom left), eight distractors (top right), or 14 distractors (bottom right).
<bold>(C)</bold>
Measurements of timing accuracy. Representative trajectories of the target stimulus (bottom trace, dashed pink) and the corresponding participant’s dominant arm movement (top, solid green) are shown. We extracted the times of the minimum positions for each movement oscillation along with the times of the minimum stimulus positions. Asynchrony was calculated by subtracting the time of movement event from the time of the corresponding stimulus event.</p>
</caption>
<graphic xlink:href="fpsyg-06-00866-g0001"></graphic>
</fig>
</sec>
<sec>
<title>Stimuli</title>
<p>Visual stimuli were generated in Matlab (2013a; The Mathworks, MA, USA) Psychophysics Toolbox (
<xref rid="B5" ref-type="bibr">Brainard, 1997</xref>
). The stimuli consisted of a series of white circular dots (100 pixels diameter) moving vertically against a black background with a sinusoidal trajectory (period: 800 ms, 60 frames per second). The peak–peak range of movement for the dots was 200 pixels. Participants were instructed to synchronize movements with the “target”—a centrally positioned dot that was present in all conditions. In addition, a number of distractor dots were positioned symmetrically to the sides, above and below the target. There were four distractor conditions, which consisted of 0, 2, 8, or 14 distractor dots in the formations shown in Figure
<xref ref-type="fig" rid="F1">1B</xref>
. Dots were separated from one another, center to center, by 125 pixels horizontally and 200 pixels vertically. In addition to the different numbers of distractors, there were five “phase-offset” conditions where the timing of the distractor dots was offset such there was a constant phase lead (negative) or lag (positive) of 0, ±100, or ±200 ms relative to the central target trajectory. The spacing of the dots was designed such that none of the phase-offset conditions resulted in occlusion of the target dot on the screen. A digital high (+5V) signal pulse was output via a data acquisition card (USB-6343, National Instruments, TX, USA) to the Qualisys motion capture system each time the target dot reached its minimum position in the trajectory. This was used to align screen output with the participant’s movements (see Data Processing).</p>
</sec>
<sec>
<title>Experimental Design and Procedure</title>
<p>Participants completed the study individually. They were instructed to move both forearms up and down in synchrony with the central target dot, flexing and extending at the elbows with only their index fingers extended. We instructed the use of bimanual movements to improve timing stability (
<xref rid="B13" ref-type="bibr">Helmuth and Ivry, 1996</xref>
). In addition, during pilot tests participants reported bimanual movements to be more comfortable and natural for the task. Participants were further required to keep their wrists tense and so were instructed to keep their wrists firm such that a straight line could be imagined between the fingertip and elbow during the movement. They were told to ignore the movements of the non-target dots to the best of their ability. A practice trial was carried out to ensure that the requirements were fully understood and they were ready to continue.</p>
<p>There were three trials for each condition (3 Distractors conditions: 2, 8, 14 × 5 Phase offset conditions: –200, –100, 0, 100, 200 ms; plus a no-distractor condition) totalling 48 trials in all. The order of the trials was randomized for each participant to avoid order effects. Each trial lasted 40 s, which resulted in 50 dot oscillations per trial.</p>
</sec>
<sec>
<title>Data Processing</title>
<p>Only the vertical (z-axis) data from the reflective marker attached to the index finger on the dominant hand was used for analysis. Using a peak detection algorithm from the MatTAP toolbox (
<xref rid="B8" ref-type="bibr">Elliott et al., 2009b</xref>
), the “event times” of the lowest vertical points of the executed oscillatory arm movements were extracted (Figure
<xref ref-type="fig" rid="F1">1C</xref>
). Lowest points were chosen as evidence suggests synchronization is more stable on the downward movement (
<xref rid="B17" ref-type="bibr">Miura et al., 2011</xref>
). Similarly, the event times of the lowest positions of the target stimulus were recorded as the time at which the digital signal from data acquisition was set high (see Stimuli). The first five event times from each trial were discarded from the analysis to allow for participants to initially synchronize with the target. The event times between the stimulus and the participant’s movements were then aligned (
<xref rid="B8" ref-type="bibr">Elliott et al., 2009b</xref>
) by finding the movement onset time closest to each stimulus onset time (on average <1% of all stimulus onsets could not be aligned to a participant’s corresponding movement, indicating participants were able to perform the task). Subsequently, the asynchronies were calculated as the time difference between the stimulus event and the corresponding movement event. A negative asynchrony indicated that the movement event occurred before the stimulus (Figure
<xref ref-type="fig" rid="F1">1C</xref>
).</p>
<p>The standard deviation and mean asynchrony were calculated for each trial and the average taken across trials for each participant. We initially analyzed the effects of the number of distractors and phase offset (reported in sections “Mean Asynchrony” and “Standard Deviation”) using a 3 (Distractors: 2, 8, 14) × 5 (Phase Offset: –200, –100, 0, 100, 200 ms) repeated measures design. We further analyzed just the effect of number of Distractors using data from the 0 phase-offset conditions in addition to the baseline “no distractor” condition [4 (Distractors: 0, 2, 8, 14) × 1 (Phase Offset: 0 ms) repeated measures; reported in section “Comparison of No-Distractor with Distractor Conditions”]. Statistical analysis was completed using Repeated Measures ANOVAs in SPSS (version 21, IBM Corp., NY, USA). Significance levels were set to
<italic>p</italic>
< 0.05. Greenhouse–Geisser adjustments were made for results that violated sphericity assumptions.
<italic>Post hoc</italic>
analyses were adjusted for multiple comparisons using the Bonferroni method.</p>
</sec>
</sec>
<sec>
<title>Results</title>
<sec>
<title>Mean Asynchrony</title>
<p>A repeated measures within-participants ANOVA revealed that there was a significant effect of phase-offset on mean asynchrony [
<italic>F</italic>
(4,36) = 25.17,
<italic>p</italic>
< 0.001]. That is, changes to the phase-offset significantly affected synchronization to the target (Figure
<xref ref-type="fig" rid="F2">2A</xref>
).
<italic>Post hoc</italic>
analysis identified that there were only significant differences between the 0 ms phase-offset condition relative to the –200 ms condition (
<italic>M</italic>
= –62.8 ms,
<italic>p</italic>
< 0.001) and the –100 ms condition (
<italic>M</italic>
= –59.8 ms,
<italic>p</italic>
= 0.001). However, there were no significant differences between the –200 ms and –100 ms phase-offsets conditions, and so performance does not continue to decline linearly as the phase-offset increases. Moreover, the positive phase offsets did not significantly alter the mean asynchrony compared to the 0 ms phase-offset. These findings show that there is an asymmetrical effect of phase-offset where the negative phase-offset conditions make the mean asynchrony more negative, so arm movements were drawn to the phase-leading distractor trajectories. In contrast, movements were not drawn to phase-lagging distractor trajectories.</p>
<fig id="F2" position="float">
<label>FIGURE 2</label>
<caption>
<p>
<bold>Mean asynchrony as a function of distractor phase-offset to target stimulus.</bold>
Asynchronies were measured between the participant’s movements and the target stimulus.
<bold>(A)</bold>
Overall effect of distractor phase-offset on mean asynchrony, collapsed across number of distractors. Error bars represent standard errors. Dashed horizontal black line indicates baseline mean asynchrony in the no-distractor condition.
<bold>(B)</bold>
Effect of distractor phase-offset on mean asynchrony, with two distractors present (circles), eight distractors present (diamonds), and 14 distractors present (triangles). Error bars represent standard errors.</p>
</caption>
<graphic xlink:href="fpsyg-06-00866-g0002"></graphic>
</fig>
<p>There was no significant main effect of the number of distractors on the mean asynchrony; however, the analysis yielded a significant interaction between the number of distractors and phase-offset [
<italic>F</italic>
(2.7,24.4) = 13.36,
<italic>p</italic>
< 0.001; Figure
<xref ref-type="fig" rid="F2">2B</xref>
]. Analyzing each Distractor condition separately highlighted that when only two distractors were present, there was no effect of phase-offset on the mean asynchrony [
<italic>F</italic>
(1.58,14.26) = 1.83,
<italic>p</italic>
= 0.199]. In contrast, for the 8 dot [
<italic>F</italic>
(4,36) = 32.79,
<italic>p</italic>
< 0.001] and 14 dot [
<italic>F</italic>
(4,36) = 36.48,
<italic>p</italic>
< 0.001] distractor conditions, the previously described phase attraction for leading distractors was present (Figure
<xref ref-type="fig" rid="F2">2B</xref>
).</p>
</sec>
<sec>
<title>Standard Deviation</title>
<p>We further investigated how the distractors impacted on the variability (standard deviation) of the asynchronies over a trial. Again, we observed a significant main effect of phase-offset [
<italic>F</italic>
(4,36) = 5.14,
<italic>p</italic>
= 0.002; Figure
<xref ref-type="fig" rid="F3">3</xref>
].
<italic>Post hoc</italic>
analyses identified the –100 ms phase-offset as the only condition that significantly differed from the 0 ms phase-offset condition (
<italic>M</italic>
difference = 10.5,
<italic>p</italic>
= 0.014). We found that in this condition, the variability of asynchronies significantly increased, indicating that the strongest distraction occurred when the distractor stimuli were moving earlier in phase by around 100 ms.</p>
<fig id="F3" position="float">
<label>FIGURE 3</label>
<caption>
<p>
<bold>Asynchrony standard deviation (SD) as a function of distractor phase-offset to target stimulus.</bold>
There was no significant effect of number of distractors on the asynchrony SD, so results are collapsed across this condition. Error bars represent standard errors. Dashed horizontal black line indicates baseline mean asynchrony in the no-distractor condition.</p>
</caption>
<graphic xlink:href="fpsyg-06-00866-g0003"></graphic>
</fig>
</sec>
<sec>
<title>Comparison of No-Distractor with Distractor Conditions</title>
<p>Two further analyses were carried out to compare a no-distractor condition (i.e., only the target stimulus present) with the other multiple dot conditions where there was no phase-offset between the target and distractor stimuli. As expected, we found no significant effect of the number of distractors on the mean asynchrony (
<italic>p</italic>
= 0.089) or the standard deviation (
<italic>p</italic>
= 0.765). Hence we can conclude that the number of distractors alone does not significantly affect synchronization to a target visual cue where there is no phase-offset applied.</p>
</sec>
</sec>
<sec>
<title>Discussion</title>
<p>In this study, we investigated how we synchronize our movements in time with a visually oscillating target cue in the presence of same-modality distractor cues. Participants were instructed to synchronize oscillatory arm movements in time with the target cue, while distractors varied in phase (either lagging or leading the target cue) and number. We found that, as predicted, the degree of phase-offset between distractor and target stimuli significantly affected the asynchrony of the participants’ movements to the target cue. However, contrary to expectations, an asymmetry in the distraction effect was observed, with only phase-leading distractors (–100, –200 ms) influencing the asynchrony; lagging distractors did not show any significant effect on performance. In particular, a phase offset of –100 ms appeared to have a substantial impact on performance, both in terms of greater negative asynchrony and higher asynchrony variability. We further found the distraction only occurred with larger numbers of distractor stimuli surrounding the target; we saw no effect when there were only two distractor stimuli present.</p>
<p>The effect of distractor cues on sensorimotor synchronization performance has been investigated for combinations of auditory–auditory (
<xref rid="B20" ref-type="bibr">Repp, 2003</xref>
), auditory-visual (
<xref rid="B22" ref-type="bibr">Repp and Penel, 2004</xref>
;
<xref rid="B14" ref-type="bibr">Hove et al., 2012</xref>
;
<xref rid="B6" ref-type="bibr">Debats et al., 2013</xref>
) and auditory-proprioceptive cues (
<xref rid="B6" ref-type="bibr">Debats et al., 2013</xref>
). An asymmetry in the strength of the distraction has been observed in auditory–auditory and auditory-visual conditions (
<xref rid="B20" ref-type="bibr">Repp, 2003</xref>
;
<xref rid="B22" ref-type="bibr">Repp and Penel, 2004</xref>
). In both cases, a strong influence of the auditory distractors on the asynchronies was observed when the distractors occurred earlier than the target cue, but not later. With discrete cues, this is expected: the participant’s attention is captured by the early distraction events and hence draws the motor responses away from the target cue. Later distraction cues are less likely to capture attention as they occur after the motor action has been planned and executed (
<xref rid="B20" ref-type="bibr">Repp, 2003</xref>
). With a continuously present visual cue and continuous motor action however, we expected there to be no difference between a distractor being late or early in phase. We considered that the continuous signal would be a constant distraction and hence would show a symmetrical effect on the asynchronies regardless of them leading or lagging the target. The fact that we saw an asymmetry indicates that participants were still utilizing salient points in the sensory stimuli and aligning them to similarly salient anchor points in their own movement trajectories. For the visual cues, the salient points could have been, for example, the change in direction of the moving dot at the top or bottom of the sinusoidal trajectory. Indeed, it makes sense to have discrete points of reference for synchronization. On the one hand it has been shown that synchronization to continuous temporal-spatial visual cues is much easier and results in enhanced synchrony performance (
<xref rid="B32" ref-type="bibr">Varlet et al., 2012</xref>
;
<xref rid="B2" ref-type="bibr">Armstrong et al., 2013</xref>
;
<xref rid="B1" ref-type="bibr">Armstrong and Issartel, 2014</xref>
) compared to the task of timing movements to discrete visual cues (
<xref rid="B22" ref-type="bibr">Repp and Penel, 2004</xref>
;
<xref rid="B9" ref-type="bibr">Elliott et al., 2010</xref>
;
<xref rid="B33" ref-type="bibr">Wright and Elliott, 2014</xref>
). However, while the dynamic spatial element of visual information is clearly important for anticipatory timing, it would be inefficient to continuously align and correct movements at arbitrary points in the cue trajectory, just because there is the sensory information available. Evidence from this experiment and other studies (
<xref rid="B16" ref-type="bibr">Luck and Sloboda, 2008</xref>
;
<xref rid="B12" ref-type="bibr">Hajnal et al., 2009</xref>
;
<xref rid="B28" ref-type="bibr">Su, 2014b</xref>
;
<xref rid="B31" ref-type="bibr">Varlet et al., 2014</xref>
) suggests that if we’re timing movements to an external cue, we pick out discrete salient points for temporal alignment that allows us to efficiently correct movements through each repetition of the cycle. These do not have to be explicit observable events within the trajectory but can be related to derivatives of the movement such as velocity (
<xref rid="B28" ref-type="bibr">Su, 2014b</xref>
;
<xref rid="B31" ref-type="bibr">Varlet et al., 2014</xref>
) or peak acceleration (
<xref rid="B16" ref-type="bibr">Luck and Sloboda, 2008</xref>
). Similar strategies arise in the movements themselves. Producing smooth continuous movements results in the timing emerging from the movement itself [referred to as emergent, or implicit timing (
<xref rid="B25" ref-type="bibr">Spencer et al., 2003</xref>
)]. This smooth movement reduces the ability to make accurate corrections necessary for maintaining synchrony (
<xref rid="B7" ref-type="bibr">Elliott et al., 2009a</xref>
). Hence it is beneficial to timing performance to have relatively discrete (identified by a high level of jerk) features in the movement that allows event based or explicit timing (
<xref rid="B7" ref-type="bibr">Elliott et al., 2009a</xref>
). Again, in this case it is likely that proprioceptive feedback of the change of direction at the lowest point of the movement was sufficient to allow participants to synchronize their actions. These strategies of extracting discrete timing events from continuous cues and movements explains why we see a similar asymmetrical distraction effect in this task as in previous experiments that used discrete cues (e.g., an auditory metronome) and movements (finger tapping).</p>
<p>To understand the effect of the distractors further, we must consider the underlying attentional processes. Moving visual stimuli in the periphery attracts attention far better than static stimuli (
<xref rid="B4" ref-type="bibr">Bartram et al., 2003</xref>
). In addition, jerky motion captures attention more than smooth motion (
<xref rid="B26" ref-type="bibr">Sunny and von Mühlenen, 2011</xref>
). Our study shows that even if visual stimuli are not being attended to, the salient features of the distractor cue trajectory attract coordinated movements away from a target stimulus. It appears however, that the number of distractors and possibly the spatial location is also important. We only observed the strong distraction effect when there were 8 or 14 distractors present. This is likely to be due to the increased salience of the distractor cues, with the large number of stimuli moving at the same phase making them increasingly difficult to ignore (
<xref rid="B4" ref-type="bibr">Bartram et al., 2003</xref>
). Equally, the salience could have been increased by the larger number of distractors completely surrounding the target dot, rather just on either side, as in the two-distractor condition. Our results therefore suggest a bottom-up stimulus driven attentional process is in place (
<xref rid="B30" ref-type="bibr">Theeuwes et al., 2000</xref>
) where the saliency of the distractor relative to the target is what draws the attention of the individual. The temporal distance of the distractors from the target is a further important factor in the strength of the distraction. With the peak distraction effect occurring when the distractors are –100 ms earlier than the target cue, it is likely a temporal window of attention (
<xref rid="B18" ref-type="bibr">Naccache et al., 2002</xref>
) around the salient event in the target cue is present. If the distractor cue event falls into this window, then it maximizes attentional capture from the target (due to the multiple distractors providing a stronger stimulus than the target). This is somewhat different to the well-documented “window of integration.” Sensory integration of temporal cues occurs when two stimuli are deemed relevant to one-another and occur close together in time (
<xref rid="B10" ref-type="bibr">Elliott et al., 2014</xref>
). In this scenario, the stimuli are integrated in a fashion that can be described under a Bayesian framework, such that the resulting combined cue becomes more reliable than either of the individual stimuli (
<xref rid="B11" ref-type="bibr">Ernst and Banks, 2002</xref>
). In a synchronization task this results in a reduced variability of the timed movements (
<xref rid="B9" ref-type="bibr">Elliott et al., 2010</xref>
). Here, we explicitly inform participants to ignore the distractor stimuli, so they are aware they are not relevant to the target. Subsequently, we observe a high level of variability at the –100 ms offset, which is likely due to be a result of the conflict between the top-down goal of synchronizing with the target cue and the bottom-up stimulus driven effect of being attracted to the more salient distractor stimuli.</p>
<p>Finally, we consider these results in the context of interpersonal synchrony. Spontaneous synchrony can emerge between two individuals, often due to the strong visual cues from the partner (
<xref rid="B24" ref-type="bibr">Richardson et al., 2007</xref>
;
<xref rid="B34" ref-type="bibr">Zivotofsky et al., 2012</xref>
). Considering larger groups (e.g., crowds jumping up and down in a sports stadium), there is potentially a contextual effect on how synchrony may emerge within a group. On the one hand, an individual may be focussed on timing their movements with a known partner, in which case the movements of the remaining crowd act as distractors and hence, based on our results, are likely to weaken the coupling between the dyad. Alternatively, an individual may be moving as part of the larger crowd, in which case it would be advantageous to combine the cues from all surrounding individuals. Through sensory integration, this latter scenario should result in greater stability of synchrony within the group. In reality, a combination of these processes are likely to be present, such that within a crowd we observe an overall weak coupling across all individuals, but with strong synchrony couplings between small numbers of individuals within the crowd.</p>
<sec>
<title>Conflict of Interest Statement</title>
<p>The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.</p>
</sec>
</sec>
</body>
<back>
<ack>
<title>Acknowledgments</title>
<p>We thank Dagmar Fraser for his assistance in the coding of the visual stimuli and Sonam Malhi for assistance with data collection. This research was funded by the Engineering and Physical Sciences Research Council [EP/I031030/1].</p>
</ack>
<ref-list>
<title>References</title>
<ref id="B1">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Armstrong</surname>
<given-names>A.</given-names>
</name>
<name>
<surname>Issartel</surname>
<given-names>J.</given-names>
</name>
</person-group>
(
<year>2014</year>
).
<article-title>Sensorimotor synchronization with audio-visual stimuli: limited multisensory integration</article-title>
.
<source>Exp. Brain Res.</source>
<volume>232</volume>
,
<fpage>3453</fpage>
<lpage>3463</lpage>
.
<pub-id pub-id-type="doi">10.1007/s00221-014-4031-9</pub-id>
<pub-id pub-id-type="pmid">25027792</pub-id>
</mixed-citation>
</ref>
<ref id="B2">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Armstrong</surname>
<given-names>A.</given-names>
</name>
<name>
<surname>Issartel</surname>
<given-names>J.</given-names>
</name>
<name>
<surname>Varlet</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>Marin</surname>
<given-names>L.</given-names>
</name>
</person-group>
(
<year>2013</year>
).
<article-title>The supplementation of spatial information improves coordination</article-title>
.
<source>Neurosci. Lett.</source>
<volume>548</volume>
,
<fpage>212</fpage>
<lpage>216</lpage>
.
<pub-id pub-id-type="doi">10.1016/j.neulet.2013.05.013</pub-id>
<pub-id pub-id-type="pmid">23701861</pub-id>
</mixed-citation>
</ref>
<ref id="B3">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Aschersleben</surname>
<given-names>G.</given-names>
</name>
<name>
<surname>Prinz</surname>
<given-names>W.</given-names>
</name>
</person-group>
(
<year>1995</year>
).
<article-title>Synchronizing actions with events: the role of sensory information</article-title>
.
<source>Percept. Psychophys.</source>
<volume>57</volume>
,
<fpage>305</fpage>
<lpage>317</lpage>
.
<pub-id pub-id-type="doi">10.3758/BF03213056</pub-id>
<pub-id pub-id-type="pmid">7770322</pub-id>
</mixed-citation>
</ref>
<ref id="B4">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Bartram</surname>
<given-names>L.</given-names>
</name>
<name>
<surname>Ware</surname>
<given-names>C.</given-names>
</name>
<name>
<surname>Calvert</surname>
<given-names>T.</given-names>
</name>
</person-group>
(
<year>2003</year>
).
<article-title>Moticons: detection, distraction and task</article-title>
.
<source>Int. J. Hum. Comput. Stud.</source>
<volume>58</volume>
,
<fpage>515</fpage>
<lpage>545</lpage>
.
<pub-id pub-id-type="doi">10.1016/S1071-5819(03)00021-1</pub-id>
</mixed-citation>
</ref>
<ref id="B5">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Brainard</surname>
<given-names>D. H.</given-names>
</name>
</person-group>
(
<year>1997</year>
).
<article-title>The Psychophysics Toolbox</article-title>
.
<source>Spat. Vis.</source>
<volume>10</volume>
,
<fpage>433</fpage>
<lpage>436</lpage>
.
<pub-id pub-id-type="doi">10.1163/156856897X00357</pub-id>
<pub-id pub-id-type="pmid">9176952</pub-id>
</mixed-citation>
</ref>
<ref id="B6">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Debats</surname>
<given-names>N. B.</given-names>
</name>
<name>
<surname>Ridderikhoff</surname>
<given-names>A.</given-names>
</name>
<name>
<surname>de Boer</surname>
<given-names>B. J.</given-names>
</name>
<name>
<surname>Peper</surname>
<given-names>C. L.</given-names>
</name>
</person-group>
(
<year>2013</year>
).
<article-title>Biases in rhythmic sensorimotor coordination: effects of modality and intentionality</article-title>
.
<source>Behav. Brain Res.</source>
<volume>250</volume>
,
<fpage>334</fpage>
<lpage>342</lpage>
.
<pub-id pub-id-type="doi">10.1016/j.bbr.2013.05.005</pub-id>
<pub-id pub-id-type="pmid">23680163</pub-id>
</mixed-citation>
</ref>
<ref id="B7">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Elliott</surname>
<given-names>M. T.</given-names>
</name>
<name>
<surname>Welchman</surname>
<given-names>A. E.</given-names>
</name>
<name>
<surname>Wing</surname>
<given-names>A. M.</given-names>
</name>
</person-group>
(
<year>2009a</year>
).
<article-title>Being discrete helps keep to the beat</article-title>
.
<source>Exp. Brain Res.</source>
<volume>192</volume>
,
<fpage>731</fpage>
<lpage>737</lpage>
.
<pub-id pub-id-type="doi">10.1007/s00221-008-1646-8</pub-id>
<pub-id pub-id-type="pmid">19048241</pub-id>
</mixed-citation>
</ref>
<ref id="B8">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Elliott</surname>
<given-names>M. T.</given-names>
</name>
<name>
<surname>Welchman</surname>
<given-names>A. E.</given-names>
</name>
<name>
<surname>Wing</surname>
<given-names>A. M.</given-names>
</name>
</person-group>
(
<year>2009b</year>
).
<article-title>MatTAP: a MATLAB toolbox for the control and analysis of movement synchronisation experiments</article-title>
.
<source>J. Neurosci. Methods</source>
<volume>177</volume>
,
<fpage>250</fpage>
<lpage>257</lpage>
.
<pub-id pub-id-type="doi">10.1016/j.jneumeth.2008.10.002</pub-id>
<pub-id pub-id-type="pmid">18977388</pub-id>
</mixed-citation>
</ref>
<ref id="B9">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Elliott</surname>
<given-names>M. T.</given-names>
</name>
<name>
<surname>Wing</surname>
<given-names>A. M.</given-names>
</name>
<name>
<surname>Welchman</surname>
<given-names>A. E.</given-names>
</name>
</person-group>
(
<year>2010</year>
).
<article-title>Multisensory cues improve sensorimotor synchronisation</article-title>
.
<source>Eur. J. Neurosci.</source>
<volume>31</volume>
,
<fpage>1828</fpage>
<lpage>1835</lpage>
.
<pub-id pub-id-type="doi">10.1111/j.1460-9568.2010.07205.x</pub-id>
<pub-id pub-id-type="pmid">20584187</pub-id>
</mixed-citation>
</ref>
<ref id="B10">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Elliott</surname>
<given-names>M. T.</given-names>
</name>
<name>
<surname>Wing</surname>
<given-names>A. M.</given-names>
</name>
<name>
<surname>Welchman</surname>
<given-names>A. E.</given-names>
</name>
</person-group>
(
<year>2014</year>
).
<article-title>Moving in time: Bayesian causal inference explains movement coordination to auditory beats</article-title>
.
<source>Proc. R. Soc. B Biol. Sci.</source>
<volume>281</volume>
,
<fpage>20140751</fpage>
.
<pub-id pub-id-type="doi">10.1098/rspb.2014.0751</pub-id>
<pub-id pub-id-type="pmid">24850915</pub-id>
</mixed-citation>
</ref>
<ref id="B11">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Ernst</surname>
<given-names>M. O.</given-names>
</name>
<name>
<surname>Banks</surname>
<given-names>M. S.</given-names>
</name>
</person-group>
(
<year>2002</year>
).
<article-title>Humans integrate visual and haptic information in a statistically optimal fashion</article-title>
.
<source>Nature</source>
<volume>415</volume>
,
<fpage>429</fpage>
<lpage>433</lpage>
.
<pub-id pub-id-type="doi">10.1038/415429a</pub-id>
<pub-id pub-id-type="pmid">11807554</pub-id>
</mixed-citation>
</ref>
<ref id="B12">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Hajnal</surname>
<given-names>A.</given-names>
</name>
<name>
<surname>Richardson</surname>
<given-names>M. J.</given-names>
</name>
<name>
<surname>Harrison</surname>
<given-names>S. J.</given-names>
</name>
<name>
<surname>Schmidt</surname>
<given-names>R. C.</given-names>
</name>
</person-group>
(
<year>2009</year>
).
<article-title>Location but not amount of stimulus occlusion influences the stability of visuo-motor coordination</article-title>
.
<source>Exp. Brain Res.</source>
<volume>199</volume>
,
<fpage>89</fpage>
<lpage>93</lpage>
.
<pub-id pub-id-type="doi">10.1007/s00221-009-1958-3</pub-id>
<pub-id pub-id-type="pmid">19657633</pub-id>
</mixed-citation>
</ref>
<ref id="B13">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Helmuth</surname>
<given-names>L. L.</given-names>
</name>
<name>
<surname>Ivry</surname>
<given-names>R. B.</given-names>
</name>
</person-group>
(
<year>1996</year>
).
<article-title>When two hands are better than one: reduced timing variability during bimanual movements</article-title>
.
<source>J. Exp. Psychol. Hum. Percept. Perform.</source>
<volume>22</volume>
,
<fpage>278</fpage>
<lpage>293</lpage>
.
<pub-id pub-id-type="doi">10.1037/0096-1523.22.2.278</pub-id>
<pub-id pub-id-type="pmid">8934844</pub-id>
</mixed-citation>
</ref>
<ref id="B14">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Hove</surname>
<given-names>M. J.</given-names>
</name>
<name>
<surname>Iversen</surname>
<given-names>J. R.</given-names>
</name>
<name>
<surname>Zhang</surname>
<given-names>A.</given-names>
</name>
<name>
<surname>Repp</surname>
<given-names>B. H.</given-names>
</name>
</person-group>
(
<year>2012</year>
).
<article-title>Synchronization with competing visual and auditory rhythms: bouncing ball meets metronome</article-title>
.
<source>Psychol. Res.</source>
<volume>77</volume>
,
<fpage>388</fpage>
<lpage>398</lpage>
.
<pub-id pub-id-type="doi">10.1007/s00426-012-0441-0</pub-id>
<pub-id pub-id-type="pmid">22638726</pub-id>
</mixed-citation>
</ref>
<ref id="B15">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Kurgansky</surname>
<given-names>A. V.</given-names>
</name>
</person-group>
(
<year>2008</year>
).
<article-title>Visuomotor synchronization: analysis of the initiation and stable synchronization phases</article-title>
.
<source>Hum. Physiol.</source>
<volume>34</volume>
,
<fpage>289</fpage>
<lpage>298</lpage>
.
<pub-id pub-id-type="doi">10.1134/S0362119708030043</pub-id>
<pub-id pub-id-type="pmid">18677945</pub-id>
</mixed-citation>
</ref>
<ref id="B16">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Luck</surname>
<given-names>G.</given-names>
</name>
<name>
<surname>Sloboda</surname>
<given-names>J.</given-names>
</name>
</person-group>
(
<year>2008</year>
).
<article-title>Exploring the spatio-temporal properties of simple conducting gestures using a synchronization task</article-title>
.
<source>Music Percept.</source>
<volume>25</volume>
,
<fpage>225</fpage>
<lpage>239</lpage>
.
<pub-id pub-id-type="doi">10.1525/mp.2008.25.3.225</pub-id>
</mixed-citation>
</ref>
<ref id="B17">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Miura</surname>
<given-names>A.</given-names>
</name>
<name>
<surname>Kudo</surname>
<given-names>K.</given-names>
</name>
<name>
<surname>Ohtsuki</surname>
<given-names>T.</given-names>
</name>
<name>
<surname>Kanehisa</surname>
<given-names>H.</given-names>
</name>
</person-group>
(
<year>2011</year>
).
<article-title>Coordination modes in sensorimotor synchronization of whole-body movement: a study of street dancers and non-dancers</article-title>
.
<source>Hum. Mov. Sci.</source>
<volume>30</volume>
,
<fpage>1260</fpage>
<lpage>1271</lpage>
.
<pub-id pub-id-type="doi">10.1016/j.humov.2010.08.006</pub-id>
<pub-id pub-id-type="pmid">21802159</pub-id>
</mixed-citation>
</ref>
<ref id="B18">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Naccache</surname>
<given-names>L.</given-names>
</name>
<name>
<surname>Blandin</surname>
<given-names>E.</given-names>
</name>
<name>
<surname>Dehaene</surname>
<given-names>S.</given-names>
</name>
</person-group>
(
<year>2002</year>
).
<article-title>Unconscious masked priming depends on temporal attention</article-title>
.
<source>Psychol. Sci.</source>
<volume>13</volume>
,
<fpage>416</fpage>
<lpage>424</lpage>
.
<pub-id pub-id-type="doi">10.1111/1467-9280.00474</pub-id>
<pub-id pub-id-type="pmid">12219807</pub-id>
</mixed-citation>
</ref>
<ref id="B19">
<mixed-citation publication-type="book">
<person-group person-group-type="author">
<name>
<surname>Noormohammadi</surname>
<given-names>N.</given-names>
</name>
<name>
<surname>Brownjohn</surname>
<given-names>J.</given-names>
</name>
<name>
<surname>Wing</surname>
<given-names>A. M.</given-names>
</name>
<name>
<surname>Racic</surname>
<given-names>V.</given-names>
</name>
<name>
<surname>Johannsen</surname>
<given-names>L.</given-names>
</name>
<name>
<surname>Elliott</surname>
<given-names>M. T.</given-names>
</name>
</person-group>
(
<year>2011</year>
).
<article-title>“Effect of different cues on spectators’ synchronisation, a vibration engineering approach,”</article-title>
in
<source>Proceedings of the 8th International Conference on Structural Dynamics</source>
,
<publisher-loc>Leuven</publisher-loc>
.</mixed-citation>
</ref>
<ref id="B20">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Repp</surname>
<given-names>B. H.</given-names>
</name>
</person-group>
(
<year>2003</year>
).
<article-title>Phase attraction in sensorimotor synchronization with auditory sequences: effects of single and periodic distractors on synchronization accuracy</article-title>
.
<source>J. Exp. Psychol. Hum. Percept. Perform.</source>
<volume>29</volume>
,
<fpage>290</fpage>
<lpage>309</lpage>
.
<pub-id pub-id-type="doi">10.1037/0096-1523.29.2.290</pub-id>
<pub-id pub-id-type="pmid">12760616</pub-id>
</mixed-citation>
</ref>
<ref id="B21">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Repp</surname>
<given-names>B. H.</given-names>
</name>
<name>
<surname>Penel</surname>
<given-names>A.</given-names>
</name>
</person-group>
(
<year>2002</year>
).
<article-title>Auditory dominance in temporal processing: new evidence from synchronization with simultaneous visual and auditory sequences</article-title>
.
<source>J. Exp. Psychol. Hum. Percept. Perform.</source>
<volume>28</volume>
,
<fpage>1085</fpage>
<lpage>1099</lpage>
.
<pub-id pub-id-type="doi">10.1037/0096-1523.28.5.1085</pub-id>
<pub-id pub-id-type="pmid">12421057</pub-id>
</mixed-citation>
</ref>
<ref id="B22">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Repp</surname>
<given-names>B. H.</given-names>
</name>
<name>
<surname>Penel</surname>
<given-names>A.</given-names>
</name>
</person-group>
(
<year>2004</year>
).
<article-title>Rhythmic movement is attracted more strongly to auditory than to visual rhythms</article-title>
.
<source>Psychol. Res.</source>
<volume>68</volume>
,
<fpage>252</fpage>
<lpage>270</lpage>
.
<pub-id pub-id-type="doi">10.1007/s00426-003-0143-8</pub-id>
<pub-id pub-id-type="pmid">12955504</pub-id>
</mixed-citation>
</ref>
<ref id="B23">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Repp</surname>
<given-names>B. H.</given-names>
</name>
<name>
<surname>Su</surname>
<given-names>Y.-H.</given-names>
</name>
</person-group>
(
<year>2013</year>
).
<article-title>Sensorimotor synchronization: a review of recent research (2006–2012)</article-title>
.
<source>Psychon. Bull. Rev.</source>
<volume>20</volume>
,
<fpage>403</fpage>
<lpage>452</lpage>
.
<pub-id pub-id-type="doi">10.3758/s13423-012-0371-2</pub-id>
<pub-id pub-id-type="pmid">23397235</pub-id>
</mixed-citation>
</ref>
<ref id="B24">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Richardson</surname>
<given-names>M. J.</given-names>
</name>
<name>
<surname>Marsh</surname>
<given-names>K. L.</given-names>
</name>
<name>
<surname>Isenhower</surname>
<given-names>R. W.</given-names>
</name>
<name>
<surname>Goodman</surname>
<given-names>J. R. L.</given-names>
</name>
<name>
<surname>Schmidt</surname>
<given-names>R. C.</given-names>
</name>
</person-group>
(
<year>2007</year>
).
<article-title>Rocking together: dynamics of intentional and unintentional interpersonal coordination</article-title>
.
<source>Hum. Mov. Sci.</source>
<volume>26</volume>
,
<fpage>867</fpage>
<lpage>891</lpage>
.
<pub-id pub-id-type="doi">10.1016/j.humov.2007.07.002</pub-id>
<pub-id pub-id-type="pmid">17765345</pub-id>
</mixed-citation>
</ref>
<ref id="B25">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Spencer</surname>
<given-names>R. M. C.</given-names>
</name>
<name>
<surname>Zelaznik</surname>
<given-names>H. N.</given-names>
</name>
<name>
<surname>Diedrichsen</surname>
<given-names>J.</given-names>
</name>
<name>
<surname>Ivry</surname>
<given-names>R. B.</given-names>
</name>
</person-group>
(
<year>2003</year>
).
<article-title>Disrupted timing of discontinuous but not continuous movements by cerebellar lesions</article-title>
.
<source>Science</source>
<volume>300</volume>
,
<fpage>1437</fpage>
<lpage>1439</lpage>
.
<pub-id pub-id-type="doi">10.1126/science.1083661</pub-id>
<pub-id pub-id-type="pmid">12775842</pub-id>
</mixed-citation>
</ref>
<ref id="B26">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Sunny</surname>
<given-names>M. M.</given-names>
</name>
<name>
<surname>von Mühlenen</surname>
<given-names>A.</given-names>
</name>
</person-group>
(
<year>2011</year>
).
<article-title>Motion onset does not capture attention when subsequent motion is “smooth”</article-title>
.
<source>Psychon. Bull. Rev.</source>
<volume>18</volume>
,
<fpage>1050</fpage>
<lpage>1056</lpage>
.
<pub-id pub-id-type="doi">10.3758/s13423-011-0152-3</pub-id>
<pub-id pub-id-type="pmid">21901513</pub-id>
</mixed-citation>
</ref>
<ref id="B27">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Su</surname>
<given-names>Y.-H.</given-names>
</name>
</person-group>
(
<year>2014a</year>
).
<article-title>Audiovisual beat induction in complex auditory rhythms: point-light figure movement as an effective visual beat</article-title>
.
<source>Acta Psychol. (Amst.)</source>
<volume>151</volume>
,
<fpage>40</fpage>
<lpage>50</lpage>
.
<pub-id pub-id-type="doi">10.1016/j.actpsy.2014.05.016</pub-id>
<pub-id pub-id-type="pmid">24932996</pub-id>
</mixed-citation>
</ref>
<ref id="B28">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Su</surname>
<given-names>Y.-H.</given-names>
</name>
</person-group>
(
<year>2014b</year>
).
<article-title>Peak velocity as a cue in audiovisual synchrony perception of rhythmic stimuli</article-title>
.
<source>Cognition</source>
<volume>131</volume>
,
<fpage>330</fpage>
<lpage>344</lpage>
.
<pub-id pub-id-type="doi">10.1016/j.cognition.2014.02.004</pub-id>
<pub-id pub-id-type="pmid">24632323</pub-id>
</mixed-citation>
</ref>
<ref id="B29">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Su</surname>
<given-names>Y.-H.</given-names>
</name>
</person-group>
(
<year>2014c</year>
).
<article-title>Visual enhancement of auditory beat perception across auditory interference levels</article-title>
.
<source>Brain Cogn.</source>
<volume>90</volume>
,
<fpage>19</fpage>
<lpage>31</lpage>
.
<pub-id pub-id-type="doi">10.1016/j.bandc.2014.05.003</pub-id>
<pub-id pub-id-type="pmid">24907465</pub-id>
</mixed-citation>
</ref>
<ref id="B30">
<mixed-citation publication-type="book">
<person-group person-group-type="author">
<name>
<surname>Theeuwes</surname>
<given-names>J.</given-names>
</name>
<name>
<surname>Atchley</surname>
<given-names>P.</given-names>
</name>
<name>
<surname>Kramer</surname>
<given-names>A. F.</given-names>
</name>
</person-group>
(
<year>2000</year>
).
<article-title>“On the time course of top-down and bottom-up Control of Visual Attention,”</article-title>
in
<source>Control of Cognitive Processes: Attention and Performance XVIII</source>
, eds
<person-group person-group-type="editor">
<name>
<surname>Monsell</surname>
<given-names>S.</given-names>
</name>
<name>
<surname>Driver</surname>
<given-names>J.</given-names>
</name>
</person-group>
(
<publisher-loc>Cambridge, MA</publisher-loc>
:
<publisher-name>MIT Press</publisher-name>
),
<fpage>104</fpage>
<lpage>124</lpage>
.</mixed-citation>
</ref>
<ref id="B31">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Varlet</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>Coey</surname>
<given-names>C. A.</given-names>
</name>
<name>
<surname>Schmidt</surname>
<given-names>R. C.</given-names>
</name>
<name>
<surname>Marin</surname>
<given-names>L.</given-names>
</name>
<name>
<surname>Bardy</surname>
<given-names>B. G.</given-names>
</name>
<name>
<surname>Richardson</surname>
<given-names>M. J.</given-names>
</name>
</person-group>
(
<year>2014</year>
).
<article-title>Influence of stimulus velocity profile on rhythmic visuomotor coordination</article-title>
.
<source>J. Exp. Psychol. Hum. Percept. Perform.</source>
<volume>40</volume>
,
<fpage>1849</fpage>
<lpage>1860</lpage>
.
<pub-id pub-id-type="doi">10.1037/a0037417</pub-id>
<pub-id pub-id-type="pmid">25019498</pub-id>
</mixed-citation>
</ref>
<ref id="B32">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Varlet</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>Marin</surname>
<given-names>L.</given-names>
</name>
<name>
<surname>Issartel</surname>
<given-names>J.</given-names>
</name>
<name>
<surname>Schmidt</surname>
<given-names>R. C.</given-names>
</name>
<name>
<surname>Bardy</surname>
<given-names>B. G.</given-names>
</name>
</person-group>
(
<year>2012</year>
).
<article-title>Continuity of visual and auditory rhythms influences sensorimotor coordination</article-title>
.
<source>PLoS ONE</source>
<volume>7</volume>
:
<fpage>e44082</fpage>
.
<pub-id pub-id-type="doi">10.1371/journal.pone.0044082</pub-id>
<pub-id pub-id-type="pmid">23028488</pub-id>
</mixed-citation>
</ref>
<ref id="B33">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Wright</surname>
<given-names>R. L.</given-names>
</name>
<name>
<surname>Elliott</surname>
<given-names>M. T.</given-names>
</name>
</person-group>
(
<year>2014</year>
).
<article-title>Stepping to phase-perturbed metronome cues: multisensory advantage in movement synchrony but not correction</article-title>
.
<source>Front. Hum. Neurosci.</source>
<volume>8</volume>
:
<fpage>724</fpage>
.
<pub-id pub-id-type="doi">10.3389/fnhum.2014.00724</pub-id>
<pub-id pub-id-type="pmid">25309397</pub-id>
</mixed-citation>
</ref>
<ref id="B34">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Zivotofsky</surname>
<given-names>A. Z.</given-names>
</name>
<name>
<surname>Gruendlinger</surname>
<given-names>L.</given-names>
</name>
<name>
<surname>Hausdorff</surname>
<given-names>J. M.</given-names>
</name>
</person-group>
(
<year>2012</year>
).
<article-title>Modality-specific communication enabling gait synchronization during over-ground side-by-side walking</article-title>
.
<source>Hum. Mov. Sci.</source>
<volume>31</volume>
,
<fpage>1268</fpage>
<lpage>1285</lpage>
.
<pub-id pub-id-type="doi">10.1016/j.humov.2012.01.003</pub-id>
<pub-id pub-id-type="pmid">22727358</pub-id>
</mixed-citation>
</ref>
</ref-list>
</back>
</pmc>
<affiliations>
<list>
<country>
<li>Royaume-Uni</li>
</country>
</list>
<tree>
<country name="Royaume-Uni">
<noRegion>
<name sortKey="Booth, Ashley J" sort="Booth, Ashley J" uniqKey="Booth A" first="Ashley J." last="Booth">Ashley J. Booth</name>
</noRegion>
<name sortKey="Elliott, Mark T" sort="Elliott, Mark T" uniqKey="Elliott M" first="Mark T." last="Elliott">Mark T. Elliott</name>
<name sortKey="Elliott, Mark T" sort="Elliott, Mark T" uniqKey="Elliott M" first="Mark T." last="Elliott">Mark T. Elliott</name>
</country>
</tree>
</affiliations>
</record>

Pour manipuler ce document sous Unix (Dilib)

EXPLOR_STEP=$WICRI_ROOT/Ticri/CIDE/explor/HapticV1/Data/Pmc/Checkpoint
HfdSelect -h $EXPLOR_STEP/biblio.hfd -nk 000614 | SxmlIndent | more

Ou

HfdSelect -h $EXPLOR_AREA/Data/Pmc/Checkpoint/biblio.hfd -nk 000614 | SxmlIndent | more

Pour mettre un lien sur cette page dans le réseau Wicri

{{Explor lien
   |wiki=    Ticri/CIDE
   |area=    HapticV1
   |flux=    Pmc
   |étape=   Checkpoint
   |type=    RBID
   |clé=     PMC:4478893
   |texte=   Early, but not late visual distractors affect movement synchronization to a temporal-spatial visual cue
}}

Pour générer des pages wiki

HfdIndexSelect -h $EXPLOR_AREA/Data/Pmc/Checkpoint/RBID.i   -Sk "pubmed:26157412" \
       | HfdSelect -Kh $EXPLOR_AREA/Data/Pmc/Checkpoint/biblio.hfd   \
       | NlmPubMed2Wicri -a HapticV1 

Wicri

This area was generated with Dilib version V0.6.23.
Data generation: Mon Jun 13 01:09:46 2016. Site generation: Wed Mar 6 09:54:07 2024