Serveur d'exploration sur les dispositifs haptiques

Attention, ce site est en cours de développement !
Attention, site généré par des moyens informatiques à partir de corpus bruts.
Les informations ne sont donc pas validées.

Evaluating Interpersonal Synchrony: Wavelet Transform Toward an Unstructured Conversation

Identifieur interne : 004215 ( Ncbi/Merge ); précédent : 004214; suivant : 004216

Evaluating Interpersonal Synchrony: Wavelet Transform Toward an Unstructured Conversation

Auteurs : Ken Fujiwara [Japon] ; Ikuo Daibo [Japon]

Source :

RBID : PMC:4828427

Abstract

This study examined whether interpersonal synchrony could be extracted using spectrum analysis (i.e., wavelet transform) in an unstructured conversation. Sixty-two female undergraduates were randomly paired and they engaged in a 6-min unstructured conversation. Interpersonal synchrony was evaluated by calculating the cross-wavelet coherence of the time-series movement data, extracted using a video-image analysis software. The existence of synchrony was tested using a pseudo-synchrony paradigm. In addition, the frequency at which the synchrony occurred and the distribution of the relative phase was explored. The results showed that the value of cross-wavelet coherence was higher in the experimental participant pairs than in the pseudo pairs. Further, the coherence value was higher in the frequency band under 0.5 Hz. These results support the validity of evaluating interpersonal synchron Behavioral mimicry and interpersonal syyby using wavelet transform even in an unstructured conversation. However, the role of relative phase was not clear; there was no significant difference between each relative-phase region. The theoretical contribution of these findings to the area of interpersonal coordination is discussed.


Url:
DOI: 10.3389/fpsyg.2016.00516
PubMed: 27148125
PubMed Central: 4828427

Links toward previous steps (curation, corpus...)


Links to Exploration step

PMC:4828427

Le document en format XML

<record>
<TEI>
<teiHeader>
<fileDesc>
<titleStmt>
<title xml:lang="en">Evaluating Interpersonal Synchrony: Wavelet Transform Toward an Unstructured Conversation</title>
<author>
<name sortKey="Fujiwara, Ken" sort="Fujiwara, Ken" uniqKey="Fujiwara K" first="Ken" last="Fujiwara">Ken Fujiwara</name>
<affiliation wicri:level="1">
<nlm:aff id="aff1">
<institution>Faculty of Human Sciences, Osaka University of Economics</institution>
<country>Osaka, Japan</country>
</nlm:aff>
<country xml:lang="fr">Japon</country>
<wicri:regionArea></wicri:regionArea>
<wicri:regionArea># see nlm:aff region in country</wicri:regionArea>
</affiliation>
</author>
<author>
<name sortKey="Daibo, Ikuo" sort="Daibo, Ikuo" uniqKey="Daibo I" first="Ikuo" last="Daibo">Ikuo Daibo</name>
<affiliation wicri:level="1">
<nlm:aff id="aff2">
<institution>School of Motivation and Behavioral Sciences, Tokyo Future University</institution>
<country>Tokyo, Japan</country>
</nlm:aff>
<country xml:lang="fr">Japon</country>
<wicri:regionArea></wicri:regionArea>
<wicri:regionArea># see nlm:aff region in country</wicri:regionArea>
</affiliation>
</author>
</titleStmt>
<publicationStmt>
<idno type="wicri:source">PMC</idno>
<idno type="pmid">27148125</idno>
<idno type="pmc">4828427</idno>
<idno type="url">http://www.ncbi.nlm.nih.gov/pmc/articles/PMC4828427</idno>
<idno type="RBID">PMC:4828427</idno>
<idno type="doi">10.3389/fpsyg.2016.00516</idno>
<date when="2016">2016</date>
<idno type="wicri:Area/Pmc/Corpus">000495</idno>
<idno type="wicri:Area/Pmc/Curation">000495</idno>
<idno type="wicri:Area/Pmc/Checkpoint">000120</idno>
<idno type="wicri:Area/Ncbi/Merge">004215</idno>
</publicationStmt>
<sourceDesc>
<biblStruct>
<analytic>
<title xml:lang="en" level="a" type="main">Evaluating Interpersonal Synchrony: Wavelet Transform Toward an Unstructured Conversation</title>
<author>
<name sortKey="Fujiwara, Ken" sort="Fujiwara, Ken" uniqKey="Fujiwara K" first="Ken" last="Fujiwara">Ken Fujiwara</name>
<affiliation wicri:level="1">
<nlm:aff id="aff1">
<institution>Faculty of Human Sciences, Osaka University of Economics</institution>
<country>Osaka, Japan</country>
</nlm:aff>
<country xml:lang="fr">Japon</country>
<wicri:regionArea></wicri:regionArea>
<wicri:regionArea># see nlm:aff region in country</wicri:regionArea>
</affiliation>
</author>
<author>
<name sortKey="Daibo, Ikuo" sort="Daibo, Ikuo" uniqKey="Daibo I" first="Ikuo" last="Daibo">Ikuo Daibo</name>
<affiliation wicri:level="1">
<nlm:aff id="aff2">
<institution>School of Motivation and Behavioral Sciences, Tokyo Future University</institution>
<country>Tokyo, Japan</country>
</nlm:aff>
<country xml:lang="fr">Japon</country>
<wicri:regionArea></wicri:regionArea>
<wicri:regionArea># see nlm:aff region in country</wicri:regionArea>
</affiliation>
</author>
</analytic>
<series>
<title level="j">Frontiers in Psychology</title>
<idno type="eISSN">1664-1078</idno>
<imprint>
<date when="2016">2016</date>
</imprint>
</series>
</biblStruct>
</sourceDesc>
</fileDesc>
<profileDesc>
<textClass></textClass>
</profileDesc>
</teiHeader>
<front>
<div type="abstract" xml:lang="en">
<p>This study examined whether interpersonal synchrony could be extracted using spectrum analysis (i.e., wavelet transform) in an unstructured conversation. Sixty-two female undergraduates were randomly paired and they engaged in a 6-min unstructured conversation. Interpersonal synchrony was evaluated by calculating the cross-wavelet coherence of the time-series movement data, extracted using a video-image analysis software. The existence of synchrony was tested using a pseudo-synchrony paradigm. In addition, the frequency at which the synchrony occurred and the distribution of the relative phase was explored. The results showed that the value of cross-wavelet coherence was higher in the experimental participant pairs than in the pseudo pairs. Further, the coherence value was higher in the frequency band under 0.5 Hz. These results support the validity of evaluating interpersonal synchron Behavioral mimicry and interpersonal syyby using wavelet transform even in an unstructured conversation. However, the role of relative phase was not clear; there was no significant difference between each relative-phase region. The theoretical contribution of these findings to the area of interpersonal coordination is discussed.</p>
</div>
</front>
<back>
<div1 type="bibliography">
<listBibl>
<biblStruct>
<analytic>
<author>
<name sortKey="Bernieri, F J" uniqKey="Bernieri F">F. J. Bernieri</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Bernieri, F J" uniqKey="Bernieri F">F. J. Bernieri</name>
</author>
<author>
<name sortKey="Reznick, J S" uniqKey="Reznick J">J. S. Reznick</name>
</author>
<author>
<name sortKey="Rosenthal, R" uniqKey="Rosenthal R">R. Rosenthal</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Bernieri, F J" uniqKey="Bernieri F">F. J. Bernieri</name>
</author>
<author>
<name sortKey="Rosenthal, R" uniqKey="Rosenthal R">R. Rosenthal</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Burgoon, J K" uniqKey="Burgoon J">J. K. Burgoon</name>
</author>
<author>
<name sortKey="Stern, L A" uniqKey="Stern L">L. A. Stern</name>
</author>
<author>
<name sortKey="Dillman, L" uniqKey="Dillman L">L. Dillman</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Cappella, J N" uniqKey="Cappella J">J. N. Cappella</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Cappella, J N" uniqKey="Cappella J">J. N. Cappella</name>
</author>
<author>
<name sortKey="Planalp, S" uniqKey="Planalp S">S. Planalp</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Chartrand, T L" uniqKey="Chartrand T">T. L. Chartrand</name>
</author>
<author>
<name sortKey="Bargh, J A" uniqKey="Bargh J">J. A. Bargh</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Chartrand, T L" uniqKey="Chartrand T">T. L. Chartrand</name>
</author>
<author>
<name sortKey="Lakin, J L" uniqKey="Lakin J">J. L. Lakin</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Condon, W S" uniqKey="Condon W">W. S. Condon</name>
</author>
<author>
<name sortKey="Ogston, W D" uniqKey="Ogston W">W. D. Ogston</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Condon, W S" uniqKey="Condon W">W. S. Condon</name>
</author>
<author>
<name sortKey="Ogston, W D" uniqKey="Ogston W">W. D. Ogston</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Ekman, P" uniqKey="Ekman P">P. Ekman</name>
</author>
<author>
<name sortKey="Friesen, W V" uniqKey="Friesen W">W. V. Friesen</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Ekman, P" uniqKey="Ekman P">P. Ekman</name>
</author>
<author>
<name sortKey="Friesen, W V" uniqKey="Friesen W">W. V. Friesen</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Frauendorfer, D" uniqKey="Frauendorfer D">D. Frauendorfer</name>
</author>
<author>
<name sortKey="Mast, M S" uniqKey="Mast M">M. S. Mast</name>
</author>
<author>
<name sortKey="Nguyen, L" uniqKey="Nguyen L">L. Nguyen</name>
</author>
<author>
<name sortKey="Gatica Perez, D" uniqKey="Gatica Perez D">D. Gatica-Perez</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Fujiwara, K" uniqKey="Fujiwara K">K. Fujiwara</name>
</author>
<author>
<name sortKey="Daibo, I" uniqKey="Daibo I">I. Daibo</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Giles, H" uniqKey="Giles H">H. Giles</name>
</author>
<author>
<name sortKey="Coupland, N" uniqKey="Coupland N">N. Coupland</name>
</author>
<author>
<name sortKey="Coupland, J" uniqKey="Coupland J">J. Coupland</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Grinsted, A" uniqKey="Grinsted A">A. Grinsted</name>
</author>
<author>
<name sortKey="Moore, J C" uniqKey="Moore J">J. C. Moore</name>
</author>
<author>
<name sortKey="Jevrejeva, S" uniqKey="Jevrejeva S">S. Jevrejeva</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Hadar, U" uniqKey="Hadar U">U. Hadar</name>
</author>
<author>
<name sortKey="Steiner, T J" uniqKey="Steiner T">T. J. Steiner</name>
</author>
<author>
<name sortKey="Clifford Rose, F" uniqKey="Clifford Rose F">F. Clifford Rose</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Holler, J" uniqKey="Holler J">J. Holler</name>
</author>
<author>
<name sortKey="Wilkin, K" uniqKey="Wilkin K">K. Wilkin</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Hove, M J" uniqKey="Hove M">M. J. Hove</name>
</author>
<author>
<name sortKey="Risen, J L" uniqKey="Risen J">J. L. Risen</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Issartel, J" uniqKey="Issartel J">J. Issartel</name>
</author>
<author>
<name sortKey="Barainne, T" uniqKey="Barainne T">T. Barainne</name>
</author>
<author>
<name sortKey="Gaillot, P" uniqKey="Gaillot P">P. Gaillot</name>
</author>
<author>
<name sortKey="Marin, L" uniqKey="Marin L">L. Marin</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Issartel, J" uniqKey="Issartel J">J. Issartel</name>
</author>
<author>
<name sortKey="Marin, L" uniqKey="Marin L">L. Marin</name>
</author>
<author>
<name sortKey="Gaillotb, P" uniqKey="Gaillotb P">P. Gaillotb</name>
</author>
<author>
<name sortKey="Bardainnec, T" uniqKey="Bardainnec T">T. Bardainnec</name>
</author>
<author>
<name sortKey="Cadopia, M" uniqKey="Cadopia M">M. Cadopia</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Kupper, Z" uniqKey="Kupper Z">Z. Kupper</name>
</author>
<author>
<name sortKey="Ramseyer, F" uniqKey="Ramseyer F">F. Ramseyer</name>
</author>
<author>
<name sortKey="Hoffmann, H" uniqKey="Hoffmann H">H. Hoffmann</name>
</author>
<author>
<name sortKey="Kalbermatten, S" uniqKey="Kalbermatten S">S. Kalbermatten</name>
</author>
<author>
<name sortKey="Tschacher, W" uniqKey="Tschacher W">W. Tschacher</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Lafrance, M" uniqKey="Lafrance M">M. LaFrance</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Lafrance, M" uniqKey="Lafrance M">M. LaFrance</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Lakin, J L" uniqKey="Lakin J">J. L. Lakin</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Lakin, J L" uniqKey="Lakin J">J. L. Lakin</name>
</author>
<author>
<name sortKey="Chartrand, T L" uniqKey="Chartrand T">T. L. Chartrand</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Lumsden, J" uniqKey="Lumsden J">J. Lumsden</name>
</author>
<author>
<name sortKey="Miles, L K" uniqKey="Miles L">L. K. Miles</name>
</author>
<author>
<name sortKey="Macrae, C N" uniqKey="Macrae C">C. N. Macrae</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Lumsden, J" uniqKey="Lumsden J">J. Lumsden</name>
</author>
<author>
<name sortKey="Miles, L K" uniqKey="Miles L">L. K. Miles</name>
</author>
<author>
<name sortKey="Richardson, M J" uniqKey="Richardson M">M. J. Richardson</name>
</author>
<author>
<name sortKey="Smith, C A" uniqKey="Smith C">C. A. Smith</name>
</author>
<author>
<name sortKey="Macrae, C N" uniqKey="Macrae C">C. N. Macrae</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Miles, L K" uniqKey="Miles L">L. K. Miles</name>
</author>
<author>
<name sortKey="Nind, L K" uniqKey="Nind L">L. K. Nind</name>
</author>
<author>
<name sortKey="Macrae, C N" uniqKey="Macrae C">C. N. Macrae</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Natale, M" uniqKey="Natale M">M. Natale</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Paxton, A" uniqKey="Paxton A">A. Paxton</name>
</author>
<author>
<name sortKey="Dale, R" uniqKey="Dale R">R. Dale</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Pikovsky, A" uniqKey="Pikovsky A">A. Pikovsky</name>
</author>
<author>
<name sortKey="Rosenblum, M" uniqKey="Rosenblum M">M. Rosenblum</name>
</author>
<author>
<name sortKey="Kurths, J" uniqKey="Kurths J">J. Kurths</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Poppe, R" uniqKey="Poppe R">R. Poppe</name>
</author>
<author>
<name sortKey="Van Der Zee, S" uniqKey="Van Der Zee S">S. Van Der Zee</name>
</author>
<author>
<name sortKey="Heylen, D K J" uniqKey="Heylen D">D. K. J. Heylen</name>
</author>
<author>
<name sortKey="Taylor, P J" uniqKey="Taylor P">P. J. Taylor</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Ramseyer, F" uniqKey="Ramseyer F">F. Ramseyer</name>
</author>
<author>
<name sortKey="Tschacher, W" uniqKey="Tschacher W">W. Tschacher</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Richardson, D C" uniqKey="Richardson D">D. C. Richardson</name>
</author>
<author>
<name sortKey="Dale, R" uniqKey="Dale R">R. Dale</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Richardson, M J" uniqKey="Richardson M">M. J. Richardson</name>
</author>
<author>
<name sortKey="Marsh, K L" uniqKey="Marsh K">K. L. Marsh</name>
</author>
<author>
<name sortKey="Isenhower, R" uniqKey="Isenhower R">R. Isenhower</name>
</author>
<author>
<name sortKey="Goodman, J" uniqKey="Goodman J">J. Goodman</name>
</author>
<author>
<name sortKey="Schmidt, R C" uniqKey="Schmidt R">R. C. Schmidt</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Scheflen, A E" uniqKey="Scheflen A">A. E. Scheflen</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Schmidt, R C" uniqKey="Schmidt R">R. C. Schmidt</name>
</author>
<author>
<name sortKey="Morr, S" uniqKey="Morr S">S. Morr</name>
</author>
<author>
<name sortKey="Fitzpatrick, P" uniqKey="Fitzpatrick P">P. Fitzpatrick</name>
</author>
<author>
<name sortKey="Richardson, M J" uniqKey="Richardson M">M. J. Richardson</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Schmidt, R C" uniqKey="Schmidt R">R. C. Schmidt</name>
</author>
<author>
<name sortKey="Nie, L" uniqKey="Nie L">L. Nie</name>
</author>
<author>
<name sortKey="Franco, A" uniqKey="Franco A">A. Franco</name>
</author>
<author>
<name sortKey="Richardson, M J" uniqKey="Richardson M">M. J. Richardson</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Schmidt, R C" uniqKey="Schmidt R">R. C. Schmidt</name>
</author>
<author>
<name sortKey="O Rien, B" uniqKey="O Rien B">B. O’Brien</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Shockley, K" uniqKey="Shockley K">K. Shockley</name>
</author>
<author>
<name sortKey="Santana, M V" uniqKey="Santana M">M. V. Santana</name>
</author>
<author>
<name sortKey="Fowler, C A" uniqKey="Fowler C">C. A. Fowler</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Sofianidis, G" uniqKey="Sofianidis G">G. Sofianidis</name>
</author>
<author>
<name sortKey="Hatzitaki, V" uniqKey="Hatzitaki V">V. Hatzitaki</name>
</author>
<author>
<name sortKey="Grouios, G" uniqKey="Grouios G">G. Grouios</name>
</author>
<author>
<name sortKey="Johannsen, L" uniqKey="Johannsen L">L. Johannsen</name>
</author>
<author>
<name sortKey="Wing, A" uniqKey="Wing A">A. Wing</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Stel, M" uniqKey="Stel M">M. Stel</name>
</author>
<author>
<name sortKey="Harinck, F" uniqKey="Harinck F">F. Harinck</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Street, R L" uniqKey="Street R">R. L. Street</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Tickle Degnen, L" uniqKey="Tickle Degnen L">L. Tickle-Degnen</name>
</author>
<author>
<name sortKey="Rosenthal, R" uniqKey="Rosenthal R">R. Rosenthal</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Tschacher, W" uniqKey="Tschacher W">W. Tschacher</name>
</author>
<author>
<name sortKey="Rees, G M" uniqKey="Rees G">G. M. Rees</name>
</author>
<author>
<name sortKey="Ramseyer, F" uniqKey="Ramseyer F">F. Ramseyer</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Varlet, M" uniqKey="Varlet M">M. Varlet</name>
</author>
<author>
<name sortKey="Marin, L" uniqKey="Marin L">L. Marin</name>
</author>
<author>
<name sortKey="Lagarde, J" uniqKey="Lagarde J">J. Lagarde</name>
</author>
<author>
<name sortKey="Bardy, B G" uniqKey="Bardy B">B. G. Bardy</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Walton, A E" uniqKey="Walton A">A. E. Walton</name>
</author>
<author>
<name sortKey="Richardson, M J" uniqKey="Richardson M">M. J. Richardson</name>
</author>
<author>
<name sortKey="Langland Hassan, P" uniqKey="Langland Hassan P">P. Langland-Hassan</name>
</author>
<author>
<name sortKey="Chemero, A" uniqKey="Chemero A">A. Chemero</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Washburn, A" uniqKey="Washburn A">A. Washburn</name>
</author>
<author>
<name sortKey="Demarco, M" uniqKey="Demarco M">M. DeMarco</name>
</author>
<author>
<name sortKey="De Vries, S" uniqKey="De Vries S">S. de Vries</name>
</author>
<author>
<name sortKey="Ariyabuddhiphongs, K" uniqKey="Ariyabuddhiphongs K">K. Ariyabuddhiphongs</name>
</author>
<author>
<name sortKey="Schmidt, R C" uniqKey="Schmidt R">R. C. Schmidt</name>
</author>
<author>
<name sortKey="Richardson, M J" uniqKey="Richardson M">M. J. Richardson</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Wiltermuth, S S" uniqKey="Wiltermuth S">S. S. Wiltermuth</name>
</author>
<author>
<name sortKey="Heath, C" uniqKey="Heath C">C. Heath</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Won, A S" uniqKey="Won A">A. S. Won</name>
</author>
<author>
<name sortKey="Bailenson, J N" uniqKey="Bailenson J">J. N. Bailenson</name>
</author>
<author>
<name sortKey="Stathatos, S C" uniqKey="Stathatos S">S. C. Stathatos</name>
</author>
<author>
<name sortKey="Dai, W" uniqKey="Dai W">W. Dai</name>
</author>
</analytic>
</biblStruct>
</listBibl>
</div1>
</back>
</TEI>
<pmc article-type="research-article">
<pmc-dir>properties open_access</pmc-dir>
<front>
<journal-meta>
<journal-id journal-id-type="nlm-ta">Front Psychol</journal-id>
<journal-id journal-id-type="iso-abbrev">Front Psychol</journal-id>
<journal-id journal-id-type="publisher-id">Front. Psychol.</journal-id>
<journal-title-group>
<journal-title>Frontiers in Psychology</journal-title>
</journal-title-group>
<issn pub-type="epub">1664-1078</issn>
<publisher>
<publisher-name>Frontiers Media S.A.</publisher-name>
</publisher>
</journal-meta>
<article-meta>
<article-id pub-id-type="pmid">27148125</article-id>
<article-id pub-id-type="pmc">4828427</article-id>
<article-id pub-id-type="doi">10.3389/fpsyg.2016.00516</article-id>
<article-categories>
<subj-group subj-group-type="heading">
<subject>Psychology</subject>
<subj-group>
<subject>Methods</subject>
</subj-group>
</subj-group>
</article-categories>
<title-group>
<article-title>Evaluating Interpersonal Synchrony: Wavelet Transform Toward an Unstructured Conversation</article-title>
</title-group>
<contrib-group>
<contrib contrib-type="author">
<name>
<surname>Fujiwara</surname>
<given-names>Ken</given-names>
</name>
<xref ref-type="aff" rid="aff1">
<sup>1</sup>
</xref>
<xref ref-type="author-notes" rid="fn001">
<sup>*</sup>
</xref>
<uri xlink:type="simple" xlink:href="http://loop.frontiersin.org/people/320815/overview"></uri>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Daibo</surname>
<given-names>Ikuo</given-names>
</name>
<xref ref-type="aff" rid="aff2">
<sup>2</sup>
</xref>
<uri xlink:type="simple" xlink:href="http://loop.frontiersin.org/people/321513/overview"></uri>
</contrib>
</contrib-group>
<aff id="aff1">
<sup>1</sup>
<institution>Faculty of Human Sciences, Osaka University of Economics</institution>
<country>Osaka, Japan</country>
</aff>
<aff id="aff2">
<sup>2</sup>
<institution>School of Motivation and Behavioral Sciences, Tokyo Future University</institution>
<country>Tokyo, Japan</country>
</aff>
<author-notes>
<fn fn-type="edited-by">
<p>Edited by:
<italic>Duarte Araújo, University of Lisbon, Portugal</italic>
</p>
</fn>
<fn fn-type="edited-by">
<p>Reviewed by:
<italic>Michael J. Richardson, University of Cincinnati, USA; Wolfgang Tschacher, University of Bern, Switzerland</italic>
</p>
</fn>
<corresp id="fn001">*Correspondence:
<italic>Ken Fujiwara,
<email xlink:type="simple">ken.fuji@osaka-ue.ac.jp</email>
</italic>
</corresp>
<fn fn-type="other" id="fn002">
<p>This article was submitted to Movement Science and Sport Psychology, a section of the journal Frontiers in Psychology</p>
</fn>
</author-notes>
<pub-date pub-type="epub">
<day>12</day>
<month>4</month>
<year>2016</year>
</pub-date>
<pub-date pub-type="collection">
<year>2016</year>
</pub-date>
<volume>7</volume>
<elocation-id>516</elocation-id>
<history>
<date date-type="received">
<day>19</day>
<month>2</month>
<year>2016</year>
</date>
<date date-type="accepted">
<day>29</day>
<month>3</month>
<year>2016</year>
</date>
</history>
<permissions>
<copyright-statement>Copyright © 2016 Fujiwara and Daibo.</copyright-statement>
<copyright-year>2016</copyright-year>
<copyright-holder>Fujiwara and Daibo</copyright-holder>
<license xlink:href="http://creativecommons.org/licenses/by/4.0/">
<license-p>This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.</license-p>
</license>
</permissions>
<abstract>
<p>This study examined whether interpersonal synchrony could be extracted using spectrum analysis (i.e., wavelet transform) in an unstructured conversation. Sixty-two female undergraduates were randomly paired and they engaged in a 6-min unstructured conversation. Interpersonal synchrony was evaluated by calculating the cross-wavelet coherence of the time-series movement data, extracted using a video-image analysis software. The existence of synchrony was tested using a pseudo-synchrony paradigm. In addition, the frequency at which the synchrony occurred and the distribution of the relative phase was explored. The results showed that the value of cross-wavelet coherence was higher in the experimental participant pairs than in the pseudo pairs. Further, the coherence value was higher in the frequency band under 0.5 Hz. These results support the validity of evaluating interpersonal synchron Behavioral mimicry and interpersonal syyby using wavelet transform even in an unstructured conversation. However, the role of relative phase was not clear; there was no significant difference between each relative-phase region. The theoretical contribution of these findings to the area of interpersonal coordination is discussed.</p>
</abstract>
<kwd-group>
<kwd>non-verbal behavior</kwd>
<kwd>interpersonal coordination</kwd>
<kwd>synchrony</kwd>
<kwd>spectrum analysis</kwd>
<kwd>wavelet transform</kwd>
<kwd>an automated method</kwd>
</kwd-group>
<funding-group>
<award-group>
<funding-source id="cn001">Japan Society for the Promotion of Science
<named-content content-type="fundref-id">10.13039/501100001691</named-content>
</funding-source>
<award-id rid="cn001">Grant-in-Aid for JSPS Fellows; 11J03150</award-id>
</award-group>
</funding-group>
<counts>
<fig-count count="3"></fig-count>
<table-count count="1"></table-count>
<equation-count count="0"></equation-count>
<ref-count count="51"></ref-count>
<page-count count="9"></page-count>
<word-count count="0"></word-count>
</counts>
</article-meta>
</front>
<body>
<sec>
<title>Introduction</title>
<p>Interpersonal coordination has attracted the attention of social psychology and communication researchers. Past work has revealed synchronization or unsynchronization at various levels of communication behavior; for instance, vocal intensity (
<xref rid="B30" ref-type="bibr">Natale, 1975</xref>
), vocalization duration (
<xref rid="B6" ref-type="bibr">Cappella and Planalp, 1981</xref>
), speech rate and response latency (
<xref rid="B44" ref-type="bibr">Street, 1984</xref>
), eye movements (
<xref rid="B35" ref-type="bibr">Richardson and Dale, 2005</xref>
), body posture (
<xref rid="B37" ref-type="bibr">Scheflen, 1964</xref>
), and body posture sway (
<xref rid="B41" ref-type="bibr">Shockley et al., 2003</xref>
) are coordinated between interactants. Communication Accommodation Theory (
<xref rid="B15" ref-type="bibr">Giles et al., 1991</xref>
) and Interpersonal Adaptation Theory (
<xref rid="B4" ref-type="bibr">Burgoon et al., 1995</xref>
) provide theoretical frameworks to explain how conversational features, including vocal patterns or gestures, become synchronized or unsynchronized between conversation partners.</p>
<p>Previous studies have suggested that there is a link between coordination and pro-sociality. Coordination leads to rapport (
<xref rid="B45" ref-type="bibr">Tickle-Degnen and Rosenthal, 1990</xref>
), affiliation (
<xref rid="B19" ref-type="bibr">Hove and Risen, 2009</xref>
), and cooperation (
<xref rid="B50" ref-type="bibr">Wiltermuth and Heath, 2009</xref>
) between interactants. One study found that the experience of coordination resulted in voting for left-wing parties, which is considered a prosocial behavior (
<xref rid="B43" ref-type="bibr">Stel and Harinck, 2011</xref>
). Conversely, coordination also occurred as a result of pro-sociality. The goal to affiliate or create rapport increases coordination (
<xref rid="B26" ref-type="bibr">Lakin and Chartrand, 2003</xref>
). Similarly, pro-social orientation increases the propensity to coordinate with others (
<xref rid="B28" ref-type="bibr">Lumsden et al., 2012b</xref>
). Many studies show consistent findings regarding the relationship between coordination and pro-sociality or positive interpersonal relationships; however, the definition of coordination is a bit more complex. There are large variations in how to study and analyze coordination.</p>
<sec>
<title>Interpersonal Coordination in Time- or Frequency-Domain</title>
<p>Interpersonal coordination, by definition, occurs when two or more individuals coordinate their behavior in a time series. The relationship between each time series can be analyzed in either time- or frequency-domains (
<xref rid="B21" ref-type="bibr">Issartel et al., 2006</xref>
). Therefore, coordination can be considered as a time- and/or frequency-domain phenomenon. In the time-domain, the amount of movement or the occurrence of a specific behavior is plotted on the
<italic>y</italic>
-axis while the
<italic>x</italic>
-axis represents the timeline. Coordination is interpreted as the extent to which behaviors co-occur or the amount of behavior that is similar between the interactants within a predetermined time window (
<bold>Figure
<xref ref-type="fig" rid="F1">1</xref>
</bold>
). In comparison, in the frequency-domain, the extent of spectrum power is plotted on the
<italic>y</italic>
-axis while the
<italic>x</italic>
-axis represents frequency components. Coordination, in this case, is represented as the amount of similarity at each frequency component (i.e., cross-spectral coherence). Each instance of time- or frequency-domain coordination seems to correspond approximately to the work of
<xref rid="B3" ref-type="bibr">Bernieri and Rosenthal (1991)</xref>
who differentiated interpersonal coordination into two facets: behavior matching and synchrony.</p>
<fig id="F1" position="float">
<label>FIGURE 1</label>
<caption>
<p>
<bold>Example of interpersonal coordination in time- and frequency- domains.</bold>
This example illustrates the coordination of the amount of movement between two interactants in the time-
<bold>(A)</bold>
and frequency-
<bold>(B)</bold>
domains.</p>
</caption>
<graphic xlink:href="fpsyg-07-00516-g001"></graphic>
</fig>
<p>Behavior matching, in the early stage of research, was defined as the similarity of body postures between interactants. Researchers focused on whether their posture was congruent in a predetermined time window throughout the time series. Postural congruence has been observed in interactants who share a common viewpoint (
<xref rid="B37" ref-type="bibr">Scheflen, 1964</xref>
) and found to lead to rapport (
<xref rid="B23" ref-type="bibr">LaFrance, 1976</xref>
,
<xref rid="B24" ref-type="bibr">1979</xref>
). Although behavioral ratings have been employed to assess postural congruence between interactants (e.g.,
<xref rid="B1" ref-type="bibr">Bernieri, 1988</xref>
), some studies have suggested possible bias in the rating of raters (
<xref rid="B5" ref-type="bibr">Cappella, 1990</xref>
;
<xref rid="B27" ref-type="bibr">Lumsden et al., 2012a</xref>
). Therefore, recent studies have directly assessed postural congruence using various sensors or motion-capture technologies (
<xref rid="B33" ref-type="bibr">Poppe et al., 2014</xref>
;
<xref rid="B51" ref-type="bibr">Won et al., 2014</xref>
), which enable objective measurements of the similarity of postures throughout the time series.</p>
<p>In the recent literature, behavioral matching is known and described as behavioral mimicry (e.g.,
<xref rid="B7" ref-type="bibr">Chartrand and Bargh, 1999</xref>
;
<xref rid="B19" ref-type="bibr">Hove and Risen, 2009</xref>
;
<xref rid="B25" ref-type="bibr">Lakin, 2013</xref>
). Behavioral mimicry is an automatic tendency to imitate another’s behavior at a particular moment in time. The target of behavioral mimicry is broad; it includes posture as well as gestures, mannerisms, and other motor movements (for a review see
<xref rid="B8" ref-type="bibr">Chartrand and Lakin, 2013</xref>
;
<xref rid="B25" ref-type="bibr">Lakin, 2013</xref>
). Behavioral mimicry is typically assessed by examining whether the same or a similar behavior occurs at a given point in time or whether the presented behavior is mimicked by an interactional partner within a short window of time. For instance,
<xref rid="B26" ref-type="bibr">Lakin and Chartrand (2003)</xref>
measured the frequency of similar or identical behaviors (i.e., face touching) between participants and the confederates in experimental tasks.
<xref rid="B46" ref-type="bibr">Tschacher et al. (2014)</xref>
measured the similarity of motor movement between interactants in a dyadic face-to-face situation. The extent of the similarity was assessed by employing time-lagged cross-correlations using two different time windows: one was a 10-s window for the computation of cross-correlations, and the other was 30 s that was used to account for non-stationarity. Although the window size and target behaviors were different in each study or its purpose, behavioral mimicry research has shed light on coordination in the time domain.</p>
<p>Over time, synchrony research has focused on the similarity of rhythm and timing, which can be interpreted as a frequency-domain phenomenon. Synchrony research employing an analysis method from physics (e.g.,
<xref rid="B32" ref-type="bibr">Pikovsky et al., 2003</xref>
) has revealed that temporal coordination occurs between interactants (e.g.,
<xref rid="B40" ref-type="bibr">Schmidt and O’Brien, 1997</xref>
), and that it can increase affiliation (
<xref rid="B19" ref-type="bibr">Hove and Risen, 2009</xref>
). Some synchrony research has focused on similar movements between interactants such as swinging pendulums and rocking in rocking chairs (e.g.,
<xref rid="B40" ref-type="bibr">Schmidt and O’Brien, 1997</xref>
;
<xref rid="B36" ref-type="bibr">Richardson et al., 2007</xref>
); however, synchrony can be achieved even with different behaviors between interactants.
<xref rid="B3" ref-type="bibr">Bernieri and Rosenthal (1991)</xref>
, for instance, suggested jazz as an example of synchrony. Jazz players each have an instrument and play it in a different manner from other players as the mechanism of playing a guitar differs from that of a saxophone. The essence of synchrony is rhythm and timing. In order to evaluate synchrony in the frequency domain (i.e., the similarity of rhythm and timing), a simple correlation applied to assess time-domain coordination cannot be employed. Spectrum analysis is used in such situations.</p>
</sec>
<sec>
<title>Evaluating Synchrony: Spectrum Analysis</title>
<p>To evaluate synchrony in the frequency domain, a spectrum analysis that deconstructs a complex time-series into its rhythmic components, was employed (e.g.,
<xref rid="B40" ref-type="bibr">Schmidt and O’Brien, 1997</xref>
;
<xref rid="B38" ref-type="bibr">Schmidt et al., 2012</xref>
). Spectrum analysis is applied to time-series continuous data that refer to scales in which the interval between observations (i.e., sampling rate) is constant; nominal and ordinal data cannot be used for spectrum analysis (
<xref rid="B20" ref-type="bibr">Issartel et al., 2014</xref>
).</p>
<p>In the early year of non-verbal research, microanalysis (e.g.,
<xref rid="B9" ref-type="bibr">Condon and Ogston, 1966</xref>
,
<xref rid="B10" ref-type="bibr">1967</xref>
) was used to analyze films of social interactions, frame by frame to generate time-series movement data. This measuring process, unfortunately, tends to be resource intensive. Coding behaviors is time-consuming and painstaking in itself and requires establishing reliability among the coders. It is not unusual to sped twice the length of a film’s time coding the behavior of interest. To address this problem, some recent studies have utilized automatic techniques to generate time-series movement data; they use the depth sensor, Kinect (Microsoft;
<xref rid="B13" ref-type="bibr">Frauendorfer et al., 2014</xref>
;
<xref rid="B51" ref-type="bibr">Won et al., 2014</xref>
), or employ video-tracking techniques (
<xref rid="B22" ref-type="bibr">Kupper et al., 2010</xref>
;
<xref rid="B38" ref-type="bibr">Schmidt et al., 2012</xref>
;
<xref rid="B31" ref-type="bibr">Paxton and Dale, 2013</xref>
;
<xref rid="B14" ref-type="bibr">Fujiwara and Daibo, 2014</xref>
;
<xref rid="B46" ref-type="bibr">Tschacher et al., 2014</xref>
). Behavioral data acquired using these techniques can be less costly and highly reliable.</p>
<p>After obtaining time-series data, a spectrum analysis can be conducted. The Fourier transform is one of the well-known types of spectrum analysis. It calculates a spectral power that indicates the magnitude at each component frequency. If there are two time-series, cross-spectrum analysis can be applied to them and coherence can be calculated. Coherence, which ranges on a scale of 0–1, is a measure of similarity between the two time-series at each component frequency. A coherence of 1 reflects a perfect correlation between the two movements, and 0 reflects no correlation (
<xref rid="B40" ref-type="bibr">Schmidt and O’Brien, 1997</xref>
;
<xref rid="B36" ref-type="bibr">Richardson et al., 2007</xref>
).
<xref rid="B38" ref-type="bibr">Schmidt et al. (2012)</xref>
conducted a periodic interaction task (i.e., telling knock–knock joke) and applied the Fourier transform toward the time-series movement data of the interactants. The mean value of coherence at the dominant rhythm, large spectral peaks at specific frequencies (i.e., 0.125 and 0.5 Hz), was calculated to evaluate rhythmic similarity.</p>
<p>Spectrum analysis, including the Fourier transform, also provides phase information. In synchrony research, relative phase angle, which indicates a time lag at the frequency between interactants, has been used (e.g.,
<xref rid="B38" ref-type="bibr">Schmidt et al., 2012</xref>
). More precisely,
<xref rid="B38" ref-type="bibr">Schmidt et al. (2012)</xref>
used the Hilbert transform, a filter that simply shifts the phases of all frequency components of its input by -π/2 radians, to conduct the relative phase analysis. A 0° relative phase indicates movements in the same part of their cycles at a given time, which is called
<italic>in-phase</italic>
patterning. On the opposite end of the range, a 180° relative phase indicates movements in the opposite parts of their cycles at a given time, which is called
<italic>anti-phase</italic>
patterning. In the periodic interaction task,
<xref rid="B38" ref-type="bibr">Schmidt et al. (2012)</xref>
demonstrated the robustness of relative phase analysis across multiple measures and identified significant periods of coordination. Relative phase information, combined with coherence value, is used as a tool for analysis to explore the dynamic synchronization process of the interactants.</p>
<p>However, the Fourier transform that has been used in previous studies has a serious practical limitation: it assumes a stable frequency or repetitive pattern during the entire interaction (
<xref rid="B21" ref-type="bibr">Issartel et al., 2006</xref>
). The use of periodic or rhythmic tasks makes it easier to control turn-taking between the interactants and enables the capture of the dominant rhythm. If a rhythmic task requires a specific behavior once every 8 s, a large spectral peak will be found at approximately 0.125 Hz, which means that the dominant rhythm in the situation is 0.125 Hz. Interactants engaging in the same rhythmic task usually have a dominant rhythm with each other, and researchers can evaluate the extent of rhythmic synchronization by analyzing the degree of similarity at the dominant rhythm. On the contrary, in our daily conversations (i.e., unstructured conversations), it is difficult to assume a stable frequency and a repetitive pattern of movements, which means that the dominant rhythm cannot be set prior to a conversation. Even during one’s turn, some parts may become faster and others slower. For interactants, their movements may be synchronized in a faster rhythm at some points and in a slower rhythm at others. Therefore, the Fourier transform does not appear to be the best way to evaluate synchrony in the frequency domain in an unstructured conversation.</p>
</sec>
<sec>
<title>Wavelet Transform: Coordination in Time–Frequency Plane</title>
<p>The wavelet transform can be a potent alternative to the Fourier transform. It does not require stationarity in each time series. By employing the wavelet transform, time-series movement data is plotted onto a time–frequency plane (
<bold>Figure
<xref ref-type="fig" rid="F2">2</xref>
</bold>
). In the time–frequency plane, the frequency components are illustrated on the
<italic>y</italic>
-axis while the
<italic>x</italic>
-axis represents the time line, and the spectrum power is represented by the gray value, which illustrates the extent of spectrum power, which changes throughout the time line. As with the Fourier transform, cross-spectrum analysis can be conducted using the wavelet transform, and cross-wavelet coherence represents the similarity between the two time series at each component frequency throughout the time line.</p>
<fig id="F2" position="float">
<label>FIGURE 2</label>
<caption>
<p>
<bold>Example of interpersonal coordination in time-frequency plane.</bold>
Two images of wavelet power
<bold>(A1,A2)</bold>
and one of the cross-wavelet coherence
<bold>(B)</bold>
of A1 and A2 are shown. The time line is represented on the
<italic>x</italic>
-axis (360 s) and each frequency component is represented on the
<italic>y</italic>
-axis, as an inverted period (e.g., 0.25 period is 4 Hz). The magnitude of wavelet power and wavelet coherence is represented by color.
<bold>(B)</bold>
, The relative phase at a given frequency and time point is denoted by the orientation of the arrow: the right arrow indicates
<italic>in-phase</italic>
synchronization, the left arrow indicates
<italic>anti-phase</italic>
synchronization, and the downward arrow indicates no synchronization. The average value of these variables at a given frequency was extracted from these plots to analyze the coherence and relative phase. However, the cone of influence (COI) area that is shown as a lighter shade is not included in the analysis.</p>
</caption>
<graphic xlink:href="fpsyg-07-00516-g002"></graphic>
</fig>
<p>Several studies of interpersonal movement coordination have evaluated coordination by using cross-wavelet analysis.
<xref rid="B48" ref-type="bibr">Walton et al. (2015)</xref>
demonstrated that the cross-wavelet approach could illustrate the dynamics of movement coordination between improvising musicians.
<xref rid="B47" ref-type="bibr">Varlet et al. (2011)</xref>
reported that a pair of participants engaged in a visual tracking task influenced one another, and produced spontaneous postural coordination.
<xref rid="B47" ref-type="bibr">Varlet et al. (2011)</xref>
also used phase information via cross-wavelet transform to evaluate the occurrence of postural coordination. In a study of cross-wavelet coherence,
<xref rid="B49" ref-type="bibr">Washburn et al. (2014)</xref>
collected body movement data in dance settings, and found that the cross-wavelet coherence of the trained dancers was significantly higher than that of the non-dancers, indicating that the dancers achieved a higher level of coordination with their confederate.
<xref rid="B42" ref-type="bibr">Sofianidis et al. (2012)</xref>
performed a rhythmical sway task in the sagittal plane, and found that a light fingertip contact, i.e., haptic contact, increased coherence. In settings with more socialization,
<xref rid="B39" ref-type="bibr">Schmidt et al. (2014)</xref>
, as well as
<xref rid="B38" ref-type="bibr">Schmidt et al. (2012)</xref>
, used the knock–knock joking task, and calculated wavelet coherence to evaluate rhythmic similarity between two interactants.
<xref rid="B39" ref-type="bibr">Schmidt et al. (2014)</xref>
revealed that the bodily synchronization in the joke-telling task occurred at the dominant rhythms as well as across different nested temporal scales. Relative phase information also indicated that in-phase synchronization rather than anti-phase synchronization was observed between the interactants, which supported and validated the findings of
<xref rid="B38" ref-type="bibr">Schmidt et al. (2012)</xref>
.</p>
</sec>
<sec>
<title>Current Study</title>
<p>Many previous studies employing wavelet transform were conducted under situations that were a less social (e.g.,
<xref rid="B47" ref-type="bibr">Varlet et al., 2011</xref>
) or involved a specific task (
<xref rid="B39" ref-type="bibr">Schmidt et al., 2014</xref>
). This study did not employ a specific rhythmic task but rather focused on an unstructured conversation. In this type of situation, the dominant rhythm could be determined prior to the conversation because conversation speed is not predictable. Additionally, turn-taking between the interactants was not controlled. We examined whether the coordination represented in the time-frequency plane would be observed even in an unstructured conversation.</p>
<p>The study’s setting was based on the previous studies in social interaction research (e.g.,
<xref rid="B1" ref-type="bibr">Bernieri, 1988</xref>
;
<xref rid="B2" ref-type="bibr">Bernieri et al., 1988</xref>
;
<xref rid="B7" ref-type="bibr">Chartrand and Bargh, 1999</xref>
;
<xref rid="B26" ref-type="bibr">Lakin and Chartrand, 2003</xref>
;
<xref rid="B46" ref-type="bibr">Tschacher et al., 2014</xref>
); thus, our participants were seated to engage their conversation. Compared to interpersonal movement coordination studies (e.g.,
<xref rid="B47" ref-type="bibr">Varlet et al., 2011</xref>
;
<xref rid="B49" ref-type="bibr">Washburn et al., 2014</xref>
;
<xref rid="B48" ref-type="bibr">Walton et al., 2015</xref>
), these settings minimized participants’ movements, which required us to conduct a conservative test. However, we addressed this issue by focusing on typical kinesics indicators: hand and head movements. Hand movements, including gestures (
<xref rid="B11" ref-type="bibr">Ekman and Friesen, 1969</xref>
,
<xref rid="B12" ref-type="bibr">1972</xref>
) and/or head movements, including nodding (
<xref rid="B17" ref-type="bibr">Hadar et al., 1985</xref>
;
<xref rid="B13" ref-type="bibr">Frauendorfer et al., 2014</xref>
), could be seen even if participants were seated. Moreover, previous studies revealed that hand and/or head movements are coordinated in face-to-face interactions (
<xref rid="B18" ref-type="bibr">Holler and Wilkin, 2011</xref>
). Even if the interactants were seated, it would be possible to examine whether their movements were synchronized by extracting the head and hand movements.</p>
<p>To test the existence of synchrony,
<xref rid="B3" ref-type="bibr">Bernieri and Rosenthal (1991)</xref>
proposed the pseudo-synchrony experimental paradigm. In this paradigm, video clips of dyadic interaction partners (i.e., a genuine pair) are isolated and re-combined in a random order. The synchrony scores of these virtual data (i.e., pseudo pairs) are compared to the genuine pair. Employing the wavelet transform, we hypothesized that the extent of synchrony, as represented by the cross-wavelet coherence coefficient, would be higher in the genuine pairs than in the pseudo pairs. In addition, we explored whether there’s a difference in the relative phase (i.e., in-phase and anti-phase) in the genuine pairs.</p>
</sec>
</sec>
<sec sec-type="materials|methods" id="s1">
<title>Materials and Methods</title>
<sec>
<title>Participants</title>
<p>Seventy-four Japanese female undergraduates participated in exchange for extra course credit. Each participant was randomly paired with a stranger. The familiarity between the participants was expected to have a potent influence on the strength of synchrony; therefore, after their conversation, the participants were asked to complete questionnaires regarding their familiarity with one another. Five pairs who knew each other were removed from the subsequent analysis. In one case, the conversation was not recorded due to a malfunction of the video equipment. Therefore, a total of 31 dyads from 62 participants (Mean age = 18.47,
<italic>SD</italic>
= 0.59) were analyzed.</p>
</sec>
<sec>
<title>Procedures</title>
<p>First, participants were seated back-to-back and completed a consent form; then they moved to another seat where they were positioned opposite one other, 80 cm apart. They were instructed to engage in a 6-min conversation and become acquainted. The conversation topics were not specified, and the participants did not know the conversation would be analyzed from the perspective of synchrony. Their conversation was video-recorded using a camera (HDR-SR12; SONY) placed at a distance of 250 cm, and to the right side of the participants.</p>
</sec>
<sec>
<title>Ethics Statement</title>
<p>All procedures performed in studies involving human participants were in accordance with the ethical standards of the Department of Human Sciences in Osaka University.</p>
</sec>
<sec>
<title>Generating Time-Series Movement Data</title>
<p>There are several methods of generating time-series movement data; some use a depth sensor (
<xref rid="B13" ref-type="bibr">Frauendorfer et al., 2014</xref>
;
<xref rid="B51" ref-type="bibr">Won et al., 2014</xref>
), and others employ video-tracking techniques (
<xref rid="B22" ref-type="bibr">Kupper et al., 2010</xref>
;
<xref rid="B38" ref-type="bibr">Schmidt et al., 2012</xref>
;
<xref rid="B31" ref-type="bibr">Paxton and Dale, 2013</xref>
;
<xref rid="B46" ref-type="bibr">Tschacher et al., 2014</xref>
). In this study, time-series movement data was extracted using video-images analysis software (Dipp-MotionPRO Ver. 2.24c). By using three attributes of color, this software automatically tracks and captures two-dimensional body movements in chronological order. Previous research (
<xref rid="B14" ref-type="bibr">Fujiwara and Daibo, 2014</xref>
), using a former version of this software (Dipp-Motion XD Ver. 3.20-2), demonstrated that gestures categorized by information on a coordinate point corresponded closely with a third person’s judgment (Spearman rank correlations: rs = 0.78). This finding indicates that this software can track and capture body movements with high resolution, even if the movement is not very large.</p>
<p>For each participant, coordinate points for the fingertips and nose were captured in chronological order. Time resolution was set to 0.1 s. After down-sampling, the number of coordinate point changes between the adjacent video frames was calculated for the movement of each body part (i.e., fingertips and nose). The movement data from each video was calibrated using the size of an outlet cover (7 cm × 12 cm). After that, the movements of the fingertips and nose were added together.</p>
</sec>
<sec>
<title>Generating Virtual Data</title>
<p>To evaluate the significance of the extent of synchrony in the genuine pair, a baseline was needed. Based on the pseudo-synchrony experimental paradigm (
<xref rid="B3" ref-type="bibr">Bernieri and Rosenthal, 1991</xref>
), a virtual dataset was generated. Data from two time-series from the genuine pair were isolated and re-combined in random order. This shuffling procedure can keep the structure of the original movement intact, and thereby yield a statistically more conservative test (
<xref rid="B34" ref-type="bibr">Ramseyer and Tschacher, 2010</xref>
). The extent of synchrony of the pseudo pairs was assessed in the same manner.</p>
</sec>
<sec>
<title>Evaluating Synchrony with Wavelet Coherence</title>
<p>By using Matlab 2014a (Mathworks) and the wavelet toolbox (
<xref rid="B16" ref-type="bibr">Grinsted et al., 2004</xref>
), we conducted a wavelet transform for each time series. The default parameters of
<xref rid="B16" ref-type="bibr">Grinsted et al. (2004)</xref>
were employed except for the number of the order; following
<xref rid="B21" ref-type="bibr">Issartel et al. (2006)</xref>
, the order was set to eight. Morlet was used as the mother wavelet. To evaluate the rhythmic similarity between two individuals, cross wavelet coherence was calculated. The cone of influence (COI) area was not included in subsequent analyses (
<xref rid="B16" ref-type="bibr">Grinsted et al., 2004</xref>
). We used a coherence value under 4 Hz because our participants’ unstructured conversation with a stranger was not active or fast. The average coherence under 4 Hz across the time line, was standardized by using a Fisher-Z transformation before the statistical analyses. In addition, to determine which frequency band was sensitive enough to illustrate synchrony in the genuine pair, the average coherence of each frequency band (under 0.025, 0.025–0.05, 0.05–0.1, 0.1–0.2, 0.2–0.5, 0.5–1, 1–2, 2–3, and 3–4 Hz, respectively) was calculated and compared to the others.</p>
<p>Additionally, in the genuine pairs, the relative phase in nine 20° regions from 0 to 180° was extracted. Because dominant rhythm cannot be predetermined in an unstructured conversation, it was not clear which frequency to focused on in order to extract the relative phase. Therefore, in the area where wavelet coherence was significant, the number of occurrences in each 20° region was counted and the percentage distribution was calculated for each pair. The proportion of each region was transformed via arcsine transformation, which was used in the subsequent analysis.</p>
</sec>
</sec>
<sec>
<title>Results</title>
<p>We compared the coherence values between the genuine pairs and the pseudo pairs. As anticipated, the result of separate
<italic>t</italic>
-tests indicated that the average coherence under 4 Hz throughout the time line was higher in the genuine pairs (
<italic>M</italic>
= 0.26,
<italic>SD</italic>
= 0.02) than in the pseudo pairs (
<italic>M</italic>
= 0.24,
<italic>SD</italic>
= 0.02), and this difference was significant [
<italic>t</italic>
(59.52) = 2.22,
<italic>p</italic>
= 0.030,
<italic>d</italic>
= 0.56].</p>
<p>In addition, the coherence value in the genuine pairs was submitted to a one-way ANOVA with a within-subjects variable of frequency band (under 0.025, 0.025–0.05, 0.05–0.1, 0.1–0.2, 0.2–0.5, 0.5–1, 1–2, 2–3, and 3–4 Hz). The result indicated that the main effect of frequency band was significant [
<italic>F</italic>
(8,240) = 11.73,
<italic>p</italic>
< 0.001,
<inline-formula>
<mml:math id="M1">
<mml:msubsup>
<mml:mi mathvariant="normal" mathcolor="black">η</mml:mi>
<mml:mi mathvariant="normal" mathcolor="black">p</mml:mi>
<mml:mn mathvariant="normal" mathcolor="black">2</mml:mn>
</mml:msubsup>
</mml:math>
</inline-formula>
= 0.28]. Holm’s multiple comparison revealed that 0.5 Hz was the boundary of the extent of synchrony. The coherence value decreased as the frequency increased, and there was a significant difference between coherence at 0.2–0.5 and 0.5–1 Hz (
<bold>Table
<xref ref-type="table" rid="T1">1</xref>
</bold>
).</p>
<table-wrap id="T1" position="float">
<label>Table 1</label>
<caption>
<p>Means and standard deviations of wavelet coherence for each frequency component.</p>
</caption>
<table frame="hsides" rules="groups" cellspacing="5" cellpadding="5">
<thead>
<tr>
<th valign="top" align="left" rowspan="1" colspan="1"><0.025 Hz</th>
<th valign="top" align="center" rowspan="1" colspan="1">0.025–0.05 Hz</th>
<th valign="top" align="center" rowspan="1" colspan="1">0.05–0.1 Hz</th>
<th valign="top" align="center" rowspan="1" colspan="1">0.1–0.2 Hz</th>
<th valign="top" align="center" rowspan="1" colspan="1">0.2–0.5 Hz</th>
<th valign="top" align="center" rowspan="1" colspan="1">0.5–1 Hz</th>
<th valign="top" align="center" rowspan="1" colspan="1">1–2 Hz</th>
<th valign="top" align="center" rowspan="1" colspan="1">2–3 Hz</th>
<th valign="top" align="center" rowspan="1" colspan="1">3–4 Hz</th>
</tr>
</thead>
<tbody>
<tr>
<td valign="top" align="left" rowspan="1" colspan="1">0.330
<sub>a</sub>
</td>
<td valign="top" align="center" rowspan="1" colspan="1">0.302
<sub>a</sub>
</td>
<td valign="top" align="center" rowspan="1" colspan="1">0.288
<sub>a</sub>
</td>
<td valign="top" align="center" rowspan="1" colspan="1">0.270
<sub>ab</sub>
</td>
<td valign="top" align="center" rowspan="1" colspan="1">0.248
<sub>b</sub>
</td>
<td valign="top" align="center" rowspan="1" colspan="1">0.232
<sub>c</sub>
</td>
<td valign="top" align="center" rowspan="1" colspan="1">0.231
<sub>c</sub>
</td>
<td valign="top" align="center" rowspan="1" colspan="1">0.228
<sub>c</sub>
</td>
<td valign="top" align="center" rowspan="1" colspan="1">0.226
<sub>c</sub>
</td>
</tr>
<tr>
<td valign="top" align="left" rowspan="1" colspan="1">(0.135)</td>
<td valign="top" align="center" rowspan="1" colspan="1">(0.099)</td>
<td valign="top" align="center" rowspan="1" colspan="1">(0.068)</td>
<td valign="top" align="center" rowspan="1" colspan="1">(0.044)</td>
<td valign="top" align="center" rowspan="1" colspan="1">(0.022)</td>
<td valign="top" align="center" rowspan="1" colspan="1">(0.023)</td>
<td valign="top" align="center" rowspan="1" colspan="1">(0.014)</td>
<td valign="top" align="center" rowspan="1" colspan="1">(0.013)</td>
<td valign="top" align="center" rowspan="1" colspan="1">(0.013)</td>
</tr>
</tbody>
</table>
<table-wrap-foot>
<attrib>
<italic>The standard deviations are in parentheses below the means; the means with different subscripts are significantly different based on Holm’s multiple comparison (adjusted p-value < 0.05).</italic>
</attrib>
</table-wrap-foot>
</table-wrap>
<p>The proportion of relative phase in the genuine pairs was submitted to a one-way ANOVA with a within-subjects variable of relative phase angle (each 20° region from 0 to 180°). The result indicated that the main effect of relative phase angle was not significant [
<italic>F</italic>
(8,240) = 0.57,
<italic>p</italic>
= 0.802,
<inline-formula>
<mml:math id="M2">
<mml:msubsup>
<mml:mi mathvariant="normal" mathcolor="black">η</mml:mi>
<mml:mi mathvariant="normal" mathcolor="black">p</mml:mi>
<mml:mn mathvariant="normal" mathcolor="black">2</mml:mn>
</mml:msubsup>
</mml:math>
</inline-formula>
= 0.02;
<bold>Figure
<xref ref-type="fig" rid="F3">3</xref>
</bold>
].</p>
<fig id="F3" position="float">
<label>FIGURE 3</label>
<caption>
<p>
<bold>Percentage distribution of relative phase occurrences.</bold>
The mean value is indicated above the error bar representing the Standard Error. Relative phase is distributed in nine 20° regions from 0 to 180°. The distribution is almost equally spread over the nine regions of the relative phase angle.</p>
</caption>
<graphic xlink:href="fpsyg-07-00516-g003"></graphic>
</fig>
</sec>
<sec>
<title>Discussion</title>
<p>This study examined whether the coordination represented in a time–frequency plane could be seen in an unstructured conversation. The results of employing the wavelet transform and calculating the wavelet coherence indicated that the genuine pair who had a conversation was more synchronized than the pseudo pair consisting of virtual data, which supported our hypothesis and the validity and possible utility of the cross-wavelet approach.</p>
<p>Findings of the current study extend the field of synchrony research. Interpersonal coordination was observed at various levels of communication behavior (
<xref rid="B15" ref-type="bibr">Giles et al., 1991</xref>
;
<xref rid="B4" ref-type="bibr">Burgoon et al., 1995</xref>
, for a review).
<xref rid="B3" ref-type="bibr">Bernieri and Rosenthal (1991)</xref>
illustrated two aspects of interpersonal coordination: behavioral matching and synchrony, which can be considered time- and frequency-domain phenomena. Behavioral matching or behavioral mimicry have been examined in studies using various situations (e.g.,
<xref rid="B37" ref-type="bibr">Scheflen, 1964</xref>
;
<xref rid="B23" ref-type="bibr">LaFrance, 1976</xref>
,
<xref rid="B24" ref-type="bibr">1979</xref>
;
<xref rid="B7" ref-type="bibr">Chartrand and Bargh, 1999</xref>
;
<xref rid="B26" ref-type="bibr">Lakin and Chartrand, 2003</xref>
). However, compared to behavioral mimicry research, the findings of synchrony research are limited. Many previous studies were conducted under a specific task situation (e.g.,
<xref rid="B38" ref-type="bibr">Schmidt et al., 2012</xref>
) or less social situations (e.g.,
<xref rid="B47" ref-type="bibr">Varlet et al., 2011</xref>
). This study employed a spectrum analysis using the wavelet transform and found that synchrony, in this case rhythmic similarity, was observed even in an unstructured conversation; there were no rhythmic tasks or restrictions on turn-taking. Moreover, as the current data were tested in a conservative fashion, the results are not very strong but their significance show the robustness of the cross-wavelet analysis. Our study contributes to the literature by extending the usage of cross-wavelet approach to a social interaction situation, and adds new insight regarding rhythm to communication research focusing on daily conversations.</p>
<p>In our daily conversations, the dominant rhythm is not predetermined. However, our findings suggest that the interactants achieved synchrony even if they were not engaged in a specific rhythmic task, which can be captured by using the wavelet transform. These findings seem to generate new questions about the frequency at which people actually synchronize, and at which temporal synchronization people perceive as being comfortable and smooth. The findings of this study provide a clue to address the first question. In the genuine pair, the value of wavelet coherence was significantly higher under 0.5 Hz. This frequency was seen in a previous study that employed a periodic rhythmic task (e.g.,
<xref rid="B38" ref-type="bibr">Schmidt et al., 2012</xref>
). Although employing a specific task makes it possible to achieve synchronization at a faster rhythm (about 1.33 Hz;
<xref rid="B39" ref-type="bibr">Schmidt et al., 2014</xref>
), our daily conversations might not move so quickly. Our findings are not sufficient to illustrate the temporal characteristics of our face-to-face interactions. More empirical data should be collected. In future research, it would be important to explore the antecedents of tempo when people achieve synchronization. Whether the specific (e.g., fast or slow) temporal synchronization can be perceived as comfortable and/or smooth by interactants remains to be explored.</p>
<p>In addition, the role of relative phase should be pursued. In the current study, the characteristics of relative phase were not clear; there was no significant difference between in-phase and anti-phase synchronization. Although previous studies revealed that interactants synchronized in the in-phase (e.g.,
<xref rid="B38" ref-type="bibr">Schmidt et al., 2012</xref>
) and that in-phase synchronization had a positive relationship with affiliation (
<xref rid="B19" ref-type="bibr">Hove and Risen, 2009</xref>
), these findings were from rhythmic task situations. If in-phase and/or anti-phase synchronization is achieved even in an unstructured conversation, which factors make this possible? Behavioral mimicry research has indicated that a positive attitude or an intention of affiliation toward one’s partner causes behavioral mimicry (e.g.,
<xref rid="B26" ref-type="bibr">Lakin and Chartrand, 2003</xref>
). Similarly, positive attitudes or intentions of affiliation toward one’s partner might influence on in-phase and/or anti-phase synchronization. In addition to the antecedents, the results of in-phase and/or anti-phase synchronization need to be examined. In-phase synchronization is a perfect match in rhythm and timing, whereas anti-phase synchronization is a matching of rhythm, but not timing. Although a previous study showed that both types of synchronization had a positive influence on third-party judgments of rapport (
<xref rid="B29" ref-type="bibr">Miles et al., 2009</xref>
), it is not yet known whether the difference in timing had a different influence on the actual interactants, which remains a question for future research.</p>
<sec>
<title>Limitations and Directions for Future Research</title>
<p>The limitations of this study include the lack of a measure of rapport. We corroborated the existence of synchrony by using the wavelet transform and pseudo synchrony paradigm (
<xref rid="B3" ref-type="bibr">Bernieri and Rosenthal, 1991</xref>
); the genuine pairs of interactants were more synchronized than the pseudo pairs. Interpersonal coordination studies, including behavioral mimicry and synchrony research, have revealed a positive relationship between coordination and rapport (e.g.,
<xref rid="B23" ref-type="bibr">LaFrance, 1976</xref>
,
<xref rid="B24" ref-type="bibr">1979</xref>
;
<xref rid="B45" ref-type="bibr">Tickle-Degnen and Rosenthal, 1990</xref>
;
<xref rid="B19" ref-type="bibr">Hove and Risen, 2009</xref>
). If wavelet transform is employed, there should be several questions regarding the relationship with rapport; for instance, whether, and at which frequency, wavelet coherence is related to rapport, and whether in-phase and anti-phase synchronization are related to rapport. Our findings support the validity and possible utility of wavelet transform to evaluate the extent of synchrony in our daily conversations. Thus, to develop synchrony research, further examination using the wavelet transform is needed.</p>
<p>Although the current study has a limitation, wavelet transform seems to have the potential to contribute to the theoretical development of interpersonal coordination.
<xref rid="B3" ref-type="bibr">Bernieri and Rosenthal (1991)</xref>
differentiated two facets of interpersonal coordination: behavior matching (or behavioral mimicry) and synchrony. Although the difference between the two facets is still argued (
<xref rid="B8" ref-type="bibr">Chartrand and Lakin, 2013</xref>
;
<xref rid="B25" ref-type="bibr">Lakin, 2013</xref>
), researchers have not yet reached a clear conclusion. This is partly because of the lack of methodology used to differentiate and integrate behavioral matching (or mimicry) and synchrony. To this end, the wavelet transform can be a powerful analysis method. The coordination assessed in the time-frequency plane (i.e., wavelet transform) indicates the extent of the rhythmic similarity located in the time line. Using the time line, the boundary between behavioral mimicry and synchrony would become blurred, and the difference between them could be regarded as a difference in perspective about coordination, not as different phenomena. Synchrony in the time-frequency plane represents how similar the rhythm or velocity between the interactants is, across the time line. In contrast, behavioral mimicry in the time-domain represents the extent to which behaviors co-occur or how similar the amount of movement is across the time line. In this perspective, synchrony and behavioral mimicry are distinguished by their focus, with the former focusing on velocity, and the latter focusing on the amount. Furthermore, the similarity of synchrony and behavioral mimicry can be argued because researchers conduct spectrum analysis and cross-correlation analysis on the same data (i.e., two movement time-series). Differentiating and integrating synchrony and behavioral mimicry should facilitate the development of coordination theory.</p>
</sec>
</sec>
<sec>
<title>Author Contributions</title>
<p>All authors contributed to the study design. KF performed the data collection, analysis, and interpretation. KF drafted the manuscript, and ID provided critical revisions. All authors approved the final version of the manuscript for submission.</p>
</sec>
<sec>
<title>Conflict of Interest Statement</title>
<p>The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.</p>
</sec>
</body>
<back>
<fn-group>
<fn fn-type="financial-disclosure">
<p>
<bold>Funding.</bold>
This research was supported by Japan Society for the Promotion of Science (Grant-in-Aid for JSPS Fellows; 11J03150, and Grant-in-Aid for Young Scientists (B); 15K17259).</p>
</fn>
</fn-group>
<ref-list>
<title>References</title>
<ref id="B1">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Bernieri</surname>
<given-names>F. J.</given-names>
</name>
</person-group>
(
<year>1988</year>
).
<article-title>Coordinated movement and rapport in teacher-student interactions.</article-title>
<source>
<italic>J. Nonverbal Behav.</italic>
</source>
<volume>12</volume>
<fpage>120</fpage>
<lpage>138</lpage>
.
<pub-id pub-id-type="doi">10.1007/BF00986930</pub-id>
</mixed-citation>
</ref>
<ref id="B2">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Bernieri</surname>
<given-names>F. J.</given-names>
</name>
<name>
<surname>Reznick</surname>
<given-names>J. S.</given-names>
</name>
<name>
<surname>Rosenthal</surname>
<given-names>R.</given-names>
</name>
</person-group>
(
<year>1988</year>
).
<article-title>Synchrony, pseudo synchrony, and dissynchrony: measuring the entrainment process in mother-infant interactions.</article-title>
<source>
<italic>J. Pers. Soc. Psychol.</italic>
</source>
<volume>54</volume>
<fpage>243</fpage>
<lpage>253</lpage>
.
<pub-id pub-id-type="doi">10.1037/0022-3514.54.2.243</pub-id>
</mixed-citation>
</ref>
<ref id="B3">
<mixed-citation publication-type="book">
<person-group person-group-type="author">
<name>
<surname>Bernieri</surname>
<given-names>F. J.</given-names>
</name>
<name>
<surname>Rosenthal</surname>
<given-names>R.</given-names>
</name>
</person-group>
(
<year>1991</year>
). “
<article-title>Interpersonal coordination: Behavior matching and interactional synchrony</article-title>
,” in
<source>
<italic>Fundamentals of Nonverbal Behavior</italic>
</source>
,
<role>eds</role>
<person-group person-group-type="editor">
<name>
<surname>Feldman</surname>
<given-names>R. S.</given-names>
</name>
<name>
<surname>Rime</surname>
<given-names>B.</given-names>
</name>
</person-group>
(
<publisher-loc>Cambridge</publisher-loc>
:
<publisher-name>Cambridge University Press</publisher-name>
),
<fpage>401</fpage>
<lpage>432</lpage>
.</mixed-citation>
</ref>
<ref id="B4">
<mixed-citation publication-type="book">
<person-group person-group-type="author">
<name>
<surname>Burgoon</surname>
<given-names>J. K.</given-names>
</name>
<name>
<surname>Stern</surname>
<given-names>L. A.</given-names>
</name>
<name>
<surname>Dillman</surname>
<given-names>L.</given-names>
</name>
</person-group>
(
<year>1995</year>
).
<source>
<italic>Interpersonal Adaptation: Dyadic Interaction Patterns.</italic>
</source>
<publisher-loc>Cambridge</publisher-loc>
:
<publisher-name>Cambridge University Press</publisher-name>
<pub-id pub-id-type="doi">10.1017/CBO9780511720314</pub-id>
</mixed-citation>
</ref>
<ref id="B5">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Cappella</surname>
<given-names>J. N.</given-names>
</name>
</person-group>
(
<year>1990</year>
).
<article-title>On defining conversational coordination and rapport.</article-title>
<source>
<italic>Psychol. Inq.</italic>
</source>
<volume>1</volume>
<fpage>303</fpage>
<lpage>305</lpage>
.
<pub-id pub-id-type="doi">10.1207/s15327965pli0104_5</pub-id>
</mixed-citation>
</ref>
<ref id="B6">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Cappella</surname>
<given-names>J. N.</given-names>
</name>
<name>
<surname>Planalp</surname>
<given-names>S.</given-names>
</name>
</person-group>
(
<year>1981</year>
).
<article-title>Talk and silence sequences in informal conversations III: interspeaker influence.</article-title>
<source>
<italic>Hum. Commun. Res.</italic>
</source>
<volume>7</volume>
<fpage>117</fpage>
<lpage>132</lpage>
.
<pub-id pub-id-type="doi">10.1111/j.1468-2958.1981.tb00564.x</pub-id>
</mixed-citation>
</ref>
<ref id="B7">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Chartrand</surname>
<given-names>T. L.</given-names>
</name>
<name>
<surname>Bargh</surname>
<given-names>J. A.</given-names>
</name>
</person-group>
(
<year>1999</year>
).
<article-title>The chameleon effect: the perception-behavior link and social interaction.</article-title>
<source>
<italic>J. Pers. Soc. Psychol.</italic>
</source>
<volume>76</volume>
<fpage>893</fpage>
<lpage>910</lpage>
.
<pub-id pub-id-type="doi">10.1037/0022-3514.76.6.893</pub-id>
<pub-id pub-id-type="pmid">10402679</pub-id>
</mixed-citation>
</ref>
<ref id="B8">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Chartrand</surname>
<given-names>T. L.</given-names>
</name>
<name>
<surname>Lakin</surname>
<given-names>J. L.</given-names>
</name>
</person-group>
(
<year>2013</year>
).
<article-title>The antecedents and consequences of human behavioral mimicry.</article-title>
<source>
<italic>Annu. Rev. Psychol.</italic>
</source>
<volume>64</volume>
<fpage>285</fpage>
<lpage>308</lpage>
.
<pub-id pub-id-type="doi">10.1146/annurev-psych-113011-143754</pub-id>
<pub-id pub-id-type="pmid">23020640</pub-id>
</mixed-citation>
</ref>
<ref id="B9">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Condon</surname>
<given-names>W. S.</given-names>
</name>
<name>
<surname>Ogston</surname>
<given-names>W. D.</given-names>
</name>
</person-group>
(
<year>1966</year>
).
<article-title>Sound film analysis of normal and pathological behavior patterns.</article-title>
<source>
<italic>J. Nerv. Ment. Dis.</italic>
</source>
<volume>143</volume>
<fpage>338</fpage>
<lpage>347</lpage>
.
<pub-id pub-id-type="doi">10.1097/00005053-196610000-00005</pub-id>
<pub-id pub-id-type="pmid">5958766</pub-id>
</mixed-citation>
</ref>
<ref id="B10">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Condon</surname>
<given-names>W. S.</given-names>
</name>
<name>
<surname>Ogston</surname>
<given-names>W. D.</given-names>
</name>
</person-group>
(
<year>1967</year>
).
<article-title>A segmentation of behavior.</article-title>
<source>
<italic>J. Psychiat. Res.</italic>
</source>
<volume>5</volume>
<fpage>221</fpage>
<lpage>235</lpage>
.
<pub-id pub-id-type="doi">10.1016/0022-3956(67)90004-0</pub-id>
</mixed-citation>
</ref>
<ref id="B11">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Ekman</surname>
<given-names>P.</given-names>
</name>
<name>
<surname>Friesen</surname>
<given-names>W. V.</given-names>
</name>
</person-group>
(
<year>1969</year>
).
<article-title>The repertoire of nonverbal behavior: categories, origins, usage, and coding.</article-title>
<source>
<italic>Semiotica</italic>
</source>
<volume>1</volume>
<fpage>49</fpage>
<lpage>98</lpage>
.
<pub-id pub-id-type="doi">10.1515/semi.1969.1.1.49</pub-id>
</mixed-citation>
</ref>
<ref id="B12">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Ekman</surname>
<given-names>P.</given-names>
</name>
<name>
<surname>Friesen</surname>
<given-names>W. V.</given-names>
</name>
</person-group>
(
<year>1972</year>
).
<article-title>Hand movements.</article-title>
<source>
<italic>J. Commun.</italic>
</source>
<volume>22</volume>
<fpage>353</fpage>
<lpage>374</lpage>
.
<pub-id pub-id-type="doi">10.1111/j.1460-2466.1972.tb00163.x</pub-id>
</mixed-citation>
</ref>
<ref id="B13">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Frauendorfer</surname>
<given-names>D.</given-names>
</name>
<name>
<surname>Mast</surname>
<given-names>M. S.</given-names>
</name>
<name>
<surname>Nguyen</surname>
<given-names>L.</given-names>
</name>
<name>
<surname>Gatica-Perez</surname>
<given-names>D.</given-names>
</name>
</person-group>
(
<year>2014</year>
).
<article-title>Nonverbal social sensing in action: unobtrusive recording and extracting of nonverbal behavior in social interactions illustrated with a research example.</article-title>
<source>
<italic>J. Nonverbal Behav.</italic>
</source>
<volume>38</volume>
<fpage>231</fpage>
<lpage>245</lpage>
.
<pub-id pub-id-type="doi">10.1007/s10919-014-0173-5</pub-id>
</mixed-citation>
</ref>
<ref id="B14">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Fujiwara</surname>
<given-names>K.</given-names>
</name>
<name>
<surname>Daibo</surname>
<given-names>I.</given-names>
</name>
</person-group>
(
<year>2014</year>
).
<article-title>The extraction of nonverbal behaviors: using video images and speech-signal analysis in dyadic conversation.</article-title>
<source>
<italic>J. Nonverbal Behav.</italic>
</source>
<volume>38</volume>
<fpage>377</fpage>
<lpage>388</lpage>
.
<pub-id pub-id-type="doi">10.1007/s10919-014-0183-3</pub-id>
</mixed-citation>
</ref>
<ref id="B15">
<mixed-citation publication-type="book">
<person-group person-group-type="author">
<name>
<surname>Giles</surname>
<given-names>H.</given-names>
</name>
<name>
<surname>Coupland</surname>
<given-names>N.</given-names>
</name>
<name>
<surname>Coupland</surname>
<given-names>J.</given-names>
</name>
</person-group>
(
<year>1991</year>
). “
<article-title>Accommodation theory: Communication, context and consequences</article-title>
,” in
<source>
<italic>Contexts of Accommodation: Developments in Applied Sociolinguistics</italic>
</source>
,
<role>eds</role>
<person-group person-group-type="editor">
<name>
<surname>Giles</surname>
<given-names>H.</given-names>
</name>
<name>
<surname>Coupland</surname>
<given-names>J.</given-names>
</name>
<name>
<surname>Coupland</surname>
<given-names>N.</given-names>
</name>
</person-group>
(
<publisher-loc>Cambridge</publisher-loc>
:
<publisher-name>Cambridge University Press</publisher-name>
),
<fpage>1</fpage>
<lpage>68</lpage>
.</mixed-citation>
</ref>
<ref id="B16">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Grinsted</surname>
<given-names>A.</given-names>
</name>
<name>
<surname>Moore</surname>
<given-names>J. C.</given-names>
</name>
<name>
<surname>Jevrejeva</surname>
<given-names>S.</given-names>
</name>
</person-group>
(
<year>2004</year>
).
<article-title>Application of the cross wavelet transform and wavelet coherence to geophysical time series.</article-title>
<source>
<italic>Nonlinear Proc. Geoph.</italic>
</source>
<volume>11</volume>
<fpage>561</fpage>
<lpage>566</lpage>
.
<pub-id pub-id-type="doi">10.5194/npg-11-561-2004</pub-id>
</mixed-citation>
</ref>
<ref id="B17">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Hadar</surname>
<given-names>U.</given-names>
</name>
<name>
<surname>Steiner</surname>
<given-names>T. J.</given-names>
</name>
<name>
<surname>Clifford Rose</surname>
<given-names>F.</given-names>
</name>
</person-group>
(
<year>1985</year>
).
<article-title>Head movement during listening turns in conversation.</article-title>
<source>
<italic>J. Nonverbal Behav.</italic>
</source>
<volume>9</volume>
<fpage>214</fpage>
<lpage>228</lpage>
.
<pub-id pub-id-type="doi">10.1007/BF00986881</pub-id>
</mixed-citation>
</ref>
<ref id="B18">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Holler</surname>
<given-names>J.</given-names>
</name>
<name>
<surname>Wilkin</surname>
<given-names>K.</given-names>
</name>
</person-group>
(
<year>2011</year>
).
<article-title>Co-speech gesture mimicry in the process of collaborative referring during face-to-face dialogue.</article-title>
<source>
<italic>J. Nonverbal Behav.</italic>
</source>
<volume>35</volume>
<fpage>133</fpage>
<lpage>153</lpage>
.
<pub-id pub-id-type="doi">10.1007/s10919-011-0105-6</pub-id>
</mixed-citation>
</ref>
<ref id="B19">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Hove</surname>
<given-names>M. J.</given-names>
</name>
<name>
<surname>Risen</surname>
<given-names>J. L.</given-names>
</name>
</person-group>
(
<year>2009</year>
).
<article-title>It’s all in the timing: interpersonal synchrony increases affiliation.</article-title>
<source>
<italic>Soc. Cognition</italic>
</source>
<volume>27</volume>
<fpage>949</fpage>
<lpage>960</lpage>
.
<pub-id pub-id-type="doi">10.1521/soco.2009.27.6.949</pub-id>
</mixed-citation>
</ref>
<ref id="B20">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Issartel</surname>
<given-names>J.</given-names>
</name>
<name>
<surname>Barainne</surname>
<given-names>T.</given-names>
</name>
<name>
<surname>Gaillot</surname>
<given-names>P.</given-names>
</name>
<name>
<surname>Marin</surname>
<given-names>L.</given-names>
</name>
</person-group>
(
<year>2014</year>
).
<article-title>The relevance of the cross-wavelet transform in the analysis of human interaction – A tutorial.</article-title>
<source>
<italic>Front. Psychol.</italic>
</source>
<volume>5</volume>
:
<issue>1566</issue>
<pub-id pub-id-type="doi">10.3389/fpsyg.2014.01566</pub-id>
</mixed-citation>
</ref>
<ref id="B21">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Issartel</surname>
<given-names>J.</given-names>
</name>
<name>
<surname>Marin</surname>
<given-names>L.</given-names>
</name>
<name>
<surname>Gaillotb</surname>
<given-names>P.</given-names>
</name>
<name>
<surname>Bardainnec</surname>
<given-names>T.</given-names>
</name>
<name>
<surname>Cadopia</surname>
<given-names>M.</given-names>
</name>
</person-group>
(
<year>2006</year>
).
<article-title>A practical guide to time—frequency analysis in the study of human motor behavior: the contribution of wavelet transform.</article-title>
<source>
<italic>J. Motor Behav.</italic>
</source>
<volume>38</volume>
<fpage>139</fpage>
<lpage>159</lpage>
.
<pub-id pub-id-type="doi">10.3200/JMBR.38.2.139-159</pub-id>
</mixed-citation>
</ref>
<ref id="B22">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Kupper</surname>
<given-names>Z.</given-names>
</name>
<name>
<surname>Ramseyer</surname>
<given-names>F.</given-names>
</name>
<name>
<surname>Hoffmann</surname>
<given-names>H.</given-names>
</name>
<name>
<surname>Kalbermatten</surname>
<given-names>S.</given-names>
</name>
<name>
<surname>Tschacher</surname>
<given-names>W.</given-names>
</name>
</person-group>
(
<year>2010</year>
).
<article-title>Video-based quantification of body movement during social interaction indicates the severity of negative symptoms in patients with schizophrenia.</article-title>
<source>
<italic>Schizophr. Res.</italic>
</source>
<volume>121</volume>
<fpage>90</fpage>
<lpage>100</lpage>
.
<pub-id pub-id-type="pmid">20434313</pub-id>
</mixed-citation>
</ref>
<ref id="B23">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>LaFrance</surname>
<given-names>M.</given-names>
</name>
</person-group>
(
<year>1976</year>
).
<article-title>Group rapport: posture sharing as a nonverbal indicator.</article-title>
<source>
<italic>Group Organ. Manage.</italic>
</source>
<volume>1</volume>
<fpage>328</fpage>
<lpage>333</lpage>
.
<pub-id pub-id-type="doi">10.1177/105960117600100307</pub-id>
</mixed-citation>
</ref>
<ref id="B24">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>LaFrance</surname>
<given-names>M.</given-names>
</name>
</person-group>
(
<year>1979</year>
).
<article-title>Nonverbal synchrony and rapport: analysis by the cross-lag panel technique.</article-title>
<source>
<italic>Soc. Psychol. Q.</italic>
</source>
<volume>42</volume>
<fpage>66</fpage>
<lpage>70</lpage>
.</mixed-citation>
</ref>
<ref id="B25">
<mixed-citation publication-type="book">
<person-group person-group-type="author">
<name>
<surname>Lakin</surname>
<given-names>J. L.</given-names>
</name>
</person-group>
(
<year>2013</year>
). “
<article-title>Behavioral mimicry and interpersonal synchrony</article-title>
,” in
<source>
<italic>Nonverbal Communication</italic>
</source>
<role>eds</role>
<person-group person-group-type="editor">
<name>
<surname>Hall</surname>
<given-names>J. A.</given-names>
</name>
<name>
<surname>Knapp</surname>
<given-names>M. L.</given-names>
</name>
</person-group>
<publisher-loc>Berlin</publisher-loc>
:
<publisher-name>De Gruyter Mouton</publisher-name>
,
<fpage>539</fpage>
<lpage>576</lpage>
.</mixed-citation>
</ref>
<ref id="B26">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Lakin</surname>
<given-names>J. L.</given-names>
</name>
<name>
<surname>Chartrand</surname>
<given-names>T. L.</given-names>
</name>
</person-group>
(
<year>2003</year>
).
<article-title>Using nonconscious behavioral mimicry to create affiliation and rapport.</article-title>
<source>
<italic>Psychol. Sci.</italic>
</source>
<volume>14</volume>
<fpage>334</fpage>
<lpage>339</lpage>
.
<pub-id pub-id-type="doi">10.1111/1467-9280.14481</pub-id>
<pub-id pub-id-type="pmid">12807406</pub-id>
</mixed-citation>
</ref>
<ref id="B27">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Lumsden</surname>
<given-names>J.</given-names>
</name>
<name>
<surname>Miles</surname>
<given-names>L. K.</given-names>
</name>
<name>
<surname>Macrae</surname>
<given-names>C. N.</given-names>
</name>
</person-group>
(
<year>2012a</year>
).
<article-title>Perceptions of synchrony: different strokes for different folks?</article-title>
<source>
<italic>Perception</italic>
</source>
<volume>41</volume>
<fpage>1529</fpage>
<lpage>1531</lpage>
.
<pub-id pub-id-type="doi">10.1068/p7360</pub-id>
<pub-id pub-id-type="pmid">23586290</pub-id>
</mixed-citation>
</ref>
<ref id="B28">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Lumsden</surname>
<given-names>J.</given-names>
</name>
<name>
<surname>Miles</surname>
<given-names>L. K.</given-names>
</name>
<name>
<surname>Richardson</surname>
<given-names>M. J.</given-names>
</name>
<name>
<surname>Smith</surname>
<given-names>C. A.</given-names>
</name>
<name>
<surname>Macrae</surname>
<given-names>C. N.</given-names>
</name>
</person-group>
(
<year>2012b</year>
).
<article-title>Who syncs? Social motives and interpersonal coordination.</article-title>
<source>
<italic>J. Exp. Soc. Psychol.</italic>
</source>
<volume>48</volume>
<fpage>746</fpage>
<lpage>751</lpage>
.
<pub-id pub-id-type="doi">10.1016/j.jesp.2011.12.007</pub-id>
</mixed-citation>
</ref>
<ref id="B29">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Miles</surname>
<given-names>L. K.</given-names>
</name>
<name>
<surname>Nind</surname>
<given-names>L. K.</given-names>
</name>
<name>
<surname>Macrae</surname>
<given-names>C. N.</given-names>
</name>
</person-group>
(
<year>2009</year>
).
<article-title>The rhythm of rapport: interpersonal synchrony and social perception.</article-title>
<source>
<italic>J. Exp. Soc. Psychol.</italic>
</source>
<volume>45</volume>
<fpage>585</fpage>
<lpage>589</lpage>
.
<pub-id pub-id-type="doi">10.1016/j.jesp.2009.02.002</pub-id>
</mixed-citation>
</ref>
<ref id="B30">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Natale</surname>
<given-names>M.</given-names>
</name>
</person-group>
(
<year>1975</year>
).
<article-title>Convergence of mean vocal intensity in dyadic communication as a function of social desirability.</article-title>
<source>
<italic>J. Pers. Soc. Psychol.</italic>
</source>
<volume>32</volume>
<fpage>790</fpage>
<lpage>804</lpage>
.</mixed-citation>
</ref>
<ref id="B31">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Paxton</surname>
<given-names>A.</given-names>
</name>
<name>
<surname>Dale</surname>
<given-names>R.</given-names>
</name>
</person-group>
(
<year>2013</year>
).
<article-title>Frame-differencing methods for measuring bodily synchrony in conversation.</article-title>
<source>
<italic>Behav. Res. Methods</italic>
</source>
<volume>45</volume>
<fpage>329</fpage>
<lpage>343</lpage>
.
<pub-id pub-id-type="doi">10.3758/s13428-012-0249-2</pub-id>
<pub-id pub-id-type="pmid">23055158</pub-id>
</mixed-citation>
</ref>
<ref id="B32">
<mixed-citation publication-type="book">
<person-group person-group-type="author">
<name>
<surname>Pikovsky</surname>
<given-names>A.</given-names>
</name>
<name>
<surname>Rosenblum</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>Kurths</surname>
<given-names>J.</given-names>
</name>
</person-group>
(
<year>2003</year>
).
<source>
<italic>Synchronization: A Universal Concept in Nonlinear Sciences.</italic>
</source>
<publisher-loc>Cambridge</publisher-loc>
:
<publisher-name>Cambridge University Press</publisher-name>
<pub-id pub-id-type="doi">10.1017/CBO9780511755743</pub-id>
</mixed-citation>
</ref>
<ref id="B33">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Poppe</surname>
<given-names>R.</given-names>
</name>
<name>
<surname>Van Der Zee</surname>
<given-names>S.</given-names>
</name>
<name>
<surname>Heylen</surname>
<given-names>D. K. J.</given-names>
</name>
<name>
<surname>Taylor</surname>
<given-names>P. J.</given-names>
</name>
</person-group>
(
<year>2014</year>
).
<article-title>AMAB: Automated measurement and analysis of body motion.</article-title>
<source>
<italic>Behav. Res. Methods</italic>
</source>
<volume>46</volume>
<fpage>625</fpage>
<lpage>633</lpage>
.
<pub-id pub-id-type="doi">10.3758/s13428-013-0398-y</pub-id>
<pub-id pub-id-type="pmid">24142835</pub-id>
</mixed-citation>
</ref>
<ref id="B34">
<mixed-citation publication-type="book">
<person-group person-group-type="author">
<name>
<surname>Ramseyer</surname>
<given-names>F.</given-names>
</name>
<name>
<surname>Tschacher</surname>
<given-names>W.</given-names>
</name>
</person-group>
(
<year>2010</year>
). “
<article-title>Nonverbal synchrony or random coincidence? How to tell the difference</article-title>
,” in
<source>
<italic>Development of Multimodal Interfaces: Active Listening and Synchrony</italic>
</source>
,
<role>eds</role>
<person-group person-group-type="editor">
<name>
<surname>Esposito</surname>
<given-names>A.</given-names>
</name>
<name>
<surname>Campbell</surname>
<given-names>N.</given-names>
</name>
<name>
<surname>Vogel</surname>
<given-names>C.</given-names>
</name>
<name>
<surname>Hussain</surname>
<given-names>A.</given-names>
</name>
<name>
<surname>Nijholt</surname>
<given-names>A.</given-names>
</name>
</person-group>
(
<publisher-loc>Berlin</publisher-loc>
:
<publisher-name>Springer</publisher-name>
),
<fpage>182</fpage>
<lpage>196</lpage>
.</mixed-citation>
</ref>
<ref id="B35">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Richardson</surname>
<given-names>D. C.</given-names>
</name>
<name>
<surname>Dale</surname>
<given-names>R.</given-names>
</name>
</person-group>
(
<year>2005</year>
).
<article-title>Looking to understand: the coupling between speakers’ and listeners’ eye movements and its relationship to discourse comprehension.</article-title>
<source>
<italic>Cogn. Sci.</italic>
</source>
<volume>29</volume>
<fpage>1046</fpage>
<lpage>1060</lpage>
.
<pub-id pub-id-type="doi">10.1207/s15516709cog0000_29</pub-id>
</mixed-citation>
</ref>
<ref id="B36">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Richardson</surname>
<given-names>M. J.</given-names>
</name>
<name>
<surname>Marsh</surname>
<given-names>K. L.</given-names>
</name>
<name>
<surname>Isenhower</surname>
<given-names>R.</given-names>
</name>
<name>
<surname>Goodman</surname>
<given-names>J.</given-names>
</name>
<name>
<surname>Schmidt</surname>
<given-names>R. C.</given-names>
</name>
</person-group>
(
<year>2007</year>
).
<article-title>Rocking together: dynamics of intentional and unintentional interpersonal coordination.</article-title>
<source>
<italic>Hum. Mov. Sci.</italic>
</source>
<volume>26</volume>
<fpage>867</fpage>
<lpage>891</lpage>
.
<pub-id pub-id-type="doi">10.1016/j.humov.2007.07.002</pub-id>
<pub-id pub-id-type="pmid">17765345</pub-id>
</mixed-citation>
</ref>
<ref id="B37">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Scheflen</surname>
<given-names>A. E.</given-names>
</name>
</person-group>
(
<year>1964</year>
).
<article-title>The significance of posture in communication systems.</article-title>
<source>
<italic>Psychiatry</italic>
</source>
<volume>27</volume>
<fpage>316</fpage>
<lpage>331</lpage>
.
<pub-id pub-id-type="doi">10.1521/00332747.1964.11023403</pub-id>
<pub-id pub-id-type="pmid">14216879</pub-id>
</mixed-citation>
</ref>
<ref id="B38">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Schmidt</surname>
<given-names>R. C.</given-names>
</name>
<name>
<surname>Morr</surname>
<given-names>S.</given-names>
</name>
<name>
<surname>Fitzpatrick</surname>
<given-names>P.</given-names>
</name>
<name>
<surname>Richardson</surname>
<given-names>M. J.</given-names>
</name>
</person-group>
(
<year>2012</year>
).
<article-title>Measuring the dynamics of interactional synchrony.</article-title>
<source>
<italic>J. Nonverbal Behav.</italic>
</source>
<volume>36</volume>
<fpage>263</fpage>
<lpage>279</lpage>
.
<pub-id pub-id-type="doi">10.1007/s10919-012-0138-5</pub-id>
</mixed-citation>
</ref>
<ref id="B39">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Schmidt</surname>
<given-names>R. C.</given-names>
</name>
<name>
<surname>Nie</surname>
<given-names>L.</given-names>
</name>
<name>
<surname>Franco</surname>
<given-names>A.</given-names>
</name>
<name>
<surname>Richardson</surname>
<given-names>M. J.</given-names>
</name>
</person-group>
(
<year>2014</year>
).
<article-title>Bodily synchronization underlying joke telling.</article-title>
<source>
<italic>Front. Hum. Neurosci.</italic>
</source>
<volume>8</volume>
:
<issue>633</issue>
<pub-id pub-id-type="doi">10.3389/fnhum.2014.00633</pub-id>
</mixed-citation>
</ref>
<ref id="B40">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Schmidt</surname>
<given-names>R. C.</given-names>
</name>
<name>
<surname>O’Brien</surname>
<given-names>B.</given-names>
</name>
</person-group>
(
<year>1997</year>
).
<article-title>Evaluating the dynamics of unintended interpersonal coordination.</article-title>
<source>
<italic>Ecol. Psychol.</italic>
</source>
<volume>9</volume>
<fpage>189</fpage>
<lpage>206</lpage>
.
<pub-id pub-id-type="doi">10.1207/s15326969eco0903_2</pub-id>
</mixed-citation>
</ref>
<ref id="B41">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Shockley</surname>
<given-names>K.</given-names>
</name>
<name>
<surname>Santana</surname>
<given-names>M. V.</given-names>
</name>
<name>
<surname>Fowler</surname>
<given-names>C. A.</given-names>
</name>
</person-group>
(
<year>2003</year>
).
<article-title>Mutual interpersonal postural constraints are involved in cooperative conversation.</article-title>
<source>
<italic>J. Exp. Psychol. Human</italic>
</source>
<volume>29</volume>
<fpage>326</fpage>
<lpage>332</lpage>
.
<pub-id pub-id-type="doi">10.1037/0096-1523.33.1.201</pub-id>
</mixed-citation>
</ref>
<ref id="B42">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Sofianidis</surname>
<given-names>G.</given-names>
</name>
<name>
<surname>Hatzitaki</surname>
<given-names>V.</given-names>
</name>
<name>
<surname>Grouios</surname>
<given-names>G.</given-names>
</name>
<name>
<surname>Johannsen</surname>
<given-names>L.</given-names>
</name>
<name>
<surname>Wing</surname>
<given-names>A.</given-names>
</name>
</person-group>
(
<year>2012</year>
).
<article-title>Somatosensory driven interpersonal synchrony during rhythmic sway.</article-title>
<source>
<italic>Hum. Movement Sci.</italic>
</source>
<volume>31</volume>
<fpage>553</fpage>
<lpage>566</lpage>
.
<pub-id pub-id-type="doi">10.1016/j.humov.2011.07.007</pub-id>
</mixed-citation>
</ref>
<ref id="B43">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Stel</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>Harinck</surname>
<given-names>F.</given-names>
</name>
</person-group>
(
<year>2011</year>
).
<article-title>Being mimicked makes you a prosocial voter.</article-title>
<source>
<italic>Exp. Psychol.</italic>
</source>
<volume>58</volume>
<fpage>79</fpage>
<lpage>84</lpage>
.
<pub-id pub-id-type="doi">10.1027/1618-3169/a000070</pub-id>
<pub-id pub-id-type="pmid">20494865</pub-id>
</mixed-citation>
</ref>
<ref id="B44">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Street</surname>
<given-names>R. L.</given-names>
</name>
</person-group>
(
<year>1984</year>
).
<article-title>Speech convergence and speech evaluation in fact-finding interview.</article-title>
<source>
<italic>Hum. Commun. Res.</italic>
</source>
<volume>11</volume>
<fpage>139</fpage>
<lpage>169</lpage>
.
<pub-id pub-id-type="doi">10.1111/j.1468-2958.1984.tb00043.x</pub-id>
</mixed-citation>
</ref>
<ref id="B45">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Tickle-Degnen</surname>
<given-names>L.</given-names>
</name>
<name>
<surname>Rosenthal</surname>
<given-names>R.</given-names>
</name>
</person-group>
(
<year>1990</year>
).
<article-title>The nature of rapport and its nonverbal correlates.</article-title>
<source>
<italic>Psychol. Inq.</italic>
</source>
<volume>1</volume>
<issue>285</issue>
<issue>293</issue>
<pub-id pub-id-type="doi">10.1207/s15327965pli0104-1</pub-id>
</mixed-citation>
</ref>
<ref id="B46">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Tschacher</surname>
<given-names>W.</given-names>
</name>
<name>
<surname>Rees</surname>
<given-names>G. M.</given-names>
</name>
<name>
<surname>Ramseyer</surname>
<given-names>F.</given-names>
</name>
</person-group>
(
<year>2014</year>
).
<article-title>Nonverbal synchrony and affect in dyadic interactions.</article-title>
<source>
<italic>Front. Psychol.</italic>
</source>
<volume>5</volume>
:
<issue>1323</issue>
<pub-id pub-id-type="doi">10.3389/fpsyg.2014.01323</pub-id>
</mixed-citation>
</ref>
<ref id="B47">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Varlet</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>Marin</surname>
<given-names>L.</given-names>
</name>
<name>
<surname>Lagarde</surname>
<given-names>J.</given-names>
</name>
<name>
<surname>Bardy</surname>
<given-names>B. G.</given-names>
</name>
</person-group>
(
<year>2011</year>
).
<article-title>Social postural coordination.</article-title>
<source>
<italic>J. Exp. Psychol. Hum.</italic>
</source>
<volume>37</volume>
<fpage>473</fpage>
<lpage>483</lpage>
.
<pub-id pub-id-type="doi">10.1037/a0020552</pub-id>
</mixed-citation>
</ref>
<ref id="B48">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Walton</surname>
<given-names>A. E.</given-names>
</name>
<name>
<surname>Richardson</surname>
<given-names>M. J.</given-names>
</name>
<name>
<surname>Langland-Hassan</surname>
<given-names>P.</given-names>
</name>
<name>
<surname>Chemero</surname>
<given-names>A.</given-names>
</name>
</person-group>
(
<year>2015</year>
).
<article-title>Improvisation and the self-organization of multiple musical bodies.</article-title>
<source>
<italic>Front. Psychol.</italic>
</source>
<volume>6</volume>
:
<issue>313</issue>
<pub-id pub-id-type="doi">10.3389/fpsyg.2015.00313</pub-id>
</mixed-citation>
</ref>
<ref id="B49">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Washburn</surname>
<given-names>A.</given-names>
</name>
<name>
<surname>DeMarco</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>de Vries</surname>
<given-names>S.</given-names>
</name>
<name>
<surname>Ariyabuddhiphongs</surname>
<given-names>K.</given-names>
</name>
<name>
<surname>Schmidt</surname>
<given-names>R. C.</given-names>
</name>
<name>
<surname>Richardson</surname>
<given-names>M. J.</given-names>
</name>
<etal></etal>
</person-group>
(
<year>2014</year>
).
<article-title>Dancers entrain more effectively than non-dancers to another actor’s movements.</article-title>
<source>
<italic>Front. Hum. Neurosci.</italic>
</source>
<volume>8</volume>
:
<issue>800</issue>
<pub-id pub-id-type="doi">10.3389/fnhum.2014.00800</pub-id>
</mixed-citation>
</ref>
<ref id="B50">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Wiltermuth</surname>
<given-names>S. S.</given-names>
</name>
<name>
<surname>Heath</surname>
<given-names>C.</given-names>
</name>
</person-group>
(
<year>2009</year>
).
<article-title>Synchrony and cooperation.</article-title>
<source>
<italic>Psychol. Sci.</italic>
</source>
<volume>20</volume>
<fpage>1</fpage>
<lpage>5</lpage>
.
<pub-id pub-id-type="doi">10.1111/j.1467-9280.2008.02253.x</pub-id>
<pub-id pub-id-type="pmid">19152536</pub-id>
</mixed-citation>
</ref>
<ref id="B51">
<mixed-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Won</surname>
<given-names>A. S.</given-names>
</name>
<name>
<surname>Bailenson</surname>
<given-names>J. N.</given-names>
</name>
<name>
<surname>Stathatos</surname>
<given-names>S. C.</given-names>
</name>
<name>
<surname>Dai</surname>
<given-names>W.</given-names>
</name>
</person-group>
(
<year>2014</year>
).
<article-title>Automatically detected nonverbal behavior predicts creativity in collaborating dyads.</article-title>
<source>
<italic>J. Nonverbal Behav.</italic>
</source>
<volume>38</volume>
<fpage>389</fpage>
<lpage>408</lpage>
.
<pub-id pub-id-type="doi">10.1007/s10919-014-0186-0</pub-id>
</mixed-citation>
</ref>
</ref-list>
</back>
</pmc>
<affiliations>
<list>
<country>
<li>Japon</li>
</country>
</list>
<tree>
<country name="Japon">
<noRegion>
<name sortKey="Fujiwara, Ken" sort="Fujiwara, Ken" uniqKey="Fujiwara K" first="Ken" last="Fujiwara">Ken Fujiwara</name>
</noRegion>
<name sortKey="Daibo, Ikuo" sort="Daibo, Ikuo" uniqKey="Daibo I" first="Ikuo" last="Daibo">Ikuo Daibo</name>
</country>
</tree>
</affiliations>
</record>

Pour manipuler ce document sous Unix (Dilib)

EXPLOR_STEP=$WICRI_ROOT/Ticri/CIDE/explor/HapticV1/Data/Ncbi/Merge
HfdSelect -h $EXPLOR_STEP/biblio.hfd -nk 004215 | SxmlIndent | more

Ou

HfdSelect -h $EXPLOR_AREA/Data/Ncbi/Merge/biblio.hfd -nk 004215 | SxmlIndent | more

Pour mettre un lien sur cette page dans le réseau Wicri

{{Explor lien
   |wiki=    Ticri/CIDE
   |area=    HapticV1
   |flux=    Ncbi
   |étape=   Merge
   |type=    RBID
   |clé=     PMC:4828427
   |texte=   Evaluating Interpersonal Synchrony: Wavelet Transform Toward an Unstructured Conversation
}}

Pour générer des pages wiki

HfdIndexSelect -h $EXPLOR_AREA/Data/Ncbi/Merge/RBID.i   -Sk "pubmed:27148125" \
       | HfdSelect -Kh $EXPLOR_AREA/Data/Ncbi/Merge/biblio.hfd   \
       | NlmPubMed2Wicri -a HapticV1 

Wicri

This area was generated with Dilib version V0.6.23.
Data generation: Mon Jun 13 01:09:46 2016. Site generation: Wed Mar 6 09:54:07 2024