Serveur d'exploration sur les dispositifs haptiques

Attention, ce site est en cours de développement !
Attention, site généré par des moyens informatiques à partir de corpus bruts.
Les informations ne sont donc pas validées.

Real-time modulation of visual feedback on human full-body movements in a virtual mirror: development and proof-of-concept

Identifieur interne : 000393 ( Pmc/Checkpoint ); précédent : 000392; suivant : 000394

Real-time modulation of visual feedback on human full-body movements in a virtual mirror: development and proof-of-concept

Auteurs : Meyke Roosink ; Nicolas Robitaille ; Bradford J. Mcfadyen ; Luc J. Hébert ; Philip L. Jackson ; Laurent J. Bouyer ; Catherine Mercier

Source :

RBID : PMC:4326499

Abstract

Background

Virtual reality (VR) provides interactive multimodal sensory stimuli and biofeedback, and can be a powerful tool for physical and cognitive rehabilitation. However, existing systems have generally not implemented realistic full-body avatars and/or a scaling of visual movement feedback. We developed a “virtual mirror” that displays a realistic full-body avatar that responds to full-body movements in all movement planes in real-time, and that allows for the scaling of visual feedback on movements in real-time. The primary objective of this proof-of-concept study was to assess the ability of healthy subjects to detect scaled feedback on trunk flexion movements.

Methods

The “virtual mirror” was developed by integrating motion capture, virtual reality and projection systems. A protocol was developed to provide both augmented and reduced feedback on trunk flexion movements while sitting and standing. The task required reliance on both visual and proprioceptive feedback. The ability to detect scaled feedback was assessed in healthy subjects (n = 10) using a two-alternative forced choice paradigm. Additionally, immersion in the VR environment and task adherence (flexion angles, velocity, and fluency) were assessed.

Results

The ability to detect scaled feedback could be modelled using a sigmoid curve with a high goodness of fit (R2 range 89-98%). The point of subjective equivalence was not significantly different from 0 (i.e. not shifted), indicating an unbiased perception. The just noticeable difference was 0.035 ± 0.007, indicating that subjects were able to discriminate different scaling levels consistently. VR immersion was reported to be good, despite some perceived delays between movements and VR projections. Movement kinematic analysis confirmed task adherence.

Conclusions

The new “virtual mirror” extends existing VR systems for motor and pain rehabilitation by enabling the use of realistic full-body avatars and scaled feedback. Proof-of-concept was demonstrated for the assessment of body perception during active movement in healthy controls. The next step will be to apply this system to assessment of body perception disturbances in patients with chronic pain.


Url:
DOI: 10.1186/1743-0003-12-2
PubMed: 25558785
PubMed Central: 4326499


Affiliations:


Links toward previous steps (curation, corpus...)


Links to Exploration step

PMC:4326499

Le document en format XML

<record>
<TEI>
<teiHeader>
<fileDesc>
<titleStmt>
<title xml:lang="en">Real-time modulation of visual feedback on human full-body movements in a virtual mirror: development and proof-of-concept</title>
<author>
<name sortKey="Roosink, Meyke" sort="Roosink, Meyke" uniqKey="Roosink M" first="Meyke" last="Roosink">Meyke Roosink</name>
<affiliation>
<nlm:aff id="Aff1">Centre Interdisciplinaire de Recherche en Réadaptation et Intégration Sociale (CIRRIS), 525 Boul Hamel, Québec, QC G1M 2S8 Canada</nlm:aff>
<wicri:noCountry code="subfield">QC G1M 2S8 Canada</wicri:noCountry>
</affiliation>
</author>
<author>
<name sortKey="Robitaille, Nicolas" sort="Robitaille, Nicolas" uniqKey="Robitaille N" first="Nicolas" last="Robitaille">Nicolas Robitaille</name>
<affiliation>
<nlm:aff id="Aff1">Centre Interdisciplinaire de Recherche en Réadaptation et Intégration Sociale (CIRRIS), 525 Boul Hamel, Québec, QC G1M 2S8 Canada</nlm:aff>
<wicri:noCountry code="subfield">QC G1M 2S8 Canada</wicri:noCountry>
</affiliation>
</author>
<author>
<name sortKey="Mcfadyen, Bradford J" sort="Mcfadyen, Bradford J" uniqKey="Mcfadyen B" first="Bradford J" last="Mcfadyen">Bradford J. Mcfadyen</name>
<affiliation>
<nlm:aff id="Aff1">Centre Interdisciplinaire de Recherche en Réadaptation et Intégration Sociale (CIRRIS), 525 Boul Hamel, Québec, QC G1M 2S8 Canada</nlm:aff>
<wicri:noCountry code="subfield">QC G1M 2S8 Canada</wicri:noCountry>
</affiliation>
<affiliation>
<nlm:aff id="Aff2">Department of Rehabilitation, Faculty of Medicine, Laval University, Québec, QC Canada</nlm:aff>
<wicri:noCountry code="subfield">QC Canada</wicri:noCountry>
</affiliation>
</author>
<author>
<name sortKey="Hebert, Luc J" sort="Hebert, Luc J" uniqKey="Hebert L" first="Luc J" last="Hébert">Luc J. Hébert</name>
<affiliation>
<nlm:aff id="Aff1">Centre Interdisciplinaire de Recherche en Réadaptation et Intégration Sociale (CIRRIS), 525 Boul Hamel, Québec, QC G1M 2S8 Canada</nlm:aff>
<wicri:noCountry code="subfield">QC G1M 2S8 Canada</wicri:noCountry>
</affiliation>
<affiliation>
<nlm:aff id="Aff2">Department of Rehabilitation, Faculty of Medicine, Laval University, Québec, QC Canada</nlm:aff>
<wicri:noCountry code="subfield">QC Canada</wicri:noCountry>
</affiliation>
<affiliation>
<nlm:aff id="Aff3">Canadian Forces Health Services Headquarters, Directorate of Medical Policy (Physiotherapy), Valcartier Garrison, Québec, QC Canada</nlm:aff>
<wicri:noCountry code="subfield">QC Canada</wicri:noCountry>
</affiliation>
<affiliation>
<nlm:aff id="Aff4">Department of Radiology, Faculty of Medicine, Laval University, Québec, QC Canada</nlm:aff>
<wicri:noCountry code="subfield">QC Canada</wicri:noCountry>
</affiliation>
</author>
<author>
<name sortKey="Jackson, Philip L" sort="Jackson, Philip L" uniqKey="Jackson P" first="Philip L" last="Jackson">Philip L. Jackson</name>
<affiliation>
<nlm:aff id="Aff1">Centre Interdisciplinaire de Recherche en Réadaptation et Intégration Sociale (CIRRIS), 525 Boul Hamel, Québec, QC G1M 2S8 Canada</nlm:aff>
<wicri:noCountry code="subfield">QC G1M 2S8 Canada</wicri:noCountry>
</affiliation>
<affiliation>
<nlm:aff id="Aff5">School of Psychology, Laval University, Québec, QC Canada</nlm:aff>
<wicri:noCountry code="subfield">QC Canada</wicri:noCountry>
</affiliation>
</author>
<author>
<name sortKey="Bouyer, Laurent J" sort="Bouyer, Laurent J" uniqKey="Bouyer L" first="Laurent J" last="Bouyer">Laurent J. Bouyer</name>
<affiliation>
<nlm:aff id="Aff1">Centre Interdisciplinaire de Recherche en Réadaptation et Intégration Sociale (CIRRIS), 525 Boul Hamel, Québec, QC G1M 2S8 Canada</nlm:aff>
<wicri:noCountry code="subfield">QC G1M 2S8 Canada</wicri:noCountry>
</affiliation>
<affiliation>
<nlm:aff id="Aff2">Department of Rehabilitation, Faculty of Medicine, Laval University, Québec, QC Canada</nlm:aff>
<wicri:noCountry code="subfield">QC Canada</wicri:noCountry>
</affiliation>
</author>
<author>
<name sortKey="Mercier, Catherine" sort="Mercier, Catherine" uniqKey="Mercier C" first="Catherine" last="Mercier">Catherine Mercier</name>
<affiliation>
<nlm:aff id="Aff1">Centre Interdisciplinaire de Recherche en Réadaptation et Intégration Sociale (CIRRIS), 525 Boul Hamel, Québec, QC G1M 2S8 Canada</nlm:aff>
<wicri:noCountry code="subfield">QC G1M 2S8 Canada</wicri:noCountry>
</affiliation>
<affiliation>
<nlm:aff id="Aff2">Department of Rehabilitation, Faculty of Medicine, Laval University, Québec, QC Canada</nlm:aff>
<wicri:noCountry code="subfield">QC Canada</wicri:noCountry>
</affiliation>
</author>
</titleStmt>
<publicationStmt>
<idno type="wicri:source">PMC</idno>
<idno type="pmid">25558785</idno>
<idno type="pmc">4326499</idno>
<idno type="url">http://www.ncbi.nlm.nih.gov/pmc/articles/PMC4326499</idno>
<idno type="RBID">PMC:4326499</idno>
<idno type="doi">10.1186/1743-0003-12-2</idno>
<date when="2015">2015</date>
<idno type="wicri:Area/Pmc/Corpus">000836</idno>
<idno type="wicri:Area/Pmc/Curation">000836</idno>
<idno type="wicri:Area/Pmc/Checkpoint">000393</idno>
</publicationStmt>
<sourceDesc>
<biblStruct>
<analytic>
<title xml:lang="en" level="a" type="main">Real-time modulation of visual feedback on human full-body movements in a virtual mirror: development and proof-of-concept</title>
<author>
<name sortKey="Roosink, Meyke" sort="Roosink, Meyke" uniqKey="Roosink M" first="Meyke" last="Roosink">Meyke Roosink</name>
<affiliation>
<nlm:aff id="Aff1">Centre Interdisciplinaire de Recherche en Réadaptation et Intégration Sociale (CIRRIS), 525 Boul Hamel, Québec, QC G1M 2S8 Canada</nlm:aff>
<wicri:noCountry code="subfield">QC G1M 2S8 Canada</wicri:noCountry>
</affiliation>
</author>
<author>
<name sortKey="Robitaille, Nicolas" sort="Robitaille, Nicolas" uniqKey="Robitaille N" first="Nicolas" last="Robitaille">Nicolas Robitaille</name>
<affiliation>
<nlm:aff id="Aff1">Centre Interdisciplinaire de Recherche en Réadaptation et Intégration Sociale (CIRRIS), 525 Boul Hamel, Québec, QC G1M 2S8 Canada</nlm:aff>
<wicri:noCountry code="subfield">QC G1M 2S8 Canada</wicri:noCountry>
</affiliation>
</author>
<author>
<name sortKey="Mcfadyen, Bradford J" sort="Mcfadyen, Bradford J" uniqKey="Mcfadyen B" first="Bradford J" last="Mcfadyen">Bradford J. Mcfadyen</name>
<affiliation>
<nlm:aff id="Aff1">Centre Interdisciplinaire de Recherche en Réadaptation et Intégration Sociale (CIRRIS), 525 Boul Hamel, Québec, QC G1M 2S8 Canada</nlm:aff>
<wicri:noCountry code="subfield">QC G1M 2S8 Canada</wicri:noCountry>
</affiliation>
<affiliation>
<nlm:aff id="Aff2">Department of Rehabilitation, Faculty of Medicine, Laval University, Québec, QC Canada</nlm:aff>
<wicri:noCountry code="subfield">QC Canada</wicri:noCountry>
</affiliation>
</author>
<author>
<name sortKey="Hebert, Luc J" sort="Hebert, Luc J" uniqKey="Hebert L" first="Luc J" last="Hébert">Luc J. Hébert</name>
<affiliation>
<nlm:aff id="Aff1">Centre Interdisciplinaire de Recherche en Réadaptation et Intégration Sociale (CIRRIS), 525 Boul Hamel, Québec, QC G1M 2S8 Canada</nlm:aff>
<wicri:noCountry code="subfield">QC G1M 2S8 Canada</wicri:noCountry>
</affiliation>
<affiliation>
<nlm:aff id="Aff2">Department of Rehabilitation, Faculty of Medicine, Laval University, Québec, QC Canada</nlm:aff>
<wicri:noCountry code="subfield">QC Canada</wicri:noCountry>
</affiliation>
<affiliation>
<nlm:aff id="Aff3">Canadian Forces Health Services Headquarters, Directorate of Medical Policy (Physiotherapy), Valcartier Garrison, Québec, QC Canada</nlm:aff>
<wicri:noCountry code="subfield">QC Canada</wicri:noCountry>
</affiliation>
<affiliation>
<nlm:aff id="Aff4">Department of Radiology, Faculty of Medicine, Laval University, Québec, QC Canada</nlm:aff>
<wicri:noCountry code="subfield">QC Canada</wicri:noCountry>
</affiliation>
</author>
<author>
<name sortKey="Jackson, Philip L" sort="Jackson, Philip L" uniqKey="Jackson P" first="Philip L" last="Jackson">Philip L. Jackson</name>
<affiliation>
<nlm:aff id="Aff1">Centre Interdisciplinaire de Recherche en Réadaptation et Intégration Sociale (CIRRIS), 525 Boul Hamel, Québec, QC G1M 2S8 Canada</nlm:aff>
<wicri:noCountry code="subfield">QC G1M 2S8 Canada</wicri:noCountry>
</affiliation>
<affiliation>
<nlm:aff id="Aff5">School of Psychology, Laval University, Québec, QC Canada</nlm:aff>
<wicri:noCountry code="subfield">QC Canada</wicri:noCountry>
</affiliation>
</author>
<author>
<name sortKey="Bouyer, Laurent J" sort="Bouyer, Laurent J" uniqKey="Bouyer L" first="Laurent J" last="Bouyer">Laurent J. Bouyer</name>
<affiliation>
<nlm:aff id="Aff1">Centre Interdisciplinaire de Recherche en Réadaptation et Intégration Sociale (CIRRIS), 525 Boul Hamel, Québec, QC G1M 2S8 Canada</nlm:aff>
<wicri:noCountry code="subfield">QC G1M 2S8 Canada</wicri:noCountry>
</affiliation>
<affiliation>
<nlm:aff id="Aff2">Department of Rehabilitation, Faculty of Medicine, Laval University, Québec, QC Canada</nlm:aff>
<wicri:noCountry code="subfield">QC Canada</wicri:noCountry>
</affiliation>
</author>
<author>
<name sortKey="Mercier, Catherine" sort="Mercier, Catherine" uniqKey="Mercier C" first="Catherine" last="Mercier">Catherine Mercier</name>
<affiliation>
<nlm:aff id="Aff1">Centre Interdisciplinaire de Recherche en Réadaptation et Intégration Sociale (CIRRIS), 525 Boul Hamel, Québec, QC G1M 2S8 Canada</nlm:aff>
<wicri:noCountry code="subfield">QC G1M 2S8 Canada</wicri:noCountry>
</affiliation>
<affiliation>
<nlm:aff id="Aff2">Department of Rehabilitation, Faculty of Medicine, Laval University, Québec, QC Canada</nlm:aff>
<wicri:noCountry code="subfield">QC Canada</wicri:noCountry>
</affiliation>
</author>
</analytic>
<series>
<title level="j">Journal of NeuroEngineering and Rehabilitation</title>
<idno type="eISSN">1743-0003</idno>
<imprint>
<date when="2015">2015</date>
</imprint>
</series>
</biblStruct>
</sourceDesc>
</fileDesc>
<profileDesc>
<textClass></textClass>
</profileDesc>
</teiHeader>
<front>
<div type="abstract" xml:lang="en">
<sec>
<title>Background</title>
<p>Virtual reality (VR) provides interactive multimodal sensory stimuli and biofeedback, and can be a powerful tool for physical and cognitive rehabilitation. However, existing systems have generally not implemented realistic full-body avatars and/or a scaling of visual movement feedback. We developed a “virtual mirror” that displays a realistic full-body avatar that responds to full-body movements in all movement planes in real-time, and that allows for the scaling of visual feedback on movements in real-time. The primary objective of this proof-of-concept study was to assess the ability of healthy subjects to detect scaled feedback on trunk flexion movements.</p>
</sec>
<sec>
<title>Methods</title>
<p>The “virtual mirror” was developed by integrating motion capture, virtual reality and projection systems. A protocol was developed to provide both augmented and reduced feedback on trunk flexion movements while sitting and standing. The task required reliance on both visual and proprioceptive feedback. The ability to detect scaled feedback was assessed in healthy subjects (n = 10) using a two-alternative forced choice paradigm. Additionally, immersion in the VR environment and task adherence (flexion angles, velocity, and fluency) were assessed.</p>
</sec>
<sec>
<title>Results</title>
<p>The ability to detect scaled feedback could be modelled using a sigmoid curve with a high goodness of fit (R
<sup>2</sup>
range 89-98%). The point of subjective equivalence was not significantly different from 0 (i.e. not shifted), indicating an unbiased perception. The just noticeable difference was 0.035 ± 0.007, indicating that subjects were able to discriminate different scaling levels consistently. VR immersion was reported to be good, despite some perceived delays between movements and VR projections. Movement kinematic analysis confirmed task adherence.</p>
</sec>
<sec>
<title>Conclusions</title>
<p>The new “virtual mirror” extends existing VR systems for motor and pain rehabilitation by enabling the use of realistic full-body avatars and scaled feedback. Proof-of-concept was demonstrated for the assessment of body perception during active movement in healthy controls. The next step will be to apply this system to assessment of body perception disturbances in patients with chronic pain.</p>
</sec>
</div>
</front>
<back>
<div1 type="bibliography">
<listBibl>
<biblStruct>
<analytic>
<author>
<name sortKey="Sumitani, M" uniqKey="Sumitani M">M Sumitani</name>
</author>
<author>
<name sortKey="Miyauchi, S" uniqKey="Miyauchi S">S Miyauchi</name>
</author>
<author>
<name sortKey="Mccabe, Cs" uniqKey="Mccabe C">CS McCabe</name>
</author>
<author>
<name sortKey="Shibata, M" uniqKey="Shibata M">M Shibata</name>
</author>
<author>
<name sortKey="Maeda, L" uniqKey="Maeda L">L Maeda</name>
</author>
<author>
<name sortKey="Saitoh, Y" uniqKey="Saitoh Y">Y Saitoh</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Jeannerod, M" uniqKey="Jeannerod M">M Jeannerod</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Kizony, R" uniqKey="Kizony R">R Kizony</name>
</author>
<author>
<name sortKey="Katz, N" uniqKey="Katz N">N Katz</name>
</author>
<author>
<name sortKey="Weiss, Pl" uniqKey="Weiss P">PL Weiss</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Kizony, R" uniqKey="Kizony R">R Kizony</name>
</author>
<author>
<name sortKey="Raz, L" uniqKey="Raz L">L Raz</name>
</author>
<author>
<name sortKey="Katz, N" uniqKey="Katz N">N Katz</name>
</author>
<author>
<name sortKey="Weingarden, H" uniqKey="Weingarden H">H Weingarden</name>
</author>
<author>
<name sortKey="Tamar Weiss, Pl" uniqKey="Tamar Weiss P">PL Tamar Weiss</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Zimmerli, L" uniqKey="Zimmerli L">L Zimmerli</name>
</author>
<author>
<name sortKey="Jacky, M" uniqKey="Jacky M">M Jacky</name>
</author>
<author>
<name sortKey="Lunenburger, L" uniqKey="Lunenburger L">L Lünenburger</name>
</author>
<author>
<name sortKey="Riener, R" uniqKey="Riener R">R Riener</name>
</author>
<author>
<name sortKey="Bolliger, M" uniqKey="Bolliger M">M Bolliger</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Malloy, Km" uniqKey="Malloy K">KM Malloy</name>
</author>
<author>
<name sortKey="Milling, Ls" uniqKey="Milling L">LS Milling</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Fung, J" uniqKey="Fung J">J Fung</name>
</author>
<author>
<name sortKey="Richards, Cl" uniqKey="Richards C">CL Richards</name>
</author>
<author>
<name sortKey="Malouin, F" uniqKey="Malouin F">F Malouin</name>
</author>
<author>
<name sortKey="Mcfadyen, Bj" uniqKey="Mcfadyen B">BJ McFadyen</name>
</author>
<author>
<name sortKey="Lamontagne, A" uniqKey="Lamontagne A">A Lamontagne</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Goble, Dj" uniqKey="Goble D">DJ Goble</name>
</author>
<author>
<name sortKey="Cone, Bl" uniqKey="Cone B">BL Cone</name>
</author>
<author>
<name sortKey="Fling, Bw" uniqKey="Fling B">BW Fling</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Hoffman, Hg" uniqKey="Hoffman H">HG Hoffman</name>
</author>
<author>
<name sortKey="Patterson, Dr" uniqKey="Patterson D">DR Patterson</name>
</author>
<author>
<name sortKey="Carrougher, Gj" uniqKey="Carrougher G">GJ Carrougher</name>
</author>
<author>
<name sortKey="Sharar, Sr" uniqKey="Sharar S">SR Sharar</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Cole, J" uniqKey="Cole J">J Cole</name>
</author>
<author>
<name sortKey="Crowle, S" uniqKey="Crowle S">S Crowle</name>
</author>
<author>
<name sortKey="Austwick, G" uniqKey="Austwick G">G Austwick</name>
</author>
<author>
<name sortKey="Henderson Slater, D" uniqKey="Henderson Slater D">D Henderson Slater</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Murray, Cd" uniqKey="Murray C">CD Murray</name>
</author>
<author>
<name sortKey="Pettifer, S" uniqKey="Pettifer S">S Pettifer</name>
</author>
<author>
<name sortKey="Howard, T" uniqKey="Howard T">T Howard</name>
</author>
<author>
<name sortKey="Patchick, El" uniqKey="Patchick E">EL Patchick</name>
</author>
<author>
<name sortKey="Caillette, F" uniqKey="Caillette F">F Caillette</name>
</author>
<author>
<name sortKey="Kulkarni, J" uniqKey="Kulkarni J">J Kulkarni</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Resnik, L" uniqKey="Resnik L">L Resnik</name>
</author>
<author>
<name sortKey="Etter, K" uniqKey="Etter K">K Etter</name>
</author>
<author>
<name sortKey="Klinger, Sl" uniqKey="Klinger S">SL Klinger</name>
</author>
<author>
<name sortKey="Kambe, C" uniqKey="Kambe C">C Kambe</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Koritnik, T" uniqKey="Koritnik T">T Koritnik</name>
</author>
<author>
<name sortKey="Koenig, A" uniqKey="Koenig A">A Koenig</name>
</author>
<author>
<name sortKey="Bajd, T" uniqKey="Bajd T">T Bajd</name>
</author>
<author>
<name sortKey="Riener, R" uniqKey="Riener R">R Riener</name>
</author>
<author>
<name sortKey="Munih, M" uniqKey="Munih M">M Munih</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Barton, Gj" uniqKey="Barton G">GJ Barton</name>
</author>
<author>
<name sortKey="De Asha, Ar" uniqKey="De Asha A">AR De Asha</name>
</author>
<author>
<name sortKey="Van Loon, Ec" uniqKey="Van Loon E">EC van Loon</name>
</author>
<author>
<name sortKey="Geijtenbeek, T" uniqKey="Geijtenbeek T">T Geijtenbeek</name>
</author>
<author>
<name sortKey="Robinson, Ma" uniqKey="Robinson M">MA Robinson</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Kim, Jh" uniqKey="Kim J">JH Kim</name>
</author>
<author>
<name sortKey="Jang, Sh" uniqKey="Jang S">SH Jang</name>
</author>
<author>
<name sortKey="Kim, Cs" uniqKey="Kim C">CS Kim</name>
</author>
<author>
<name sortKey="Jung, Jh" uniqKey="Jung J">JH Jung</name>
</author>
<author>
<name sortKey="You, Jh" uniqKey="You J">JH You</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Pompeu, Je" uniqKey="Pompeu J">JE Pompeu</name>
</author>
<author>
<name sortKey="Arduini, La" uniqKey="Arduini L">LA Arduini</name>
</author>
<author>
<name sortKey="Botelho, Ar" uniqKey="Botelho A">AR Botelho</name>
</author>
<author>
<name sortKey="Fonseca, Mbf" uniqKey="Fonseca M">MBF Fonseca</name>
</author>
<author>
<name sortKey="Pompeu, Smaa" uniqKey="Pompeu S">SMAA Pompeu</name>
</author>
<author>
<name sortKey="Torriani Pasin, C" uniqKey="Torriani Pasin C">C Torriani-Pasin</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Sarig Bahat, H" uniqKey="Sarig Bahat H">H Sarig-Bahat</name>
</author>
<author>
<name sortKey="Weiss, Pl" uniqKey="Weiss P">PL Weiss</name>
</author>
<author>
<name sortKey="Laufer, Y" uniqKey="Laufer Y">Y Laufer</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Kim, S J" uniqKey="Kim S">S-J Kim</name>
</author>
<author>
<name sortKey="Mugisha, D" uniqKey="Mugisha D">D Mugisha</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Lin, Q" uniqKey="Lin Q">Q Lin</name>
</author>
<author>
<name sortKey="Rieser, J" uniqKey="Rieser J">J Rieser</name>
</author>
<author>
<name sortKey="Bodenheimer, B" uniqKey="Bodenheimer B">B Bodenheimer</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Bernardi, Nf" uniqKey="Bernardi N">NF Bernardi</name>
</author>
<author>
<name sortKey="Marino, Bf" uniqKey="Marino B">BF Marino</name>
</author>
<author>
<name sortKey="Maravita, A" uniqKey="Maravita A">A Maravita</name>
</author>
<author>
<name sortKey="Castelnuovo, G" uniqKey="Castelnuovo G">G Castelnuovo</name>
</author>
<author>
<name sortKey="Tebano, R" uniqKey="Tebano R">R Tebano</name>
</author>
<author>
<name sortKey="Bricolo, E" uniqKey="Bricolo E">E Bricolo</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Ramachandran, Vs" uniqKey="Ramachandran V">VS Ramachandran</name>
</author>
<author>
<name sortKey="Brang, D" uniqKey="Brang D">D Brang</name>
</author>
<author>
<name sortKey="Mcgeoch, Pd" uniqKey="Mcgeoch P">PD McGeoch</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Osumi, M" uniqKey="Osumi M">M Osumi</name>
</author>
<author>
<name sortKey="Imai, R" uniqKey="Imai R">R Imai</name>
</author>
<author>
<name sortKey="Ueta, K" uniqKey="Ueta K">K Ueta</name>
</author>
<author>
<name sortKey="Nakano, H" uniqKey="Nakano H">H Nakano</name>
</author>
<author>
<name sortKey="Nobusako, S" uniqKey="Nobusako S">S Nobusako</name>
</author>
<author>
<name sortKey="Morioka, S" uniqKey="Morioka S">S Morioka</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Moseley, Gl" uniqKey="Moseley G">GL Moseley</name>
</author>
<author>
<name sortKey="Gallace, A" uniqKey="Gallace A">A Gallace</name>
</author>
<author>
<name sortKey="Spence, C" uniqKey="Spence C">C Spence</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Dunn, F" uniqKey="Dunn F">F Dunn</name>
</author>
<author>
<name sortKey="Perberry, I" uniqKey="Perberry I">I Perberry</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Witmer, Bg" uniqKey="Witmer B">BG Witmer</name>
</author>
<author>
<name sortKey="Singer, Mj" uniqKey="Singer M">MJ Singer</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="De Lussanet, Mhe" uniqKey="De Lussanet M">MHE de Lussanet</name>
</author>
<author>
<name sortKey="Behrendt, F" uniqKey="Behrendt F">F Behrendt</name>
</author>
<author>
<name sortKey="Puta, C" uniqKey="Puta C">C Puta</name>
</author>
<author>
<name sortKey="Schulte, Tl" uniqKey="Schulte T">TL Schulte</name>
</author>
<author>
<name sortKey="Lappe, M" uniqKey="Lappe M">M Lappe</name>
</author>
<author>
<name sortKey="Weiss, T" uniqKey="Weiss T">T Weiss</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Mann, L" uniqKey="Mann L">L Mann</name>
</author>
<author>
<name sortKey="Kleinpaul, Jf" uniqKey="Kleinpaul J">JF Kleinpaul</name>
</author>
<author>
<name sortKey="Pereira Moro, Ar" uniqKey="Pereira Moro A">AR Pereira Moro</name>
</author>
<author>
<name sortKey="Mota, Cb" uniqKey="Mota C">CB Mota</name>
</author>
<author>
<name sortKey="Carpes, Fp" uniqKey="Carpes F">FP Carpes</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Wand, Bm" uniqKey="Wand B">BM Wand</name>
</author>
<author>
<name sortKey="Keeves, J" uniqKey="Keeves J">J Keeves</name>
</author>
<author>
<name sortKey="Bourgoin, C" uniqKey="Bourgoin C">C Bourgoin</name>
</author>
<author>
<name sortKey="George, Pj" uniqKey="George P">PJ George</name>
</author>
<author>
<name sortKey="Smith, Aj" uniqKey="Smith A">AJ Smith</name>
</author>
<author>
<name sortKey="O Onnell, Ne" uniqKey="O Onnell N">NE O’Connell</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Moseley, Gl" uniqKey="Moseley G">GL Moseley</name>
</author>
<author>
<name sortKey="Flor, H" uniqKey="Flor H">H Flor</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Mccabe, Cs" uniqKey="Mccabe C">CS McCabe</name>
</author>
<author>
<name sortKey="Haigh, Rc" uniqKey="Haigh R">RC Haigh</name>
</author>
<author>
<name sortKey="Blake, Dr" uniqKey="Blake D">DR Blake</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Mccabe, Cs" uniqKey="Mccabe C">CS McCabe</name>
</author>
<author>
<name sortKey="Cohen, H" uniqKey="Cohen H">H Cohen</name>
</author>
<author>
<name sortKey="Hall, J" uniqKey="Hall J">J Hall</name>
</author>
<author>
<name sortKey="Lewis, J" uniqKey="Lewis J">J Lewis</name>
</author>
<author>
<name sortKey="Rodham, K" uniqKey="Rodham K">K Rodham</name>
</author>
<author>
<name sortKey="Harris, N" uniqKey="Harris N">N Harris</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Opris, D" uniqKey="Opris D">D Opris</name>
</author>
<author>
<name sortKey="Pintea, S" uniqKey="Pintea S">S Pintea</name>
</author>
<author>
<name sortKey="Garcia Palacios, A" uniqKey="Garcia Palacios A">A Garcia-Palacios</name>
</author>
<author>
<name sortKey="Botella, C" uniqKey="Botella C">C Botella</name>
</author>
<author>
<name sortKey="Szamoskozi, S" uniqKey="Szamoskozi S">S Szamoskozi</name>
</author>
<author>
<name sortKey="David, D" uniqKey="David D">D David</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Coelho, Cm" uniqKey="Coelho C">CM Coelho</name>
</author>
<author>
<name sortKey="Santos, Ja" uniqKey="Santos J">JA Santos</name>
</author>
<author>
<name sortKey="Silva, C" uniqKey="Silva C">C Silva</name>
</author>
<author>
<name sortKey="Wallis, G" uniqKey="Wallis G">G Wallis</name>
</author>
<author>
<name sortKey="Tichon, J" uniqKey="Tichon J">J Tichon</name>
</author>
<author>
<name sortKey="Hine, Tj" uniqKey="Hine T">TJ Hine</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Vlaeyen, Jw" uniqKey="Vlaeyen J">JW Vlaeyen</name>
</author>
<author>
<name sortKey="De Jong, J" uniqKey="De Jong J">J de Jong</name>
</author>
<author>
<name sortKey="Geilen, M" uniqKey="Geilen M">M Geilen</name>
</author>
<author>
<name sortKey="Heuts, Ph" uniqKey="Heuts P">PH Heuts</name>
</author>
<author>
<name sortKey="Van Breukelen, G" uniqKey="Van Breukelen G">G van Breukelen</name>
</author>
</analytic>
</biblStruct>
</listBibl>
</div1>
</back>
</TEI>
<pmc article-type="research-article">
<pmc-dir>properties open_access</pmc-dir>
<front>
<journal-meta>
<journal-id journal-id-type="nlm-ta">J Neuroeng Rehabil</journal-id>
<journal-id journal-id-type="iso-abbrev">J Neuroeng Rehabil</journal-id>
<journal-title-group>
<journal-title>Journal of NeuroEngineering and Rehabilitation</journal-title>
</journal-title-group>
<issn pub-type="epub">1743-0003</issn>
<publisher>
<publisher-name>BioMed Central</publisher-name>
<publisher-loc>London</publisher-loc>
</publisher>
</journal-meta>
<article-meta>
<article-id pub-id-type="pmid">25558785</article-id>
<article-id pub-id-type="pmc">4326499</article-id>
<article-id pub-id-type="publisher-id">692</article-id>
<article-id pub-id-type="doi">10.1186/1743-0003-12-2</article-id>
<article-categories>
<subj-group subj-group-type="heading">
<subject>Methodology</subject>
</subj-group>
</article-categories>
<title-group>
<article-title>Real-time modulation of visual feedback on human full-body movements in a virtual mirror: development and proof-of-concept</article-title>
</title-group>
<contrib-group>
<contrib contrib-type="author">
<name>
<surname>Roosink</surname>
<given-names>Meyke</given-names>
</name>
<address>
<email>meyke.roosink@gmail.com</email>
</address>
<xref ref-type="aff" rid="Aff1"></xref>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Robitaille</surname>
<given-names>Nicolas</given-names>
</name>
<address>
<email>nicolas.robitaille@cirris.ulaval.ca</email>
</address>
<xref ref-type="aff" rid="Aff1"></xref>
</contrib>
<contrib contrib-type="author">
<name>
<surname>McFadyen</surname>
<given-names>Bradford J</given-names>
</name>
<address>
<email>brad.mcfadyen@rea.ulaval.ca</email>
</address>
<xref ref-type="aff" rid="Aff1"></xref>
<xref ref-type="aff" rid="Aff2"></xref>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Hébert</surname>
<given-names>Luc J</given-names>
</name>
<address>
<email>luc.hebert@forces.gc.ca</email>
</address>
<xref ref-type="aff" rid="Aff1"></xref>
<xref ref-type="aff" rid="Aff2"></xref>
<xref ref-type="aff" rid="Aff3"></xref>
<xref ref-type="aff" rid="Aff4"></xref>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Jackson</surname>
<given-names>Philip L</given-names>
</name>
<address>
<email>philip.jackson@psy.ulaval.ca</email>
</address>
<xref ref-type="aff" rid="Aff1"></xref>
<xref ref-type="aff" rid="Aff5"></xref>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Bouyer</surname>
<given-names>Laurent J</given-names>
</name>
<address>
<email>laurent.bouyer@rea.ulaval.ca</email>
</address>
<xref ref-type="aff" rid="Aff1"></xref>
<xref ref-type="aff" rid="Aff2"></xref>
</contrib>
<contrib contrib-type="author" corresp="yes">
<name>
<surname>Mercier</surname>
<given-names>Catherine</given-names>
</name>
<address>
<email>catherine.mercier@rea.ulaval.ca</email>
</address>
<xref ref-type="aff" rid="Aff1"></xref>
<xref ref-type="aff" rid="Aff2"></xref>
</contrib>
<aff id="Aff1">
<label></label>
Centre Interdisciplinaire de Recherche en Réadaptation et Intégration Sociale (CIRRIS), 525 Boul Hamel, Québec, QC G1M 2S8 Canada</aff>
<aff id="Aff2">
<label></label>
Department of Rehabilitation, Faculty of Medicine, Laval University, Québec, QC Canada</aff>
<aff id="Aff3">
<label></label>
Canadian Forces Health Services Headquarters, Directorate of Medical Policy (Physiotherapy), Valcartier Garrison, Québec, QC Canada</aff>
<aff id="Aff4">
<label></label>
Department of Radiology, Faculty of Medicine, Laval University, Québec, QC Canada</aff>
<aff id="Aff5">
<label></label>
School of Psychology, Laval University, Québec, QC Canada</aff>
</contrib-group>
<pub-date pub-type="epub">
<day>5</day>
<month>1</month>
<year>2015</year>
</pub-date>
<pub-date pub-type="collection">
<year>2015</year>
</pub-date>
<volume>12</volume>
<issue>1</issue>
<elocation-id>2</elocation-id>
<history>
<date date-type="received">
<day>3</day>
<month>7</month>
<year>2014</year>
</date>
<date date-type="accepted">
<day>22</day>
<month>12</month>
<year>2014</year>
</date>
</history>
<permissions>
<copyright-statement>© Roosink et al.; licensee BioMed Central. 2015</copyright-statement>
<license license-type="open-access">
<license-p>This article is published under license to BioMed Central Ltd. This is an Open Access article distributed under the terms of the Creative Commons Attribution License (
<ext-link ext-link-type="uri" xlink:href="http://creativecommons.org/licenses/by/4.0">http://creativecommons.org/licenses/by/4.0</ext-link>
), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly credited. The Creative Commons Public Domain Dedication waiver (
<ext-link ext-link-type="uri" xlink:href="http://creativecommons.org/publicdomain/zero/1.0/">http://creativecommons.org/publicdomain/zero/1.0/</ext-link>
) applies to the data made available in this article, unless otherwise stated.</license-p>
</license>
</permissions>
<abstract id="Abs1">
<sec>
<title>Background</title>
<p>Virtual reality (VR) provides interactive multimodal sensory stimuli and biofeedback, and can be a powerful tool for physical and cognitive rehabilitation. However, existing systems have generally not implemented realistic full-body avatars and/or a scaling of visual movement feedback. We developed a “virtual mirror” that displays a realistic full-body avatar that responds to full-body movements in all movement planes in real-time, and that allows for the scaling of visual feedback on movements in real-time. The primary objective of this proof-of-concept study was to assess the ability of healthy subjects to detect scaled feedback on trunk flexion movements.</p>
</sec>
<sec>
<title>Methods</title>
<p>The “virtual mirror” was developed by integrating motion capture, virtual reality and projection systems. A protocol was developed to provide both augmented and reduced feedback on trunk flexion movements while sitting and standing. The task required reliance on both visual and proprioceptive feedback. The ability to detect scaled feedback was assessed in healthy subjects (n = 10) using a two-alternative forced choice paradigm. Additionally, immersion in the VR environment and task adherence (flexion angles, velocity, and fluency) were assessed.</p>
</sec>
<sec>
<title>Results</title>
<p>The ability to detect scaled feedback could be modelled using a sigmoid curve with a high goodness of fit (R
<sup>2</sup>
range 89-98%). The point of subjective equivalence was not significantly different from 0 (i.e. not shifted), indicating an unbiased perception. The just noticeable difference was 0.035 ± 0.007, indicating that subjects were able to discriminate different scaling levels consistently. VR immersion was reported to be good, despite some perceived delays between movements and VR projections. Movement kinematic analysis confirmed task adherence.</p>
</sec>
<sec>
<title>Conclusions</title>
<p>The new “virtual mirror” extends existing VR systems for motor and pain rehabilitation by enabling the use of realistic full-body avatars and scaled feedback. Proof-of-concept was demonstrated for the assessment of body perception during active movement in healthy controls. The next step will be to apply this system to assessment of body perception disturbances in patients with chronic pain.</p>
</sec>
</abstract>
<kwd-group xml:lang="en">
<title>Keywords</title>
<kwd>Motion capture</kwd>
<kwd>Visual feedback</kwd>
<kwd>Proprioception</kwd>
<kwd>Physical rehabilitation</kwd>
<kwd>Virtual reality</kwd>
<kwd>Body perception</kwd>
</kwd-group>
<custom-meta-group>
<custom-meta>
<meta-name>issue-copyright-statement</meta-name>
<meta-value>© The Author(s) 2015</meta-value>
</custom-meta>
</custom-meta-group>
</article-meta>
</front>
<body>
<sec id="Sec1">
<title>Background</title>
<p>The normalization of body perception disturbances and of abnormal movement patterns is an important goal in both physical and pain rehabilitation. This requires an understanding of the complex relationship between body perception and movement kinematics, which can subsequently be used to guide patients towards more optimal movement patterns, i.e. by providing visual, haptic and verbal feedback. Virtual reality (VR) is a tool that can create credible and complex multimodal sensory stimuli and biofeedback [
<xref ref-type="bibr" rid="CR1">1</xref>
,
<xref ref-type="bibr" rid="CR2">2</xref>
], can increase therapy engagement [
<xref ref-type="bibr" rid="CR3">3</xref>
<xref ref-type="bibr" rid="CR5">5</xref>
], and may distract from effort and pain [
<xref ref-type="bibr" rid="CR5">5</xref>
,
<xref ref-type="bibr" rid="CR6">6</xref>
]. Moreover, VR can create visual illusions that “bend the truth”, which could be used to assess or change body perception or to stimulate more optimal movement patterns. Lastly, by combining VR with other technologies such as motion capture, therapies may be better tailored to the individual needs of patients. As such VR has increasingly been explored in the context of rehabilitation. Common applications of VR in rehabilitation include for example self-displacements or object displacements in realistic [
<xref ref-type="bibr" rid="CR7">7</xref>
] or non-realistic [
<xref ref-type="bibr" rid="CR8">8</xref>
,
<xref ref-type="bibr" rid="CR9">9</xref>
] virtual (gaming) environments, or the manipulation of virtual body parts, e.g. to replace a missing limb in amputee patients [
<xref ref-type="bibr" rid="CR10">10</xref>
<xref ref-type="bibr" rid="CR12">12</xref>
].</p>
<p>Although used extensively in gaming and video-animation, the use of full-body avatars is still rare in rehabilitation due to the need for accurate movement representations requiring detailed movement sampling and modelling, which can be complex and time-consuming. One of the few successful examples is a study by Koritnik et al. who created a full-body “virtual mirror” by recording kinematic data to animate a virtual mirror-image (non-realistic avatar) in real-time, while healthy adults were stepping in place [
<xref ref-type="bibr" rid="CR13">13</xref>
]. Another example is a very recent study by Barton and colleagues that implemented a virtual mirror for amputee patients. In their study, movements kinematics of the unimpaired leg were combined with the movement timing of the impaired leg to model a realistic avatar with a symmetric gait pattern [
<xref ref-type="bibr" rid="CR14">14</xref>
]. In addition, some studies have used full-body video-capture to display a full-body mirror-image [
<xref ref-type="bibr" rid="CR3">3</xref>
,
<xref ref-type="bibr" rid="CR15">15</xref>
] or avatar [
<xref ref-type="bibr" rid="CR16">16</xref>
] of the subject onto a virtual reality scene.</p>
<p>Unfortunately, the VR systems commonly available in rehabilitation have some important limitations. The modelling of virtual limbs and avatars has generally been based on specific movements in a limited number of movement planes, whereas rehabilitation may include complex movements in multiple movement planes. In addition, only a few VR systems allow for a scaling of movements (e.g. providing augmented or reduced feedback). Indeed, an altered perception of body movements [
<xref ref-type="bibr" rid="CR17">17</xref>
,
<xref ref-type="bibr" rid="CR18">18</xref>
] or body size [
<xref ref-type="bibr" rid="CR19">19</xref>
,
<xref ref-type="bibr" rid="CR20">20</xref>
] could be used to promote or prevent certain movement patterns and could directly impact on pain perception [
<xref ref-type="bibr" rid="CR21">21</xref>
<xref ref-type="bibr" rid="CR23">23</xref>
]. For example, previous work has shown that a virtual environment in which movements were scaled to attain reduced movement perception increased the range of neck motion in patients with neck pain as opposed to a virtual environment without scaling [
<xref ref-type="bibr" rid="CR17">17</xref>
]. Likewise, a gradual modulation of visual feedback of step-length during gait (simple bar graphs) systematically modulated step length away from symmetry, even when subjects were explicitly instructed to maintain a symmetric gait pattern [
<xref ref-type="bibr" rid="CR18">18</xref>
]. However, the required level (low, high) and direction (reduction, augmentation) of scaling is likely to depend on the particular body part and movement involved as well as on the particular type of feedback provided.</p>
<p>As such, and prior to the development of any intervention protocols, it is important to establish normative data regarding body perception during active movement in VR, for example by assessing the ability to detect different levels and directions of scaled feedback in healthy subjects [
<xref ref-type="bibr" rid="CR18">18</xref>
]. To attain this goal we developed a “virtual mirror” that: 1) displays a realistic full-body avatar, 2) responds to full-body movements in all movement planes in real-time, and that 3) allows for the scaling of visual feedback on movements at any given joint in real-time.</p>
<p>The primary objective of this proof-of-concept study was to assess the ability of healthy adults to detect scaled feedback on trunk movements using a two-alternative forced choice paradigm. For each subject, a psychophysical curve was created, and two main variables of interest were derived, the point of subjective equality (PSE) and the just noticeable difference (JND). It was expected that healthy adults would perform consistent with expectations for a two-alternative forced choice paradigm, i.e. that the detection of scaled feedback could be modelled using a sigmoid curve, and that subjects would display unbiased perception (no shift in PSE) and high discriminative ability (small JND). Secondary objectives were to assess virtual reality immersion and task adherence (movement kinematics).</p>
</sec>
<sec id="Sec2">
<title>Technological development of the virtual mirror</title>
<p>The virtual mirror consists of three main components: 1) a motion capture (MOCAP) system, 2) an interaction and rendering system (IRS), and 3) a projection system, see Figure 
<xref rid="Fig1" ref-type="fig">1</xref>
. The subject’s movements (rotation and position) are first acquired using the MOCAP system. The data is then sent to the IRS, which scales the subject’s movements and applies the scaled data to an avatar in real-time. The IRS finally displays the avatar onto a projection screen. A mirrored projection setup allows the subjects to see their avatar as a mirror-image.
<fig id="Fig1">
<label>Figure 1</label>
<caption>
<p>
<bold>Overview of the different components of the virtual mirror.</bold>
1) Motion capture (MOCAP) system including the positioning of 41 reflective markers on the subject’s body
<bold>(A)</bold>
to create a Vicon skeleton template
<bold>(B)</bold>
; 2) Interaction and rendering system (IRS) that retrieves and scales the MOCAP data online, maps the modified data onto the avatar and renders the avatar on screen; 3) Projection screen displaying the avatar’s movements as being augmented (left, scaling factor
<italic>s</italic>
 > 1) or reduced (right, scaling factor
<italic>s</italic>
 < 1) as opposed to the subject’s actual movements (here displayed as a white skeleton).</p>
</caption>
<graphic xlink:href="12984_2014_692_Fig1_HTML" id="d30e464"></graphic>
</fig>
</p>
<sec id="Sec3">
<title>Motion capture system</title>
<p>The MOCAP system (Vicon Motion Systems Ltd., Oxford, UK) is used to acquire the subject’s movements, which are then mapped to an avatar in real-time by the IRS. The system consists of 12 infrared cameras (Bonita 10) connected to a computer (Intel Xeon E31270, 3.40 GHz; 4 GB RAM; OS: Windows 7, 64 bits; NVIDIA Quadro 2000) running Vicon’s Nexus 1.8.2 acquisition software. Movements are captured with a sampling frequency of 100 Hz using a set of 41 reflective markers (14 mm) placed on the subject’s entire body. To be able to locate a marker in 3D space, the MOCAP system must be calibrated. The calibration consists of environment reflection removal, a calibration of the cameras using a wand with a specific marker configuration, and setting the volume origin.</p>
<p>The placement of the markers on the subject’s body is facilitated by using a motion capture suit, and is determined by a skeleton template file based on Vicon’s ‘HumanRTkm’ model. This model additionally defines a hierarchy of segments (or bones) consisting of 19 segments. A complete list of segments and their hierarchy is presented in Table 
<xref rid="Tab1" ref-type="table">1</xref>
, and a visual representation is presented in Figure 
<xref rid="Fig1" ref-type="fig">1</xref>
(frame 1). The segments are mapped onto the subject based on another calibration procedure. This procedure consists of 1) acquiring a sequence of predefined body movements of the head, shoulders, arms, trunk and legs; 2) labeling the markers of the acquired sequence according to the skeleton template; and 3) calibrating the position and orientation of the skeleton joints based on the sequence movements. Once the subject is calibrated, real-time segment positions and orientations are transmitted to the IRS through a local network.
<table-wrap id="Tab1">
<label>Table 1</label>
<caption>
<p>
<bold>Motion capture: skeleton segments and hierarchy</bold>
</p>
</caption>
<table frame="hsides" rules="groups">
<thead>
<tr>
<th>Root segment</th>
<th>Level 1</th>
<th>Level 2</th>
<th>Level 3</th>
<th>Level 4</th>
<th>Level 5</th>
</tr>
</thead>
<tbody>
<tr>
<td>Pelvis</td>
<td>Thorax</td>
<td>Head</td>
<td></td>
<td></td>
<td></td>
</tr>
<tr>
<td></td>
<td></td>
<td>Clavicle_L</td>
<td>Humerus_L</td>
<td>Radius_L</td>
<td>Hand_L</td>
</tr>
<tr>
<td></td>
<td></td>
<td>Clavicle_R</td>
<td>Humerus_R</td>
<td>Radius_R</td>
<td>Hand_R</td>
</tr>
<tr>
<td></td>
<td>Femur_L</td>
<td>Tibia_L</td>
<td>Foot_L</td>
<td>Toes_L</td>
<td></td>
</tr>
<tr>
<td></td>
<td>Femur_R</td>
<td>Tibia_R</td>
<td>Foot_R</td>
<td>Toes_R</td>
<td></td>
</tr>
</tbody>
</table>
<table-wrap-foot>
<p>L: left, R: right.</p>
</table-wrap-foot>
</table-wrap>
</p>
</sec>
<sec id="Sec4">
<title>Interaction and rendering system</title>
<p>The IRS consists of a computer (Intel Xeon E31270, 3.40 GHz; 4 GB RAM; Windows 7, 32 bits; NVIDIA Quadro 2000) running D-Flow (Motek Medical, Amsterdam, The Netherlands). The computer receives the MOCAP data, performs the scaling (see paragraph on ‘Movement scaling’ for details) and maps the resulting data onto the avatar rig so that the avatar follows the subject’s movements in real-time at a refresh rate of 60 Hz. A realistic male avatar model was bought (
<ext-link ext-link-type="uri" xlink:href="http://www.TurboSquid.com">http://www.TurboSquid.com</ext-link>
, Martin T-pose, ID 523309) and rigged for motion capture using Blender (Blender Foundation, Amsterdam, The Netherlands). The avatar model was then converted to the OGRE format (Object-Oriented Graphics Rendering Engine, ogre3d.org) to be used in real-time in the IRS. As such, the size and proportions of the avatar vary based on individual MOCAP data whereas its appearance (e.g. body shape, clothing) remains the same for each subject. In principle, the avatar is placed in an empty scene (grey floor, black walls). However, a height-adjustable stool that is present in the laboratory was also modeled in VR and can additionally be presented and adjusted (i.e. height, positioning in the scene) using an interface programmed in D-flow.</p>
</sec>
<sec id="Sec5">
<title>Projection system</title>
<p>The avatar model is projected onto a silver-coated screen (projection surface 3.05 m × 2.06 m) using a single projector (Hitachi, Tokyo, Japan; CP-WX8255A; 1920 × 1080 High Definition) connected to the IRS computer. To produce the mirror effect, the projector is set in rear-projection mode. Notably, there are no technical limitations to project the avatar onto other projection devices such as a head-mounted display. Additionally, the avatar might be projected in 3D. In the current set-up, the avatar can be viewed in full-body size while the subject remains within an area of about 2.5 by 4 meters. The screen can be approached up to 1 meter. The size of the avatar is proportionally scaled with the distance as opposed to the screen. At a distance of about 2 meters, the avatar’s height is approximately 1.5 times smaller than the subject’s real height.</p>
</sec>
<sec id="Sec6">
<title>Movement scaling</title>
<p>The movement scaling procedure is summarized in Figure 
<xref rid="Fig1" ref-type="fig">1</xref>
. Movement scaling is programmed directly in D-Flow on the IRS using custom scripts. All rotation manipulations are performed in real-time using quaternions. Starting from the global position and rotation data of the MOCAP system, the data is first transformed into the D-Flow coordinate system. Starting from the root segment (pelvis), the hierarchy of the MOCAP skeleton is used to find the local rotation and position of all other segments. A reference rotation is acquired while the subject assumes a static base position. During movement the scaling is applied in the local space of each segment on the difference between the reference rotation and the current MOCAP rotation (updated during movement) using spherical linear interpolation (SLERP), or quaternion interpolation [
<xref ref-type="bibr" rid="CR24">24</xref>
]. The SLERP operation returns a rotation interpolated between two rotations
<italic>q</italic>
<sub>0</sub>
and
<italic>q</italic>
<sub>1</sub>
according to an interpolation parameter (or scaling factor),
<italic>s</italic>
. For parameters
<italic>s</italic>
 = 0 and
<italic>s</italic>
 = 1 SLERP gives
<italic>q</italic>
<sub>0</sub>
and
<italic>q</italic>
<sub>1</sub>
, respectively. In our case
<italic>q</italic>
<sub>0</sub>
is the reference rotation and
<italic>q</italic>
<sub>1</sub>
is the current MOCAP rotation. When for a given segment
<italic>s</italic>
 < 1, SLERP returns an interpolated rotation that is a reduction of the current MOCAP rotation. For
<italic>s</italic>
 > 1 the interpolated rotation is an augmentation of the current MOCAP rotation and follows the same direction. For
<italic>s</italic>
 = 1 no scaling is applied and SLERP simply returns the current MOCAP rotation. Once the scaled rotation is applied locally on a segment, the positions and rotations of its child segments are updated according to this new scaled rotation. This process is performed upwards in the hierarchy up to the root segment (pelvis), resulting in a set of global rotations and positions that are applied onto the avatar. As such, both rotation amplitudes and velocities are scaled in real-time (total delay between movements and VR projection ranging between 90 and 120 ms). It is important to note that the scaling operation is performed locally on each segment and independently in each axis, so that in principle the scaling could be applied on any chosen segment depending on the required application.</p>
</sec>
<sec id="Sec7">
<title>Scaling trunk movements</title>
<p>In this study, only the trunk, consisting of two segments (pelvis and thorax), was scaled in the sagittal plane (i.e. flexion-extension movements). Scaling factors ranged from
<italic>s</italic>
 = 0.667 (corresponding to avatar movements being reduced 1.5 times) to
<italic>s</italic>
 = 1.500 (corresponding to avatar movements being augmented 1.5 times). The range was determined empirically based on task performance in a two-alternative forced choice paradigm during pilot-testing in healthy subjects and in patients with chronic low back pain (for future clinical application). The two extremes (
<italic>s</italic>
 = 0.667 and
<italic>s</italic>
 = 1.500) produced movement scaling that could be clearly identified by the subject as being either reduced or augmented, and were used for familiarization and test trials. Two sets of five points equally spaced below and above
<italic>s</italic>
 = 1 were used for analyses. As such, on a log scale, each point in the set below 1 had a corresponding inverse in the set above 1. The final set of scaling factors is listed in Table 
<xref rid="Tab2" ref-type="table">2</xref>
.
<table-wrap id="Tab2">
<label>Table 2</label>
<caption>
<p>
<bold>Scaling factors, and number of trials</bold>
</p>
</caption>
<table frame="hsides" rules="groups">
<thead>
<tr>
<th>
<bold>
<italic>s</italic>
</bold>
</th>
<th>Log
<bold>
<italic>s</italic>
</bold>
</th>
<th>Number of trials</th>
</tr>
</thead>
<tbody>
<tr>
<td>0.667</td>
<td align="left">−0.176</td>
<td>3 (test trials)</td>
</tr>
<tr>
<td>0.749</td>
<td align="left">−0.126</td>
<td align="left">4</td>
</tr>
<tr>
<td>0.793</td>
<td align="left">−0.101</td>
<td align="left">5</td>
</tr>
<tr>
<td>0.841</td>
<td align="left">−0.075</td>
<td align="left">6</td>
</tr>
<tr>
<td>0.891</td>
<td align="left">−0.050</td>
<td align="left">6</td>
</tr>
<tr>
<td>0.944</td>
<td align="left">−0.025</td>
<td align="left">7</td>
</tr>
<tr>
<td>1.000</td>
<td align="left">0.000</td>
<td align="left">7</td>
</tr>
<tr>
<td>1.060</td>
<td align="left">0.025</td>
<td align="left">7</td>
</tr>
<tr>
<td>1.123</td>
<td align="left">0.050</td>
<td align="left">6</td>
</tr>
<tr>
<td>1.190</td>
<td align="left">0.075</td>
<td align="left">6</td>
</tr>
<tr>
<td>1.261</td>
<td align="left">0.101</td>
<td align="left">5</td>
</tr>
<tr>
<td>1.336</td>
<td align="left">0.126</td>
<td align="left">4</td>
</tr>
<tr>
<td>1.500</td>
<td align="left">0.176</td>
<td>3 (test trials)</td>
</tr>
</tbody>
</table>
</table-wrap>
</p>
</sec>
</sec>
<sec id="Sec8">
<title>Proof of concept: perception of scaled trunk movements</title>
<sec id="Sec9">
<title>Subjects</title>
<p>The project was performed in collaboration with the Canadian Armed Forces. Healthy military subjects (aged between 18–55 years, men only to comply with the avatar’s gender) were recruited at a regional military base. Exclusion criteria included recurrent low back pain, low back pain that required medical care or that restricted work or recreation during the past 2 years, acute pain (pain score higher than 2/10, 0 = no pain, 10 = worst pain imaginable) at the time of testing, chronic pain (duration ≥ 3 months) during the last 6 months prior to participation, non-corrected visual impairments, repeated fractures, or other medical conditions (inflammatory, neurologic, degenerative, auto-immune, psychiatric) that could interfere with performance during testing. All assessments took place at the Centre Interdisciplinaire de Recherche en Réadaptation et Intégration Sociale of the Institut de réadaptation en déficience physique de Québec. The project was approved by the local institutional review board (#2013-323). All subjects received written and oral information, and signed informed consent prior to participation.</p>
</sec>
<sec id="Sec10">
<title>Experimental procedure</title>
<sec id="Sec11">
<title>Preparation</title>
<p>Demographic and anthropomorphic data (weight, height, trunk height) were registered, after which the subject put on a body size-matched (4 sizes available) motion capture suit (OptiTrack, NaturalPoint, Corvallis, Oregon, USA) on which the markers were placed as described in the paragraph 'Motion capture system'. After calibrating the subjects for motion capture, they were placed in front of the projection screen (distance of 2 meters), the lights were dimmed, and the IRS software was activated to display the avatar in front of the subject (mirror mode,
<italic>s</italic>
 = 1 = no modulation). A familiarization period including various pre-defined and spontaneous movements allowed subjects to explore the interaction with the avatar. Afterwards, the subjects remained seated facing the screen, but the avatar was medially rotated 90° so that it was displayed from the side (facing left) to allow for a better view of trunk flexion-extension movements (i.e. side-view mode). A snapshot of the experimental set-up is presented in Figure 
<xref rid="Fig2" ref-type="fig">2</xref>
.
<fig id="Fig2">
<label>Figure 2</label>
<caption>
<p>
<bold>Snapshot of the experimental procedure during a sitting block.</bold>
Each block consisted of 23 trials. The scaling factor was different for each trial. When subjects reached the required flexion angle (15°, 25° or 35°), simultaneous visual (OK) and auditory (bell-sound) feedback were provided. After returning to the base position, subjects had to decide whether the movements of the avatar were greater or smaller than their own movements (two-alternative forced choice paradigm).</p>
</caption>
<graphic xlink:href="12984_2014_692_Fig2_HTML" id="d30e888"></graphic>
</fig>
</p>
<p>Subjects were instructed on static base positions for sitting and standing, which required them to keep their back and neck straight, their head facing the screen, arms falling naturally along the sides of the body, and feet aligned at shoulder width and pointing forward. For the sitting condition, subjects were placed on the stool that was adjusted to yield 90° of hip and knee flexion. For the standing condition, the subject was instructed to keep the knee joint partially flexed in order to maintain balance during trunk flexion. In the base position, the reference rotation was acquired and the trunk flexion angle was considered to be 0°. Subjects practiced the basics of the trunk flexion task in both positions while observing the simultaneous movements of the avatar on the screen (side-view,
<italic>s =</italic>
 1 = no modulation). The instructions were to move at a slow pace in one fluent movement towards a maximum angle of 35°, and this was demonstrated by the experimenter. Subjects received feedback on adherence to instructions.</p>
</sec>
<sec id="Sec12">
<title>Scaling task</title>
<p>The scaling task was introduced in the sitting position in 2 steps. First, the element of moving towards a predefined angle (unknown to the subject) was introduced (4 trials). The detection of these predefined angles by the IRS is described in detail under ‘Detecting and controlling flexion angles’. Subjects were required to start bending forward and, upon the appearance of the word “OK” on the screen along with a simultaneous bell-sound, return backwards to the base position. Second, a two-alternative forced choice paradigm was introduced (4 trials). After each trial, subjects had to decide whether the movements of the avatar were greater or smaller than their own movements. Subjects did not receive feedback on performance accuracy. After this brief training period, the experiment was started.</p>
<p>The number of experimental trials was weighted per scaling factor to acquire more data for relatively difficult trials involving small modulations, i.e. trials in which
<italic>s</italic>
was close to 1. The scaling factors were then distributed over 3 blocks of 23 trials each. The first 2 trials of each block were test trials (unknown to the subject), and were not further analyzed. The other scaling factors were distributed pseudo-randomly to ensure that blocks contained a balanced number of relatively easy and relatively difficult trials. As the tasks had to be performed while sitting and while standing, the total number of blocks was 6 (3 sitting, 3 standing blocks), and the total number of trials was 138. Sitting and standing blocks were alternated and the starting block (sitting or standing) was randomized across subjects. After each block there was a short break. After finishing all experimental blocks, the perceived interaction with the virtual mirror (immersion, distraction) was evaluated on a 1–7 scale using a selection of questions from the Presence Questionnaire (see Table 
<xref rid="Tab3" ref-type="table">3</xref>
for the complete list of questions) [
<xref ref-type="bibr" rid="CR25">25</xref>
]. The total duration of the experiment (including preparation) was about 2 h.
<table-wrap id="Tab3">
<label>Table 3</label>
<caption>
<p>
<bold>Virtual reality immersion and distraction (based on the Presence Questionnaire</bold>
[
<xref ref-type="bibr" rid="CR25">25</xref>
]
<bold>)</bold>
</p>
</caption>
<table frame="hsides" rules="groups">
<thead>
<tr>
<th>Questions</th>
<th></th>
<th>AV ± SD</th>
</tr>
</thead>
<tbody>
<tr>
<td rowspan="6">Immersion</td>
<td>How much were you able to control the avatar (your virtual image)?</td>
<td align="center">6.0 ± 0.7</td>
</tr>
<tr>
<td>How responsive was the avatar to your movements?</td>
<td align="center">5.8 ± 0.4</td>
</tr>
<tr>
<td>How quickly did you adjust to the virtual environment experience?</td>
<td align="center">6.2 ± 1.0</td>
</tr>
<tr>
<td>How proficient in moving and interacting with the virtual environment did you feel at the end of the experience?</td>
<td align="center">6.2 ± 0.8</td>
</tr>
<tr>
<td>To what extent did the movements of the avatar seem natural to you?</td>
<td align="center">5.1 ± 0.7</td>
</tr>
<tr>
<td>How well could you examine the details of the avatar?</td>
<td align="center">5.1 ± 1.1</td>
</tr>
<tr>
<td rowspan="3">Distraction</td>
<td>How much delay did you experience between your actions and the response of the system?</td>
<td align="center">3.5 ± 2.0</td>
</tr>
<tr>
<td>How much did the visual display quality interfere or distract you from performing assigned tasks or required activities?</td>
<td align="center">1.8 ± 1.0</td>
</tr>
<tr>
<td>How much did the control devices interfere with the performance of assigned tasks or with other activities?</td>
<td align="center">1.3 ± 0.5</td>
</tr>
</tbody>
</table>
<table-wrap-foot>
<p>Scoring for immersion: 1 = not able/responsive/etc.; 7 = extremely able/responsive/etc. Scoring for distraction: 1 = no delay/interference; 7 = long delay/high interference.</p>
</table-wrap-foot>
</table-wrap>
</p>
</sec>
<sec id="Sec13">
<title>Detecting and controlling flexion angles</title>
<p>Three predefined angles (15°, 25° and 35°) were programmed in the IRS to: 1) have subjects move within a safe range of motion (i.e. to avoid fatigue or pain) and 2) to introduce proprioceptive inter-trial-variability so that subjects would have to depend on both visual and proprioceptive feedback to perform the task correctly. The detection of flexion angles was based on the sagittal orientation of a vector connecting 2 markers on the back of the subject (C7 and T10). This orientation was considered to be 0° in the base position. When subjects reached the predefined angle for that trial, the IRS sent out the OK signal (screen) and simultaneous bell sound (audio), indicating to the subject to stop bending forward.</p>
<p>The 3 angles were distributed pseudo-randomly across the different blocks. Importantly, the 3 smallest scaling factors were not combined with a 15° detection angle, and the 3 largest scaling factors were not combined with a 35° detection angle. As such, the resulting avatar’s movements were also restricted to a limited range of motion. This avoided extremes in the visual feedback that would otherwise allow subjects to base their decision on visual feedback only. The important point from a methodological perspective was that subjects varied their flexion angles from trial to trial, and not that they achieved a specific flexion angle.</p>
</sec>
</sec>
<sec id="Sec14">
<title>Outcome parameters</title>
<p>For each individual subject, the responses to the two-alternative forced choice task were averaged over log-transformed scaling factors (see Table 
<xref rid="Tab2" ref-type="table">2</xref>
) and plotted (X = log-transformed scaling factor [−0.126; 0.126]; Y = percentage of trials for which the subjects responded that the avatar’s movements were greater than their actual movements [0; 1]). Then a sigmoid curve (Equation 
<xref rid="Equ1" ref-type="">1</xref>
), with initial value X
<sub>Y0.50</sub>
 = 0, with constraints Y
<sub>MAX</sub>
 = 1 and Y
<sub>MIN</sub>
 = 0, and with a variable slope (
<italic>m</italic>
), was fitted to the data (Prism 6 for Windows, Graphpad Software Inc., La Jolla, CA, USA). From each curve, 3 data points were interpolated (X
<sub>Y0.25</sub>
, X
<sub>Y0.50</sub>
, X
<sub>Y0.75</sub>
), and used to determine the so-called point of subjective equivalence (PSE, Equation 
<xref rid="Equ2" ref-type="">2</xref>
) and the just noticeable difference (JND, Equation 
<xref rid="Equ3" ref-type="">3</xref>
). Theoretically, the chance distribution for a two-alternative forced choice paradigm predicts a PSE of 0, i.e. there is a 50% chance of responding “greater” or “smaller” when in fact no scaling has been applied. A PSE higher than 0 indicates that subjects tend to overestimate their own movements and a PSE lower than 0 indicates that subjects tend to underestimate their own movements. The higher the slope and the smaller the JND, the better subjects are able to discriminate between different levels of scaled feedback.
<disp-formula id="Equ1">
<label>1</label>
<graphic xlink:href="12984_2014_692_Equ1_HTML.gif" position="anchor"></graphic>
</disp-formula>
<disp-formula id="Equ2">
<label>2</label>
<graphic xlink:href="12984_2014_692_Equ2_HTML.gif" position="anchor"></graphic>
</disp-formula>
<disp-formula id="Equ3">
<label>3</label>
<graphic xlink:href="12984_2014_692_Equ3_HTML.gif" position="anchor"></graphic>
</disp-formula>
</p>
<p>Task adherence was assessed by analyzing trunk movements for maximum flexion angles, maximum flexion velocity and for the fluency of movement around the maximum flexion angle (number of zero-crossings in trunk acceleration between the maximum flexion and maximum extension velocity) for each of the predefined flexion angles (15°, 25°, 35°), using in-house scripts written in Matlab (version R2010b, The Mathworks Inc., Natik, MA, USA). Data was filtered using a second-order double pass Butterworth filter (4 Hz). Trunk movement analyses were performed based on 3 markers located on the back of the subject (C7, T10 and scapula), and focused on the sagittal plane only.</p>
</sec>
<sec id="Sec15">
<title>Data analysis</title>
<p>For each of the outcome parameters (X
<sub>Y0.25</sub>
, PSE, X
<sub>Y0.75</sub>
, JND,
<italic>m</italic>
) the normality of the data distribution (the skewness of the distribution) and presence of outliers (data outside 1.5 times the interquartile range) was assessed and descriptive statistics were calculated (IBM SPSS for Windows, version 22.0.0.0, USA). Movement data was analyzed using multivariate tests with within-subject factor [Angle] (15°, 25°, 35°). Data is presented in text as mean ± standard deviation.</p>
</sec>
</sec>
<sec id="Sec16" sec-type="results">
<title>Results</title>
<p>A total of 11 healthy subjects participated in the experiment. One subject showed poor task adherence and was additionally identified as an outlier based on psychophysical curve metrics and movement data. As such this subject was excluded from the analyses. The final sample therefore consisted of 10 male subjects, having a mean age of 28 ± 5 years (range: 22–37), weight of 88 ± 14 kg (range: 62–108), height of 176 ± 10 cm (range: 165–201), and Body Mass Index (BMI) of 28 ± 4 (range: 23–34).</p>
<sec id="Sec17">
<title>Two-alternative forced choice paradigm and psychophysical curve</title>
<p>The data followed a normal distribution (i.e. skewness values close to 0). Figure 
<xref rid="Fig3" ref-type="fig">3</xref>
presents the data and curve fitting results for a representative subject. In general, the goodness of fit for these individually fitted curves was high (R
<sup>2</sup>
range: 0.89 - 0.98). Group averaged interpolated X
<sub>Y0.25</sub>
, PSE, and X
<sub>Y0.75</sub>
(± SEM) are presented in Figure 
<xref rid="Fig4" ref-type="fig">4</xref>
. The 95% confidence interval for the PSE ranged from −0.003 to 0.028, indicating that the PSE was not significantly different from 0. The average JND was 0.035 ± 0.007 (range 0.026 - 0.042), the average curve slope
<italic>m</italic>
was 14.1 ± 2.7 (range 10.1 - 18.6), and the average percentage of correct responses was 83% ± 4% (range 60% - 100%).
<fig id="Fig3">
<label>Figure 3</label>
<caption>
<p>
<bold>Average response frequencies (black dots) and curve fitting results (black line) for a representative subject.</bold>
For log scaling factors smaller than 0, avatar movements were reduced. For log scaling factors greater than 0, avatar movements were augmented. For a log scaling factor of 0, no modulation was applied. Colored lines indicate the curve metrics derived: X
<sub>Y0.25</sub>
(green), X
<sub>Y0.50</sub>
 = point of subjective equality (PSE) (red), and X
<sub>Y0.75</sub>
(blue). The PSE was close to 0 consistent with expectations for a two-alternative forced choice paradigm.</p>
</caption>
<graphic xlink:href="12984_2014_692_Fig3_HTML" id="d30e1139"></graphic>
</fig>
<fig id="Fig4">
<label>Figure 4</label>
<caption>
<p>
<bold>Curve metrics as derived from individual curves (n = 10), mean ± SEM.</bold>
X
<sub>Y0.25</sub>
: interpolated log scaling factor at a response frequency of 0.25; X
<sub>Y0.75</sub>
: interpolated log scaling factor at a response frequency of 0.75; PSE: point of subjective equality.</p>
</caption>
<graphic xlink:href="12984_2014_692_Fig4_HTML" id="d30e1157"></graphic>
</fig>
</p>
</sec>
<sec id="Sec18">
<title>Virtual reality immersion</title>
<p>Table 
<xref rid="Tab3" ref-type="table">3</xref>
presents the average scores relating to the subjective experience with the virtual mirror. Despite perceived delays, and despite the fact that the “fit” of the avatar (i.e. regarding anthropomorphic characteristics) was better for some subjects than for others, immersion was found to be relatively good. Distraction due to visual display quality and/or the control devices (e.g. Vicon suit and markers) were considered minor and did not interfere or interfered only somewhat with task performance.</p>
</sec>
<sec id="Sec19">
<title>Task adherence (movement kinematics)</title>
<p>The data followed a normal distribution (i.e. skewness values close to 0). Movement analyses revealed distinct maximum flexion angles for each of the predefined angles (15°: 32° ± 5°; 25°: 38° ± 4°; 35°: 43° ± 4°)(F
<sub>2,8</sub>
 = 49.603, p < 0.001). As expected, maximum flexion angles were higher than the predefined angles due to the subjects’ reaction times following the appearance of the “OK” sign and simultaneous bell-sound. Importantly, subjects varied their movement angles from trial to trial as required. Flexion movement velocity (15°: 31° ± 9°/s; 25°: 31° ± 8°/s; 35°: 31° ± 8°/s) was comparable for each of the angles (F
<sub>2,8</sub>
 = 1.506, p = 0.279). As expected, movement fluency (15°: 0.9 ± 0.5 zero-crossings; 25°: 1.0 ± 0.4 zero-crossings; 35°: 1.3 ± 0.5 zero-crossings) was slightly better for smaller angles (F
<sub>2,8</sub>
 = 5.725, p = 0.029).</p>
</sec>
</sec>
<sec id="Sec20" sec-type="discussion">
<title>Discussion</title>
<p>A virtual mirror allowing for the scaling of visual movement feedback was developed by integrating motion capture, virtual reality and projection systems. In a proof-of-concept study in healthy adults, the ability to detect scaled feedback on trunk flexion movements was according to expectation. Performance could be modelled using a sigmoid curve with a high goodness of fit, and confirmed unbiased perception (PSE not different form 0) and high discriminative ability (small JND) in healthy adults.</p>
<sec id="Sec21">
<title>Virtual mirror components and performance</title>
<p>The real-time full-body virtual mirror that was developed in this study displayed a realistic full-body avatar that responded to full-body movements in all movement planes in real-time, and allowed for the real-time scaling of visual feedback on movements at a given joint. As such, the developed virtual mirror extends existing VR systems in motor and pain rehabilitation (e.g. [
<xref ref-type="bibr" rid="CR10">10</xref>
<xref ref-type="bibr" rid="CR12">12</xref>
,
<xref ref-type="bibr" rid="CR3">3</xref>
,
<xref ref-type="bibr" rid="CR13">13</xref>
,
<xref ref-type="bibr" rid="CR15">15</xref>
]) and further enables the use of realistic full-body avatars.</p>
<p>Throughout the test-phase, the real-time performance of the virtual mirror was stable, and the IRS refresh rate was according to specifications (60 Hz). Although the total delay between actual movements and projected movements was found to be between 90 and 120 ms, perceptual delays were reported ranging from “no delays” to “long delays”. Upon verification, the computational load associated with the scaling of movements could be ruled out as a potential source of delay, suggesting the total delay was mainly caused by delays in the communication between the MOCAP system and IRS. Together with the IRS supplier, we are currently trying to further improve the communication between the two systems.</p>
<p>Due to the exploratory nature of the study, initially only one realistic male avatar model was implemented. As expected, the “fit” of this “one-size fits all”-avatar was better in some subjects than in others. This might be improved by incorporating additional anthropomorphic data, such as BMI, into the avatar rigging process. However, regardless of perceived delays and avatar “fit” issues, immersion was reported to be good. As such, we are confident that the virtual mirror worked sufficiently well to apply scaled feedback and to assess the ability to perceive this scaled feedback in healthy subjects. This is further substantiated by the relatively small between-subject variability observed in this proof-of-concept study.</p>
</sec>
<sec id="Sec22">
<title>Detecting scaled feedback</title>
<p>Movement kinematics were consistent with instructions to move slowly and in one fluent movement towards a set of predefined flexion angles, confirming task adherence. As such, both proprioceptive (real flexion angles) and visual feedback (avatar flexion angles) varied from trial to trial. Together, this suggests that subjects relied on a combination of visual and proprioceptive feedback to perform the task and seems consistent with the spontaneous reports of subjects perceiving the two-alternative forced-choice paradigm as being difficult.</p>
<p>Using the current virtual mirror set-up, the detection of scaled feedback could be modelled using a sigmoid curve with a high goodness of fit. Importantly, the subjects’ performance was consistent with expectations for a two-alternative forced choice paradigm, i.e. the PSE being not significantly different from 0 reflecting unbiased perception, and small JNDs reflecting high discriminative ability. Back-transforming the average JND to a linear scale reveals that avatar movements had to be augmented or reduced only 1.07 times to be accurately detected, i.e. staying within the range of 2 scaling levels below or above 1. In addition, the between-subject variability of psychophysical outcome parameters appears to be sufficiently small to allow for distinguishing normal from abnormal responses in future studies (e.g. as compared to clinical populations). Together, these results confirm the validity of our method to assess body perception during active trunk movement.</p>
<p>Some methodological limitations need to be considered, including the relatively small number of subjects, the relatively low number of trials (divided over sitting and standing blocks), and the application of scaling in one plane of movement using a relatively small range of motion (the maximum predefined angle being 35°). Additionally, the experimental task required a mental rotation (due to presentation of the avatar in side-view mode) which could have impacted on task performance. However, despite these limitations, the between-subject variability was relatively small and immersion was reported to be good. Additional study of protocol parameters (e.g. optimal number of trials/blocks), and complex analyses of curve metrics and movements (e.g. to distinguish different movement strategies) may help to further improve the protocol and the interpretation of results.</p>
</sec>
<sec id="Sec23">
<title>Potential applications</title>
<p>To date, our scaling protocol using the virtual mirror was implemented for trunk flexion only (involving the pelvis and trunk segments). In principle, scaling could be applied on any chosen body segment to fit the required application. In the near future, we envision two main applications.</p>
<p>First, the virtual mirror might be used as an assessment tool, i.e. to assess body perception during active movement. This would extend currently available assessment tools that commonly assess body perception under static conditions. Likewise, this application would allow for the assessment of body perception disturbances in patients, which may inform clinical management. The present protocol for trunk flexion was developed for the assessment of body perception in patients with chronic low back pain, in whom body perception disturbances and fear of movement are thought to play an important role in disease pathology [
<xref ref-type="bibr" rid="CR26">26</xref>
<xref ref-type="bibr" rid="CR28">28</xref>
]. Other relevant patient populations for which similar pathophysiological mechanisms have been proposed include patients with complex regional pain syndrome, fibromyalgia, or phantom-limb pain [
<xref ref-type="bibr" rid="CR23">23</xref>
,
<xref ref-type="bibr" rid="CR29">29</xref>
<xref ref-type="bibr" rid="CR31">31</xref>
].</p>
<p>Second, the virtual mirror might be used as an intervention tool, alone or in combination with other cognitive or physical interventions. As introduced in the background section, prolonged periods of scaled feedback might be applied to promote or prevent specific movements in patients displaying altered perceptions of body movements [
<xref ref-type="bibr" rid="CR17">17</xref>
,
<xref ref-type="bibr" rid="CR18">18</xref>
]. Additionally, the virtual mirror could be used to overcome fear of movement (kinesiophobia) in patients with chronic pain by providing an extension to, or by replacing, graded in-vivo exposure therapies [
<xref ref-type="bibr" rid="CR32">32</xref>
,
<xref ref-type="bibr" rid="CR33">33</xref>
], such as recommended for the treatment of patients with chronic low back pain displaying high levels of kinesiophobia [
<xref ref-type="bibr" rid="CR34">34</xref>
].</p>
</sec>
</sec>
<sec id="Sec24" sec-type="conclusions">
<title>Conclusions</title>
<p>The new virtual mirror extends existing VR systems for motor and pain rehabilitation by providing scaled feedback, and enables the use of realistic full-body avatars. After having demonstrated proof-of-concept in healthy adults, we are now exploring the current virtual mirror set-up as a tool to assess body perception disturbances in patients with chronic low back pain.</p>
</sec>
</body>
<back>
<glossary>
<title>Abbreviations</title>
<def-list>
<def-list>
<def-item>
<term>BMI</term>
<def>
<p>Body mass index</p>
</def>
</def-item>
<def-item>
<term>IRS</term>
<def>
<p>Interaction and rendering system</p>
</def>
</def-item>
<def-item>
<term>MOCAP</term>
<def>
<p>Motion capture</p>
</def>
</def-item>
<def-item>
<term>SLERP</term>
<def>
<p>Spherical linear interpolation</p>
</def>
</def-item>
<def-item>
<term>VR</term>
<def>
<p>Virtual reality.</p>
</def>
</def-item>
</def-list>
</def-list>
</glossary>
<fn-group>
<fn>
<p>
<bold>Competing interests</bold>
</p>
<p>The authors declare that they have no competing interests.</p>
</fn>
<fn>
<p>
<bold>Authors’ contributions</bold>
</p>
<p>MR was involved in designing the study, performed the data acquisition, analysis and interpretation, and drafted the manuscript. NR developed all technical aspects relating to the virtual mirror, provided technical support throughout the study, was involved in data analysis, and critically revised the manuscript for important intellectual content. BJM was involved in the conception and design of the study, as well as in the technical developments and data analysis, and critically revised the manuscript for important intellectual content. LJH, PLJ and LJB were involved in the conception and design of the study and critically revised the manuscript for important intellectual content. CM was involved in the conception and design of the study, and in data analysis and interpretation, and critically revised the manuscript for important intellectual content. All authors read and approved the final manuscript.</p>
</fn>
</fn-group>
<ack>
<title>Acknowledgements</title>
<p>The authors are thankful to all participants for investing their time and effort, to Chantal Gendron for her assistance in participant recruitment, to Isabelle Lorusso and Martin Gagné for their assistance in the data collection process, and to Jean Larochelle for his initial assistance in the technical development of the virtual scene and avatar. The research was financially supported by the Surgeon General Health Research Program (Canadian Armed Forces). MR is supported by a postdoctoral fellowship from the Fonds de recherche Québec – Santé (FRQS) and the SensoriMotor Rehabilitation Research Team (SMRRT) as part of the Regenerative Medicine, and the Nanomedicine Strategic Initiative of the Canadian Institute for Health Research (CIHR). PLJ and CM are supported by salary grants from CIHR and FRQS.</p>
</ack>
<ref-list id="Bib1">
<title>References</title>
<ref id="CR1">
<label>1.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Sumitani</surname>
<given-names>M</given-names>
</name>
<name>
<surname>Miyauchi</surname>
<given-names>S</given-names>
</name>
<name>
<surname>McCabe</surname>
<given-names>CS</given-names>
</name>
<name>
<surname>Shibata</surname>
<given-names>M</given-names>
</name>
<name>
<surname>Maeda</surname>
<given-names>L</given-names>
</name>
<name>
<surname>Saitoh</surname>
<given-names>Y</given-names>
</name>
<etal></etal>
</person-group>
<article-title>Mirror visual feedback alleviates deafferentation pain, depending on qualitative aspects of the pain: A preliminary report</article-title>
<source>Rheumatology</source>
<year>2008</year>
<volume>47</volume>
<fpage>1038</fpage>
<lpage>43</lpage>
<pub-id pub-id-type="doi">10.1093/rheumatology/ken170</pub-id>
<pub-id pub-id-type="pmid">18463143</pub-id>
</element-citation>
</ref>
<ref id="CR2">
<label>2.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Jeannerod</surname>
<given-names>M</given-names>
</name>
</person-group>
<article-title>Neural simulation of action: a unifying mechanism for motor cognition</article-title>
<source>Neuroimage</source>
<year>2001</year>
<volume>14</volume>
<fpage>S103</fpage>
<lpage>9</lpage>
<pub-id pub-id-type="doi">10.1006/nimg.2001.0832</pub-id>
<pub-id pub-id-type="pmid">11373140</pub-id>
</element-citation>
</ref>
<ref id="CR3">
<label>3.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Kizony</surname>
<given-names>R</given-names>
</name>
<name>
<surname>Katz</surname>
<given-names>N</given-names>
</name>
<name>
<surname>Weiss</surname>
<given-names>PL</given-names>
</name>
</person-group>
<article-title>Adapting an immersive virtual reality system for rehabilitation</article-title>
<source>J Vis Comput Anim</source>
<year>2003</year>
<volume>14</volume>
<fpage>261</fpage>
<lpage>8</lpage>
<pub-id pub-id-type="doi">10.1002/vis.323</pub-id>
</element-citation>
</ref>
<ref id="CR4">
<label>4.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Kizony</surname>
<given-names>R</given-names>
</name>
<name>
<surname>Raz</surname>
<given-names>L</given-names>
</name>
<name>
<surname>Katz</surname>
<given-names>N</given-names>
</name>
<name>
<surname>Weingarden</surname>
<given-names>H</given-names>
</name>
<name>
<surname>Tamar Weiss</surname>
<given-names>PL</given-names>
</name>
</person-group>
<article-title>Video-capture virtual reality system for patients with paraplegic spinal cord injury</article-title>
<source>J Rehabil Res Dev</source>
<year>2005</year>
<volume>42</volume>
<fpage>595</fpage>
<lpage>607</lpage>
<pub-id pub-id-type="doi">10.1682/JRRD.2005.01.0023</pub-id>
<pub-id pub-id-type="pmid">16586185</pub-id>
</element-citation>
</ref>
<ref id="CR5">
<label>5.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Zimmerli</surname>
<given-names>L</given-names>
</name>
<name>
<surname>Jacky</surname>
<given-names>M</given-names>
</name>
<name>
<surname>Lünenburger</surname>
<given-names>L</given-names>
</name>
<name>
<surname>Riener</surname>
<given-names>R</given-names>
</name>
<name>
<surname>Bolliger</surname>
<given-names>M</given-names>
</name>
</person-group>
<article-title>Increasing patient engagement during virtual reality-based motor rehabilitation</article-title>
<source>Arch Phys Med Rehabil</source>
<year>2013</year>
<volume>94</volume>
<fpage>1737</fpage>
<lpage>46</lpage>
<pub-id pub-id-type="doi">10.1016/j.apmr.2013.01.029</pub-id>
<pub-id pub-id-type="pmid">23500181</pub-id>
</element-citation>
</ref>
<ref id="CR6">
<label>6.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Malloy</surname>
<given-names>KM</given-names>
</name>
<name>
<surname>Milling</surname>
<given-names>LS</given-names>
</name>
</person-group>
<article-title>The effectiveness of virtual reality distraction for pain reduction: A systematic review</article-title>
<source>Clin Psychol Rev</source>
<year>2010</year>
<volume>30</volume>
<fpage>1011</fpage>
<lpage>18</lpage>
<pub-id pub-id-type="doi">10.1016/j.cpr.2010.07.001</pub-id>
<pub-id pub-id-type="pmid">20691523</pub-id>
</element-citation>
</ref>
<ref id="CR7">
<label>7.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Fung</surname>
<given-names>J</given-names>
</name>
<name>
<surname>Richards</surname>
<given-names>CL</given-names>
</name>
<name>
<surname>Malouin</surname>
<given-names>F</given-names>
</name>
<name>
<surname>McFadyen</surname>
<given-names>BJ</given-names>
</name>
<name>
<surname>Lamontagne</surname>
<given-names>A</given-names>
</name>
</person-group>
<article-title>A treadmill and motion coupled virtual reality system for gait training post-stroke</article-title>
<source>Cyberpsychol Behav</source>
<year>2006</year>
<volume>9</volume>
<fpage>157</fpage>
<lpage>62</lpage>
<pub-id pub-id-type="doi">10.1089/cpb.2006.9.157</pub-id>
<pub-id pub-id-type="pmid">16640470</pub-id>
</element-citation>
</ref>
<ref id="CR8">
<label>8.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Goble</surname>
<given-names>DJ</given-names>
</name>
<name>
<surname>Cone</surname>
<given-names>BL</given-names>
</name>
<name>
<surname>Fling</surname>
<given-names>BW</given-names>
</name>
</person-group>
<article-title>Using the Wii Fit as a tool for balance assessment and neurorehabilitation: The first half decade of “wii-search”</article-title>
<source>J Neuroeng Rehabil</source>
<year>2014</year>
<volume>11</volume>
<fpage>12</fpage>
<pub-id pub-id-type="doi">10.1186/1743-0003-11-12</pub-id>
<pub-id pub-id-type="pmid">24507245</pub-id>
</element-citation>
</ref>
<ref id="CR9">
<label>9.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Hoffman</surname>
<given-names>HG</given-names>
</name>
<name>
<surname>Patterson</surname>
<given-names>DR</given-names>
</name>
<name>
<surname>Carrougher</surname>
<given-names>GJ</given-names>
</name>
<name>
<surname>Sharar</surname>
<given-names>SR</given-names>
</name>
</person-group>
<article-title>Effectiveness of virtual reality-based pain control with multiple treatments</article-title>
<source>Clin J Pain</source>
<year>2001</year>
<volume>17</volume>
<fpage>229</fpage>
<lpage>35</lpage>
<pub-id pub-id-type="doi">10.1097/00002508-200109000-00007</pub-id>
<pub-id pub-id-type="pmid">11587113</pub-id>
</element-citation>
</ref>
<ref id="CR10">
<label>10.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Cole</surname>
<given-names>J</given-names>
</name>
<name>
<surname>Crowle</surname>
<given-names>S</given-names>
</name>
<name>
<surname>Austwick</surname>
<given-names>G</given-names>
</name>
<name>
<surname>Henderson Slater</surname>
<given-names>D</given-names>
</name>
</person-group>
<article-title>Exploratory findings with virtual reality for phantom limb pain; From stump motion to agency and analgesia</article-title>
<source>Disabil Rehabil</source>
<year>2009</year>
<volume>31</volume>
<fpage>846</fpage>
<lpage>54</lpage>
<pub-id pub-id-type="doi">10.1080/09638280802355197</pub-id>
<pub-id pub-id-type="pmid">19191061</pub-id>
</element-citation>
</ref>
<ref id="CR11">
<label>11.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Murray</surname>
<given-names>CD</given-names>
</name>
<name>
<surname>Pettifer</surname>
<given-names>S</given-names>
</name>
<name>
<surname>Howard</surname>
<given-names>T</given-names>
</name>
<name>
<surname>Patchick</surname>
<given-names>EL</given-names>
</name>
<name>
<surname>Caillette</surname>
<given-names>F</given-names>
</name>
<name>
<surname>Kulkarni</surname>
<given-names>J</given-names>
</name>
<etal></etal>
</person-group>
<article-title>The treatment of phantom limb pain using immersive virtual reality: Three case studies</article-title>
<source>Disabil Rehabil</source>
<year>2007</year>
<volume>29</volume>
<fpage>1465</fpage>
<lpage>9</lpage>
<pub-id pub-id-type="doi">10.1080/09638280601107385</pub-id>
<pub-id pub-id-type="pmid">17729094</pub-id>
</element-citation>
</ref>
<ref id="CR12">
<label>12.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Resnik</surname>
<given-names>L</given-names>
</name>
<name>
<surname>Etter</surname>
<given-names>K</given-names>
</name>
<name>
<surname>Klinger</surname>
<given-names>SL</given-names>
</name>
<name>
<surname>Kambe</surname>
<given-names>C</given-names>
</name>
</person-group>
<article-title>Using virtual reality environment to facilitate training with advanced upper-limb prosthesis</article-title>
<source>J Rehabil Res Dev</source>
<year>2011</year>
<volume>48</volume>
<fpage>707</fpage>
<lpage>18</lpage>
<pub-id pub-id-type="doi">10.1682/JRRD.2010.07.0127</pub-id>
<pub-id pub-id-type="pmid">21938657</pub-id>
</element-citation>
</ref>
<ref id="CR13">
<label>13.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Koritnik</surname>
<given-names>T</given-names>
</name>
<name>
<surname>Koenig</surname>
<given-names>A</given-names>
</name>
<name>
<surname>Bajd</surname>
<given-names>T</given-names>
</name>
<name>
<surname>Riener</surname>
<given-names>R</given-names>
</name>
<name>
<surname>Munih</surname>
<given-names>M</given-names>
</name>
</person-group>
<article-title>Comparison of visual and haptic feedback during training of lower extremities</article-title>
<source>Gait Posture</source>
<year>2010</year>
<volume>32</volume>
<fpage>540</fpage>
<lpage>6</lpage>
<pub-id pub-id-type="doi">10.1016/j.gaitpost.2010.07.017</pub-id>
<pub-id pub-id-type="pmid">20727763</pub-id>
</element-citation>
</ref>
<ref id="CR14">
<label>14.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Barton</surname>
<given-names>GJ</given-names>
</name>
<name>
<surname>De Asha</surname>
<given-names>AR</given-names>
</name>
<name>
<surname>van Loon</surname>
<given-names>EC</given-names>
</name>
<name>
<surname>Geijtenbeek</surname>
<given-names>T</given-names>
</name>
<name>
<surname>Robinson</surname>
<given-names>MA</given-names>
</name>
</person-group>
<article-title>Manipulation of visual biofeedback during gait with a time delayed adaptive Virtual Mirror Box</article-title>
<source>J Neuroeng Rehabil</source>
<year>2014</year>
<volume>11</volume>
<fpage>101</fpage>
<pub-id pub-id-type="doi">10.1186/1743-0003-11-101</pub-id>
<pub-id pub-id-type="pmid">24917329</pub-id>
</element-citation>
</ref>
<ref id="CR15">
<label>15.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Kim</surname>
<given-names>JH</given-names>
</name>
<name>
<surname>Jang</surname>
<given-names>SH</given-names>
</name>
<name>
<surname>Kim</surname>
<given-names>CS</given-names>
</name>
<name>
<surname>Jung</surname>
<given-names>JH</given-names>
</name>
<name>
<surname>You</surname>
<given-names>JH</given-names>
</name>
</person-group>
<article-title>Use of virtual reality to enhance balance and ambulation in chronic stroke: a double-blind, randomized controlled study</article-title>
<source>Am J Phys Med Rehabil</source>
<year>2009</year>
<volume>88</volume>
<fpage>693</fpage>
<lpage>701</lpage>
<pub-id pub-id-type="doi">10.1097/PHM.0b013e3181b33350</pub-id>
<pub-id pub-id-type="pmid">19692788</pub-id>
</element-citation>
</ref>
<ref id="CR16">
<label>16.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Pompeu</surname>
<given-names>JE</given-names>
</name>
<name>
<surname>Arduini</surname>
<given-names>LA</given-names>
</name>
<name>
<surname>Botelho</surname>
<given-names>AR</given-names>
</name>
<name>
<surname>Fonseca</surname>
<given-names>MBF</given-names>
</name>
<name>
<surname>Pompeu</surname>
<given-names>SMAA</given-names>
</name>
<name>
<surname>Torriani-Pasin</surname>
<given-names>C</given-names>
</name>
<etal></etal>
</person-group>
<article-title>Feasibility, safety and outcomes of playing Kinect Adventures!™ for people with Parkinson’s disease: a pilot study</article-title>
<source>Physiotherapy (United Kingdom)</source>
<year>2014</year>
<volume>100</volume>
<fpage>162</fpage>
<lpage>8</lpage>
</element-citation>
</ref>
<ref id="CR17">
<label>17.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Sarig-Bahat</surname>
<given-names>H</given-names>
</name>
<name>
<surname>Weiss</surname>
<given-names>PL</given-names>
</name>
<name>
<surname>Laufer</surname>
<given-names>Y</given-names>
</name>
</person-group>
<article-title>Neck pain assessment in a virtual environment</article-title>
<source>Spine</source>
<year>2010</year>
<volume>35</volume>
<fpage>E105</fpage>
<lpage>12</lpage>
<pub-id pub-id-type="doi">10.1097/BRS.0b013e3181b79358</pub-id>
<pub-id pub-id-type="pmid">20110842</pub-id>
</element-citation>
</ref>
<ref id="CR18">
<label>18.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Kim</surname>
<given-names>S-J</given-names>
</name>
<name>
<surname>Mugisha</surname>
<given-names>D</given-names>
</name>
</person-group>
<article-title>Effect of explicit visual feedback distortion on human gait</article-title>
<source>J Neuroeng Rehabil</source>
<year>2014</year>
<volume>11</volume>
<fpage>74</fpage>
<pub-id pub-id-type="doi">10.1186/1743-0003-11-74</pub-id>
<pub-id pub-id-type="pmid">24775424</pub-id>
</element-citation>
</ref>
<ref id="CR19">
<label>19.</label>
<element-citation publication-type="book">
<person-group person-group-type="author">
<name>
<surname>Lin</surname>
<given-names>Q</given-names>
</name>
<name>
<surname>Rieser</surname>
<given-names>J</given-names>
</name>
<name>
<surname>Bodenheimer</surname>
<given-names>B</given-names>
</name>
</person-group>
<person-group person-group-type="editor">
<name>
<surname>Khooshabeh</surname>
<given-names>P</given-names>
</name>
<name>
<surname>Harders</surname>
<given-names>M</given-names>
</name>
</person-group>
<article-title>Stepping over and ducking under: The influence of an avatar on locomotion in an HMD-based immersive virtual environment</article-title>
<source>Proceedings of the ACM Symposium on Applied Perception; 3–4 August 2012; Santa Monica</source>
<year>2012</year>
<publisher-loc>New York</publisher-loc>
<publisher-name>ACM</publisher-name>
<fpage>7</fpage>
<lpage>10</lpage>
</element-citation>
</ref>
<ref id="CR20">
<label>20.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Bernardi</surname>
<given-names>NF</given-names>
</name>
<name>
<surname>Marino</surname>
<given-names>BF</given-names>
</name>
<name>
<surname>Maravita</surname>
<given-names>A</given-names>
</name>
<name>
<surname>Castelnuovo</surname>
<given-names>G</given-names>
</name>
<name>
<surname>Tebano</surname>
<given-names>R</given-names>
</name>
<name>
<surname>Bricolo</surname>
<given-names>E</given-names>
</name>
</person-group>
<article-title>Grasping in wonderland: Altering the visual size of the body recalibrates the body schema</article-title>
<source>Exp Brain Res</source>
<year>2013</year>
<volume>226</volume>
<fpage>585</fpage>
<lpage>94</lpage>
<pub-id pub-id-type="doi">10.1007/s00221-013-3467-7</pub-id>
<pub-id pub-id-type="pmid">23515625</pub-id>
</element-citation>
</ref>
<ref id="CR21">
<label>21.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Ramachandran</surname>
<given-names>VS</given-names>
</name>
<name>
<surname>Brang</surname>
<given-names>D</given-names>
</name>
<name>
<surname>McGeoch</surname>
<given-names>PD</given-names>
</name>
</person-group>
<article-title>Size reduction using mirror visual feedback (MVF) reduces phantom pain</article-title>
<source>Neurocase</source>
<year>2009</year>
<volume>15</volume>
<fpage>357</fpage>
<lpage>60</lpage>
<pub-id pub-id-type="doi">10.1080/13554790903081767</pub-id>
<pub-id pub-id-type="pmid">19657972</pub-id>
</element-citation>
</ref>
<ref id="CR22">
<label>22.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Osumi</surname>
<given-names>M</given-names>
</name>
<name>
<surname>Imai</surname>
<given-names>R</given-names>
</name>
<name>
<surname>Ueta</surname>
<given-names>K</given-names>
</name>
<name>
<surname>Nakano</surname>
<given-names>H</given-names>
</name>
<name>
<surname>Nobusako</surname>
<given-names>S</given-names>
</name>
<name>
<surname>Morioka</surname>
<given-names>S</given-names>
</name>
</person-group>
<article-title>Factors associated with the modulation of pain by visual distortion of body size</article-title>
<source>Front Hum Neurosci</source>
<year>2014</year>
<volume>8</volume>
<fpage>137</fpage>
<pub-id pub-id-type="doi">10.3389/fnhum.2014.00137</pub-id>
<pub-id pub-id-type="pmid">24688463</pub-id>
</element-citation>
</ref>
<ref id="CR23">
<label>23.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Moseley</surname>
<given-names>GL</given-names>
</name>
<name>
<surname>Gallace</surname>
<given-names>A</given-names>
</name>
<name>
<surname>Spence</surname>
<given-names>C</given-names>
</name>
</person-group>
<article-title>Bodily illusions in health and disease: physiological and clinical perspectives and the concept of a cortical ‘body matrix’</article-title>
<source>Neurosci Biobehav Rev</source>
<year>2012</year>
<volume>36</volume>
<fpage>34</fpage>
<lpage>46</lpage>
<pub-id pub-id-type="doi">10.1016/j.neubiorev.2011.03.013</pub-id>
<pub-id pub-id-type="pmid">21477616</pub-id>
</element-citation>
</ref>
<ref id="CR24">
<label>24.</label>
<element-citation publication-type="book">
<person-group person-group-type="author">
<name>
<surname>Dunn</surname>
<given-names>F</given-names>
</name>
<name>
<surname>Perberry</surname>
<given-names>I</given-names>
</name>
</person-group>
<source>3D math primer for graphics and game development</source>
<year>2002</year>
<publisher-loc>Sudbury, MS, USA</publisher-loc>
<publisher-name>Jones & Bartlett Learning</publisher-name>
</element-citation>
</ref>
<ref id="CR25">
<label>25.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Witmer</surname>
<given-names>BG</given-names>
</name>
<name>
<surname>Singer</surname>
<given-names>MJ</given-names>
</name>
</person-group>
<article-title>Measuring presence in virtual environments: A presence questionnaire</article-title>
<source>Presence-Teleop Virtual</source>
<year>1998</year>
<volume>7</volume>
<fpage>225</fpage>
<lpage>40</lpage>
<pub-id pub-id-type="doi">10.1162/105474698565686</pub-id>
</element-citation>
</ref>
<ref id="CR26">
<label>26.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>de Lussanet</surname>
<given-names>MHE</given-names>
</name>
<name>
<surname>Behrendt</surname>
<given-names>F</given-names>
</name>
<name>
<surname>Puta</surname>
<given-names>C</given-names>
</name>
<name>
<surname>Schulte</surname>
<given-names>TL</given-names>
</name>
<name>
<surname>Lappe</surname>
<given-names>M</given-names>
</name>
<name>
<surname>Weiss</surname>
<given-names>T</given-names>
</name>
<etal></etal>
</person-group>
<article-title>Impaired visual perception of hurtful actions in patients with chronic low back pain</article-title>
<source>Hum Mov Sci</source>
<year>2013</year>
<volume>32</volume>
<fpage>938</fpage>
<lpage>53</lpage>
<pub-id pub-id-type="doi">10.1016/j.humov.2013.05.002</pub-id>
<pub-id pub-id-type="pmid">24120278</pub-id>
</element-citation>
</ref>
<ref id="CR27">
<label>27.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Mann</surname>
<given-names>L</given-names>
</name>
<name>
<surname>Kleinpaul</surname>
<given-names>JF</given-names>
</name>
<name>
<surname>Pereira Moro</surname>
<given-names>AR</given-names>
</name>
<name>
<surname>Mota</surname>
<given-names>CB</given-names>
</name>
<name>
<surname>Carpes</surname>
<given-names>FP</given-names>
</name>
</person-group>
<article-title>Effect of low back pain on postural stability in younger women: Influence of visual deprivation</article-title>
<source>J Bodyw Mov Ther</source>
<year>2010</year>
<volume>14</volume>
<fpage>361</fpage>
<lpage>6</lpage>
<pub-id pub-id-type="doi">10.1016/j.jbmt.2009.06.007</pub-id>
<pub-id pub-id-type="pmid">20850043</pub-id>
</element-citation>
</ref>
<ref id="CR28">
<label>28.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Wand</surname>
<given-names>BM</given-names>
</name>
<name>
<surname>Keeves</surname>
<given-names>J</given-names>
</name>
<name>
<surname>Bourgoin</surname>
<given-names>C</given-names>
</name>
<name>
<surname>George</surname>
<given-names>PJ</given-names>
</name>
<name>
<surname>Smith</surname>
<given-names>AJ</given-names>
</name>
<name>
<surname>O’Connell</surname>
<given-names>NE</given-names>
</name>
<etal></etal>
</person-group>
<article-title>Mislocalization of sensory information in people with chronic low back pain: A preliminary investigation</article-title>
<source>Clin J Pain</source>
<year>2013</year>
<volume>29</volume>
<fpage>737</fpage>
<lpage>43</lpage>
<pub-id pub-id-type="doi">10.1097/AJP.0b013e318274b320</pub-id>
<pub-id pub-id-type="pmid">23835768</pub-id>
</element-citation>
</ref>
<ref id="CR29">
<label>29.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Moseley</surname>
<given-names>GL</given-names>
</name>
<name>
<surname>Flor</surname>
<given-names>H</given-names>
</name>
</person-group>
<article-title>Targeting cortical representations in the treatment of chronic pain: a review</article-title>
<source>Neurorehabil Neural Repair</source>
<year>2012</year>
<volume>26</volume>
<fpage>646</fpage>
<lpage>52</lpage>
<pub-id pub-id-type="doi">10.1177/1545968311433209</pub-id>
<pub-id pub-id-type="pmid">22331213</pub-id>
</element-citation>
</ref>
<ref id="CR30">
<label>30.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>McCabe</surname>
<given-names>CS</given-names>
</name>
<name>
<surname>Haigh</surname>
<given-names>RC</given-names>
</name>
<name>
<surname>Blake</surname>
<given-names>DR</given-names>
</name>
</person-group>
<article-title>Mirror visual feedback for the treatment of complex regional pain syndrome (Type 1)</article-title>
<source>Curr Pain Headache Rep</source>
<year>2008</year>
<volume>12</volume>
<fpage>103</fpage>
<lpage>7</lpage>
<pub-id pub-id-type="doi">10.1007/s11916-008-0020-7</pub-id>
<pub-id pub-id-type="pmid">18474189</pub-id>
</element-citation>
</ref>
<ref id="CR31">
<label>31.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>McCabe</surname>
<given-names>CS</given-names>
</name>
<name>
<surname>Cohen</surname>
<given-names>H</given-names>
</name>
<name>
<surname>Hall</surname>
<given-names>J</given-names>
</name>
<name>
<surname>Lewis</surname>
<given-names>J</given-names>
</name>
<name>
<surname>Rodham</surname>
<given-names>K</given-names>
</name>
<name>
<surname>Harris</surname>
<given-names>N</given-names>
</name>
</person-group>
<article-title>Somatosensory conflicts in complex regional pain syndrome type 1 and fibromyalgia syndrome</article-title>
<source>Curr Rheumatol Rep</source>
<year>2009</year>
<volume>11</volume>
<fpage>461</fpage>
<lpage>5</lpage>
<pub-id pub-id-type="doi">10.1007/s11926-009-0067-4</pub-id>
<pub-id pub-id-type="pmid">19922737</pub-id>
</element-citation>
</ref>
<ref id="CR32">
<label>32.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Opris</surname>
<given-names>D</given-names>
</name>
<name>
<surname>Pintea</surname>
<given-names>S</given-names>
</name>
<name>
<surname>Garcia-Palacios</surname>
<given-names>A</given-names>
</name>
<name>
<surname>Botella</surname>
<given-names>C</given-names>
</name>
<name>
<surname>Szamoskozi</surname>
<given-names>S</given-names>
</name>
<name>
<surname>David</surname>
<given-names>D</given-names>
</name>
</person-group>
<article-title>Virtual reality exposure therapy in anxiety disorders: a quantitative meta-analysis</article-title>
<source>Depress Anxiety</source>
<year>2012</year>
<volume>29</volume>
<fpage>85</fpage>
<lpage>93</lpage>
<pub-id pub-id-type="doi">10.1002/da.20910</pub-id>
<pub-id pub-id-type="pmid">22065564</pub-id>
</element-citation>
</ref>
<ref id="CR33">
<label>33.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Coelho</surname>
<given-names>CM</given-names>
</name>
<name>
<surname>Santos</surname>
<given-names>JA</given-names>
</name>
<name>
<surname>Silva</surname>
<given-names>C</given-names>
</name>
<name>
<surname>Wallis</surname>
<given-names>G</given-names>
</name>
<name>
<surname>Tichon</surname>
<given-names>J</given-names>
</name>
<name>
<surname>Hine</surname>
<given-names>TJ</given-names>
</name>
</person-group>
<article-title>The role of self-motion in acrophobia treatment</article-title>
<source>Cyberpsychol Behav</source>
<year>2008</year>
<volume>11</volume>
<fpage>723</fpage>
<lpage>5</lpage>
<pub-id pub-id-type="doi">10.1089/cpb.2008.0023</pub-id>
<pub-id pub-id-type="pmid">18991529</pub-id>
</element-citation>
</ref>
<ref id="CR34">
<label>34.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Vlaeyen</surname>
<given-names>JW</given-names>
</name>
<name>
<surname>de Jong</surname>
<given-names>J</given-names>
</name>
<name>
<surname>Geilen</surname>
<given-names>M</given-names>
</name>
<name>
<surname>Heuts</surname>
<given-names>PH</given-names>
</name>
<name>
<surname>van Breukelen</surname>
<given-names>G</given-names>
</name>
</person-group>
<article-title>Graded exposure in vivo in the treatment of pain-related fear: a replicated single-case experimental design in four patients with chronic low back pain</article-title>
<source>Behav Res Ther</source>
<year>2001</year>
<volume>39</volume>
<fpage>151</fpage>
<lpage>66</lpage>
<pub-id pub-id-type="doi">10.1016/S0005-7967(99)00174-6</pub-id>
<pub-id pub-id-type="pmid">11153970</pub-id>
</element-citation>
</ref>
</ref-list>
</back>
</pmc>
<affiliations>
<list></list>
<tree>
<noCountry>
<name sortKey="Bouyer, Laurent J" sort="Bouyer, Laurent J" uniqKey="Bouyer L" first="Laurent J" last="Bouyer">Laurent J. Bouyer</name>
<name sortKey="Hebert, Luc J" sort="Hebert, Luc J" uniqKey="Hebert L" first="Luc J" last="Hébert">Luc J. Hébert</name>
<name sortKey="Jackson, Philip L" sort="Jackson, Philip L" uniqKey="Jackson P" first="Philip L" last="Jackson">Philip L. Jackson</name>
<name sortKey="Mcfadyen, Bradford J" sort="Mcfadyen, Bradford J" uniqKey="Mcfadyen B" first="Bradford J" last="Mcfadyen">Bradford J. Mcfadyen</name>
<name sortKey="Mercier, Catherine" sort="Mercier, Catherine" uniqKey="Mercier C" first="Catherine" last="Mercier">Catherine Mercier</name>
<name sortKey="Robitaille, Nicolas" sort="Robitaille, Nicolas" uniqKey="Robitaille N" first="Nicolas" last="Robitaille">Nicolas Robitaille</name>
<name sortKey="Roosink, Meyke" sort="Roosink, Meyke" uniqKey="Roosink M" first="Meyke" last="Roosink">Meyke Roosink</name>
</noCountry>
</tree>
</affiliations>
</record>

Pour manipuler ce document sous Unix (Dilib)

EXPLOR_STEP=$WICRI_ROOT/Ticri/CIDE/explor/HapticV1/Data/Pmc/Checkpoint
HfdSelect -h $EXPLOR_STEP/biblio.hfd -nk 000393 | SxmlIndent | more

Ou

HfdSelect -h $EXPLOR_AREA/Data/Pmc/Checkpoint/biblio.hfd -nk 000393 | SxmlIndent | more

Pour mettre un lien sur cette page dans le réseau Wicri

{{Explor lien
   |wiki=    Ticri/CIDE
   |area=    HapticV1
   |flux=    Pmc
   |étape=   Checkpoint
   |type=    RBID
   |clé=     PMC:4326499
   |texte=   Real-time modulation of visual feedback on human full-body movements in a virtual mirror: development and proof-of-concept
}}

Pour générer des pages wiki

HfdIndexSelect -h $EXPLOR_AREA/Data/Pmc/Checkpoint/RBID.i   -Sk "pubmed:25558785" \
       | HfdSelect -Kh $EXPLOR_AREA/Data/Pmc/Checkpoint/biblio.hfd   \
       | NlmPubMed2Wicri -a HapticV1 

Wicri

This area was generated with Dilib version V0.6.23.
Data generation: Mon Jun 13 01:09:46 2016. Site generation: Wed Mar 6 09:54:07 2024