Serveur d'exploration sur les dispositifs haptiques

Attention, ce site est en cours de développement !
Attention, site généré par des moyens informatiques à partir de corpus bruts.
Les informations ne sont donc pas validées.

Haptic/Graphic Rehabilitation: Integrating a Robot into a Virtual Environment Library and Applying it to Stroke Therapy

Identifieur interne : 000248 ( Pmc/Curation ); précédent : 000247; suivant : 000249

Haptic/Graphic Rehabilitation: Integrating a Robot into a Virtual Environment Library and Applying it to Stroke Therapy

Auteurs : Ian Sharp ; James Patton ; Molly Listenberger ; Emily Case

Source :

RBID : PMC:3211130

Abstract

Recent research that tests interactive devices for prolonged therapy practice has revealed new prospects for robotics combined with graphical and other forms of biofeedback. Previous human-robot interactive systems have required different software commands to be implemented for each robot leading to unnecessary developmental overhead time each time a new system becomes available. For example, when a haptic/graphic virtual reality environment has been coded for one specific robot to provide haptic feedback, that specific robot would not be able to be traded for another robot without recoding the program. However, recent efforts in the open source community have proposed a wrapper class approach that can elicit nearly identical responses regardless of the robot used. The result can lead researchers across the globe to perform similar experiments using shared code. Therefore modular "switching out"of one robot for another would not affect development time. In this paper, we outline the successful creation and implementation of a wrapper class for one robot into the open-source H3DAPI, which integrates the software commands most commonly used by all robots.


Url:
DOI: 10.3791/3007
PubMed: 21847086
PubMed Central: 3211130

Links toward previous steps (curation, corpus...)


Links to Exploration step

PMC:3211130

Curation

No country items

Ian Sharp
<affiliation>
<nlm:aff id="ID1">Department of Bioengineering, University of Illinois at Chicago and Rehabilitation Institute of Chicago</nlm:aff>
<wicri:noCountry code="subfield">University of Illinois at Chicago and Rehabilitation Institute of Chicago</wicri:noCountry>
</affiliation>
James Patton
<affiliation>
<nlm:aff id="ID1">Department of Bioengineering, University of Illinois at Chicago and Rehabilitation Institute of Chicago</nlm:aff>
<wicri:noCountry code="subfield">University of Illinois at Chicago and Rehabilitation Institute of Chicago</wicri:noCountry>
</affiliation>
Molly Listenberger
<affiliation>
<nlm:aff id="ID2">Sensory Motor Performance Program, Rehabilitation Institute of Chicago</nlm:aff>
<wicri:noCountry code="subfield">Rehabilitation Institute of Chicago</wicri:noCountry>
</affiliation>
Emily Case
<affiliation>
<nlm:aff id="ID2">Sensory Motor Performance Program, Rehabilitation Institute of Chicago</nlm:aff>
<wicri:noCountry code="subfield">Rehabilitation Institute of Chicago</wicri:noCountry>
</affiliation>

Le document en format XML

<record>
<TEI>
<teiHeader>
<fileDesc>
<titleStmt>
<title xml:lang="en">Haptic/Graphic Rehabilitation: Integrating a Robot into a Virtual Environment Library and Applying it to Stroke Therapy</title>
<author>
<name sortKey="Sharp, Ian" sort="Sharp, Ian" uniqKey="Sharp I" first="Ian" last="Sharp">Ian Sharp</name>
<affiliation>
<nlm:aff id="ID1">Department of Bioengineering, University of Illinois at Chicago and Rehabilitation Institute of Chicago</nlm:aff>
<wicri:noCountry code="subfield">University of Illinois at Chicago and Rehabilitation Institute of Chicago</wicri:noCountry>
</affiliation>
</author>
<author>
<name sortKey="Patton, James" sort="Patton, James" uniqKey="Patton J" first="James" last="Patton">James Patton</name>
<affiliation>
<nlm:aff id="ID1">Department of Bioengineering, University of Illinois at Chicago and Rehabilitation Institute of Chicago</nlm:aff>
<wicri:noCountry code="subfield">University of Illinois at Chicago and Rehabilitation Institute of Chicago</wicri:noCountry>
</affiliation>
</author>
<author>
<name sortKey="Listenberger, Molly" sort="Listenberger, Molly" uniqKey="Listenberger M" first="Molly" last="Listenberger">Molly Listenberger</name>
<affiliation>
<nlm:aff id="ID2">Sensory Motor Performance Program, Rehabilitation Institute of Chicago</nlm:aff>
<wicri:noCountry code="subfield">Rehabilitation Institute of Chicago</wicri:noCountry>
</affiliation>
</author>
<author>
<name sortKey="Case, Emily" sort="Case, Emily" uniqKey="Case E" first="Emily" last="Case">Emily Case</name>
<affiliation>
<nlm:aff id="ID2">Sensory Motor Performance Program, Rehabilitation Institute of Chicago</nlm:aff>
<wicri:noCountry code="subfield">Rehabilitation Institute of Chicago</wicri:noCountry>
</affiliation>
</author>
</titleStmt>
<publicationStmt>
<idno type="wicri:source">PMC</idno>
<idno type="pmid">21847086</idno>
<idno type="pmc">3211130</idno>
<idno type="url">http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3211130</idno>
<idno type="RBID">PMC:3211130</idno>
<idno type="doi">10.3791/3007</idno>
<date when="2011">2011</date>
<idno type="wicri:Area/Pmc/Corpus">000248</idno>
<idno type="wicri:Area/Pmc/Curation">000248</idno>
</publicationStmt>
<sourceDesc>
<biblStruct>
<analytic>
<title xml:lang="en" level="a" type="main">Haptic/Graphic Rehabilitation: Integrating a Robot into a Virtual Environment Library and Applying it to Stroke Therapy</title>
<author>
<name sortKey="Sharp, Ian" sort="Sharp, Ian" uniqKey="Sharp I" first="Ian" last="Sharp">Ian Sharp</name>
<affiliation>
<nlm:aff id="ID1">Department of Bioengineering, University of Illinois at Chicago and Rehabilitation Institute of Chicago</nlm:aff>
<wicri:noCountry code="subfield">University of Illinois at Chicago and Rehabilitation Institute of Chicago</wicri:noCountry>
</affiliation>
</author>
<author>
<name sortKey="Patton, James" sort="Patton, James" uniqKey="Patton J" first="James" last="Patton">James Patton</name>
<affiliation>
<nlm:aff id="ID1">Department of Bioengineering, University of Illinois at Chicago and Rehabilitation Institute of Chicago</nlm:aff>
<wicri:noCountry code="subfield">University of Illinois at Chicago and Rehabilitation Institute of Chicago</wicri:noCountry>
</affiliation>
</author>
<author>
<name sortKey="Listenberger, Molly" sort="Listenberger, Molly" uniqKey="Listenberger M" first="Molly" last="Listenberger">Molly Listenberger</name>
<affiliation>
<nlm:aff id="ID2">Sensory Motor Performance Program, Rehabilitation Institute of Chicago</nlm:aff>
<wicri:noCountry code="subfield">Rehabilitation Institute of Chicago</wicri:noCountry>
</affiliation>
</author>
<author>
<name sortKey="Case, Emily" sort="Case, Emily" uniqKey="Case E" first="Emily" last="Case">Emily Case</name>
<affiliation>
<nlm:aff id="ID2">Sensory Motor Performance Program, Rehabilitation Institute of Chicago</nlm:aff>
<wicri:noCountry code="subfield">Rehabilitation Institute of Chicago</wicri:noCountry>
</affiliation>
</author>
</analytic>
<series>
<title level="j">Journal of Visualized Experiments : JoVE</title>
<idno type="eISSN">1940-087X</idno>
<imprint>
<date when="2011">2011</date>
</imprint>
</series>
</biblStruct>
</sourceDesc>
</fileDesc>
<profileDesc>
<textClass></textClass>
</profileDesc>
</teiHeader>
<front>
<div type="abstract" xml:lang="en">
<p>Recent research that tests interactive devices for prolonged therapy practice has revealed new prospects for robotics combined with graphical and other forms of biofeedback. Previous human-robot interactive systems have required different software commands to be implemented for each robot leading to unnecessary developmental overhead time each time a new system becomes available. For example, when a haptic/graphic virtual reality environment has been coded for one specific robot to provide haptic feedback, that specific robot would not be able to be traded for another robot without recoding the program. However, recent efforts in the open source community have proposed a wrapper class approach that can elicit nearly identical responses regardless of the robot used. The result can lead researchers across the globe to perform similar experiments using shared code. Therefore modular "switching out"of one robot for another would not affect development time. In this paper, we outline the successful creation and implementation of a wrapper class for one robot into the open-source H3DAPI, which integrates the software commands most commonly used by all robots.</p>
</div>
</front>
<back>
<div1 type="bibliography">
<listBibl>
<biblStruct>
<analytic>
<author>
<name sortKey="Solis, J" uniqKey="Solis J">J Solis</name>
</author>
<author>
<name sortKey="Takeshi, N" uniqKey="Takeshi N">N Takeshi</name>
</author>
<author>
<name sortKey="Petersen, K" uniqKey="Petersen K">K Petersen</name>
</author>
<author>
<name sortKey="Takeuchi, M" uniqKey="Takeuchi M">M Takeuchi</name>
</author>
<author>
<name sortKey="Takanishi, A" uniqKey="Takanishi A">A Takanishi</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Zufferey, Jc" uniqKey="Zufferey J">JC Zufferey</name>
</author>
<author>
<name sortKey="Floreano, D" uniqKey="Floreano D">D Floreano</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Conditt, Ma" uniqKey="Conditt M">MA Conditt</name>
</author>
<author>
<name sortKey="Gandolfo, F" uniqKey="Gandolfo F">F Gandolfo</name>
</author>
<author>
<name sortKey="Mussa Ivaldi, Fa" uniqKey="Mussa Ivaldi F">FA Mussa-Ivaldi</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Krebs, Hi" uniqKey="Krebs H">HI Krebs</name>
</author>
<author>
<name sortKey="Palazzolo, Jj" uniqKey="Palazzolo J">JJ Palazzolo</name>
</author>
<author>
<name sortKey="Dipietro, L" uniqKey="Dipietro L">L Dipietro</name>
</author>
<author>
<name sortKey="Ferraro, M" uniqKey="Ferraro M">M Ferraro</name>
</author>
<author>
<name sortKey="Krol, J" uniqKey="Krol J">J Krol</name>
</author>
<author>
<name sortKey="Rannekleiv, K" uniqKey="Rannekleiv K">K Rannekleiv</name>
</author>
<author>
<name sortKey="Volpe, Bt" uniqKey="Volpe B">BT Volpe</name>
</author>
<author>
<name sortKey="Hogan, N" uniqKey="Hogan N">N Hogan</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Wei, K" uniqKey="Wei K">K Wei</name>
</author>
<author>
<name sortKey="Kording, K" uniqKey="Kording K">K Kording</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Wei, Y" uniqKey="Wei Y">Y Wei</name>
</author>
<author>
<name sortKey="Bajaj, P" uniqKey="Bajaj P">P Bajaj</name>
</author>
<author>
<name sortKey="Scheidt, R" uniqKey="Scheidt R">R Scheidt</name>
</author>
<author>
<name sortKey="Patton, J" uniqKey="Patton J">J Patton</name>
</author>
</analytic>
</biblStruct>
</listBibl>
</div1>
</back>
</TEI>
<pmc article-type="research-article">
<pmc-dir>properties open_access</pmc-dir>
<front>
<journal-meta>
<journal-id journal-id-type="nlm-ta">J Vis Exp</journal-id>
<journal-id journal-id-type="iso-abbrev">J Vis Exp</journal-id>
<journal-id journal-id-type="publisher-id">JoVE</journal-id>
<journal-title-group>
<journal-title>Journal of Visualized Experiments : JoVE</journal-title>
</journal-title-group>
<issn pub-type="epub">1940-087X</issn>
<publisher>
<publisher-name>MyJove Corporation</publisher-name>
</publisher>
</journal-meta>
<article-meta>
<article-id pub-id-type="pmid">21847086</article-id>
<article-id pub-id-type="pmc">3211130</article-id>
<article-id pub-id-type="publisher-id">3007</article-id>
<article-id pub-id-type="doi">10.3791/3007</article-id>
<article-categories>
<subj-group subj-group-type="heading">
<subject>Bioengineering</subject>
</subj-group>
</article-categories>
<title-group>
<article-title>Haptic/Graphic Rehabilitation: Integrating a Robot into a Virtual Environment Library and Applying it to Stroke Therapy</article-title>
</title-group>
<contrib-group>
<contrib contrib-type="author">
<name>
<surname>Sharp</surname>
<given-names>Ian</given-names>
</name>
<xref ref-type="aff" rid="ID1">
<sup>1</sup>
</xref>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Patton</surname>
<given-names>James</given-names>
</name>
<xref ref-type="aff" rid="ID1">
<sup>1</sup>
</xref>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Listenberger</surname>
<given-names>Molly</given-names>
</name>
<xref ref-type="aff" rid="ID2">
<sup>2</sup>
</xref>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Case</surname>
<given-names>Emily</given-names>
</name>
<xref ref-type="aff" rid="ID2">
<sup>2</sup>
</xref>
</contrib>
</contrib-group>
<aff id="ID1">
<sup>1</sup>
Department of Bioengineering, University of Illinois at Chicago and Rehabilitation Institute of Chicago</aff>
<aff id="ID2">
<sup>2</sup>
Sensory Motor Performance Program, Rehabilitation Institute of Chicago</aff>
<author-notes>
<fn>
<p>Correspondence to: James Patton at
<email>pattonj@uic.edu</email>
</p>
</fn>
</author-notes>
<pub-date pub-type="collection">
<year>2011</year>
</pub-date>
<pub-date pub-type="epub">
<day>8</day>
<month>8</month>
<year>2011</year>
</pub-date>
<pub-date pub-type="pmc-release">
<day>8</day>
<month>8</month>
<year>2011</year>
</pub-date>
<pmc-comment> PMC Release delay is 0 months and 0 days and was based on the . </pmc-comment>
<issue>54</issue>
<elocation-id>3007</elocation-id>
<permissions>
<copyright-statement>Copyright © 2011, Journal of Visualized Experiments</copyright-statement>
<copyright-year>2011</copyright-year>
<license license-type="open-access" xlink:href="http://creativecommons.org/licenses/by-nc-nd/3.0/">
<license-p>This is an open-access article distributed under the terms of the Creative Commons Attribution-NonCommercial-NoDerivs 3.0 Unported License. To view a copy of this license, visit
<ext-link ext-link-type="uri" xlink:href="http://creativecommons.org/licenses/by-nc-nd/3.0/">http://creativecommons.org/licenses/by-nc-nd/3.0/</ext-link>
</license-p>
</license>
</permissions>
<abstract>
<p>Recent research that tests interactive devices for prolonged therapy practice has revealed new prospects for robotics combined with graphical and other forms of biofeedback. Previous human-robot interactive systems have required different software commands to be implemented for each robot leading to unnecessary developmental overhead time each time a new system becomes available. For example, when a haptic/graphic virtual reality environment has been coded for one specific robot to provide haptic feedback, that specific robot would not be able to be traded for another robot without recoding the program. However, recent efforts in the open source community have proposed a wrapper class approach that can elicit nearly identical responses regardless of the robot used. The result can lead researchers across the globe to perform similar experiments using shared code. Therefore modular "switching out"of one robot for another would not affect development time. In this paper, we outline the successful creation and implementation of a wrapper class for one robot into the open-source H3DAPI, which integrates the software commands most commonly used by all robots.</p>
</abstract>
<kwd-group kwd-group-type="author-generated">
<kwd>Bioengineering</kwd>
<kwd>Issue 54</kwd>
<kwd>robotics</kwd>
<kwd>haptics</kwd>
<kwd>virtual reality</kwd>
<kwd>wrapper class</kwd>
<kwd>rehabilitation robotics</kwd>
<kwd>neural engineering</kwd>
<kwd>H3DAPI</kwd>
<kwd>C++</kwd>
</kwd-group>
</article-meta>
</front>
<body>
<fig id="Fig_3007" orientation="portrait" position="anchor">
<alternatives>
<media id="Video_3007" xlink:href="jove-54-3007.mp4" mimetype="video" mime-subtype="mp4" orientation="portrait" xlink:type="simple" position="anchor"></media>
<graphic xlink:href="jove-54-3007-thumb"></graphic>
</alternatives>
</fig>
<sec>
<title>Protocol</title>
<sec>
<title>Introduction</title>
<p>There is a growing need in all of human-machine interaction (HMI) for intuitive and efficient interactive environments. Numerous industries continue to depend more heavily on HMI, such as: rehabilitation robotics, the automotive industry, metals manufacturing, packaging machinery, pharmaceuticals, food, beverage, and utilities. Technologies employed in these industries include: display terminals, personal computers, and HMI software. These technologies may be combined together to perform unlimited functions.</p>
<p>Robots may be used to facilitate direct interaction with users, such as serving as a music instructor. For example, researchers at Waseda University have created a robot that plays the saxophone to teach people how to play and to understand the interaction between student and teacher
<sup>1</sup>
. Other robotics researchers have made a vision-based flying robot in order to determine how artificial intelligence may evolve into intelligent interactions with the environment
<sup>2</sup>
. The particular concentration of this paper resides within rehabilitation robotics.</p>
<p>Within the realm of research and industry, the quick pace of change for new products and user requirements continues to grow. These demands impose greater challenges in scalability. Therefore code design has become integral in meeting the needs of these entities in a timely manner. Hence, the quality of a strong architectural candidate would include easily interchangeable graphic-robot systems that include driver support. The H3DAPI architecture meets these needs and thus a wrapper class has been created. Furthermore, H3D is designed for virtual reality environments, such as those needed in rehabilitation robotics.</p>
<p>Neural rehabilitation robotics seeks to utilize robots for the purpose of assisting rehabilitation professionals. The assistance that these robots provide comes in the form of a force-field. Passed motor command researchers such as Shadmehr and Mussa-Ivaldi, used force-fields to foster motor adaptation, and have found 1) adaptation to an externally applied force field occurs with different classes of movements including but not limited to reaching movements, and 2) adaptation generalizes across different movements that visit the same regions of the external field
<sup>3</sup>
. Research from biomechanical engineers in Performance-Based Progressive Robot-Assisted therapy shows that repetitive, task-specific, goal-directed, robot-assisted therapy is effective in reducing motor impairments in the affected arm after stroke
<sup>4</sup>
, but the exact therapeutic effects and parameters continue to be a field of research.</p>
<p>Sensory feedback affects learning and adaptation. Therefore the next logical question would be to ask whether or not artificially increasing the magnitude of such feedback would promote faster or more complete learning/adaptation. Some researchers have found that applying greater sensory feedback forces or visual cues to enhance mistakes can provide an adequate neurological stimulus to promote higher levels of adaptation/learning
<sup>5,6</sup>
. This is known as "error augmentation". This phenomenon may be due to the fact that once outcomes of a motor control action deviate from the ideal, our internal model self-adjusts according to the magnitude of error. Consequently, as our internal model approaches the external environment, error in the execution of a task decreases.</p>
<p>Research continues to support prolonged practice of functionally relevant activities for restoration of function, although many current health care policies limit the amount of time patients can spend time with therapists. The compelling question is whether these new applications of technology can go further than simply giving a higher dosage of the current state of care. Human-machine interaction studies have revealed new prospects in the areas of motor learning, and can in some cases offer added value to the therapeutic process. Specialized robotic devices combined with computer-displays can augment feedback of error in order to speed up, enhance, or trigger motor relearning. This paper will present a methodology of using a developed system for a clinical intervention as one such example of the application of this technology.</p>
</sec>
<sec>
<title>1. Establishing HAPI wrapper class for a robot</title>
<list list-type="order">
<list-item>
<p>Create a wrapper for HAPI the haptics library by creating your own .cpp and header file. For example we will use the name HAPIWAM.cpp and HAPIWAM.h. </p>
</list-item>
<list-item>
<p>Place HAPIWAM.cpp into the source directory: HAPI/src</p>
</list-item>
<list-item>
<p>Place HAPIWAM.h into the header file directory: HAPI/include/HAPI</p>
</list-item>
<list-item>
<p>At the top of HAPIWAM.h, include the main header file(s) of your robot, in the case of the Barrett WAM, that would be:</p>
</list-item>
</list>
<p>
<italic>extern "C" { #include } #include </italic>
</p>
<p>Note: extern "C" is required to resolve compiler mangling, because the included library is written in 'C' and the H3DAPI is written in C++.</p>
<list list-type="order">
<list-item>
<p>In HAPIWAM.h, create your class and include the following 4 functions</p>
</list-item>
</list>
<p>
<italic>bool initHapticsDevice(int); bool releaseHapticsDevice(); void updateDeviceValues(DeviceValues &dv, HAPITime dt); void sendOutput(HAPIHapticsDevice::DeviceOutput &d, HAPITime t);</italic>
</p>
<list list-type="order">
<list-item>
<p>Make sure your class inherits publicly from the HAPIhapticsdevice class.</p>
</list-item>
<list-item>
<p>Create a header guard for your class.</p>
</list-item>
<list-item>
<p>Create static DeviceOutput and static HapticsDeviceRegistration attributes under the HAPIWAM class.</p>
</list-item>
<list-item>
<p>Create your static member functions for callbacks.</p>
</list-item>
<list-item>
<p>Define your constructor and destructor in HAPIWAM.cpp.</p>
</list-item>
<list-item>
<p>Register your device in HAPIWAM.cpp.</p>
</list-item>
<list-item>
<p>Define your 4 inherited functions and callbacks in HAPIWAM.cpp.</p>
</list-item>
</list>
</sec>
<sec>
<title>2. HAPI library creation</title>
<list list-type="order">
<list-item>
<p>Now that we have created the HAPI wrapper class, we need to build your wrapper into the HAPI library. The WAM depends on some libraries that H3DAPI does not depend on in its raw form, therefore these libraries will need to be added to HAPI. Go to HAPI/HAPI/build, and edit CMakeLists.txt. Add the dependent libraries after the line that says 'SET(OptionalLibs)'. </p>
</list-item>
<list-item>
<p>Open a command console and navigate to: HAPI/HAPI/build and type the following 3 commands in this order:</p>
</list-item>
</list>
<p>
<italic>cmake . sudo make sudo make install</italic>
</p>
</sec>
<sec>
<title>3. H3D wrapper class</title>
<list list-type="order">
<list-item>
<p>To create the wrapper class for the H3D library with your HAPIWAM, first create WAMDevice.cpp in the source directory: H3DAPI/src</p>
</list-item>
<list-item>
<p>Place WAMDevice.h into the header file directory: H3DAPI/include/H3D</p>
</list-item>
<list-item>
<p>WAMDevice.h should contain the standard header file for all H3DAPI devices, with the name replaced to whatever you want.</p>
</list-item>
<list-item>
<p>WAMDevice.cpp should contain the standard source for all H3DAPI devices, with the name replaced to whatever you want.</p>
</list-item>
<list-item>
<p>Now that the wrapper class has been created, rebuild the H3DAPI library. Do this by editing CMakeLists.txt in the same way that was performed in step 2.1, only under the directory: H3DAPI/build. </p>
</list-item>
<list-item>
<p>Rebuild the H3DAPI library under the directory H3DAPI/build</p>
</list-item>
</list>
<p>
<italic>cmake . sudo make sudo make install</italic>
</p>
</sec>
<sec>
<title>4. Finite state machine</title>
<list list-type="order">
<list-item>
<p>Every targeted reaching program requires the creation of a finite state machine to control the experimental protocol or practice regime. Typical state machines include: Start of trial, Launch, Target Contact, and End of Trial. An example of the function of each state, and criteria to transfer between states is listed below.</p>
</list-item>
<list-item>
<p>The Start of Trial requires the allocation of a target. Targets locations may be set randomly for each trial or may be set from a file. The start of the trial ends once the user has launched toward the target above velocity threshold, typically 0.06 meters per second.</p>
</list-item>
<list-item>
<p>The Launch state occurs after the start of the trial. This state ends either once the user touches the target or stays inside of the target for a period of time. Once the target is touched, this enables the Target Contact state.</p>
</list-item>
<list-item>
<p>Target Contact occurs during the Launch state. It may end as soon as the target is touched or after the subject resides within the target for a specific period of time. Once this time has elapsed, the End of Trial state is enabled.</p>
</list-item>
<list-item>
<p>The End of Trial state should signal the data collection software to mark the data file, in whatever parsing the software developer has used, to delimit the end of the each trial. Unless the final trial has been completed, the end of the End of Trial state enables the Start of Trial state.</p>
</list-item>
</list>
</sec>
<sec>
<title>5. Application: rehabilitation of the stroke patient </title>
<list list-type="order">
<list-item>
<p>The robotic interface was designed to involve therapist expertise while using the robot to enable something that could not otherwise be done. The technology enabled application (described in more detail below) of error augmentation, which magnified the errors perceived by the patient, which for several well-known reasons enhances the relearning process (fig 1).</p>
</list-item>
<list-item>
<p>A three-dimensional haptics/graphics system called the Virtual Reality Robotic and Optical Operations Machine (VRROOM). This system, presented previously
<sup>6</sup>
, combined a projected stereo, head-tracked rendering on a semi-silvered mirror overlay display with a robotic system that recorded wrist position and generated a force vector (fig 2).</p>
</list-item>
<list-item>
<p>A cinema-quality digital projector (Christie Mirage 3000 DLP) displayed the images that spanned a five-foot-wide 1280x1024 pixel display, resulting in a 110° wide viewing angle. Infrared emitters synchronized separate left and right eye images through Liquid Crystal Display (LCD) shutter glasses (Stereographics, Inc). Ascension Flock of Birds magnetic elements tracked motion of the head so that the visual display was rendered with the appropriate head-centered perspective.</p>
</list-item>
<list-item>
<p>Upon qualifying for the study, each participant's functional ability was evaluated by a blind rater at the start and finish of each treatment paradigm with a one-week follow-up after each and an overall 45-day follow-up evaluation. Each evaluation consisted of a range of motion (ROM) assessment performed in the VRROOM as well as clinical measures including: the Box and Blocks Assessment, Wolf Motor Function Test (WMFT), Arm Motor Section of the Fugl-Meyer (AMFM), and Assessment of Simple Functional Reach (ASFR). </p>
</list-item>
<list-item>
<p>An exotendon glove with wrist splint was utilized to assist in neutral wrist and hand alignment. The center of the robot handle was attached to the forearm was placed posterior to the radiocarpal joint so that its forces acted at the wrist but allowed motion at the hand.</p>
</list-item>
<list-item>
<p>The patient's arm weight was lessened using a spring-powered Wilmington Robotic Exoskeleton (WREX) gravity-balanced orthosis. The patient's instructed goal was to chase a cursor presented in front of them moved via a tracking device in the hand of the therapist (therapist teleoperation).</p>
</list-item>
<list-item>
<p>Patients practice three days per week for approximately 40-60 minutes, with the patient, the therapist, and the robot working together in a trio. Subject and therapist sat side-by-side, and the subject was connected to the robot at the wrist. </p>
</list-item>
<list-item>
<p>Each session began with five minutes of passive range of motion exercises (PROM) with the therapist, followed by approximately ten minutes for situating the patient in the machine. The subject then completed six blocks of movement training lasting five minutes each with two-minute rest periods between each block.</p>
</list-item>
<list-item>
<p>During training, participants viewed two cursors on the stereo display. The treating therapist manipulated one cursor while the participant controlled the other. Patients were instructed to follow the exact path of the therapist’s cursor as it moved throughout the workspace. </p>
</list-item>
<list-item>
<p>Error augmentation was provided both visually and by forces generated by the robot. When participants deviated from therapist’s cursor, an instantaneous error vector e was established as the difference in position between the therapist’s cursor and the participant’s hand. Error was visually magnified by a factor of 1.5
<italic>e</italic>
(m) as part of the error augmentation. Additionally, an error augmenting force of 100
<italic>e</italic>
(N/m) was also applied, which was programmed to saturate at a maximum of 4 N for safety reasons.</p>
</list-item>
<list-item>
<p>Every other treatment block consisted of specific, standardized motions that were the same for each session. The other blocks allowed the therapist to customize training at specific areas of weakness based on therapist expertise and their observations. The treatment protocol included the practice of specific movements for all participants; including forward and side reaching, shoulder-elbow coupling, and diagonal reaching across the body.</p>
</list-item>
<list-item>
<p>While practicing, day-to-day median error was measured as one outcome of the practice. Special attention was given to blocks of standardized motions that were the same for each session. These were compared to previous days to determine if any incremental improvement could be observed on a day-to-day basis, which can be reported to the patient, the therapist, and the caregivers (fig 3).</p>
</list-item>
<list-item>
<p>Primary measures of outcome were measured weekly, 1 week after the end of treatment, and 45 days post to determine the retention of benefits. Key outcomes were the Fugl-Meyer motor ability score and our customized arm reach test that measured range of motion. </p>
</list-item>
</list>
</sec>
<sec>
<title>6. Representative Results: </title>
<p>When the protocol is done correctly, then once the node is loaded into the H3DViewer or H3DLoad, the WAM device should be recognized and initiated. If the WAM were replaced with another robot, the code itself would not need to be changed.</p>
<p>
<graphic xlink:href="jove-54-3007-0.jpg" position="float" orientation="portrait"></graphic>
<bold>Figure 1.</bold>
Subject seated at the haptic/graphic apparatus.</p>
<p>
<graphic xlink:href="jove-54-3007-1.jpg" position="float" orientation="portrait"></graphic>
<bold>Figure 2.</bold>
Subject seated at the haptic/graphic apparatus with physical therapist.</p>
<p>
<graphic xlink:href="jove-54-3007-2.jpg" position="float" orientation="portrait"></graphic>
<bold>Figure 3.</bold>
Configuration for rehabilitation of the stroke patient. A) subject and therapist working together, seated and using the large-workspace haptic/graphic display to practice movement. The Therapist provides a cue for the subject, and can tailor conditioning to the needs of the patient. The robot provides forces that push the limb away from the target and the visual feedback system enhances the error of the cursor. B) Typical chronic stroke patient improvement from day to day. Each dot represents the median error measured for a 2-minute block of stereotypical functional movement. While the patient shows progress across the 2-week period and overall benefit, this person did not always improve each day.</p>
</sec>
</sec>
<sec sec-type="discussion">
<title>Discussion</title>
<p>This method of wrapper class implementation allows for different robots to be used, without changing the source code, when using the H3DAPI. Specifically, researchers who have written their haptic/graphic environment in H3D and tested their experiment with a phantom robot would be able to carry out the same or similar experiment using the Barrett WAM, and vice versa. This type of device independent cross-communication carries implications for international rehabilitation robotics research. Such implications facilitate rapid haptic/graphic development, international research collaboration, and inter-research lab communication.</p>
<p>Rehabilitation robotics has yet to uncover the numerous parameters involved in motor learning. One of the time consuming steps during haptic/graphics development includes compilation time. With numerous rehabilitation parameters, compounded with the compilation time for each program, the development life cycle to test all possible group permutations rises rapidly. H3D, with its absence of compilation requirements, allows for quick development of numerous virtual reality scenes. This comes as an advantage for those researchers aspiring to probe the effects of various training scenarios.</p>
<p>Limitations of this 'hard-coded' wrapper class integration approach include the fact that this procedure must be repeated each time there is a new distribution of the H3DAPI. Possible modifications to integrating the wrapper class into your latest distribution of the H3DAPI would be to create the wrapper class separately from the H3DAPI. You would then put your wrapper class into a *.so library file. This would isolate your class from the original H3DAPI distribution.</p>
</sec>
<sec>
<title>Disclosures</title>
<p>The wrapper classes in this tutorial are under copyright by Ian Sharp.</p>
</sec>
</body>
<back>
<ack>
<p>I would like to acknowledge the technical help of Brian Zenowich, Daniel Evestedt and Winsean Lin.</p>
</ack>
<ref-list>
<ref id="B0">
<element-citation publication-type="journal" publisher-type="journal">
<person-group person-group-type="author">
<name>
<surname>Solis</surname>
<given-names>J</given-names>
</name>
<name>
<surname>Takeshi</surname>
<given-names>N</given-names>
</name>
<name>
<surname>Petersen</surname>
<given-names>K</given-names>
</name>
<name>
<surname>Takeuchi</surname>
<given-names>M</given-names>
</name>
<name>
<surname>Takanishi</surname>
<given-names>A</given-names>
</name>
</person-group>
<article-title>Development of the anthropomorphic saxophonist robot WAS-1: Mechanical design of the simulated organs and implementation of air pressure</article-title>
<source>Advanced Robotics Journal</source>
<year>2010</year>
<volume>24</volume>
<fpage>629</fpage>
<lpage>650</lpage>
</element-citation>
</ref>
<ref id="B1">
<element-citation publication-type="confproc" publisher-type="confproc">
<person-group person-group-type="editor">
<name>
<surname>Zufferey</surname>
<given-names>JC</given-names>
</name>
<name>
<surname>Floreano</surname>
<given-names>D</given-names>
</name>
</person-group>
<article-title>Evolving Won-Based Flying Robots</article-title>
<year>2002</year>
<conf-name>Proceedings of the 2nd International Workshop on Biologically Motivated Computer Vision</conf-name>
<conf-date>2002 November</conf-date>
<conf-loc>Berlin</conf-loc>
<publisher-name>Springer-Verlag</publisher-name>
<fpage>592</fpage>
<lpage>600</lpage>
</element-citation>
</ref>
<ref id="B2">
<element-citation publication-type="journal" publisher-type="journal">
<person-group person-group-type="author">
<name>
<surname>Conditt</surname>
<given-names>MA</given-names>
</name>
<name>
<surname>Gandolfo</surname>
<given-names>F</given-names>
</name>
<name>
<surname>Mussa-Ivaldi</surname>
<given-names>FA</given-names>
</name>
</person-group>
<article-title>The motor system does not learn the dynamics of the arm by rote memorization of past experience</article-title>
<source>Journal of Neurophysiology</source>
<year>1997</year>
<volume>78</volume>
<fpage>554</fpage>
<lpage>554</lpage>
<pub-id pub-id-type="pmid">9242306</pub-id>
</element-citation>
</ref>
<ref id="B3">
<element-citation publication-type="journal" publisher-type="journal">
<person-group person-group-type="author">
<name>
<surname>Krebs</surname>
<given-names>HI</given-names>
</name>
<name>
<surname>Palazzolo</surname>
<given-names>JJ</given-names>
</name>
<name>
<surname>Dipietro</surname>
<given-names>L</given-names>
</name>
<name>
<surname>Ferraro</surname>
<given-names>M</given-names>
</name>
<name>
<surname>Krol</surname>
<given-names>J</given-names>
</name>
<name>
<surname>Rannekleiv</surname>
<given-names>K</given-names>
</name>
<name>
<surname>Volpe</surname>
<given-names>BT</given-names>
</name>
<name>
<surname>Hogan</surname>
<given-names>N</given-names>
</name>
</person-group>
<article-title>Rehabilitation robotics: Performance-based progressive robot-assisted therapy</article-title>
<source>Autonomous Robots</source>
<year>2003</year>
<volume>15</volume>
<fpage>7</fpage>
<lpage>20</lpage>
</element-citation>
</ref>
<ref id="B4">
<element-citation publication-type="journal" publisher-type="journal">
<person-group person-group-type="author">
<name>
<surname>Wei</surname>
<given-names>K</given-names>
</name>
<name>
<surname>Kording</surname>
<given-names>K</given-names>
</name>
</person-group>
<article-title>Relevance of error: what drives motor adaptation</article-title>
<source>Journal of neurophysiology</source>
<year>2009</year>
<volume>101</volume>
<fpage>655</fpage>
<lpage>65</lpage>
<pub-id pub-id-type="pmid">19019979</pub-id>
</element-citation>
</ref>
<ref id="B5">
<element-citation publication-type="other" publisher-type="journal">
<person-group person-group-type="author">
<name>
<surname>Wei</surname>
<given-names>Y</given-names>
</name>
<name>
<surname>Bajaj</surname>
<given-names>P</given-names>
</name>
<name>
<surname>Scheidt</surname>
<given-names>R</given-names>
</name>
<name>
<surname>Patton</surname>
<given-names>J</given-names>
</name>
</person-group>
<article-title>Visual error augmentation for enhancing motor learning and rehabilitative relearning</article-title>
<source>IEEE International Conference on Rehabilitation Robotics</source>
<year>2005</year>
<fpage>505</fpage>
<lpage>510</lpage>
</element-citation>
</ref>
</ref-list>
</back>
</pmc>
</record>

Pour manipuler ce document sous Unix (Dilib)

EXPLOR_STEP=$WICRI_ROOT/Ticri/CIDE/explor/HapticV1/Data/Pmc/Curation
HfdSelect -h $EXPLOR_STEP/biblio.hfd -nk 000248 | SxmlIndent | more

Ou

HfdSelect -h $EXPLOR_AREA/Data/Pmc/Curation/biblio.hfd -nk 000248 | SxmlIndent | more

Pour mettre un lien sur cette page dans le réseau Wicri

{{Explor lien
   |wiki=    Ticri/CIDE
   |area=    HapticV1
   |flux=    Pmc
   |étape=   Curation
   |type=    RBID
   |clé=     PMC:3211130
   |texte=   Haptic/Graphic Rehabilitation: Integrating a Robot into a Virtual Environment Library and Applying it to Stroke Therapy
}}

Pour générer des pages wiki

HfdIndexSelect -h $EXPLOR_AREA/Data/Pmc/Curation/RBID.i   -Sk "pubmed:21847086" \
       | HfdSelect -Kh $EXPLOR_AREA/Data/Pmc/Curation/biblio.hfd   \
       | NlmPubMed2Wicri -a HapticV1 

Wicri

This area was generated with Dilib version V0.6.23.
Data generation: Mon Jun 13 01:09:46 2016. Site generation: Wed Mar 6 09:54:07 2024