Serveur d'exploration sur les dispositifs haptiques

Attention, ce site est en cours de développement !
Attention, site généré par des moyens informatiques à partir de corpus bruts.
Les informations ne sont donc pas validées.

Robust Hand Motion Tracking through Data Fusion of 5DT Data Glove and Nimble VR Kinect Camera Measurements

Identifieur interne : 000390 ( Pmc/Curation ); précédent : 000389; suivant : 000391

Robust Hand Motion Tracking through Data Fusion of 5DT Data Glove and Nimble VR Kinect Camera Measurements

Auteurs : Ewout A. Arkenbout ; Joost C. F. De Winter ; Paul Breedveld

Source :

RBID : PMC:4721788

Abstract

Vision based interfaces for human computer interaction have gained increasing attention over the past decade. This study presents a data fusion approach of the Nimble VR vision based system, using the Kinect camera, with the contact based 5DT Data Glove. Data fusion was achieved through a Kalman filter. The Nimble VR and filter output were compared using measurements performed on (1) a wooden hand model placed in various static postures and orientations; and (2) three differently sized human hands during active finger flexions. Precision and accuracy of joint angle estimates as a function of hand posture and orientation were determined. Moreover, in light of possible self-occlusions of the fingers in the Kinect camera images, data completeness was assessed. Results showed that the integration of the Data Glove through the Kalman filter provided for the proximal interphalangeal (PIP) joints of the fingers a substantial improvement of 79% in precision, from 2.2 deg to 0.9 deg. Moreover, a moderate improvement of 31% in accuracy (being the mean angular deviation from the true joint angle) was established, from 24 deg to 17 deg. The metacarpophalangeal (MCP) joint was relatively unaffected by the Kalman filter. Moreover, the Data Glove increased data completeness, thus providing a substantial advantage over the sole use of the Nimble VR system.


Url:
DOI: 10.3390/s151229868
PubMed: 26694395
PubMed Central: 4721788

Links toward previous steps (curation, corpus...)


Links to Exploration step

PMC:4721788

Le document en format XML

<record>
<TEI>
<teiHeader>
<fileDesc>
<titleStmt>
<title xml:lang="en">Robust Hand Motion Tracking through Data Fusion of 5DT Data Glove and Nimble VR Kinect Camera Measurements</title>
<author>
<name sortKey="Arkenbout, Ewout A" sort="Arkenbout, Ewout A" uniqKey="Arkenbout E" first="Ewout A." last="Arkenbout">Ewout A. Arkenbout</name>
</author>
<author>
<name sortKey="De Winter, Joost C F" sort="De Winter, Joost C F" uniqKey="De Winter J" first="Joost C. F." last="De Winter">Joost C. F. De Winter</name>
</author>
<author>
<name sortKey="Breedveld, Paul" sort="Breedveld, Paul" uniqKey="Breedveld P" first="Paul" last="Breedveld">Paul Breedveld</name>
</author>
</titleStmt>
<publicationStmt>
<idno type="wicri:source">PMC</idno>
<idno type="pmid">26694395</idno>
<idno type="pmc">4721788</idno>
<idno type="url">http://www.ncbi.nlm.nih.gov/pmc/articles/PMC4721788</idno>
<idno type="RBID">PMC:4721788</idno>
<idno type="doi">10.3390/s151229868</idno>
<date when="2015">2015</date>
<idno type="wicri:Area/Pmc/Corpus">000390</idno>
<idno type="wicri:Area/Pmc/Curation">000390</idno>
</publicationStmt>
<sourceDesc>
<biblStruct>
<analytic>
<title xml:lang="en" level="a" type="main">Robust Hand Motion Tracking through Data Fusion of 5DT Data Glove and Nimble VR Kinect Camera Measurements</title>
<author>
<name sortKey="Arkenbout, Ewout A" sort="Arkenbout, Ewout A" uniqKey="Arkenbout E" first="Ewout A." last="Arkenbout">Ewout A. Arkenbout</name>
</author>
<author>
<name sortKey="De Winter, Joost C F" sort="De Winter, Joost C F" uniqKey="De Winter J" first="Joost C. F." last="De Winter">Joost C. F. De Winter</name>
</author>
<author>
<name sortKey="Breedveld, Paul" sort="Breedveld, Paul" uniqKey="Breedveld P" first="Paul" last="Breedveld">Paul Breedveld</name>
</author>
</analytic>
<series>
<title level="j">Sensors (Basel, Switzerland)</title>
<idno type="eISSN">1424-8220</idno>
<imprint>
<date when="2015">2015</date>
</imprint>
</series>
</biblStruct>
</sourceDesc>
</fileDesc>
<profileDesc>
<textClass></textClass>
</profileDesc>
</teiHeader>
<front>
<div type="abstract" xml:lang="en">
<p>Vision based interfaces for human computer interaction have gained increasing attention over the past decade. This study presents a data fusion approach of the Nimble VR vision based system, using the Kinect camera, with the contact based 5DT Data Glove. Data fusion was achieved through a Kalman filter. The Nimble VR and filter output were compared using measurements performed on (1) a wooden hand model placed in various static postures and orientations; and (2) three differently sized human hands during active finger flexions. Precision and accuracy of joint angle estimates as a function of hand posture and orientation were determined. Moreover, in light of possible self-occlusions of the fingers in the Kinect camera images, data completeness was assessed. Results showed that the integration of the Data Glove through the Kalman filter provided for the proximal interphalangeal (PIP) joints of the fingers a substantial improvement of 79% in precision, from 2.2 deg to 0.9 deg. Moreover, a moderate improvement of 31% in accuracy (being the mean angular deviation from the true joint angle) was established, from 24 deg to 17 deg. The metacarpophalangeal (MCP) joint was relatively unaffected by the Kalman filter. Moreover, the Data Glove increased data completeness, thus providing a substantial advantage over the sole use of the Nimble VR system.</p>
</div>
</front>
<back>
<div1 type="bibliography">
<listBibl>
<biblStruct>
<analytic>
<author>
<name sortKey="Rautaray, S" uniqKey="Rautaray S">S. Rautaray</name>
</author>
<author>
<name sortKey="Agrawal, A" uniqKey="Agrawal A">A. Agrawal</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Suarez, J" uniqKey="Suarez J">J. Suarez</name>
</author>
<author>
<name sortKey="Murphy, R R" uniqKey="Murphy R">R.R. Murphy</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Erol, A" uniqKey="Erol A">A. Erol</name>
</author>
<author>
<name sortKey="Bebis, G" uniqKey="Bebis G">G. Bebis</name>
</author>
<author>
<name sortKey="Nicolescu, M" uniqKey="Nicolescu M">M. Nicolescu</name>
</author>
<author>
<name sortKey="Boyle, R D" uniqKey="Boyle R">R.D. Boyle</name>
</author>
<author>
<name sortKey="Twombly, X" uniqKey="Twombly X">X. Twombly</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Palacios, J" uniqKey="Palacios J">J. Palacios</name>
</author>
<author>
<name sortKey="Sagues, C" uniqKey="Sagues C">C. Sagüés</name>
</author>
<author>
<name sortKey="Montijano, E" uniqKey="Montijano E">E. Montijano</name>
</author>
<author>
<name sortKey="Llorente, S" uniqKey="Llorente S">S. Llorente</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Sturman, D J" uniqKey="Sturman D">D.J. Sturman</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Pintzos, G" uniqKey="Pintzos G">G. Pintzos</name>
</author>
<author>
<name sortKey="Rentzos, L" uniqKey="Rentzos L">L. Rentzos</name>
</author>
<author>
<name sortKey="Papakostas, N" uniqKey="Papakostas N">N. Papakostas</name>
</author>
<author>
<name sortKey="Chryssolouris, G" uniqKey="Chryssolouris G">G. Chryssolouris</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Kalra, P" uniqKey="Kalra P">P. Kalra</name>
</author>
<author>
<name sortKey="Magnenat Thalmann, N" uniqKey="Magnenat Thalmann N">N. Magnenat-Thalmann</name>
</author>
<author>
<name sortKey="Moccozet, L" uniqKey="Moccozet L">L. Moccozet</name>
</author>
<author>
<name sortKey="Sannier, G" uniqKey="Sannier G">G. Sannier</name>
</author>
<author>
<name sortKey="Aubel, A" uniqKey="Aubel A">A. Aubel</name>
</author>
<author>
<name sortKey="Thalmann, D" uniqKey="Thalmann D">D. Thalmann</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Menache, A" uniqKey="Menache A">A. Menache</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Ohn Bar, E" uniqKey="Ohn Bar E">E. Ohn-Bar</name>
</author>
<author>
<name sortKey="Trivedi, M M" uniqKey="Trivedi M">M.M. Trivedi</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Gr Tzel, C" uniqKey="Gr Tzel C">C. Grätzel</name>
</author>
<author>
<name sortKey="Fong, T" uniqKey="Fong T">T. Fong</name>
</author>
<author>
<name sortKey="Grange, S" uniqKey="Grange S">S. Grange</name>
</author>
<author>
<name sortKey="Baur, C" uniqKey="Baur C">C. Baur</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Rosa, G M" uniqKey="Rosa G">G.M. Rosa</name>
</author>
<author>
<name sortKey="Elizondo, M L" uniqKey="Elizondo M">M.L. Elizondo</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Adhikarla, V" uniqKey="Adhikarla V">V. Adhikarla</name>
</author>
<author>
<name sortKey="Sodnik, J" uniqKey="Sodnik J">J. Sodnik</name>
</author>
<author>
<name sortKey="Szolgay, P" uniqKey="Szolgay P">P. Szolgay</name>
</author>
<author>
<name sortKey="Jakus, G" uniqKey="Jakus G">G. Jakus</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Bachmann, D" uniqKey="Bachmann D">D. Bachmann</name>
</author>
<author>
<name sortKey="Weichert, F" uniqKey="Weichert F">F. Weichert</name>
</author>
<author>
<name sortKey="Rinkenauer, G" uniqKey="Rinkenauer G">G. Rinkenauer</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Guna, J" uniqKey="Guna J">J. Guna</name>
</author>
<author>
<name sortKey="Jakus, G" uniqKey="Jakus G">G. Jakus</name>
</author>
<author>
<name sortKey="Poga Nik, M" uniqKey="Poga Nik M">M. Pogačnik</name>
</author>
<author>
<name sortKey="Tomazi, S" uniqKey="Tomazi S">S. Tomažič</name>
</author>
<author>
<name sortKey="Sodnik, J" uniqKey="Sodnik J">J. Sodnik</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Dipietro, L" uniqKey="Dipietro L">L. Dipietro</name>
</author>
<author>
<name sortKey="Sabatini, A M" uniqKey="Sabatini A">A.M. Sabatini</name>
</author>
<author>
<name sortKey="Dario, P" uniqKey="Dario P">P. Dario</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Pavlovic, V I" uniqKey="Pavlovic V">V.I. Pavlovic</name>
</author>
<author>
<name sortKey="Sharma, R" uniqKey="Sharma R">R. Sharma</name>
</author>
<author>
<name sortKey="Huang, T S" uniqKey="Huang T">T.S. Huang</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Wu, Y" uniqKey="Wu Y">Y. Wu</name>
</author>
<author>
<name sortKey="Huang, T" uniqKey="Huang T">T. Huang</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Preil, M" uniqKey="Preil M">M. Preil</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Kurzweil, R" uniqKey="Kurzweil R">R. Kurzweil</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Zhao, W" uniqKey="Zhao W">W. Zhao</name>
</author>
<author>
<name sortKey="Chai, J" uniqKey="Chai J">J. Chai</name>
</author>
<author>
<name sortKey="Xu, Y Q" uniqKey="Xu Y">Y.-Q. Xu</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Rogalla, O" uniqKey="Rogalla O">O. Rogalla</name>
</author>
<author>
<name sortKey="Ehrenmann, M" uniqKey="Ehrenmann M">M. Ehrenmann</name>
</author>
<author>
<name sortKey="Dillmann, R" uniqKey="Dillmann R">R. Dillmann</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Ehrenmann, M" uniqKey="Ehrenmann M">M. Ehrenmann</name>
</author>
<author>
<name sortKey="Zollner, R" uniqKey="Zollner R">R. Zollner</name>
</author>
<author>
<name sortKey="Knoop, S" uniqKey="Knoop S">S. Knoop</name>
</author>
<author>
<name sortKey="Dillmann, R" uniqKey="Dillmann R">R. Dillmann</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Hebert, P" uniqKey="Hebert P">P. Hebert</name>
</author>
<author>
<name sortKey="Hudson, N" uniqKey="Hudson N">N. Hudson</name>
</author>
<author>
<name sortKey="Ma, J" uniqKey="Ma J">J. Ma</name>
</author>
<author>
<name sortKey="Burdick, J" uniqKey="Burdick J">J. Burdick</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Zhou, S" uniqKey="Zhou S">S. Zhou</name>
</author>
<author>
<name sortKey="Fei, F" uniqKey="Fei F">F. Fei</name>
</author>
<author>
<name sortKey="Zhang, G" uniqKey="Zhang G">G. Zhang</name>
</author>
<author>
<name sortKey="Liu, Y" uniqKey="Liu Y">Y. Liu</name>
</author>
<author>
<name sortKey="Li, W" uniqKey="Li W">W. Li</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Fan, W" uniqKey="Fan W">W. Fan</name>
</author>
<author>
<name sortKey="Chen, X" uniqKey="Chen X">X. Chen</name>
</author>
<author>
<name sortKey="Wang, W H" uniqKey="Wang W">W.-H. Wang</name>
</author>
<author>
<name sortKey="Zhang, X" uniqKey="Zhang X">X. Zhang</name>
</author>
<author>
<name sortKey="Yang, J H" uniqKey="Yang J">J.-H. Yang</name>
</author>
<author>
<name sortKey="Lantz, V" uniqKey="Lantz V">V. Lantz</name>
</author>
<author>
<name sortKey="Wang, K Q" uniqKey="Wang K">K.-Q. Wang</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Zou, W" uniqKey="Zou W">W. Zou</name>
</author>
<author>
<name sortKey="Yuan, K" uniqKey="Yuan K">K. Yuan</name>
</author>
<author>
<name sortKey="Liu, J" uniqKey="Liu J">J. Liu</name>
</author>
<author>
<name sortKey="Luo, B" uniqKey="Luo B">B. Luo</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Brashear, H" uniqKey="Brashear H">H. Brashear</name>
</author>
<author>
<name sortKey="Starner, T" uniqKey="Starner T">T. Starner</name>
</author>
<author>
<name sortKey="Lukowicz, P" uniqKey="Lukowicz P">P. Lukowicz</name>
</author>
<author>
<name sortKey="Junker, H" uniqKey="Junker H">H. Junker</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Khaleghi, B" uniqKey="Khaleghi B">B. Khaleghi</name>
</author>
<author>
<name sortKey="Khamis, A" uniqKey="Khamis A">A. Khamis</name>
</author>
<author>
<name sortKey="Karray, F O" uniqKey="Karray F">F.O. Karray</name>
</author>
<author>
<name sortKey="Razavi, S N" uniqKey="Razavi S">S.N. Razavi</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Kunkler, K" uniqKey="Kunkler K">K. Kunkler</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Diesen, D L" uniqKey="Diesen D">D.L. Diesen</name>
</author>
<author>
<name sortKey="Erhunmwunsee, L" uniqKey="Erhunmwunsee L">L. Erhunmwunsee</name>
</author>
<author>
<name sortKey="Bennett, K M" uniqKey="Bennett K">K.M. Bennett</name>
</author>
<author>
<name sortKey="Ben David, K" uniqKey="Ben David K">K. Ben-David</name>
</author>
<author>
<name sortKey="Yurcisin, B" uniqKey="Yurcisin B">B. Yurcisin</name>
</author>
<author>
<name sortKey="Ceppa, E P" uniqKey="Ceppa E">E.P. Ceppa</name>
</author>
<author>
<name sortKey="Omotosho, P A" uniqKey="Omotosho P">P.A. Omotosho</name>
</author>
<author>
<name sortKey="Perez, A" uniqKey="Perez A">A. Perez</name>
</author>
<author>
<name sortKey="Pryor, A" uniqKey="Pryor A">A. Pryor</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Newmark, J" uniqKey="Newmark J">J. Newmark</name>
</author>
<author>
<name sortKey="Dandolu, V" uniqKey="Dandolu V">V. Dandolu</name>
</author>
<author>
<name sortKey="Milner, R" uniqKey="Milner R">R. Milner</name>
</author>
<author>
<name sortKey="Grewal, H" uniqKey="Grewal H">H. Grewal</name>
</author>
<author>
<name sortKey="Harbison, S" uniqKey="Harbison S">S. Harbison</name>
</author>
<author>
<name sortKey="Hernandez, E" uniqKey="Hernandez E">E. Hernandez</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Munz, Y" uniqKey="Munz Y">Y. Munz</name>
</author>
<author>
<name sortKey="Kumar, B D" uniqKey="Kumar B">B.D. Kumar</name>
</author>
<author>
<name sortKey="Moorthy, K" uniqKey="Moorthy K">K. Moorthy</name>
</author>
<author>
<name sortKey="Bann, S" uniqKey="Bann S">S. Bann</name>
</author>
<author>
<name sortKey="Darzi, A" uniqKey="Darzi A">A. Darzi</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Hull, L" uniqKey="Hull L">L. Hull</name>
</author>
<author>
<name sortKey="Kassab, E" uniqKey="Kassab E">E. Kassab</name>
</author>
<author>
<name sortKey="Arora, S" uniqKey="Arora S">S. Arora</name>
</author>
<author>
<name sortKey="Kneebone, R" uniqKey="Kneebone R">R. Kneebone</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Bent, J" uniqKey="Bent J">J. Bent</name>
</author>
<author>
<name sortKey="Chan, K" uniqKey="Chan K">K. Chan</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Crothers, I" uniqKey="Crothers I">I. Crothers</name>
</author>
<author>
<name sortKey="Gallagher, A" uniqKey="Gallagher A">A. Gallagher</name>
</author>
<author>
<name sortKey="Mcclure, N" uniqKey="Mcclure N">N. McClure</name>
</author>
<author>
<name sortKey="James, D" uniqKey="James D">D. James</name>
</author>
<author>
<name sortKey="Mcguigan, J" uniqKey="Mcguigan J">J. McGuigan</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Gallagher, A" uniqKey="Gallagher A">A. Gallagher</name>
</author>
<author>
<name sortKey="Mcclure, N" uniqKey="Mcclure N">N. McClure</name>
</author>
<author>
<name sortKey="Mcguigan, J" uniqKey="Mcguigan J">J. McGuigan</name>
</author>
<author>
<name sortKey="Ritchie, K" uniqKey="Ritchie K">K. Ritchie</name>
</author>
<author>
<name sortKey="Sheehy, N" uniqKey="Sheehy N">N. Sheehy</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Eyal, R" uniqKey="Eyal R">R. Eyal</name>
</author>
<author>
<name sortKey="Tendick, F" uniqKey="Tendick F">F. Tendick</name>
</author>
</analytic>
</biblStruct>
<biblStruct></biblStruct>
<biblStruct></biblStruct>
<biblStruct></biblStruct>
<biblStruct></biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Ahlberg, G" uniqKey="Ahlberg G">G. Ahlberg</name>
</author>
<author>
<name sortKey="Enochsson, L" uniqKey="Enochsson L">L. Enochsson</name>
</author>
<author>
<name sortKey="Gallagher, A G" uniqKey="Gallagher A">A.G. Gallagher</name>
</author>
<author>
<name sortKey="Hedman, L" uniqKey="Hedman L">L. Hedman</name>
</author>
<author>
<name sortKey="Hogman, C" uniqKey="Hogman C">C. Hogman</name>
</author>
<author>
<name sortKey="Mcclusky, D A" uniqKey="Mcclusky D">D.A. McClusky</name>
</author>
<author>
<name sortKey="Ramel, S" uniqKey="Ramel S">S. Ramel</name>
</author>
<author>
<name sortKey="Smith, C D" uniqKey="Smith C">C.D. Smith</name>
</author>
<author>
<name sortKey="Arvidsson, D" uniqKey="Arvidsson D">D. Arvidsson</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Gallagher, A G" uniqKey="Gallagher A">A.G. Gallagher</name>
</author>
<author>
<name sortKey="Ritter, E M" uniqKey="Ritter E">E.M. Ritter</name>
</author>
<author>
<name sortKey="Champion, H" uniqKey="Champion H">H. Champion</name>
</author>
<author>
<name sortKey="Higgins, G" uniqKey="Higgins G">G. Higgins</name>
</author>
<author>
<name sortKey="Fried, M P" uniqKey="Fried M">M.P. Fried</name>
</author>
<author>
<name sortKey="Moses, G" uniqKey="Moses G">G. Moses</name>
</author>
<author>
<name sortKey="Smith, C D" uniqKey="Smith C">C.D. Smith</name>
</author>
<author>
<name sortKey="Satava, R M" uniqKey="Satava R">R.M. Satava</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Seymour, N E" uniqKey="Seymour N">N.E. Seymour</name>
</author>
<author>
<name sortKey="Gallagher, A G" uniqKey="Gallagher A">A.G. Gallagher</name>
</author>
<author>
<name sortKey="Roman, S A" uniqKey="Roman S">S.A. Roman</name>
</author>
<author>
<name sortKey="O Rien, M K" uniqKey="O Rien M">M.K. O’Brien</name>
</author>
<author>
<name sortKey="Bansal, V K" uniqKey="Bansal V">V.K. Bansal</name>
</author>
<author>
<name sortKey="Andersen, D K" uniqKey="Andersen D">D.K. Andersen</name>
</author>
<author>
<name sortKey="Satava, R M" uniqKey="Satava R">R.M. Satava</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Munz, Y" uniqKey="Munz Y">Y. Munz</name>
</author>
<author>
<name sortKey="Almoudaris, A M" uniqKey="Almoudaris A">A.M. Almoudaris</name>
</author>
<author>
<name sortKey="Moorthy, K" uniqKey="Moorthy K">K. Moorthy</name>
</author>
<author>
<name sortKey="Dosis, A" uniqKey="Dosis A">A. Dosis</name>
</author>
<author>
<name sortKey="Liddle, A D" uniqKey="Liddle A">A.D. Liddle</name>
</author>
<author>
<name sortKey="Darzi, A W" uniqKey="Darzi A">A.W. Darzi</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Wang, R" uniqKey="Wang R">R. Wang</name>
</author>
<author>
<name sortKey="Paris, S" uniqKey="Paris S">S. Paris</name>
</author>
<author>
<name sortKey="Popovi, J" uniqKey="Popovi J">J. Popović</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="El Laithy, R A" uniqKey="El Laithy R">R.A. El-laithy</name>
</author>
<author>
<name sortKey="Jidong, H" uniqKey="Jidong H">H. Jidong</name>
</author>
<author>
<name sortKey="Yeh, M" uniqKey="Yeh M">M. Yeh</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Yonjae, K" uniqKey="Yonjae K">K. Yonjae</name>
</author>
<author>
<name sortKey="Kim, P C W" uniqKey="Kim P">P.C.W. Kim</name>
</author>
<author>
<name sortKey="Selle, R" uniqKey="Selle R">R. Selle</name>
</author>
<author>
<name sortKey="Shademan, A" uniqKey="Shademan A">A. Shademan</name>
</author>
<author>
<name sortKey="Krieger, A" uniqKey="Krieger A">A. Krieger</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Arkenbout, E A" uniqKey="Arkenbout E">E.A. Arkenbout</name>
</author>
<author>
<name sortKey="Winter, J C F D" uniqKey="Winter J">J.C.F.D. Winter</name>
</author>
<author>
<name sortKey="Breedveld, P" uniqKey="Breedveld P">P. Breedveld</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Weichert, F" uniqKey="Weichert F">F. Weichert</name>
</author>
<author>
<name sortKey="Bachmann, D" uniqKey="Bachmann D">D. Bachmann</name>
</author>
<author>
<name sortKey="Rudak, B" uniqKey="Rudak B">B. Rudak</name>
</author>
<author>
<name sortKey="Fisseler, D" uniqKey="Fisseler D">D. Fisseler</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Kim, Y" uniqKey="Kim Y">Y. Kim</name>
</author>
<author>
<name sortKey="Leonard, S" uniqKey="Leonard S">S. Leonard</name>
</author>
<author>
<name sortKey="Shademan, A" uniqKey="Shademan A">A. Shademan</name>
</author>
<author>
<name sortKey="Krieger, A" uniqKey="Krieger A">A. Krieger</name>
</author>
<author>
<name sortKey="Kim, P W" uniqKey="Kim P">P.W. Kim</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Technologies, F D" uniqKey="Technologies F">F.D. Technologies</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Bishop, G" uniqKey="Bishop G">G. Bishop</name>
</author>
<author>
<name sortKey="Welch, G" uniqKey="Welch G">G. Welch</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Harvey, A C" uniqKey="Harvey A">A.C. Harvey</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Haykin, S" uniqKey="Haykin S">S. Haykin</name>
</author>
</analytic>
</biblStruct>
<biblStruct></biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Fukunaga, K" uniqKey="Fukunaga K">K. Fukunaga</name>
</author>
<author>
<name sortKey="Hostetler, L" uniqKey="Hostetler L">L. Hostetler</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Dianat, I" uniqKey="Dianat I">I. Dianat</name>
</author>
<author>
<name sortKey="Haslegrave, C M" uniqKey="Haslegrave C">C.M. Haslegrave</name>
</author>
<author>
<name sortKey="Stedmon, A W" uniqKey="Stedmon A">A.W. Stedmon</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Simone, L K" uniqKey="Simone L">L.K. Simone</name>
</author>
<author>
<name sortKey="Elovic, E" uniqKey="Elovic E">E. Elovic</name>
</author>
<author>
<name sortKey="Kalambur, U" uniqKey="Kalambur U">U. Kalambur</name>
</author>
<author>
<name sortKey="Kamper, D" uniqKey="Kamper D">D. Kamper</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Kim, D" uniqKey="Kim D">D. Kim</name>
</author>
<author>
<name sortKey="Hilliges, O" uniqKey="Hilliges O">O. Hilliges</name>
</author>
<author>
<name sortKey="Izadi, S" uniqKey="Izadi S">S. Izadi</name>
</author>
<author>
<name sortKey="Butler, A D" uniqKey="Butler A">A.D. Butler</name>
</author>
<author>
<name sortKey="Chen, J" uniqKey="Chen J">J. Chen</name>
</author>
<author>
<name sortKey="Oikonomidis, I" uniqKey="Oikonomidis I">I. Oikonomidis</name>
</author>
<author>
<name sortKey="Olivier, P" uniqKey="Olivier P">P. Olivier</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Nymoen, K" uniqKey="Nymoen K">K. Nymoen</name>
</author>
<author>
<name sortKey="Haugen, M R" uniqKey="Haugen M">M.R. Haugen</name>
</author>
<author>
<name sortKey="Jensenius, A R" uniqKey="Jensenius A">A.R. Jensenius</name>
</author>
</analytic>
</biblStruct>
<biblStruct></biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Lynch, J" uniqKey="Lynch J">J. Lynch</name>
</author>
<author>
<name sortKey="Aughwane, P" uniqKey="Aughwane P">P. Aughwane</name>
</author>
<author>
<name sortKey="Hammond, T M" uniqKey="Hammond T">T.M. Hammond</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Levison, W H" uniqKey="Levison W">W.H. Levison</name>
</author>
<author>
<name sortKey="Lancraft, R" uniqKey="Lancraft R">R. Lancraft</name>
</author>
<author>
<name sortKey="Junker, A" uniqKey="Junker A">A. Junker</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Bowersox, J C" uniqKey="Bowersox J">J.C. Bowersox</name>
</author>
<author>
<name sortKey="Cordts, P R" uniqKey="Cordts P">P.R. Cordts</name>
</author>
<author>
<name sortKey="Laporta, A J" uniqKey="Laporta A">A.J. LaPorta</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Fabrlzio, M D" uniqKey="Fabrlzio M">M.D. Fabrlzio</name>
</author>
<author>
<name sortKey="Lee, B R" uniqKey="Lee B">B.R. Lee</name>
</author>
<author>
<name sortKey="Chan, D Y" uniqKey="Chan D">D.Y. Chan</name>
</author>
<author>
<name sortKey="Stoianovici, D" uniqKey="Stoianovici D">D. Stoianovici</name>
</author>
<author>
<name sortKey="Jarrett, T W" uniqKey="Jarrett T">T.W. Jarrett</name>
</author>
<author>
<name sortKey="Yang, C" uniqKey="Yang C">C. Yang</name>
</author>
<author>
<name sortKey="Kavoussi, L R" uniqKey="Kavoussi L">L.R. Kavoussi</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Ottensmeyer, M P" uniqKey="Ottensmeyer M">M.P. Ottensmeyer</name>
</author>
<author>
<name sortKey="Hu, J" uniqKey="Hu J">J. Hu</name>
</author>
<author>
<name sortKey="Thompson, J M" uniqKey="Thompson J">J.M. Thompson</name>
</author>
<author>
<name sortKey="Ren, J" uniqKey="Ren J">J. Ren</name>
</author>
<author>
<name sortKey="Sheridan, T B" uniqKey="Sheridan T">T.B. Sheridan</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Arsenault, R" uniqKey="Arsenault R">R. Arsenault</name>
</author>
<author>
<name sortKey="Ware, C" uniqKey="Ware C">C. Ware</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Reiley, C E" uniqKey="Reiley C">C.E. Reiley</name>
</author>
<author>
<name sortKey="Akinbiyi, T" uniqKey="Akinbiyi T">T. Akinbiyi</name>
</author>
<author>
<name sortKey="Burschka, D" uniqKey="Burschka D">D. Burschka</name>
</author>
<author>
<name sortKey="Chang, D C" uniqKey="Chang D">D.C. Chang</name>
</author>
<author>
<name sortKey="Okamura, A M" uniqKey="Okamura A">A.M. Okamura</name>
</author>
<author>
<name sortKey="Yuh, D D" uniqKey="Yuh D">D.D. Yuh</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Lecuyer, A" uniqKey="Lecuyer A">A. Lécuyer</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Buchmann, V" uniqKey="Buchmann V">V. Buchmann</name>
</author>
<author>
<name sortKey="Violich, S" uniqKey="Violich S">S. Violich</name>
</author>
<author>
<name sortKey="Billinghurst, M" uniqKey="Billinghurst M">M. Billinghurst</name>
</author>
<author>
<name sortKey="Cockburn, A" uniqKey="Cockburn A">A. Cockburn</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Judkins, T N" uniqKey="Judkins T">T.N. Judkins</name>
</author>
<author>
<name sortKey="Dimartino, A" uniqKey="Dimartino A">A. DiMartino</name>
</author>
<author>
<name sortKey="Done, K" uniqKey="Done K">K. Doné</name>
</author>
<author>
<name sortKey="Hallbeck, M S" uniqKey="Hallbeck M">M.S. Hallbeck</name>
</author>
<author>
<name sortKey="Oleynikov, D" uniqKey="Oleynikov D">D. Oleynikov</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Saunders, J" uniqKey="Saunders J">J. Saunders</name>
</author>
<author>
<name sortKey="Knill, D" uniqKey="Knill D">D. Knill</name>
</author>
</analytic>
</biblStruct>
</listBibl>
</div1>
</back>
</TEI>
<pmc article-type="research-article">
<pmc-dir>properties open_access</pmc-dir>
<front>
<journal-meta>
<journal-id journal-id-type="nlm-ta">Sensors (Basel)</journal-id>
<journal-id journal-id-type="iso-abbrev">Sensors (Basel)</journal-id>
<journal-id journal-id-type="publisher-id">sensors</journal-id>
<journal-title-group>
<journal-title>Sensors (Basel, Switzerland)</journal-title>
</journal-title-group>
<issn pub-type="epub">1424-8220</issn>
<publisher>
<publisher-name>MDPI</publisher-name>
</publisher>
</journal-meta>
<article-meta>
<article-id pub-id-type="pmid">26694395</article-id>
<article-id pub-id-type="pmc">4721788</article-id>
<article-id pub-id-type="doi">10.3390/s151229868</article-id>
<article-id pub-id-type="publisher-id">sensors-15-29868</article-id>
<article-categories>
<subj-group subj-group-type="heading">
<subject>Article</subject>
</subj-group>
</article-categories>
<title-group>
<article-title>Robust Hand Motion Tracking through Data Fusion of 5DT Data Glove and Nimble VR Kinect Camera Measurements</article-title>
</title-group>
<contrib-group>
<contrib contrib-type="author">
<name>
<surname>Arkenbout</surname>
<given-names>Ewout A.</given-names>
</name>
<xref rid="c1-sensors-15-29868" ref-type="corresp">*</xref>
</contrib>
<contrib contrib-type="author">
<name>
<surname>de Winter</surname>
<given-names>Joost C. F.</given-names>
</name>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Breedveld</surname>
<given-names>Paul</given-names>
</name>
</contrib>
</contrib-group>
<contrib-group>
<contrib contrib-type="editor">
<name>
<surname>Sabatini</surname>
<given-names>Angelo Maria</given-names>
</name>
<role>Academic Editor</role>
</contrib>
</contrib-group>
<aff id="af1-sensors-15-29868">Department of Biomechanical Engineering, Faculty of Mechanical, Maritime and Materials Engineering, Delft University of Technology, Mekelweg 2, 2628 CD Delft, The Netherlands;
<email>J.C.F.deWinter@tudelft.nl</email>
(J.C.F.W.);
<email>P.Breedveld@tudelft.nl</email>
(P.B.)</aff>
<author-notes>
<corresp id="c1-sensors-15-29868">
<label>*</label>
Correspondence:
<email>E.A.Arkenbout@tudelft.nl</email>
; Tel.: +31-15-27-82360</corresp>
</author-notes>
<pub-date pub-type="epub">
<day>15</day>
<month>12</month>
<year>2015</year>
</pub-date>
<pub-date pub-type="collection">
<month>12</month>
<year>2015</year>
</pub-date>
<volume>15</volume>
<issue>12</issue>
<fpage>31644</fpage>
<lpage>31671</lpage>
<history>
<date date-type="received">
<day>27</day>
<month>8</month>
<year>2015</year>
</date>
<date date-type="accepted">
<day>07</day>
<month>12</month>
<year>2015</year>
</date>
</history>
<permissions>
<copyright-statement>© 2015 by the authors; licensee MDPI, Basel, Switzerland.</copyright-statement>
<copyright-year>2015</copyright-year>
<license>
<license-p>
<pmc-comment>CREATIVE COMMONS</pmc-comment>
This article is an open access article distributed under the terms and conditions of the Creative Commons by Attribution (CC-BY) license (
<ext-link ext-link-type="uri" xlink:href="http://creativecommons.org/licenses/by/4.0/">http://creativecommons.org/licenses/by/4.0/</ext-link>
).</license-p>
</license>
</permissions>
<abstract>
<p>Vision based interfaces for human computer interaction have gained increasing attention over the past decade. This study presents a data fusion approach of the Nimble VR vision based system, using the Kinect camera, with the contact based 5DT Data Glove. Data fusion was achieved through a Kalman filter. The Nimble VR and filter output were compared using measurements performed on (1) a wooden hand model placed in various static postures and orientations; and (2) three differently sized human hands during active finger flexions. Precision and accuracy of joint angle estimates as a function of hand posture and orientation were determined. Moreover, in light of possible self-occlusions of the fingers in the Kinect camera images, data completeness was assessed. Results showed that the integration of the Data Glove through the Kalman filter provided for the proximal interphalangeal (PIP) joints of the fingers a substantial improvement of 79% in precision, from 2.2 deg to 0.9 deg. Moreover, a moderate improvement of 31% in accuracy (being the mean angular deviation from the true joint angle) was established, from 24 deg to 17 deg. The metacarpophalangeal (MCP) joint was relatively unaffected by the Kalman filter. Moreover, the Data Glove increased data completeness, thus providing a substantial advantage over the sole use of the Nimble VR system.</p>
</abstract>
<kwd-group>
<kwd>Human-computer interaction</kwd>
<kwd>Kalman filter</kwd>
<kwd>data fusion</kwd>
<kwd>gestures</kwd>
<kwd>finger joint angle measurements</kwd>
<kwd>sensor redundancy</kwd>
</kwd-group>
</article-meta>
</front>
<body>
<sec id="sec1-sensors-15-29868">
<title>1. Introduction</title>
<p>The use of hand gestures as a control input in Human-Computer Interaction (HCI) is an ongoing topic of research [
<xref rid="B1-sensors-15-29868" ref-type="bibr">1</xref>
,
<xref rid="B2-sensors-15-29868" ref-type="bibr">2</xref>
,
<xref rid="B3-sensors-15-29868" ref-type="bibr">3</xref>
,
<xref rid="B4-sensors-15-29868" ref-type="bibr">4</xref>
]. In human-to-human interaction, hand movements are a means of non-verbal communication, and can take the form of either simple actions (such as pointing to an object) or more complex ones (such as when expressing feelings). Therefore, it stands to reason that using the hands can be an intuitive method for the communication with computers. The hands can be considered an input device with more than 20 degrees of freedom (DOF) [
<xref rid="B3-sensors-15-29868" ref-type="bibr">3</xref>
,
<xref rid="B5-sensors-15-29868" ref-type="bibr">5</xref>
], as such it should be possible to use the hands as high DOF control devices in a wide range of applications.</p>
<p>Two major types of technology for HCI can be distinguished, namely contact based and vision based devices. Contact based devices rely on physical interaction with the user. Vision based devices, on the other hand, analyze one or more video streams for determining hand motions. Examples of contact based devices are mobile touch screens (e.g., for monitoring, communication and guidance on an industrial shop floor [
<xref rid="B6-sensors-15-29868" ref-type="bibr">6</xref>
]) and data gloves (e.g., for tracking of the hands in computer animations [
<xref rid="B7-sensors-15-29868" ref-type="bibr">7</xref>
,
<xref rid="B8-sensors-15-29868" ref-type="bibr">8</xref>
]). Most vision based devices fall into the categories of interactive displays/table tops/whiteboards, robot motion control, and sign language [
<xref rid="B2-sensors-15-29868" ref-type="bibr">2</xref>
]. For example in the automotive domain, the use of hand gestures can be a valuable asset for the control of interfaces that would otherwise require physical interaction with the driver [
<xref rid="B9-sensors-15-29868" ref-type="bibr">9</xref>
]. In the medical domain, vision based devices have been researched as a non-contact substitute for the mouse and keyboard, allowing the surgeon to interact with computers in a sterile environment. For example, Graetzel
<italic>et al.</italic>
[
<xref rid="B10-sensors-15-29868" ref-type="bibr">10</xref>
] enabled the surgeon to perform standard mouse functions through hand gestures, and Rosa and Elizondo [
<xref rid="B11-sensors-15-29868" ref-type="bibr">11</xref>
] used the recently introduced Leap Motion (Leap Motion Inc., San Francisco, CA, USA) [
<xref rid="B12-sensors-15-29868" ref-type="bibr">12</xref>
,
<xref rid="B13-sensors-15-29868" ref-type="bibr">13</xref>
,
<xref rid="B14-sensors-15-29868" ref-type="bibr">14</xref>
] to provide intra-operative touchless control of surgical images.</p>
<p>A large number of
<italic>contact based</italic>
data gloves have been developed over the last 35 years [
<xref rid="B15-sensors-15-29868" ref-type="bibr">15</xref>
], whereas vision based tracking of the hands has been in development for about two decades [
<xref rid="B3-sensors-15-29868" ref-type="bibr">3</xref>
,
<xref rid="B16-sensors-15-29868" ref-type="bibr">16</xref>
,
<xref rid="B17-sensors-15-29868" ref-type="bibr">17</xref>
]. The application of vision based devices is of interest, as cameras are becoming more and more prevalent, featuring continually increasingly sampling rates and an exponentially growing number of pixels [
<xref rid="B18-sensors-15-29868" ref-type="bibr">18</xref>
,
<xref rid="B19-sensors-15-29868" ref-type="bibr">19</xref>
]. However, pressing challenges in vision based hand gesture recognition are to cope with a large variety of gestures, hand appearances, silhouette scales (spatial resolution), as well as visual occlusions [
<xref rid="B1-sensors-15-29868" ref-type="bibr">1</xref>
,
<xref rid="B3-sensors-15-29868" ref-type="bibr">3</xref>
]. In comparison, contact based devices are easy to implement, but require calibration because the measurement is relative rather than absolute with respect to the earth.</p>
<p>Gloves and camera systems each have their limitations, but may complement each other. Sensor fusion of a vision based with a contact based device has several advantages, in particular that the contact based device can fill in the data gap that occurs with vision based systems during camera occlusions, and that the vision based device provides an absolute measurement of hand state. Moreover, the fusion of data can result in a higher precision of pose estimates through redundancy gain.</p>
<p>Previous research has integrated two vision based systems for the purpose of high fidelity hand motion data acquisition [
<xref rid="B20-sensors-15-29868" ref-type="bibr">20</xref>
]. Furthermore, various studies have integrated vision and contact based systems with the aim of aiding in the tracking of the location of a grasped object within a hand [
<xref rid="B21-sensors-15-29868" ref-type="bibr">21</xref>
,
<xref rid="B22-sensors-15-29868" ref-type="bibr">22</xref>
,
<xref rid="B23-sensors-15-29868" ref-type="bibr">23</xref>
,
<xref rid="B24-sensors-15-29868" ref-type="bibr">24</xref>
] or for improving the recognition of sign language and hand gestures [
<xref rid="B25-sensors-15-29868" ref-type="bibr">25</xref>
,
<xref rid="B26-sensors-15-29868" ref-type="bibr">26</xref>
,
<xref rid="B27-sensors-15-29868" ref-type="bibr">27</xref>
]. These multi-sensor techniques supplement each other, where the separate sensors measure different aspects of the motions of the arm and hands, after which their combined data is used for higher-level feature extraction for gesture recognition [
<xref rid="B28-sensors-15-29868" ref-type="bibr">28</xref>
]. However, using sensor redundancy and fusion with the primary purpose of increasing precision and robustness of a vision based hand posture approximation is rarely performed.</p>
<p>Because of the inherent issue of visual occlusions associated with cameras, updating the hand posture approximation with local sensors may often be necessary. A recommendation in this regard is to use as few and as minimally obtrusive sensors as possible, thereby not influencing natural hand and finger motions. Accordingly, this research presents a simple method for fusing contact based with vision based hand tracking systems, where the focus is placed on using a camera tracking system that is readily available, and a data glove that uses a small number of sensors.</p>
<sec id="sec1dot1-sensors-15-29868">
<title>1.1. HCI in Laparoscopic Training</title>
<p>A field where hand motions and postures as HCI input may be promising is virtual laparoscopic training. Various medical trainers exist for laparoscopic skills training and assessment, ranging from physical box trainers to high fidelity virtual reality (VR) trainers [
<xref rid="B29-sensors-15-29868" ref-type="bibr">29</xref>
], both of which are effective training devices [
<xref rid="B30-sensors-15-29868" ref-type="bibr">30</xref>
,
<xref rid="B31-sensors-15-29868" ref-type="bibr">31</xref>
,
<xref rid="B32-sensors-15-29868" ref-type="bibr">32</xref>
]. Contemporary virtual simulators need a physical interface, both for purposes of congruence with the actual operating room scenario as well as for reliable tracking of the hand motions. These VR trainers usually aim to simulate the minimally invasive surgical scenario as realistically as possible. Surgeons in training may benefit from practicing with such realistic systems, but due to the considerable cost gap between VR simulators and physical box trainers, the use of VR simulators is currently limited to a relatively small number of training centers [
<xref rid="B33-sensors-15-29868" ref-type="bibr">33</xref>
]. As such, it may be beneficial to have a cheaper VR alternative.</p>
<p>Comparing the medical field to training methods in aviation, one can see that it is standard practice to train pilots in simulators of increasing complexity, where basic tasks are trained in lower fidelity and part-task simulators. For example, Integrated Procedures Trainers (IPTs) allow for the learning of flow patterns, systems, procedures, and checklists [
<xref rid="B34-sensors-15-29868" ref-type="bibr">34</xref>
]. In the same way, one could train surgeons, starting out with a virtual trainer that simulates basic laparoscopic tasks to train hand–eye coordination skills, for example in a scenario where instrument movements are inverted with respect to the hand movements (
<italic>i.e</italic>
., the “fulcrum effect” associated with the entry incision). Training of these basic laparoscopic skills is to a certain extent possible without the need for a physical interface, making visual based tracking devices potentially useful for the early training of surgeons. Depending on the skills that the surgeon aims to learn, a certain level of precision and accuracy of the hand state estimate is required. However, these requirements may be relaxed when learning basic spatial abilities, for example when learning to control an instrument with inverted movement [
<xref rid="B35-sensors-15-29868" ref-type="bibr">35</xref>
,
<xref rid="B36-sensors-15-29868" ref-type="bibr">36</xref>
] or when learning to use an angled laparoscope [
<xref rid="B37-sensors-15-29868" ref-type="bibr">37</xref>
].</p>
<p>Using a vision based device for virtual laparoscopic training may furthermore be interesting in light of the recent surge in low cost consumer market VR headsets (
<italic>i.e.</italic>
, Oculus Rift [
<xref rid="B38-sensors-15-29868" ref-type="bibr">38</xref>
], Sony PlayStation VR [
<xref rid="B39-sensors-15-29868" ref-type="bibr">39</xref>
], HTC Vive [
<xref rid="B40-sensors-15-29868" ref-type="bibr">40</xref>
], and Samsung Gear VR [
<xref rid="B41-sensors-15-29868" ref-type="bibr">41</xref>
]), which are devices that could enhance the fidelity of medical VR simulators. Such high fidelity VR simulations may confer effective skills transfer to the
<italic>in vivo</italic>
surgical situation, whereas less expensive VR trainers may lead to effective skill generalization [
<xref rid="B42-sensors-15-29868" ref-type="bibr">42</xref>
,
<xref rid="B43-sensors-15-29868" ref-type="bibr">43</xref>
,
<xref rid="B44-sensors-15-29868" ref-type="bibr">44</xref>
,
<xref rid="B45-sensors-15-29868" ref-type="bibr">45</xref>
]. Unfortunately, as previously mentioned, the limited accuracy and precision of current vision based devices for tracking of the hand movements, as well as their inherent issue of visual occlusions, makes them not yet suitable for surgical applications. Both issues may be solved through the integration of a contact based device.</p>
</sec>
<sec id="sec1dot2-sensors-15-29868">
<title>1.2. Nimble VR</title>
<p>A relatively new vision based system is the Nimble VR (Nimble VR Inc., San Francisco, CA, USA), previously named 3Gear Systems. It currently relies on the Microsoft Kinect
<sup>TM</sup>
sensor (Microsoft Corporation, Redmond, WA, USA) and obtains the hand pose estimates through queries of a precomputed database that relates the detected hand silhouettes to their 3D configurations [
<xref rid="B46-sensors-15-29868" ref-type="bibr">46</xref>
]. The Microsoft Kinect has a QVGA (320 × 240 px) depth camera and a VGA (640 × 480 px) video camera, both of which can produce image streams up to 30 frames per second [
<xref rid="B47-sensors-15-29868" ref-type="bibr">47</xref>
]. Moreover, the Kinect has a horizontal and vertical field of view of 57 and 43 degrees, respectively, with a depth sensor range of 1.2 m to 3.5 m.</p>
<p>Previous research into the Nimble VR system has shown the measurement errors of the position of the hand to depend on the distance from the camera [
<xref rid="B48-sensors-15-29868" ref-type="bibr">48</xref>
] and the variance of the measurement data to depend on the orientation of the hand [
<xref rid="B49-sensors-15-29868" ref-type="bibr">49</xref>
]. Kim
<italic>et al.</italic>
[
<xref rid="B48-sensors-15-29868" ref-type="bibr">48</xref>
] evaluated the Nimble VR (v.0.9.21) and concluded that it did not provide results of high enough accuracy and robustness over the working range that is required for a medical robotics master. Continued development, however, as well as the addition of data filtering, smoothing, and downscaling of motions, can improve the performance of this vision based system [
<xref rid="B48-sensors-15-29868" ref-type="bibr">48</xref>
,
<xref rid="B50-sensors-15-29868" ref-type="bibr">50</xref>
,
<xref rid="B51-sensors-15-29868" ref-type="bibr">51</xref>
].</p>
<p>The goal of the present research is to implement a Kalman filter algorithm to fuse measurement data of the vision based Nimble VR system with a contact based measurement system, for research into the use of hand and finger motions in medical VR simulators. As previously mentioned, a requirement for the contact based device is that it should be minimally obtrusive to the surgeon, because physical sensors may impede the naturalness of motion and therefore influence surgical VR skills training. We selected the 5DT Data glove (5th Dimension Technologies, Irvine, CA, USA) [
<xref rid="B52-sensors-15-29868" ref-type="bibr">52</xref>
], providing five basic full finger flexion sensors. Although this data fusion approach negates the contact-free control advantage that characterizes vision based systems, it allows for improved pose estimates at visual occlusions and a higher update frequency due to a higher sampling rate of the Data Glove (200 Hz) as compared with the Nimble VR (currently running at 15 Hz). This study presents the implementation of the filter as well as its validation. The validation was performed through measurements of the finger joint angles of a wooden hand model in various poses and orientations. The pose estimates from the 5DT Data Glove, Nimble VR, and the filter were assessed with respect to the actual finger joint angles of the hand model. Additionally, dynamic finger flexion angles, measured on three differently sized hands, were performed, and the data with and without implementation of the filter were compared.</p>
</sec>
</sec>
<sec id="sec2-sensors-15-29868">
<title>2. Kalman Filter Procedures and Parameter Settings</title>
<p>The Kalman filter is a computationally efficient recursive solution of the least-squares method, supporting estimates of the past, present, and future states of a modeled system [
<xref rid="B53-sensors-15-29868" ref-type="bibr">53</xref>
].</p>
<p>In this research, we used the Kalman filter to combine Nimble VR measurements with local sensor data obtained from the 5DT Data Glove 5 Ultra [
<xref rid="B15-sensors-15-29868" ref-type="bibr">15</xref>
,
<xref rid="B52-sensors-15-29868" ref-type="bibr">52</xref>
]. The Data Glove allows for the measurement of overall flexion of each finger by means of fiber-optics-based bending sensors. Although the Data Glove does not distinguish between the individual finger joints, it does have the advantage of being independent of hand orientation and hand position. Moreover, because the Data Glove uses only five simple sensors, it does not significantly impede hand movements. Fusing the local sensor data with the obtained global camera data has the expected advantage of increasing data completeness during hand occlusion and during hand orientations at which the camera-based tracking system is unable to provide an accurate estimation.</p>
<p>The Kalman filter has already been described extensively in the literature [
<xref rid="B54-sensors-15-29868" ref-type="bibr">54</xref>
,
<xref rid="B55-sensors-15-29868" ref-type="bibr">55</xref>
]. The basic equations are as follows:</p>
<p>Measurement update equations:
<disp-formula id="FD1">
<label>(1)</label>
<mml:math id="mm1">
<mml:mrow>
<mml:msub>
<mml:mi>K</mml:mi>
<mml:mi>k</mml:mi>
</mml:msub>
<mml:mo>=</mml:mo>
<mml:msubsup>
<mml:mi>P</mml:mi>
<mml:mi>k</mml:mi>
<mml:mo></mml:mo>
</mml:msubsup>
<mml:msubsup>
<mml:mi>H</mml:mi>
<mml:mi>k</mml:mi>
<mml:mi>T</mml:mi>
</mml:msubsup>
<mml:msup>
<mml:mrow>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi>H</mml:mi>
<mml:mi>k</mml:mi>
</mml:msub>
<mml:msubsup>
<mml:mi>P</mml:mi>
<mml:mi>k</mml:mi>
<mml:mo></mml:mo>
</mml:msubsup>
<mml:msubsup>
<mml:mi>H</mml:mi>
<mml:mi>k</mml:mi>
<mml:mi>T</mml:mi>
</mml:msubsup>
<mml:mo>+</mml:mo>
<mml:msub>
<mml:mi>R</mml:mi>
<mml:mi>k</mml:mi>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
</mml:mrow>
<mml:mrow>
<mml:mo></mml:mo>
<mml:mn>1</mml:mn>
</mml:mrow>
</mml:msup>
</mml:mrow>
</mml:math>
</disp-formula>
<disp-formula id="FD2">
<label>(2)</label>
<mml:math id="mm2">
<mml:mrow>
<mml:msub>
<mml:mrow>
<mml:mover accent="true">
<mml:mi>x</mml:mi>
<mml:mo stretchy="true">^</mml:mo>
</mml:mover>
</mml:mrow>
<mml:mi>k</mml:mi>
</mml:msub>
<mml:mo>=</mml:mo>
<mml:msubsup>
<mml:mrow>
<mml:mover accent="true">
<mml:mi>x</mml:mi>
<mml:mo stretchy="true">^</mml:mo>
</mml:mover>
</mml:mrow>
<mml:mi>k</mml:mi>
<mml:mo></mml:mo>
</mml:msubsup>
<mml:mo>+</mml:mo>
<mml:mi>K</mml:mi>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi>z</mml:mi>
<mml:mi>k</mml:mi>
</mml:msub>
<mml:mo></mml:mo>
<mml:msub>
<mml:mi>H</mml:mi>
<mml:mi>k</mml:mi>
</mml:msub>
<mml:msubsup>
<mml:mrow>
<mml:mover accent="true">
<mml:mi>x</mml:mi>
<mml:mo stretchy="true">^</mml:mo>
</mml:mover>
</mml:mrow>
<mml:mi>k</mml:mi>
<mml:mo></mml:mo>
</mml:msubsup>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
</mml:mrow>
</mml:math>
</disp-formula>
<disp-formula id="FD3">
<label>(3)</label>
<mml:math id="mm3">
<mml:mrow>
<mml:msub>
<mml:mi>P</mml:mi>
<mml:mi>k</mml:mi>
</mml:msub>
<mml:mo>=</mml:mo>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:mi>I</mml:mi>
<mml:mo></mml:mo>
<mml:msub>
<mml:mi>K</mml:mi>
<mml:mi>k</mml:mi>
</mml:msub>
<mml:msub>
<mml:mi>H</mml:mi>
<mml:mi>k</mml:mi>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
<mml:msubsup>
<mml:mi>P</mml:mi>
<mml:mi>k</mml:mi>
<mml:mo></mml:mo>
</mml:msubsup>
</mml:mrow>
</mml:math>
</disp-formula>
</p>
<p>Time update equations:
<disp-formula id="FD4">
<label>(4)</label>
<mml:math id="mm4">
<mml:mrow>
<mml:msubsup>
<mml:mi>x</mml:mi>
<mml:mrow>
<mml:mi>k</mml:mi>
<mml:mo>+</mml:mo>
<mml:mn>1</mml:mn>
</mml:mrow>
<mml:mo></mml:mo>
</mml:msubsup>
<mml:mo>=</mml:mo>
<mml:msub>
<mml:mi>A</mml:mi>
<mml:mi>k</mml:mi>
</mml:msub>
<mml:msub>
<mml:mrow>
<mml:mover accent="true">
<mml:mi>x</mml:mi>
<mml:mo stretchy="true">^</mml:mo>
</mml:mover>
</mml:mrow>
<mml:mi>k</mml:mi>
</mml:msub>
<mml:mo>+</mml:mo>
<mml:mi>B</mml:mi>
<mml:msub>
<mml:mi>u</mml:mi>
<mml:mi>k</mml:mi>
</mml:msub>
</mml:mrow>
</mml:math>
</disp-formula>
<disp-formula id="FD5">
<label>(5)</label>
<mml:math id="mm5">
<mml:mrow>
<mml:msubsup>
<mml:mi>P</mml:mi>
<mml:mrow>
<mml:mi>k</mml:mi>
<mml:mo>+</mml:mo>
<mml:mn>1</mml:mn>
</mml:mrow>
<mml:mo></mml:mo>
</mml:msubsup>
<mml:mo>=</mml:mo>
<mml:msub>
<mml:mi>A</mml:mi>
<mml:mi>k</mml:mi>
</mml:msub>
<mml:msub>
<mml:mi>P</mml:mi>
<mml:mi>k</mml:mi>
</mml:msub>
<mml:msubsup>
<mml:mi>A</mml:mi>
<mml:mi>k</mml:mi>
<mml:mi>T</mml:mi>
</mml:msubsup>
<mml:mo>+</mml:mo>
<mml:msub>
<mml:mi>Q</mml:mi>
<mml:mi>k</mml:mi>
</mml:msub>
</mml:mrow>
</mml:math>
</disp-formula>
where
<inline-formula>
<mml:math id="mm6">
<mml:mrow>
<mml:msub>
<mml:mi>K</mml:mi>
<mml:mi>k</mml:mi>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula>
is the Kalman gain,
<inline-formula>
<mml:math id="mm7">
<mml:mrow>
<mml:msub>
<mml:mi>P</mml:mi>
<mml:mi>k</mml:mi>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula>
the estimate error covariance matrix,
<inline-formula>
<mml:math id="mm8">
<mml:mrow>
<mml:msub>
<mml:mi>H</mml:mi>
<mml:mi>k</mml:mi>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula>
the matrix describing how the measurement equation relates to the actual measurement
<inline-formula>
<mml:math id="mm9">
<mml:mrow>
<mml:msub>
<mml:mi>z</mml:mi>
<mml:mi>k</mml:mi>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula>
,
<inline-formula>
<mml:math id="mm10">
<mml:mrow>
<mml:msub>
<mml:mi>A</mml:mi>
<mml:mi>k</mml:mi>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula>
contains the model functions describing the relation between the state at time step
<italic>k</italic>
and the state at step
<italic>k + 1</italic>
, and
<inline-formula>
<mml:math id="mm11">
<mml:mi>B</mml:mi>
</mml:math>
</inline-formula>
the matrix relating control input
<inline-formula>
<mml:math id="mm12">
<mml:mrow>
<mml:msub>
<mml:mi>u</mml:mi>
<mml:mi>k</mml:mi>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula>
to the state
<inline-formula>
<mml:math id="mm13">
<mml:mi>x</mml:mi>
</mml:math>
</inline-formula>
. In our case, the state vector
<inline-formula>
<mml:math id="mm14">
<mml:mi>x</mml:mi>
</mml:math>
</inline-formula>
contains the metacarpophalangeal (MCP) and proximal interphalangeal (PIP) joint angles and angular velocities for each finger. The actual measurements
<inline-formula>
<mml:math id="mm15">
<mml:mrow>
<mml:msub>
<mml:mi>z</mml:mi>
<mml:mi>k</mml:mi>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula>
are limited to the MCP and PIP joint angles individually as obtained through the Nimble VR software and the sum of the two as given by the Data Glove. Vectors
<inline-formula>
<mml:math id="mm16">
<mml:mi>x</mml:mi>
</mml:math>
</inline-formula>
and
<inline-formula>
<mml:math id="mm17">
<mml:mrow>
<mml:msub>
<mml:mi>z</mml:mi>
<mml:mi>k</mml:mi>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula>
and matrices
<inline-formula>
<mml:math id="mm18">
<mml:mrow>
<mml:msub>
<mml:mi>H</mml:mi>
<mml:mi>k</mml:mi>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula>
,
<inline-formula>
<mml:math id="mm19">
<mml:mrow>
<mml:msub>
<mml:mi>A</mml:mi>
<mml:mi>k</mml:mi>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula>
and
<inline-formula>
<mml:math id="mm20">
<mml:mi>B</mml:mi>
</mml:math>
</inline-formula>
are given as follows:
<disp-formula id="FD6">
<label>(6)</label>
<mml:math id="mm21">
<mml:mtable columnalign="left">
<mml:mtr>
<mml:mtd>
<mml:mi>x</mml:mi>
<mml:mo>=</mml:mo>
<mml:mrow>
<mml:mo>[</mml:mo>
<mml:mrow>
<mml:mtable>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:msub>
<mml:mi>φ</mml:mi>
<mml:mrow>
<mml:mi>M</mml:mi>
<mml:mi>C</mml:mi>
<mml:mi>P</mml:mi>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:msub>
<mml:mi>φ</mml:mi>
<mml:mrow>
<mml:mi>P</mml:mi>
<mml:mi>I</mml:mi>
<mml:mi>P</mml:mi>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:msub>
<mml:mover accent="true">
<mml:mi>φ</mml:mi>
<mml:mo>˙</mml:mo>
</mml:mover>
<mml:mrow>
<mml:mi>M</mml:mi>
<mml:mi>C</mml:mi>
<mml:mi>P</mml:mi>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:msub>
<mml:mover accent="true">
<mml:mi>φ</mml:mi>
<mml:mo>˙</mml:mo>
</mml:mover>
<mml:mrow>
<mml:mi>P</mml:mi>
<mml:mi>I</mml:mi>
<mml:mi>P</mml:mi>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:mtd>
</mml:mtr>
</mml:mtable>
</mml:mrow>
<mml:mo>]</mml:mo>
</mml:mrow>
<mml:mtable>
<mml:mtr></mml:mtr>
</mml:mtable>
<mml:msub>
<mml:mi>z</mml:mi>
<mml:mi>k</mml:mi>
</mml:msub>
<mml:mo>=</mml:mo>
<mml:mrow>
<mml:mo>[</mml:mo>
<mml:mrow>
<mml:mtable>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:msub>
<mml:mi>φ</mml:mi>
<mml:mrow>
<mml:mi>N</mml:mi>
<mml:mi>V</mml:mi>
<mml:msub>
<mml:mi>R</mml:mi>
<mml:mrow>
<mml:mi>M</mml:mi>
<mml:mi>C</mml:mi>
<mml:mi>P</mml:mi>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:msub>
<mml:mi>φ</mml:mi>
<mml:mrow>
<mml:mi>N</mml:mi>
<mml:mi>V</mml:mi>
<mml:msub>
<mml:mi>R</mml:mi>
<mml:mrow>
<mml:mi>P</mml:mi>
<mml:mi>I</mml:mi>
<mml:mi>P</mml:mi>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:msub>
<mml:mi>φ</mml:mi>
<mml:mrow>
<mml:mi>D</mml:mi>
<mml:msub>
<mml:mi>G</mml:mi>
<mml:mrow>
<mml:mi>M</mml:mi>
<mml:mi>C</mml:mi>
<mml:mi>P</mml:mi>
<mml:mo>+</mml:mo>
<mml:mi>P</mml:mi>
<mml:mi>I</mml:mi>
<mml:mi>P</mml:mi>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:mtd>
</mml:mtr>
</mml:mtable>
</mml:mrow>
<mml:mo>]</mml:mo>
</mml:mrow>
<mml:mtable>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:mrow>
<mml:mrow>
<mml:mtable>
<mml:mtr></mml:mtr>
<mml:mtr></mml:mtr>
</mml:mtable>
</mml:mrow>
<mml:mo>}</mml:mo>
</mml:mrow>
<mml:mtext>Nimble VR (NVR)</mml:mtext>
</mml:mrow>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:mrow>
<mml:mo>}</mml:mo>
</mml:mrow>
<mml:mtext>Data Glove (DG) </mml:mtext>
</mml:mrow>
</mml:mtd>
</mml:mtr>
</mml:mtable>
<mml:mtable>
<mml:mtr></mml:mtr>
</mml:mtable>
<mml:msub>
<mml:mi>H</mml:mi>
<mml:mi>k</mml:mi>
</mml:msub>
<mml:mo>=</mml:mo>
<mml:mrow>
<mml:mo>[</mml:mo>
<mml:mrow>
<mml:mtable>
<mml:mtr>
<mml:mtd>
<mml:mn>1</mml:mn>
</mml:mtd>
<mml:mtd>
<mml:mn>0</mml:mn>
</mml:mtd>
<mml:mtd>
<mml:mn>0</mml:mn>
</mml:mtd>
<mml:mtd>
<mml:mn>0</mml:mn>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mn>0</mml:mn>
</mml:mtd>
<mml:mtd>
<mml:mn>1</mml:mn>
</mml:mtd>
<mml:mtd>
<mml:mn>0</mml:mn>
</mml:mtd>
<mml:mtd>
<mml:mn>0</mml:mn>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:msub>
<mml:mi>w</mml:mi>
<mml:mrow>
<mml:mi>D</mml:mi>
<mml:msub>
<mml:mi>G</mml:mi>
<mml:mrow>
<mml:mi>M</mml:mi>
<mml:mi>C</mml:mi>
<mml:mi>P</mml:mi>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:msub>
<mml:mi>w</mml:mi>
<mml:mrow>
<mml:mi>D</mml:mi>
<mml:msub>
<mml:mi>G</mml:mi>
<mml:mrow>
<mml:mi>P</mml:mi>
<mml:mi>I</mml:mi>
<mml:mi>P</mml:mi>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mn>0</mml:mn>
</mml:mtd>
<mml:mtd>
<mml:mn>0</mml:mn>
</mml:mtd>
</mml:mtr>
</mml:mtable>
</mml:mrow>
<mml:mo>]</mml:mo>
</mml:mrow>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mtable>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:msub>
<mml:mi>A</mml:mi>
<mml:mi>k</mml:mi>
</mml:msub>
<mml:mo>=</mml:mo>
<mml:mrow>
<mml:mo>[</mml:mo>
<mml:mrow>
<mml:mtable>
<mml:mtr>
<mml:mtd>
<mml:mn>1</mml:mn>
</mml:mtd>
<mml:mtd>
<mml:mn>0</mml:mn>
</mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:mi>d</mml:mi>
<mml:mi>t</mml:mi>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mn>0</mml:mn>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mn>0</mml:mn>
</mml:mtd>
<mml:mtd>
<mml:mn>1</mml:mn>
</mml:mtd>
<mml:mtd>
<mml:mn>0</mml:mn>
</mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:mi>d</mml:mi>
<mml:mi>t</mml:mi>
</mml:mrow>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mn>0</mml:mn>
</mml:mtd>
<mml:mtd>
<mml:mn>0</mml:mn>
</mml:mtd>
<mml:mtd>
<mml:mn>1</mml:mn>
</mml:mtd>
<mml:mtd>
<mml:mn>0</mml:mn>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mn>0</mml:mn>
</mml:mtd>
<mml:mtd>
<mml:mn>0</mml:mn>
</mml:mtd>
<mml:mtd>
<mml:mn>0</mml:mn>
</mml:mtd>
<mml:mtd>
<mml:mn>1</mml:mn>
</mml:mtd>
</mml:mtr>
</mml:mtable>
</mml:mrow>
<mml:mo>]</mml:mo>
</mml:mrow>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:mi>B</mml:mi>
<mml:mo>=</mml:mo>
<mml:mstyle mathvariant="bold" mathsize="normal">
<mml:mn>0</mml:mn>
</mml:mstyle>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:mtext>null matrix</mml:mtext>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
</mml:mrow>
</mml:mtd>
</mml:mtr>
</mml:mtable>
</mml:mtd>
</mml:mtr>
</mml:mtable>
</mml:math>
</disp-formula>
</p>
<p>Note that these matrices are valid for all fingers, with the exception that the thumb has an interphalangeal (IP) joint instead of a PIP joint. The carpometacarpal (CMC) joint of the thumb is not measured with the Data Glove, and therefore not present in this model. The distal interphalangeal (DIP) joint is not measured by either of the two systems, because these joints are linked in motion to the PIP joints and because one cannot easily control one’s own DIP joints. Hence, these joints were left outside the scope of this research. The
<inline-formula>
<mml:math id="mm22">
<mml:mi>B</mml:mi>
</mml:math>
</inline-formula>
matrix is a null matrix because we do not provide a custom input
<inline-formula>
<mml:math id="mm23">
<mml:mi>u</mml:mi>
</mml:math>
</inline-formula>
.</p>
<p>Matrix
<inline-formula>
<mml:math id="mm24">
<mml:mrow>
<mml:msub>
<mml:mi>H</mml:mi>
<mml:mi>k</mml:mi>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula>
contains two weights
<inline-formula>
<mml:math id="mm25">
<mml:mrow>
<mml:msub>
<mml:mi>w</mml:mi>
<mml:mrow>
<mml:mi>D</mml:mi>
<mml:msub>
<mml:mi>G</mml:mi>
<mml:mrow>
<mml:mi>M</mml:mi>
<mml:mi>C</mml:mi>
<mml:mi>P</mml:mi>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula>
and
<inline-formula>
<mml:math id="mm26">
<mml:mrow>
<mml:msub>
<mml:mi>w</mml:mi>
<mml:mrow>
<mml:mi>D</mml:mi>
<mml:msub>
<mml:mi>G</mml:mi>
<mml:mrow>
<mml:mi>PIP</mml:mi>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula>
which represent the degree to which the respective finger joints contribute to the Data Glove measurement signal. The measurement error covariance matrix
<inline-formula>
<mml:math id="mm27">
<mml:mrow>
<mml:msub>
<mml:mi>R</mml:mi>
<mml:mi>k</mml:mi>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula>
, the process noise covariance matrix
<inline-formula>
<mml:math id="mm28">
<mml:mrow>
<mml:msub>
<mml:mi>Q</mml:mi>
<mml:mi>k</mml:mi>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula>
and the Data Glove weights
<inline-formula>
<mml:math id="mm29">
<mml:mrow>
<mml:msub>
<mml:mi>w</mml:mi>
<mml:mrow>
<mml:mi>D</mml:mi>
<mml:msub>
<mml:mi>G</mml:mi>
<mml:mrow>
<mml:mi>M</mml:mi>
<mml:mi>C</mml:mi>
<mml:mi>P</mml:mi>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula>
and
<inline-formula>
<mml:math id="mm30">
<mml:mrow>
<mml:msub>
<mml:mi>w</mml:mi>
<mml:mrow>
<mml:mi>D</mml:mi>
<mml:msub>
<mml:mi>G</mml:mi>
<mml:mrow>
<mml:mi>PIP</mml:mi>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula>
were measured prior to operation of the filter, and are described next.</p>
<sec id="sec2dot1-sensors-15-29868">
<title>Determining the Kalman Filter paRameters</title>
<p>Research has shown that the mean finger flexion obtained from Nimble VR (v0.9.34) measurements is dependent on the orientation of the hand [
<xref rid="B49-sensors-15-29868" ref-type="bibr">49</xref>
]. The level of variance for each finger joint as a function of both hand orientation and the degree of finger flexion serves as input for the measurement error covariance matrix
<inline-formula>
<mml:math id="mm27000">
<mml:mrow>
<mml:msub>
<mml:mi>R</mml:mi>
<mml:mi>k</mml:mi>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula>
:
<disp-formula id="FD7">
<label>(7)</label>
<mml:math id="mm31">
<mml:mrow>
<mml:msub>
<mml:mi>R</mml:mi>
<mml:mi>k</mml:mi>
</mml:msub>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:mi>α</mml:mi>
<mml:mo>,</mml:mo>
<mml:mi>β</mml:mi>
<mml:mo>,</mml:mo>
<mml:mi>γ</mml:mi>
<mml:mo>,</mml:mo>
<mml:msub>
<mml:mi>z</mml:mi>
<mml:mi>k</mml:mi>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
<mml:mo>=</mml:mo>
<mml:mrow>
<mml:mo>[</mml:mo>
<mml:mrow>
<mml:mtable>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:msubsup>
<mml:mi>σ</mml:mi>
<mml:mrow>
<mml:mi>N</mml:mi>
<mml:mi>V</mml:mi>
<mml:msub>
<mml:mi>R</mml:mi>
<mml:mrow>
<mml:mi>M</mml:mi>
<mml:mi>C</mml:mi>
<mml:mi>P</mml:mi>
</mml:mrow>
</mml:msub>
</mml:mrow>
<mml:mn>2</mml:mn>
</mml:msubsup>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:mi>α</mml:mi>
<mml:mo>,</mml:mo>
<mml:mi>β</mml:mi>
<mml:mo>,</mml:mo>
<mml:mi>γ</mml:mi>
<mml:mo>,</mml:mo>
<mml:msub>
<mml:mi>z</mml:mi>
<mml:mi>k</mml:mi>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mn>0</mml:mn>
</mml:mtd>
<mml:mtd>
<mml:mn>0</mml:mn>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mn>0</mml:mn>
</mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:msubsup>
<mml:mi>σ</mml:mi>
<mml:mrow>
<mml:mi>N</mml:mi>
<mml:mi>V</mml:mi>
<mml:msub>
<mml:mi>R</mml:mi>
<mml:mrow>
<mml:mi>P</mml:mi>
<mml:mi>I</mml:mi>
<mml:mi>P</mml:mi>
</mml:mrow>
</mml:msub>
</mml:mrow>
<mml:mn>2</mml:mn>
</mml:msubsup>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:mi>α</mml:mi>
<mml:mo>,</mml:mo>
<mml:mi>β</mml:mi>
<mml:mo>,</mml:mo>
<mml:mi>γ</mml:mi>
<mml:mo>,</mml:mo>
<mml:msub>
<mml:mi>z</mml:mi>
<mml:mi>k</mml:mi>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mn>0</mml:mn>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mn>0</mml:mn>
</mml:mtd>
<mml:mtd>
<mml:mn>0</mml:mn>
</mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:msubsup>
<mml:mi>σ</mml:mi>
<mml:mrow>
<mml:mi>D</mml:mi>
<mml:msub>
<mml:mi>G</mml:mi>
<mml:mrow>
<mml:mi>M</mml:mi>
<mml:mi>C</mml:mi>
<mml:mi>P</mml:mi>
<mml:mo>+</mml:mo>
<mml:mi>P</mml:mi>
<mml:mi>I</mml:mi>
<mml:mi>P</mml:mi>
</mml:mrow>
</mml:msub>
</mml:mrow>
<mml:mn>2</mml:mn>
</mml:msubsup>
</mml:mrow>
</mml:mtd>
</mml:mtr>
</mml:mtable>
</mml:mrow>
<mml:mo>]</mml:mo>
</mml:mrow>
</mml:mrow>
</mml:math>
</disp-formula>
where
<inline-formula>
<mml:math id="mm32">
<mml:mrow>
<mml:msubsup>
<mml:mi>σ</mml:mi>
<mml:mrow>
<mml:mi>N</mml:mi>
<mml:mi>V</mml:mi>
<mml:msub>
<mml:mi>R</mml:mi>
<mml:mrow>
<mml:mi>M</mml:mi>
<mml:mi>C</mml:mi>
<mml:mi>P</mml:mi>
</mml:mrow>
</mml:msub>
</mml:mrow>
<mml:mn>2</mml:mn>
</mml:msubsup>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:mi>α</mml:mi>
<mml:mo>,</mml:mo>
<mml:mi>β</mml:mi>
<mml:mo>,</mml:mo>
<mml:mi>γ</mml:mi>
<mml:mo>,</mml:mo>
<mml:msub>
<mml:mi>z</mml:mi>
<mml:mi>k</mml:mi>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
</mml:mrow>
</mml:math>
</inline-formula>
and
<inline-formula>
<mml:math id="mm33">
<mml:mrow>
<mml:msubsup>
<mml:mi>σ</mml:mi>
<mml:mrow>
<mml:mi>N</mml:mi>
<mml:mi>V</mml:mi>
<mml:msub>
<mml:mi>R</mml:mi>
<mml:mrow>
<mml:mi>P</mml:mi>
<mml:mi>I</mml:mi>
<mml:mi>P</mml:mi>
</mml:mrow>
</mml:msub>
</mml:mrow>
<mml:mn>2</mml:mn>
</mml:msubsup>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:mi>α</mml:mi>
<mml:mo>,</mml:mo>
<mml:mi>β</mml:mi>
<mml:mo>,</mml:mo>
<mml:mi>γ</mml:mi>
<mml:mo>,</mml:mo>
<mml:msub>
<mml:mi>z</mml:mi>
<mml:mi>k</mml:mi>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
</mml:mrow>
</mml:math>
</inline-formula>
are the Nimble VR measured MCP and PIP joint angle variances as a function of pitch, roll, and yaw angles (
<inline-formula>
<mml:math id="mm34">
<mml:mrow>
<mml:mi>α</mml:mi>
<mml:mo>,</mml:mo>
<mml:mi>β</mml:mi>
<mml:mo>,</mml:mo>
<mml:mi>γ</mml:mi>
</mml:mrow>
</mml:math>
</inline-formula>
, respectively) and Data Glove measurement
<inline-formula>
<mml:math id="mm35">
<mml:mrow>
<mml:msub>
<mml:mi>z</mml:mi>
<mml:mi>k</mml:mi>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula>
.
<inline-formula>
<mml:math id="mm36">
<mml:mrow>
<mml:msubsup>
<mml:mi>σ</mml:mi>
<mml:mrow>
<mml:mi>D</mml:mi>
<mml:msub>
<mml:mi>G</mml:mi>
<mml:mrow>
<mml:mi>M</mml:mi>
<mml:mi>C</mml:mi>
<mml:mi>P</mml:mi>
<mml:mo>+</mml:mo>
<mml:mi>P</mml:mi>
<mml:mi>I</mml:mi>
<mml:mi>P</mml:mi>
</mml:mrow>
</mml:msub>
</mml:mrow>
<mml:mn>2</mml:mn>
</mml:msubsup>
</mml:mrow>
</mml:math>
</inline-formula>
is the data variance associated with the Data Glove, which is independent of hand orientation. The off-diagonal elements are the correlations between the various joints. Because a person can actuate their MCP and PIP joints independently of each other (to a certain degree), these elements were set to zero. The correlations between the different fingers were set to zero for the same reason. The variance terms used as input for the Kalman filter were measured as a function of hand orientation and finger flexion. The method by which this has been done and the accompanying results are given in
<xref ref-type="app" rid="app1-sensors-15-29868">Appendix A</xref>
.</p>
<p>The noise covariance matrix
<inline-formula>
<mml:math id="mm27001">
<mml:mrow>
<mml:msub>
<mml:mi>Q</mml:mi>
<mml:mi>k</mml:mi>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula>
is typically used to represent the uncertainty in the process model [
<xref rid="B53-sensors-15-29868" ref-type="bibr">53</xref>
]. We set this uncertainty to be equal to the squared angular deviation from the state estimation
<inline-formula>
<mml:math id="mm27010">
<mml:mrow>
<mml:msub>
<mml:mrow>
<mml:mover accent="true">
<mml:mi>x</mml:mi>
<mml:mo stretchy="true">^</mml:mo>
</mml:mover>
</mml:mrow>
<mml:mi>k</mml:mi>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula>
, as calculated with the peak rotational acceleration of the finger flexions. Because changes in finger flexion cannot be greater than the physical maximum during voluntary free finger movement, this approach provides us with a valid uncertainty range for where a finger can be at a point in time based on its previous location. The process noise covariance matrix
<inline-formula>
<mml:math id="mm27002">
<mml:mrow>
<mml:msub>
<mml:mi>Q</mml:mi>
<mml:mi>k</mml:mi>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula>
then becomes:
<disp-formula id="FD8">
<label>(8)</label>
<mml:math id="mm37">
<mml:mrow>
<mml:msub>
<mml:mi>Q</mml:mi>
<mml:mi>k</mml:mi>
</mml:msub>
<mml:mo>=</mml:mo>
<mml:mrow>
<mml:mo>[</mml:mo>
<mml:mrow>
<mml:mtable>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:mfrac bevelled="true">
<mml:mn>1</mml:mn>
<mml:mn>4</mml:mn>
</mml:mfrac>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
<mml:msubsup>
<mml:mover accent="true">
<mml:mi>φ</mml:mi>
<mml:mo>¨</mml:mo>
</mml:mover>
<mml:mrow>
<mml:mi>M</mml:mi>
<mml:mi>C</mml:mi>
<mml:mi>P</mml:mi>
</mml:mrow>
<mml:mn>2</mml:mn>
</mml:msubsup>
<mml:mo></mml:mo>
<mml:mi>δ</mml:mi>
<mml:msup>
<mml:mi>t</mml:mi>
<mml:mn>4</mml:mn>
</mml:msup>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mn>0</mml:mn>
</mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:mfrac bevelled="true">
<mml:mn>1</mml:mn>
<mml:mn>2</mml:mn>
</mml:mfrac>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
<mml:msubsup>
<mml:mover accent="true">
<mml:mi>φ</mml:mi>
<mml:mo>¨</mml:mo>
</mml:mover>
<mml:mrow>
<mml:mi>M</mml:mi>
<mml:mi>C</mml:mi>
<mml:mi>P</mml:mi>
</mml:mrow>
<mml:mn>2</mml:mn>
</mml:msubsup>
<mml:mo></mml:mo>
<mml:mi>δ</mml:mi>
<mml:msup>
<mml:mi>t</mml:mi>
<mml:mn>3</mml:mn>
</mml:msup>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mn>0</mml:mn>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mn>0</mml:mn>
</mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:mfrac bevelled="true">
<mml:mn>1</mml:mn>
<mml:mn>4</mml:mn>
</mml:mfrac>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
<mml:msubsup>
<mml:mover accent="true">
<mml:mi>φ</mml:mi>
<mml:mo>¨</mml:mo>
</mml:mover>
<mml:mrow>
<mml:mi>P</mml:mi>
<mml:mi>I</mml:mi>
<mml:mi>P</mml:mi>
</mml:mrow>
<mml:mn>2</mml:mn>
</mml:msubsup>
<mml:mo></mml:mo>
<mml:mi>δ</mml:mi>
<mml:msup>
<mml:mi>t</mml:mi>
<mml:mn>4</mml:mn>
</mml:msup>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mn>0</mml:mn>
</mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:mfrac bevelled="true">
<mml:mn>1</mml:mn>
<mml:mn>2</mml:mn>
</mml:mfrac>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
<mml:msubsup>
<mml:mover accent="true">
<mml:mi>φ</mml:mi>
<mml:mo>¨</mml:mo>
</mml:mover>
<mml:mrow>
<mml:mi>P</mml:mi>
<mml:mi>I</mml:mi>
<mml:mi>P</mml:mi>
</mml:mrow>
<mml:mn>2</mml:mn>
</mml:msubsup>
<mml:mo></mml:mo>
<mml:mi>δ</mml:mi>
<mml:msup>
<mml:mi>t</mml:mi>
<mml:mn>3</mml:mn>
</mml:msup>
</mml:mrow>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:mfrac bevelled="true">
<mml:mn>1</mml:mn>
<mml:mn>2</mml:mn>
</mml:mfrac>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
<mml:msubsup>
<mml:mover accent="true">
<mml:mi>φ</mml:mi>
<mml:mo>¨</mml:mo>
</mml:mover>
<mml:mrow>
<mml:mi>M</mml:mi>
<mml:mi>C</mml:mi>
<mml:mi>P</mml:mi>
</mml:mrow>
<mml:mn>2</mml:mn>
</mml:msubsup>
<mml:mo></mml:mo>
<mml:mi>δ</mml:mi>
<mml:msup>
<mml:mi>t</mml:mi>
<mml:mn>3</mml:mn>
</mml:msup>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mn>0</mml:mn>
</mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:msubsup>
<mml:mover accent="true">
<mml:mi>φ</mml:mi>
<mml:mo>¨</mml:mo>
</mml:mover>
<mml:mrow>
<mml:mi>M</mml:mi>
<mml:mi>C</mml:mi>
<mml:mi>P</mml:mi>
</mml:mrow>
<mml:mn>2</mml:mn>
</mml:msubsup>
<mml:mo></mml:mo>
<mml:mi>δ</mml:mi>
<mml:msup>
<mml:mi>t</mml:mi>
<mml:mn>2</mml:mn>
</mml:msup>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mn>0</mml:mn>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mn>0</mml:mn>
</mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:mfrac bevelled="true">
<mml:mn>1</mml:mn>
<mml:mn>2</mml:mn>
</mml:mfrac>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
<mml:msubsup>
<mml:mover accent="true">
<mml:mi>φ</mml:mi>
<mml:mo>¨</mml:mo>
</mml:mover>
<mml:mrow>
<mml:mi>P</mml:mi>
<mml:mi>I</mml:mi>
<mml:mi>P</mml:mi>
</mml:mrow>
<mml:mn>2</mml:mn>
</mml:msubsup>
<mml:mo></mml:mo>
<mml:mi>δ</mml:mi>
<mml:msup>
<mml:mi>t</mml:mi>
<mml:mn>3</mml:mn>
</mml:msup>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mn>0</mml:mn>
</mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:msubsup>
<mml:mover accent="true">
<mml:mi>φ</mml:mi>
<mml:mo>¨</mml:mo>
</mml:mover>
<mml:mrow>
<mml:mi>P</mml:mi>
<mml:mi>I</mml:mi>
<mml:mi>P</mml:mi>
</mml:mrow>
<mml:mn>2</mml:mn>
</mml:msubsup>
<mml:mo></mml:mo>
<mml:mi>δ</mml:mi>
<mml:msup>
<mml:mi>t</mml:mi>
<mml:mn>2</mml:mn>
</mml:msup>
</mml:mrow>
</mml:mtd>
</mml:mtr>
</mml:mtable>
</mml:mrow>
<mml:mo>]</mml:mo>
</mml:mrow>
</mml:mrow>
</mml:math>
</disp-formula>
where
<inline-formula>
<mml:math id="mm38">
<mml:mrow>
<mml:msub>
<mml:mover accent="true">
<mml:mi>φ</mml:mi>
<mml:mo>¨</mml:mo>
</mml:mover>
<mml:mrow>
<mml:mi>M</mml:mi>
<mml:mi>C</mml:mi>
<mml:mi>P</mml:mi>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula>
and
<inline-formula>
<mml:math id="mm39">
<mml:mrow>
<mml:msub>
<mml:mover accent="true">
<mml:mi>φ</mml:mi>
<mml:mo>¨</mml:mo>
</mml:mover>
<mml:mrow>
<mml:mi>P</mml:mi>
<mml:mi>I</mml:mi>
<mml:mi>P</mml:mi>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula>
are the maximum joint angular accelerations. The values used as input for this matrix were measured experimentally and are provided in
<xref ref-type="app" rid="app2-sensors-15-29868">Appendix B</xref>
.</p>
<p>Lastly, because the Data Glove measures the sum of MCP and PIP flexion, the following relation holds for the two weights
<inline-formula>
<mml:math id="mm40">
<mml:mrow>
<mml:msub>
<mml:mi>w</mml:mi>
<mml:mrow>
<mml:mi>D</mml:mi>
<mml:msub>
<mml:mi>G</mml:mi>
<mml:mrow>
<mml:mi>M</mml:mi>
<mml:mi>C</mml:mi>
<mml:mi>P</mml:mi>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula>
and
<inline-formula>
<mml:math id="mm41">
<mml:mrow>
<mml:msub>
<mml:mi>w</mml:mi>
<mml:mrow>
<mml:mi>D</mml:mi>
<mml:msub>
<mml:mi>G</mml:mi>
<mml:mrow>
<mml:mi>PIP</mml:mi>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula>
:
<disp-formula id="FD9">
<label>(9)</label>
<mml:math id="mm42">
<mml:mrow>
<mml:msub>
<mml:mi>φ</mml:mi>
<mml:mrow>
<mml:mi>D</mml:mi>
<mml:msub>
<mml:mi>G</mml:mi>
<mml:mrow>
<mml:mi>M</mml:mi>
<mml:mi>C</mml:mi>
<mml:mi>P</mml:mi>
<mml:mo>+</mml:mo>
<mml:mi>P</mml:mi>
<mml:mi>I</mml:mi>
<mml:mi>P</mml:mi>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:msub>
<mml:mo>=</mml:mo>
<mml:msub>
<mml:mi>w</mml:mi>
<mml:mrow>
<mml:mi>D</mml:mi>
<mml:msub>
<mml:mi>G</mml:mi>
<mml:mrow>
<mml:mi>M</mml:mi>
<mml:mi>C</mml:mi>
<mml:mi>P</mml:mi>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:msub>
<mml:mo></mml:mo>
<mml:msub>
<mml:mi>φ</mml:mi>
<mml:mrow>
<mml:mi>M</mml:mi>
<mml:mi>C</mml:mi>
<mml:mi>P</mml:mi>
</mml:mrow>
</mml:msub>
<mml:mo>+</mml:mo>
<mml:msub>
<mml:mi>w</mml:mi>
<mml:mrow>
<mml:mi>D</mml:mi>
<mml:msub>
<mml:mi>G</mml:mi>
<mml:mrow>
<mml:mi>P</mml:mi>
<mml:mi>I</mml:mi>
<mml:mi>P</mml:mi>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:msub>
<mml:mo></mml:mo>
<mml:msub>
<mml:mi>φ</mml:mi>
<mml:mrow>
<mml:mi>P</mml:mi>
<mml:mi>I</mml:mi>
<mml:mi>P</mml:mi>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:math>
</disp-formula>
</p>
<p>Ideally, the weights have a value of 1.0 each, indicating proper measurement of the individual joint rotations. However due to shifting of the Data Glove sensors inside the glove with respect to the fingers, the measurement signals may be biased and vary per finger. Hence, these weights were measured using a medium sized hand prior to the Kalman filter operation. The measurement procedures and resulting weight values are provided in
<xref ref-type="app" rid="app3-sensors-15-29868">Appendix C</xref>
.</p>
</sec>
</sec>
<sec id="sec3-sensors-15-29868">
<title>3. Methods</title>
<sec id="sec3dot1-sensors-15-29868">
<title>3.1. Test Setup</title>
<p>A setup was created that implements the Nimble VR camera-based hand tracking software (v0.9.36). This setup made use of a Kinect camera mounted on a rig facing downwards onto a table top (
<xref ref-type="fig" rid="sensors-15-29868-f001">Figure 1</xref>
). Using the infrared depth information obtained from the Kinect camera, the software detected the hands, provided an estimation of the orientation and position of the hands and fingers, and approximated the hand’s skeletal model [
<xref rid="B56-sensors-15-29868" ref-type="bibr">56</xref>
]. Default software settings were used. The 5DT Data Glove was added to this setup, and we wrote a C++ program that exports all measurements to MATLAB. In MATLAB, the Kalman filter function fused the Nimble VR and Data Glove measurements.</p>
<fig id="sensors-15-29868-f001" position="float">
<label>Figure 1</label>
<caption>
<p>Schematic representation of the test setup.</p>
</caption>
<graphic xlink:href="sensors-15-29868-g001"></graphic>
</fig>
<p>After implementing the predetermined filter parameters (
<italic>i.e.</italic>
, the measurement error covariance matrix
<inline-formula>
<mml:math id="mm43">
<mml:mrow>
<mml:msub>
<mml:mi>R</mml:mi>
<mml:mi>k</mml:mi>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula>
, the process noise covariance matrix
<inline-formula>
<mml:math id="mm44">
<mml:mrow>
<mml:msub>
<mml:mi>Q</mml:mi>
<mml:mi>k</mml:mi>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula>
, and the Data Glove weights
<inline-formula>
<mml:math id="mm45">
<mml:mrow>
<mml:msub>
<mml:mi>w</mml:mi>
<mml:mrow>
<mml:mi>D</mml:mi>
<mml:msub>
<mml:mi>G</mml:mi>
<mml:mrow>
<mml:mi>M</mml:mi>
<mml:mi>C</mml:mi>
<mml:mi>P</mml:mi>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula>
and
<inline-formula>
<mml:math id="mm46">
<mml:mrow>
<mml:msub>
<mml:mi>w</mml:mi>
<mml:mrow>
<mml:mi>D</mml:mi>
<mml:msub>
<mml:mi>G</mml:mi>
<mml:mrow>
<mml:mi>PIP</mml:mi>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula>
, see
<xref ref-type="app" rid="app1-sensors-15-29868">Appendix A</xref>
,
<xref ref-type="app" rid="app2-sensors-15-29868">Appendix B</xref>
and
<xref ref-type="app" rid="app3-sensors-15-29868">Appendix C</xref>
), the Kalman filter output was compared with the Nimble VR measurements.</p>
<p>Two validation measurements were performed. The first measurements used a wooden model hand to assess the influence of hand orientation on the Nimble VR output, and to assess the degree to which the Kalman filter is able to improve the state estimates by fusing with the Data Glove output. The second measurements involved dynamic hand movements with three human hands of different sizes to assess the robustness of the Kalman filter output. This is important because predetermined Kalman parameters were used in combination with a single one-size-fits-all glove. Moreover, the dynamic measurements provide a measure of the time delay of the current setup.</p>
</sec>
<sec id="sec3dot2-sensors-15-29868">
<title>3.2. Wooden Hand Model Measurements to Validate the Kalman Filter Operation</title>
<p>The wooden model hand, which is widely available for purchase, had a length of 21.35 cm, measured from wrist to tip of the middle finger and breadth of 8.3 cm (note that the same model was used for determining matrix
<inline-formula>
<mml:math id="mm47">
<mml:mrow>
<mml:msub>
<mml:mi>R</mml:mi>
<mml:mrow>
<mml:mi>k</mml:mi>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula>
, see
<xref ref-type="app" rid="app1-sensors-15-29868">Appendix A</xref>
). Using a real hand for these measurements was not possible, because a human cannot keep his hand in a constant position during time consuming measurements. Using a model hand offered good experimental control, and moreover enables the current study to be reproduced and the results to be compared to later iterations of the Nimble VR software, different software packages, or the use of alternative cameras.</p>
<p>The model, mounted on a tripod with a three-way pan/tilt head, was placed in five different postures, while wearing the glove in view of the Nimble VR system. As is standard practice, the Data Glove was calibrated to the full range of motion of the hand [
<xref rid="B52-sensors-15-29868" ref-type="bibr">52</xref>
]. Flat hand, pure MCP flexion, pure PIP flexion, combined MCP and PIP flexion, and pinch grip postures were assessed, and the data from the Nimble VR system was compared with the Kalman filtered results. The orientation of the hand model was varied by placing it in varying pitch, roll and yaw angles (ranges: [−60, 30] deg, [−120, 60] deg, and [−60, 60] deg, respectively). These three angles were varied at 5 deg intervals, while keeping the other two angles constant at 0 deg. Five measurements were performed for each of the 5 postures and for each of the 3 ranges, where at each orientation angle 200 samples were collected (representing about 13 seconds of data at a mean frequency of 15 Hz). A measurement thus represents a full sweep through a chosen orientation range, and this entire sequence was repeated five times. The total number of measurements performed per posture for the pitch range for example was therefore 19,000 (= 200 samples * 19 angles * 5 repetitions). In total, 19 pitch angles, 25 yaw angles, and 38 roll angles were assessed. The roll measurements were performed in two separated sessions (ranges [−120, −30] and [−30, −60]), where the model hand was rotated 90 deg in between. As a result, one roll angle was measured twice (angle of −30 deg). The Data Glove was recalibrated at each change of hand posture to account for potential shifting of the sensors inside the glove, caused by the external forces applied to glove during the changing of hand postures. Note that during contactless measurements with human hands this recalibration is not needed as external forces potentially causing sensor shift should be absent. However, in practice the glove can easily be recalibrated in between measurement sessions if sensor drift is observed. The tripod was kept horizontally aligned with the test setup, with the hand rigidly attached to its pan/tilt head.</p>
</sec>
<sec id="sec3dot3-sensors-15-29868">
<title>3.3. Human Hand Measurements to Validate the Kalman Filter Operation</title>
<p>Following the wooden hand model measurements, dynamic finger flexions were conducted on three different sized hands of healthy volunteers, ranging from small to large. Hand scale values, as automatically detected by the Nimble VR software, were 0.81, 0.84 and 0.87, respectively. The hand lengths, measured from wrist to tip of the middle finger, were 16.5 cm, 18.6 cm, and 19.6 cm, and breadth were 8.4, 9.2 cm, and 10.0 cm, respectively. In these tests, an additional Marker Tracking camera was used. This camera, capturing RGB data at 30 Hz with a resolution of 640 × 480 pixels, was aimed at the side of the hand. The positions of colored markers, attached to the joint locations of the index, pinky, and thumb fingers of the Data Glove, were extracted from the camera footage using RGB threshold and Mean Shift Cluster detection [
<xref rid="B57-sensors-15-29868" ref-type="bibr">57</xref>
]. Calculating the joint angles from the marker locations provided a reference to which the Nimble VR and Kalman Filtered data could be compared. Marker Tracking analysis was not performed online, hence the tracking results are free from any time delay. More information on this Marker Tracking algorithm is provided in
<xref ref-type="app" rid="app3-sensors-15-29868">Appendix C</xref>
. The joint angles were measured during active finger flexions with the Marker Tracking, Data Glove, and Nimble VR system. Each of the three participants performed five sets of ten repetitive full finger flexions (
<italic>i.e.</italic>
, from flat hand to closed fist and back) at a relaxed pace with the palm of the hand facing down. In between each set of ten flexions, the participant was asked to move his/her hand freely before going to the next set. A single glove calibration was performed prior to the measurements, calibrating the measurement range of the glove sensors to the movement range of the fingers of the participant.</p>
</sec>
<sec id="sec3dot4-sensors-15-29868">
<title>3.4. Dependent Measures</title>
<p>The dependent measures are the precision, accuracy, completeness of the data, and time delay. Completeness was defined as the ability to provide (reliable) joint angle estimates of the joints of the fingers as a function of orientation, posture, and degree of visual self-occlusion of the hand. In the analyses, a distinction was made between the MCP and PIP joints of the fingers.</p>
<p>For the wooden hand model test, at each 5 deg step, the mean orientation of the hand was calculated over the 200 samples. The mean hand orientation angle per step was calculated by averaging over the 25 measurement sets performed (
<italic>i.e.</italic>
, 5 per posture and 5 different postures). The standard deviation (SD) was calculated as the mean of the 25 standard deviations.</p>
<p>Regarding the joint angle, at each 5 deg step of the hand orientation angle, the mean and standard deviation of the joint angle were calculated over the 200 samples. These values were then again averaged over the 5 measurement sessions performed for each of the 5 deg steps. This procedure was performed for each orientation range and each of the five postures.</p>
<p>Comparing the resulting mean joint angles to the actual joint angles provides a measure of the accuracy of the system. The standard deviation (and variance) around these mean joint angles are measures of precision. Additionally, comparing these performance measures of the Nimble VR system with the Kalman filtered data gives insight into the completeness of the data as a function of hand orientation and posture. At hand orientations where visual self-occlusion degrades the joint angle approximations, we expected for the Kalman filtered data lower standard deviations as well as more accurate joint angle approximations compared to the Nimble VR.</p>
<p>Independent two-sample
<italic>t</italic>
tests were performed to assess whether the difference in mean calculated joint angles between Nimble VR and the filter output were statistically significantly different from each other. The compared vectors (being of equal lengths) were each composed of the mean joint angles, calculated at each of the five measurements. A
<italic>t</italic>
test was performed between these vectors for each posture at every orientation angle, totaling 984 tests (index: 82 orientation angles * 5 postures * 2 joints; thumb: 82 angles * 1 posture * 2 joints). The accompanying degrees of freedom in each of the
<italic>t</italic>
tests was 8 (
<italic>i.e.</italic>
,
<italic>df</italic>
= 2
<italic>n</italic>
− 2, with
<italic>n</italic>
= 5). A
<italic>p</italic>
value smaller than 0.01 was deemed statistically significant. We selected a conservative significance level in order to reduce the probability of false positives.</p>
<p>In order to assess the overall benefit of the Kalman filter with respect to the Nimble VR system, we calculated for each orientation range the mean of the mean and the mean of the standard deviations taken over the entire range (the pitch, yaw and roll sample sizes were 19, 25 and 38, respectively). This represents the accuracy and precision respectively of the measurements for a specific pose and orientation range.</p>
<p>For the human hand test, the time delay of both the Nimble VR system and the Kalman filter was compared to the Marker Tracking measurements, which were not performed online, but were obtained through video post analysis. Hence, the Marker Tracking results are free from time delay. The root-mean-square error and the Pearson’s correlation coefficient of the Nimble VR and Kalman filter output with respect to the Marker Tracking measurements were calculated. Lastly, the maximum angular under- or overestimation of the measurement systems, occurring at full finger flexion, were extracted.</p>
</sec>
</sec>
<sec id="sec4-sensors-15-29868">
<title>4. Results</title>
<p>The results for the posture measurements are shown in
<xref ref-type="fig" rid="sensors-15-29868-f002">Figure 2</xref>
as a series of plots. The top three plots indicate the measured orientation of the hand model as a function of the input hand orientations, that is, across the pitch, yaw, and roll ranges. The remainder of the plots in
<xref ref-type="fig" rid="sensors-15-29868-f002">Figure 2</xref>
show, for each of the hand postures, the index MCP joint angles (in blue) and PIP joint angles (in green) as a function of the actual hand orientation.</p>
<p>A distinction is made between the joint angles determined with the Nimble VR software only (square markers) and joint angles determined with the Kalman filter (asterisk markers). The actual joint angles of the wooden hand model are represented by horizontal lines (MCP: dashed line; PIP: dash-dotted line). Measurements lying closer to these lines are by definition more accurate. In each plot, at the top left corner, two means and two standard deviations are shown per joint. The first mean and standard deviation are that of the Nimble VR joint angle measurements taken over the entire range, and the second mean and standard deviation are that of the Kalman filtered data. A mean that lies closer to the actual joint angle indicates an overall improvement in joint angle approximation accuracy, and a lower standard deviation indicates an improvement in precision. Lastly, a solid triangle marker on the horizontal axes was used to indicate that the difference between the mean joint angle of the PIP joint obtained with Nimble VR and the Kalman filter is not statistically significant. This same display-method was not used for the MCP joint, because for this joint the measured angles were the same in approximately 50% of the cases, thereby cluttering the graphs if we were to display this.</p>
<sec id="sec4dot1-sensors-15-29868">
<title>4.1. Wooden Hand Model Orientation Measurements</title>
<p>The measured hand orientation angles are shown as a function of actual hand angle in
<xref ref-type="fig" rid="sensors-15-29868-f002">Figure 2</xref>
a–c. There are several orientation ranges of the hand at which the measurements of the hand angles are imprecise. For the pitch orientation (
<xref ref-type="fig" rid="sensors-15-29868-f002">Figure 2</xref>
a), all measured angles below −35 deg show very large standard deviations, that is, when the hand was pitched far downwards. The yaw angles (
<xref ref-type="fig" rid="sensors-15-29868-f002">Figure 2</xref>
b) show high precision for the range [−50, 50] deg. At angles below −50 deg, when the thumb was angled away from the screen, the measurements show large standard deviations, whereas above 50 deg the standard deviation increases slightly. Lastly, for roll angles (
<xref ref-type="fig" rid="sensors-15-29868-f002">Figure 2</xref>
c) in the range [−110, −60] deg, the measured angles have slightly larger standard deviations, which is because of visual occlusion of the fingers. At −90 deg, the hand is vertically aligned with the thumb on top. In this condition, the observed surface area of the hand is small, and only the thumb and index fingers can be distinguished by the Nimble VR software. As a result, in this range the orientation measurement becomes somewhat less reliable.</p>
<p>For the yaw measurements, a constant mean difference of about 9 deg is observed between the measured and actual yaw angle. Moreover, for the roll measurements a misalignment is seen at −30°, which is on account of the roll orientation having been measured in two separate sessions. A slight drift from actual the actual roll angle can be seen in the first range, where at −30 deg the measurement was stopped, the hand rotated 90 deg and reoriented, and the measurements (as well as the software) reinitialized. The re-measured roll angle of the hand is then free from drift and closer to the actual angle.</p>
</sec>
<sec id="sec4dot2-sensors-15-29868">
<title>4.2. Wooden Hand Model Finger Joint Measurements—Index and Thumb Fingers</title>
<p>At hand orientations yielding a low precision (SD > 5 deg, see
<xref ref-type="fig" rid="sensors-15-29868-f002">Figure 2</xref>
, graphs a to c), a similar effect on precision can be seen for most of the finger joint angle estimates of the Nimble VR. The consequence of imprecise hand orientation measurements is either a decrease in finger joint angle estimation precision (
<italic>i.e.</italic>
, SD > 10 deg) or an unrealistically high precision (
<italic>i.e.</italic>
, SD < 1 deg) combined with a poor accuracy (> 30 deg shift from the true angle). This high precision is the result of visual self-occlusion of the finger, and the Nimble VR software accordingly making an estimation of the joint angles based on the estimated posture. This can for example clearly be seen in
<xref ref-type="fig" rid="sensors-15-29868-f002">Figure 2</xref>
, graph f2, at angles −60 to −15 deg, where for the PIP joint a 50 deg difference is observed between measured and true angle (thus having a low accuracy), while the observed precision is around 1 deg. Due to the orientation independent standard deviation of the Data Glove, the Kalman filter output has a low standard deviation, even when the standard deviation of the Nimble VR data is high. Furthermore, because the Data Glove output is independent of the hand orientation, the Glove contributes to improved accuracy of the Kalman filter output over all the hand orientation ranges, in particular for the PIP joint. In order to assess both the accuracy and the precision before and after implementation of the filter,
<xref ref-type="fig" rid="sensors-15-29868-f002">Figure 2</xref>
, subfigures d2-f4, should be referred to, as accuracy and precision are dependent on hand pose and assessed orientation range. The accuracy for a given joint, hand pose, or orientation range is equal to the mean joint angle (provided in the top left of every graph) minus the true joint angle. The precision is given by the standard deviations provided in the top left of every plot.</p>
<fig id="sensors-15-29868-f002" position="float">
<label>Figure 2</label>
<caption>
<p>(
<bold>a–c</bold>
) Three plots providing measured orientation of the wooden hand model for varying actual hand orientations (
<italic>i.e.</italic>
, pitch, yaw, and roll), indicated by red dashed unity lines; (
<bold>d1–i1</bold>
) Photos of the measured hand orientations; (
<bold>d2–i4</bold>
) Plots showing the measured metacarpophalangeal joint (MCP, blue) and proximal interphalangeal (PIP, green) joint angles, as determined with the Nimble VR system (square markers, □) and after fusion of the Data Glove data through the application of the Kalman filter (asterisk markers, *). Data is presented with error bars ranging from mean – 1·SD to mean + 1·SD. The actual angles at which the fingers were placed are indicated with the red dotted and dash-dotted lines, and are illustrated in the photos provided on the left. All plots show data collected on the index finger, unless otherwise specified below the photo on the left. Indicated in the top left of every graph are the mean and standard deviation (format: mean ± SD | mean ± SD) calculated over the entire hand orientation range before (left) and after implementation of the Kalman filter (right), for both joints. Note that these mean and standard deviations are calculated as the mean of the mean, and the mean of the standard deviations, calculated per 5 deg step. The triangle markers on the horizontal axes indicate whether the difference in mean PIP joint angle between Nimble NR and the Kalman filter is not statistically significant (note: MCP joint is not visualised in this way).</p>
</caption>
<graphic xlink:href="sensors-15-29868-g002"></graphic>
</fig>
<p>Looking at the index MCP joint, the two mean joint angles lie close together, and although an improvement in precision can be seen, only 49% of the time a significant difference between the measurement systems was observed, mostly at MCP joint differences larger than about 10 deg. The PIP joint is more affected by the filter, and in a substantial portion of cases (83%) a significant improvement was observed. As indicated by the triangular markers on the horizontal axes in (
<xref ref-type="fig" rid="sensors-15-29868-f002">Figure 2</xref>
, graphs d2 to i4), no significant difference is present when the Nimble VR output overlap with the filter output, which occurs when the Nimble VR measurements already approach the true PIP joint angle. Moreover, at high standard deviations of Nimble VR data, statistically significant differences with the filter data are not always obtained.</p>
<p>Following, the separate hand postures will be discussed.</p>
<p>In the Flat Hand posture (
<xref ref-type="fig" rid="sensors-15-29868-f002">Figure 2</xref>
, graphs d
<sub>1</sub>
to d
<sub>4</sub>
), with both MCP and PIP joint angles being 0 deg, the Nimble VR PIP joint estimate shows the poorest accuracy, especially at low hand pitch angles (graph d
<sub>2</sub>
). The Kalman filter output adjusts this and keeps both the MCP and PIP joint estimates around 0 deg, even in the ranges where hand orientation measurements are imprecise. This is most clearly shown by the decrease in standard deviation for both joints at graphs d
<sub>2</sub>
to d
<sub>4</sub>
.</p>
<p>At Pure PIP Flexion (
<xref ref-type="fig" rid="sensors-15-29868-f002">Figure 2</xref>
, graphs e
<sub>1</sub>
to e
<sub>4</sub>
), where the MCP joint angles are kept at 0 deg and the PIP joint at 90 deg, one can see large fluctuations of accuracy in the PIP flexion angle estimate. For the pitch range (graph e
<sub>2</sub>
), in the region below −50 deg, the PIP angle is grossly underestimated, but for the remainder of the range, it is close to the actual angle. The Kalman filter output decreases the large variations over this range, keeping the joint estimate relatively accurate with some fluctuations around the actual PIP angle. For both the yaw (graph e
<sub>3</sub>
) and roll (graph e
<sub>4</sub>
) orientation ranges an improvement in precision and less variation in the accuracy can be seen. The MCP joint estimate deviates from the actual angle at high pitch angles (graph e
<sub>2</sub>
), but is relatively accurate for the other orientations (graphs e
<sub>3</sub>
and e
<sub>4</sub>
).</p>
<p>At Pure MCP Flexion, (
<xref ref-type="fig" rid="sensors-15-29868-f002">Figure 2</xref>
, graphs f
<sub>1</sub>
to f
<sub>4</sub>
), with the PIP joint angles kept at 0 deg, a bias is seen during pitch (graph f
<sub>2</sub>
). As with pure PIP flexion, the PIP joint angle is wrongly estimated by the Nimble VR up until −10 deg, after which it correctly approaches the actual angle. The Kalman filter output adequately corrects for this bias, and keeps the estimated PIP joint angle around 0 deg at all angles. This comes, however, at the expense of the accuracy with which the MCP joint angle is estimated, which slightly worsens due to the Kalman filter. This is exemplified by the mean and standard deviations of the MCP joint angles taken over the entire range (see top left of graph f
<sub>2</sub>
), showing a slight increase in angle underestimation (
<italic>i.e.</italic>
, a lower accuracy) and an increase in standard deviation (
<italic>i.e</italic>
., a lower precision). However, the reverse is thus true for the PIP joint. This same effect is seen to a lesser extent for the yaw and roll orientations (graphs f
<sub>3</sub>
and f
<sub>4</sub>
).</p>
<p>For Combined MCP and PIP Flexion
<italic>,</italic>
(
<xref ref-type="fig" rid="sensors-15-29868-f002">Figure 2</xref>
graphs g
<sub>1</sub>
to g
<sub>4</sub>
), which is a more natural hand closure posture than the pure flexion of either the MCP or PIP joints, the advantage of using the Kalman filter is most pronounced in the PIP joint estimate. Where the Nimble VR measurements for this joint greatly vary for all orientations and are grossly overestimated, the Kalman filter yields a reliable and more accurate PIP joint angle estimate.</p>
<p>Lastly, for the Pinch Grip posture (
<xref ref-type="fig" rid="sensors-15-29868-f002">Figure 2</xref>
, graphs h and i), we show both the index (graphs h
<sub>1</sub>
to h
<sub>4</sub>
) and thumb fingers (graphs i
<sub>1</sub>
to i
<sub>4</sub>
). Again, the Kalman filter increases the precision of the PIP joint output of the index finger (graphs h
<sub>2</sub>
to h
<sub>4</sub>
). However, there is a significant overestimation of the joint angle for all orientations. For the thumb (graphs i
<sub>2</sub>
to i
<sub>4</sub>
), the Kalman filter slightly increases precision for both joints estimates and slightly improves the MCP joint accuracy. However, the filter’s effect is less pronounced here as compared to the index finger.</p>
</sec>
<sec id="sec4dot3-sensors-15-29868">
<title>4.3. Wooden Hand Model Finger Joint Measurements—All Fingers</title>
<p>In order to assess the improvements gained for all fingers, in
<xref ref-type="fig" rid="sensors-15-29868-f003">Figure 3</xref>
the difference between the true joint angles and the mean joint angles taken over the range of all assessed hand orientation ranges are given. These differences are equal to the mean joint angle in the top left of every graph in
<xref ref-type="fig" rid="sensors-15-29868-f002">Figure 2</xref>
minus the true joint angle. The accompanying standard deviation is provided in
<xref ref-type="fig" rid="sensors-15-29868-f003">Figure 3</xref>
as well. Hence,
<xref ref-type="fig" rid="sensors-15-29868-f003">Figure 3</xref>
shows the mean accuracy and precision measures for all fingers, joints and poses, for the respective orientation range. It can be seen that for all fingers, the filter increases precision for the PIP joints and to a lesser extent for the MCP joints, regardless of hand posture. Accuracy improvements are seen for the PIP joints for the flat hand, pure PIP flexion, pure MCP flexion, and combined MCP and PIP flexion postures, but not for the pinch grip posture.</p>
<p>Lastly, we calculated the overall mean accuracy and precision improvements per joint gained by implementation of the filter. The overall accuracy and overall precision estimates were calculated across 2050 means and 2050 SDs, respectively (82 angles (19 pitch angles + 25 yaw angles + 38 roll angles) * 5 postures * 5 fingers). The results show that accuracy of the MCP slightly worsens by 6% from 12.7 deg (SD = 11.5 deg) to 13.5 deg (SD = 12.9 deg). This is offset by an accuracy improvement for the PIP joint of 31%, from 24.4 deg (SD = 17.4 deg) to 16.8 deg (SD = 15.7 deg). The precision of the MCP joint assessment improves with 5%, from 2.3 deg (SD = 2.5 deg) to 2.2 deg (SD = 2.2 deg), whereas the precision of the PIP joint improves with 79%, from 4.5 deg (SD = 4.1 deg) to 0.9 deg (SD = 1.1 deg). Overall, the filter thus marginally affects the MCP joint estimation, but strongly improves PIP joint estimation.</p>
<fig id="sensors-15-29868-f003" position="float">
<label>Figure 3</label>
<caption>
<p>Precision and accuracy for all fingers of the hand model and for all orientation ranges. (
<bold>a</bold>
) Pitch range; (
<bold>b</bold>
) Yaw range; (
<bold>c</bold>
) Roll range. For each graph, the absolute difference is given between the mean calculated joint angle (
<italic>i.e.</italic>
, mean of the mean joint angles) and the true joint angle. The accompanying standard deviation is given as well (
<italic>i.e.</italic>
, the mean of the standard deviations at all angles), shown as ± 2·SD. Note that the values provided here are equal to the values given in the top left of the graphs in
<xref ref-type="fig" rid="sensors-15-29868-f002">Figure 2</xref>
minus the true joint angles of the assessed hand pose. A distinction is made between the Nimble VR data (left) and the Kalman filtered date (right), as well as between the metacarpophalangeal (MCP, blue) and proximal interphalangeal (PIP, green) joint. From left to right the fingers are presented; t = thumb, i = index, m = middle, r = ring, p = pinky.</p>
</caption>
<graphic xlink:href="sensors-15-29868-g003"></graphic>
</fig>
</sec>
<sec id="sec4dot4-sensors-15-29868">
<title>4.4. Human Hand Active Finger Flexion Measurements</title>
<p>Dynamic flexing of the fingers while performing Marker Tracking of the joint angles and measuring the Nimble VR, Data Glove, and Kalman filter output yields the results shown in
<xref ref-type="fig" rid="sensors-15-29868-f004">Figure 4</xref>
and
<xref ref-type="fig" rid="sensors-15-29868-f005">Figure 5</xref>
. In
<xref ref-type="fig" rid="sensors-15-29868-f004">Figure 4</xref>
, one of five sessions are shown, for the three differently sized hands. Additionally, all fifty full finger flexions per hand as measured with the Nimble VR system and obtained through the Kalman filter are plotted
<italic>vs.</italic>
the marker tracked angles. In
<xref ref-type="fig" rid="sensors-15-29868-f005">Figure 5</xref>
the MCP and PIP joints are shown separately as well in combination for 10 flexions performed by the medium sized hand (Hs = 0.84). Note that the Marker Tracking results are free from any time delay.</p>
<fig id="sensors-15-29868-f004" position="float">
<label>Figure 4</label>
<caption>
<p>Finger joint estimates at dynamic finger flexions. (
<bold>a1–a4</bold>
) small hand size; (
<bold>b1–b4</bold>
) medium hand size; (
<bold>c1–c4</bold>
) and large hand size. Five sessions with 10 full finger flexion repetitions each were performed. All graphs show the Nimble VR system (red dash dotted line), Kalman filter output (green dashed line), and Marker Tracking measurements (blue continuous line). Left three graphs show the combined metacarpophalangeal (MCP) and proximal interphalangeal (PIP) joint angles of the third performed measurement session. The remaining nine plots show the Nimble VR and Kalman filter output plotted
<italic>vs.</italic>
the Marker Tracked angles of all fifty finger flexions performed per hand. The black line presents the true angle, and the plots are given for separate and combined joints: (a2–c2) MCP; (a3–c3) PIP; (a4–c4) combined MCP plus PIP joints. In the top right, the Pearson correlation coefficient between the measured angle and marker tracked angle are given (red = Nimble VR, green = Kalman filter output).</p>
</caption>
<graphic xlink:href="sensors-15-29868-g004"></graphic>
</fig>
<p>In
<xref ref-type="fig" rid="sensors-15-29868-f004">Figure 4</xref>
, at the peaks of the graphs (
<italic>i.e.</italic>
, the points of maximum finger flexion), a noticeable effect can be seen of the hand size on the degree of under- or overestimation of the joint angles as compared to the Marker Tracking angles. The following presented under- or overestimation values have been calculated over all 50 flexions combined for each hand. At the small hand (graphs a1-a4) one can see that both the Nimble VR and the filter output underestimate the MCP joint angle considerably by 51 deg. However, this is compensated by an overestimation for the PIP joint (Nimble VR: 13 deg, SD = 17 deg; Kalman filter: 14 deg, SD = 11 deg), resulting in an overall underestimation of the full finger flexion by 38 deg (SD = 12 deg) with the Nimble VR system and 36 deg (SD = 8 deg) for the Kalman filter output. This underestimation of the MCP joint angle is less prominent at the medium sized hand (graphs b1-b4), where the Nimble VR underestimates the MCP joint with 15 deg (SD = 15 deg), overestimates the PIP joint with 28 deg (SD = 8 deg), leading to an overall overestimation of 13 deg (SD = 16 deg). For the medium sized hand, the output from the Kalman filter underestimates the MCP joint with 16 deg (SD = 15 deg) and overestimates the PIP joint with: 26 deg (SD =10 deg), adding up to a combined overestimation of 9 deg (SD = 9 deg). Lastly, for the large hand (graphs c1-c4), an underestimation is again seen for the MCP joint (filter: 23 deg, SD = 11 deg), but the filter output overestimates the PIP joint (filter: 27 deg, SD = 9 deg), leading to an overall small overestimation of 4 deg (SD = 12 deg) (as compared to the Nimble VR output providing an underestimation of 29 deg, SD = 20 deg).</p>
<p>Summarizing, the Kalman filter system underestimates the MCP joint angle (small, medium, large hand underestimation: 61%, 20%, and 50%, respectively) while the PIP joint is overestimated (small, medium, large hand overestimation: 18%, 34% and 28%, respectively). The combined finger flexion approximation is underestimated at the small hand (small hand: 22%), but marginally overestimated at medium and large hands (medium hand: 6%; large hand: 3%). The overall contribution of the Kalman filter compared to the Nimble VR data is relatively limited for the small and medium sized hands, providing a 1% and 3% reduction in under- and overestimation respectively. However, at the large hand Nimble VR data an underestimation of 20% is present, which after implementation of the filter changes to a small overestimation of 3%.</p>
<fig id="sensors-15-29868-f005" position="float">
<label>Figure 5</label>
<caption>
<p>Active index finger flexion comparison between Marker Tracking, Nimble VR, and Kalman filtered joint angles. Shown data are from the medium sized hand, first measurement session. (
<bold>a</bold>
) sum of the metacarpophalangeal (MCP) and proximal interphalangeal (PIP) joint angles plotted versus time; (
<bold>b</bold>
) MCP joint angles versus time; (
<bold>c</bold>
) PIP joint angles versus time.</p>
</caption>
<graphic xlink:href="sensors-15-29868-g005"></graphic>
</fig>
<p>The correlation coefficients provided in the top right of the graphs in
<xref ref-type="fig" rid="sensors-15-29868-f004">Figure 4</xref>
indicate the degree of linearity in the datasets. At all hands and finger joints, a higher correlation coefficient was found in the Kalman filter output than for the Nimble VR. The correlation for the Kalman filter output is relatively low at the MCP joint of the small hand (
<italic>r</italic>
= 0.64), and strongest at the PIP joint of the large hand (
<italic>r</italic>
= 0.97).</p>
<p>The root-mean-square errors (RMSE) calculated of the Nimble VR and Kalman filter output with respect to the marker tracked angles for the small hand were 43 deg and 33 deg, for the medium hand 38 deg and 20 deg, and the large hand 39 deg and 16 deg, respectively. A substantial improvement is thus visible when using the Kalman filter. Important to note is that these RMSE values were calculated over the entire 50 hand flexures, where the maximum absolute error at a particular point in time was 167 deg for the Nimble VR system, and 158 deg for the Kalman filter.</p>
<p>Lastly, in
<xref ref-type="fig" rid="sensors-15-29868-f005">Figure 5</xref>
, the discrepancy in time between the Nimble VR angle measurement and the Marker Tracking output is shown. A delay is present at both joints. Although for the Kalman filter output at the MCP joint the delay of 0.4 s persists (SD = 0.2 s), calculated over the 50 finger flexions of the medium sized hand, this effect is less pronounced at the PIP joint. For the PIP joint, the delay before and after implementation of the filter are 0.17 s (SD = 0.07 s) and 0.07 s (SD = 0.04 s), respectively. The resulting combined finger flexion estimation has a delay of 0.23 s (SD = 0.07 s) before implementation of the filter, and 0.12 s (SD = 0.03 s) after. Lastly, looking at the Nimble VR data at the PIP joint one can see that this joint is at times measured unsteadily, and that some erratic fluctuations occur, which are smoothed in the filter output.</p>
</sec>
</sec>
<sec id="sec5-sensors-15-29868">
<title>5. Discussion</title>
<p>The results showed that the application of the Kalman filter for fusing vision based with contact based tracking data provided substantial improvements in precision, and to a lesser extent improvements in accuracy. Using the Data Glove improved hand posture and finger flexion estimates, especially at the occurrence of visual self-occlusion of the fingers. Depending on the orientation and posture of the hand, these precision and accuracy improvements varied somewhat.</p>
<sec id="sec5dot1-sensors-15-29868">
<title>5.1. Setup Limitations</title>
<p>The measurements of the Nimble VR is influenced by the chosen depth camera and its resolution. Aside from this, two limitations were present in our setup: (1) the anatomical dissimilarities of the wooden hand model with respect to real hands; and (2) the sensor limitations of the Data Glove.</p>
<p>The first limitation was the fact that we used a wooden hand model. Although the hand model was anatomically correct in terms of dimensions and locations of the MCP, PIP and DIP joints of the fingers, it is less representative for the thumb. The model lacks the carpometacarpal (CMC) joint of the thumb, which connects the first metacarpal bone of the thumb to the carpal bone of the wrist. This joint, allowing for about 55° of flexion and 10° of hyperextension, is important for the action of opposition (
<italic>i.e.</italic>
, touching the tip of the pinky finger with the tip of the thumb). Due to the absence of the CMC joint, both palmar and radial abduction are impossible in the hand model, limiting the thumb’s movements to flexion of the MCP and IP joints. As a result, the pinch grip pose that we assessed deviated from an actual pinch grip that relies on the CMC joint. The Nimble VR software (which compares the obtained Kinect camera output to a precomputed database of hand postures) is able to detect this pinch grip posture, and automatically assumes the CMC joint to play a role in it. As a result, the software output better reflects real index and thumb fingers joint angles as compared to the model fingers. If the reader was to place one’s own hand in the pinch grip pose, and compare this to the hand model pinch grip (shown in
<xref ref-type="fig" rid="sensors-15-29868-f002">Figure 2</xref>
graphs h
<sub>1</sub>
and i
<sub>1</sub>
), he/she can see the degree of flexion of the MCP joint of the index finger to be nonzero and PIP joint to be around 8°, whereas the model had 0 and 50° respectively. When calculating the accuracy of the MCP and PIP joints over all hand poses combined, but excluding the pinch grip pose, the Nimble VR provides 12.8 deg and 22.1 deg, respectively, and after implementation of the filter this is 13.4 deg and 10.5 deg. Compared to previous results, the accuracy for the PIP joint is thus improved by 6 deg when not taking the pinch grip into consideration. The precision stays approximately the same.</p>
<p>An additional restriction of the hand model is that its fingers are not able to perform abduction and adduction at the MCP joints. This did not affect our measurements of the MCP and PIP joints, but limited us in the selection of the postures. For future research, it is interesting to use a hand model that is able to make such joint movements, and to use a Data Glove with additional sensors that measure ab- and adduction of the fingers, such as the 5DT Data Glove 14 Ultra [
<xref rid="B52-sensors-15-29868" ref-type="bibr">52</xref>
]. However, a glove with more sensors is more likely to impede the naturalness of motion of the user, especially considering the fact that ab- and adduction sensors need to be placed in between the fingers.</p>
<p>The second limitation of the used measurement setup was inherent in the Data Glove. Because the glove itself is made of stretch Lycra and made to fit most hand sizes, its measurement accuracy is dependent on the quality with which it is calibrated. Furthermore, its fibre-optics-based bending sensors are positioned inside small hollow sleeves along the glove fingers. Consequently, the sensors are slightly able to shift inside these sleeves, potentially creating drift in the measurements during prolonged usage. As the measurements presented in this research were acquired during passive hand postures, this drift could not be quantified.</p>
<p>Another disadvantage of using the Data Glove is that the predetermined Data Glove weights
<inline-formula>
<mml:math id="mm48">
<mml:mrow>
<mml:msub>
<mml:mi>w</mml:mi>
<mml:mrow>
<mml:mi>D</mml:mi>
<mml:msub>
<mml:mi>G</mml:mi>
<mml:mrow>
<mml:mi>M</mml:mi>
<mml:mi>C</mml:mi>
<mml:mi>P</mml:mi>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula>
and
<inline-formula>
<mml:math id="mm49">
<mml:mrow>
<mml:msub>
<mml:mi>w</mml:mi>
<mml:mrow>
<mml:mi>D</mml:mi>
<mml:msub>
<mml:mi>G</mml:mi>
<mml:mrow>
<mml:mi>PIP</mml:mi>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula>
(see
<xref ref-type="app" rid="app3-sensors-15-29868">Appendix C</xref>
) are person-specific to a certain extent. As the size of peoples’ hands vary, the degree of sensor flexion and their exact positioning with respect to the fingers tend to vary as well. The current weights were determined based on a medium sized hand. Where for example the sensor on the pinky mainly measures PIP flexion, this may be slightly different for persons with smaller hands, where the sensor will overlay more of the MCP joint. Based on the active finger flexion measurements performed on hands of different sizes, we found that the degree of overall finger flexion is underestimated at small hands, and slightly overestimated at medium and large hands. As such, the glove weights will likely not have to be recalibrated for all test participants. In order to improve the joint angle estimations for small hands, a second set of glove weights may be measured and used in future tests.</p>
</sec>
<sec id="sec5dot2-sensors-15-29868">
<title>5.2. Active Finger Flexion Measurements</title>
<p>When assessing the influence of the Kalman filter on the active finger flexions (
<xref ref-type="fig" rid="sensors-15-29868-f004">Figure 4</xref>
), it can be seen that the time delay inherent in the Nimble VR remains present at the MCP joint estimate, but is reduced at the PIP joint. Similarly, for all the fingers (
<xref ref-type="fig" rid="sensors-15-29868-f003">Figure 3</xref>
), the MCP joint’s increase in precision and accuracy is not as pronounced as for the PIP joint. The explanation for this can be found in the implementation of the variance matrix
<inline-formula>
<mml:math id="mm50">
<mml:mrow>
<mml:msub>
<mml:mi>R</mml:mi>
<mml:mi>k</mml:mi>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula>
(see
<xref ref-type="app" rid="app1-sensors-15-29868">Appendix A</xref>
). The observed variance of the Nimble VR PIP joint angles is higher as compared to the MCP joints. The filter therefore assumes the Nimble VR MCP joint estimates to be more reliable than those of the PIP joint. As a result, the Data Glove measurements are predominantly used to smoothen out and correct the PIP joint estimates.</p>
<p>During active finger flexions, both the MCP and PIP joints are flexed simultaneously. It is likely that for the Kinect camera, which has a top down view of the hand, the flexion of the PIP joints is initially visually more pronounced than the flexion of the MCP joint. The flexion of the MCP joint is not detected directly, providing an explanation for the time delay (which, as shown in
<xref ref-type="fig" rid="sensors-15-29868-f005">Figure 5</xref>
, is larger for the MCP joint compared to the PIP joint). As the Data Glove mostly corrects for the PIP joints, this time delay persists in the filter output for the MCP joints. In order to correct for this, one could adjust the variance terms in matrix
<inline-formula>
<mml:math id="mm5000">
<mml:mrow>
<mml:msub>
<mml:mi>R</mml:mi>
<mml:mi>k</mml:mi>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula>
accordingly, allowing the Data Glove measurements to better influence the MCP joint estimates. However, this would come at the cost of the quality of the PIP joint estimates.</p>
<p>The size of the time delays for both joints can be further reduced, as the setup in its current iteration has not yet been optimized in terms of computational efficiency. The time delays have been obtained on a computer with an i7-2640M CPU (2.80 GHz), 8 GB of RAM and 64-bit Windows 7 Operating System, while using custom written software to capture the data stream from both the camera and data glove system. The time delays will likely decrease with further iterations of the Nimble VR software as well as through implementation of a dedicated processing unit to perform the relatively heavy Nimble VR calculations.</p>
</sec>
<sec id="sec5dot3-sensors-15-29868">
<title>5.3. Data Fusion Improvements</title>
<p>Sensor redundancy through data fusion of a contact based device with that of a vision based provides data completeness during partial and full visual occlusion of the fingers. Although the Kalman filter method is easy in implementation, it by definition requires the state prediction model to be a linear function of the measurements. It is possible that an extended Kalman filter, using non-linear functions, will provide more accurate state estimates. An interesting way to extend the model could be to use heuristics, and to change the model based on the hand pose detected by the Nimble VR software. For example, if a person is pointing with the index finger, one could assume him or her to have the remaining fingers closed towards the palm of the hand. As such, it is known there is a high likelihood that some of those fingers will be visually occluded, and the model could take this into account.</p>
<p>Another way of improving the current setup may be to use alternative contact based measurement devices that directly measure finger flexion without impeding naturalness of motion of the user. Although the used 5DT data glove is not very intrusive, studies have shown that gloves may have negative effects on manual dexterity, comfort, and possibly the range of finger and wrist movements [
<xref rid="B58-sensors-15-29868" ref-type="bibr">58</xref>
]. One interesting method presented in literature is the use of adhesives to attach sensor sleeves to the back of the fingers whilst leaving the joints free of adhesives that would restrict movements [
<xref rid="B59-sensors-15-29868" ref-type="bibr">59</xref>
]. These sensors would leave the fingers largely unimpeded, benefiting potential uses for tactile feedback.</p>
<p>An alternative may also be to exchange the contact based device for a locally functioning vision based device such as the Digits wrist-worn gloveless sensor [
<xref rid="B60-sensors-15-29868" ref-type="bibr">60</xref>
]. Digits is a small camera-based sensor attached to the wrist that images a large part of the user’s bare hand. Its infrared camera is placed such that the upper part of the palm and fingers are imaged as they bend inwards towards the device. As the device is not restricted to a fixed space around the user, but moves with the user’s hand, it can image the fingers that may be occluded from vision for the Nimble VR.</p>
<p>A third potential device for integration with the Nimble VR system is the Myo armband from Thalmic Labs [
<xref rid="B61-sensors-15-29868" ref-type="bibr">61</xref>
,
<xref rid="B62-sensors-15-29868" ref-type="bibr">62</xref>
]. Instead of approximating the flexion of the fingers, this system extracts user gestures from measured EMG signals. Using these observed gestures to update the Nimble VR’s calculated skeletal model may be a different route for obtaining improved finger joint angles estimates.</p>
<p>Regardless of which second measurement system is used to improve data completeness through data fusion, in case of a contact based system its influence on the naturalness of motion needs to be taken into account. Especially in applications such as laparoscopic training, such (minor) physical limitations can easily become a hindrance to the participants (or surgeons), and influence task performance. The advantages of more complex measurement systems should thus be considered within the context and duration of the target application(s).</p>
<p>The Nimble VR software and concurrent hardware is still under development, as the company Nimble VR has recently joined Oculus VR, LLC [
<xref rid="B38-sensors-15-29868" ref-type="bibr">38</xref>
]. It is expected that the accuracy and precision will improve prior to consumer market introduction. Additionally, especially with respect to the consumer market, the advantages of fusing simple unobtrusive contact based sensors with “low-budget” vision based systems (e.g., Leap Motion and Nimble VR) may be an easy and computationally efficient way of obtaining data completeness for applications such as 3D computer interaction and gaming [
<xref rid="B63-sensors-15-29868" ref-type="bibr">63</xref>
].</p>
</sec>
<sec id="sec5dot4-sensors-15-29868">
<title>5.4. Application in Medical Field</title>
<p>One of the fields where precision and data completeness of measured hand and finger motions is of prime importance is the medical domain. As described in the introduction, hand motions and postures as HCI input have already been applied in several cases [
<xref rid="B10-sensors-15-29868" ref-type="bibr">10</xref>
,
<xref rid="B11-sensors-15-29868" ref-type="bibr">11</xref>
], but not yet in virtual laparoscopic training. The Nimble VR system used in this research has a working area large enough to allow surgeons to make the same surgical routine motions they would make using the already available Da Vinci master robot (Intuitive Surgical Inc., Sunnyvale, CA, USA) to which some surgeons have already grown accustomed [
<xref rid="B47-sensors-15-29868" ref-type="bibr">47</xref>
,
<xref rid="B48-sensors-15-29868" ref-type="bibr">48</xref>
]. Before the eventual implementation of vision based devices in applications such as laparoscopic training however, several limitations will need to be overcome. Foremost is the time delay in the detection of the hands by the camera and the (computationally heavy) extraction of the skeletal model of the hand. A time delay of 300 ms is likely to aversely effect skill training [
<xref rid="B64-sensors-15-29868" ref-type="bibr">64</xref>
]. In terms of surgical performance, it has been shown that surgeons are able to compensate for time delays up to 700 ms through a “move and wait” strategy. However, the number of operator errors increases with increasing time delay [
<xref rid="B65-sensors-15-29868" ref-type="bibr">65</xref>
,
<xref rid="B66-sensors-15-29868" ref-type="bibr">66</xref>
,
<xref rid="B67-sensors-15-29868" ref-type="bibr">67</xref>
]. Hence, ideally the time delay has to be minimized. Secondly, inherent in the use of a vision based device is the lack of haptic feedback, a property which has been shown to improve hand–eye coordination [
<xref rid="B68-sensors-15-29868" ref-type="bibr">68</xref>
]. To certain extents, this limitation can be addressed through the use of visual force feedback [
<xref rid="B69-sensors-15-29868" ref-type="bibr">69</xref>
], implementation of pseudo-haptic feedback in the virtual environment [
<xref rid="B70-sensors-15-29868" ref-type="bibr">70</xref>
], or the integration of a haptic-feedback mechanism in a touch based glove [
<xref rid="B71-sensors-15-29868" ref-type="bibr">71</xref>
].</p>
<p>The precision of the joint angle estimates obtained through implementation of the data glove and the Kalman Filter in this research is generally around 1 to 3 deg, depending on which joint and finger we are looking at. For medical practice, a control precision of 2 mm for standard laparoscopic instruments has been reported [
<xref rid="B72-sensors-15-29868" ref-type="bibr">72</xref>
]. As an example, when controlling a joint incorporated in the shaft of a laparoscopic instrument, depending on the length of the segment attached to the joint controlled through finger flexion, a standard deviation from 1 to 3 deg is acceptable (e.g., with a segment length of 20 mm, which is the approximate length of a functional tooltip of a laparoscopic instrument, an angular standard deviation of 3 deg equals a tip positional standard deviation of 1.0 mm).</p>
<p>The accuracy improvements through implementation of the filter were less pronounced as compared to the precision results. Using the filter, for the MCP joint, the mean deviation from the true joint angle was 13.5 deg, and for the PIP joint around 16.8 deg. Due to the good precision however, touchless control of simulated instruments through flexion of the fingers combined with movements of the hand and concomitant visual feedback should be possible. This is true when considering that humans are able to use visual feedback to correct for unforeseen perturbations during continuous hand movements [
<xref rid="B73-sensors-15-29868" ref-type="bibr">73</xref>
]. Therefore, considering the precision of the joint angle estimates obtained in this research, our aim is to implement and further study the presented measurement setup for VR medical simulator purposes.</p>
</sec>
</sec>
<sec id="sec6-sensors-15-29868">
<title>6. Conclusions</title>
<p>In this study, we performed a Kalman filter data fusion of hand and finger motion measurements obtained with the 5DT Data Glove and the Nimble VR using a Kinect camera. Measurements were obtained using a wooden hand model placed in various postures across various orientation ranges, as well as on three different sized hands performing active finger flexions. Through sensor redundancy, more accurate and substantially more precise joint flexion estimates could be obtained compared to the Nimble VR alone. The obtained accuracy for the MCP and PIP joint after implementation of the filter were 13.5 deg (SD = 12.9 deg) and 16.8 deg (SD = 15.7 deg), respectively, and the precisions, 2.2 deg (SD = 2.2 deg) and 0.9 deg (SD = 1.1 deg). Thus, for the PIP joint, a 31% improvement in accuracy was observed and a 79% improvement in accuracy. The MCP accuracy worsened by 6% and the precision improved by 5%, showing the filter to only marginally influence this joint. Due to the use of the contact based Data Glove, visual self-occlusion of the fingers for the visual based Nimble VR system could be mitigated, and data completeness obtained.</p>
</sec>
</body>
<back>
<ack>
<title>Acknowledgments</title>
<p>The research of Ewout A. Arkenbout is supported by the Dutch Technology Foundation STW, which is a part of the Netherlands Organisation for Scientific Research (NWO), and which is partly funded by Ministry of Economic Affairs, Agriculture and Innovation (STW Project 12137).</p>
</ack>
<notes>
<title>Author Contributions</title>
<p>The work presented in this paper was carried out in collaboration between all authors. Ewout A. Arkenbout designed the test setup, performed the measurements and data analysis, and wrote the article. Joost C.F. de Winter and Paul Breedveld aided in the interpretation of the results and in writing the article.</p>
</notes>
<notes>
<title>Conflicts of Interest</title>
<p>The authors declare no conflict of interest.</p>
</notes>
<app-group>
<title>Appendix A</title>
<app id="app1-sensors-15-29868">
<title>Orientation Dependent Variance Quantification</title>
<sec>
<title>Methods</title>
<p>Finger joint variances for matrix
<inline-formula>
<mml:math id="mm51">
<mml:mrow>
<mml:msub>
<mml:mi>R</mml:mi>
<mml:mi>k</mml:mi>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula>
were measured using an anatomically correct wooden right-hand model mounted on a tripod with a 3-way pan/tilt head. The wooden hand had a length of 21.35 cm, measured from wrist to tip of the middle finger, and breadth of 8.3 cm.</p>
<p>Two different hand postures were assessed: (1) hand with the fingers held stretched (“flat hand”), and (2) bend fingers (with index through pinky joint angles 35° MCP and 55° PIP). The orientation of the hand model was varied by placing it in varying pitch, roll and yaw angles (ranges [–60, 30] deg, [–120, 60] deg and [–60, 60] deg, respectively). All three angles were varied at 5 deg intervals, while keeping the other two angles constant at 0 deg.</p>
<p>At every step, 200 samples were taken in approximately 13 s. Every measurement session was repeated five times. For every combined session, the mean and variance were determined of the orientation of the hand and the angular measured MCP and PIP joints rotations. Based on these measurements, the variances as a function of changing hand orientation and posture were determined.</p>
</sec>
<sec>
<title>Results</title>
<p>
<xref ref-type="fig" rid="sensors-15-29868-f006">Figure A1</xref>
provides an overview of the measured hand orientation and the MCP joint angles of the index finger for varying hand orientation angles and both evaluated hand postures. In these figures, ranges are indicated using vertical (red) dotted lines. These ranges correspond to stable or unstable finger flexion measurements.</p>
<p>For varying pitch angles, and independent of posture, the MCP and PIP joints can adequately be estimated in the range [−35, 30] deg. In this stable range, at hand posture 1 (“flat hand”) no substantial difference is present between the MCP and PIP joints. However, at posture 2 (finger flexion) the PIP joints are subject to higher variances than the MCP joints (75.9 deg
<sup>2</sup>
<italic>vs.</italic>
9.3 deg
<sup>2</sup>
, respectively). No apparent relation is present however between variance and pitch angle. At hand orientation angles below −35 deg, the hand is angling too far downwards for the camera to robustly detect the joint angles. Lastly, for posture 2, the PIP joints appear to be overestimated.</p>
<p>The yaw angles are measured robustly in range [−50, 50] deg. Outside this range, the measurement results are unstable. The yaw stable range can be divided into two separate ranges, depending on whether the thumb is pointing towards or away from the computer screen. These ranges are [−50, −10] deg and [−10, 50] deg, respectively, with measured MCP variances 2.4 deg
<sup>2</sup>
and 43.7 deg
<sup>2</sup>
for the index finger at posture 2. We thus measure a higher variance when the thumb is pointing away from the computer screen. The PIP flexions are overestimated at posture 2.</p>
<p>Lastly, the roll angle measurements are stable over the entire range [−120, 60] deg, except for the range [−110, −60] deg where uncertainty is present in the data. At −90 deg, the hand is angled vertically, with the thumb facing up, and the middle through pinky fingers are occluded for the camera by the thumb and index fingers. As a result, finger joint estimates are unstable in this range, and have higher variance as compared to the stable range (for the MCP joint 20.1 deg
<sup>2</sup>
<italic>vs.</italic>
0.3 deg
<sup>2</sup>
at posture 1 and 23.2 deg
<sup>2</sup>
<italic>vs.</italic>
8.1 deg
<sup>2</sup>
at posture 2). At range [40, 60] deg, when the thumb is pointing down, the software has significant difficulty detecting the thumb (resulting in either high or zero variance for the thumb). In this range, at high thumb measurement variance, all the other fingers are influenced as well, explaining the variance peak at posture 2 at 40 deg, see
<xref ref-type="fig" rid="sensors-15-29868-f006">Figure A1</xref>
. Again, in the stable range ([–120, –110] deg and [–60, 60] deg), the PIP variance is higher at posture 2 due to camera occlusion as compared to posture 1. This PIP variance is also higher in range [0, 60] deg, when the thumb is pointing down, as compared to [–60, 0] deg (65.4 deg
<sup>2</sup>
<italic>vs.</italic>
17.3 deg
<sup>2</sup>
for the PIP joint, respectively, at posture 2).</p>
<fig id="sensors-15-29868-f006" position="anchor">
<label>Figure A1</label>
<caption>
<p>(
<bold>a1–a3</bold>
) Measured hand orientations; (
<bold>b1–c3</bold>
) metacarpophalangeal (MCP) index finger joint angle as a function of hand orientations (pitch, roll and yaw) for two different postures. (
<bold>b1–b3</bold>
) posture 1, flat hand; (
<bold>c1–c3</bold>
) posture 2, flexed fingers. Means, standard deviations, and variances were calculated for each of the 5 measurement sessions, and subsequently averaged over all sessions. Posture 1 is with a “flat hand”, MCP 0 deg and proximal interphalangeal (PIP) 0 deg joint angles; posture 2 is with flexed fingers, MCP 35 deg, PIP 55 deg. The thick (blue) line are the measured joint angles. At posture 1 (with MCP 0 deg) one would expect the measurement data as a function of changing hand orientation to be stable around 0 deg, and at posture 2 (with MCP 35 deg) stable around 35 deg. The dashed (green) lines are the measured variances. Vertical thick dashed lines (red) distinguish between stable and unstable orientation ranges, and the therein provided numbers are the mean variances for those ranges.</p>
</caption>
<graphic xlink:href="sensors-15-29868-g006"></graphic>
</fig>
<p>Based on these results, the variances of individual fingers and joints can be expressed as a function of hand orientation and degree of finger flexion. Where possible, the measured orientation is used as input in choosing the appropriate corresponding variance in matrix
<inline-formula>
<mml:math id="mm52">
<mml:mrow>
<mml:msub>
<mml:mi>R</mml:mi>
<mml:mi>k</mml:mi>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula>
. Moreover, where higher variance is measured at posture 2, the normalized measurement signal from the Data Glove is used as a weight in scaling up the joint Posture 2, the normalized measurement signal from the Data Glove is used as a weight in scaling up the joint variance at increasing finger flexion. Within the stable orientation ranges (pitch
<inline-formula>
<mml:math id="mm53">
<mml:mi>α</mml:mi>
</mml:math>
</inline-formula>
[−35, 30] deg, roll
<inline-formula>
<mml:math id="mm54">
<mml:mi>β</mml:mi>
</mml:math>
</inline-formula>
[−120, 60] deg, yaw
<inline-formula>
<mml:math id="mm55">
<mml:mi>γ</mml:mi>
</mml:math>
</inline-formula>
[−50, 50] deg) the parameters for matrix
<inline-formula>
<mml:math id="mm56">
<mml:mrow>
<mml:msub>
<mml:mi>R</mml:mi>
<mml:mi>k</mml:mi>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula>
for the MCP joint of the index finger are calculated as follows:
<disp-formula>
<label>(A1)</label>
<mml:math id="mm57">
<mml:mrow>
<mml:mrow>
<mml:msubsup>
<mml:mi>σ</mml:mi>
<mml:mrow>
<mml:mi>N</mml:mi>
<mml:mi>V</mml:mi>
<mml:msub>
<mml:mi>R</mml:mi>
<mml:mrow>
<mml:mi>M</mml:mi>
<mml:mi>C</mml:mi>
<mml:mi>P</mml:mi>
</mml:mrow>
</mml:msub>
</mml:mrow>
<mml:mn>2</mml:mn>
</mml:msubsup>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:mi>α</mml:mi>
<mml:mo>,</mml:mo>
<mml:mi>β</mml:mi>
<mml:mo>,</mml:mo>
<mml:mi>γ</mml:mi>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
<mml:mo>=</mml:mo>
<mml:mi>A</mml:mi>
<mml:mo>+</mml:mo>
<mml:mi>B</mml:mi>
<mml:mo></mml:mo>
<mml:msub>
<mml:mi>z</mml:mi>
<mml:mrow>
<mml:mi>D</mml:mi>
<mml:mi>G</mml:mi>
</mml:mrow>
</mml:msub>
<mml:mo>=</mml:mo>
<mml:mi>max</mml:mi>
<mml:mrow>
<mml:mo>[</mml:mo>
<mml:mrow>
<mml:mtable>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:msubsup>
<mml:mi>σ</mml:mi>
<mml:mrow>
<mml:mi>N</mml:mi>
<mml:mi>V</mml:mi>
<mml:msub>
<mml:mi>R</mml:mi>
<mml:mrow>
<mml:mi>M</mml:mi>
<mml:mi>C</mml:mi>
<mml:mi>P</mml:mi>
</mml:mrow>
</mml:msub>
</mml:mrow>
<mml:mn>2</mml:mn>
</mml:msubsup>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mi>α</mml:mi>
<mml:mo>)</mml:mo>
</mml:mrow>
</mml:mrow>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:msubsup>
<mml:mi>σ</mml:mi>
<mml:mrow>
<mml:mi>N</mml:mi>
<mml:mi>V</mml:mi>
<mml:msub>
<mml:mi>R</mml:mi>
<mml:mrow>
<mml:mi>M</mml:mi>
<mml:mi>C</mml:mi>
<mml:mi>P</mml:mi>
</mml:mrow>
</mml:msub>
</mml:mrow>
<mml:mn>2</mml:mn>
</mml:msubsup>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mi>β</mml:mi>
<mml:mo>)</mml:mo>
</mml:mrow>
</mml:mrow>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:msubsup>
<mml:mi>σ</mml:mi>
<mml:mrow>
<mml:mi>N</mml:mi>
<mml:mi>V</mml:mi>
<mml:msub>
<mml:mi>R</mml:mi>
<mml:mrow>
<mml:mi>M</mml:mi>
<mml:mi>C</mml:mi>
<mml:mi>P</mml:mi>
</mml:mrow>
</mml:msub>
</mml:mrow>
<mml:mn>2</mml:mn>
</mml:msubsup>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mi>β</mml:mi>
<mml:mo>)</mml:mo>
</mml:mrow>
</mml:mrow>
</mml:mtd>
</mml:mtr>
</mml:mtable>
</mml:mrow>
<mml:mo>]</mml:mo>
</mml:mrow>
<mml:mtext>, with</mml:mtext>
</mml:mrow>
<mml:mspace linebreak="newline"></mml:mspace>
<mml:mrow>
<mml:mtable>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:msubsup>
<mml:mi>σ</mml:mi>
<mml:mrow>
<mml:mi>N</mml:mi>
<mml:mi>V</mml:mi>
<mml:msub>
<mml:mi>R</mml:mi>
<mml:mrow>
<mml:mi>M</mml:mi>
<mml:mi>C</mml:mi>
<mml:mi>P</mml:mi>
</mml:mrow>
</mml:msub>
</mml:mrow>
<mml:mn>2</mml:mn>
</mml:msubsup>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mi>α</mml:mi>
<mml:mo>)</mml:mo>
</mml:mrow>
<mml:mo>=</mml:mo>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:mn>0.8</mml:mn>
<mml:mo>+</mml:mo>
<mml:mn>8.5</mml:mn>
<mml:mo></mml:mo>
<mml:msub>
<mml:mi>z</mml:mi>
<mml:mrow>
<mml:mi>D</mml:mi>
<mml:mi>G</mml:mi>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:mtext>if</mml:mtext>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:mo></mml:mo>
<mml:mn>35</mml:mn>
<mml:mo></mml:mo>
<mml:mi>α</mml:mi>
<mml:mo></mml:mo>
<mml:mn>30</mml:mn>
</mml:mrow>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:msubsup>
<mml:mi>σ</mml:mi>
<mml:mrow>
<mml:mi>N</mml:mi>
<mml:mi>V</mml:mi>
<mml:msub>
<mml:mi>R</mml:mi>
<mml:mrow>
<mml:mi>M</mml:mi>
<mml:mi>C</mml:mi>
<mml:mi>P</mml:mi>
</mml:mrow>
</mml:msub>
</mml:mrow>
<mml:mn>2</mml:mn>
</mml:msubsup>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mi>β</mml:mi>
<mml:mo>)</mml:mo>
</mml:mrow>
<mml:mo>=</mml:mo>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:mrow>
<mml:mo>{</mml:mo>
<mml:mtable columnalign="left">
<mml:mtr>
<mml:mtd>
<mml:mn>0.9</mml:mn>
<mml:mo>+</mml:mo>
<mml:mn>41.1</mml:mn>
<mml:mo></mml:mo>
<mml:msub>
<mml:mi>z</mml:mi>
<mml:mrow>
<mml:mi>D</mml:mi>
<mml:mi>G</mml:mi>
</mml:mrow>
</mml:msub>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mn>23.2</mml:mn>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mn>0.9</mml:mn>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mn>0.3</mml:mn>
<mml:mo>+</mml:mo>
<mml:mn>14.9</mml:mn>
<mml:mo></mml:mo>
<mml:msub>
<mml:mi>z</mml:mi>
<mml:mrow>
<mml:mi>D</mml:mi>
<mml:mi>G</mml:mi>
</mml:mrow>
</mml:msub>
</mml:mtd>
</mml:mtr>
</mml:mtable>
</mml:mrow>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mtable columnalign="left">
<mml:mtr>
<mml:mtd>
<mml:mtext>if</mml:mtext>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mtext>if</mml:mtext>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mtext>if</mml:mtext>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mtext>if</mml:mtext>
</mml:mtd>
</mml:mtr>
</mml:mtable>
</mml:mtd>
<mml:mtd>
<mml:mtable>
<mml:mtr>
<mml:mtd>
<mml:mo></mml:mo>
<mml:mn>120</mml:mn>
<mml:mo></mml:mo>
<mml:mi>β</mml:mi>
<mml:mo></mml:mo>
<mml:mo></mml:mo>
<mml:mn>110</mml:mn>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mo></mml:mo>
<mml:mn>110</mml:mn>
<mml:mo><</mml:mo>
<mml:mi>β</mml:mi>
<mml:mo></mml:mo>
<mml:mo></mml:mo>
<mml:mn>60</mml:mn>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mo></mml:mo>
<mml:mn>60</mml:mn>
<mml:mo><</mml:mo>
<mml:mi>β</mml:mi>
<mml:mo></mml:mo>
<mml:mn>0</mml:mn>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mn>0</mml:mn>
<mml:mo><</mml:mo>
<mml:mi>β</mml:mi>
<mml:mo></mml:mo>
<mml:mn>60</mml:mn>
</mml:mtd>
</mml:mtr>
</mml:mtable>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:msubsup>
<mml:mi>σ</mml:mi>
<mml:mrow>
<mml:mi>N</mml:mi>
<mml:mi>V</mml:mi>
<mml:msub>
<mml:mi>R</mml:mi>
<mml:mrow>
<mml:mi>M</mml:mi>
<mml:mi>C</mml:mi>
<mml:mi>P</mml:mi>
</mml:mrow>
</mml:msub>
</mml:mrow>
<mml:mn>2</mml:mn>
</mml:msubsup>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mi>γ</mml:mi>
<mml:mo>)</mml:mo>
</mml:mrow>
<mml:mo>=</mml:mo>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:mrow>
<mml:mo>{</mml:mo>
<mml:mtable columnalign="left">
<mml:mtr>
<mml:mtd>
<mml:mn>0.4</mml:mn>
<mml:mo>+</mml:mo>
<mml:mn>2.0</mml:mn>
<mml:mo></mml:mo>
<mml:msub>
<mml:mi>z</mml:mi>
<mml:mrow>
<mml:mi>D</mml:mi>
<mml:mi>G</mml:mi>
</mml:mrow>
</mml:msub>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mn>6.4</mml:mn>
<mml:mo>+</mml:mo>
<mml:mn>37.3</mml:mn>
<mml:mo></mml:mo>
<mml:msub>
<mml:mi>z</mml:mi>
<mml:mrow>
<mml:mi>D</mml:mi>
<mml:mi>G</mml:mi>
</mml:mrow>
</mml:msub>
</mml:mtd>
</mml:mtr>
</mml:mtable>
</mml:mrow>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:mtable>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:mtext>if</mml:mtext>
</mml:mrow>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:mtext>if</mml:mtext>
</mml:mrow>
</mml:mtd>
</mml:mtr>
</mml:mtable>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mtable>
<mml:mtr>
<mml:mtd>
<mml:mo></mml:mo>
<mml:mn>50</mml:mn>
<mml:mo></mml:mo>
<mml:mi>γ</mml:mi>
<mml:mo></mml:mo>
<mml:mo></mml:mo>
<mml:mn>10</mml:mn>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mo></mml:mo>
<mml:mn>10</mml:mn>
<mml:mo><</mml:mo>
<mml:mi>γ</mml:mi>
<mml:mo></mml:mo>
<mml:mn>50</mml:mn>
</mml:mtd>
</mml:mtr>
</mml:mtable>
</mml:mtd>
</mml:mtr>
</mml:mtable>
</mml:mrow>
</mml:mrow>
</mml:math>
</disp-formula>
</p>
<p>The
<italic>A</italic>
and
<italic>B</italic>
values (with unit deg2) for varying orientation angles for all fingers have been determined separately, and have been given below in the format
<inline-formula>
<mml:math id="mm59">
<mml:mrow>
<mml:mrow>
<mml:mo>[</mml:mo>
<mml:mrow>
<mml:mrow>
<mml:mrow>
<mml:mtable>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:msub>
<mml:mi>A</mml:mi>
<mml:mrow>
<mml:mi>M</mml:mi>
<mml:mi>C</mml:mi>
<mml:mi>P</mml:mi>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:msub>
<mml:mi>B</mml:mi>
<mml:mrow>
<mml:mi>M</mml:mi>
<mml:mi>C</mml:mi>
<mml:mi>P</mml:mi>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:mtd>
</mml:mtr>
</mml:mtable>
</mml:mrow>
<mml:mo>|</mml:mo>
</mml:mrow>
<mml:mtable>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:msub>
<mml:mi>A</mml:mi>
<mml:mrow>
<mml:mi>P</mml:mi>
<mml:mi>I</mml:mi>
<mml:mi>P</mml:mi>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:msub>
<mml:mi>B</mml:mi>
<mml:mrow>
<mml:mi>P</mml:mi>
<mml:mi>I</mml:mi>
<mml:mi>P</mml:mi>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:mtd>
</mml:mtr>
</mml:mtable>
</mml:mrow>
<mml:mo>]</mml:mo>
</mml:mrow>
</mml:mrow>
</mml:math>
</inline-formula>
for each finger.</p>
<disp-formula id="FD11">
<label>(A2)</label>
<mml:math id="mm60">
<mml:mrow>
<mml:mtable>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:mi>thumb</mml:mi>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:mi>index</mml:mi>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:mi>middle</mml:mi>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:mi>ring</mml:mi>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:mi>pink</mml:mi>
</mml:mrow>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:mrow>
<mml:mo>[</mml:mo>
<mml:mrow>
<mml:mrow>
<mml:mrow>
<mml:mtable>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:mn>3.7</mml:mn>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:mn>37.5</mml:mn>
</mml:mrow>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:mn>0.1</mml:mn>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:mn>5.9</mml:mn>
</mml:mrow>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:mn>21.8</mml:mn>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mn>0</mml:mn>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:mn>1.9</mml:mn>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mn>0</mml:mn>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:mn>2.4</mml:mn>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:mn>23.3</mml:mn>
</mml:mrow>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:mn>8.1</mml:mn>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mn>0</mml:mn>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:mn>19.1</mml:mn>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:mn>23.3</mml:mn>
</mml:mrow>
</mml:mtd>
</mml:mtr>
</mml:mtable>
</mml:mrow>
<mml:mo>|</mml:mo>
</mml:mrow>
<mml:mtable>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:mn>11.7</mml:mn>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:mn>62.4</mml:mn>
</mml:mrow>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:mn>3.5</mml:mn>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:mn>7.8</mml:mn>
</mml:mrow>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:mn>15.9</mml:mn>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mn>0</mml:mn>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:mn>6.4</mml:mn>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mn>0</mml:mn>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:mn>5.5</mml:mn>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:mn>16.2</mml:mn>
</mml:mrow>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:mn>19.6</mml:mn>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mn>0</mml:mn>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:mn>33.8</mml:mn>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:mn>45.1</mml:mn>
</mml:mrow>
</mml:mtd>
</mml:mtr>
</mml:mtable>
</mml:mrow>
<mml:mo>]</mml:mo>
</mml:mrow>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:mrow>
<mml:mo>[</mml:mo>
<mml:mrow>
<mml:mrow>
<mml:mrow>
<mml:mtable>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:mn>0.8</mml:mn>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:mn>8.5</mml:mn>
</mml:mrow>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:mn>0.9</mml:mn>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:mn>41.1</mml:mn>
</mml:mrow>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:mn>23.2</mml:mn>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mn>0</mml:mn>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:mn>0.9</mml:mn>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mn>0</mml:mn>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:mn>0.3</mml:mn>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:mn>14.9</mml:mn>
</mml:mrow>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:mn>0.4</mml:mn>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:mn>2.0</mml:mn>
</mml:mrow>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:mn>6.4</mml:mn>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:mn>37.3</mml:mn>
</mml:mrow>
</mml:mtd>
</mml:mtr>
</mml:mtable>
</mml:mrow>
<mml:mo>|</mml:mo>
</mml:mrow>
<mml:mtable>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:mn>2.3</mml:mn>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:mn>73.6</mml:mn>
</mml:mrow>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:mn>2.8</mml:mn>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:mn>85.3</mml:mn>
</mml:mrow>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:mn>41.2</mml:mn>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mn>0</mml:mn>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:mn>0.7</mml:mn>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:mn>16.6</mml:mn>
</mml:mrow>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:mn>0.7</mml:mn>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:mn>55.7</mml:mn>
</mml:mrow>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:mn>0.6</mml:mn>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:mn>25.0</mml:mn>
</mml:mrow>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:mn>11.5</mml:mn>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:mn>214.8</mml:mn>
</mml:mrow>
</mml:mtd>
</mml:mtr>
</mml:mtable>
</mml:mrow>
<mml:mo>]</mml:mo>
</mml:mrow>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:mrow>
<mml:mo>[</mml:mo>
<mml:mrow>
<mml:mrow>
<mml:mrow>
<mml:mtable>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:mn>0.8</mml:mn>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:mn>11.6</mml:mn>
</mml:mrow>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:mn>0.5</mml:mn>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:mn>19.1</mml:mn>
</mml:mrow>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:mn>16.6</mml:mn>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mn>0</mml:mn>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:mn>1.3</mml:mn>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mn>0</mml:mn>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:mn>0.3</mml:mn>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:mn>14.8</mml:mn>
</mml:mrow>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:mn>0.3</mml:mn>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:mn>2.0</mml:mn>
</mml:mrow>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:mn>5.8</mml:mn>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:mn>43.7</mml:mn>
</mml:mrow>
</mml:mtd>
</mml:mtr>
</mml:mtable>
</mml:mrow>
<mml:mo>|</mml:mo>
</mml:mrow>
<mml:mtable>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:mn>2.0</mml:mn>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:mn>58.1</mml:mn>
</mml:mrow>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:mn>0.7</mml:mn>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:mn>26.9</mml:mn>
</mml:mrow>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:mn>19.7</mml:mn>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mn>0</mml:mn>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:mn>0.4</mml:mn>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:mn>41.1</mml:mn>
</mml:mrow>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:mn>0.2</mml:mn>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:mn>95.1</mml:mn>
</mml:mrow>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:mn>0.2</mml:mn>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:mn>33.8</mml:mn>
</mml:mrow>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:mn>29.2</mml:mn>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:mn>143.9</mml:mn>
</mml:mrow>
</mml:mtd>
</mml:mtr>
</mml:mtable>
</mml:mrow>
<mml:mo>]</mml:mo>
</mml:mrow>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:mrow>
<mml:mo>[</mml:mo>
<mml:mrow>
<mml:mrow>
<mml:mrow>
<mml:mtable>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:mn>0.9</mml:mn>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:mn>32.4</mml:mn>
</mml:mrow>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:mn>0.5</mml:mn>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:mn>31.7</mml:mn>
</mml:mrow>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:mn>26.9</mml:mn>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mn>0</mml:mn>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:mn>0.7</mml:mn>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mn>0</mml:mn>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:mn>0.4</mml:mn>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:mn>11.7</mml:mn>
</mml:mrow>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:mn>0.4</mml:mn>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:mn>1.0</mml:mn>
</mml:mrow>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:mn>4.0</mml:mn>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:mn>75.3</mml:mn>
</mml:mrow>
</mml:mtd>
</mml:mtr>
</mml:mtable>
</mml:mrow>
<mml:mo>|</mml:mo>
</mml:mrow>
<mml:mtable>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:mn>1.5</mml:mn>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:mn>61.2</mml:mn>
</mml:mrow>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:mn>1.8</mml:mn>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:mn>103.1</mml:mn>
</mml:mrow>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:mn>22.5</mml:mn>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:mn>72.8</mml:mn>
</mml:mrow>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:mn>1.0</mml:mn>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:mn>14.6</mml:mn>
</mml:mrow>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:mn>1.5</mml:mn>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:mn>44.2</mml:mn>
</mml:mrow>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:mn>0.8</mml:mn>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:mn>28.3</mml:mn>
</mml:mrow>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:mn>36.5</mml:mn>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:mn>80</mml:mn>
</mml:mrow>
</mml:mtd>
</mml:mtr>
</mml:mtable>
</mml:mrow>
<mml:mo>]</mml:mo>
</mml:mrow>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:mrow>
<mml:mo>[</mml:mo>
<mml:mrow>
<mml:mrow>
<mml:mrow>
<mml:mtable>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:mn>1.4</mml:mn>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:mn>66</mml:mn>
</mml:mrow>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:mn>2.4</mml:mn>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:mn>83.1</mml:mn>
</mml:mrow>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:mn>9.3</mml:mn>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:mn>48.6</mml:mn>
</mml:mrow>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:mn>1.1</mml:mn>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:mn>2.9</mml:mn>
</mml:mrow>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:mn>1.1</mml:mn>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:mn>6.6</mml:mn>
</mml:mrow>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:mn>0.9</mml:mn>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:mn>0.9</mml:mn>
</mml:mrow>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:mn>7.8</mml:mn>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:mn>70.4</mml:mn>
</mml:mrow>
</mml:mtd>
</mml:mtr>
</mml:mtable>
</mml:mrow>
<mml:mo>|</mml:mo>
</mml:mrow>
<mml:mtable>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:mn>2.0</mml:mn>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:mn>38.4</mml:mn>
</mml:mrow>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:mn>4.7</mml:mn>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:mn>58.7</mml:mn>
</mml:mrow>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:mn>47.0</mml:mn>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:mn>25.7</mml:mn>
</mml:mrow>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:mn>2.6</mml:mn>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:mn>27.9</mml:mn>
</mml:mrow>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:mn>1.5</mml:mn>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:mn>27.8</mml:mn>
</mml:mrow>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:mn>2.9</mml:mn>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:mn>19.6</mml:mn>
</mml:mrow>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:mn>39.3</mml:mn>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:mn>88.2</mml:mn>
</mml:mrow>
</mml:mtd>
</mml:mtr>
</mml:mtable>
</mml:mrow>
<mml:mo>]</mml:mo>
</mml:mrow>
</mml:mrow>
</mml:mtd>
</mml:mtr>
</mml:mtable>
</mml:mrow>
</mml:math>
</disp-formula>
</sec>
</app>
</app-group>
<app-group>
<title>Appendix B</title>
<app id="app2-sensors-15-29868">
<title>Fingers Maximum Acceleration Determination</title>
<sec>
<title>Methods</title>
<p>The process noise covariance matrix
<inline-formula>
<mml:math id="mm61">
<mml:mrow>
<mml:msub>
<mml:mi>Q</mml:mi>
<mml:mi>k</mml:mi>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula>
is calculated based on the maximum possible accelerations of the MCP and PIP joints. Measurements of these accelerations were performed using the 5DT Data Glove. Ten healthy young participants were asked to flex and extend their fingers at a normal pace ten times, and as fast as possible ten times. The measurement frequency of the Data Glove was 200 Hz. Measured flexion data were resampled to 1000 Hz, using a cubic interpolation method, and filtered using a 2nd order Butterworth filter with a cut-off frequency of 10 Hz. At both movement tasks, for each participant the peak accelerations at every flexion and extension for every finger were determined, and the mean (±standard deviation (SD)) calculated. All determined mean accelerations and SD were subsequently averaged over all participants, and we use this as input for matrix
<inline-formula>
<mml:math id="mm62">
<mml:mrow>
<mml:msub>
<mml:mi>Q</mml:mi>
<mml:mi>k</mml:mi>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula>
.</p>
</sec>
<sec>
<title>Results</title>
<p>An example of a single measurement session for the index finger is shown in
<xref ref-type="fig" rid="sensors-15-29868-f007">Figure B1</xref>
, with accompanying calculated finger flexion and extension accelerations. The participant was asked to flex his fingers as fast as possible. Accelerations and standard deviations averaged over all 10 participants are shown in
<xref ref-type="fig" rid="sensors-15-29868-f008">Figure B2</xref>
. The accelerations of the index, middle and ring fingers lie in the range 2.5 to 3.2·10
<sup>5</sup>
deg/s
<sup>2</sup>
, and accelerations of the thumb and pinky are generally lower.</p>
<p>In order to make a distinction between
<inline-formula>
<mml:math id="mm63">
<mml:mrow>
<mml:msub>
<mml:mover accent="true">
<mml:mi>φ</mml:mi>
<mml:mo>¨</mml:mo>
</mml:mover>
<mml:mrow>
<mml:mi>M</mml:mi>
<mml:mi>C</mml:mi>
<mml:mi>P</mml:mi>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula>
and
<inline-formula>
<mml:math id="mm64">
<mml:mrow>
<mml:msub>
<mml:mover accent="true">
<mml:mi>φ</mml:mi>
<mml:mo>¨</mml:mo>
</mml:mover>
<mml:mrow>
<mml:mi>P</mml:mi>
<mml:mi>I</mml:mi>
<mml:mi>P</mml:mi>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula>
, we use the weights calculated in
<xref ref-type="app" rid="app3-sensors-15-29868">Appendix C</xref>
that indicate the individual joint contributions to the overall finger flexion angle as measured with the Data Glove. Recalculating these weights, which are given in equation (C.2), to percentages gives a MCP vs PIP division of 44%
<italic>vs.</italic>
56% for the thumb, around 25%
<italic>vs.</italic>
75% for the index, middle and ring finger, and 6%
<italic>vs.</italic>
94% for the pinky. The MCP angle contribution to the Data Glove sensor readings is always lower as compared to the PIP angle due to shifting of the sensors in the glove, and the MCP joint angle being more gradual as compared to the PIP joint angle.</p>
<fig id="sensors-15-29868-f007" position="anchor">
<label>Figure B1</label>
<caption>
<p>
<bold>Top</bold>
: (
<bold>a</bold>
) Index finger motion measured with 5DT Data Glove during 10 times as-fast-as-possible finger flexions of a single participant. Indicated with a bold line are the flexion and extension movements with their respective marker indicators (o and *) showing starting and stopping points of the finger motions; (
<bold>b</bold>
) calculated accelerations over the course of every finger flexion superimposed over each other; (
<bold>c</bold>
) similar as subfigure b, but for finger extensions. The horizontal (red) dotted lines show the mean peak accelerations determined from the datasets. (Note: this participant was faster than average.)</p>
</caption>
<graphic xlink:href="sensors-15-29868-g007"></graphic>
</fig>
<fig id="sensors-15-29868-f008" position="anchor">
<label>Figure B2</label>
<caption>
<p>Mean peak accelerations (±SD) during finger flexion and extension performed at normal speed (blue) and as fast as possible (green), calculated per participant and subsequently averaged over all 10 participants. (
<bold>a</bold>
) finger flexion measurements; (
<bold>b</bold>
) finger extension measurements.</p>
</caption>
<graphic xlink:href="sensors-15-29868-g008"></graphic>
</fig>
<p>For matrix
<inline-formula>
<mml:math id="mm65">
<mml:mrow>
<mml:msub>
<mml:mi>Q</mml:mi>
<mml:mi>k</mml:mi>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula>
we take the PIP joint contribution percentages and calculate the maximum joint accelerations by taking the same percentage from the measured overall flexure acceleration. For the index finger, with a total acceleration of 3.1·10
<sup>4</sup>
deg/s
<sup>2</sup>
we thus find a PIP joint acceleration of 0.75 * 3.1·10
<sup>4</sup>
= 2.3·10
<sup>4</sup>
deg/s
<sup>2</sup>
. The following accelerations were found for the respective fingers:
<disp-formula id="FD12">
<label>(B1)</label>
<mml:math id="mm66">
<mml:mrow>
<mml:mtable>
<mml:mtr>
<mml:mtd></mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:mi>thumb</mml:mi>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:mi>index</mml:mi>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:mi>middle</mml:mi>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:mi>ring</mml:mi>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:mi>pink</mml:mi>
</mml:mrow>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:msub>
<mml:mover accent="true">
<mml:mi>φ</mml:mi>
<mml:mo>¨</mml:mo>
</mml:mover>
<mml:mrow>
<mml:mi>P</mml:mi>
<mml:mi>I</mml:mi>
<mml:mi>P</mml:mi>
</mml:mrow>
</mml:msub>
<mml:mo>=</mml:mo>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:mrow>
<mml:mo>[</mml:mo>
<mml:mrow>
<mml:mn>1.05</mml:mn>
</mml:mrow>
</mml:mrow>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:mn>2.34</mml:mn>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:mn>2.10</mml:mn>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:mn>2.38</mml:mn>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:mrow>
<mml:mrow>
<mml:mn>2.31</mml:mn>
</mml:mrow>
<mml:mo>]</mml:mo>
</mml:mrow>
</mml:mrow>
</mml:mtd>
</mml:mtr>
</mml:mtable>
<mml:mtable>
<mml:mtr>
<mml:mtd></mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:mo></mml:mo>
<mml:msup>
<mml:mrow>
<mml:mn>10</mml:mn>
</mml:mrow>
<mml:mn>4</mml:mn>
</mml:msup>
<mml:mo> </mml:mo>
<mml:mi>d</mml:mi>
<mml:mi>e</mml:mi>
<mml:mi>g</mml:mi>
<mml:mo>/</mml:mo>
<mml:msup>
<mml:mi>s</mml:mi>
<mml:mn>2</mml:mn>
</mml:msup>
</mml:mrow>
</mml:mtd>
</mml:mtr>
</mml:mtable>
</mml:mrow>
</mml:math>
</disp-formula>
</p>
<p>Because of the large spread in maximum accelerations (see
<xref ref-type="fig" rid="sensors-15-29868-f008">Figure B2</xref>
) between participants, these results need to be interpreted with caution. For implementation into the Kalman filter we will assume the MCP joint accelerations to be on par with the PIP accelerations,
<italic>i.e.</italic>
,
<inline-formula>
<mml:math id="mm67">
<mml:mrow>
<mml:msub>
<mml:mover accent="true">
<mml:mi>φ</mml:mi>
<mml:mo>¨</mml:mo>
</mml:mover>
<mml:mrow>
<mml:mi>P</mml:mi>
<mml:mi>I</mml:mi>
<mml:mi>P</mml:mi>
</mml:mrow>
</mml:msub>
<mml:mo></mml:mo>
<mml:msub>
<mml:mover accent="true">
<mml:mi>φ</mml:mi>
<mml:mo>¨</mml:mo>
</mml:mover>
<mml:mrow>
<mml:mi>M</mml:mi>
<mml:mi>C</mml:mi>
<mml:mi>P</mml:mi>
</mml:mrow>
</mml:msub>
<mml:mo>.</mml:mo>
</mml:mrow>
</mml:math>
</inline-formula>
Moreover, considering that the accelerations are the maximum possible, and the normal movement accelerations are generally a lot lower, we can downscale the in (B.1) given values for input into
<inline-formula>
<mml:math id="mm68">
<mml:mrow>
<mml:msub>
<mml:mi>Q</mml:mi>
<mml:mi>k</mml:mi>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula>
to increase joint angle approximation precision under the assumption of normal task operation speeds.</p>
</sec>
</app>
</app-group>
<app-group>
<title>Appendix C</title>
<app id="app3-sensors-15-29868">
<title>Determination of Data Glove weights</title>
<p>Data Glove weights
<inline-formula>
<mml:math id="mm69">
<mml:mrow>
<mml:msub>
<mml:mi>w</mml:mi>
<mml:mrow>
<mml:mi>D</mml:mi>
<mml:msub>
<mml:mi>G</mml:mi>
<mml:mrow>
<mml:mi>M</mml:mi>
<mml:mi>C</mml:mi>
<mml:mi>P</mml:mi>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula>
and
<inline-formula>
<mml:math id="mm70">
<mml:mrow>
<mml:msub>
<mml:mi>w</mml:mi>
<mml:mrow>
<mml:mi>D</mml:mi>
<mml:msub>
<mml:mi>G</mml:mi>
<mml:mrow>
<mml:mi>PIP</mml:mi>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula>
were calculated based on reference measurements performed through colored Marker Tracking (MT) of the MCP and PIP joint rotations. The weights were calculated as follows:
<disp-formula id="FD13">
<label>(C1)</label>
<mml:math id="mm71">
<mml:mtable columnalign="left">
<mml:mtr>
<mml:mtd>
<mml:msub>
<mml:mi>w</mml:mi>
<mml:mrow>
<mml:mi>D</mml:mi>
<mml:msub>
<mml:mi>G</mml:mi>
<mml:mrow>
<mml:mi>M</mml:mi>
<mml:mi>C</mml:mi>
<mml:mi>P</mml:mi>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:msub>
<mml:mo>=</mml:mo>
<mml:mfrac>
<mml:mrow>
<mml:msub>
<mml:mi>φ</mml:mi>
<mml:mrow>
<mml:mi>D</mml:mi>
<mml:msub>
<mml:mi>G</mml:mi>
<mml:mrow>
<mml:mi>M</mml:mi>
<mml:mi>C</mml:mi>
<mml:mi>P</mml:mi>
<mml:mo>+</mml:mo>
<mml:mi>P</mml:mi>
<mml:mi>I</mml:mi>
<mml:mi>P</mml:mi>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:msub>
</mml:mrow>
<mml:mrow>
<mml:msub>
<mml:mi>φ</mml:mi>
<mml:mrow>
<mml:mi>M</mml:mi>
<mml:msub>
<mml:mi>T</mml:mi>
<mml:mrow>
<mml:mi>M</mml:mi>
<mml:mi>C</mml:mi>
<mml:mi>P</mml:mi>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:mfrac>
<mml:mtext>, at pure MCP flexion </mml:mtext>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi>φ</mml:mi>
<mml:mrow>
<mml:mi>P</mml:mi>
<mml:mi>I</mml:mi>
<mml:mi>P</mml:mi>
</mml:mrow>
</mml:msub>
<mml:mo>=</mml:mo>
<mml:mn>0</mml:mn>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:msub>
<mml:mi>w</mml:mi>
<mml:mrow>
<mml:mi>D</mml:mi>
<mml:msub>
<mml:mi>G</mml:mi>
<mml:mrow>
<mml:mi>P</mml:mi>
<mml:mi>I</mml:mi>
<mml:mi>P</mml:mi>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:msub>
<mml:mo>=</mml:mo>
<mml:mfrac>
<mml:mrow>
<mml:msub>
<mml:mi>φ</mml:mi>
<mml:mrow>
<mml:mi>D</mml:mi>
<mml:msub>
<mml:mi>G</mml:mi>
<mml:mrow>
<mml:mi>M</mml:mi>
<mml:mi>C</mml:mi>
<mml:mi>P</mml:mi>
<mml:mo>+</mml:mo>
<mml:mi>P</mml:mi>
<mml:mi>I</mml:mi>
<mml:mi>P</mml:mi>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:msub>
</mml:mrow>
<mml:mrow>
<mml:msub>
<mml:mi>φ</mml:mi>
<mml:mrow>
<mml:mi>M</mml:mi>
<mml:msub>
<mml:mi>T</mml:mi>
<mml:mrow>
<mml:mi>P</mml:mi>
<mml:mi>I</mml:mi>
<mml:mi>P</mml:mi>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:mfrac>
<mml:mtext>, at pure PIP flexion </mml:mtext>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi>φ</mml:mi>
<mml:mrow>
<mml:mi>M</mml:mi>
<mml:mi>C</mml:mi>
<mml:mi>P</mml:mi>
</mml:mrow>
</mml:msub>
<mml:mo>=</mml:mo>
<mml:mn>0</mml:mn>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
</mml:mtd>
</mml:mtr>
</mml:mtable>
</mml:math>
</disp-formula>
where
<inline-formula>
<mml:math id="mm72">
<mml:mrow>
<mml:msub>
<mml:mi>φ</mml:mi>
<mml:mrow>
<mml:mi>M</mml:mi>
<mml:msub>
<mml:mi>T</mml:mi>
<mml:mrow>
<mml:mi>M</mml:mi>
<mml:mi>C</mml:mi>
<mml:mi>P</mml:mi>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula>
and
<inline-formula>
<mml:math id="mm73">
<mml:mrow>
<mml:msub>
<mml:mi>φ</mml:mi>
<mml:mrow>
<mml:mi>M</mml:mi>
<mml:msub>
<mml:mi>T</mml:mi>
<mml:mrow>
<mml:mi>P</mml:mi>
<mml:mi>I</mml:mi>
<mml:mi>P</mml:mi>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula>
are the joint angles as measured through Marker Tracking and
<inline-formula>
<mml:math id="mm74">
<mml:mrow>
<mml:msub>
<mml:mi>φ</mml:mi>
<mml:mrow>
<mml:mi>D</mml:mi>
<mml:msub>
<mml:mi>G</mml:mi>
<mml:mrow>
<mml:mi>M</mml:mi>
<mml:mi>C</mml:mi>
<mml:mi>P</mml:mi>
<mml:mo>+</mml:mo>
<mml:mi>P</mml:mi>
<mml:mi>I</mml:mi>
<mml:mi>P</mml:mi>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula>
the MCP and PIP combined angle calculated from the measured Data Glove sensor measurements. As the functions describe, by flexing for example the MCP joint (and consciously keeping PIP joint flexion as close to zero as possible), the ratio between the Data Glove measurement and the actual finger joint angle was determined. Note that consciously keeping one joint unflexed whilst flexing the other is physically challenging, hence the weights calculated from actual measurements need to be interpreted as approximations. In the case of an obtained value lower than one, the Data Glove has a bias towards joint angle under estimation and vice-versa.</p>
<p>Measurements were performed on the hand of the author, with colored markers attached to the joint locations of the index, pinky, and thumb fingers of the Data Glove. The MCP and PIP joints of the index through pinky fingers were flexed and extended separately at a relaxed pace, with the hand in view of the camera. The RGB data of the video footage was then analyzed for every movie frame (with 30 frames per second) through subtraction of the background and the black glove, followed by marker detection using a RGB threshold and Mean Shift Cluster detection (“Mean Shift Clustering” function code available at MATLAB Central) [
<xref rid="B57-sensors-15-29868" ref-type="bibr">57</xref>
]. Analyses of a single frame are depicted in
<xref ref-type="fig" rid="sensors-15-29868-f009">Figure C1</xref>
. Drawing straight lines between the marker locations, and calculating the relative angles between those lines yielded the MCP and PIP joint angles (
<xref ref-type="fig" rid="sensors-15-29868-f010">Figure C2</xref>
). The measured joint angle through Marker Tracking were plotted
<italic>vs.</italic>
the measured Data Glove joint angles rescaled to range [0 210] deg, see
<xref ref-type="fig" rid="sensors-15-29868-f011">Figure C3</xref>
, right plots. This range is equal to the sum of the natural MCP and PIP joint angle limits, which are 90 and 120 respectively for the index finger.</p>
<fig id="sensors-15-29868-f009" position="anchor">
<label>Figure C1</label>
<caption>
<p>Marker Tracking video analyses. Example of one analyzed frame for the tracking of a single marker. From left to right: (
<bold>a</bold>
) original frame; (
<bold>b</bold>
) detected background in white, residual image information in black; (
<bold>c</bold>
) detected black glove where the black dots inside the hand show the edges of the markers; (
<bold>d</bold>
) residual image information after background and glove subtraction; and (
<bold>e</bold>
) red marker pixels detected from 4 overlaid on original image and mean shift cluster detection used to determine the center of the marker (shown with green + in image).</p>
</caption>
<graphic xlink:href="sensors-15-29868-g009"></graphic>
</fig>
<fig id="sensors-15-29868-f010" position="anchor">
<label>Figure C2</label>
<caption>
<p>Example of detected metacarpophalangeal joint (MCP, α) and proximal interphalangeal (PIP, β) angles as a function of detected marker locations. Angles provided in degrees.</p>
</caption>
<graphic xlink:href="sensors-15-29868-g010"></graphic>
</fig>
<p>The Data Glove weights were subsequently calculated as the gradient of the linear least-squares fit representing the relation between the measured joint angles. Although the middle and ring fingers were not measured with Marker Tracking because of visual occlusion, for the purpose of these analyses, the Data Glove measurements of those fingers were compared to the index finger Marker Tracking angles. This is valid, because the middle and ring finger joint angles were approximately equal to those of the index finger, as these fingers were flexed and extended simultaneously and equally during measurements. The pinky and thumb fingers were measured separately.</p>
<p>Following from the analysis, the weights calculated for all fingers are given as follows:
<disp-formula id="FD14">
<label>(C2)</label>
<mml:math id="mm75">
<mml:mrow>
<mml:mtable>
<mml:mtr>
<mml:mtd></mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:mtext>thumb</mml:mtext>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:mtext>index</mml:mtext>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:mtext>middle</mml:mtext>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:mtext>ring</mml:mtext>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:mtext>pink</mml:mtext>
</mml:mrow>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:mtable>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:msub>
<mml:mi>w</mml:mi>
<mml:mrow>
<mml:mi>D</mml:mi>
<mml:msub>
<mml:mi>G</mml:mi>
<mml:mrow>
<mml:mi>M</mml:mi>
<mml:mi>C</mml:mi>
<mml:mi>P</mml:mi>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:msub>
<mml:mi>w</mml:mi>
<mml:mrow>
<mml:mi>D</mml:mi>
<mml:msub>
<mml:mi>G</mml:mi>
<mml:mrow>
<mml:mi>P</mml:mi>
<mml:mi>I</mml:mi>
<mml:mi>P</mml:mi>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:mtd>
</mml:mtr>
</mml:mtable>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:mrow>
<mml:mo>[</mml:mo>
<mml:mrow>
<mml:mtable>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:mn>1.07</mml:mn>
</mml:mrow>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:mn>1.38</mml:mn>
</mml:mrow>
</mml:mtd>
</mml:mtr>
</mml:mtable>
</mml:mrow>
</mml:mrow>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:mtable>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:mn>0.73</mml:mn>
</mml:mrow>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:mn>2.14</mml:mn>
</mml:mrow>
</mml:mtd>
</mml:mtr>
</mml:mtable>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:mtable>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:mn>0.77</mml:mn>
</mml:mrow>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:mn>2.12</mml:mn>
</mml:mrow>
</mml:mtd>
</mml:mtr>
</mml:mtable>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:mtable>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:mn>0.49</mml:mn>
</mml:mrow>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:mn>1.86</mml:mn>
</mml:mrow>
</mml:mtd>
</mml:mtr>
</mml:mtable>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:mrow>
<mml:mrow>
<mml:mtable>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:mn>0.10</mml:mn>
</mml:mrow>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:mn>1.47</mml:mn>
</mml:mrow>
</mml:mtd>
</mml:mtr>
</mml:mtable>
</mml:mrow>
<mml:mo>]</mml:mo>
</mml:mrow>
</mml:mrow>
</mml:mtd>
</mml:mtr>
</mml:mtable>
</mml:mrow>
</mml:math>
</disp-formula>
</p>
<fig id="sensors-15-29868-f011" position="anchor">
<label>Figure C3</label>
<caption>
<p>(
<bold>a1–b1</bold>
) Raw metacarpophalangeal (MCP,
<bold>a1</bold>
) and proximal interphalangeal (PIP,
<bold>b1</bold>
) joint measurement data collected through Marker Tracking and normalized Data Glove (DG) sensor readings for the index finger; (
<bold>a2–b2</bold>
) Joint flexions measured through Marker Tracking versus data glove sensor readings rescaled to range [0 210]. The gradient of the linear least-squares fit function provides the weights
<inline-formula>
<mml:math id="mm76">
<mml:mrow>
<mml:msub>
<mml:mi>w</mml:mi>
<mml:mrow>
<mml:mi>D</mml:mi>
<mml:msub>
<mml:mi>G</mml:mi>
<mml:mrow>
<mml:mi>M</mml:mi>
<mml:mi>C</mml:mi>
<mml:mi>P</mml:mi>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula>
and
<inline-formula>
<mml:math id="mm77">
<mml:mrow>
<mml:msub>
<mml:mi>w</mml:mi>
<mml:mrow>
<mml:mi>D</mml:mi>
<mml:msub>
<mml:mi>G</mml:mi>
<mml:mrow>
<mml:mi>P</mml:mi>
<mml:mi>I</mml:mi>
<mml:mi>P</mml:mi>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula>
.</p>
</caption>
<graphic xlink:href="sensors-15-29868-g011"></graphic>
</fig>
</app>
</app-group>
<ref-list>
<title>References</title>
<ref id="B1-sensors-15-29868">
<label>1.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Rautaray</surname>
<given-names>S.</given-names>
</name>
<name>
<surname>Agrawal</surname>
<given-names>A.</given-names>
</name>
</person-group>
<article-title>Vision based hand gesture recognition for human computer interaction: A survey</article-title>
<source>Artif. Intell. Rev.</source>
<year>2015</year>
<volume>43</volume>
<fpage>1</fpage>
<lpage>54</lpage>
<pub-id pub-id-type="doi">10.1007/s10462-012-9356-9</pub-id>
</element-citation>
</ref>
<ref id="B2-sensors-15-29868">
<label>2.</label>
<element-citation publication-type="confproc">
<person-group person-group-type="author">
<name>
<surname>Suarez</surname>
<given-names>J.</given-names>
</name>
<name>
<surname>Murphy</surname>
<given-names>R.R.</given-names>
</name>
</person-group>
<article-title>Hand gesture recognition with depth images: A review</article-title>
<source>Proceedings of the RO-MAN, 2012 IEEE</source>
<conf-loc>Paris, France</conf-loc>
<conf-date>9–13 September 2012</conf-date>
<fpage>411</fpage>
<lpage>417</lpage>
</element-citation>
</ref>
<ref id="B3-sensors-15-29868">
<label>3.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Erol</surname>
<given-names>A.</given-names>
</name>
<name>
<surname>Bebis</surname>
<given-names>G.</given-names>
</name>
<name>
<surname>Nicolescu</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>Boyle</surname>
<given-names>R.D.</given-names>
</name>
<name>
<surname>Twombly</surname>
<given-names>X.</given-names>
</name>
</person-group>
<article-title>Vision-based hand pose estimation: A review</article-title>
<source>Comput. Vis. Image Underst.</source>
<year>2007</year>
<volume>108</volume>
<fpage>52</fpage>
<lpage>73</lpage>
<pub-id pub-id-type="doi">10.1016/j.cviu.2006.10.012</pub-id>
</element-citation>
</ref>
<ref id="B4-sensors-15-29868">
<label>4.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Palacios</surname>
<given-names>J.</given-names>
</name>
<name>
<surname>Sagüés</surname>
<given-names>C.</given-names>
</name>
<name>
<surname>Montijano</surname>
<given-names>E.</given-names>
</name>
<name>
<surname>Llorente</surname>
<given-names>S.</given-names>
</name>
</person-group>
<article-title>Human-computer interaction based on hand gestures using RGB-D sensors</article-title>
<source>Sensors</source>
<year>2013</year>
<volume>13</volume>
<fpage>11842</fpage>
<pub-id pub-id-type="doi">10.3390/s130911842</pub-id>
<pub-id pub-id-type="pmid">24018953</pub-id>
</element-citation>
</ref>
<ref id="B5-sensors-15-29868">
<label>5.</label>
<element-citation publication-type="book">
<person-group person-group-type="author">
<name>
<surname>Sturman</surname>
<given-names>D.J.</given-names>
</name>
</person-group>
<source>Whole-Hand Input</source>
<publisher-name>Massachusetts Institute of Technology</publisher-name>
<publisher-loc>Boston, MA, USA</publisher-loc>
<year>1991</year>
</element-citation>
</ref>
<ref id="B6-sensors-15-29868">
<label>6.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Pintzos</surname>
<given-names>G.</given-names>
</name>
<name>
<surname>Rentzos</surname>
<given-names>L.</given-names>
</name>
<name>
<surname>Papakostas</surname>
<given-names>N.</given-names>
</name>
<name>
<surname>Chryssolouris</surname>
<given-names>G.</given-names>
</name>
</person-group>
<article-title>A novel approach for the combined use of ar goggles and mobile devices as communication tools on the shopfloor</article-title>
<source>Procedia CIRP</source>
<year>2014</year>
<volume>25</volume>
<fpage>132</fpage>
<lpage>137</lpage>
<pub-id pub-id-type="doi">10.1016/j.procir.2014.10.021</pub-id>
</element-citation>
</ref>
<ref id="B7-sensors-15-29868">
<label>7.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Kalra</surname>
<given-names>P.</given-names>
</name>
<name>
<surname>Magnenat-Thalmann</surname>
<given-names>N.</given-names>
</name>
<name>
<surname>Moccozet</surname>
<given-names>L.</given-names>
</name>
<name>
<surname>Sannier</surname>
<given-names>G.</given-names>
</name>
<name>
<surname>Aubel</surname>
<given-names>A.</given-names>
</name>
<name>
<surname>Thalmann</surname>
<given-names>D.</given-names>
</name>
</person-group>
<article-title>Real-time animation of realistic virtual humans</article-title>
<source>Comput. Graph. Appl. IEEE</source>
<year>1998</year>
<volume>18</volume>
<fpage>42</fpage>
<lpage>56</lpage>
<pub-id pub-id-type="doi">10.1109/38.708560</pub-id>
</element-citation>
</ref>
<ref id="B8-sensors-15-29868">
<label>8.</label>
<element-citation publication-type="book">
<person-group person-group-type="author">
<name>
<surname>Menache</surname>
<given-names>A.</given-names>
</name>
</person-group>
<source>Understanding Motion Capture for Computer Animation and Video Games</source>
<publisher-name>Morgan Kaufmann</publisher-name>
<publisher-loc>San Diego, CA, USA</publisher-loc>
<year>2000</year>
</element-citation>
</ref>
<ref id="B9-sensors-15-29868">
<label>9.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Ohn-Bar</surname>
<given-names>E.</given-names>
</name>
<name>
<surname>Trivedi</surname>
<given-names>M.M.</given-names>
</name>
</person-group>
<article-title>Hand gesture recognition in real time for automotive interfaces: A multimodal vision-based approach and evaluations</article-title>
<source>IEEE Trans. Intell. Transp. Syst.</source>
<year>2014</year>
<volume>15</volume>
<fpage>2368</fpage>
<lpage>2377</lpage>
<pub-id pub-id-type="doi">10.1109/TITS.2014.2337331</pub-id>
</element-citation>
</ref>
<ref id="B10-sensors-15-29868">
<label>10.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Grätzel</surname>
<given-names>C.</given-names>
</name>
<name>
<surname>Fong</surname>
<given-names>T.</given-names>
</name>
<name>
<surname>Grange</surname>
<given-names>S.</given-names>
</name>
<name>
<surname>Baur</surname>
<given-names>C.</given-names>
</name>
</person-group>
<article-title>A non-contact mouse for surgeon-computer interaction</article-title>
<source>Technol. Health Care</source>
<year>2004</year>
<volume>12</volume>
<fpage>245</fpage>
<lpage>257</lpage>
<pub-id pub-id-type="pmid">15328453</pub-id>
</element-citation>
</ref>
<ref id="B11-sensors-15-29868">
<label>11.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Rosa</surname>
<given-names>G.M.</given-names>
</name>
<name>
<surname>Elizondo</surname>
<given-names>M.L.</given-names>
</name>
</person-group>
<article-title>Use of a gesture user interface as a touchless image navigation system in dental surgery: Case series report</article-title>
<source>Imaging Sci. Dent.</source>
<year>2014</year>
<volume>44</volume>
<fpage>155</fpage>
<lpage>160</lpage>
<pub-id pub-id-type="doi">10.5624/isd.2014.44.2.155</pub-id>
<pub-id pub-id-type="pmid">24944966</pub-id>
</element-citation>
</ref>
<ref id="B12-sensors-15-29868">
<label>12.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Adhikarla</surname>
<given-names>V.</given-names>
</name>
<name>
<surname>Sodnik</surname>
<given-names>J.</given-names>
</name>
<name>
<surname>Szolgay</surname>
<given-names>P.</given-names>
</name>
<name>
<surname>Jakus</surname>
<given-names>G.</given-names>
</name>
</person-group>
<article-title>Exploring direct 3d interaction for full horizontal parallax light field displays using leap motion controller</article-title>
<source>Sensors</source>
<year>2015</year>
<volume>15</volume>
<fpage>8642</fpage>
<pub-id pub-id-type="doi">10.3390/s150408642</pub-id>
<pub-id pub-id-type="pmid">25875189</pub-id>
</element-citation>
</ref>
<ref id="B13-sensors-15-29868">
<label>13.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Bachmann</surname>
<given-names>D.</given-names>
</name>
<name>
<surname>Weichert</surname>
<given-names>F.</given-names>
</name>
<name>
<surname>Rinkenauer</surname>
<given-names>G.</given-names>
</name>
</person-group>
<article-title>Evaluation of the leap motion controller as a new contact-free pointing device</article-title>
<source>Sensors</source>
<year>2014</year>
<volume>15</volume>
<fpage>214</fpage>
<lpage>233</lpage>
<pub-id pub-id-type="doi">10.3390/s150100214</pub-id>
<pub-id pub-id-type="pmid">25609043</pub-id>
</element-citation>
</ref>
<ref id="B14-sensors-15-29868">
<label>14.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Guna</surname>
<given-names>J.</given-names>
</name>
<name>
<surname>Jakus</surname>
<given-names>G.</given-names>
</name>
<name>
<surname>Pogačnik</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>Tomažič</surname>
<given-names>S.</given-names>
</name>
<name>
<surname>Sodnik</surname>
<given-names>J.</given-names>
</name>
</person-group>
<article-title>An analysis of the precision and reliability of the leap motion sensor and its suitability for static and dynamic tracking</article-title>
<source>Sensors</source>
<year>2014</year>
<volume>14</volume>
<fpage>3702</fpage>
<lpage>3720</lpage>
<pub-id pub-id-type="doi">10.3390/s140203702</pub-id>
<pub-id pub-id-type="pmid">24566635</pub-id>
</element-citation>
</ref>
<ref id="B15-sensors-15-29868">
<label>15.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Dipietro</surname>
<given-names>L.</given-names>
</name>
<name>
<surname>Sabatini</surname>
<given-names>A.M.</given-names>
</name>
<name>
<surname>Dario</surname>
<given-names>P.</given-names>
</name>
</person-group>
<article-title>A survey of glove-based systems and their applications</article-title>
<source>IEEE Trans. Syst. Man Cybern. Part C Appl. Rev.</source>
<year>2008</year>
<volume>38</volume>
<fpage>461</fpage>
<lpage>482</lpage>
<pub-id pub-id-type="doi">10.1109/TSMCC.2008.923862</pub-id>
</element-citation>
</ref>
<ref id="B16-sensors-15-29868">
<label>16.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Pavlovic</surname>
<given-names>V.I.</given-names>
</name>
<name>
<surname>Sharma</surname>
<given-names>R.</given-names>
</name>
<name>
<surname>Huang</surname>
<given-names>T.S.</given-names>
</name>
</person-group>
<article-title>Visual interpretation of hand gestures for human-computer interaction: A review</article-title>
<source>IEEE Trans. Pattern Anal. Mach. Intell.</source>
<year>1997</year>
<volume>19</volume>
<fpage>677</fpage>
<lpage>695</lpage>
<pub-id pub-id-type="doi">10.1109/34.598226</pub-id>
</element-citation>
</ref>
<ref id="B17-sensors-15-29868">
<label>17.</label>
<element-citation publication-type="book">
<person-group person-group-type="author">
<name>
<surname>Wu</surname>
<given-names>Y.</given-names>
</name>
<name>
<surname>Huang</surname>
<given-names>T.</given-names>
</name>
</person-group>
<source>Vision-Based Gesture Recognition: A Review</source>
<publisher-name>Springer Heidelberg</publisher-name>
<publisher-loc>Berlin, Germany</publisher-loc>
<year>1999</year>
<fpage>103</fpage>
<lpage>115</lpage>
</element-citation>
</ref>
<ref id="B18-sensors-15-29868">
<label>18.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Preil</surname>
<given-names>M.</given-names>
</name>
</person-group>
<article-title>Optimum dose for EUV: Technical
<italic>vs.</italic>
Economic drivers</article-title>
<source>Future Fab Int.</source>
<year>2012</year>
<volume>41</volume>
</element-citation>
</ref>
<ref id="B19-sensors-15-29868">
<label>19.</label>
<element-citation publication-type="book">
<person-group person-group-type="author">
<name>
<surname>Kurzweil</surname>
<given-names>R.</given-names>
</name>
</person-group>
<source>The Singularity is Near: When Humans Transcend Biology</source>
<publisher-name>Penguin</publisher-name>
<publisher-loc>New York, NY, USA</publisher-loc>
<year>2005</year>
</element-citation>
</ref>
<ref id="B20-sensors-15-29868">
<label>20.</label>
<element-citation publication-type="confproc">
<person-group person-group-type="author">
<name>
<surname>Zhao</surname>
<given-names>W.</given-names>
</name>
<name>
<surname>Chai</surname>
<given-names>J.</given-names>
</name>
<name>
<surname>Xu</surname>
<given-names>Y.-Q.</given-names>
</name>
</person-group>
<article-title>Combining Marker-Based Mocap and RGB-D Camera for Acquiring High-Fidelity Hand Motion Data</article-title>
<source>Proceedings of the ACM SIGGRAPH/Eurographics Symposium on Computer Animation</source>
<conf-loc>Eurographics Association, Lausanne, Switzerland</conf-loc>
<conf-date>29–31 July 2012</conf-date>
<fpage>33</fpage>
<lpage>42</lpage>
</element-citation>
</ref>
<ref id="B21-sensors-15-29868">
<label>21.</label>
<element-citation publication-type="confproc">
<person-group person-group-type="author">
<name>
<surname>Rogalla</surname>
<given-names>O.</given-names>
</name>
<name>
<surname>Ehrenmann</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>Dillmann</surname>
<given-names>R.</given-names>
</name>
</person-group>
<article-title>A sensor fusion approach for PbD</article-title>
<source>Proceedings of the 1998 IEEE/RSJ International Conference on Intelligent Robots and Systems</source>
<conf-loc>Victoria, BC, Canada</conf-loc>
<conf-date>13–17 October 1998</conf-date>
<fpage>1040</fpage>
<lpage>1045</lpage>
</element-citation>
</ref>
<ref id="B22-sensors-15-29868">
<label>22.</label>
<element-citation publication-type="confproc">
<person-group person-group-type="author">
<name>
<surname>Ehrenmann</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>Zollner</surname>
<given-names>R.</given-names>
</name>
<name>
<surname>Knoop</surname>
<given-names>S.</given-names>
</name>
<name>
<surname>Dillmann</surname>
<given-names>R.</given-names>
</name>
</person-group>
<article-title>Sensor Fusion Approaches for Observation of User Actions in Programming by Demonstration</article-title>
<source>Proceedings of the International Conference on Multisensor Fusion and Integration for Intelligent Systems</source>
<conf-loc>Baden-Baden, Germany</conf-loc>
<conf-date>20–22 August 2001</conf-date>
<fpage>227</fpage>
<lpage>232</lpage>
</element-citation>
</ref>
<ref id="B23-sensors-15-29868">
<label>23.</label>
<element-citation publication-type="confproc">
<person-group person-group-type="author">
<name>
<surname>Hebert</surname>
<given-names>P.</given-names>
</name>
<name>
<surname>Hudson</surname>
<given-names>N.</given-names>
</name>
<name>
<surname>Ma</surname>
<given-names>J.</given-names>
</name>
<name>
<surname>Burdick</surname>
<given-names>J.</given-names>
</name>
</person-group>
<article-title>Fusion of Stereo Vision, Force-Torque, and Joint Sensors for Estimation of in-Hand Object Location</article-title>
<source>Proceedings of the 2011 IEEE International Conference on Robotics and Automation (ICRA)</source>
<conf-loc>Shanghai, China</conf-loc>
<conf-date>9–13 May 2011</conf-date>
<fpage>5935</fpage>
<lpage>5941</lpage>
</element-citation>
</ref>
<ref id="B24-sensors-15-29868">
<label>24.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Zhou</surname>
<given-names>S.</given-names>
</name>
<name>
<surname>Fei</surname>
<given-names>F.</given-names>
</name>
<name>
<surname>Zhang</surname>
<given-names>G.</given-names>
</name>
<name>
<surname>Liu</surname>
<given-names>Y.</given-names>
</name>
<name>
<surname>Li</surname>
<given-names>W.</given-names>
</name>
</person-group>
<article-title>Hand-writing motion tracking with vision-inertial sensor fusion: Calibration and error correction</article-title>
<source>Sensors</source>
<year>2014</year>
<volume>14</volume>
<fpage>15641</fpage>
<lpage>15657</lpage>
<pub-id pub-id-type="doi">10.3390/s140915641</pub-id>
<pub-id pub-id-type="pmid">25157546</pub-id>
</element-citation>
</ref>
<ref id="B25-sensors-15-29868">
<label>25.</label>
<element-citation publication-type="confproc">
<person-group person-group-type="author">
<name>
<surname>Fan</surname>
<given-names>W.</given-names>
</name>
<name>
<surname>Chen</surname>
<given-names>X.</given-names>
</name>
<name>
<surname>Wang</surname>
<given-names>W.-H.</given-names>
</name>
<name>
<surname>Zhang</surname>
<given-names>X.</given-names>
</name>
<name>
<surname>Yang</surname>
<given-names>J.-H.</given-names>
</name>
<name>
<surname>Lantz</surname>
<given-names>V.</given-names>
</name>
<name>
<surname>Wang</surname>
<given-names>K.-Q.</given-names>
</name>
</person-group>
<article-title>A Method of Hand Gesture Recognition Based on Multiple Sensors</article-title>
<source>Proceedings of the 4th International Conference on Bioinformatics and Biomedical Engineering (iCBBE)</source>
<conf-loc>Chengdu, China</conf-loc>
<conf-date>18–20 June 2010</conf-date>
<fpage>1</fpage>
<lpage>4</lpage>
</element-citation>
</ref>
<ref id="B26-sensors-15-29868">
<label>26.</label>
<element-citation publication-type="confproc">
<person-group person-group-type="author">
<name>
<surname>Zou</surname>
<given-names>W.</given-names>
</name>
<name>
<surname>Yuan</surname>
<given-names>K.</given-names>
</name>
<name>
<surname>Liu</surname>
<given-names>J.</given-names>
</name>
<name>
<surname>Luo</surname>
<given-names>B.</given-names>
</name>
</person-group>
<article-title>A Method for Hand Tracking and Motion Recognizing in Chinese Sign Language</article-title>
<source>Proceedings of the 2001 International Conferences on Info-Tech and Info-Net</source>
<conf-loc>Beijing, China</conf-loc>
<conf-date>29 October 2001</conf-date>
<fpage>543</fpage>
<lpage>549</lpage>
</element-citation>
</ref>
<ref id="B27-sensors-15-29868">
<label>27.</label>
<element-citation publication-type="confproc">
<person-group person-group-type="author">
<name>
<surname>Brashear</surname>
<given-names>H.</given-names>
</name>
<name>
<surname>Starner</surname>
<given-names>T.</given-names>
</name>
<name>
<surname>Lukowicz</surname>
<given-names>P.</given-names>
</name>
<name>
<surname>Junker</surname>
<given-names>H.</given-names>
</name>
</person-group>
<article-title>Using Multiple Sensors for Mobile Sign Language Recognition</article-title>
<source>Proceedings of the 7th IEEE International Symposium on Wearable Computers (ISWC 2003)</source>
<conf-loc>White Plains, NY, USA</conf-loc>
<conf-date>21–23 October 2003</conf-date>
</element-citation>
</ref>
<ref id="B28-sensors-15-29868">
<label>28.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Khaleghi</surname>
<given-names>B.</given-names>
</name>
<name>
<surname>Khamis</surname>
<given-names>A.</given-names>
</name>
<name>
<surname>Karray</surname>
<given-names>F.O.</given-names>
</name>
<name>
<surname>Razavi</surname>
<given-names>S.N.</given-names>
</name>
</person-group>
<article-title>Multisensor data fusion: A review of the state-of-the-art</article-title>
<source>Inf. Fusion</source>
<year>2013</year>
<volume>14</volume>
<fpage>28</fpage>
<lpage>44</lpage>
<pub-id pub-id-type="doi">10.1016/j.inffus.2011.08.001</pub-id>
</element-citation>
</ref>
<ref id="B29-sensors-15-29868">
<label>29.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Kunkler</surname>
<given-names>K.</given-names>
</name>
</person-group>
<article-title>The role of medical simulation: An overview</article-title>
<source>Int. J. Med. Robot. Comput. Assist. Surg.</source>
<year>2006</year>
<volume>2</volume>
<fpage>203</fpage>
<lpage>210</lpage>
<pub-id pub-id-type="doi">10.1002/rcs.101</pub-id>
<pub-id pub-id-type="pmid">17520633</pub-id>
</element-citation>
</ref>
<ref id="B30-sensors-15-29868">
<label>30.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Diesen</surname>
<given-names>D.L.</given-names>
</name>
<name>
<surname>Erhunmwunsee</surname>
<given-names>L.</given-names>
</name>
<name>
<surname>Bennett</surname>
<given-names>K.M.</given-names>
</name>
<name>
<surname>Ben-David</surname>
<given-names>K.</given-names>
</name>
<name>
<surname>Yurcisin</surname>
<given-names>B.</given-names>
</name>
<name>
<surname>Ceppa</surname>
<given-names>E.P.</given-names>
</name>
<name>
<surname>Omotosho</surname>
<given-names>P.A.</given-names>
</name>
<name>
<surname>Perez</surname>
<given-names>A.</given-names>
</name>
<name>
<surname>Pryor</surname>
<given-names>A.</given-names>
</name>
</person-group>
<article-title>Effectiveness of laparoscopic computer simulator
<italic>versus</italic>
usage of box trainer for endoscopic surgery training of novices</article-title>
<source>J. Surg. Educ.</source>
<year>2011</year>
<volume>68</volume>
<fpage>282</fpage>
<lpage>289</lpage>
<pub-id pub-id-type="doi">10.1016/j.jsurg.2011.02.007</pub-id>
<pub-id pub-id-type="pmid">21708364</pub-id>
</element-citation>
</ref>
<ref id="B31-sensors-15-29868">
<label>31.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Newmark</surname>
<given-names>J.</given-names>
</name>
<name>
<surname>Dandolu</surname>
<given-names>V.</given-names>
</name>
<name>
<surname>Milner</surname>
<given-names>R.</given-names>
</name>
<name>
<surname>Grewal</surname>
<given-names>H.</given-names>
</name>
<name>
<surname>Harbison</surname>
<given-names>S.</given-names>
</name>
<name>
<surname>Hernandez</surname>
<given-names>E.</given-names>
</name>
</person-group>
<article-title>Correlating virtual reality and box trainer tasks in the assessment of laparoscopic surgical skills</article-title>
<source>Am. J. Obst. Gynecol.</source>
<year>2007</year>
<volume>197</volume>
<fpage>546.e541</fpage>
<lpage>546.e544</lpage>
<pub-id pub-id-type="doi">10.1016/j.ajog.2007.07.026</pub-id>
<pub-id pub-id-type="pmid">17980205</pub-id>
</element-citation>
</ref>
<ref id="B32-sensors-15-29868">
<label>32.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Munz</surname>
<given-names>Y.</given-names>
</name>
<name>
<surname>Kumar</surname>
<given-names>B.D.</given-names>
</name>
<name>
<surname>Moorthy</surname>
<given-names>K.</given-names>
</name>
<name>
<surname>Bann</surname>
<given-names>S.</given-names>
</name>
<name>
<surname>Darzi</surname>
<given-names>A.</given-names>
</name>
</person-group>
<article-title>Laparoscopic virtual reality and box trainers: Is one superior to the other?</article-title>
<source>Surg. Endosc. Interv. Tech.</source>
<year>2004</year>
<volume>18</volume>
<fpage>485</fpage>
<lpage>494</lpage>
</element-citation>
</ref>
<ref id="B33-sensors-15-29868">
<label>33.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Hull</surname>
<given-names>L.</given-names>
</name>
<name>
<surname>Kassab</surname>
<given-names>E.</given-names>
</name>
<name>
<surname>Arora</surname>
<given-names>S.</given-names>
</name>
<name>
<surname>Kneebone</surname>
<given-names>R.</given-names>
</name>
</person-group>
<article-title>Increasing the realism of a laparoscopic box trainer: A simple, inexpensive method</article-title>
<source>J. Laparoendosc. Ad. Surg. Tech.</source>
<year>2010</year>
<volume>20</volume>
<fpage>559</fpage>
<lpage>562</lpage>
<pub-id pub-id-type="doi">10.1089/lap.2010.0069</pub-id>
<pub-id pub-id-type="pmid">20687817</pub-id>
</element-citation>
</ref>
<ref id="B34-sensors-15-29868">
<label>34.</label>
<element-citation publication-type="book">
<person-group person-group-type="author">
<name>
<surname>Bent</surname>
<given-names>J.</given-names>
</name>
<name>
<surname>Chan</surname>
<given-names>K.</given-names>
</name>
</person-group>
<source>Human Factors in Aviation -h10 Flight Training and Simulation as Safety Generators</source>
<publisher-name>Academic Press</publisher-name>
<publisher-loc>Burlington, MA, USA</publisher-loc>
<year>2010</year>
</element-citation>
</ref>
<ref id="B35-sensors-15-29868">
<label>35.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Crothers</surname>
<given-names>I.</given-names>
</name>
<name>
<surname>Gallagher</surname>
<given-names>A.</given-names>
</name>
<name>
<surname>McClure</surname>
<given-names>N.</given-names>
</name>
<name>
<surname>James</surname>
<given-names>D.</given-names>
</name>
<name>
<surname>McGuigan</surname>
<given-names>J.</given-names>
</name>
</person-group>
<article-title>Experienced laparoscopic surgeons are automated to the "fulcrum effect": An ergonomic demonstration</article-title>
<source>Endoscopy</source>
<year>1999</year>
<volume>31</volume>
<fpage>365</fpage>
<lpage>369</lpage>
<pub-id pub-id-type="doi">10.1055/s-1999-26</pub-id>
<pub-id pub-id-type="pmid">10433045</pub-id>
</element-citation>
</ref>
<ref id="B36-sensors-15-29868">
<label>36.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Gallagher</surname>
<given-names>A.</given-names>
</name>
<name>
<surname>McClure</surname>
<given-names>N.</given-names>
</name>
<name>
<surname>McGuigan</surname>
<given-names>J.</given-names>
</name>
<name>
<surname>Ritchie</surname>
<given-names>K.</given-names>
</name>
<name>
<surname>Sheehy</surname>
<given-names>N.</given-names>
</name>
</person-group>
<article-title>An ergonomic analysis of the fulcrum effect in the acquisition of endoscopic skills</article-title>
<source>Endoscopy</source>
<year>1998</year>
<volume>30</volume>
<fpage>617</fpage>
<lpage>620</lpage>
<pub-id pub-id-type="doi">10.1055/s-2007-1001366</pub-id>
<pub-id pub-id-type="pmid">9826140</pub-id>
</element-citation>
</ref>
<ref id="B37-sensors-15-29868">
<label>37.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Eyal</surname>
<given-names>R.</given-names>
</name>
<name>
<surname>Tendick</surname>
<given-names>F.</given-names>
</name>
</person-group>
<article-title>Spatial ability and learning the use of an angled laparoscope in a virtual</article-title>
<source>Med. Meets Virtual Real. Outer Space Inner Space Virtual Space</source>
<year>2001</year>
<volume>81</volume>
<fpage>146</fpage>
</element-citation>
</ref>
<ref id="B38-sensors-15-29868">
<label>38.</label>
<element-citation publication-type="webpage">
<person-group person-group-type="author">
<collab>Oculus VR. LLC</collab>
</person-group>
<article-title>Step into the Rift</article-title>
<comment>Available online:
<ext-link ext-link-type="uri" xlink:href="www.oculus.com">www.oculus.com</ext-link>
</comment>
<date-in-citation>(accessed on 7 December 2015)</date-in-citation>
</element-citation>
</ref>
<ref id="B39-sensors-15-29868">
<label>39.</label>
<element-citation publication-type="webpage">
<person-group person-group-type="author">
<collab>Sony Computer Entertainment America LLC</collab>
</person-group>
<article-title>Project Morpheus</article-title>
<comment>Available online:
<ext-link ext-link-type="uri" xlink:href="https://www.playstation.com/en-us/explore/project-morpheus/">https://www.playstation.com/en-us/explore/project-morpheus/</ext-link>
</comment>
<date-in-citation>(accessed on 7 December 2015)</date-in-citation>
</element-citation>
</ref>
<ref id="B40-sensors-15-29868">
<label>40.</label>
<element-citation publication-type="webpage">
<person-group person-group-type="author">
<collab>HTC Corporation</collab>
</person-group>
<article-title>Htc Vive</article-title>
<comment>Available online:
<ext-link ext-link-type="uri" xlink:href="http://www.htcvr.com">http://www.htcvr.com</ext-link>
</comment>
<date-in-citation>(accessed on 7 December 2015)</date-in-citation>
</element-citation>
</ref>
<ref id="B41-sensors-15-29868">
<label>41.</label>
<element-citation publication-type="webpage">
<person-group person-group-type="author">
<collab>SAMSUNG ELECTRONICS CO. LTD</collab>
</person-group>
<article-title>Samsung Gear vr</article-title>
<comment>Available online:
<ext-link ext-link-type="uri" xlink:href="http://www.samsung.com/global/microsite/gearvr/gearvr_features.html">http://www.samsung.com/global/microsite/gearvr/gearvr_features.html</ext-link>
</comment>
<date-in-citation>(accessed on 7 December 2015)</date-in-citation>
</element-citation>
</ref>
<ref id="B42-sensors-15-29868">
<label>42.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Ahlberg</surname>
<given-names>G.</given-names>
</name>
<name>
<surname>Enochsson</surname>
<given-names>L.</given-names>
</name>
<name>
<surname>Gallagher</surname>
<given-names>A.G.</given-names>
</name>
<name>
<surname>Hedman</surname>
<given-names>L.</given-names>
</name>
<name>
<surname>Hogman</surname>
<given-names>C.</given-names>
</name>
<name>
<surname>McClusky</surname>
<given-names>D.A.</given-names>
<suffix>III</suffix>
</name>
<name>
<surname>Ramel</surname>
<given-names>S.</given-names>
</name>
<name>
<surname>Smith</surname>
<given-names>C.D.</given-names>
</name>
<name>
<surname>Arvidsson</surname>
<given-names>D.</given-names>
</name>
</person-group>
<article-title>Proficiency-based virtual reality training significantly reduces the error rate for residents during their first 10 laparoscopic cholecystectomies</article-title>
<source>Am. J. Surg.</source>
<year>2007</year>
<volume>193</volume>
<fpage>797</fpage>
<lpage>804</lpage>
<pub-id pub-id-type="doi">10.1016/j.amjsurg.2006.06.050</pub-id>
<pub-id pub-id-type="pmid">17512301</pub-id>
</element-citation>
</ref>
<ref id="B43-sensors-15-29868">
<label>43.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Gallagher</surname>
<given-names>A.G.</given-names>
</name>
<name>
<surname>Ritter</surname>
<given-names>E.M.</given-names>
</name>
<name>
<surname>Champion</surname>
<given-names>H.</given-names>
</name>
<name>
<surname>Higgins</surname>
<given-names>G.</given-names>
</name>
<name>
<surname>Fried</surname>
<given-names>M.P.</given-names>
</name>
<name>
<surname>Moses</surname>
<given-names>G.</given-names>
</name>
<name>
<surname>Smith</surname>
<given-names>C.D.</given-names>
</name>
<name>
<surname>Satava</surname>
<given-names>R.M.</given-names>
</name>
</person-group>
<article-title>Virtual reality simulation for the operating room: Proficiency-based training as a paradigm shift in surgical skills training</article-title>
<source>Ann. Surg.</source>
<year>2005</year>
<volume>241</volume>
<fpage>364</fpage>
<lpage>372</lpage>
<pub-id pub-id-type="doi">10.1097/01.sla.0000151982.85062.80</pub-id>
<pub-id pub-id-type="pmid">15650649</pub-id>
</element-citation>
</ref>
<ref id="B44-sensors-15-29868">
<label>44.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Seymour</surname>
<given-names>N.E.</given-names>
</name>
<name>
<surname>Gallagher</surname>
<given-names>A.G.</given-names>
</name>
<name>
<surname>Roman</surname>
<given-names>S.A.</given-names>
</name>
<name>
<surname>O’Brien</surname>
<given-names>M.K.</given-names>
</name>
<name>
<surname>Bansal</surname>
<given-names>V.K.</given-names>
</name>
<name>
<surname>Andersen</surname>
<given-names>D.K.</given-names>
</name>
<name>
<surname>Satava</surname>
<given-names>R.M.</given-names>
</name>
</person-group>
<article-title>Virtual reality training improves operating room performance: Results of a randomized, double-blinded study</article-title>
<source>Ann. Surg.</source>
<year>2002</year>
<volume>236</volume>
<fpage>458</fpage>
<lpage>464</lpage>
<pub-id pub-id-type="doi">10.1097/00000658-200210000-00008</pub-id>
<pub-id pub-id-type="pmid">12368674</pub-id>
</element-citation>
</ref>
<ref id="B45-sensors-15-29868">
<label>45.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Munz</surname>
<given-names>Y.</given-names>
</name>
<name>
<surname>Almoudaris</surname>
<given-names>A.M.</given-names>
</name>
<name>
<surname>Moorthy</surname>
<given-names>K.</given-names>
</name>
<name>
<surname>Dosis</surname>
<given-names>A.</given-names>
</name>
<name>
<surname>Liddle</surname>
<given-names>A.D.</given-names>
</name>
<name>
<surname>Darzi</surname>
<given-names>A.W.</given-names>
</name>
</person-group>
<article-title>Curriculum-based solo virtual reality training for laparoscopic intracorporeal knot tying: Objective assessment of the transfer of skill from virtual reality to reality</article-title>
<source>Am. J. Surg.</source>
<year>2007</year>
<volume>193</volume>
<fpage>774</fpage>
<lpage>783</lpage>
<pub-id pub-id-type="doi">10.1016/j.amjsurg.2007.01.022</pub-id>
<pub-id pub-id-type="pmid">17512295</pub-id>
</element-citation>
</ref>
<ref id="B46-sensors-15-29868">
<label>46.</label>
<element-citation publication-type="confproc">
<person-group person-group-type="author">
<name>
<surname>Wang</surname>
<given-names>R.</given-names>
</name>
<name>
<surname>Paris</surname>
<given-names>S.</given-names>
</name>
<name>
<surname>Popović</surname>
<given-names>J.</given-names>
</name>
</person-group>
<article-title>6d Hands: Markerless Hand-Tracking for Computer Aided Design</article-title>
<source>Proceedings of the 24th Annual ACM Symposium on User Interface Software and Technology</source>
<conf-loc>Santa Barbara, CA, USA</conf-loc>
<conf-date>16–19 October 2011</conf-date>
<fpage>549</fpage>
<lpage>558</lpage>
</element-citation>
</ref>
<ref id="B47-sensors-15-29868">
<label>47.</label>
<element-citation publication-type="confproc">
<person-group person-group-type="author">
<name>
<surname>El-laithy</surname>
<given-names>R.A.</given-names>
</name>
<name>
<surname>Jidong</surname>
<given-names>H.</given-names>
</name>
<name>
<surname>Yeh</surname>
<given-names>M.</given-names>
</name>
</person-group>
<article-title>Study on the Use of Microsoft Kinect for Robotics Applications</article-title>
<source>Proceedings of the 2012 IEEE/ION Position Location and Navigation Symposium (PLANS)</source>
<conf-loc>Myrtle Beach, SC, USA</conf-loc>
<conf-date>23–26 April 2012</conf-date>
<fpage>1280</fpage>
<lpage>1288</lpage>
</element-citation>
</ref>
<ref id="B48-sensors-15-29868">
<label>48.</label>
<element-citation publication-type="confproc">
<person-group person-group-type="author">
<name>
<surname>Yonjae</surname>
<given-names>K.</given-names>
</name>
<name>
<surname>Kim</surname>
<given-names>P.C.W.</given-names>
</name>
<name>
<surname>Selle</surname>
<given-names>R.</given-names>
</name>
<name>
<surname>Shademan</surname>
<given-names>A.</given-names>
</name>
<name>
<surname>Krieger</surname>
<given-names>A.</given-names>
</name>
</person-group>
<article-title>Experimental Evaluation of Contact-Less hand Tracking Systems for Tele-Operation of Surgical Tasks</article-title>
<source>Proceedings of the 2014 IEEE International Conference on Robotics and Automation (ICRA)</source>
<conf-loc>Hong Kong, China</conf-loc>
<conf-date>31 May–7 June 2014</conf-date>
<fpage>3502</fpage>
<lpage>3509</lpage>
</element-citation>
</ref>
<ref id="B49-sensors-15-29868">
<label>49.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Arkenbout</surname>
<given-names>E.A.</given-names>
</name>
<name>
<surname>Winter</surname>
<given-names>J.C.F.D.</given-names>
</name>
<name>
<surname>Breedveld</surname>
<given-names>P.</given-names>
</name>
</person-group>
<article-title>Using kinect with 3gear systems software to determine hand and finger movement: An assessment for minimally invasive surgery applications</article-title>
<source>Des. Med. Devices Eur.</source>
<year>2014</year>
<pub-id pub-id-type="doi">10.1371/journal.pone.0134501</pub-id>
</element-citation>
</ref>
<ref id="B50-sensors-15-29868">
<label>50.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Weichert</surname>
<given-names>F.</given-names>
</name>
<name>
<surname>Bachmann</surname>
<given-names>D.</given-names>
</name>
<name>
<surname>Rudak</surname>
<given-names>B.</given-names>
</name>
<name>
<surname>Fisseler</surname>
<given-names>D.</given-names>
</name>
</person-group>
<article-title>Analysis of the accuracy and robustness of the leap motion controller</article-title>
<source>Sensors</source>
<year>2013</year>
<volume>13</volume>
<fpage>6380</fpage>
<lpage>6393</lpage>
<pub-id pub-id-type="doi">10.3390/s130506380</pub-id>
<pub-id pub-id-type="pmid">23673678</pub-id>
</element-citation>
</ref>
<ref id="B51-sensors-15-29868">
<label>51.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Kim</surname>
<given-names>Y.</given-names>
</name>
<name>
<surname>Leonard</surname>
<given-names>S.</given-names>
</name>
<name>
<surname>Shademan</surname>
<given-names>A.</given-names>
</name>
<name>
<surname>Krieger</surname>
<given-names>A.</given-names>
</name>
<name>
<surname>Kim</surname>
<given-names>P.W.</given-names>
</name>
</person-group>
<article-title>Kinect technology for hand tracking control of surgical robots: Technical and surgical skill comparison to current robotic masters</article-title>
<source>Surg. Endosc.</source>
<year>2014</year>
<volume>28</volume>
<fpage>1993</fpage>
<lpage>2000</lpage>
<pub-id pub-id-type="doi">10.1007/s00464-013-3383-8</pub-id>
<pub-id pub-id-type="pmid">24380997</pub-id>
</element-citation>
</ref>
<ref id="B52-sensors-15-29868">
<label>52.</label>
<element-citation publication-type="webpage">
<person-group person-group-type="author">
<name>
<surname>Technologies</surname>
<given-names>F.D.</given-names>
</name>
</person-group>
<article-title>Data gloves</article-title>
<comment>Available online:
<ext-link ext-link-type="uri" xlink:href="http://www.5dt.com/?page_id=34">http://www.5dt.com/?page_id=34</ext-link>
</comment>
<date-in-citation>(accessed on 14 January 2015)</date-in-citation>
</element-citation>
</ref>
<ref id="B53-sensors-15-29868">
<label>53.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Bishop</surname>
<given-names>G.</given-names>
</name>
<name>
<surname>Welch</surname>
<given-names>G.</given-names>
</name>
</person-group>
<article-title>An introduction to the kalman filter</article-title>
<source>Proc. SIGGRAPH Course</source>
<year>2001</year>
<volume>8</volume>
<fpage>1</fpage>
<lpage>47</lpage>
</element-citation>
</ref>
<ref id="B54-sensors-15-29868">
<label>54.</label>
<element-citation publication-type="book">
<person-group person-group-type="author">
<name>
<surname>Harvey</surname>
<given-names>A.C.</given-names>
</name>
</person-group>
<source>Forecasting, Structural Time Series Models And The Kalman Filter</source>
<publisher-name>Cambridge University Press</publisher-name>
<publisher-loc>Cambridge, UK</publisher-loc>
<year>1990</year>
</element-citation>
</ref>
<ref id="B55-sensors-15-29868">
<label>55.</label>
<element-citation publication-type="book">
<person-group person-group-type="author">
<name>
<surname>Haykin</surname>
<given-names>S.</given-names>
</name>
</person-group>
<source>Kalman Filtering and Neural Networks</source>
<publisher-name>John Wiley & Sons</publisher-name>
<publisher-loc>New York, NY, USA</publisher-loc>
<year>2004</year>
</element-citation>
</ref>
<ref id="B56-sensors-15-29868">
<label>56.</label>
<element-citation publication-type="webpage">
<person-group person-group-type="author">
<collab>Nimble VR</collab>
</person-group>
<article-title>Nimble vr sdk v0.9.36</article-title>
<comment>Available online:
<ext-link ext-link-type="uri" xlink:href="http://nimblevr.com/download.html">http://nimblevr.com/download.html</ext-link>
</comment>
<date-in-citation>(accessed on 14 January 2015)</date-in-citation>
</element-citation>
</ref>
<ref id="B57-sensors-15-29868">
<label>57.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Fukunaga</surname>
<given-names>K.</given-names>
</name>
<name>
<surname>Hostetler</surname>
<given-names>L.</given-names>
</name>
</person-group>
<article-title>The estimation of the gradient of a density function, with applications in pattern recognition</article-title>
<source>IEEE Trans. Inf. Theory</source>
<year>1975</year>
<volume>21</volume>
<fpage>32</fpage>
<lpage>40</lpage>
<pub-id pub-id-type="doi">10.1109/TIT.1975.1055330</pub-id>
</element-citation>
</ref>
<ref id="B58-sensors-15-29868">
<label>58.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Dianat</surname>
<given-names>I.</given-names>
</name>
<name>
<surname>Haslegrave</surname>
<given-names>C.M.</given-names>
</name>
<name>
<surname>Stedmon</surname>
<given-names>A.W.</given-names>
</name>
</person-group>
<article-title>Methodology for evaluating gloves in relation to the effects on hand performance capabilities: A literature review</article-title>
<source>Ergonomics</source>
<year>2012</year>
<volume>55</volume>
<fpage>1429</fpage>
<lpage>1451</lpage>
<pub-id pub-id-type="doi">10.1080/00140139.2012.708058</pub-id>
<pub-id pub-id-type="pmid">22897425</pub-id>
</element-citation>
</ref>
<ref id="B59-sensors-15-29868">
<label>59.</label>
<element-citation publication-type="confproc">
<person-group person-group-type="author">
<name>
<surname>Simone</surname>
<given-names>L.K.</given-names>
</name>
<name>
<surname>Elovic</surname>
<given-names>E.</given-names>
</name>
<name>
<surname>Kalambur</surname>
<given-names>U.</given-names>
</name>
<name>
<surname>Kamper</surname>
<given-names>D.</given-names>
</name>
</person-group>
<article-title>A Low Cost Method to Measure Finger Flexion in Individuals with Reduced Hand and Finger Range of Motion</article-title>
<source>Proceedings of the IEMBS 04. 26th Annual International Conference of the IEEE Engineering in Medicine and Biology Society</source>
<conf-loc>San Francisco, CA, USA</conf-loc>
<conf-date>1–5 September 2004</conf-date>
<fpage>4791</fpage>
<lpage>4794</lpage>
</element-citation>
</ref>
<ref id="B60-sensors-15-29868">
<label>60.</label>
<element-citation publication-type="confproc">
<person-group person-group-type="author">
<name>
<surname>Kim</surname>
<given-names>D.</given-names>
</name>
<name>
<surname>Hilliges</surname>
<given-names>O.</given-names>
</name>
<name>
<surname>Izadi</surname>
<given-names>S.</given-names>
</name>
<name>
<surname>Butler</surname>
<given-names>A.D.</given-names>
</name>
<name>
<surname>Chen</surname>
<given-names>J.</given-names>
</name>
<name>
<surname>Oikonomidis</surname>
<given-names>I.</given-names>
</name>
<name>
<surname>Olivier</surname>
<given-names>P.</given-names>
</name>
</person-group>
<article-title>Digits: Freehand 3D Interactions Anywhere Using a Wrist-Worn Gloveless Sensor</article-title>
<source>Proceedings of the 25th annual ACM symposium on User interface software and technology</source>
<conf-loc>Cambridge, MA, USA</conf-loc>
<conf-date>7–10 October 2012</conf-date>
<fpage>167</fpage>
<lpage>176</lpage>
</element-citation>
</ref>
<ref id="B61-sensors-15-29868">
<label>61.</label>
<element-citation publication-type="confproc">
<person-group person-group-type="author">
<name>
<surname>Nymoen</surname>
<given-names>K.</given-names>
</name>
<name>
<surname>Haugen</surname>
<given-names>M.R.</given-names>
</name>
<name>
<surname>Jensenius</surname>
<given-names>A.R.</given-names>
</name>
</person-group>
<article-title>Mumyo–Evaluating and Exploring the MYO Armband for Musical Interaction</article-title>
<source>Proceedings of the International Conference on New Interfaces For Musical Expression</source>
<conf-loc>Baton Rouge, LA, USA</conf-loc>
<conf-date>31 May–3 June 2015</conf-date>
<fpage>1</fpage>
<lpage>4</lpage>
</element-citation>
</ref>
<ref id="B62-sensors-15-29868">
<label>62.</label>
<element-citation publication-type="webpage">
<person-group person-group-type="author">
<collab>Thalmic Labs Inc.</collab>
</person-group>
<article-title>Myo-Touch-Free Control</article-title>
<comment>Available online:
<ext-link ext-link-type="uri" xlink:href="www.thalmic.com">www.thalmic.com</ext-link>
</comment>
<date-in-citation>(accessed on 7 December 2015)</date-in-citation>
</element-citation>
</ref>
<ref id="B63-sensors-15-29868">
<label>63.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Lynch</surname>
<given-names>J.</given-names>
</name>
<name>
<surname>Aughwane</surname>
<given-names>P.</given-names>
</name>
<name>
<surname>Hammond</surname>
<given-names>T.M.</given-names>
</name>
</person-group>
<article-title>Video games and surgical ability: A literature review</article-title>
<source>J. Surg. Educ.</source>
<year>2010</year>
<volume>67</volume>
<fpage>184</fpage>
<lpage>189</lpage>
<pub-id pub-id-type="doi">10.1016/j.jsurg.2010.02.010</pub-id>
<pub-id pub-id-type="pmid">20630431</pub-id>
</element-citation>
</ref>
<ref id="B64-sensors-15-29868">
<label>64.</label>
<element-citation publication-type="confproc">
<person-group person-group-type="author">
<name>
<surname>Levison</surname>
<given-names>W.H.</given-names>
</name>
<name>
<surname>Lancraft</surname>
<given-names>R.</given-names>
</name>
<name>
<surname>Junker</surname>
<given-names>A.</given-names>
</name>
</person-group>
<article-title>Effects of Simulator Delays on Performance and Learning in a Roll-Axis Tracking Task</article-title>
<source>Proceedings of the 15th Annual Conference on Manual Control, Wright State University, Dayton</source>
<conf-loc>Dayton, OH, USA</conf-loc>
<conf-date>20–22 March 1979</conf-date>
<fpage>168</fpage>
<lpage>186</lpage>
</element-citation>
</ref>
<ref id="B65-sensors-15-29868">
<label>65.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Bowersox</surname>
<given-names>J.C.</given-names>
</name>
<name>
<surname>Cordts</surname>
<given-names>P.R.</given-names>
</name>
<name>
<surname>LaPorta</surname>
<given-names>A.J.</given-names>
</name>
</person-group>
<article-title>Use of an intuitive telemanipulator system for remote trauma surgery: An experimental study</article-title>
<source>J. Am. Coll. Surg.</source>
<year>1998</year>
<volume>186</volume>
<fpage>615</fpage>
<lpage>621</lpage>
<pub-id pub-id-type="doi">10.1016/S1072-7515(98)00105-7</pub-id>
<pub-id pub-id-type="pmid">9632146</pub-id>
</element-citation>
</ref>
<ref id="B66-sensors-15-29868">
<label>66.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Fabrlzio</surname>
<given-names>M.D.</given-names>
</name>
<name>
<surname>Lee</surname>
<given-names>B.R.</given-names>
</name>
<name>
<surname>Chan</surname>
<given-names>D.Y.</given-names>
</name>
<name>
<surname>Stoianovici</surname>
<given-names>D.</given-names>
</name>
<name>
<surname>Jarrett</surname>
<given-names>T.W.</given-names>
</name>
<name>
<surname>Yang</surname>
<given-names>C.</given-names>
</name>
<name>
<surname>Kavoussi</surname>
<given-names>L.R.</given-names>
</name>
</person-group>
<article-title>Effect of time delay on surgical performance during telesurgical manipulation</article-title>
<source>J. Endourol.</source>
<year>2000</year>
<volume>14</volume>
<fpage>133</fpage>
<lpage>138</lpage>
<pub-id pub-id-type="doi">10.1089/end.2000.14.133</pub-id>
<pub-id pub-id-type="pmid">10772504</pub-id>
</element-citation>
</ref>
<ref id="B67-sensors-15-29868">
<label>67.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Ottensmeyer</surname>
<given-names>M.P.</given-names>
</name>
<name>
<surname>Hu</surname>
<given-names>J.</given-names>
</name>
<name>
<surname>Thompson</surname>
<given-names>J.M.</given-names>
</name>
<name>
<surname>Ren</surname>
<given-names>J.</given-names>
</name>
<name>
<surname>Sheridan</surname>
<given-names>T.B.</given-names>
</name>
</person-group>
<article-title>Investigations into performance of minimally invasive telesurgery with feedback time delays</article-title>
<source>Presence Teleoper. Virtual Environ.</source>
<year>2000</year>
<volume>9</volume>
<fpage>369</fpage>
<lpage>382</lpage>
<pub-id pub-id-type="doi">10.1162/105474600566871</pub-id>
</element-citation>
</ref>
<ref id="B68-sensors-15-29868">
<label>68.</label>
<element-citation publication-type="confproc">
<person-group person-group-type="author">
<name>
<surname>Arsenault</surname>
<given-names>R.</given-names>
</name>
<name>
<surname>Ware</surname>
<given-names>C.</given-names>
</name>
</person-group>
<article-title>Eye-Hand Co-Ordination with Force Feedback</article-title>
<source>Proceedings of the SIGCHI conference on Human Factors in Computing Systems</source>
<conf-loc>The Hague, The Netherlands</conf-loc>
<conf-date>1–6 April 2000</conf-date>
<fpage>408</fpage>
<lpage>414</lpage>
</element-citation>
</ref>
<ref id="B69-sensors-15-29868">
<label>69.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Reiley</surname>
<given-names>C.E.</given-names>
</name>
<name>
<surname>Akinbiyi</surname>
<given-names>T.</given-names>
</name>
<name>
<surname>Burschka</surname>
<given-names>D.</given-names>
</name>
<name>
<surname>Chang</surname>
<given-names>D.C.</given-names>
</name>
<name>
<surname>Okamura</surname>
<given-names>A.M.</given-names>
</name>
<name>
<surname>Yuh</surname>
<given-names>D.D.</given-names>
</name>
</person-group>
<article-title>Effects of visual force feedback on robot-assisted surgical task performance</article-title>
<source>J. Thorac. Cardiovasc. Surg.</source>
<year>2008</year>
<volume>135</volume>
<fpage>196</fpage>
<lpage>202</lpage>
<pub-id pub-id-type="doi">10.1016/j.jtcvs.2007.08.043</pub-id>
<pub-id pub-id-type="pmid">18179942</pub-id>
</element-citation>
</ref>
<ref id="B70-sensors-15-29868">
<label>70.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Lécuyer</surname>
<given-names>A.</given-names>
</name>
</person-group>
<article-title>Simulating haptic feedback using vision: A survey of research and applications of pseudo-haptic feedback</article-title>
<source>Presence Teleoperators Virtual Environ.</source>
<year>2009</year>
<volume>18</volume>
<fpage>39</fpage>
<lpage>53</lpage>
<pub-id pub-id-type="doi">10.1162/pres.18.1.39</pub-id>
</element-citation>
</ref>
<ref id="B71-sensors-15-29868">
<label>71.</label>
<element-citation publication-type="confproc">
<person-group person-group-type="author">
<name>
<surname>Buchmann</surname>
<given-names>V.</given-names>
</name>
<name>
<surname>Violich</surname>
<given-names>S.</given-names>
</name>
<name>
<surname>Billinghurst</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>Cockburn</surname>
<given-names>A.</given-names>
</name>
</person-group>
<article-title>Fingartips: Gesture Based Direct Manipulation in Augmented Reality</article-title>
<source>Proceedings of the 2nd international conference on Computer graphics and interactive techniques in Australasia and South East Asia</source>
<conf-loc>Singapore</conf-loc>
<conf-date>15–18 June 2004</conf-date>
<fpage>212</fpage>
<lpage>221</lpage>
</element-citation>
</ref>
<ref id="B72-sensors-15-29868">
<label>72.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Judkins</surname>
<given-names>T.N.</given-names>
</name>
<name>
<surname>DiMartino</surname>
<given-names>A.</given-names>
</name>
<name>
<surname>Doné</surname>
<given-names>K.</given-names>
</name>
<name>
<surname>Hallbeck</surname>
<given-names>M.S.</given-names>
</name>
<name>
<surname>Oleynikov</surname>
<given-names>D.</given-names>
</name>
</person-group>
<article-title>Effect of handle design and target location on wrist posture during aiming with a laparoscopic tool</article-title>
<source>Proc. Human Factors Ergon. Soc. Ann. Meet.</source>
<year>2004</year>
<volume>48</volume>
<fpage>1464</fpage>
<lpage>1468</lpage>
<pub-id pub-id-type="doi">10.1177/154193120404801242</pub-id>
</element-citation>
</ref>
<ref id="B73-sensors-15-29868">
<label>73.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Saunders</surname>
<given-names>J.</given-names>
</name>
<name>
<surname>Knill</surname>
<given-names>D.</given-names>
</name>
</person-group>
<article-title>Humans use continuous visual feedback from the hand to control fast reaching movements</article-title>
<source>Exp. Brain Res.</source>
<year>2003</year>
<volume>152</volume>
<fpage>341</fpage>
<lpage>352</lpage>
<pub-id pub-id-type="doi">10.1007/s00221-003-1525-2</pub-id>
<pub-id pub-id-type="pmid">12904935</pub-id>
</element-citation>
</ref>
</ref-list>
</back>
</pmc>
</record>

Pour manipuler ce document sous Unix (Dilib)

EXPLOR_STEP=$WICRI_ROOT/Ticri/CIDE/explor/HapticV1/Data/Pmc/Curation
HfdSelect -h $EXPLOR_STEP/biblio.hfd -nk 000390 | SxmlIndent | more

Ou

HfdSelect -h $EXPLOR_AREA/Data/Pmc/Curation/biblio.hfd -nk 000390 | SxmlIndent | more

Pour mettre un lien sur cette page dans le réseau Wicri

{{Explor lien
   |wiki=    Ticri/CIDE
   |area=    HapticV1
   |flux=    Pmc
   |étape=   Curation
   |type=    RBID
   |clé=     PMC:4721788
   |texte=   Robust Hand Motion Tracking through Data Fusion of 5DT Data Glove and Nimble VR Kinect Camera Measurements
}}

Pour générer des pages wiki

HfdIndexSelect -h $EXPLOR_AREA/Data/Pmc/Curation/RBID.i   -Sk "pubmed:26694395" \
       | HfdSelect -Kh $EXPLOR_AREA/Data/Pmc/Curation/biblio.hfd   \
       | NlmPubMed2Wicri -a HapticV1 

Wicri

This area was generated with Dilib version V0.6.23.
Data generation: Mon Jun 13 01:09:46 2016. Site generation: Wed Mar 6 09:54:07 2024