Serveur d'exploration sur les dispositifs haptiques

Attention, ce site est en cours de développement !
Attention, site généré par des moyens informatiques à partir de corpus bruts.
Les informations ne sont donc pas validées.

An Analysis of the Precision and Reliability of the Leap Motion Sensor and Its Suitability for Static and Dynamic Tracking

Identifieur interne : 002505 ( Pmc/Curation ); précédent : 002504; suivant : 002506

An Analysis of the Precision and Reliability of the Leap Motion Sensor and Its Suitability for Static and Dynamic Tracking

Auteurs : Jože Guna ; Grega Jakus ; Matevž Poga Nik ; Sašo Tomaži ; Jaka Sodnik

Source :

RBID : PMC:3958287

Abstract

We present the results of an evaluation of the performance of the Leap Motion Controller with the aid of a professional, high-precision, fast motion tracking system. A set of static and dynamic measurements was performed with different numbers of tracking objects and configurations. For the static measurements, a plastic arm model simulating a human arm was used. A set of 37 reference locations was selected to cover the controller's sensory space. For the dynamic measurements, a special V-shaped tool, consisting of two tracking objects maintaining a constant distance between them, was created to simulate two human fingers. In the static scenario, the standard deviation was less than 0.5 mm. The linear correlation revealed a significant increase in the standard deviation when moving away from the controller. The results of the dynamic scenario revealed the inconsistent performance of the controller, with a significant drop in accuracy for samples taken more than 250 mm above the controller's surface. The Leap Motion Controller undoubtedly represents a revolutionary input device for gesture-based human-computer interaction; however, due to its rather limited sensory space and inconsistent sampling frequency, in its current configuration it cannot currently be used as a professional tracking system.


Url:
DOI: 10.3390/s140203702
PubMed: 24566635
PubMed Central: 3958287

Links toward previous steps (curation, corpus...)


Links to Exploration step

PMC:3958287

Le document en format XML

<record>
<TEI>
<teiHeader>
<fileDesc>
<titleStmt>
<title xml:lang="en">An Analysis of the Precision and Reliability of the Leap Motion Sensor and Its Suitability for Static and Dynamic Tracking</title>
<author>
<name sortKey="Guna, Joze" sort="Guna, Joze" uniqKey="Guna J" first="Jože" last="Guna">Jože Guna</name>
</author>
<author>
<name sortKey="Jakus, Grega" sort="Jakus, Grega" uniqKey="Jakus G" first="Grega" last="Jakus">Grega Jakus</name>
</author>
<author>
<name sortKey="Poga Nik, Matevz" sort="Poga Nik, Matevz" uniqKey="Poga Nik M" first="Matevž" last="Poga Nik">Matevž Poga Nik</name>
</author>
<author>
<name sortKey="Tomazi, Saso" sort="Tomazi, Saso" uniqKey="Tomazi S" first="Sašo" last="Tomaži">Sašo Tomaži</name>
</author>
<author>
<name sortKey="Sodnik, Jaka" sort="Sodnik, Jaka" uniqKey="Sodnik J" first="Jaka" last="Sodnik">Jaka Sodnik</name>
</author>
</titleStmt>
<publicationStmt>
<idno type="wicri:source">PMC</idno>
<idno type="pmid">24566635</idno>
<idno type="pmc">3958287</idno>
<idno type="url">http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3958287</idno>
<idno type="RBID">PMC:3958287</idno>
<idno type="doi">10.3390/s140203702</idno>
<date when="2014">2014</date>
<idno type="wicri:Area/Pmc/Corpus">002505</idno>
<idno type="wicri:Area/Pmc/Curation">002505</idno>
</publicationStmt>
<sourceDesc>
<biblStruct>
<analytic>
<title xml:lang="en" level="a" type="main">An Analysis of the Precision and Reliability of the Leap Motion Sensor and Its Suitability for Static and Dynamic Tracking</title>
<author>
<name sortKey="Guna, Joze" sort="Guna, Joze" uniqKey="Guna J" first="Jože" last="Guna">Jože Guna</name>
</author>
<author>
<name sortKey="Jakus, Grega" sort="Jakus, Grega" uniqKey="Jakus G" first="Grega" last="Jakus">Grega Jakus</name>
</author>
<author>
<name sortKey="Poga Nik, Matevz" sort="Poga Nik, Matevz" uniqKey="Poga Nik M" first="Matevž" last="Poga Nik">Matevž Poga Nik</name>
</author>
<author>
<name sortKey="Tomazi, Saso" sort="Tomazi, Saso" uniqKey="Tomazi S" first="Sašo" last="Tomaži">Sašo Tomaži</name>
</author>
<author>
<name sortKey="Sodnik, Jaka" sort="Sodnik, Jaka" uniqKey="Sodnik J" first="Jaka" last="Sodnik">Jaka Sodnik</name>
</author>
</analytic>
<series>
<title level="j">Sensors (Basel, Switzerland)</title>
<idno type="eISSN">1424-8220</idno>
<imprint>
<date when="2014">2014</date>
</imprint>
</series>
</biblStruct>
</sourceDesc>
</fileDesc>
<profileDesc>
<textClass></textClass>
</profileDesc>
</teiHeader>
<front>
<div type="abstract" xml:lang="en">
<p>We present the results of an evaluation of the performance of the Leap Motion Controller with the aid of a professional, high-precision, fast motion tracking system. A set of static and dynamic measurements was performed with different numbers of tracking objects and configurations. For the static measurements, a plastic arm model simulating a human arm was used. A set of 37 reference locations was selected to cover the controller's sensory space. For the dynamic measurements, a special V-shaped tool, consisting of two tracking objects maintaining a constant distance between them, was created to simulate two human fingers. In the static scenario, the standard deviation was less than 0.5 mm. The linear correlation revealed a significant increase in the standard deviation when moving away from the controller. The results of the dynamic scenario revealed the inconsistent performance of the controller, with a significant drop in accuracy for samples taken more than 250 mm above the controller's surface. The Leap Motion Controller undoubtedly represents a revolutionary input device for gesture-based human-computer interaction; however, due to its rather limited sensory space and inconsistent sampling frequency, in its current configuration it cannot currently be used as a professional tracking system.</p>
</div>
</front>
<back>
<div1 type="bibliography">
<listBibl>
<biblStruct>
<analytic>
<author>
<name sortKey="Oviatt, S" uniqKey="Oviatt S">S. Oviatt</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Matthew, T" uniqKey="Matthew T">T. Matthew</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Bhuiyan, M" uniqKey="Bhuiyan M">M. Bhuiyan</name>
</author>
<author>
<name sortKey="Picking, R" uniqKey="Picking R">R. Picking</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Wingrave, C A" uniqKey="Wingrave C">C.A. Wingrave</name>
</author>
<author>
<name sortKey="Williamson, B" uniqKey="Williamson B">B. Williamson</name>
</author>
<author>
<name sortKey="Varcholik, P D" uniqKey="Varcholik P">P.D. Varcholik</name>
</author>
<author>
<name sortKey="Rose, J" uniqKey="Rose J">J. Rose</name>
</author>
<author>
<name sortKey="Miller, A" uniqKey="Miller A">A. Miller</name>
</author>
<author>
<name sortKey="Charbonneau, E" uniqKey="Charbonneau E">E. Charbonneau</name>
</author>
<author>
<name sortKey="Bott, J" uniqKey="Bott J">J. Bott</name>
</author>
<author>
<name sortKey="Laviola, J J" uniqKey="Laviola J">J.J. LaViola</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Zhang, Z" uniqKey="Zhang Z">Z. Zhang</name>
</author>
</analytic>
</biblStruct>
<biblStruct></biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Hodson, H" uniqKey="Hodson H">H. Hodson</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Aziz, A A" uniqKey="Aziz A">A.A. Aziz</name>
</author>
<author>
<name sortKey="Wan, K" uniqKey="Wan K">K. Wan</name>
</author>
<author>
<name sortKey="Zaaba, S K" uniqKey="Zaaba S">S.K. Zaaba</name>
</author>
<author>
<name sortKey="Shahriman, A B" uniqKey="Shahriman A">A.B Shahriman</name>
</author>
<author>
<name sortKey="Adnan, N H" uniqKey="Adnan N">N.H. Adnan</name>
</author>
<author>
<name sortKey="Nor, R M" uniqKey="Nor R">R.M. Nor</name>
</author>
<author>
<name sortKey="Ayob, M N" uniqKey="Ayob M">M.N. Ayob</name>
</author>
<author>
<name sortKey="Ismail, A H" uniqKey="Ismail A">A.H. Ismail</name>
</author>
<author>
<name sortKey="Ramly, M F" uniqKey="Ramly M">M.F. Ramly</name>
</author>
</analytic>
</biblStruct>
<biblStruct></biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Khoshelham, K" uniqKey="Khoshelham K">K. Khoshelham</name>
</author>
<author>
<name sortKey="Elberink, S O" uniqKey="Elberink S">S.O. Elberink</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Teather, R J" uniqKey="Teather R">R.J. Teather</name>
</author>
<author>
<name sortKey="Pavlovych, A" uniqKey="Pavlovych A">A. Pavlovych</name>
</author>
<author>
<name sortKey="Stuerzlinger, W" uniqKey="Stuerzlinger W">W. Stuerzlinger</name>
</author>
<author>
<name sortKey="Mackenzie, I S" uniqKey="Mackenzie I">I.S. MacKenzie</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Hernoux, F" uniqKey="Hernoux F">F. Hernoux</name>
</author>
<author>
<name sortKey="Bearee, R" uniqKey="Bearee R">R. Béarée</name>
</author>
<author>
<name sortKey="Gajny, L" uniqKey="Gajny L">L. Gajny</name>
</author>
<author>
<name sortKey="Nyiri, E" uniqKey="Nyiri E">E. Nyiri</name>
</author>
<author>
<name sortKey="Bancalin, J" uniqKey="Bancalin J">J. Bancalin</name>
</author>
<author>
<name sortKey="Gibaru, O" uniqKey="Gibaru O">O. Gibaru</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Vikram, S" uniqKey="Vikram S">S. Vikram</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Regenbrecht, H" uniqKey="Regenbrecht H">H. Regenbrecht</name>
</author>
<author>
<name sortKey="Collins, J" uniqKey="Collins J">J. Collins</name>
</author>
<author>
<name sortKey="Hoermann, S" uniqKey="Hoermann S">S. Hoermann</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Weichert, F" uniqKey="Weichert F">F. Weichert</name>
</author>
<author>
<name sortKey="Bachmann, D" uniqKey="Bachmann D">D. Bachmann</name>
</author>
<author>
<name sortKey="Rudak, B" uniqKey="Rudak B">B. Rudak</name>
</author>
<author>
<name sortKey="Fisseler, D" uniqKey="Fisseler D">D. Fisseler</name>
</author>
</analytic>
</biblStruct>
</listBibl>
</div1>
</back>
</TEI>
<pmc article-type="research-article">
<pmc-dir>properties open_access</pmc-dir>
<front>
<journal-meta>
<journal-id journal-id-type="nlm-ta">Sensors (Basel)</journal-id>
<journal-id journal-id-type="iso-abbrev">Sensors (Basel)</journal-id>
<journal-title-group>
<journal-title>Sensors (Basel, Switzerland)</journal-title>
</journal-title-group>
<issn pub-type="epub">1424-8220</issn>
<publisher>
<publisher-name>Molecular Diversity Preservation International (MDPI)</publisher-name>
</publisher>
</journal-meta>
<article-meta>
<article-id pub-id-type="pmid">24566635</article-id>
<article-id pub-id-type="pmc">3958287</article-id>
<article-id pub-id-type="doi">10.3390/s140203702</article-id>
<article-id pub-id-type="publisher-id">sensors-14-03702</article-id>
<article-categories>
<subj-group subj-group-type="heading">
<subject>Article</subject>
</subj-group>
</article-categories>
<title-group>
<article-title>An Analysis of the Precision and Reliability of the Leap Motion Sensor and Its Suitability for Static and Dynamic Tracking</article-title>
</title-group>
<contrib-group>
<contrib contrib-type="author">
<name>
<surname>Guna</surname>
<given-names>Jože</given-names>
</name>
<xref rid="c1-sensors-14-03702" ref-type="corresp">
<sup>*</sup>
</xref>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Jakus</surname>
<given-names>Grega</given-names>
</name>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Pogačnik</surname>
<given-names>Matevž</given-names>
</name>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Tomažič</surname>
<given-names>Sašo</given-names>
</name>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Sodnik</surname>
<given-names>Jaka</given-names>
</name>
</contrib>
<aff id="af1-sensors-14-03702">Faculty of Electrical Engineering, University of Ljubljana, Tržaška 25, Ljubljana 1000, Slovenia; E-Mails:
<email>grega.jakus@fe.uni-lj.si</email>
(G.J.);
<email>matevz.pogacnik@fe.uni-lj.si</email>
(M.P.);
<email>saso.tomazic@fe.uni-lj.si</email>
(S.T.);
<email>jaka.sodnik@fe.uni-lj.si</email>
(J.S.)</aff>
</contrib-group>
<author-notes>
<fn id="fn1-sensors-14-03702" fn-type="con">
<p>
<bold>Author Contributions:</bold>
The work presented in this paper was carried out in collaboration between all authors. Guna and Pogačnik designed and implemented the experiment setup and also developed a custom tracking software for the Leap Motion Controller. Jakus, Tomažič and Sodnik were responsible for the data post processing and corresponding statistical analysis. All contributions are made by the authors.</p>
</fn>
<corresp id="c1-sensors-14-03702">
<label>*</label>
Author to whom correspondence should be addressed; E-Mail:
<email>joze.guna@fe.uni-lj.si</email>
; Tel.: +386-147-681-16; Fax: +386-147-687-32.</corresp>
</author-notes>
<pub-date pub-type="epub">
<day>21</day>
<month>2</month>
<year>2014</year>
</pub-date>
<pub-date pub-type="collection">
<month>2</month>
<year>2014</year>
</pub-date>
<volume>14</volume>
<issue>2</issue>
<fpage>3702</fpage>
<lpage>3720</lpage>
<history>
<date date-type="received">
<day>13</day>
<month>12</month>
<year>2013</year>
</date>
<date date-type="rev-recd">
<day>30</day>
<month>1</month>
<year>2014</year>
</date>
<date date-type="accepted">
<day>12</day>
<month>2</month>
<year>2014</year>
</date>
</history>
<permissions>
<copyright-statement>© 2014 by the authors; licensee MDPI, Basel, Switzerland.</copyright-statement>
<copyright-year>2014</copyright-year>
<license>
<license-p>This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution license (
<ext-link ext-link-type="uri" xlink:href="http://creativecommons.org/licenses/by/3.0/">http://creativecommons.org/licenses/by/3.0/</ext-link>
).</license-p>
</license>
</permissions>
<abstract>
<p>We present the results of an evaluation of the performance of the Leap Motion Controller with the aid of a professional, high-precision, fast motion tracking system. A set of static and dynamic measurements was performed with different numbers of tracking objects and configurations. For the static measurements, a plastic arm model simulating a human arm was used. A set of 37 reference locations was selected to cover the controller's sensory space. For the dynamic measurements, a special V-shaped tool, consisting of two tracking objects maintaining a constant distance between them, was created to simulate two human fingers. In the static scenario, the standard deviation was less than 0.5 mm. The linear correlation revealed a significant increase in the standard deviation when moving away from the controller. The results of the dynamic scenario revealed the inconsistent performance of the controller, with a significant drop in accuracy for samples taken more than 250 mm above the controller's surface. The Leap Motion Controller undoubtedly represents a revolutionary input device for gesture-based human-computer interaction; however, due to its rather limited sensory space and inconsistent sampling frequency, in its current configuration it cannot currently be used as a professional tracking system.</p>
</abstract>
<kwd-group>
<kwd>Leap Motion Controller</kwd>
<kwd>motion capture system</kwd>
<kwd>precision measurement</kwd>
<kwd>spatial distortion measurement</kwd>
</kwd-group>
</article-meta>
</front>
<body>
<sec>
<label>1.</label>
<title>Introduction</title>
<p>The user interface and the corresponding interaction modalities play an essential role in the human-computer relationship. Advanced multimodal interfaces present yet another step in this equation, providing users with the freedom and flexibility to choose the best input modality for specific tasks. Users generally prefer multimodal interaction when it is available and intuitive to use [
<xref rid="b1-sensors-14-03702" ref-type="bibr">1</xref>
,
<xref rid="b2-sensors-14-03702" ref-type="bibr">2</xref>
].</p>
<p>Gesture-based user interfaces, in combination with the latest technical advances that incorporate accurate yet affordable new types of input devices, provide realistic new opportunities for specific application areas (e.g., entertainment, learning, health, engineering), especially for users who are uncomfortable with more commonly used input devices and/or technology [
<xref rid="b3-sensors-14-03702" ref-type="bibr">3</xref>
].</p>
<p>Gesture input devices and sensors are of special interest. Gesture acquisition methods can, in general, be divided into methods incorporating a specific device that the user must physically hold or have on his/her body and hands/body-free methods. The latter become more and more popular as the user becomes a controller rather than an operator. One of the first widespread, accurate, and commercially viable solutions was the Nintendo WiiMote controller, bundled with the Wii console, released in 2006. The WiiMote, besides its vocal and haptic modalities, incorporates an accelerometer that allows the acquisition of full 3D gestures. It can operate as a separate device and has been successfully used for many atypical applications [
<xref rid="b4-sensors-14-03702" ref-type="bibr">4</xref>
]. Another important milestone is the Microsoft Kinect sensor, an add-on for the Xbox 360 console, which was released in late 2010. The Kinect, among its visual and auditory inputs, includes a depth-sensing camera. In combination with an open SDK, it can be used to acquire and recognize full body gestures for multiple users at a time [
<xref rid="b5-sensors-14-03702" ref-type="bibr">5</xref>
]. The latest technological breakthrough in gesture-sensing devices has come in the form of a Leap Motion Controller (Leap Motion, San Francisco, CA, United States) [
<xref rid="b6-sensors-14-03702" ref-type="bibr">6</xref>
]. The controller, approximately the size of a box of matches, allows for the precise and fluid tracking of multiple hands, fingers, and small objects in free space with sub-millimeter accuracy. According to [
<xref rid="b7-sensors-14-03702" ref-type="bibr">7</xref>
], the Leap Motion Controller represents a major leap in input technology that could, with its enhanced interaction possibilities, trigger a new generation of far more useful 3D displays and possibly surpass the mouse as a primary input device.</p>
<p>The main goal of the research presented in this paper was to analyze the precision and reliability of the Leap Motion Controller in static and dynamic conditions and to determine its suitability as an economically attractive finger/hand and object tracking sensor. The evaluation was performed with the aid of a high-speed and highly accurate optical motion capture system. To the best of the authors' knowledge, no study has yet been conducted with the Leap Motion Controller in combination with an optical motion capture system.</p>
<p>The main contributions of this paper are analyses of the following:
<list list-type="bullet">
<list-item>
<p>Precision and reliability (spatial dispersion of measurements through time) of the controller</p>
</list-item>
<list-item>
<p>The spatial distortion of accuracy (variation of accuracy in different regions of sensory space)</p>
</list-item>
<list-item>
<p>Sampling frequency and its consistency.</p>
</list-item>
</list>
</p>
<p>The rest of the paper is organized as follows: following the Introduction, previous related work is presented in Section 2. The experimental environment, measurement methodology, and measurement scenarios used in this study are described in Section 3. The detailed results are analyzed and discussed in Section 4. Finally, key conclusions are drawn and recommendations are offered in Secion 5.</p>
</sec>
<sec>
<label>2.</label>
<title>Related Work</title>
<p>To illuminate the choice of motion capture system in combination with the Leap Motion Controller, the results of the research in [
<xref rid="b8-sensors-14-03702" ref-type="bibr">8</xref>
] are discussed. The authors of that study focused on the adaptive gesture recognition system while developing a gesture database to eliminate the individual factors that affect the efficiency of the recognition system. In particular, hand gestures were investigated. To acquire input data for the experiment, a Qualisys™ Motion Capture System [
<xref rid="b9-sensors-14-03702" ref-type="bibr">9</xref>
] was used, similar to the one in our own setup.</p>
<p>The Microsoft Kinect sensor was developed with hand/arm and full-body gesture recognition in mind. The authors of [
<xref rid="b10-sensors-14-03702" ref-type="bibr">10</xref>
] provide a detailed analysis of the accuracy and resolution of the Kinect sensor's depth data for indoor mapping applications. The experimental results show that the random error in depth measurement increases with increasing distance to the sensor, ranging from a few millimeters to approximately 4 cm at the maximum range of the sensor. The quality of the data is also found to be influenced by the low resolution of the depth measurements. The obtained accuracy is, in general, sufficient for detecting arm and full body gestures, but is not sufficient for precise finger gestures such as handwriting. The input device latency and spatial jitter are also important factors [
<xref rid="b11-sensors-14-03702" ref-type="bibr">11</xref>
].</p>
<p>The Leap Motion Controller presents a milestone in consumer finger/object and gesture tracking input technology. The device itself was made publicly available in summer 2013, and therefore not much scientific work has been published yet. In [
<xref rid="b12-sensors-14-03702" ref-type="bibr">12</xref>
], the authors describe an application of the Leap Motion Controller for the direct manipulation of an industrial robot arm with six axes of freedom. The Leap Motion Controller is used for finger position tracking. To increase the tracking precision, an interpolation of the acquired data is performed using polynomial splines. The aim of the research was to reproduce complex tasks in 3D without constraints on the operator. This goal reflects the importance of gesture-based interfaces that utilize low-cost, consumer-grade input sensor devices for industrial use.</p>
<p>Another study of the Leap Motion Controller in [
<xref rid="b13-sensors-14-03702" ref-type="bibr">13</xref>
] shows its potential in gesture and handwriting recognition applications. The acquired input data are treated as a time series of 3D positions and processed using the Dynamic Time Warping algorithm. The authors report promising recognition accuracy and performance results.</p>
<p>In [
<xref rid="b14-sensors-14-03702" ref-type="bibr">14</xref>
], a novel interface approach that combines 2D video-based augmented reality with a partial voxel model to allow more convincing interactions with 3D objects and worlds is presented. The interface enables users to interact with a virtual environment through a hand-controlled interface and allows for correct mutual occlusions between the interacting fingers and the virtual environment. A combination of the Leap Motion Controller and a webcam is used to track the users' fingers and overlay the appropriate video for an augmented view.</p>
<p>Finally, in [
<xref rid="b15-sensors-14-03702" ref-type="bibr">15</xref>
], the authors present a study of the accuracy and robustness of the Leap Motion Controller. An industrial robot with a reference pen allowing suitable position accuracy was used for the experiment. The results show a deviation between the desired 3D position and the average measured positions below 0.2 mm for static setups and of 1.2 mm for dynamic setups.</p>
</sec>
<sec>
<label>3.</label>
<title>Experimental Design</title>
<p>The controller's performance was evaluated through two types of measurements. In the first measurement, a series of fixed static points in space were tracked and recorded for a longer period of time to evaluate the consistency and dispersion of the results. The coordinates of the points were systematically chosen to cover the majority of the controller's sensory space. In the second measurement, a constant distance was provided between two objects, which were then moved freely around the sensory space. The tracking accuracy of the controller was then evaluated based on the distortion of the distance between the two objects. The reference system (a professional optical motion capture system) was used to determine the exact spatial positions of the tracked objects and the distances between them.</p>
<sec>
<label>3.1.</label>
<title>The Leap Motion Controller</title>
<p>The Leap Motion Controller uses infrared (IR) imaging to determine the position of predefined objects in a limited space in real time. Technically, very few details are known about the precise nature of the algorithms used due to patent and trade secret restrictions. However, from inspection of the controller, it is clear that three separate IR LED emitters are used in conjunction with two IR cameras. Therefore, the controller can be categorized as an optical tracking system based on the stereo vision principle. According to the official information [
<xref rid="b6-sensors-14-03702" ref-type="bibr">6</xref>
], the Leap software analyzes the objects observed in the device's field of view. It recognizes hands, fingers, and tools, reporting discrete positions, gestures, and motion. The controller's field of view is an inverted pyramid centered on the device. The effective range of the controller extends from approximately 25 to 600 millimeters above the device (1 inch to 2 feet). The controller itself is accessed and programmed through Application Programming Interfaces (APIs), with support for a variety of programming languages, ranging from C++ to Python. The positions of the recognized objects are acquired through these APIs. The Cartesian and spherical coordinate systems used to describe positions in the controller's sensory space are shown in
<xref rid="f1-sensors-14-03702" ref-type="fig">Figure 1</xref>
. However, it should be noted that the sampling frequency is not stable, cannot be set, and varies significantly.</p>
</sec>
<sec>
<label>3.2.</label>
<title>The Reference System</title>
<p>A high-precision optical tracking system [
<xref rid="b9-sensors-14-03702" ref-type="bibr">9</xref>
] consisting of eight Oqus 3+ high-speed cameras and Qualisys Track Manager software (version 2.8—build 1065) was used as the reference system (Qualisys Inc., Gothenburg, Sweden). Such systems are widely used for the fast and precise tracking of various objects in industrial applications, biomechanics, and media and entertainment applications. The tracking precision depends on the number of cameras used, their spatial layout, the calibration process, and the lighting conditions. In our case, only three markers were used, one for static measurement and two for dynamic measurement. In the dynamic measurement, a simple Automatic Identification of Markers (AIM) model was created from the two selected markers and their connecting bone. All markers were seen by all cameras at all times. The standard deviation of the noise for the static marker was measured for each individual coordinate: std
<italic>
<sub>x</sub>
</italic>
= 0.018 mm, std
<italic>
<sub>y</sub>
</italic>
= 0.016 mm and std
<italic>
<sub>z</sub>
</italic>
= 0.029 mm.</p>
</sec>
<sec>
<label>3.3.</label>
<title>Technical Setup</title>
<p>The Leap Motion controller was placed on a table 60 × 60 cm in area and 73 cm in height. The controller was firmly attached to the table, ensuring no undesired movement of the device. The controller transmitted data on the identified objects to a desktop computer (Intel
<sup>®</sup>
Core™ i7-2600 CPU 3.40 GHz with 8 GB of RAM). A set of scripts was written in the Python programming language using the Leap Motion APIs specifically for this study. The scripts were used for real-time data acquisition and logging. The operation of the controller was monitored in real time using the Leap Motion Visualizer software.</p>
<p>The optical reference system provided a calibrated measurement volume of approximately 1 × 1 × 1 m in size, with a resolution of 1.3 million pixels and a constant frame rate of 500 frames per second. The cameras were set up uniformly, encircling the Leap Motion controller so that each camera's point of view was directed towards the controller. A set of hard passive markers with diameters of 12.5 mm was used in the measurements. The coordinate systems of the reference system and the controller were aligned at the origin of the controller's coordinate system.</p>
<p>Two types of measurements were performed within the experiment, under two experimental conditions:
<list list-type="bullet">
<list-item>
<p>Static conditions: acquisition of a limited number of static points in space</p>
</list-item>
<list-item>
<p>Dynamic conditions: tracking of moving objects with constant inter-object distance within the calibrated space</p>
</list-item>
</list>
</p>
<p>Our pre-experiment trials indicated the controller's inability to track static objects that do not resemble the human hand. We can only speculate that this limitation is due to the controller's internal algorithms, as they are protected by patents and therefore not publicly disclosed. A pointed object, such as a pen tip (used for tracking in [
<xref rid="b15-sensors-14-03702" ref-type="bibr">15</xref>
]), was successfully tracked only if constantly in motion. When it was stationary and mounted on a stand, it was successfully tracked for only approximately 8–10 s. After this period of time, the measurement was automatically stopped by the controller. Therefore, a plastic arm model was used (
<xref rid="f2-sensors-14-03702" ref-type="fig">Figure 2</xref>
) instead of a simpler object.</p>
<p>During the measurement of static locations, the arm model was firmly attached using a stand (
<xref rid="f3-sensors-14-03702" ref-type="fig">Figure 3</xref>
) and directed perpendicular to the
<italic>z</italic>
= 0 plane in the opposite direction from the
<italic>z</italic>
axis. Additionally, a reflective marker was attached to the index fingertip of the plastic arm for simultaneous tracking by the controller and by the reference motion capture system. The stability of the stand was measured using the reference system, which indicated the dispersion of the measured index fingertip location to be below 20 μm.</p>
<p>For dynamic measurements, the tracking objects were moved around the sensory space with an approximately constant speed of 100 mm/s. Instead of the plastic arm, a special tool was used to mimic two human fingers. It consisted of two wooden sticks with markers fixed together to form a V-shape (hereafter: “the V-tool”) (
<xref rid="f4-sensors-14-03702" ref-type="fig">Figure 4</xref>
). This tool provided a constant distance between the two tracked objects, which was used to evaluate the tracking performance. It was perfectly tracked by the controller and the reference system simultaneously. The exact distance was acquired using the reference system (
<italic>d</italic>
= 21.36 mm, std
<italic>
<sub>d</sub>
</italic>
= 0.023 mm). The arm model with five fingers proved to be very impractical for this type of measurement, as the controller usually tracked the five fingers as five individual points that could not be identified separately. It was therefore almost impossible to identify the results for two selected fingers and calculate their inter-distance.</p>
</sec>
<sec>
<label>3.4.</label>
<title>The Methodology</title>
<p>All measurements were conducted in an environment with a constant temperature of 22 °C and an illumination intensity of approximately 500 lux, a common legal requirement for the workspace. The sampling frequency of the reference system was set to 500 Hz.</p>
<sec>
<label>3.4.1.</label>
<title>Static Measurements</title>
<p>The 37 reference locations where static measurements were performed are shown in
<xref rid="f5-sensors-14-03702" ref-type="fig">Figure 5</xref>
. The locations of the reference points were selected systematically to cover the majority of the sensory space of the controller. The number at each location in the figure indicates the height (the position on the
<italic>y</italic>
axis in cm) at which the individual measurements were taken. The actual measurements were taken close to the reference point in the measurement grid with an offset less than 5 mm. At least 4,000 samples were measured at each location. A total of 214,546 samples were obtained for the entire sensory space.</p>
<p>It was initially planned to take measurements along a 3-dimensional grid with 5 cm spacing between the measured locations. However, the pre-experiment trials revealed that it is difficult to obtain stable tracking of static objects at some locations, especially locations in front of the controller (
<italic>z</italic>
> 0). The measurement grid was therefore modified to include only locations at which the controller was able to provide stable tracking over a longer time period.</p>
<p>The analysis of the collected samples was primarily focused on evaluating the dispersion—a temporal distribution—of the recorded locations for each reference location, which characterizes the repeatability of measurements at a particular location in the controller's sensory space. Repeatability characterizes the ability to relocate the same location in a series of sequential measurements.</p>
<p>For the purpose of the analysis and the presentation of its results, the following mathematical operations and notions are used.</p>
<p>The measured positions are denoted by a set
<italic>p</italic>
[
<italic>i,j</italic>
]=(
<italic>p
<sub>x</sub>
</italic>
[
<italic>i,j</italic>
],
<italic>p
<sub>y</sub>
</italic>
[
<italic>i,j</italic>
],
<italic>p
<sub>z</sub>
</italic>
[
<italic>i,j</italic>
]) ∈
<italic>R</italic>
<sup>3</sup>
, where the components
<italic>p
<sub>x</sub>
</italic>
[
<italic>i,j</italic>
],
<italic>p
<sub>y</sub>
</italic>
[
<italic>i,j</italic>
] and
<italic>p
<sub>z</sub>
</italic>
[
<italic>i,j</italic>
] represent the coordinates in the Cartesian coordinate system of the
<italic>j</italic>
-th sample (1 ≤
<italic>j</italic>
<italic>N
<sub>i</sub>
</italic>
,
<italic>j</italic>
<italic>N</italic>
) taken at the
<italic>i</italic>
-th position (1 ≤
<italic>i</italic>
37,
<italic>i</italic>
<italic>N</italic>
), and
<italic>N
<sub>i</sub>
</italic>
stands for the total number of samples taken at the
<italic>i</italic>
-th position.</p>
<p>The standard deviation of the
<italic>i</italic>
-th three-dimensional spatial position is calculated by:
<disp-formula id="FD1">
<label>(1)</label>
<mml:math id="mm1">
<mml:mrow>
<mml:mtext mathvariant="italic">std</mml:mtext>
<mml:mo stretchy="false">[</mml:mo>
<mml:mi>i</mml:mi>
<mml:mo stretchy="false">]</mml:mo>
<mml:mo>=</mml:mo>
<mml:msqrt>
<mml:mrow>
<mml:mfrac>
<mml:mn>1</mml:mn>
<mml:mrow>
<mml:msub>
<mml:mi>N</mml:mi>
<mml:mi>i</mml:mi>
</mml:msub>
<mml:mo></mml:mo>
<mml:mn>1</mml:mn>
</mml:mrow>
</mml:mfrac>
<mml:msubsup>
<mml:mo></mml:mo>
<mml:mrow>
<mml:mi>j</mml:mi>
<mml:mo>=</mml:mo>
<mml:mn>1</mml:mn>
</mml:mrow>
<mml:mrow>
<mml:msub>
<mml:mi>N</mml:mi>
<mml:mi>i</mml:mi>
</mml:msub>
</mml:mrow>
</mml:msubsup>
<mml:mrow>
<mml:msup>
<mml:mrow>
<mml:mo stretchy="false">(</mml:mo>
<mml:mi>E</mml:mi>
<mml:mo stretchy="false">[</mml:mo>
<mml:mi>i</mml:mi>
<mml:mo>,</mml:mo>
<mml:mi>j</mml:mi>
<mml:mo stretchy="false">]</mml:mo>
<mml:mo></mml:mo>
<mml:mover accent="true">
<mml:mrow>
<mml:mi>E</mml:mi>
<mml:mo stretchy="false">[</mml:mo>
<mml:mi>ι</mml:mi>
<mml:mo stretchy="false">]</mml:mo>
</mml:mrow>
<mml:mo>¯</mml:mo>
</mml:mover>
<mml:mo stretchy="false">)</mml:mo>
</mml:mrow>
<mml:mn>2</mml:mn>
</mml:msup>
</mml:mrow>
<mml:mo>,</mml:mo>
</mml:mrow>
</mml:msqrt>
<mml:mtext>where</mml:mtext>
</mml:mrow>
</mml:math>
</disp-formula>
<disp-formula id="FD2">
<label>(2)</label>
<mml:math id="mm2">
<mml:mrow>
<mml:mi>E</mml:mi>
<mml:mo stretchy="false">[</mml:mo>
<mml:mi>i</mml:mi>
<mml:mo>,</mml:mo>
<mml:mi>j</mml:mi>
<mml:mo stretchy="false">]</mml:mo>
<mml:mo>=</mml:mo>
<mml:msqrt>
<mml:mrow>
<mml:msup>
<mml:mrow>
<mml:mo stretchy="false">(</mml:mo>
<mml:msub>
<mml:mi>p</mml:mi>
<mml:mi>x</mml:mi>
</mml:msub>
<mml:mo stretchy="false">[</mml:mo>
<mml:mi>i</mml:mi>
<mml:mo>,</mml:mo>
<mml:mi>j</mml:mi>
<mml:mo stretchy="false">]</mml:mo>
<mml:mo></mml:mo>
<mml:mover accent="true">
<mml:mrow>
<mml:msub>
<mml:mi>p</mml:mi>
<mml:mi>x</mml:mi>
</mml:msub>
<mml:mo stretchy="false">[</mml:mo>
<mml:mi>ι</mml:mi>
<mml:mo stretchy="false">]</mml:mo>
</mml:mrow>
<mml:mo>¯</mml:mo>
</mml:mover>
<mml:mo stretchy="false">)</mml:mo>
</mml:mrow>
<mml:mn>2</mml:mn>
</mml:msup>
<mml:mo>+</mml:mo>
<mml:msup>
<mml:mrow>
<mml:mo stretchy="false">(</mml:mo>
<mml:msub>
<mml:mi>p</mml:mi>
<mml:mi>y</mml:mi>
</mml:msub>
<mml:mo stretchy="false">[</mml:mo>
<mml:mi>i</mml:mi>
<mml:mo>,</mml:mo>
<mml:mi>j</mml:mi>
<mml:mo stretchy="false">]</mml:mo>
<mml:mo></mml:mo>
<mml:mover accent="true">
<mml:mrow>
<mml:msub>
<mml:mi>p</mml:mi>
<mml:mi>y</mml:mi>
</mml:msub>
<mml:mo stretchy="false">[</mml:mo>
<mml:mi>ι</mml:mi>
<mml:mo stretchy="false">]</mml:mo>
</mml:mrow>
<mml:mo>¯</mml:mo>
</mml:mover>
<mml:mo stretchy="false">)</mml:mo>
</mml:mrow>
<mml:mn>2</mml:mn>
</mml:msup>
<mml:mo>+</mml:mo>
<mml:msup>
<mml:mrow>
<mml:mo stretchy="false">(</mml:mo>
<mml:msub>
<mml:mi>p</mml:mi>
<mml:mi>z</mml:mi>
</mml:msub>
<mml:mo stretchy="false">[</mml:mo>
<mml:mi>i</mml:mi>
<mml:mo>,</mml:mo>
<mml:mi>j</mml:mi>
<mml:mo stretchy="false">]</mml:mo>
<mml:mo></mml:mo>
<mml:mover accent="true">
<mml:mrow>
<mml:msub>
<mml:mi>p</mml:mi>
<mml:mi>z</mml:mi>
</mml:msub>
<mml:mo stretchy="false">[</mml:mo>
<mml:mi>ι</mml:mi>
<mml:mo stretchy="false">]</mml:mo>
</mml:mrow>
<mml:mo>¯</mml:mo>
</mml:mover>
<mml:mo stretchy="false">)</mml:mo>
</mml:mrow>
<mml:mn>2</mml:mn>
</mml:msup>
</mml:mrow>
</mml:msqrt>
</mml:mrow>
</mml:math>
</disp-formula>
where
<inline-formula>
<mml:math id="mm3">
<mml:mrow>
<mml:mover accent="true">
<mml:mrow>
<mml:msub>
<mml:mi>p</mml:mi>
<mml:mi>x</mml:mi>
</mml:msub>
<mml:mo stretchy="false">[</mml:mo>
<mml:mi>ι</mml:mi>
<mml:mo stretchy="false">]</mml:mo>
</mml:mrow>
<mml:mo>¯</mml:mo>
</mml:mover>
</mml:mrow>
</mml:math>
</inline-formula>
,
<inline-formula>
<mml:math id="mm4">
<mml:mrow>
<mml:mover accent="true">
<mml:mrow>
<mml:msub>
<mml:mi>p</mml:mi>
<mml:mi>y</mml:mi>
</mml:msub>
<mml:mo stretchy="false">[</mml:mo>
<mml:mi>ι</mml:mi>
<mml:mo stretchy="false">]</mml:mo>
</mml:mrow>
<mml:mo>¯</mml:mo>
</mml:mover>
</mml:mrow>
</mml:math>
</inline-formula>
and
<inline-formula>
<mml:math id="mm5">
<mml:mrow>
<mml:mover accent="true">
<mml:mrow>
<mml:msub>
<mml:mi>p</mml:mi>
<mml:mi>z</mml:mi>
</mml:msub>
<mml:mo stretchy="false">[</mml:mo>
<mml:mi>ι</mml:mi>
<mml:mo stretchy="false">]</mml:mo>
</mml:mrow>
<mml:mo>¯</mml:mo>
</mml:mover>
</mml:mrow>
</mml:math>
</inline-formula>
represent the average coordinates calculated by the arithmetic mean over
<italic>N
<sub>i</sub>
</italic>
samples.</p>
<p>
<inline-formula>
<mml:math id="mm6">
<mml:mrow>
<mml:mover accent="true">
<mml:mrow>
<mml:mi>E</mml:mi>
<mml:mo stretchy="false">[</mml:mo>
<mml:mi>ι</mml:mi>
<mml:mo stretchy="false">]</mml:mo>
</mml:mrow>
<mml:mo>¯</mml:mo>
</mml:mover>
</mml:mrow>
</mml:math>
</inline-formula>
stands for the arithmetic mean of
<italic>E</italic>
[
<italic>i,j</italic>
] over
<italic>N
<sub>i</sub>
</italic>
samples. The standard deviations for individual coordinates (
<italic>std
<sub>x</sub>
</italic>
[
<italic>i</italic>
],
<italic>std
<sub>y</sub>
</italic>
[
<italic>i</italic>
], and
<italic>std
<sub>z</sub>
</italic>
[
<italic>i</italic>
]) were calculated by:
<disp-formula id="FD3">
<label>(3)</label>
<mml:math id="mm7">
<mml:mrow>
<mml:msub>
<mml:mtext mathvariant="italic">std</mml:mtext>
<mml:mi>x</mml:mi>
</mml:msub>
<mml:mo stretchy="false">[</mml:mo>
<mml:mi>i</mml:mi>
<mml:mo stretchy="false">]</mml:mo>
<mml:mo>=</mml:mo>
<mml:msqrt>
<mml:mrow>
<mml:mfrac>
<mml:mn>1</mml:mn>
<mml:mrow>
<mml:msub>
<mml:mi>N</mml:mi>
<mml:mi>i</mml:mi>
</mml:msub>
<mml:mo></mml:mo>
<mml:mn>1</mml:mn>
</mml:mrow>
</mml:mfrac>
<mml:munderover>
<mml:mo></mml:mo>
<mml:mrow>
<mml:mi>j</mml:mi>
<mml:mo>=</mml:mo>
<mml:mn>1</mml:mn>
</mml:mrow>
<mml:mrow>
<mml:msub>
<mml:mi>N</mml:mi>
<mml:mi>i</mml:mi>
</mml:msub>
</mml:mrow>
</mml:munderover>
<mml:mrow>
<mml:msup>
<mml:mrow>
<mml:mo stretchy="false">(</mml:mo>
<mml:msub>
<mml:mi>p</mml:mi>
<mml:mi>x</mml:mi>
</mml:msub>
<mml:mo stretchy="false">[</mml:mo>
<mml:mi>i</mml:mi>
<mml:mo>,</mml:mo>
<mml:mi>j</mml:mi>
<mml:mo stretchy="false">]</mml:mo>
<mml:mo></mml:mo>
<mml:mover accent="true">
<mml:mrow>
<mml:msub>
<mml:mi>p</mml:mi>
<mml:mi>x</mml:mi>
</mml:msub>
<mml:mo stretchy="false">[</mml:mo>
<mml:mi>ι</mml:mi>
<mml:mo stretchy="false">]</mml:mo>
</mml:mrow>
<mml:mo>¯</mml:mo>
</mml:mover>
<mml:mo stretchy="false">)</mml:mo>
</mml:mrow>
<mml:mn>2</mml:mn>
</mml:msup>
</mml:mrow>
</mml:mrow>
</mml:msqrt>
</mml:mrow>
</mml:math>
</disp-formula>
<disp-formula id="FD4">
<label>(4)</label>
<mml:math id="mm8">
<mml:mrow>
<mml:msub>
<mml:mtext mathvariant="italic">std</mml:mtext>
<mml:mi>y</mml:mi>
</mml:msub>
<mml:mo stretchy="false">[</mml:mo>
<mml:mi>i</mml:mi>
<mml:mo stretchy="false">]</mml:mo>
<mml:mo>=</mml:mo>
<mml:msqrt>
<mml:mrow>
<mml:mfrac>
<mml:mn>1</mml:mn>
<mml:mrow>
<mml:msub>
<mml:mi>N</mml:mi>
<mml:mi>i</mml:mi>
</mml:msub>
<mml:mo></mml:mo>
<mml:mn>1</mml:mn>
</mml:mrow>
</mml:mfrac>
<mml:munderover>
<mml:mo></mml:mo>
<mml:mrow>
<mml:mi>j</mml:mi>
<mml:mo>=</mml:mo>
<mml:mn>1</mml:mn>
</mml:mrow>
<mml:mrow>
<mml:msub>
<mml:mi>N</mml:mi>
<mml:mi>i</mml:mi>
</mml:msub>
</mml:mrow>
</mml:munderover>
<mml:mrow>
<mml:msup>
<mml:mrow>
<mml:mo stretchy="false">(</mml:mo>
<mml:msub>
<mml:mi>p</mml:mi>
<mml:mi>y</mml:mi>
</mml:msub>
<mml:mo stretchy="false">[</mml:mo>
<mml:mi>i</mml:mi>
<mml:mo>,</mml:mo>
<mml:mi>j</mml:mi>
<mml:mo stretchy="false">]</mml:mo>
<mml:mo></mml:mo>
<mml:mover accent="true">
<mml:mrow>
<mml:msub>
<mml:mi>p</mml:mi>
<mml:mi>y</mml:mi>
</mml:msub>
<mml:mo stretchy="false">[</mml:mo>
<mml:mi>ι</mml:mi>
<mml:mo stretchy="false">]</mml:mo>
</mml:mrow>
<mml:mo>¯</mml:mo>
</mml:mover>
<mml:mo stretchy="false">)</mml:mo>
</mml:mrow>
<mml:mn>2</mml:mn>
</mml:msup>
</mml:mrow>
</mml:mrow>
</mml:msqrt>
</mml:mrow>
</mml:math>
</disp-formula>
<disp-formula id="FD5">
<label>(5)</label>
<mml:math id="mm9">
<mml:mrow>
<mml:msub>
<mml:mtext mathvariant="italic">std</mml:mtext>
<mml:mi>z</mml:mi>
</mml:msub>
<mml:mo stretchy="false">[</mml:mo>
<mml:mi>i</mml:mi>
<mml:mo stretchy="false">]</mml:mo>
<mml:mo>=</mml:mo>
<mml:msqrt>
<mml:mrow>
<mml:mfrac>
<mml:mn>1</mml:mn>
<mml:mrow>
<mml:msub>
<mml:mi>N</mml:mi>
<mml:mi>i</mml:mi>
</mml:msub>
<mml:mo></mml:mo>
<mml:mn>1</mml:mn>
</mml:mrow>
</mml:mfrac>
<mml:munderover>
<mml:mo></mml:mo>
<mml:mrow>
<mml:mi>j</mml:mi>
<mml:mo>=</mml:mo>
<mml:mn>1</mml:mn>
</mml:mrow>
<mml:mrow>
<mml:msub>
<mml:mi>N</mml:mi>
<mml:mi>i</mml:mi>
</mml:msub>
</mml:mrow>
</mml:munderover>
<mml:mrow>
<mml:msup>
<mml:mrow>
<mml:mo stretchy="false">(</mml:mo>
<mml:msub>
<mml:mi>p</mml:mi>
<mml:mi>z</mml:mi>
</mml:msub>
<mml:mo stretchy="false">[</mml:mo>
<mml:mi>i</mml:mi>
<mml:mo>,</mml:mo>
<mml:mi>j</mml:mi>
<mml:mo stretchy="false">]</mml:mo>
<mml:mo></mml:mo>
<mml:mover accent="true">
<mml:mrow>
<mml:msub>
<mml:mi>p</mml:mi>
<mml:mi>z</mml:mi>
</mml:msub>
<mml:mo stretchy="false">[</mml:mo>
<mml:mi>ι</mml:mi>
<mml:mo stretchy="false">]</mml:mo>
</mml:mrow>
<mml:mo>¯</mml:mo>
</mml:mover>
<mml:mo stretchy="false">)</mml:mo>
</mml:mrow>
<mml:mn>2</mml:mn>
</mml:msup>
</mml:mrow>
</mml:mrow>
</mml:msqrt>
</mml:mrow>
</mml:math>
</disp-formula>
</p>
<p>For the analysis in a spherical coordinate system, the following three conversion equations were used (note that, according to the controller's coordinate system, the
<italic>y</italic>
and
<italic>z</italic>
axes are mutually switched compared to the standard Cartesian system; therefore,
<italic>y</italic>
represents the height, and
<italic>z</italic>
represents the depth):
<disp-formula id="FD6">
<label>(6)</label>
<mml:math id="mm10">
<mml:mrow>
<mml:mi>r</mml:mi>
<mml:mo>=</mml:mo>
<mml:msqrt>
<mml:mrow>
<mml:msup>
<mml:mi>x</mml:mi>
<mml:mn>2</mml:mn>
</mml:msup>
<mml:mo>+</mml:mo>
<mml:msup>
<mml:mi>y</mml:mi>
<mml:mn>2</mml:mn>
</mml:msup>
<mml:mo>+</mml:mo>
<mml:msup>
<mml:mi>z</mml:mi>
<mml:mn>2</mml:mn>
</mml:msup>
</mml:mrow>
</mml:msqrt>
</mml:mrow>
</mml:math>
</disp-formula>
<disp-formula id="FD7">
<label>(7)</label>
<mml:math id="mm11">
<mml:mrow>
<mml:mi>θ</mml:mi>
<mml:mo>=</mml:mo>
<mml:mo>arccos</mml:mo>
<mml:mo stretchy="false">(</mml:mo>
<mml:mfrac>
<mml:mi>y</mml:mi>
<mml:mi>r</mml:mi>
</mml:mfrac>
<mml:mo stretchy="false">)</mml:mo>
<mml:mtext>and</mml:mtext>
</mml:mrow>
</mml:math>
</disp-formula>
<disp-formula id="FD8">
<label>(8)</label>
<mml:math id="mm12">
<mml:mrow>
<mml:mi>φ</mml:mi>
<mml:mo>=</mml:mo>
<mml:mrow>
<mml:mo>{</mml:mo>
<mml:mrow>
<mml:mtable>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:mo>arctan</mml:mo>
<mml:mo stretchy="false">(</mml:mo>
<mml:mfrac>
<mml:mi>z</mml:mi>
<mml:mi>x</mml:mi>
</mml:mfrac>
<mml:mo stretchy="false">)</mml:mo>
<mml:mo>,</mml:mo>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:mi>x</mml:mi>
<mml:mo>></mml:mo>
<mml:mn>0</mml:mn>
</mml:mrow>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:mo>arctan</mml:mo>
<mml:mo stretchy="false">(</mml:mo>
<mml:mfrac>
<mml:mi>z</mml:mi>
<mml:mi>x</mml:mi>
</mml:mfrac>
<mml:mo stretchy="false">)</mml:mo>
<mml:mo>+</mml:mo>
<mml:mi>π</mml:mi>
<mml:mo>,</mml:mo>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:mi>z</mml:mi>
<mml:mo></mml:mo>
<mml:mn>0</mml:mn>
<mml:mo>,</mml:mo>
<mml:mi>x</mml:mi>
<mml:mo><</mml:mo>
<mml:mn>0</mml:mn>
</mml:mrow>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:mo>arctan</mml:mo>
<mml:mo stretchy="false">(</mml:mo>
<mml:mfrac>
<mml:mi>z</mml:mi>
<mml:mi>x</mml:mi>
</mml:mfrac>
<mml:mo stretchy="false">)</mml:mo>
<mml:mo></mml:mo>
<mml:mi>π</mml:mi>
<mml:mo>,</mml:mo>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:mi>z</mml:mi>
<mml:mo><</mml:mo>
<mml:mn>0</mml:mn>
<mml:mo>,</mml:mo>
<mml:mi>x</mml:mi>
<mml:mo><</mml:mo>
<mml:mn>0</mml:mn>
</mml:mrow>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:mfrac>
<mml:mi>π</mml:mi>
<mml:mn>2</mml:mn>
</mml:mfrac>
<mml:mo>,</mml:mo>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:mi>z</mml:mi>
<mml:mo>></mml:mo>
<mml:mn>0</mml:mn>
<mml:mo>,</mml:mo>
<mml:mi>x</mml:mi>
<mml:mo>=</mml:mo>
<mml:mn>0</mml:mn>
</mml:mrow>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:mo></mml:mo>
<mml:mfrac>
<mml:mi>π</mml:mi>
<mml:mn>2</mml:mn>
</mml:mfrac>
<mml:mo>,</mml:mo>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:mi>z</mml:mi>
<mml:mo><</mml:mo>
<mml:mn>0</mml:mn>
<mml:mo>,</mml:mo>
<mml:mi>x</mml:mi>
<mml:mo>=</mml:mo>
<mml:mn>0</mml:mn>
</mml:mrow>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:mtext mathvariant="italic">not defined</mml:mtext>
<mml:mo>,</mml:mo>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:mi>z</mml:mi>
<mml:mo>=</mml:mo>
<mml:mn>0</mml:mn>
<mml:mo>,</mml:mo>
<mml:mi>x</mml:mi>
<mml:mo>=</mml:mo>
<mml:mn>0</mml:mn>
</mml:mrow>
</mml:mtd>
</mml:mtr>
</mml:mtable>
</mml:mrow>
</mml:mrow>
</mml:mrow>
</mml:math>
</disp-formula>
</p>
<p>Assuming symmetry in the controller's performance over the
<italic>x</italic>
and
<italic>z</italic>
axes, we additionally define the modified azimuth angle as follows:
<disp-formula id="FD9">
<label>(9)</label>
<mml:math id="mm13">
<mml:mrow>
<mml:mi>φ</mml:mi>
<mml:mo></mml:mo>
<mml:mo>=</mml:mo>
<mml:mrow>
<mml:mo>{</mml:mo>
<mml:mrow>
<mml:mtable>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:mo>arctan</mml:mo>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:mrow>
<mml:mo>|</mml:mo>
<mml:mrow>
<mml:mfrac>
<mml:mi>x</mml:mi>
<mml:mi>z</mml:mi>
</mml:mfrac>
</mml:mrow>
<mml:mo>|</mml:mo>
</mml:mrow>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
<mml:mo>,</mml:mo>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:mi>z</mml:mi>
<mml:mo></mml:mo>
<mml:mn>0</mml:mn>
</mml:mrow>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:mfrac>
<mml:mi>π</mml:mi>
<mml:mn>2</mml:mn>
</mml:mfrac>
<mml:mo>,</mml:mo>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:mi>x</mml:mi>
<mml:mo></mml:mo>
<mml:mn>0</mml:mn>
<mml:mo>,</mml:mo>
<mml:mi>z</mml:mi>
<mml:mo>=</mml:mo>
<mml:mn>0</mml:mn>
</mml:mrow>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:mtext mathvariant="italic">not defined</mml:mtext>
<mml:mo>,</mml:mo>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:mi>x</mml:mi>
<mml:mo>=</mml:mo>
<mml:mn>0</mml:mn>
<mml:mo>,</mml:mo>
<mml:mi>z</mml:mi>
<mml:mo>=</mml:mo>
<mml:mn>0</mml:mn>
</mml:mrow>
</mml:mtd>
</mml:mtr>
</mml:mtable>
</mml:mrow>
</mml:mrow>
</mml:mrow>
</mml:math>
</disp-formula>
</p>
<p>The angle φ′ is measured from the x axis (not from the
<italic>z</italic>
axis, as in the case of the φ angle) and the line connecting the coordinate origin with the projection of the measured location in the
<italic>x</italic>
-
<italic>z</italic>
plane. As the angle φ′ is defined under the assumption of symmetry in the controller's performance over the
<italic>x</italic>
and
<italic>z</italic>
axes, it is therefore defined in the range of (0,
<inline-formula>
<mml:math id="mm14">
<mml:mrow>
<mml:mfrac>
<mml:mi>π</mml:mi>
<mml:mn>2</mml:mn>
</mml:mfrac>
</mml:mrow>
</mml:math>
</inline-formula>
) rad.</p>
</sec>
<sec>
<label>3.4.2.</label>
<title>Dynamic Measurements</title>
<p>In the dynamic measurements, the experimenter moved the V-tool randomly but with a constant speed within the selected region of the controller's sensory space. The V-tool was held in the fist (
<xref rid="f4-sensors-14-03702" ref-type="fig">Figure 4</xref>
) to simulate two extended fingers and was therefore detected by the controller. The moving speed of the V-tool was approximately 100 mm/s.</p>
<p>The measured sensory space included a volume of 100,000 cm
<sup>3</sup>
(−250 mm <
<italic>x</italic>
< 250 mm, −250 mm <
<italic>z</italic>
< 250 mm and 0 mm <
<italic>y</italic>
< 400 mm). This space was systematically covered in a series of four continuous measurements, each covering a layer approximately 100 mm in height (
<italic>y</italic>
dimension). The data from the individual measurements were then combined for analysis. A total of 119,360 valid positions were recorded with an average density of 1.2 samples per cm
<sup>3</sup>
.</p>
<p>The primary goal of the dynamic measurements was to evaluate the distortion of the controller's perception of space. As previously explained, the distortion was measured as the deviation of the distance between the two markers located at the tips of the V-tool. The distance between the markers at the
<italic>i</italic>
-th position was defined as:
<disp-formula id="FD10">
<label>(10)</label>
<mml:math id="mm15">
<mml:mrow>
<mml:mi>d</mml:mi>
<mml:mo stretchy="false">[</mml:mo>
<mml:mi>i</mml:mi>
<mml:mo stretchy="false">]</mml:mo>
<mml:mo>=</mml:mo>
<mml:msqrt>
<mml:mrow>
<mml:msup>
<mml:mrow>
<mml:mo stretchy="false">(</mml:mo>
<mml:msub>
<mml:mi>p</mml:mi>
<mml:mrow>
<mml:mi>x</mml:mi>
<mml:mn>1</mml:mn>
</mml:mrow>
</mml:msub>
<mml:mo stretchy="false">[</mml:mo>
<mml:mi>i</mml:mi>
<mml:mo stretchy="false">]</mml:mo>
<mml:mo></mml:mo>
<mml:msub>
<mml:mi>p</mml:mi>
<mml:mrow>
<mml:mi>x</mml:mi>
<mml:mn>2</mml:mn>
</mml:mrow>
</mml:msub>
<mml:mo stretchy="false">[</mml:mo>
<mml:mi>i</mml:mi>
<mml:mo stretchy="false">]</mml:mo>
<mml:mo stretchy="false">)</mml:mo>
</mml:mrow>
<mml:mn>2</mml:mn>
</mml:msup>
<mml:mo>+</mml:mo>
<mml:msup>
<mml:mrow>
<mml:mo stretchy="false">(</mml:mo>
<mml:msub>
<mml:mi>p</mml:mi>
<mml:mrow>
<mml:mi>y</mml:mi>
<mml:mn>1</mml:mn>
</mml:mrow>
</mml:msub>
<mml:mo stretchy="false">[</mml:mo>
<mml:mi>i</mml:mi>
<mml:mo stretchy="false">]</mml:mo>
<mml:mo></mml:mo>
<mml:msub>
<mml:mi>p</mml:mi>
<mml:mrow>
<mml:mi>y</mml:mi>
<mml:mn>2</mml:mn>
</mml:mrow>
</mml:msub>
<mml:mo stretchy="false">[</mml:mo>
<mml:mi>i</mml:mi>
<mml:mo stretchy="false">]</mml:mo>
<mml:mo stretchy="false">)</mml:mo>
</mml:mrow>
<mml:mn>2</mml:mn>
</mml:msup>
<mml:mo>+</mml:mo>
<mml:msup>
<mml:mrow>
<mml:mo stretchy="false">(</mml:mo>
<mml:msub>
<mml:mi>p</mml:mi>
<mml:mrow>
<mml:mi>z</mml:mi>
<mml:mn>1</mml:mn>
</mml:mrow>
</mml:msub>
<mml:mo stretchy="false">[</mml:mo>
<mml:mi>i</mml:mi>
<mml:mo stretchy="false">]</mml:mo>
<mml:mo></mml:mo>
<mml:msub>
<mml:mi>p</mml:mi>
<mml:mrow>
<mml:mi>z</mml:mi>
<mml:mn>2</mml:mn>
</mml:mrow>
</mml:msub>
<mml:mo stretchy="false">[</mml:mo>
<mml:mi>i</mml:mi>
<mml:mo stretchy="false">]</mml:mo>
<mml:mo stretchy="false">)</mml:mo>
</mml:mrow>
<mml:mn>2</mml:mn>
</mml:msup>
</mml:mrow>
</mml:msqrt>
</mml:mrow>
</mml:math>
</disp-formula>
where
<italic>p
<sub>x</sub>
</italic>
<sub>1</sub>
[
<italic>i</italic>
],
<italic>p
<sub>y</sub>
</italic>
<sub>1</sub>
[
<italic>i</italic>
] and
<italic>p
<sub>z</sub>
</italic>
<sub>1</sub>
[
<italic>i</italic>
] represent the coordinates of the first marker, and
<italic>p
<sub>x</sub>
</italic>
<sub>2</sub>
[
<italic>i</italic>
],
<italic>p
<sub>y</sub>
</italic>
<sub>2</sub>
[
<italic>i</italic>
] and
<italic>p
<sub>z</sub>
</italic>
<sub>2</sub>
[
<italic>i</italic>
] represent the coordinates of the second marker. The exact location of the
<italic>i</italic>
-th position was defined as a central point on the line between the two markers, obtained from the reference tracking system:
<disp-formula id="FD11">
<label>(11)</label>
<mml:math id="mm16">
<mml:mrow>
<mml:msub>
<mml:mi>p</mml:mi>
<mml:mrow>
<mml:mi>x</mml:mi>
<mml:mo>_</mml:mo>
<mml:mtext mathvariant="italic">ref</mml:mtext>
</mml:mrow>
</mml:msub>
<mml:mo stretchy="false">[</mml:mo>
<mml:mi>i</mml:mi>
<mml:mo stretchy="false">]</mml:mo>
<mml:mo>=</mml:mo>
<mml:mfrac>
<mml:mrow>
<mml:mo stretchy="false">(</mml:mo>
<mml:msub>
<mml:mi>p</mml:mi>
<mml:mrow>
<mml:mi>x</mml:mi>
<mml:mn>1</mml:mn>
<mml:mo>_</mml:mo>
<mml:mtext mathvariant="italic">ref</mml:mtext>
</mml:mrow>
</mml:msub>
<mml:mo stretchy="false">[</mml:mo>
<mml:mi>i</mml:mi>
<mml:mo stretchy="false">]</mml:mo>
<mml:mo>+</mml:mo>
<mml:msub>
<mml:mi>p</mml:mi>
<mml:mrow>
<mml:mi>x</mml:mi>
<mml:mn>2</mml:mn>
<mml:mo>_</mml:mo>
<mml:mtext mathvariant="italic">ref</mml:mtext>
</mml:mrow>
</mml:msub>
<mml:mo stretchy="false">[</mml:mo>
<mml:mi>i</mml:mi>
<mml:mo stretchy="false">]</mml:mo>
<mml:mo stretchy="false">)</mml:mo>
</mml:mrow>
<mml:mn>2</mml:mn>
</mml:mfrac>
</mml:mrow>
</mml:math>
</disp-formula>
<disp-formula id="FD12">
<label>(12)</label>
<mml:math id="mm17">
<mml:mrow>
<mml:msub>
<mml:mi>p</mml:mi>
<mml:mrow>
<mml:mi>y</mml:mi>
<mml:mo>_</mml:mo>
<mml:mtext mathvariant="italic">ref</mml:mtext>
</mml:mrow>
</mml:msub>
<mml:mo stretchy="false">[</mml:mo>
<mml:mi>i</mml:mi>
<mml:mo stretchy="false">]</mml:mo>
<mml:mo>=</mml:mo>
<mml:mfrac>
<mml:mrow>
<mml:mo stretchy="false">(</mml:mo>
<mml:msub>
<mml:mi>p</mml:mi>
<mml:mrow>
<mml:mi>y</mml:mi>
<mml:mn>1</mml:mn>
<mml:mo>_</mml:mo>
<mml:mtext mathvariant="italic">ref</mml:mtext>
</mml:mrow>
</mml:msub>
<mml:mo stretchy="false">[</mml:mo>
<mml:mi>i</mml:mi>
<mml:mo stretchy="false">]</mml:mo>
<mml:mo>+</mml:mo>
<mml:msub>
<mml:mi>p</mml:mi>
<mml:mrow>
<mml:mi>y</mml:mi>
<mml:mn>2</mml:mn>
<mml:mo>_</mml:mo>
<mml:mtext mathvariant="italic">ref</mml:mtext>
</mml:mrow>
</mml:msub>
<mml:mo stretchy="false">[</mml:mo>
<mml:mi>i</mml:mi>
<mml:mo stretchy="false">]</mml:mo>
<mml:mo stretchy="false">)</mml:mo>
</mml:mrow>
<mml:mn>2</mml:mn>
</mml:mfrac>
</mml:mrow>
</mml:math>
</disp-formula>
<disp-formula id="FD13">
<label>(13)</label>
<mml:math id="mm18">
<mml:mrow>
<mml:msub>
<mml:mi>p</mml:mi>
<mml:mrow>
<mml:mi>z</mml:mi>
<mml:mo>_</mml:mo>
<mml:mtext mathvariant="italic">ref</mml:mtext>
</mml:mrow>
</mml:msub>
<mml:mo stretchy="false">[</mml:mo>
<mml:mi>i</mml:mi>
<mml:mo stretchy="false">]</mml:mo>
<mml:mo>=</mml:mo>
<mml:mfrac>
<mml:mrow>
<mml:mo stretchy="false">(</mml:mo>
<mml:msub>
<mml:mi>p</mml:mi>
<mml:mrow>
<mml:mi>z</mml:mi>
<mml:mn>1</mml:mn>
<mml:mo>_</mml:mo>
<mml:mtext mathvariant="italic">ref</mml:mtext>
</mml:mrow>
</mml:msub>
<mml:mo stretchy="false">[</mml:mo>
<mml:mi>i</mml:mi>
<mml:mo stretchy="false">]</mml:mo>
<mml:mo>+</mml:mo>
<mml:msub>
<mml:mi>p</mml:mi>
<mml:mrow>
<mml:mi>z</mml:mi>
<mml:mn>2</mml:mn>
<mml:mo>_</mml:mo>
<mml:mtext mathvariant="italic">ref</mml:mtext>
</mml:mrow>
</mml:msub>
<mml:mo stretchy="false">[</mml:mo>
<mml:mi>i</mml:mi>
<mml:mo stretchy="false">]</mml:mo>
<mml:mo stretchy="false">)</mml:mo>
</mml:mrow>
<mml:mn>2</mml:mn>
</mml:mfrac>
</mml:mrow>
</mml:math>
</disp-formula>
</p>
</sec>
</sec>
</sec>
<sec sec-type="results">
<label>4.</label>
<title>Results</title>
<p>This section presents the measurement results based on the experimental design described in the previous section. The results of the static measurements are presented first, followed by the results of the dynamic measurement scenario.</p>
<sec>
<label>4.1.</label>
<title>Static Measurements</title>
<p>The upper two rows in
<xref rid="t1-sensors-14-03702" ref-type="table">Table 1</xref>
show the minimum and maximum standard deviations of the measured static positions. The standard deviations are given for the individual axes as well as for the three-dimensional spatial position. The lower two rows show the spatial positions with the minimal and maximal standard deviations for the individual axes.</p>
<p>The lowest standard deviation (0.0081 mm) was measured on the
<italic>x</italic>
axis 30 cm above the controller, while the highest standard deviation (0.49 mm) was measured on the
<italic>y</italic>
axis at the leftmost and topmost positions.</p>
<p>
<xref rid="f6-sensors-14-03702" ref-type="fig">Figure 6</xref>
shows the probability density of the deviation for the individual axes. By deviation, we mean the difference of all the measured samples from their corresponding mean measured positions (
<italic>p</italic>
[
<italic>i,j</italic>
]−
<inline-formula>
<mml:math id="mm19">
<mml:mrow>
<mml:mover accent="true">
<mml:mrow>
<mml:mi>p</mml:mi>
<mml:mo stretchy="false">[</mml:mo>
<mml:mi>ι</mml:mi>
<mml:mo stretchy="false">]</mml:mo>
</mml:mrow>
<mml:mo>¯</mml:mo>
</mml:mover>
</mml:mrow>
</mml:math>
</inline-formula>
, 1 ≤
<italic>i</italic>
≤ 37, 1 ≤
<italic>j</italic>
<italic>N
<sub>i</sub>
</italic>
). The figure therefore indicates the deviation probability on each of the three axes when making a single measurement in the controller's sensory space. The narrowness and height of the individual curve can therefore be directly interpreted as the consistency of the controller's individual dimensions for tracking static spatial points.</p>
<p>Our further study was focused on the determination of the spatial dependency of the standard deviation of the static measurements. For this purpose, a spherical coordinate system was used instead of the Cartesian coordinate system. The following figures (
<xref rid="f7-sensors-14-03702" ref-type="fig">Figure 7</xref>
) show the impact of the radius (
<italic>r</italic>
), inclination (θ), and azimuth (φ′) on the standard deviation. As the angle φ′ is not defined for the
<italic>y</italic>
axis, the five locations above the coordinate origin with
<italic>x</italic>
= 0 and
<italic>z</italic>
= 0 were excluded from the analysis involving the azimuth angle.</p>
<p>The figures indicate a dependency of the standard deviation on the radius and the azimuth. In both cases, the standard deviation increases when the radius or azimuth increases. The latter is also confirmed by the linear correlations listed in
<xref rid="t2-sensors-14-03702" ref-type="table">Table 2</xref>
.</p>
<p>The results indicate a significant weak positive correlation between the radius and the standard deviation, and a significant moderate positive correlation between the azimuth angle φ′ and the standard deviation. These results show that consistency of the controller drops with the distance (radius) and when the tracking objects are to the far left or far right (higher φ′) in the sensory space. Interestingly, no such dependency can be found when changing the inclination (θ) of the tracking objects.</p>
<p>We also analyzed the sampling performance and sampling frequency of the controller. Each measurement of the controller was logged with the corresponding absolute timestamp, which enabled us to determine the exact time gap between two sequential samples and to calculate the corresponding sample frequency.
<xref rid="f8-sensors-14-03702" ref-type="fig">Figure 8</xref>
demonstrates the progress of the measurements and the total time required to track the initial 3,000 samples for each of the 37 measured positions. It can be seen that sample frequency is very unstable and varies from measurement to measurement, and also within individual measurement. The minimal logged period between two samples was 14 ms (corresponds to sampling frequency 71.43 Hz). The red line at the bottom of the figure demonstrates the prediction of optimal sampling performance based on the highest measured sampling frequency.
<xref rid="f9-sensors-14-03702" ref-type="fig">Figure 9</xref>
shows the distribution of the time intervals between two consecutive samples involving all 37 positions. The mean sampling frequency was 39.0 Hz. The standard deviation was 12.8 Hz.</p>
</sec>
<sec>
<label>4.2.</label>
<title>Dynamic Measurements</title>
<p>A total of 119,360 measurements were taken within the dynamic measurement scenario in an attempt to cover the estimated useful sensory space of the controller, as described in the methodology section. Two markers with a constant inter-marker distance were used for tracking, and variations of that distance were used to analyze the controller's accuracy.
<xref rid="f10-sensors-14-03702" ref-type="fig">Figure 10</xref>
demonstrates the distributions of the deviation of the distance.
<xref rid="f10-sensors-14-03702" ref-type="fig">Figure 10a</xref>
shows the overall distribution of samples for all the positions recorded by the controller.
<xref rid="f10-sensors-14-03702" ref-type="fig">Figure 10b–d</xref>
display the distributions of the deviation on the individual axes. In these cases, the brightness of the color indicates the density of the samples (higher brightness represents higher sample density).</p>
<p>The most interesting phenomenon, which can be noted in
<xref rid="f10-sensors-14-03702" ref-type="fig">Figure 10a</xref>
, is the non-Gaussian deviation distribution, which was not expected. In addition the global peak at a deviation of approximately 0 mm, another local peak is evident at a deviation of approximately −5 mm. Further analysis shows (
<xref rid="f10-sensors-14-03702" ref-type="fig">Figure 10b–d</xref>
) that this phenomenon originates in the measurements taken at
<italic>y</italic>
> 250 mm over the entire covered area of the
<italic>x</italic>
-
<italic>z</italic>
plane (−250 mm <
<italic>x</italic>
,
<italic>z</italic>
< 250 mm).</p>
<p>The analysis of the spatial dependency of the measured distance deviation was based on computing the correlations between the spatial dimensions and the distance (
<xref rid="t3-sensors-14-03702" ref-type="table">Table 3</xref>
). The results reveal statistically significant moderate negative linear correlations between the distance deviation and the height above the controller (
<italic>y</italic>
) and the distance from the coordinate origin (
<italic>r</italic>
). The distance deviation is not correlated with the other spatial dimensions.</p>
<p>The additional volumetric analysis reveals the local distribution of the distance deviation on different planes.
<xref rid="f11-sensors-14-03702" ref-type="fig">Figure 11</xref>
displays the deviation distribution on the
<italic>x</italic>
-
<italic>z</italic>
plane (different heights above the controller) at
<italic>y</italic>
= 150 mm (
<xref rid="f11-sensors-14-03702" ref-type="fig">Figure 11a</xref>
) and
<italic>y</italic>
= 250 mm (
<xref rid="f11-sensors-14-03702" ref-type="fig">Figure 11b</xref>
). The formerly presented “local peak anomaly” can be observed in
<xref rid="f11-sensors-14-03702" ref-type="fig">Figure 11b</xref>
, where the distance deviation tends towards lower values (blue color).</p>
<p>
<xref rid="f12-sensors-14-03702" ref-type="fig">Figure 12</xref>
displays the deviation on the
<italic>x</italic>
= 0 (side view) and
<italic>z</italic>
= 0 (front view) planes. The figure reveals the highest deviation at the edges of the useful sensory space and at heights above
<italic>y</italic>
= 250 mm.</p>
<p>
<xref rid="f13-sensors-14-03702" ref-type="fig">Figure 13</xref>
displays the controller's sampling performance when tracking moving objects in four different layers above the controller. The red broken line indicates the optimal sampling performance defined with a constant sampling period of 15 ms, which corresponds to the minimum time interval between two consecutive samples logged in the dynamic measurements. The figure indicates the best sampling performance between the heights of
<italic>y</italic>
= 100 mm and
<italic>y</italic>
= 300 mm, with significantly reduced efficiency above this height.</p>
<p>
<xref rid="f14-sensors-14-03702" ref-type="fig">Figure 14</xref>
compares the sampling performance for the static and dynamic conditions. The initial 5,000 samples of the best cases from both conditions were taken for this analysis. These results were expected, as the sampling performance in the static condition proved to be more robust and uniform than the performance in the dynamic condition.</p>
</sec>
</sec>
<sec>
<label>5.</label>
<title>Discussion and Conclusions</title>
<p>In this paper, we have described an extensive evaluation of the performance of the Leap Motion Controller with the aid of a professional fast and high-accuracy motion tracking system. The main goal of our research was to systematically analyze the controller's sensory space and to define the spatial dependency of its accuracy and reliability. We performed a set of static and dynamic measurements with different numbers and configurations of tracking objects.</p>
<p>In the static scenario, the standard deviation was shown to be less than 0.5 mm at all times, in the best cases less than 0.01 mm. In addition, the high accuracy (below 0.2 mm) reported in [
<xref rid="b15-sensors-14-03702" ref-type="bibr">15</xref>
] combines with our results to evaluate the controller as a reliable and accurate system for tracking static points. Our analysis revealed an important spatial dependency of the controller's consistency and performance. The linear correlation revealed a significant increase in the standard deviation when moving away from the controller (radius) and when moving to the far left or right of the controller (φ′).</p>
<p>A sharp pen mounted on the robotic arm was used in [
<xref rid="b15-sensors-14-03702" ref-type="bibr">15</xref>
], while we had to perform our measurements using a plastic arm with pointing finger. The algorithm of the controller seems to have been updated and requires a “hand-like” object to track static points. In many cases, we were unable to establish a stable environment, and the controller only tracked the static points for a few seconds and then stopped. The main criterion for choosing the final 37 spatial locations was, therefore, the establishment of a stable position for the tracking arm that enabled successful tracking and logging over a longer period of time. The majority of the successfully selected points were located behind the controller (
<italic>z</italic>
< 0), when the hand was located above the controller and therefore fully visible inside the sensory space. It was very difficult to set up the measurement when the hand was located in front of the controller and only the tracking finger remained in the sensory space.</p>
<p>The set of measurements in the dynamic scenario also revealed the inconsistent performance of the controller. Its accuracy was evaluated through the distortion of the distance between two moving points with a constant inter-point distance. In this case, the accuracy drops when the objects move away from the sensor. There is a significant drop in accuracy for the samples taken more than 250 mm above the controller. Due to this interesting and unexpected phenomenon, we repeated the measurement for this area and obtained the same results. It is impossible to speculate on the primary cause for this behavior, but perhaps the use of objects with different inter-object distances would reveal different results.</p>
<p>An important limitation of the controller's performance is its inconsistent sampling frequency. Its mean value of less than 40 Hz is relatively low and varies significantly under both static and dynamic conditions. The main drawback of the non-uniform sampling is the great difficulty to synchronize the controller with other real-time systems since it requires difficult post processing and re-sampling methods and operations.</p>
<p>Based on the insights gained from these experiments, the further study of the Leap Motion Controller may include research on the precision and reliability of tracking more complex hand/finger and tool movements as well as its suitability for applications strongly relying on gesture input modality.</p>
<p>The Leap Motion Controller undoubtedly represents a revolutionary input device for gesture-based human-computer interaction. In this study, we evaluated the controller as a possible replacement for a fast and high-precision optical motion capture system in a limited space and with a limited number of objects. Based on the current results and the overall experience, we conclude that the controller in its current state could not be used as a professional tracking system, primarily due to its rather limited sensory space and inconsistent sampling frequency.</p>
</sec>
</body>
<back>
<ack>
<p>This work was supported by the Slovenian Research Agency within the following research program: Algorithms and Optimization Methods in Telecommunications.</p>
</ack>
<notes>
<title>Conflicts of Interest</title>
<p>The authors declare no conflict of interest.</p>
</notes>
<ref-list>
<title>References</title>
<ref id="b1-sensors-14-03702">
<label>1.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Oviatt</surname>
<given-names>S.</given-names>
</name>
</person-group>
<article-title>User-centered modeling and evaluation of multimodal interfaces</article-title>
<source>Proc. IEEE</source>
<year>2003</year>
<volume>91</volume>
<fpage>1457</fpage>
<lpage>1468</lpage>
</element-citation>
</ref>
<ref id="b2-sensors-14-03702">
<label>2.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Matthew</surname>
<given-names>T.</given-names>
</name>
</person-group>
<article-title>Multimodal interaction: A review</article-title>
<source>Pattern Recognit. Lett.</source>
<year>2014</year>
<volume>36</volume>
<fpage>189</fpage>
<lpage>195</lpage>
</element-citation>
</ref>
<ref id="b3-sensors-14-03702">
<label>3.</label>
<element-citation publication-type="confproc">
<person-group person-group-type="author">
<name>
<surname>Bhuiyan</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>Picking</surname>
<given-names>R.</given-names>
</name>
</person-group>
<article-title>Gesture-Controlled User Interfaces, What Have We Done and What's Next?</article-title>
<conf-name>Proceedings of the Fifth Collaborative Research Symposium on Security, E-Learning, Internet and Networking (SEIN 2009)</conf-name>
<conf-loc>Darmstadt, Germany</conf-loc>
<conf-date>25–29 November 2009</conf-date>
<fpage>26</fpage>
<lpage>27</lpage>
</element-citation>
</ref>
<ref id="b4-sensors-14-03702">
<label>4.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Wingrave</surname>
<given-names>C.A.</given-names>
</name>
<name>
<surname>Williamson</surname>
<given-names>B.</given-names>
</name>
<name>
<surname>Varcholik</surname>
<given-names>P.D.</given-names>
</name>
<name>
<surname>Rose</surname>
<given-names>J.</given-names>
</name>
<name>
<surname>Miller</surname>
<given-names>A.</given-names>
</name>
<name>
<surname>Charbonneau</surname>
<given-names>E.</given-names>
</name>
<name>
<surname>Bott</surname>
<given-names>J.</given-names>
</name>
<name>
<surname>LaViola</surname>
<given-names>J.J.</given-names>
</name>
</person-group>
<article-title>The wiimote and beyond: Spatially convenient devices for 3D user interfaces</article-title>
<source>IEEE Comput. Graph. Appl.</source>
<year>2010</year>
<volume>30</volume>
<fpage>71</fpage>
<lpage>85</lpage>
<pub-id pub-id-type="pmid">20669534</pub-id>
</element-citation>
</ref>
<ref id="b5-sensors-14-03702">
<label>5.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Zhang</surname>
<given-names>Z.</given-names>
</name>
</person-group>
<article-title>Microsoft kinect sensor and its effect</article-title>
<source>IEEE MultiMedia</source>
<year>2012</year>
<volume>19</volume>
<fpage>4</fpage>
<lpage>10</lpage>
</element-citation>
</ref>
<ref id="b6-sensors-14-03702">
<label>6.</label>
<element-citation publication-type="webpage">
<article-title>Leap Motion Controller</article-title>
<comment>Available online:
<ext-link ext-link-type="uri" xlink:href="https://www.leapmotion.com">https://www.leapmotion.com</ext-link>
</comment>
<date-in-citation>(accessed on 19 February 2014)</date-in-citation>
</element-citation>
</ref>
<ref id="b7-sensors-14-03702">
<label>7.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Hodson</surname>
<given-names>H.</given-names>
</name>
</person-group>
<article-title>Leap motion hacks show potential of new gesture tech</article-title>
<source>New Sci.</source>
<year>2013</year>
<volume>218</volume>
<fpage>21</fpage>
</element-citation>
</ref>
<ref id="b8-sensors-14-03702">
<label>8.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Aziz</surname>
<given-names>A.A.</given-names>
</name>
<name>
<surname>Wan</surname>
<given-names>K.</given-names>
</name>
<name>
<surname>Zaaba</surname>
<given-names>S.K.</given-names>
</name>
<name>
<surname>Shahriman</surname>
<given-names>A.B</given-names>
</name>
<name>
<surname>Adnan</surname>
<given-names>N.H.</given-names>
</name>
<name>
<surname>Nor</surname>
<given-names>R.M.</given-names>
</name>
<name>
<surname>Ayob</surname>
<given-names>M.N.</given-names>
</name>
<name>
<surname>Ismail</surname>
<given-names>A.H.</given-names>
</name>
<name>
<surname>Ramly</surname>
<given-names>M.F.</given-names>
</name>
</person-group>
<article-title>Development of a gesture database for an adaptive gesture recognition system</article-title>
<source>Int. J. Electr. Comput. Sci.</source>
<year>2012</year>
<volume>12</volume>
<fpage>38</fpage>
<lpage>44</lpage>
</element-citation>
</ref>
<ref id="b9-sensors-14-03702">
<label>9.</label>
<element-citation publication-type="webpage">
<article-title>Qualisys Motion Capture System</article-title>
<comment>Available online:
<ext-link ext-link-type="uri" xlink:href="http://www.qualisys.com">http://www.qualisys.com</ext-link>
</comment>
<date-in-citation>(accessed on 19 February 2014)</date-in-citation>
</element-citation>
</ref>
<ref id="b10-sensors-14-03702">
<label>10.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Khoshelham</surname>
<given-names>K.</given-names>
</name>
<name>
<surname>Elberink</surname>
<given-names>S.O.</given-names>
</name>
</person-group>
<article-title>Accuracy and resolution of kinect depth data for indoor mapping applications</article-title>
<source>Sensors</source>
<year>2012</year>
<volume>12</volume>
<fpage>1437</fpage>
<lpage>1454</lpage>
<pub-id pub-id-type="pmid">22438718</pub-id>
</element-citation>
</ref>
<ref id="b11-sensors-14-03702">
<label>11.</label>
<element-citation publication-type="confproc">
<person-group person-group-type="author">
<name>
<surname>Teather</surname>
<given-names>R.J.</given-names>
</name>
<name>
<surname>Pavlovych</surname>
<given-names>A.</given-names>
</name>
<name>
<surname>Stuerzlinger</surname>
<given-names>W.</given-names>
</name>
<name>
<surname>MacKenzie</surname>
<given-names>I.S.</given-names>
</name>
</person-group>
<article-title>Effects of Tracking Technology Latency, and Spatial Jitter on Object Movement</article-title>
<conf-name>Proceedings of the IEEE Symposium on 3D User Interfaces</conf-name>
<conf-loc>Lafayette, LA, USA</conf-loc>
<conf-date>14–15 March 2009</conf-date>
<fpage>43</fpage>
<lpage>50</lpage>
</element-citation>
</ref>
<ref id="b12-sensors-14-03702">
<label>12.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Hernoux</surname>
<given-names>F.</given-names>
</name>
<name>
<surname>Béarée</surname>
<given-names>R.</given-names>
</name>
<name>
<surname>Gajny</surname>
<given-names>L.</given-names>
</name>
<name>
<surname>Nyiri</surname>
<given-names>E.</given-names>
</name>
<name>
<surname>Bancalin</surname>
<given-names>J.</given-names>
</name>
<name>
<surname>Gibaru</surname>
<given-names>O.</given-names>
</name>
</person-group>
<article-title>Leap Motion pour la capture de mouvement 3D par spline L1. Application à la robotique</article-title>
<source>Journées du Groupe de Travail en Modélisation Géométrique 2013, Marseille</source>
<comment>(in French)</comment>
</element-citation>
</ref>
<ref id="b13-sensors-14-03702">
<label>13.</label>
<element-citation publication-type="confproc">
<person-group person-group-type="author">
<name>
<surname>Vikram</surname>
<given-names>S.</given-names>
</name>
</person-group>
<article-title>Handwriting and Gestures in the Air, Recognizing on the Fly</article-title>
<conf-name>Proceedings of the CHI 2013 Extended Abstracts</conf-name>
<conf-loc>Paris, France</conf-loc>
<conf-date>27 April–2 May 2013</conf-date>
</element-citation>
</ref>
<ref id="b14-sensors-14-03702">
<label>14.</label>
<element-citation publication-type="confproc">
<person-group person-group-type="author">
<name>
<surname>Regenbrecht</surname>
<given-names>H.</given-names>
</name>
<name>
<surname>Collins</surname>
<given-names>J.</given-names>
</name>
<name>
<surname>Hoermann</surname>
<given-names>S.</given-names>
</name>
</person-group>
<article-title>A Leap-Supported, Hybrid AR Interface Approach</article-title>
<conf-name>Proceedings of the OZCHI'13, 25th Australian Computer-Human Interaction Conference: Augmentation, Application, Innovation, Collaboration</conf-name>
<conf-loc>Adelaide, Australia</conf-loc>
<conf-date>25–29 November 2013</conf-date>
</element-citation>
</ref>
<ref id="b15-sensors-14-03702">
<label>15.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Weichert</surname>
<given-names>F.</given-names>
</name>
<name>
<surname>Bachmann</surname>
<given-names>D.</given-names>
</name>
<name>
<surname>Rudak</surname>
<given-names>B.</given-names>
</name>
<name>
<surname>Fisseler</surname>
<given-names>D.</given-names>
</name>
</person-group>
<article-title>Analysis of the accuracy and robustness of the leap motion controller</article-title>
<source>Sensors</source>
<year>2013</year>
<volume>13</volume>
<fpage>6380</fpage>
<lpage>6393</lpage>
<pub-id pub-id-type="pmid">23673678</pub-id>
</element-citation>
</ref>
</ref-list>
</back>
<floats-group>
<fig id="f1-sensors-14-03702" position="float">
<label>Figure 1.</label>
<caption>
<p>The Cartesian and spherical coordinate systems used to describe positions in the controller's sensory space.</p>
</caption>
<graphic xlink:href="sensors-14-03702f1"></graphic>
</fig>
<fig id="f2-sensors-14-03702" position="float">
<label>Figure 2.</label>
<caption>
<p>The setup of the experimental environment.</p>
</caption>
<graphic xlink:href="sensors-14-03702f2"></graphic>
</fig>
<fig id="f3-sensors-14-03702" position="float">
<label>Figure 3.</label>
<caption>
<p>To improve the tracking capabilities of the Leap Motion Controller, the marker was placed at the tip of the index finger of a plastic arm model. During the measurement of static locations, the arm was fixed in place using a stand.</p>
</caption>
<graphic xlink:href="sensors-14-03702f3"></graphic>
</fig>
<fig id="f4-sensors-14-03702" position="float">
<label>Figure 4.</label>
<caption>
<p>The V-tool used for dynamic measurements.</p>
</caption>
<graphic xlink:href="sensors-14-03702f4"></graphic>
</fig>
<fig id="f5-sensors-14-03702" position="float">
<label>Figure 5.</label>
<caption>
<p>The measurement grid displaying the reference locations of the static measurements.</p>
</caption>
<graphic xlink:href="sensors-14-03702f5"></graphic>
</fig>
<fig id="f6-sensors-14-03702" position="float">
<label>Figure 6.</label>
<caption>
<p>The probability density of the deviations, including all 37 locations.</p>
</caption>
<graphic xlink:href="sensors-14-03702f6"></graphic>
</fig>
<fig id="f7-sensors-14-03702" position="float">
<label>Figure 7.</label>
<caption>
<p>Spatial dependency of (
<bold>a</bold>
) the radius—
<italic>r</italic>
; (
<bold>b</bold>
) inclination—θ; and (
<bold>c</bold>
) azimuth—φ′ on the standard deviation.</p>
</caption>
<graphic xlink:href="sensors-14-03702f7a"></graphic>
<graphic xlink:href="sensors-14-03702f7b"></graphic>
</fig>
<fig id="f8-sensors-14-03702" position="float">
<label>Figure 8.</label>
<caption>
<p>The progress of the measurements in the static scenario (the total time required to collect the initial 3,000 samples at different points in space).</p>
</caption>
<graphic xlink:href="sensors-14-03702f8"></graphic>
</fig>
<fig id="f9-sensors-14-03702" position="float">
<label>Figure 9.</label>
<caption>
<p>The distribution of the time intervals between two individual samples.</p>
</caption>
<graphic xlink:href="sensors-14-03702f9"></graphic>
</fig>
<fig id="f10-sensors-14-03702" position="float">
<label>Figure 10.</label>
<caption>
<p>Distributions of deviation within dynamic measurements: (
<bold>a</bold>
) the overall distribution; and (
<bold>b</bold>
<bold>d</bold>
) the distributions on the individual axes.</p>
</caption>
<graphic xlink:href="sensors-14-03702f10"></graphic>
</fig>
<fig id="f11-sensors-14-03702" position="float">
<label>Figure 11.</label>
<caption>
<p>Distance deviation distributions in
<italic>x</italic>
-
<italic>z</italic>
plane at (
<bold>a</bold>
)
<italic>y</italic>
= 150 mm; and (
<bold>b</bold>
)
<italic>y</italic>
= 250 mm.</p>
</caption>
<graphic xlink:href="sensors-14-03702f11"></graphic>
</fig>
<fig id="f12-sensors-14-03702" position="float">
<label>Figure 12.</label>
<caption>
<p>Distance deviation distributions at (
<bold>a</bold>
)
<italic>x</italic>
= 0; and (
<bold>b</bold>
)
<italic>y</italic>
= 0.</p>
</caption>
<graphic xlink:href="sensors-14-03702f12"></graphic>
</fig>
<fig id="f13-sensors-14-03702" position="float">
<label>Figure 13.</label>
<caption>
<p>The progress of the measurements in the dynamic scenario (the total time required to collect the initial 10,000 samples in different height regions).</p>
</caption>
<graphic xlink:href="sensors-14-03702f13"></graphic>
</fig>
<fig id="f14-sensors-14-03702" position="float">
<label>Figure 14.</label>
<caption>
<p>Comparison between the progress of static and dynamic measurements (total time required to collect the initial 5,000 samples).</p>
</caption>
<graphic xlink:href="sensors-14-03702f14"></graphic>
</fig>
<table-wrap id="t1-sensors-14-03702" position="float">
<label>Table 1.</label>
<caption>
<p>Standard deviations of static positions.</p>
</caption>
<table frame="hsides" rules="groups">
<thead>
<tr>
<th colspan="2" align="center" valign="middle" rowspan="1">
<bold>Standard deviation and position</bold>
</th>
<th align="center" valign="middle" rowspan="1" colspan="1">
<bold>
<italic>x</italic>
Axis (std
<italic>
<sub>x</sub>
</italic>
)</bold>
</th>
<th align="center" valign="middle" rowspan="1" colspan="1">
<bold>
<italic>y</italic>
Axis (std
<italic>
<sub>y</sub>
</italic>
)</bold>
</th>
<th align="center" valign="middle" rowspan="1" colspan="1">
<bold>
<italic>z</italic>
Axis (std
<italic>
<sub>z</sub>
</italic>
)</bold>
</th>
<th align="center" valign="middle" rowspan="1" colspan="1">
<bold>Spatial position (std)</bold>
</th>
</tr>
</thead>
<tbody>
<tr>
<td align="center" valign="middle" rowspan="3" colspan="1">
<bold>Minimal std</bold>
</td>
<td align="center" valign="bottom" rowspan="1" colspan="1">std (mm)</td>
<td align="center" valign="bottom" rowspan="1" colspan="1">
<bold>0.0081</bold>
</td>
<td align="center" valign="bottom" rowspan="1" colspan="1">0.0093</td>
<td align="center" valign="bottom" rowspan="1" colspan="1">0.015</td>
<td align="center" valign="bottom" rowspan="1" colspan="1">0.013</td>
</tr>
<tr>
<td valign="bottom" colspan="5" rowspan="1">
<hr></hr>
</td>
</tr>
<tr>
<td align="center" valign="bottom" rowspan="1" colspan="1">location (
<italic>x</italic>
,
<italic>y</italic>
,
<italic>z</italic>
) (cm)</td>
<td align="center" valign="bottom" rowspan="1" colspan="1">(0, 30, 0)</td>
<td align="center" valign="bottom" rowspan="1" colspan="1">(−10, 10, −5)</td>
<td align="center" valign="bottom" rowspan="1" colspan="1">(0, 20, −5)</td>
<td align="center" valign="bottom" rowspan="1" colspan="1">(0, 15, 0)</td>
</tr>
<tr>
<td valign="bottom" colspan="6" rowspan="1">
<hr></hr>
</td>
</tr>
<tr>
<td align="center" valign="middle" rowspan="3" colspan="1">
<bold>Maximal std</bold>
</td>
<td align="center" valign="bottom" rowspan="1" colspan="1">std (mm)</td>
<td align="center" valign="bottom" rowspan="1" colspan="1">0.39</td>
<td align="center" valign="bottom" rowspan="1" colspan="1">
<bold>0.49</bold>
</td>
<td align="center" valign="bottom" rowspan="1" colspan="1">0.37</td>
<td align="center" valign="bottom" rowspan="1" colspan="1">0.38</td>
</tr>
<tr>
<td valign="bottom" colspan="5" rowspan="1">
<hr></hr>
</td>
</tr>
<tr>
<td align="center" valign="bottom" rowspan="1" colspan="1">location (
<italic>x</italic>
,
<italic>y</italic>
,
<italic>z</italic>
) (cm)</td>
<td align="center" valign="bottom" rowspan="1" colspan="1">(−20, 20, 0)</td>
<td align="center" valign="bottom" rowspan="1" colspan="1">(−20, 30, 0)</td>
<td align="center" valign="bottom" rowspan="1" colspan="1">(−20, 30, 0)</td>
<td align="center" valign="bottom" rowspan="1" colspan="1">(−20, 30, 0)</td>
</tr>
</tbody>
</table>
</table-wrap>
<table-wrap id="t2-sensors-14-03702" position="float">
<label>Table 2.</label>
<caption>
<p>Correlations between the dimensions of the spherical coordinate system and standard deviation.</p>
</caption>
<table frame="hsides" rules="groups">
<thead>
<tr>
<th align="center" valign="bottom" rowspan="1" colspan="1">
<bold>Correlation variables</bold>
</th>
<th align="center" valign="bottom" rowspan="1" colspan="1">
<bold>Pearson coefficient</bold>
</th>
<th align="center" valign="bottom" rowspan="1" colspan="1">
<bold>
<italic>p</italic>
-value</bold>
</th>
</tr>
</thead>
<tbody>
<tr>
<td align="center" valign="bottom" rowspan="1" colspan="1">
<mml:math id="mm20">
<mml:mrow>
<mml:mover accent="true">
<mml:mrow>
<mml:msub>
<mml:mi>p</mml:mi>
<mml:mi>r</mml:mi>
</mml:msub>
<mml:mo stretchy="false">[</mml:mo>
<mml:mi>ι</mml:mi>
<mml:mo stretchy="false">]</mml:mo>
</mml:mrow>
<mml:mo>¯</mml:mo>
</mml:mover>
</mml:mrow>
</mml:math>
</td>
<td align="center" valign="bottom" rowspan="1" colspan="1">0.338</td>
<td align="center" valign="bottom" rowspan="1" colspan="1">0.044</td>
</tr>
<tr>
<td align="center" valign="bottom" rowspan="1" colspan="1">
<mml:math id="mm21">
<mml:mrow>
<mml:mover accent="true">
<mml:mrow>
<mml:msub>
<mml:mi>p</mml:mi>
<mml:mi>θ</mml:mi>
</mml:msub>
<mml:mo stretchy="false">[</mml:mo>
<mml:mi>ι</mml:mi>
<mml:mo stretchy="false">]</mml:mo>
</mml:mrow>
<mml:mo>¯</mml:mo>
</mml:mover>
</mml:mrow>
</mml:math>
</td>
<td align="center" valign="bottom" rowspan="1" colspan="1">0.163</td>
<td align="center" valign="bottom" rowspan="1" colspan="1">0.34</td>
</tr>
<tr>
<td align="center" valign="bottom" rowspan="1" colspan="1">
<mml:math id="mm22">
<mml:mrow>
<mml:mover accent="true">
<mml:mrow>
<mml:msub>
<mml:mi>p</mml:mi>
<mml:mrow>
<mml:mi>φ</mml:mi>
<mml:mo></mml:mo>
</mml:mrow>
</mml:msub>
<mml:mo stretchy="false">[</mml:mo>
<mml:mi>ι</mml:mi>
<mml:mo stretchy="false">]</mml:mo>
</mml:mrow>
<mml:mo>¯</mml:mo>
</mml:mover>
</mml:mrow>
</mml:math>
</td>
<td align="center" valign="bottom" rowspan="1" colspan="1">0.433</td>
<td align="center" valign="bottom" rowspan="1" colspan="1">0.051</td>
</tr>
</tbody>
</table>
</table-wrap>
<table-wrap id="t3-sensors-14-03702" position="float">
<label>Table 3.</label>
<caption>
<p>Correlations between spatial dimensions and the distance deviation.</p>
</caption>
<table frame="hsides" rules="groups">
<thead>
<tr>
<th align="center" valign="bottom" rowspan="1" colspan="1">
<bold>Correlation variables</bold>
</th>
<th align="center" valign="bottom" rowspan="1" colspan="1">
<bold>Pearson coefficient</bold>
</th>
<th align="center" valign="bottom" rowspan="1" colspan="1">
<bold>
<italic>p</italic>
-value</bold>
</th>
</tr>
</thead>
<tbody>
<tr>
<td align="center" valign="bottom" rowspan="1" colspan="1">
<italic>x</italic>
[j], dist[j]</td>
<td align="center" valign="bottom" rowspan="1" colspan="1">−0.0658</td>
<td align="center" valign="bottom" rowspan="1" colspan="1"><0.000</td>
</tr>
<tr>
<td align="center" valign="bottom" rowspan="1" colspan="1">
<italic>y</italic>
[j], dist[j]</td>
<td align="center" valign="bottom" rowspan="1" colspan="1">
<bold>−0.612</bold>
</td>
<td align="center" valign="bottom" rowspan="1" colspan="1">
<bold><0.000</bold>
</td>
</tr>
<tr>
<td align="center" valign="bottom" rowspan="1" colspan="1">
<italic>z</italic>
[j], dist[j]</td>
<td align="center" valign="bottom" rowspan="1" colspan="1">0.00350</td>
<td align="center" valign="bottom" rowspan="1" colspan="1">0.223</td>
</tr>
<tr>
<td align="center" valign="bottom" rowspan="1" colspan="1">
<italic>r</italic>
[j], dist[j]</td>
<td align="center" valign="bottom" rowspan="1" colspan="1">
<bold>−0.595</bold>
</td>
<td align="center" valign="bottom" rowspan="1" colspan="1">
<bold><0.000</bold>
</td>
</tr>
<tr>
<td align="center" valign="bottom" rowspan="1" colspan="1">θ[j], dist [j]</td>
<td align="center" valign="bottom" rowspan="1" colspan="1">0.192</td>
<td align="center" valign="bottom" rowspan="1" colspan="1"><0.000</td>
</tr>
<tr>
<td align="center" valign="bottom" rowspan="1" colspan="1">φ′ [j], dist [j]</td>
<td align="center" valign="bottom" rowspan="1" colspan="1">−0.0792</td>
<td align="center" valign="bottom" rowspan="1" colspan="1"><0.000</td>
</tr>
</tbody>
</table>
</table-wrap>
</floats-group>
</pmc>
</record>

Pour manipuler ce document sous Unix (Dilib)

EXPLOR_STEP=$WICRI_ROOT/Ticri/CIDE/explor/HapticV1/Data/Pmc/Curation
HfdSelect -h $EXPLOR_STEP/biblio.hfd -nk 002505 | SxmlIndent | more

Ou

HfdSelect -h $EXPLOR_AREA/Data/Pmc/Curation/biblio.hfd -nk 002505 | SxmlIndent | more

Pour mettre un lien sur cette page dans le réseau Wicri

{{Explor lien
   |wiki=    Ticri/CIDE
   |area=    HapticV1
   |flux=    Pmc
   |étape=   Curation
   |type=    RBID
   |clé=     PMC:3958287
   |texte=   An Analysis of the Precision and Reliability of the Leap Motion Sensor and Its Suitability for Static and Dynamic Tracking
}}

Pour générer des pages wiki

HfdIndexSelect -h $EXPLOR_AREA/Data/Pmc/Curation/RBID.i   -Sk "pubmed:24566635" \
       | HfdSelect -Kh $EXPLOR_AREA/Data/Pmc/Curation/biblio.hfd   \
       | NlmPubMed2Wicri -a HapticV1 

Wicri

This area was generated with Dilib version V0.6.23.
Data generation: Mon Jun 13 01:09:46 2016. Site generation: Wed Mar 6 09:54:07 2024