Serveur d'exploration sur les dispositifs haptiques

Attention, ce site est en cours de développement !
Attention, site généré par des moyens informatiques à partir de corpus bruts.
Les informations ne sont donc pas validées.

The adaptation of GDL motion recognition system to sport and rehabilitation techniques analysis

Identifieur interne : 000024 ( Pmc/Checkpoint ); précédent : 000023; suivant : 000025

The adaptation of GDL motion recognition system to sport and rehabilitation techniques analysis

Auteurs : Tomasz Hachaj [Pologne] ; Marek R. Ogiela [Pologne]

Source :

RBID : PMC:4841835

Abstract

The main novelty of this paper is presenting the adaptation of Gesture Description Language (GDL) methodology to sport and rehabilitation data analysis and classification. In this paper we showed that Lua language can be successfully used for adaptation of the GDL classifier to those tasks. The newly applied scripting language allows easily extension and integration of classifier with other software technologies and applications. The obtained execution speed allows using the methodology in the real-time motion capture data processing where capturing frequency differs from 100 Hz to even 500 Hz depending on number of features or classes to be calculated and recognized. Due to this fact the proposed methodology can be used to the high-end motion capture system. We anticipate that using novel, efficient and effective method will highly help both sport trainers and physiotherapist in they practice. The proposed approach can be directly applied to motion capture data kinematics analysis (evaluation of motion without regard to the forces that cause that motion). The ability to apply pattern recognition methods for GDL description can be utilized in virtual reality environment and used for sport training or rehabilitation treatment.


Url:
DOI: 10.1007/s10916-016-0493-6
PubMed: 27106581
PubMed Central: 4841835


Affiliations:


Links toward previous steps (curation, corpus...)


Links to Exploration step

PMC:4841835

Le document en format XML

<record>
<TEI>
<teiHeader>
<fileDesc>
<titleStmt>
<title xml:lang="en">The adaptation of GDL motion recognition system to sport and rehabilitation techniques analysis</title>
<author>
<name sortKey="Hachaj, Tomasz" sort="Hachaj, Tomasz" uniqKey="Hachaj T" first="Tomasz" last="Hachaj">Tomasz Hachaj</name>
<affiliation wicri:level="1">
<nlm:aff id="Aff1">Institute of Computer Science and Computer Methods, Pedagogical University of Krakow, 2 Podchorazych Ave, 30-084 Krakow, Poland</nlm:aff>
<country xml:lang="fr">Pologne</country>
<wicri:regionArea>Institute of Computer Science and Computer Methods, Pedagogical University of Krakow, 2 Podchorazych Ave, 30-084 Krakow</wicri:regionArea>
<wicri:noRegion>30-084 Krakow</wicri:noRegion>
</affiliation>
</author>
<author>
<name sortKey="Ogiela, Marek R" sort="Ogiela, Marek R" uniqKey="Ogiela M" first="Marek R." last="Ogiela">Marek R. Ogiela</name>
<affiliation wicri:level="1">
<nlm:aff id="Aff2">Cryptography and Cognitive Informatics Research Group, AGH University of Science and Technology, 30 Mickiewicza Ave, 30-059 Krakow, Poland</nlm:aff>
<country xml:lang="fr">Pologne</country>
<wicri:regionArea>Cryptography and Cognitive Informatics Research Group, AGH University of Science and Technology, 30 Mickiewicza Ave, 30-059 Krakow</wicri:regionArea>
<wicri:noRegion>30-059 Krakow</wicri:noRegion>
</affiliation>
</author>
</titleStmt>
<publicationStmt>
<idno type="wicri:source">PMC</idno>
<idno type="pmid">27106581</idno>
<idno type="pmc">4841835</idno>
<idno type="url">http://www.ncbi.nlm.nih.gov/pmc/articles/PMC4841835</idno>
<idno type="RBID">PMC:4841835</idno>
<idno type="doi">10.1007/s10916-016-0493-6</idno>
<date when="2016">2016</date>
<idno type="wicri:Area/Pmc/Corpus">000829</idno>
<idno type="wicri:Area/Pmc/Curation">000829</idno>
<idno type="wicri:Area/Pmc/Checkpoint">000024</idno>
</publicationStmt>
<sourceDesc>
<biblStruct>
<analytic>
<title xml:lang="en" level="a" type="main">The adaptation of GDL motion recognition system to sport and rehabilitation techniques analysis</title>
<author>
<name sortKey="Hachaj, Tomasz" sort="Hachaj, Tomasz" uniqKey="Hachaj T" first="Tomasz" last="Hachaj">Tomasz Hachaj</name>
<affiliation wicri:level="1">
<nlm:aff id="Aff1">Institute of Computer Science and Computer Methods, Pedagogical University of Krakow, 2 Podchorazych Ave, 30-084 Krakow, Poland</nlm:aff>
<country xml:lang="fr">Pologne</country>
<wicri:regionArea>Institute of Computer Science and Computer Methods, Pedagogical University of Krakow, 2 Podchorazych Ave, 30-084 Krakow</wicri:regionArea>
<wicri:noRegion>30-084 Krakow</wicri:noRegion>
</affiliation>
</author>
<author>
<name sortKey="Ogiela, Marek R" sort="Ogiela, Marek R" uniqKey="Ogiela M" first="Marek R." last="Ogiela">Marek R. Ogiela</name>
<affiliation wicri:level="1">
<nlm:aff id="Aff2">Cryptography and Cognitive Informatics Research Group, AGH University of Science and Technology, 30 Mickiewicza Ave, 30-059 Krakow, Poland</nlm:aff>
<country xml:lang="fr">Pologne</country>
<wicri:regionArea>Cryptography and Cognitive Informatics Research Group, AGH University of Science and Technology, 30 Mickiewicza Ave, 30-059 Krakow</wicri:regionArea>
<wicri:noRegion>30-059 Krakow</wicri:noRegion>
</affiliation>
</author>
</analytic>
<series>
<title level="j">Journal of Medical Systems</title>
<idno type="ISSN">0148-5598</idno>
<idno type="eISSN">1573-689X</idno>
<imprint>
<date when="2016">2016</date>
</imprint>
</series>
</biblStruct>
</sourceDesc>
</fileDesc>
<profileDesc>
<textClass></textClass>
</profileDesc>
</teiHeader>
<front>
<div type="abstract" xml:lang="en">
<p>The main novelty of this paper is presenting the adaptation of Gesture Description Language (GDL) methodology to sport and rehabilitation data analysis and classification. In this paper we showed that Lua language can be successfully used for adaptation of the GDL classifier to those tasks. The newly applied scripting language allows easily extension and integration of classifier with other software technologies and applications. The obtained execution speed allows using the methodology in the real-time motion capture data processing where capturing frequency differs from 100 Hz to even 500 Hz depending on number of features or classes to be calculated and recognized. Due to this fact the proposed methodology can be used to the high-end motion capture system. We anticipate that using novel, efficient and effective method will highly help both sport trainers and physiotherapist in they practice. The proposed approach can be directly applied to motion capture data kinematics analysis (evaluation of motion without regard to the forces that cause that motion). The ability to apply pattern recognition methods for GDL description can be utilized in virtual reality environment and used for sport training or rehabilitation treatment.</p>
</div>
</front>
<back>
<div1 type="bibliography">
<listBibl>
<biblStruct></biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Zhang, Q" uniqKey="Zhang Q">Q Zhang</name>
</author>
<author>
<name sortKey="Song, X" uniqKey="Song X">X Song</name>
</author>
<author>
<name sortKey="Shibasaki, R" uniqKey="Shibasaki R">R Shibasaki</name>
</author>
<author>
<name sortKey="Zhao, H" uniqKey="Zhao H">H Zhao</name>
</author>
</analytic>
</biblStruct>
<biblStruct></biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Schwarz, La" uniqKey="Schwarz L">LA Schwarz</name>
</author>
<author>
<name sortKey="Mkhitaryan, A" uniqKey="Mkhitaryan A">A Mkhitaryan</name>
</author>
<author>
<name sortKey="Mateus, D" uniqKey="Mateus D">D Mateus</name>
</author>
<author>
<name sortKey="Navab, N" uniqKey="Navab N">N Navab</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Gupta, S" uniqKey="Gupta S">S Gupta</name>
</author>
<author>
<name sortKey="Jaafar, J" uniqKey="Jaafar J">J Jaafar</name>
</author>
<author>
<name sortKey="Fatimah, W" uniqKey="Fatimah W">W Fatimah</name>
</author>
<author>
<name sortKey="Ahmad, W" uniqKey="Ahmad W">W Ahmad</name>
</author>
</analytic>
</biblStruct>
<biblStruct></biblStruct>
<biblStruct></biblStruct>
<biblStruct></biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Miranda, L" uniqKey="Miranda L">L Miranda</name>
</author>
<author>
<name sortKey="Vieira, T" uniqKey="Vieira T">T Vieira</name>
</author>
<author>
<name sortKey="Martinez, D" uniqKey="Martinez D">D Martinez</name>
</author>
<author>
<name sortKey="Lewiner, T" uniqKey="Lewiner T">T Lewiner</name>
</author>
<author>
<name sortKey="Vieira, Aw" uniqKey="Vieira A">AW Vieira</name>
</author>
<author>
<name sortKey="Campos, Mfm" uniqKey="Campos M">MFM Campos</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Li, Z" uniqKey="Li Z">Z Li</name>
</author>
<author>
<name sortKey="Wei, Z" uniqKey="Wei Z">Z Wei</name>
</author>
<author>
<name sortKey="Yue, Y" uniqKey="Yue Y">Y Yue</name>
</author>
<author>
<name sortKey="Wang, H" uniqKey="Wang H">H Wang</name>
</author>
<author>
<name sortKey="Jia, W" uniqKey="Jia W">W Jia</name>
</author>
<author>
<name sortKey="Burke, Le" uniqKey="Burke L">LE Burke</name>
</author>
<author>
<name sortKey="Baranowski, T" uniqKey="Baranowski T">T Baranowski</name>
</author>
<author>
<name sortKey="Sun, M" uniqKey="Sun M">M Sun</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Rajanna, V" uniqKey="Rajanna V">V Rajanna</name>
</author>
<author>
<name sortKey="Vo, P" uniqKey="Vo P">P Vo</name>
</author>
<author>
<name sortKey="Barth, J" uniqKey="Barth J">J Barth</name>
</author>
<author>
<name sortKey="Mjelde, M" uniqKey="Mjelde M">M Mjelde</name>
</author>
<author>
<name sortKey="Grey, T" uniqKey="Grey T">T Grey</name>
</author>
<author>
<name sortKey="Oduola, C" uniqKey="Oduola C">C Oduola</name>
</author>
<author>
<name sortKey="Hammond, T" uniqKey="Hammond T">T Hammond</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Cholewa, M" uniqKey="Cholewa M">M Cholewa</name>
</author>
<author>
<name sortKey="Glomb, P" uniqKey="Glomb P">P Głomb</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Kang, J" uniqKey="Kang J">J Kang</name>
</author>
<author>
<name sortKey="Zhong, K" uniqKey="Zhong K">K Zhong</name>
</author>
<author>
<name sortKey="Qin, S" uniqKey="Qin S">S Qin</name>
</author>
<author>
<name sortKey="Wang, H" uniqKey="Wang H">H Wang</name>
</author>
<author>
<name sortKey="Wright, D" uniqKey="Wright D">D Wright</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="L Pez Mendez, A" uniqKey="L Pez Mendez A">A López-Méndez</name>
</author>
<author>
<name sortKey="Casas, Jr" uniqKey="Casas J">JR Casas</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Zhu, F" uniqKey="Zhu F">F Zhu</name>
</author>
<author>
<name sortKey="Shao, L" uniqKey="Shao L">L Shao</name>
</author>
<author>
<name sortKey="Lin, M" uniqKey="Lin M">M Lin</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Gamage, N" uniqKey="Gamage N">N Gamage</name>
</author>
<author>
<name sortKey="Chow Kuang, Y" uniqKey="Chow Kuang Y">Y Chow Kuang</name>
</author>
<author>
<name sortKey="Akmeliawati, R" uniqKey="Akmeliawati R">R Akmeliawati</name>
</author>
<author>
<name sortKey="Demidenko, S" uniqKey="Demidenko S">S Demidenko</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Glowacz, A" uniqKey="Glowacz A">A Glowacz</name>
</author>
</analytic>
</biblStruct>
<biblStruct></biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Du, Y" uniqKey="Du Y">Y Du</name>
</author>
<author>
<name sortKey="Chen, F" uniqKey="Chen F">F Chen</name>
</author>
<author>
<name sortKey="Xu, W" uniqKey="Xu W">W Xu</name>
</author>
<author>
<name sortKey="Zhang, W" uniqKey="Zhang W">W Zhang</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Suma, Ea" uniqKey="Suma E">EA Suma</name>
</author>
<author>
<name sortKey="Krum, Dm" uniqKey="Krum D">DM Krum</name>
</author>
<author>
<name sortKey="Lange, B" uniqKey="Lange B">B Lange</name>
</author>
<author>
<name sortKey="Koenig, S" uniqKey="Koenig S">S Koenig</name>
</author>
<author>
<name sortKey="Rizzo, A" uniqKey="Rizzo A">A Rizzo</name>
</author>
<author>
<name sortKey="Bolas, M" uniqKey="Bolas M">M Bolas</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Bickerstaffe, A" uniqKey="Bickerstaffe A">A Bickerstaffe</name>
</author>
<author>
<name sortKey="Lane, A" uniqKey="Lane A">A Lane</name>
</author>
<author>
<name sortKey="Meyer, B" uniqKey="Meyer B">B Meyer</name>
</author>
<author>
<name sortKey="Marriott, K" uniqKey="Marriott K">K Marriott</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Hachaj, T" uniqKey="Hachaj T">T Hachaj</name>
</author>
<author>
<name sortKey="Ogiela, Mr" uniqKey="Ogiela M">MR Ogiela</name>
</author>
</analytic>
</biblStruct>
<biblStruct></biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Hachaj, T" uniqKey="Hachaj T">T Hachaj</name>
</author>
<author>
<name sortKey="Ogiela, Mr" uniqKey="Ogiela M">MR Ogiela</name>
</author>
<author>
<name sortKey="Piekarczyk, M" uniqKey="Piekarczyk M">M Piekarczyk</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Hachaj, T" uniqKey="Hachaj T">T Hachaj</name>
</author>
<author>
<name sortKey="Ogiela, Mr" uniqKey="Ogiela M">MR Ogiela</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Hachaj, T" uniqKey="Hachaj T">T Hachaj</name>
</author>
<author>
<name sortKey="Ogiela, Mr" uniqKey="Ogiela M">MR Ogiela</name>
</author>
<author>
<name sortKey="Koptyra, K" uniqKey="Koptyra K">K Koptyra</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Hachaj, T" uniqKey="Hachaj T">T Hachaj</name>
</author>
<author>
<name sortKey="Baraniewicz, D" uniqKey="Baraniewicz D">D Baraniewicz</name>
</author>
</analytic>
</biblStruct>
<biblStruct></biblStruct>
<biblStruct></biblStruct>
<biblStruct></biblStruct>
<biblStruct></biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Palacios Navarro, G" uniqKey="Palacios Navarro G">G Palacios-Navarro</name>
</author>
<author>
<name sortKey="Garcia Magari O, I" uniqKey="Garcia Magari O I">I García-Magariño</name>
</author>
<author>
<name sortKey="Ramos Lorente, P" uniqKey="Ramos Lorente P">P Ramos-Lorente</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="De La Torre Diez, I" uniqKey="De La Torre Diez I">I De la Torre-Díez</name>
</author>
<author>
<name sortKey="Ant N Rodriguez, M" uniqKey="Ant N Rodriguez M">M Antón-Rodríguez</name>
</author>
<author>
<name sortKey="Diaz Pernas, Fj" uniqKey="Diaz Pernas F">FJ Díaz-Pernas</name>
</author>
<author>
<name sortKey="Perozo Rond N, Fj" uniqKey="Perozo Rond N F">FJ Perozo-Rondón</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Hauser, Jr" uniqKey="Hauser J">JR Hauser</name>
</author>
</analytic>
</biblStruct>
<biblStruct></biblStruct>
<biblStruct></biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Cacho, N" uniqKey="Cacho N">N Cacho</name>
</author>
<author>
<name sortKey="Batista, T" uniqKey="Batista T">T Batista</name>
</author>
<author>
<name sortKey="Fernandes, F" uniqKey="Fernandes F">F Fernandes</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Maia, R" uniqKey="Maia R">R Maia</name>
</author>
<author>
<name sortKey="Cerqueira, R" uniqKey="Cerqueira R">R Cerqueira</name>
</author>
<author>
<name sortKey="Sieckenius De Souza, C" uniqKey="Sieckenius De Souza C">C Sieckenius de Souza</name>
</author>
<author>
<name sortKey="Guisasola Gorham, T" uniqKey="Guisasola Gorham T">T Guisasola-Gorham</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Soares, Lfg" uniqKey="Soares L">LFG Soares</name>
</author>
<author>
<name sortKey="Rodrigues, Rf" uniqKey="Rodrigues R">RF Rodrigues</name>
</author>
<author>
<name sortKey="Cerqueira, R" uniqKey="Cerqueira R">R Cerqueira</name>
</author>
<author>
<name sortKey="Barbosa, Sdj" uniqKey="Barbosa S">SDJ Barbosa</name>
</author>
</analytic>
</biblStruct>
<biblStruct></biblStruct>
<biblStruct></biblStruct>
<biblStruct></biblStruct>
<biblStruct></biblStruct>
<biblStruct></biblStruct>
<biblStruct></biblStruct>
<biblStruct></biblStruct>
<biblStruct></biblStruct>
<biblStruct></biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Lin, C" uniqKey="Lin C">C Lin</name>
</author>
<author>
<name sortKey="Song, Z" uniqKey="Song Z">Z Song</name>
</author>
<author>
<name sortKey="Song, H" uniqKey="Song H">H Song</name>
</author>
<author>
<name sortKey="Zhou, Y" uniqKey="Zhou Y">Y Zhou</name>
</author>
<author>
<name sortKey="Wang, Y" uniqKey="Wang Y">Y Wang</name>
</author>
<author>
<name sortKey="Wu, G" uniqKey="Wu G">G Wu</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Ullah, S" uniqKey="Ullah S">S Ullah</name>
</author>
<author>
<name sortKey="Higgins, H" uniqKey="Higgins H">H Higgins</name>
</author>
<author>
<name sortKey="Braem, B" uniqKey="Braem B">B Braem</name>
</author>
<author>
<name sortKey="Latre, B" uniqKey="Latre B">B Latre</name>
</author>
<author>
<name sortKey="Blondia, C" uniqKey="Blondia C">C Blondia</name>
</author>
<author>
<name sortKey="Moerman, I" uniqKey="Moerman I">I Moerman</name>
</author>
<author>
<name sortKey="Saleem, S" uniqKey="Saleem S">S Saleem</name>
</author>
<author>
<name sortKey="Rahman, Z" uniqKey="Rahman Z">Z Rahman</name>
</author>
<author>
<name sortKey="Kwak, Ks" uniqKey="Kwak K">KS Kwak</name>
</author>
</analytic>
</biblStruct>
<biblStruct></biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Hachaj, T" uniqKey="Hachaj T">T Hachaj</name>
</author>
<author>
<name sortKey="Ogiela, Mr" uniqKey="Ogiela M">MR Ogiela</name>
</author>
</analytic>
</biblStruct>
</listBibl>
</div1>
</back>
</TEI>
<pmc article-type="research-article">
<pmc-dir>properties open_access</pmc-dir>
<front>
<journal-meta>
<journal-id journal-id-type="nlm-ta">J Med Syst</journal-id>
<journal-id journal-id-type="iso-abbrev">J Med Syst</journal-id>
<journal-title-group>
<journal-title>Journal of Medical Systems</journal-title>
</journal-title-group>
<issn pub-type="ppub">0148-5598</issn>
<issn pub-type="epub">1573-689X</issn>
<publisher>
<publisher-name>Springer US</publisher-name>
<publisher-loc>New York</publisher-loc>
</publisher>
</journal-meta>
<article-meta>
<article-id pub-id-type="pmid">27106581</article-id>
<article-id pub-id-type="pmc">4841835</article-id>
<article-id pub-id-type="publisher-id">493</article-id>
<article-id pub-id-type="doi">10.1007/s10916-016-0493-6</article-id>
<article-categories>
<subj-group subj-group-type="heading">
<subject>Patient Facing Systems</subject>
</subj-group>
</article-categories>
<title-group>
<article-title>The adaptation of GDL motion recognition system to sport and rehabilitation techniques analysis</article-title>
</title-group>
<contrib-group>
<contrib contrib-type="author" corresp="yes">
<name>
<surname>Hachaj</surname>
<given-names>Tomasz</given-names>
</name>
<address>
<phone>(+48) 12 662 63 22</phone>
<email>tomekhachaj@o2.pl</email>
</address>
<xref ref-type="aff" rid="Aff1"></xref>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Ogiela</surname>
<given-names>Marek R.</given-names>
</name>
<xref ref-type="aff" rid="Aff2"></xref>
</contrib>
<aff id="Aff1">
<label></label>
Institute of Computer Science and Computer Methods, Pedagogical University of Krakow, 2 Podchorazych Ave, 30-084 Krakow, Poland</aff>
<aff id="Aff2">
<label></label>
Cryptography and Cognitive Informatics Research Group, AGH University of Science and Technology, 30 Mickiewicza Ave, 30-059 Krakow, Poland</aff>
</contrib-group>
<pub-date pub-type="epub">
<day>22</day>
<month>4</month>
<year>2016</year>
</pub-date>
<pub-date pub-type="pmc-release">
<day>22</day>
<month>4</month>
<year>2016</year>
</pub-date>
<pub-date pub-type="ppub">
<year>2016</year>
</pub-date>
<volume>40</volume>
<elocation-id>137</elocation-id>
<history>
<date date-type="received">
<day>2</day>
<month>3</month>
<year>2016</year>
</date>
<date date-type="accepted">
<day>6</day>
<month>4</month>
<year>2016</year>
</date>
</history>
<permissions>
<copyright-statement>© The Author(s) 2016</copyright-statement>
<license license-type="OpenAccess">
<license-p>
<bold>Open Access</bold>
This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.</license-p>
</license>
</permissions>
<abstract id="Abs1">
<p>The main novelty of this paper is presenting the adaptation of Gesture Description Language (GDL) methodology to sport and rehabilitation data analysis and classification. In this paper we showed that Lua language can be successfully used for adaptation of the GDL classifier to those tasks. The newly applied scripting language allows easily extension and integration of classifier with other software technologies and applications. The obtained execution speed allows using the methodology in the real-time motion capture data processing where capturing frequency differs from 100 Hz to even 500 Hz depending on number of features or classes to be calculated and recognized. Due to this fact the proposed methodology can be used to the high-end motion capture system. We anticipate that using novel, efficient and effective method will highly help both sport trainers and physiotherapist in they practice. The proposed approach can be directly applied to motion capture data kinematics analysis (evaluation of motion without regard to the forces that cause that motion). The ability to apply pattern recognition methods for GDL description can be utilized in virtual reality environment and used for sport training or rehabilitation treatment.</p>
</abstract>
<kwd-group xml:lang="en">
<title>Keywords</title>
<kwd>Sport data analysis</kwd>
<kwd>Rehabilitation data analysis</kwd>
<kwd>Motion capture</kwd>
<kwd>Signal classification</kwd>
<kwd>Gesture description language</kwd>
</kwd-group>
<funding-group>
<award-group>
<funding-source>
<institution>National Science Centre, Poland</institution>
</funding-source>
<award-id>2015/17/D/ST6/04051</award-id>
</award-group>
</funding-group>
<custom-meta-group>
<custom-meta>
<meta-name>issue-copyright-statement</meta-name>
<meta-value>© Springer Science+Business Media New York 2016</meta-value>
</custom-meta>
</custom-meta-group>
</article-meta>
</front>
<body>
<sec id="Sec1" sec-type="introduction">
<title>Introduction</title>
<p>Motion capture (MoCap) is a powerful technology with many possible applications. The dimensionality of output signal stream from MoCap system depends on number and type of sensors or tracked body joints in virtual skeleton that are used [
<xref ref-type="bibr" rid="CR1">1</xref>
<xref ref-type="bibr" rid="CR4">4</xref>
]. Mostly often each body joint has three or six degrees of freedom. Three are linear coordinates in Cartesian frame with versors (x, y, z) and other three are angles that define the orientation of the body segments. Those values are not used directly by the system but rather features values are calculated as derivatives of original data. The features selection and extraction methods are for example Gabor [
<xref ref-type="bibr" rid="CR5">5</xref>
] or Haar filters [
<xref ref-type="bibr" rid="CR6">6</xref>
]. Dimensionality reduction can be done with principal components analysis (PCA) [
<xref ref-type="bibr" rid="CR7">7</xref>
] or other approaches [
<xref ref-type="bibr" rid="CR8">8</xref>
]. The movement representation is often invariant under rigid transformation [
<xref ref-type="bibr" rid="CR8">8</xref>
] and can be for example angular representation of the skeleton joints [
<xref ref-type="bibr" rid="CR9">9</xref>
] where each pose is described using an angular representation of the skeleton joints.</p>
<p>Many methods have been yet proposed for human actions and movements evaluation and recognition. That type of analysis is important for calculation of biomechanics parameters of actions, for evaluation ones activates and lifestyle or during rehabilitation [
<xref ref-type="bibr" rid="CR10">10</xref>
,
<xref ref-type="bibr" rid="CR11">11</xref>
]. Among proposed methods that can be used for signals and action recognition are approaches that are often used for signal identification and pattern recognition. The most popular are Hidden Markov Models (HMM) [
<xref ref-type="bibr" rid="CR10">10</xref>
,
<xref ref-type="bibr" rid="CR12">12</xref>
,
<xref ref-type="bibr" rid="CR13">13</xref>
], support vector machines (SVM) [
<xref ref-type="bibr" rid="CR5">5</xref>
,
<xref ref-type="bibr" rid="CR9">9</xref>
,
<xref ref-type="bibr" rid="CR14">14</xref>
], decision forests [
<xref ref-type="bibr" rid="CR9">9</xref>
,
<xref ref-type="bibr" rid="CR15">15</xref>
], Gaussian process dynamical models [
<xref ref-type="bibr" rid="CR16">16</xref>
], K-means clustering [
<xref ref-type="bibr" rid="CR6">6</xref>
], nearest neighbor classifier [
<xref ref-type="bibr" rid="CR17">17</xref>
], Bayes classifier [
<xref ref-type="bibr" rid="CR18">18</xref>
], dynamic Bayesian networks [
<xref ref-type="bibr" rid="CR19">19</xref>
], syntactic method [
<xref ref-type="bibr" rid="CR6">6</xref>
,
<xref ref-type="bibr" rid="CR20">20</xref>
,
<xref ref-type="bibr" rid="CR21">21</xref>
] and rule based methods – for example Gesture Description Language (GDL) [
<xref ref-type="bibr" rid="CR22">22</xref>
<xref ref-type="bibr" rid="CR24">24</xref>
]. GDL classifier uses a rule – based approach with memory stack. The memory stack holds the captured MoCap data frames, features and classes to which a sequence of MoCap frames are classified. GDL method uses specially designed scripts that hold the definition of features calculated from MoCap input stream. Those features are used to design rules that have if-else form and define the key frames of actions. Key frames are ordered in sequences. If the sequence of key frames appears in memory stack in a given time restriction the ongoing action is classified to a given class. This approach is somehow similar to HMM classifier.</p>
<p>The wide comparison of GDL methodology to other recognition system is discussed in other papers [
<xref ref-type="bibr" rid="CR25">25</xref>
<xref ref-type="bibr" rid="CR27">27</xref>
]. This comparison includes most important aspects like comparison of action description methodology, geometric interpretation of those descriptions, training algorithm and applications. The example results for various physical activates is presented in papers [
<xref ref-type="bibr" rid="CR22">22</xref>
<xref ref-type="bibr" rid="CR26">26</xref>
,
<xref ref-type="bibr" rid="CR28">28</xref>
<xref ref-type="bibr" rid="CR31">31</xref>
] and includes common-life actions, gym exercises and Oyama and Shorin-Ryu karate techniques.</p>
<p>So far the GDL was used mainly for classification of MoCap data stream from multimedia devices (for example Microsoft Kinect) however we need to create a unified approach that would enable not only classification but also analysis of MoCap signals from high-end hardware. That type of devices is often used to support spot coaches in training optimization and physicians in rehabilitation process [
<xref ref-type="bibr" rid="CR10">10</xref>
,
<xref ref-type="bibr" rid="CR11">11</xref>
,
<xref ref-type="bibr" rid="CR32">32</xref>
]. The requirements for this new approach is that it has to be capable to create complex features definitions (that could be for example used for kinematic analysis) and have to be fast enough for real-time data analysis (performance is very important aspect of every medical system [
<xref ref-type="bibr" rid="CR33">33</xref>
]). The main novelty of this paper is presenting the adaptation of Gesture Description Language (GDL) methodology to sport and rehabilitation data analysis and classification. In the following sections we will present, evaluate and discuss proposition of such methodology.</p>
</sec>
<sec id="Sec2" sec-type="materials|methods">
<title>Materials and methods</title>
<p>In this section we will present the novel adaptation of GDL methodology to analysis and classification of MoCap data for sport and rehabilitation applications. To do so we need to enhance the possibilities of features definition of scripting language that is inherent part of GDL classifier. We did it by replacing the old GDLs scripting language with Lua.</p>
<sec id="Sec3">
<title>Lua application in GDL paradigm</title>
<p>Lua is a dynamically typed language that can be easily integrated with other computer languages and applications. Due its simplicity and easily extensions it is a popular scripting technology for other computer systems. Lua is used in scientific computing like linear algebra, neural networks, numeric optimization routines and many more [
<xref ref-type="bibr" rid="CR34">34</xref>
<xref ref-type="bibr" rid="CR36">36</xref>
]. Lua was also used in an aspect-oriented infrastructure to handle dynamic programming tasks [
<xref ref-type="bibr" rid="CR37">37</xref>
]. Lua is also used in middleware design and development [
<xref ref-type="bibr" rid="CR38">38</xref>
,
<xref ref-type="bibr" rid="CR39">39</xref>
] also in robotics and embedded systems [
<xref ref-type="bibr" rid="CR40">40</xref>
<xref ref-type="bibr" rid="CR43">43</xref>
]. This programming language is very popular tool for writing high-level scripts for other computer systems [
<xref ref-type="bibr" rid="CR44">44</xref>
<xref ref-type="bibr" rid="CR46">46</xref>
]. In paper [
<xref ref-type="bibr" rid="CR47">47</xref>
] authors discuss what mechanisms Lua features to achieve its flexibility and how programmers use them for different paradigms.</p>
<p>The proposed Lua implementation is based on Lua 5.2 and JAVA hosting application. Lua functions are called by LuaJ library. GDL engine uses five classes and one script file Engine.lua with global variables and functions - see a class diagram presented in Fig. 
<xref rid="Fig1" ref-type="fig">1</xref>
. A very basic script that can be used to detect situation when right hand is under head might looks as follow:
<fig id="Fig1">
<label>Fig. 1</label>
<caption>
<p>This figure presents a class diagram for Lua implementation of GDL classifier</p>
</caption>
<graphic xlink:href="10916_2016_493_Fig1_HTML" id="MO1"></graphic>
</fig>
<graphic position="anchor" xlink:href="10916_2016_493_Figa_HTML" id="MO2"></graphic>
</p>
<p>The ReturnConclusions function is called by hosting application to pass the tracking parameters. The proposed Lua – based framework can be easily adapted to anybody joints set simply by configuration of joints definition in SkeletonData class and functions in Engine.lua.</p>
</sec>
<sec id="Sec4">
<title>Features calculation</title>
<p>The adapted GDL classifier uses standard Lua syntax to define features. There are three types of features: logical, numeric and vector. The logical feature role is identical to conclusion from GDL specification and it is represented by ‘boolean’ data type in Lua. Only those logical values that equals ‘true’ are passed to hosting application. The numeric data type has a floating-point value and it is represented by ‘number’ data type in Lua. The vector data type has three floating-point values and is represented by user-defined class Vector3D. There are also definition of most important vectors operation like vector sum, multiplication by a number, dot product, cross product etc. that can be usable for kinematics or kinetics analysis [
<xref ref-type="bibr" rid="CR48">48</xref>
] (the basic trigonometric functions like sinus are already supported by Lua).</p>
<p>In Fig. 
<xref rid="Fig2" ref-type="fig">2</xref>
we present vectors set that was used to generate features set that was used for recognition hiza-geri karate kick. We have defined six angle-based features:
<disp-formula id="Equ1">
<label>1</label>
<alternatives>
<tex-math id="M1">\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$ \left\{\begin{array}{c}\hfill {A}_1=\measuredangle \left({v}_1,{v}_2\right)\ \hfill \\ {}\hfill {A}_2=\measuredangle \left({v}_3,{v}_2\right)\hfill \\ {}\hfill {A}_3=\measuredangle \left({v}_4,{v}_2\right)\hfill \\ {}\hfill {A}_4=\measuredangle \left({v}_1,{v}_5\right)\hfill \\ {}\hfill {A}_5=\measuredangle \left({v}_3,{v}_5\right)\hfill \\ {}\hfill {A}_6=\measuredangle \left({v}_4,{v}_5\right)\hfill \end{array}\right. $$\end{document}</tex-math>
<mml:math id="M2">
<mml:mfenced close="" open="{">
<mml:mtable columnalign="center">
<mml:mtr columnalign="center">
<mml:mtd columnalign="center">
<mml:msub>
<mml:mi>A</mml:mi>
<mml:mn>1</mml:mn>
</mml:msub>
<mml:mo>=</mml:mo>
<mml:mo></mml:mo>
<mml:mfenced close=")" open="(" separators=",">
<mml:msub>
<mml:mi>v</mml:mi>
<mml:mn>1</mml:mn>
</mml:msub>
<mml:msub>
<mml:mi>v</mml:mi>
<mml:mn>2</mml:mn>
</mml:msub>
</mml:mfenced>
<mml:mspace width="0.25em"></mml:mspace>
</mml:mtd>
</mml:mtr>
<mml:mtr columnalign="center">
<mml:mtd columnalign="center">
<mml:msub>
<mml:mi>A</mml:mi>
<mml:mn>2</mml:mn>
</mml:msub>
<mml:mo>=</mml:mo>
<mml:mo></mml:mo>
<mml:mfenced close=")" open="(" separators=",">
<mml:msub>
<mml:mi>v</mml:mi>
<mml:mn>3</mml:mn>
</mml:msub>
<mml:msub>
<mml:mi>v</mml:mi>
<mml:mn>2</mml:mn>
</mml:msub>
</mml:mfenced>
</mml:mtd>
</mml:mtr>
<mml:mtr columnalign="center">
<mml:mtd columnalign="center">
<mml:msub>
<mml:mi>A</mml:mi>
<mml:mn>3</mml:mn>
</mml:msub>
<mml:mo>=</mml:mo>
<mml:mo></mml:mo>
<mml:mfenced close=")" open="(" separators=",">
<mml:msub>
<mml:mi>v</mml:mi>
<mml:mn>4</mml:mn>
</mml:msub>
<mml:msub>
<mml:mi>v</mml:mi>
<mml:mn>2</mml:mn>
</mml:msub>
</mml:mfenced>
</mml:mtd>
</mml:mtr>
<mml:mtr columnalign="center">
<mml:mtd columnalign="center">
<mml:msub>
<mml:mi>A</mml:mi>
<mml:mn>4</mml:mn>
</mml:msub>
<mml:mo>=</mml:mo>
<mml:mo></mml:mo>
<mml:mfenced close=")" open="(" separators=",">
<mml:msub>
<mml:mi>v</mml:mi>
<mml:mn>1</mml:mn>
</mml:msub>
<mml:msub>
<mml:mi>v</mml:mi>
<mml:mn>5</mml:mn>
</mml:msub>
</mml:mfenced>
</mml:mtd>
</mml:mtr>
<mml:mtr columnalign="center">
<mml:mtd columnalign="center">
<mml:msub>
<mml:mi>A</mml:mi>
<mml:mn>5</mml:mn>
</mml:msub>
<mml:mo>=</mml:mo>
<mml:mo></mml:mo>
<mml:mfenced close=")" open="(" separators=",">
<mml:msub>
<mml:mi>v</mml:mi>
<mml:mn>3</mml:mn>
</mml:msub>
<mml:msub>
<mml:mi>v</mml:mi>
<mml:mn>5</mml:mn>
</mml:msub>
</mml:mfenced>
</mml:mtd>
</mml:mtr>
<mml:mtr columnalign="center">
<mml:mtd columnalign="center">
<mml:msub>
<mml:mi>A</mml:mi>
<mml:mn>6</mml:mn>
</mml:msub>
<mml:mo>=</mml:mo>
<mml:mo></mml:mo>
<mml:mfenced close=")" open="(" separators=",">
<mml:msub>
<mml:mi>v</mml:mi>
<mml:mn>4</mml:mn>
</mml:msub>
<mml:msub>
<mml:mi>v</mml:mi>
<mml:mn>5</mml:mn>
</mml:msub>
</mml:mfenced>
</mml:mtd>
</mml:mtr>
</mml:mtable>
</mml:mfenced>
</mml:math>
<graphic xlink:href="10916_2016_493_Article_Equ1.gif" position="anchor"></graphic>
</alternatives>
</disp-formula>
<fig id="Fig2">
<label>Fig. 2</label>
<caption>
<p>This figure presents vectors set that was used to generate example features</p>
</caption>
<graphic xlink:href="10916_2016_493_Fig2_HTML" id="MO3"></graphic>
</fig>
</p>
<p>Where
<inline-formula id="IEq1">
<alternatives>
<tex-math id="M3">\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$ {A}_1,\dots, {A}_6 $$\end{document}</tex-math>
<mml:math id="M4">
<mml:msub>
<mml:mi>A</mml:mi>
<mml:mn>1</mml:mn>
</mml:msub>
<mml:mo>,</mml:mo>
<mml:mo></mml:mo>
<mml:mo>,</mml:mo>
<mml:msub>
<mml:mi>A</mml:mi>
<mml:mn>6</mml:mn>
</mml:msub>
</mml:math>
<inline-graphic xlink:href="10916_2016_493_Article_IEq1.gif"></inline-graphic>
</alternatives>
</inline-formula>
are angles calculated between vectors
<inline-formula id="IEq2">
<alternatives>
<tex-math id="M5">\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$ {v}_1,\dots, {v}_5 $$\end{document}</tex-math>
<mml:math id="M6">
<mml:msub>
<mml:mi>v</mml:mi>
<mml:mn>1</mml:mn>
</mml:msub>
<mml:mo>,</mml:mo>
<mml:mo></mml:mo>
<mml:mo>,</mml:mo>
<mml:msub>
<mml:mi>v</mml:mi>
<mml:mn>5</mml:mn>
</mml:msub>
</mml:math>
<inline-graphic xlink:href="10916_2016_493_Article_IEq2.gif"></inline-graphic>
</alternatives>
</inline-formula>
visualized in Fig. 
<xref rid="Fig2" ref-type="fig">2</xref>
. Vectors are defined by tracked body joints.</p>
<p>The Lua implementation looks as follows:
<graphic position="anchor" xlink:href="10916_2016_493_Figb_HTML" id="MO4"></graphic>
</p>
<p>Where angle is a function that finds angle between two vectors on the plane designated by those vectors.</p>
<p>In Fig. 
<xref rid="Fig3" ref-type="fig">3</xref>
we present 3D visualizations of important phases of selected karate actions we used in evaluation of our methodology.
<fig id="Fig3">
<label>Fig. 3</label>
<caption>
<p>This figure presents important phases of karate actions: Hiza-Geri kick and Kiba-Dachi stance. The Mo-Cap data is visualized in 3D virtual environment</p>
</caption>
<graphic xlink:href="10916_2016_493_Fig3_HTML" id="MO5"></graphic>
</fig>
</p>
<p>In Fig. 
<xref rid="Fig4" ref-type="fig">4</xref>
we present plot of features values defined by (1) for a recording of single Hiza-Geri kick done with Kinect 2 depth camera. Above the plot are horizontal bars with color-coded information about key frames to which current frame was assigned by GDL classifier. Brown is the first key frame, yellow the second, cyan the third. Blue stands for lack of assignment (N – not assigned).
<fig id="Fig4">
<label>Fig. 4</label>
<caption>
<p>This figure presents features time series generated for single Hiza-Geri kick recording. The
<italic>horizontal axis</italic>
represents time and the vertical axis the angle. Each time series stands for one of the feature from (1). On the top of the plot there are
<italic>color bars</italic>
that indicate to which GDL key frame the signal sample has been classified. Color codes are the same as in Fig. 
<xref rid="Fig5" ref-type="fig">5</xref>
. Number 1, 2 and 3 are key frames numbers (there are totally three key frames in this particular Hiza-Geri definition). Symbol N represents the time sample in which signals have not been classified to any key frame</p>
</caption>
<graphic xlink:href="10916_2016_493_Fig4_HTML" id="MO6"></graphic>
</fig>
</p>
</sec>
<sec id="Sec5">
<title>Pattern recognition with GDL approach</title>
<p>The recognition process of action pattern in adapted implementation is mainly the same as in [
<xref ref-type="bibr" rid="CR25">25</xref>
]. The only difference is a reasoning module. We have found that in the previous implementations users seldom used some features of it and might even not be aware of its existence. Due to this fact hardly ever users design GDLs that reference to rule conclusion before it is defined in next rule. Also those types of constructions are not required in automatic training algorithm (R-GDL) about which we write below.</p>
<p>The wearable body sensor enables to collect big data collections for which approaches know from other fields of big-data analysis can be applied [
<xref ref-type="bibr" rid="CR49">49</xref>
,
<xref ref-type="bibr" rid="CR50">50</xref>
]. This property is also utilized in automatic training algorithm for GDL technology. While using GDL for a classification task the one of the most challenging aspect is designing of appropriate script that defines key frames of actions. To find those key frames automatically we often use Reverse-GDL (R-GDL) approach published in [
<xref ref-type="bibr" rid="CR25">25</xref>
]. The R-GDL utilizes the fact that after transferring the original MoCap data to features space key frames can be detected with k-means clustering algorithm. This situation if visualized in Fig. 
<xref rid="Fig5" ref-type="fig">5</xref>
where data assigned to key frames is color coded with the same color pattern as in Fig. 
<xref rid="Fig4" ref-type="fig">4</xref>
.
<fig id="Fig5">
<label>Fig. 5</label>
<caption>
<p>This figure presents three-dimensional projection of six-dimensional feature space (1) using principal component analysis. Each point represents a single MoCap frame with color-coded GDL key frame</p>
</caption>
<graphic xlink:href="10916_2016_493_Fig5_HTML" id="MO7"></graphic>
</fig>
</p>
</sec>
</sec>
<sec id="Sec6" sec-type="results">
<title>Results</title>
<p>In third section we will evaluate the average performance of our adapted methodology. The implementation can be found on official website of GDL technology [
<xref ref-type="bibr" rid="CR51">51</xref>
]. We are mostly interested in how much time is required to calculate features and to classify the input dataset. We have taken into account time of transferring data from hosting application to GDL engine, Lua scripts execution time and transferring data from GDL to host application. After this whole cycle the application obtains data that can be directly used by it. We have used 20-joints data with gym exercises recordings acquired with Kinect version 1 (K1), the same as we used in [
<xref ref-type="bibr" rid="CR25">25</xref>
] and 25-joints data with karate techniques recordings acquired with Kinect version 2 (K2) [
<xref ref-type="bibr" rid="CR26">26</xref>
]. The Lua scripts in which actions were defined varied in number of features definition and number of GDL instructions calls. The most basic scripts were 10- features, 20-features, 30-features and 40-features sets that were defined both for K1 and K2. All features were angles defined by vectors calculated from neighboring joint. The other scripts were codes that define kiba-dachi stand (about 4 kB of code), and definition of 12 karate actions (about 33 Kb of code) [
<xref ref-type="bibr" rid="CR30">30</xref>
], jumping jacks exercise (4 kB) and 9 gym exercises (40 kB) [
<xref ref-type="bibr" rid="CR25">25</xref>
]. We have used actions recordings consisted of 100, 200, 500, 1000, 2000, 5000, 10,000 and 20,000 frames. Evaluation was repeated 20-times for each Lua script and recording. The proposed method was evaluated on standard PC equipped with Intel Core i7-4770 CPU 3.40 GHz, 8 GB of RAM with Windows 7 Home Premium 64 Bit. The results in Table
<xref rid="Tab1" ref-type="table">1</xref>
and Fig. 
<xref rid="Fig6" ref-type="fig">6</xref>
are averaged results plus – minus standard deviation.
<table-wrap id="Tab1">
<label>Table 1</label>
<caption>
<p>This table presents averaged execution time (in milliseconds) plus-minus standard deviation of various Lua scripts that uses GDL implementation</p>
</caption>
<table frame="hsides" rules="groups">
<thead>
<tr>
<th></th>
<th>100</th>
<th>200</th>
<th>500</th>
<th>1000</th>
<th>2000</th>
<th>5000</th>
<th>10,000</th>
<th>20,000</th>
</tr>
</thead>
<tbody>
<tr>
<td>kiba-dachi (K2)</td>
<td>171 ± 26</td>
<td>290 ± 35</td>
<td>754 ± 44</td>
<td>1683 ± 51</td>
<td>3062 ± 96</td>
<td>7824 ± 191</td>
<td>15,316 ± 278</td>
<td>31,249 ± 1218</td>
</tr>
<tr>
<td>karate (K2)</td>
<td>1173 ± 330</td>
<td>2225 ± 299</td>
<td>5913 ± 146</td>
<td>11,627 ± 358</td>
<td>23,114 ± 458</td>
<td>60,020 ± 788</td>
<td>116,892 ± 2052</td>
<td>231,344 ± 1046</td>
</tr>
<tr>
<td>10 features (K2)</td>
<td>125 ± 21</td>
<td>201 ± 22</td>
<td>505 ± 26</td>
<td>1123 ± 36</td>
<td>2064 ± 38</td>
<td>5116 ± 31</td>
<td>10,511 ± 377</td>
<td>20,566 ± 164</td>
</tr>
<tr>
<td>20 features (K2)</td>
<td>220 ± 21</td>
<td>366 ± 45</td>
<td>912 ± 43</td>
<td>2069 ± 56</td>
<td>3717 ± 57</td>
<td>9234 ± 55</td>
<td>18,677 ± 252</td>
<td>37,526 ± 915</td>
</tr>
<tr>
<td>30 features (K2)</td>
<td>301 ± 31</td>
<td>524 ± 65</td>
<td>1346 ± 69</td>
<td>2996 ± 83</td>
<td>5429 ± 83</td>
<td>13,550 ± 81</td>
<td>27,322 ± 374</td>
<td>54,419 ± 203</td>
</tr>
<tr>
<td>40 features (K2)</td>
<td>394 ± 35</td>
<td>778 ± 100</td>
<td>1977 ± 109</td>
<td>4010 ± 125</td>
<td>7999 ± 117</td>
<td>20,018 ± 140</td>
<td>40,615 ± 618</td>
<td>80,120 ± 500</td>
</tr>
<tr>
<td>jumping jacks (K1)</td>
<td>104 ± 16</td>
<td>166 ± 18</td>
<td>425 ± 25</td>
<td>1099 ± 149</td>
<td>1659 ± 21</td>
<td>4220 ± 73</td>
<td>8615 ± 409</td>
<td>16,684 ± 69</td>
</tr>
<tr>
<td>gym (K1)</td>
<td>722 ± 260</td>
<td>1251 ± 164</td>
<td>3710 ± 185</td>
<td>8806 ± 160</td>
<td>17,655 ± 221</td>
<td>43,832 ± 187</td>
<td>88,156 ± 646</td>
<td>175,715 ± 2004</td>
</tr>
<tr>
<td>10 features (K1)</td>
<td>121 ± 12</td>
<td>195 ± 22</td>
<td>498 ± 27</td>
<td>1148 ± 35</td>
<td>2024 ± 49</td>
<td>5087 ± 117</td>
<td>10,200 ± 288</td>
<td>20,211 ± 183</td>
</tr>
<tr>
<td>20 features (K1)</td>
<td>215 ± 25</td>
<td>359 ± 45</td>
<td>901 ± 50</td>
<td>2097 ± 63</td>
<td>3652 ± 52</td>
<td>9092 ± 38</td>
<td>18,439 ± 253</td>
<td>36,333 ± 38</td>
</tr>
<tr>
<td>30 features (K1)</td>
<td>309 ± 24</td>
<td>534 ± 67</td>
<td>1368 ± 65</td>
<td>3059 ± 100</td>
<td>5504 ± 85</td>
<td>13,744 ± 88</td>
<td>27,231 ± 500</td>
<td>55,012 ± 126</td>
</tr>
<tr>
<td>40 features (K1)</td>
<td>377 ± 31</td>
<td>671 ± 84</td>
<td>1710 ± 95</td>
<td>3828 ± 112</td>
<td>6921 ± 103</td>
<td>17,249 ± 115</td>
<td>34,723 ± 157</td>
<td>69,997 ± 448</td>
</tr>
</tbody>
</table>
<table-wrap-foot>
<p>Each row represents various features, action and actions groups that are evaluated for different number of motion capture frames (in columns)</p>
</table-wrap-foot>
</table-wrap>
<fig id="Fig6">
<label>Fig. 6</label>
<caption>
<p>This figure visualizes data from Table
<xref rid="Tab1" ref-type="table">1</xref>
</p>
</caption>
<graphic xlink:href="10916_2016_493_Fig6_HTML" id="MO8"></graphic>
</fig>
</p>
<p>Basing on the obtained data in Table
<xref rid="Tab2" ref-type="table">2</xref>
and Fig. 
<xref rid="Fig7" ref-type="fig">7</xref>
we have shown what is an average execution time of single MoCap frame calculation plus – minus standard deviation.
<table-wrap id="Tab2">
<label>Table 2</label>
<caption>
<p>This table presents averaged execution time (in milliseconds) plus-minus standard deviation of various Lua scripts that uses GDL implementation for a single motion capture frame</p>
</caption>
<table frame="hsides" rules="groups">
<thead>
<tr>
<th>Feature, action or action group name</th>
<th>Execution time (in milliseconds)</th>
</tr>
</thead>
<tbody>
<tr>
<td>kiba-dachi (K2)</td>
<td>1.57 ± 0.08</td>
</tr>
<tr>
<td>karate (K2)</td>
<td>11.64 ± 0.24</td>
</tr>
<tr>
<td>10 features (K2)</td>
<td>1.06 ± 0.08</td>
</tr>
<tr>
<td>20 features (K2)</td>
<td>1.92 ± 0.13</td>
</tr>
<tr>
<td>30 features (K2)</td>
<td>2.77 ± 0.13</td>
</tr>
<tr>
<td>40 features (K2)</td>
<td>3.98 ± 0.05</td>
</tr>
<tr>
<td>jumping jacks (K1)</td>
<td>0.90 ± 0.10</td>
</tr>
<tr>
<td>gym (K1)</td>
<td>8.11 ± 0.94</td>
</tr>
<tr>
<td>10 features (K1)</td>
<td>1.05 ± 0.08</td>
</tr>
<tr>
<td>20 features (K1)</td>
<td>1.89 ± 0.13</td>
</tr>
<tr>
<td>30 features (K1)</td>
<td>2.82 ± 0.15</td>
</tr>
<tr>
<td>40 features (K1)</td>
<td>3.53 ± 0.16</td>
</tr>
</tbody>
</table>
</table-wrap>
<fig id="Fig7">
<label>Fig. 7</label>
<caption>
<p>This figure visualizes data from Table
<xref rid="Tab2" ref-type="table">2</xref>
</p>
</caption>
<graphic xlink:href="10916_2016_493_Fig7_HTML" id="MO9"></graphic>
</fig>
</p>
</sec>
<sec id="Sec7" sec-type="discussion">
<title>Discussion</title>
<p>The results presented in previous section show that processing time of Lua implementation of GDL methodology operates in fast and reliable way. As can be seen in Table
<xref rid="Tab1" ref-type="table">1</xref>
and Fig. 
<xref rid="Fig6" ref-type="fig">6</xref>
there is a nearly linear dependence between number of processed frames and time of processing. That proves that the method is stable and can operate without disturbance continuously. The test performed on features set for 10, 20, 30 and 40 features proves that there is no significant difference in processing time for 20 and 25 joints dataset. This is also quite natural that the larger the movement description is the more time is require to process the single MoCap frame. The execution time for 40-features K1 is 3.53 ± 0.16 milliseconds and for K2 is 3.98 ± 0.05 that means that it is possible to process MoCap dataset with frequency over 250 Hz which is sufficient for most up-to-date hardware of that type. The slowest processing was obtained for karate actions classification dataset that recognizes 12 different actions (11.64 ± 0.24 milliseconds per frame). That means that frequency of frame processing is about 85 Hz which is fast enough for signal classification task. Summing up the GDL methodology adapted to new functionalities satisfies needs of sport and rehabilitation data analysis and classification.</p>
</sec>
<sec id="Sec8" sec-type="conclusion">
<title>Conclusions</title>
<p>In this paper we showed that Lua language can be successfully used for adaptation of the GDL classifier to new scientific tasks. The newly applied scripting language allows easily extension and integration of classifier with other software technologies and applications. As it was discussed in the previous section the obtained execution speed allows using the methodology in the real-time motion capture data processing where capturing frequency differs from 100 Hz to even 500 Hz depending on number of features or classes to be calculated and recognized. Due to this fact the proposed methodology can be used to the high-end motion capture system. We anticipate that using novel, efficient and effective method will highly help both sport trainers and physiotherapist in everyday tasks. For example pattern recognition and data mining methods can supply both sport and rehabilitation data evaluation. The proposed approach can be directly applied to MoCap data kinematics analysis (evaluation of motion without regard to the forces that cause that motion). However kinetics (the study of movements under the action of forces) will require additional data source beside MoCap for example ground reaction forces acquired by force plate or other force type collected with dynamometer. That additional data stream can be easily integrated with our methodology and the forces can be calculated without changing already established framework. However sole kinematics is sufficient for many applications in sport, medicine, physiotherapy and rehabilitation. Actions can be described using derivatives of displacement like velocity or acceleration. The ability to apply pattern recognition methods for GDL description can be utilized in virtual reality environment similarly to that described in [
<xref ref-type="bibr" rid="CR27">27</xref>
,
<xref ref-type="bibr" rid="CR52">52</xref>
] and used for training or treatment.</p>
</sec>
</body>
<back>
<fn-group>
<fn>
<p>This article is part of the Topical Collection on
<italic>Patient Facing Systems</italic>
</p>
</fn>
</fn-group>
<ack>
<p>This work has been supported by the National Science Centre, Poland, under project number 2015/17/D/ST6/04051.</p>
</ack>
<ref-list id="Bib1">
<title>References</title>
<ref id="CR1">
<label>1.</label>
<mixed-citation publication-type="other">Artner, N. M., Ion, A., and Kropatsch, W. G., Multi-scale 2D tracking of articulated objects using hierarchical spring systems. Pattern Recognit.: 800–810. doi: 10.1016/j.patcog.2010.10.025, 2011.</mixed-citation>
</ref>
<ref id="CR2">
<label>2.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Zhang</surname>
<given-names>Q</given-names>
</name>
<name>
<surname>Song</surname>
<given-names>X</given-names>
</name>
<name>
<surname>Shibasaki</surname>
<given-names>R</given-names>
</name>
<name>
<surname>Zhao</surname>
<given-names>H</given-names>
</name>
</person-group>
<article-title>Unsupervised skeleton extraction and motion capture from 3D deformable matching</article-title>
<source>Neurocomputing</source>
<year>2013</year>
<volume>100</volume>
<issue>16</issue>
<fpage>170</fpage>
<lpage>182</lpage>
<pub-id pub-id-type="doi">10.1016/j.neucom.2011.11.032</pub-id>
</element-citation>
</ref>
<ref id="CR3">
<label>3.</label>
<mixed-citation publication-type="other">Shotton, J., Fitzgibbon, A., Cook, M., Sharp, T., Finocchio, M., Moore, R., Kipman, A., and Blake, A., Real-time human pose recognition in parts from single depth images, CVPR ‘11 Proceedings of the 2011 I.E. Conference on Computer Vision and Pattern Recognition, pp. 1297–1304, IEEE Computer Society Washington, DC, USA, 2011.</mixed-citation>
</ref>
<ref id="CR4">
<label>4.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Schwarz</surname>
<given-names>LA</given-names>
</name>
<name>
<surname>Mkhitaryan</surname>
<given-names>A</given-names>
</name>
<name>
<surname>Mateus</surname>
<given-names>D</given-names>
</name>
<name>
<surname>Navab</surname>
<given-names>N</given-names>
</name>
</person-group>
<article-title>Human skeleton tracking from depth data using geodesic distances and optical flow</article-title>
<source>Image Vis. Comput.</source>
<year>2012</year>
<volume>30</volume>
<fpage>217</fpage>
<lpage>226</lpage>
<pub-id pub-id-type="doi">10.1016/j.imavis.2011.12.001</pub-id>
</element-citation>
</ref>
<ref id="CR5">
<label>5.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Gupta</surname>
<given-names>S</given-names>
</name>
<name>
<surname>Jaafar</surname>
<given-names>J</given-names>
</name>
<name>
<surname>Fatimah</surname>
<given-names>W</given-names>
</name>
<name>
<surname>Ahmad</surname>
<given-names>W</given-names>
</name>
</person-group>
<article-title>Static hand gesture recognition using local gabor filter</article-title>
<source>Proc. Eng.</source>
<year>2012</year>
<volume>41</volume>
<fpage>827</fpage>
<lpage>832</lpage>
<pub-id pub-id-type="doi">10.1016/j.proeng.2012.07.250</pub-id>
</element-citation>
</ref>
<ref id="CR6">
<label>6.</label>
<mixed-citation publication-type="other">Arulkarthick, V. J., and Sangeetha, D., Sign language recognition using K-means clustered haar-like, features and a stochastic context free grammar. Eur. J. Sci.</mixed-citation>
</ref>
<ref id="CR7">
<label>7.</label>
<mixed-citation publication-type="other">Taubert, N., Löffler, M., Ludolph, N., Christensen, A., Endres, D., and Giese, M. A., A virtual reality setup for controllable, stylized real-time interactions between humans and avatars with sparse gaussian process dynamical models. Proceedings of the ACM Symposium on Applied Perception, pp. 41–44, 2013.</mixed-citation>
</ref>
<ref id="CR8">
<label>8.</label>
<mixed-citation publication-type="other">Vieira, W. A., Lewiner, T., Schwartz, W. R., and Campos M. F. M., Distance matrices as invariant features for classifying MoCap data, Pattern Recognition (ICPR), 2012 21st International Conference on, pp. 2934–2937. IEEE, 2012.</mixed-citation>
</ref>
<ref id="CR9">
<label>9.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Miranda</surname>
<given-names>L</given-names>
</name>
<name>
<surname>Vieira</surname>
<given-names>T</given-names>
</name>
<name>
<surname>Martinez</surname>
<given-names>D</given-names>
</name>
<name>
<surname>Lewiner</surname>
<given-names>T</given-names>
</name>
<name>
<surname>Vieira</surname>
<given-names>AW</given-names>
</name>
<name>
<surname>Campos</surname>
<given-names>MFM</given-names>
</name>
</person-group>
<article-title>Online gesture recognition from pose kernel learning and decision forests</article-title>
<source>Pattern Recogn. Lett.</source>
<year>2014</year>
<volume>39</volume>
<issue>1</issue>
<fpage>65</fpage>
<lpage>73</lpage>
<pub-id pub-id-type="doi">10.1016/j.patrec.2013.10.005</pub-id>
</element-citation>
</ref>
<ref id="CR10">
<label>10.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Li</surname>
<given-names>Z</given-names>
</name>
<name>
<surname>Wei</surname>
<given-names>Z</given-names>
</name>
<name>
<surname>Yue</surname>
<given-names>Y</given-names>
</name>
<name>
<surname>Wang</surname>
<given-names>H</given-names>
</name>
<name>
<surname>Jia</surname>
<given-names>W</given-names>
</name>
<name>
<surname>Burke</surname>
<given-names>LE</given-names>
</name>
<name>
<surname>Baranowski</surname>
<given-names>T</given-names>
</name>
<name>
<surname>Sun</surname>
<given-names>M</given-names>
</name>
</person-group>
<article-title>An adaptive hidden Markov model for activity recognition based on a wearable multi-sensor device</article-title>
<source>J. Med. Syst.</source>
<year>2015</year>
<volume>39</volume>
<issue>5</issue>
<fpage>57</fpage>
<pub-id pub-id-type="doi">10.1007/s10916-015-0239-x</pub-id>
<pub-id pub-id-type="pmid">25787786</pub-id>
</element-citation>
</ref>
<ref id="CR11">
<label>11.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Rajanna</surname>
<given-names>V</given-names>
</name>
<name>
<surname>Vo</surname>
<given-names>P</given-names>
</name>
<name>
<surname>Barth</surname>
<given-names>J</given-names>
</name>
<name>
<surname>Mjelde</surname>
<given-names>M</given-names>
</name>
<name>
<surname>Grey</surname>
<given-names>T</given-names>
</name>
<name>
<surname>Oduola</surname>
<given-names>C</given-names>
</name>
<name>
<surname>Hammond</surname>
<given-names>T</given-names>
</name>
</person-group>
<article-title>KinoHaptics: An automated, wearable, Haptic assisted, physio-therapeutic system for post-surgery rehabilitation and self-care</article-title>
<source>J. Med. Syst.</source>
<year>2016</year>
<volume>40</volume>
<issue>3</issue>
<fpage>60</fpage>
<pub-id pub-id-type="doi">10.1007/s10916-015-0391-3</pub-id>
<pub-id pub-id-type="pmid">26660691</pub-id>
</element-citation>
</ref>
<ref id="CR12">
<label>12.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Cholewa</surname>
<given-names>M</given-names>
</name>
<name>
<surname>Głomb</surname>
<given-names>P</given-names>
</name>
</person-group>
<article-title>Estimation of the number of states for gesture recognition with Hidden Markov Models based on the number of critical points in time sequence</article-title>
<source>Pattern Recogn. Lett.</source>
<year>2013</year>
<volume>34</volume>
<issue>5</issue>
<fpage>574</fpage>
<lpage>579</lpage>
<pub-id pub-id-type="doi">10.1016/j.patrec.2012.12.002</pub-id>
</element-citation>
</ref>
<ref id="CR13">
<label>13.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Kang</surname>
<given-names>J</given-names>
</name>
<name>
<surname>Zhong</surname>
<given-names>K</given-names>
</name>
<name>
<surname>Qin</surname>
<given-names>S</given-names>
</name>
<name>
<surname>Wang</surname>
<given-names>H</given-names>
</name>
<name>
<surname>Wright</surname>
<given-names>D</given-names>
</name>
</person-group>
<article-title>Instant 3D design concept generation and visualization by real-time hand gesture recognition</article-title>
<source>Comput. Ind.</source>
<year>2013</year>
<volume>64</volume>
<issue>7</issue>
<fpage>785</fpage>
<lpage>797</lpage>
<pub-id pub-id-type="doi">10.1016/j.compind.2013.04.012</pub-id>
</element-citation>
</ref>
<ref id="CR14">
<label>14.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>López-Méndez</surname>
<given-names>A</given-names>
</name>
<name>
<surname>Casas</surname>
<given-names>JR</given-names>
</name>
</person-group>
<article-title>Model-based recognition of human actions by trajectory matching in phase spaces</article-title>
<source>Image Vis. Comput.</source>
<year>2012</year>
<volume>30</volume>
<fpage>808</fpage>
<lpage>816</lpage>
<pub-id pub-id-type="doi">10.1016/j.imavis.2012.06.007</pub-id>
</element-citation>
</ref>
<ref id="CR15">
<label>15.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Zhu</surname>
<given-names>F</given-names>
</name>
<name>
<surname>Shao</surname>
<given-names>L</given-names>
</name>
<name>
<surname>Lin</surname>
<given-names>M</given-names>
</name>
</person-group>
<article-title>Multi-view action recognition using local similarity random forests and sensor fusion</article-title>
<source>Pattern Recogn. Lett.</source>
<year>2013</year>
<volume>34</volume>
<fpage>20</fpage>
<lpage>24</lpage>
<pub-id pub-id-type="doi">10.1016/j.patrec.2012.04.016</pub-id>
</element-citation>
</ref>
<ref id="CR16">
<label>16.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Gamage</surname>
<given-names>N</given-names>
</name>
<name>
<surname>Chow Kuang</surname>
<given-names>Y</given-names>
</name>
<name>
<surname>Akmeliawati</surname>
<given-names>R</given-names>
</name>
<name>
<surname>Demidenko</surname>
<given-names>S</given-names>
</name>
</person-group>
<article-title>Gaussian process dynamical models for hand gesture interpretation in sign language</article-title>
<source>Pattern Recogn. Lett.</source>
<year>2011</year>
<volume>32</volume>
<fpage>2009</fpage>
<lpage>2014</lpage>
<pub-id pub-id-type="doi">10.1016/j.patrec.2011.08.015</pub-id>
</element-citation>
</ref>
<ref id="CR17">
<label>17.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Glowacz</surname>
<given-names>A</given-names>
</name>
</person-group>
<article-title>Diagnostics of synchronous motor based on analysis of acoustic signals with the use of line spectral frequencies and K-nearest neighbor classifier</article-title>
<source>Arch. Acoust.</source>
<year>2014</year>
<volume>39</volume>
<issue>2</issue>
<fpage>189</fpage>
<lpage>194</lpage>
</element-citation>
</ref>
<ref id="CR18">
<label>18.</label>
<mixed-citation publication-type="other">Glowacz, A., Glowacz, A., and Glowacz, Z., Recognition of thermal images of direct current motor with application of area perimeter vector and Bayes classifier, measurement science review. 15(3): 119–126, ISSN (Online) 1335–8871. doi: 10.1515/msr-2015-0018, 2015.</mixed-citation>
</ref>
<ref id="CR19">
<label>19.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Du</surname>
<given-names>Y</given-names>
</name>
<name>
<surname>Chen</surname>
<given-names>F</given-names>
</name>
<name>
<surname>Xu</surname>
<given-names>W</given-names>
</name>
<name>
<surname>Zhang</surname>
<given-names>W</given-names>
</name>
</person-group>
<article-title>Activity recognition through multi-scale motion detail analysis</article-title>
<source>Neurocomputing</source>
<year>2008</year>
<volume>71</volume>
<fpage>3561</fpage>
<lpage>3574</lpage>
<pub-id pub-id-type="doi">10.1016/j.neucom.2007.09.012</pub-id>
</element-citation>
</ref>
<ref id="CR20">
<label>20.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Suma</surname>
<given-names>EA</given-names>
</name>
<name>
<surname>Krum</surname>
<given-names>DM</given-names>
</name>
<name>
<surname>Lange</surname>
<given-names>B</given-names>
</name>
<name>
<surname>Koenig</surname>
<given-names>S</given-names>
</name>
<name>
<surname>Rizzo</surname>
<given-names>A</given-names>
</name>
<name>
<surname>Bolas</surname>
<given-names>M</given-names>
</name>
</person-group>
<article-title>Adapting user interfaces for gestural interaction with the flexible action and articulated skeleton toolkit</article-title>
<source>Comput. Graph.</source>
<year>2013</year>
<volume>37</volume>
<issue>3</issue>
<fpage>193</fpage>
<lpage>201</lpage>
<pub-id pub-id-type="doi">10.1016/j.cag.2012.11.004</pub-id>
</element-citation>
</ref>
<ref id="CR21">
<label>21.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Bickerstaffe</surname>
<given-names>A</given-names>
</name>
<name>
<surname>Lane</surname>
<given-names>A</given-names>
</name>
<name>
<surname>Meyer</surname>
<given-names>B</given-names>
</name>
<name>
<surname>Marriott</surname>
<given-names>K</given-names>
</name>
</person-group>
<article-title>Developing domain-specific gesture recognizers for smart diagram environments, graphics recognition. Recent advances and new opportunities</article-title>
<source>Lect. Notes Comput. Sci</source>
<year>2008</year>
<volume>5046</volume>
<fpage>145</fpage>
<lpage>156</lpage>
<pub-id pub-id-type="doi">10.1007/978-3-540-88188-9_15</pub-id>
</element-citation>
</ref>
<ref id="CR22">
<label>22.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Hachaj</surname>
<given-names>T</given-names>
</name>
<name>
<surname>Ogiela</surname>
<given-names>MR</given-names>
</name>
</person-group>
<article-title>Rule-based approach to recognizing human body poses and gestures in real time</article-title>
<source>Multimedia Systems</source>
<year>2014</year>
<volume>20</volume>
<issue>1</issue>
<fpage>81</fpage>
<lpage>99</lpage>
<pub-id pub-id-type="doi">10.1007/s00530-013-0332-2</pub-id>
</element-citation>
</ref>
<ref id="CR23">
<label>23.</label>
<mixed-citation publication-type="other">Hachaj, T., and Ogiela, M. R., Computer karate trainer in tasks of personal and homeland security defense. In: Cuzzocrea, A., et al. (Eds.), CD-ARES 2013 Workshops, LNCS 8128, pp. 430–441, 2013.</mixed-citation>
</ref>
<ref id="CR24">
<label>24.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Hachaj</surname>
<given-names>T</given-names>
</name>
<name>
<surname>Ogiela</surname>
<given-names>MR</given-names>
</name>
<name>
<surname>Piekarczyk</surname>
<given-names>M</given-names>
</name>
</person-group>
<article-title>Real-time recognition of selected karate techniques using GDL approach, image processing and communications challenges 5</article-title>
<source>Adv. Intell. Syst. Comput.</source>
<year>2014</year>
<volume>233</volume>
<fpage>99</fpage>
<lpage>106</lpage>
<pub-id pub-id-type="doi">10.1007/978-3-319-01622-1_12</pub-id>
</element-citation>
</ref>
<ref id="CR25">
<label>25.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Hachaj</surname>
<given-names>T</given-names>
</name>
<name>
<surname>Ogiela</surname>
<given-names>MR</given-names>
</name>
</person-group>
<article-title>Full body Movements recognition – unsupervised learning approach with heuristic R-GDL method</article-title>
<source>Digital Signal Process.</source>
<year>2015</year>
<volume>46</volume>
<fpage>239</fpage>
<lpage>252</lpage>
<pub-id pub-id-type="doi">10.1016/j.dsp.2015.07.004</pub-id>
</element-citation>
</ref>
<ref id="CR26">
<label>26.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Hachaj</surname>
<given-names>T</given-names>
</name>
<name>
<surname>Ogiela</surname>
<given-names>MR</given-names>
</name>
<name>
<surname>Koptyra</surname>
<given-names>K</given-names>
</name>
</person-group>
<article-title>Application of assistive computer vision methods to Oyama karate techniques recognition</article-title>
<source>Symmetry</source>
<year>2015</year>
<volume>7</volume>
<issue>4</issue>
<fpage>1670</fpage>
<lpage>1698</lpage>
<pub-id pub-id-type="doi">10.3390/sym7041670</pub-id>
</element-citation>
</ref>
<ref id="CR27">
<label>27.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Hachaj</surname>
<given-names>T</given-names>
</name>
<name>
<surname>Baraniewicz</surname>
<given-names>D</given-names>
</name>
</person-group>
<article-title>Knowledge bricks - educational immersive reality environment</article-title>
<source>Int. J. Inf. Manag.</source>
<year>2015</year>
<volume>35</volume>
<fpage>396</fpage>
<lpage>406</lpage>
<pub-id pub-id-type="doi">10.1016/j.ijinfomgt.2015.01.006</pub-id>
</element-citation>
</ref>
<ref id="CR28">
<label>28.</label>
<mixed-citation publication-type="other">Hachaj, T., and Ogiela, M. R., Recognition of body movements patterns for immersive virtual reality system interface, 2014 Ninth International Conference on P2P, Parallel, Grid, Cloud and Internet Computing, 978-1-4799-4171-1/14, IEEE Computer Society Order Number E5391 ISBN-13: 978-1-4799-4171-1, pp. 290–294. doi 10.1109/3PGCIC.2014.79, 2014.</mixed-citation>
</ref>
<ref id="CR29">
<label>29.</label>
<mixed-citation publication-type="other">Hachaj, T., Ogiela, M. R., and Koptyra, K., Effectiveness comparison of Kinect and Kinect 2 for recognition of Oyama karate techniques, NBiS 2015 - The 18-th International Conference on Network-Based Information Systems (NBiS 2015), September 2–4, Taipei, Taiwan, pp. 332–337. doi 10.1109/NBiS.2015.51, ISBN: 978-1-4799-9942-2/15.</mixed-citation>
</ref>
<ref id="CR30">
<label>30.</label>
<mixed-citation publication-type="other">Hachaj, T., Ogiela, M. R., and Koptyra, K., Human actions modelling and recognition in low-dimensional feature space, BWCCA 2015, 10th International Conference on Broadband and Wireless Computing, Communication and Applications, November 4–6, 2015, Krakow, Poland, pp. 247–254. doi 10.1109/BWCCA.2015.15, 2015.</mixed-citation>
</ref>
<ref id="CR31">
<label>31.</label>
<mixed-citation publication-type="other">Hachaj, T., Ogiela, M. R., and Koptyra, K., Application of hidden Markov models and gesture description language classifiers to Oyama karate techniques recognition, Innovative Mobile and Internet Services in Ubiquitous Computing (IMIS), 2015 9th International Conference on, 8–10 July 2015, Blumenau, pp. 160–165, ISBN 978-1-4799-8872-3. doi: 10.1109/IMIS.2015.26, 2015.</mixed-citation>
</ref>
<ref id="CR32">
<label>32.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Palacios-Navarro</surname>
<given-names>G</given-names>
</name>
<name>
<surname>García-Magariño</surname>
<given-names>I</given-names>
</name>
<name>
<surname>Ramos-Lorente</surname>
<given-names>P</given-names>
</name>
</person-group>
<article-title>A Kinect-based system for lower limb rehabilitation in Parkinson’s disease patients: a pilot study</article-title>
<source>J. Med. Syst.</source>
<year>2015</year>
<volume>39</volume>
<issue>9</issue>
<fpage>103</fpage>
<pub-id pub-id-type="doi">10.1007/s10916-015-0289-0</pub-id>
<pub-id pub-id-type="pmid">26265237</pub-id>
</element-citation>
</ref>
<ref id="CR33">
<label>33.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>De la Torre-Díez</surname>
<given-names>I</given-names>
</name>
<name>
<surname>Antón-Rodríguez</surname>
<given-names>M</given-names>
</name>
<name>
<surname>Díaz-Pernas</surname>
<given-names>FJ</given-names>
</name>
<name>
<surname>Perozo-Rondón</surname>
<given-names>FJ</given-names>
</name>
</person-group>
<article-title>Comparison of response times of a mobile-web EHRs system using PHP and JSP languages</article-title>
<source>J. Med. Syst.</source>
<year>2012</year>
<volume>36</volume>
<issue>6</issue>
<fpage>3945</fpage>
<lpage>3953</lpage>
<pub-id pub-id-type="doi">10.1007/s10916-012-9866-7</pub-id>
<pub-id pub-id-type="pmid">22706897</pub-id>
</element-citation>
</ref>
<ref id="CR34">
<label>34.</label>
<element-citation publication-type="book">
<person-group person-group-type="author">
<name>
<surname>Hauser</surname>
<given-names>JR</given-names>
</name>
</person-group>
<source>Numerical methods for nonlinear engineering models</source>
<year>2009</year>
<publisher-loc>Netherlands</publisher-loc>
<publisher-name>Springer</publisher-name>
</element-citation>
</ref>
<ref id="CR35">
<label>35.</label>
<mixed-citation publication-type="other">Torch official website (acces datee 25-02-2016)
<ext-link ext-link-type="uri" xlink:href="http://torch.ch/">http://torch.ch/</ext-link>
.</mixed-citation>
</ref>
<ref id="CR36">
<label>36.</label>
<mixed-citation publication-type="other">Collobert, R., Kavukcuoglu, K., and Farabet, C., Implementing neural networks efficiently, neural networks: tricks of the trade, Volume 7700 of the series Lecture Notes in Computer Science, pp. 537–557, DOI: 10.1007/978-3-642-35289-8_28.</mixed-citation>
</ref>
<ref id="CR37">
<label>37.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Cacho</surname>
<given-names>N</given-names>
</name>
<name>
<surname>Batista</surname>
<given-names>T</given-names>
</name>
<name>
<surname>Fernandes</surname>
<given-names>F</given-names>
</name>
</person-group>
<article-title>A Lua-based AOP infrastructure</article-title>
<source>J. Braz. Comput. Soc.</source>
<year>2005</year>
<volume>11</volume>
<issue>3</issue>
<fpage>7</fpage>
<lpage>20</lpage>
<pub-id pub-id-type="doi">10.1007/BF03192379</pub-id>
</element-citation>
</ref>
<ref id="CR38">
<label>38.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Maia</surname>
<given-names>R</given-names>
</name>
<name>
<surname>Cerqueira</surname>
<given-names>R</given-names>
</name>
<name>
<surname>Sieckenius de Souza</surname>
<given-names>C</given-names>
</name>
<name>
<surname>Guisasola-Gorham</surname>
<given-names>T</given-names>
</name>
</person-group>
<article-title>A qualitative human-centric evaluation of flexibility in middleware implementations</article-title>
<source>Empir. Softw. Eng.</source>
<year>2012</year>
<volume>17</volume>
<issue>3</issue>
<fpage>166</fpage>
<lpage>199</lpage>
<pub-id pub-id-type="doi">10.1007/s10664-011-9167-7</pub-id>
</element-citation>
</ref>
<ref id="CR39">
<label>39.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Soares</surname>
<given-names>LFG</given-names>
</name>
<name>
<surname>Rodrigues</surname>
<given-names>RF</given-names>
</name>
<name>
<surname>Cerqueira</surname>
<given-names>R</given-names>
</name>
<name>
<surname>Barbosa</surname>
<given-names>SDJ</given-names>
</name>
</person-group>
<article-title>Variable and state handling in NCL</article-title>
<source>Multimed. Tools Appl.</source>
<year>2010</year>
<volume>50</volume>
<issue>3</issue>
<fpage>465</fpage>
<lpage>489</lpage>
<pub-id pub-id-type="doi">10.1007/s11042-010-0478-2</pub-id>
</element-citation>
</ref>
<ref id="CR40">
<label>40.</label>
<mixed-citation publication-type="other">Niemüller, T., Ferrein, A., and Lakemeyer, G., A Lua-based behavior engine for controlling the humanoid robot nao, RoboCup 2009: Robot Soccer World Cup XIII, Volume 5949 of the series Lecture Notes in Computer Science, pp. 240–251. DOI: 10.1007/978-3-642-11876-0_21.</mixed-citation>
</ref>
<ref id="CR41">
<label>41.</label>
<mixed-citation publication-type="other">Codd-Downey, R., Jenkin, M., Ansell, M., Ng, H. -K., and Jasiobedzki, P., Simulating the C2SM ‘Fast’ robot, simulation, modeling, and programming for autonomous robots, Volume 6472 of the series Lecture Notes in Computer Science, pp. 26–37. DOI: 10.1007/978-3-642-17319-6_6.</mixed-citation>
</ref>
<ref id="CR42">
<label>42.</label>
<mixed-citation publication-type="other">Freese, M., Singh, S., Ozaki, F., and Matsuhira, N., Virtual robot experimentation platform V-REP: a versatile 3D robot simulator, simulation, modeling, and programming for autonomous robots, Volume 6472 of the series Lecture Notes in Computer Science, pp. 51–62. DOI: 10.1007/978-3-642-17319-6_8.</mixed-citation>
</ref>
<ref id="CR43">
<label>43.</label>
<mixed-citation publication-type="other">Ferrein, A., and Steinbauer, G., On the way to high-level programming for resource-limited embedded systems with Golog, Simulation, Modeling, and Programming for Autonomous Robots, Volume 6472 of the series Lecture Notes in Computer Science, pp. 229–240. doi: 10.1007/978-3-642-17319-6_23.</mixed-citation>
</ref>
<ref id="CR44">
<label>44.</label>
<mixed-citation publication-type="other">Emmerich, P., Beginning Lua with world of warcraft addons. Apress. doi: 10.1007/978-1-4302-2372-6, 2009.</mixed-citation>
</ref>
<ref id="CR45">
<label>45.</label>
<mixed-citation publication-type="other">Jordan, L., and Greyling, P., Practical android projects. Apress. doi: 10.1007/978-1-4302-3244-5, 2011.</mixed-citation>
</ref>
<ref id="CR46">
<label>46.</label>
<mixed-citation publication-type="other">Smith, W., and Wakefield, G., Computational audiovisual composition using Lua, Transdisciplinary Digital Art. Sound, Vision and the New Screen, Volume 7 of the series Communications in Computer and Information Science, pp. 213–228. doi: 10.1007/978-3-540-79486-8_19, 2008.</mixed-citation>
</ref>
<ref id="CR47">
<label>47.</label>
<mixed-citation publication-type="other">Ierusalimschy, R., Programming with multiple paradigms in lua, functional and constraint logic programming, Volume 5979 of the series Lecture Notes in Computer Science, pp. 1–12. doi: 10.1007/978-3-642-11999-6_1.</mixed-citation>
</ref>
<ref id="CR48">
<label>48.</label>
<mixed-citation publication-type="other">Karduna, A. R., Introduction to biomechanical analysis. In: Oatis, C. A., (Ed.),
<italic>Kinesiology: The Mechanics And Pathomechanics Of Human Movement</italic>
. Published by Lippincott Williams & Wilkins (2004-06-01) ISBN 10: 0781755131 / ISBN 13: 9780781755139.</mixed-citation>
</ref>
<ref id="CR49">
<label>49.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Lin</surname>
<given-names>C</given-names>
</name>
<name>
<surname>Song</surname>
<given-names>Z</given-names>
</name>
<name>
<surname>Song</surname>
<given-names>H</given-names>
</name>
<name>
<surname>Zhou</surname>
<given-names>Y</given-names>
</name>
<name>
<surname>Wang</surname>
<given-names>Y</given-names>
</name>
<name>
<surname>Wu</surname>
<given-names>G</given-names>
</name>
</person-group>
<article-title>Differential privacy preserving in big data analytics for connected health</article-title>
<source>J. Med. Syst.</source>
<year>2016</year>
<volume>40</volume>
<issue>4</issue>
<fpage>97</fpage>
<pub-id pub-id-type="doi">10.1007/s10916-016-0446-0</pub-id>
<pub-id pub-id-type="pmid">26872779</pub-id>
</element-citation>
</ref>
<ref id="CR50">
<label>50.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Ullah</surname>
<given-names>S</given-names>
</name>
<name>
<surname>Higgins</surname>
<given-names>H</given-names>
</name>
<name>
<surname>Braem</surname>
<given-names>B</given-names>
</name>
<name>
<surname>Latre</surname>
<given-names>B</given-names>
</name>
<name>
<surname>Blondia</surname>
<given-names>C</given-names>
</name>
<name>
<surname>Moerman</surname>
<given-names>I</given-names>
</name>
<name>
<surname>Saleem</surname>
<given-names>S</given-names>
</name>
<name>
<surname>Rahman</surname>
<given-names>Z</given-names>
</name>
<name>
<surname>Kwak</surname>
<given-names>KS</given-names>
</name>
</person-group>
<article-title>A comprehensive survey of wireless body area networks: on PHY, MAC, and network layers solutions</article-title>
<source>J. Med. Syst.</source>
<year>2012</year>
<volume>36</volume>
<issue>3</issue>
<fpage>1065</fpage>
<lpage>1094</lpage>
<pub-id pub-id-type="doi">10.1007/s10916-010-9571-3</pub-id>
<pub-id pub-id-type="pmid">20721685</pub-id>
</element-citation>
</ref>
<ref id="CR51">
<label>51.</label>
<mixed-citation publication-type="other">Official website of GDL technology (acces datee 25-02-2016)
<ext-link ext-link-type="uri" xlink:href="http://gdl.org.pl/">http://gdl.org.pl/</ext-link>
.</mixed-citation>
</ref>
<ref id="CR52">
<label>52.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Hachaj</surname>
<given-names>T</given-names>
</name>
<name>
<surname>Ogiela</surname>
<given-names>MR</given-names>
</name>
</person-group>
<article-title>Visualization of perfusion abnormalities with GPU-based volume rendering</article-title>
<source>Comput. Graph.</source>
<year>2012</year>
<volume>36</volume>
<issue>3</issue>
<fpage>163</fpage>
<lpage>169</lpage>
<pub-id pub-id-type="doi">10.1016/j.cag.2012.01.002</pub-id>
</element-citation>
</ref>
</ref-list>
</back>
</pmc>
<affiliations>
<list>
<country>
<li>Pologne</li>
</country>
</list>
<tree>
<country name="Pologne">
<noRegion>
<name sortKey="Hachaj, Tomasz" sort="Hachaj, Tomasz" uniqKey="Hachaj T" first="Tomasz" last="Hachaj">Tomasz Hachaj</name>
</noRegion>
<name sortKey="Ogiela, Marek R" sort="Ogiela, Marek R" uniqKey="Ogiela M" first="Marek R." last="Ogiela">Marek R. Ogiela</name>
</country>
</tree>
</affiliations>
</record>

Pour manipuler ce document sous Unix (Dilib)

EXPLOR_STEP=$WICRI_ROOT/Ticri/CIDE/explor/HapticV1/Data/Pmc/Checkpoint
HfdSelect -h $EXPLOR_STEP/biblio.hfd -nk 000024 | SxmlIndent | more

Ou

HfdSelect -h $EXPLOR_AREA/Data/Pmc/Checkpoint/biblio.hfd -nk 000024 | SxmlIndent | more

Pour mettre un lien sur cette page dans le réseau Wicri

{{Explor lien
   |wiki=    Ticri/CIDE
   |area=    HapticV1
   |flux=    Pmc
   |étape=   Checkpoint
   |type=    RBID
   |clé=     PMC:4841835
   |texte=   The adaptation of GDL motion recognition system to sport and rehabilitation techniques analysis
}}

Pour générer des pages wiki

HfdIndexSelect -h $EXPLOR_AREA/Data/Pmc/Checkpoint/RBID.i   -Sk "pubmed:27106581" \
       | HfdSelect -Kh $EXPLOR_AREA/Data/Pmc/Checkpoint/biblio.hfd   \
       | NlmPubMed2Wicri -a HapticV1 

Wicri

This area was generated with Dilib version V0.6.23.
Data generation: Mon Jun 13 01:09:46 2016. Site generation: Wed Mar 6 09:54:07 2024