La maladie de Parkinson au Canada (serveur d'exploration)

Attention, ce site est en cours de développement !
Attention, site généré par des moyens informatiques à partir de corpus bruts.
Les informations ne sont donc pas validées.
***** Acces problem to record *****\

Identifieur interne : 000B44 ( Pmc/Corpus ); précédent : 000B439; suivant : 000B450 ***** probable Xml problem with record *****

Links to Exploration step


Le document en format XML

<record>
<TEI>
<teiHeader>
<fileDesc>
<titleStmt>
<title xml:lang="en">Design and Implementation of Foot-Mounted Inertial Sensor Based Wearable Electronic Device for Game Play Application</title>
<author>
<name sortKey="Zhou, Qifan" sort="Zhou, Qifan" uniqKey="Zhou Q" first="Qifan" last="Zhou">Qifan Zhou</name>
<affiliation>
<nlm:aff id="af1-sensors-16-01752">School of Automation Science and Electrical Engineering, Beihang University, Beijing 100191, China;
<email>zhanghai@buaa.edu.cn</email>
</nlm:aff>
</affiliation>
<affiliation>
<nlm:aff id="af2-sensors-16-01752">Geomatics Engineering Department, University of Calgary, Calgary, AB T2N 1N4, Canada;
<email>zlari@ucalgary.ca</email>
(Z.L.);
<email>zhenbo.liu2@ucalgary.ca</email>
(Z.L.);
<email>elsheimy@ucalgary.ca</email>
(N.E.-S.)</nlm:aff>
</affiliation>
</author>
<author>
<name sortKey="Zhang, Hai" sort="Zhang, Hai" uniqKey="Zhang H" first="Hai" last="Zhang">Hai Zhang</name>
<affiliation>
<nlm:aff id="af1-sensors-16-01752">School of Automation Science and Electrical Engineering, Beihang University, Beijing 100191, China;
<email>zhanghai@buaa.edu.cn</email>
</nlm:aff>
</affiliation>
</author>
<author>
<name sortKey="Lari, Zahra" sort="Lari, Zahra" uniqKey="Lari Z" first="Zahra" last="Lari">Zahra Lari</name>
<affiliation>
<nlm:aff id="af2-sensors-16-01752">Geomatics Engineering Department, University of Calgary, Calgary, AB T2N 1N4, Canada;
<email>zlari@ucalgary.ca</email>
(Z.L.);
<email>zhenbo.liu2@ucalgary.ca</email>
(Z.L.);
<email>elsheimy@ucalgary.ca</email>
(N.E.-S.)</nlm:aff>
</affiliation>
</author>
<author>
<name sortKey="Liu, Zhenbo" sort="Liu, Zhenbo" uniqKey="Liu Z" first="Zhenbo" last="Liu">Zhenbo Liu</name>
<affiliation>
<nlm:aff id="af2-sensors-16-01752">Geomatics Engineering Department, University of Calgary, Calgary, AB T2N 1N4, Canada;
<email>zlari@ucalgary.ca</email>
(Z.L.);
<email>zhenbo.liu2@ucalgary.ca</email>
(Z.L.);
<email>elsheimy@ucalgary.ca</email>
(N.E.-S.)</nlm:aff>
</affiliation>
</author>
<author>
<name sortKey="El Sheimy, Naser" sort="El Sheimy, Naser" uniqKey="El Sheimy N" first="Naser" last="El-Sheimy">Naser El-Sheimy</name>
<affiliation>
<nlm:aff id="af2-sensors-16-01752">Geomatics Engineering Department, University of Calgary, Calgary, AB T2N 1N4, Canada;
<email>zlari@ucalgary.ca</email>
(Z.L.);
<email>zhenbo.liu2@ucalgary.ca</email>
(Z.L.);
<email>elsheimy@ucalgary.ca</email>
(N.E.-S.)</nlm:aff>
</affiliation>
</author>
</titleStmt>
<publicationStmt>
<idno type="wicri:source">PMC</idno>
<idno type="pmid">27775673</idno>
<idno type="pmc">5087537</idno>
<idno type="url">http://www.ncbi.nlm.nih.gov/pmc/articles/PMC5087537</idno>
<idno type="RBID">PMC:5087537</idno>
<idno type="doi">10.3390/s16101752</idno>
<date when="2016">2016</date>
<idno type="wicri:Area/Pmc/Corpus">000B44</idno>
<idno type="wicri:explorRef" wicri:stream="Pmc" wicri:step="Corpus" wicri:corpus="PMC">000B44</idno>
</publicationStmt>
<sourceDesc>
<biblStruct>
<analytic>
<title xml:lang="en" level="a" type="main">Design and Implementation of Foot-Mounted Inertial Sensor Based Wearable Electronic Device for Game Play Application</title>
<author>
<name sortKey="Zhou, Qifan" sort="Zhou, Qifan" uniqKey="Zhou Q" first="Qifan" last="Zhou">Qifan Zhou</name>
<affiliation>
<nlm:aff id="af1-sensors-16-01752">School of Automation Science and Electrical Engineering, Beihang University, Beijing 100191, China;
<email>zhanghai@buaa.edu.cn</email>
</nlm:aff>
</affiliation>
<affiliation>
<nlm:aff id="af2-sensors-16-01752">Geomatics Engineering Department, University of Calgary, Calgary, AB T2N 1N4, Canada;
<email>zlari@ucalgary.ca</email>
(Z.L.);
<email>zhenbo.liu2@ucalgary.ca</email>
(Z.L.);
<email>elsheimy@ucalgary.ca</email>
(N.E.-S.)</nlm:aff>
</affiliation>
</author>
<author>
<name sortKey="Zhang, Hai" sort="Zhang, Hai" uniqKey="Zhang H" first="Hai" last="Zhang">Hai Zhang</name>
<affiliation>
<nlm:aff id="af1-sensors-16-01752">School of Automation Science and Electrical Engineering, Beihang University, Beijing 100191, China;
<email>zhanghai@buaa.edu.cn</email>
</nlm:aff>
</affiliation>
</author>
<author>
<name sortKey="Lari, Zahra" sort="Lari, Zahra" uniqKey="Lari Z" first="Zahra" last="Lari">Zahra Lari</name>
<affiliation>
<nlm:aff id="af2-sensors-16-01752">Geomatics Engineering Department, University of Calgary, Calgary, AB T2N 1N4, Canada;
<email>zlari@ucalgary.ca</email>
(Z.L.);
<email>zhenbo.liu2@ucalgary.ca</email>
(Z.L.);
<email>elsheimy@ucalgary.ca</email>
(N.E.-S.)</nlm:aff>
</affiliation>
</author>
<author>
<name sortKey="Liu, Zhenbo" sort="Liu, Zhenbo" uniqKey="Liu Z" first="Zhenbo" last="Liu">Zhenbo Liu</name>
<affiliation>
<nlm:aff id="af2-sensors-16-01752">Geomatics Engineering Department, University of Calgary, Calgary, AB T2N 1N4, Canada;
<email>zlari@ucalgary.ca</email>
(Z.L.);
<email>zhenbo.liu2@ucalgary.ca</email>
(Z.L.);
<email>elsheimy@ucalgary.ca</email>
(N.E.-S.)</nlm:aff>
</affiliation>
</author>
<author>
<name sortKey="El Sheimy, Naser" sort="El Sheimy, Naser" uniqKey="El Sheimy N" first="Naser" last="El-Sheimy">Naser El-Sheimy</name>
<affiliation>
<nlm:aff id="af2-sensors-16-01752">Geomatics Engineering Department, University of Calgary, Calgary, AB T2N 1N4, Canada;
<email>zlari@ucalgary.ca</email>
(Z.L.);
<email>zhenbo.liu2@ucalgary.ca</email>
(Z.L.);
<email>elsheimy@ucalgary.ca</email>
(N.E.-S.)</nlm:aff>
</affiliation>
</author>
</analytic>
<series>
<title level="j">Sensors (Basel, Switzerland)</title>
<idno type="eISSN">1424-8220</idno>
<imprint>
<date when="2016">2016</date>
</imprint>
</series>
</biblStruct>
</sourceDesc>
</fileDesc>
<profileDesc>
<textClass></textClass>
</profileDesc>
</teiHeader>
<front>
<div type="abstract" xml:lang="en">
<p>Wearable electronic devices have experienced increasing development with the advances in the semiconductor industry and have received more attention during the last decades. This paper presents the development and implementation of a novel inertial sensor-based foot-mounted wearable electronic device for a brand new application: game playing. The main objective of the introduced system is to monitor and identify the human foot stepping direction in real time, and coordinate these motions to control the player operation in games. This proposed system extends the utilized field of currently available wearable devices and introduces a convenient and portable medium to perform exercise in a more compelling way in the near future. This paper provides an overview of the previously-developed system platforms, introduces the main idea behind this novel application, and describes the implemented human foot moving direction identification algorithm. Practical experiment results demonstrate that the proposed system is capable of recognizing five foot motions, jump, step left, step right, step forward, and step backward, and has achieved an over 97% accuracy performance for different users. The functionality of the system for real-time application has also been verified through the practical experiments.</p>
</div>
</front>
<back>
<div1 type="bibliography">
<listBibl>
<biblStruct>
<analytic>
<author>
<name sortKey="Susi, M" uniqKey="Susi M">M. Susi</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Junker, H" uniqKey="Junker H">H. Junker</name>
</author>
<author>
<name sortKey="Amft, O" uniqKey="Amft O">O. Amft</name>
</author>
<author>
<name sortKey="Lukowicz, P" uniqKey="Lukowicz P">P. Lukowicz</name>
</author>
<author>
<name sortKey="Troster, G" uniqKey="Troster G">G. Tröster</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Li, Y" uniqKey="Li Y">Y. Li</name>
</author>
<author>
<name sortKey="Georgy, J" uniqKey="Georgy J">J. Georgy</name>
</author>
<author>
<name sortKey="Niu, X" uniqKey="Niu X">X. Niu</name>
</author>
<author>
<name sortKey="Li, Q" uniqKey="Li Q">Q. Li</name>
</author>
<author>
<name sortKey="El Sheimy, N" uniqKey="El Sheimy N">N. El-Sheimy</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Park, S" uniqKey="Park S">S. Park</name>
</author>
<author>
<name sortKey="Jayaraman, S" uniqKey="Jayaraman S">S. Jayaraman</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Nilsson, J O" uniqKey="Nilsson J">J.O. Nilsson</name>
</author>
<author>
<name sortKey="Skog, I" uniqKey="Skog I">I. Skog</name>
</author>
<author>
<name sortKey="H Ndel, P" uniqKey="H Ndel P">P. Händel</name>
</author>
<author>
<name sortKey="Hari, K V S" uniqKey="Hari K">K.V.S. Hari</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Ruppelt, J" uniqKey="Ruppelt J">J. Ruppelt</name>
</author>
<author>
<name sortKey="Kronenwett, N" uniqKey="Kronenwett N">N. Kronenwett</name>
</author>
<author>
<name sortKey="Scholz, G" uniqKey="Scholz G">G. Scholz</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Abdulrahim, K" uniqKey="Abdulrahim K">K. Abdulrahim</name>
</author>
<author>
<name sortKey="Hide, C" uniqKey="Hide C">C. Hide</name>
</author>
<author>
<name sortKey="Moore, T" uniqKey="Moore T">T. Moore</name>
</author>
<author>
<name sortKey="Hill, C" uniqKey="Hill C">C. Hill</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Fischer, C" uniqKey="Fischer C">C. Fischer</name>
</author>
<author>
<name sortKey="Gellersen, H" uniqKey="Gellersen H">H. Gellersen</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Skog, I" uniqKey="Skog I">I. Skog</name>
</author>
<author>
<name sortKey="H Ndel, P" uniqKey="H Ndel P">P. Händel</name>
</author>
<author>
<name sortKey="Nilsson, J O" uniqKey="Nilsson J">J.-O. Nilsson</name>
</author>
<author>
<name sortKey="Rantakokko, J" uniqKey="Rantakokko J">J. Rantakokko</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Norrdine, A" uniqKey="Norrdine A">A. Norrdine</name>
</author>
<author>
<name sortKey="Kasmi, Z" uniqKey="Kasmi Z">Z. Kasmi</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Gu, Y" uniqKey="Gu Y">Y. Gu</name>
</author>
<author>
<name sortKey="Song, Q" uniqKey="Song Q">Q. Song</name>
</author>
<author>
<name sortKey="Li, Y" uniqKey="Li Y">Y. Li</name>
</author>
<author>
<name sortKey="Ma, M" uniqKey="Ma M">M. Ma</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Ruiz, A R J" uniqKey="Ruiz A">A.R.J. Ruiz</name>
</author>
<author>
<name sortKey="Granja, F S" uniqKey="Granja F">F.S. Granja</name>
</author>
<author>
<name sortKey="Honorato, J C P" uniqKey="Honorato J">J.C.P. Honorato</name>
</author>
<author>
<name sortKey="Guevara Rosas, J I" uniqKey="Guevara Rosas J">J.I. Guevara Rosas</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Ascher, C" uniqKey="Ascher C">C. Ascher</name>
</author>
<author>
<name sortKey="Kessler, C" uniqKey="Kessler C">C. Kessler</name>
</author>
<author>
<name sortKey="Wankerl, M" uniqKey="Wankerl M">M. Wankerl</name>
</author>
<author>
<name sortKey="Trommer, G F" uniqKey="Trommer G">G.F. Trommer</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Nilsson, J O" uniqKey="Nilsson J">J.O. Nilsson</name>
</author>
<author>
<name sortKey="Gupta, A K" uniqKey="Gupta A">A.K. Gupta</name>
</author>
<author>
<name sortKey="Handel, P" uniqKey="Handel P">P. Handel</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Harle, R" uniqKey="Harle R">R. Harle</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Yun, X" uniqKey="Yun X">X. Yun</name>
</author>
<author>
<name sortKey="Calusdian, J" uniqKey="Calusdian J">J. Calusdian</name>
</author>
<author>
<name sortKey="Bachmann, E R" uniqKey="Bachmann E">E.R. Bachmann</name>
</author>
<author>
<name sortKey="Mcghee, R B" uniqKey="Mcghee R">R.B. McGhee</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Bancroft, J B" uniqKey="Bancroft J">J.B. Bancroft</name>
</author>
<author>
<name sortKey="Garrett, D" uniqKey="Garrett D">D. Garrett</name>
</author>
<author>
<name sortKey="Lachapelle, G" uniqKey="Lachapelle G">G. Lachapelle</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Avci, A" uniqKey="Avci A">A. Avci</name>
</author>
<author>
<name sortKey="Bosch, S" uniqKey="Bosch S">S. Bosch</name>
</author>
<author>
<name sortKey="Marin Perianu, M" uniqKey="Marin Perianu M">M. Marin-perianu</name>
</author>
<author>
<name sortKey="Marin Perianu, R" uniqKey="Marin Perianu R">R. Marin-perianu</name>
</author>
<author>
<name sortKey="Havinga, P" uniqKey="Havinga P">P. Havinga</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Choi, I" uniqKey="Choi I">I. Choi</name>
</author>
<author>
<name sortKey="Ricci, C" uniqKey="Ricci C">C. Ricci</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Mannini, A" uniqKey="Mannini A">A. Mannini</name>
</author>
<author>
<name sortKey="Sabatini, A M" uniqKey="Sabatini A">A.M. Sabatini</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Porta, J P" uniqKey="Porta J">J.P. Porta</name>
</author>
<author>
<name sortKey="Acosta, D J" uniqKey="Acosta D">D.J. Acosta</name>
</author>
<author>
<name sortKey="Lehker, A N" uniqKey="Lehker A">A.N. Lehker</name>
</author>
<author>
<name sortKey="Miller, S T" uniqKey="Miller S">S.T. Miller</name>
</author>
<author>
<name sortKey="Tomaka, J" uniqKey="Tomaka J">J. Tomaka</name>
</author>
<author>
<name sortKey="King, G A" uniqKey="King G">G.A. King</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Tucker, W J" uniqKey="Tucker W">W.J. Tucker</name>
</author>
<author>
<name sortKey="Bhammar, D M" uniqKey="Bhammar D">D.M. Bhammar</name>
</author>
<author>
<name sortKey="Sawyer, B J" uniqKey="Sawyer B">B.J. Sawyer</name>
</author>
<author>
<name sortKey="Buman, M P" uniqKey="Buman M">M.P. Buman</name>
</author>
<author>
<name sortKey="Gaesser, G A" uniqKey="Gaesser G">G.A. Gaesser</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Fong, D T" uniqKey="Fong D">D.T. Fong</name>
</author>
<author>
<name sortKey="Chan, Y" uniqKey="Chan Y">Y. Chan</name>
</author>
<author>
<name sortKey="Kong, H" uniqKey="Kong H">H. Kong</name>
</author>
<author>
<name sortKey="Hong, T" uniqKey="Hong T">T. Hong</name>
</author>
<author>
<name sortKey="Jockey, K" uniqKey="Jockey K">K. Jockey</name>
</author>
<author>
<name sortKey="Sports, C" uniqKey="Sports C">C. Sports</name>
</author>
<author>
<name sortKey="Centre, H S" uniqKey="Centre H">H.S. Centre</name>
</author>
<author>
<name sortKey="Kong, H" uniqKey="Kong H">H. Kong</name>
</author>
<author>
<name sortKey="Ho, A" uniqKey="Ho A">A. Ho</name>
</author>
<author>
<name sortKey="Ling, M" uniqKey="Ling M">M. Ling</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Casamassima, F" uniqKey="Casamassima F">F. Casamassima</name>
</author>
<author>
<name sortKey="Ferrari, A" uniqKey="Ferrari A">A. Ferrari</name>
</author>
<author>
<name sortKey="Milosevic, B" uniqKey="Milosevic B">B. Milosevic</name>
</author>
<author>
<name sortKey="Ginis, P" uniqKey="Ginis P">P. Ginis</name>
</author>
<author>
<name sortKey="Farella, E" uniqKey="Farella E">E. Farella</name>
</author>
<author>
<name sortKey="Rocchi, L" uniqKey="Rocchi L">L. Rocchi</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Bae, J" uniqKey="Bae J">J. Bae</name>
</author>
<author>
<name sortKey="Kong, K" uniqKey="Kong K">K. Kong</name>
</author>
<author>
<name sortKey="Byl, N" uniqKey="Byl N">N. Byl</name>
</author>
<author>
<name sortKey="Tomizuka, M" uniqKey="Tomizuka M">M. Tomizuka</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Bae, J" uniqKey="Bae J">J. Bae</name>
</author>
<author>
<name sortKey="Tomizuka, M" uniqKey="Tomizuka M">M. Tomizuka</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Strohrmann, C" uniqKey="Strohrmann C">C. Strohrmann</name>
</author>
<author>
<name sortKey="Harms, H" uniqKey="Harms H">H. Harms</name>
</author>
<author>
<name sortKey="Troster, G" uniqKey="Troster G">G. Tröster</name>
</author>
<author>
<name sortKey="Hensler, S" uniqKey="Hensler S">S. Hensler</name>
</author>
<author>
<name sortKey="Muller, R" uniqKey="Muller R">R. Müller</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Chung, P C" uniqKey="Chung P">P.-C. Chung</name>
</author>
<author>
<name sortKey="Hsu, Y L" uniqKey="Hsu Y">Y.-L. Hsu</name>
</author>
<author>
<name sortKey="Wang, C Y" uniqKey="Wang C">C.-Y. Wang</name>
</author>
<author>
<name sortKey="Lin, C W" uniqKey="Lin C">C.-W. Lin</name>
</author>
<author>
<name sortKey="Wang, J S" uniqKey="Wang J">J.-S. Wang</name>
</author>
<author>
<name sortKey="Pai, M C" uniqKey="Pai M">M.-C. Pai</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Schou, T" uniqKey="Schou T">T. Schou</name>
</author>
<author>
<name sortKey="Gardner, H J" uniqKey="Gardner H">H.J. Gardner</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Schlomer, T" uniqKey="Schlomer T">T. Schlömer</name>
</author>
<author>
<name sortKey="Poppinga, B" uniqKey="Poppinga B">B. Poppinga</name>
</author>
<author>
<name sortKey="Henze, N" uniqKey="Henze N">N. Henze</name>
</author>
<author>
<name sortKey="Boll, S" uniqKey="Boll S">S. Boll</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Shum, H P H" uniqKey="Shum H">H.P.H. Shum</name>
</author>
<author>
<name sortKey="Komura, T" uniqKey="Komura T">T. Komura</name>
</author>
<author>
<name sortKey="Takagi, S" uniqKey="Takagi S">S. Takagi</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Heinz, E A" uniqKey="Heinz E">E.A. Heinz</name>
</author>
<author>
<name sortKey="Kunze, K S" uniqKey="Kunze K">K.S. Kunze</name>
</author>
<author>
<name sortKey="Gruber, M" uniqKey="Gruber M">M. Gruber</name>
</author>
<author>
<name sortKey="Bannach, D" uniqKey="Bannach D">D. Bannach</name>
</author>
<author>
<name sortKey="Lukowicz, P" uniqKey="Lukowicz P">P. Lukowicz</name>
</author>
</analytic>
</biblStruct>
<biblStruct></biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Board, E V" uniqKey="Board E">E.V. Board</name>
</author>
<author>
<name sortKey="Guide, U" uniqKey="Guide U">U. Guide</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Noureldin, A" uniqKey="Noureldin A">A. Noureldin</name>
</author>
<author>
<name sortKey="Karamat, T B" uniqKey="Karamat T">T.B. Karamat</name>
</author>
<author>
<name sortKey="Georgy, J" uniqKey="Georgy J">J. Georgy</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Syed, Z F" uniqKey="Syed Z">Z.F. Syed</name>
</author>
<author>
<name sortKey="Aggarwal, P" uniqKey="Aggarwal P">P. Aggarwal</name>
</author>
<author>
<name sortKey="Goodall, C" uniqKey="Goodall C">C. Goodall</name>
</author>
<author>
<name sortKey="Niu, X" uniqKey="Niu X">X. Niu</name>
</author>
<author>
<name sortKey="El Sheimy, N" uniqKey="El Sheimy N">N. El-Sheimy</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Shin, E H" uniqKey="Shin E">E.H. Shin</name>
</author>
<author>
<name sortKey="El Sheimy, N" uniqKey="El Sheimy N">N. El-Sheimy</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Li, Y" uniqKey="Li Y">Y. Li</name>
</author>
<author>
<name sortKey="Niu, X" uniqKey="Niu X">X. Niu</name>
</author>
<author>
<name sortKey="Zhang, Q" uniqKey="Zhang Q">Q. Zhang</name>
</author>
<author>
<name sortKey="Zhang, H" uniqKey="Zhang H">H. Zhang</name>
</author>
<author>
<name sortKey="Shi, C" uniqKey="Shi C">C. Shi</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Gravina, R" uniqKey="Gravina R">R. Gravina</name>
</author>
<author>
<name sortKey="Alinia, P" uniqKey="Alinia P">P. Alinia</name>
</author>
<author>
<name sortKey="Ghasemzadeh, H" uniqKey="Ghasemzadeh H">H. Ghasemzadeh</name>
</author>
<author>
<name sortKey="Fortino, G" uniqKey="Fortino G">G. Fortino</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Foxlin, E" uniqKey="Foxlin E">E. Foxlin</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Li, W" uniqKey="Li W">W. Li</name>
</author>
<author>
<name sortKey="Wang, J" uniqKey="Wang J">J. Wang</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Wang, M" uniqKey="Wang M">M. Wang</name>
</author>
<author>
<name sortKey="Yang, Y" uniqKey="Yang Y">Y. Yang</name>
</author>
<author>
<name sortKey="Hatch, R R" uniqKey="Hatch R">R.R. Hatch</name>
</author>
<author>
<name sortKey="Zhang, Y" uniqKey="Zhang Y">Y. Zhang</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Godha, S" uniqKey="Godha S">S. Godha</name>
</author>
<author>
<name sortKey="Lachapelle, G" uniqKey="Lachapelle G">G. Lachapelle</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="El Sheimy, N" uniqKey="El Sheimy N">N. El-Sheimy</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Safavian, S R" uniqKey="Safavian S">S.R. Safavian</name>
</author>
<author>
<name sortKey="Landgrebe, D" uniqKey="Landgrebe D">D. Landgrebe</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Chawla, N V" uniqKey="Chawla N">N.V. Chawla</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Larose, D T" uniqKey="Larose D">D.T. Larose</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Chang, C C" uniqKey="Chang C">C.-C. Chang</name>
</author>
<author>
<name sortKey="Lin, C J" uniqKey="Lin C">C.-J. Lin</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Hsu, C W" uniqKey="Hsu C">C.-W. Hsu</name>
</author>
<author>
<name sortKey="Chang, C C" uniqKey="Chang C">C.-C. Chang</name>
</author>
<author>
<name sortKey="Lin, C J" uniqKey="Lin C">C.-J. Lin</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Suykens, J A K" uniqKey="Suykens J">J.A.K. Suykens</name>
</author>
<author>
<name sortKey="Vandewalle, J" uniqKey="Vandewalle J">J. Vandewalle</name>
</author>
</analytic>
</biblStruct>
</listBibl>
</div1>
</back>
</TEI>
<pmc article-type="research-article">
<pmc-dir>properties open_access</pmc-dir>
<front>
<journal-meta>
<journal-id journal-id-type="nlm-ta">Sensors (Basel)</journal-id>
<journal-id journal-id-type="iso-abbrev">Sensors (Basel)</journal-id>
<journal-id journal-id-type="publisher-id">sensors</journal-id>
<journal-title-group>
<journal-title>Sensors (Basel, Switzerland)</journal-title>
</journal-title-group>
<issn pub-type="epub">1424-8220</issn>
<publisher>
<publisher-name>MDPI</publisher-name>
</publisher>
</journal-meta>
<article-meta>
<article-id pub-id-type="pmid">27775673</article-id>
<article-id pub-id-type="pmc">5087537</article-id>
<article-id pub-id-type="doi">10.3390/s16101752</article-id>
<article-id pub-id-type="publisher-id">sensors-16-01752</article-id>
<article-categories>
<subj-group subj-group-type="heading">
<subject>Article</subject>
</subj-group>
</article-categories>
<title-group>
<article-title>Design and Implementation of Foot-Mounted Inertial Sensor Based Wearable Electronic Device for Game Play Application</article-title>
</title-group>
<contrib-group>
<contrib contrib-type="author">
<name>
<surname>Zhou</surname>
<given-names>Qifan</given-names>
</name>
<xref ref-type="aff" rid="af1-sensors-16-01752">1</xref>
<xref ref-type="aff" rid="af2-sensors-16-01752">2</xref>
<xref rid="c1-sensors-16-01752" ref-type="corresp">*</xref>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Zhang</surname>
<given-names>Hai</given-names>
</name>
<xref ref-type="aff" rid="af1-sensors-16-01752">1</xref>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Lari</surname>
<given-names>Zahra</given-names>
</name>
<xref ref-type="aff" rid="af2-sensors-16-01752">2</xref>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Liu</surname>
<given-names>Zhenbo</given-names>
</name>
<xref ref-type="aff" rid="af2-sensors-16-01752">2</xref>
</contrib>
<contrib contrib-type="author">
<name>
<surname>El-Sheimy</surname>
<given-names>Naser</given-names>
</name>
<xref ref-type="aff" rid="af2-sensors-16-01752">2</xref>
</contrib>
</contrib-group>
<contrib-group>
<contrib contrib-type="editor">
<name>
<surname>Passaro</surname>
<given-names>Vittorio M. N.</given-names>
</name>
<role>Academic Editor</role>
</contrib>
</contrib-group>
<aff id="af1-sensors-16-01752">
<label>1</label>
School of Automation Science and Electrical Engineering, Beihang University, Beijing 100191, China;
<email>zhanghai@buaa.edu.cn</email>
</aff>
<aff id="af2-sensors-16-01752">
<label>2</label>
Geomatics Engineering Department, University of Calgary, Calgary, AB T2N 1N4, Canada;
<email>zlari@ucalgary.ca</email>
(Z.L.);
<email>zhenbo.liu2@ucalgary.ca</email>
(Z.L.);
<email>elsheimy@ucalgary.ca</email>
(N.E.-S.)</aff>
<author-notes>
<corresp id="c1-sensors-16-01752">
<label>*</label>
Correspondence:
<email>qifan.zhou@ucalgary.ca</email>
; Tel.: +86-10-8233-9189</corresp>
</author-notes>
<pub-date pub-type="epub">
<day>21</day>
<month>10</month>
<year>2016</year>
</pub-date>
<pub-date pub-type="collection">
<month>10</month>
<year>2016</year>
</pub-date>
<volume>16</volume>
<issue>10</issue>
<elocation-id>1752</elocation-id>
<history>
<date date-type="received">
<day>18</day>
<month>9</month>
<year>2016</year>
</date>
<date date-type="accepted">
<day>18</day>
<month>10</month>
<year>2016</year>
</date>
</history>
<permissions>
<copyright-statement>© 2016 by the authors; licensee MDPI, Basel, Switzerland.</copyright-statement>
<copyright-year>2016</copyright-year>
<license>
<license-p>
<pmc-comment>CREATIVE COMMONS</pmc-comment>
This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC-BY) license (
<ext-link ext-link-type="uri" xlink:href="http://creativecommons.org/licenses/by/4.0/">http://creativecommons.org/licenses/by/4.0/</ext-link>
).</license-p>
</license>
</permissions>
<abstract>
<p>Wearable electronic devices have experienced increasing development with the advances in the semiconductor industry and have received more attention during the last decades. This paper presents the development and implementation of a novel inertial sensor-based foot-mounted wearable electronic device for a brand new application: game playing. The main objective of the introduced system is to monitor and identify the human foot stepping direction in real time, and coordinate these motions to control the player operation in games. This proposed system extends the utilized field of currently available wearable devices and introduces a convenient and portable medium to perform exercise in a more compelling way in the near future. This paper provides an overview of the previously-developed system platforms, introduces the main idea behind this novel application, and describes the implemented human foot moving direction identification algorithm. Practical experiment results demonstrate that the proposed system is capable of recognizing five foot motions, jump, step left, step right, step forward, and step backward, and has achieved an over 97% accuracy performance for different users. The functionality of the system for real-time application has also been verified through the practical experiments.</p>
</abstract>
<kwd-group>
<kwd>wearable electronic device</kwd>
<kwd>foot moving direction</kwd>
<kwd>game play</kwd>
</kwd-group>
</article-meta>
</front>
<body>
<sec id="sec1-sensors-16-01752">
<title>1. Introduction</title>
<p>In recent years, with the rapid development of MEMS (Micro-Electro-Mechanical System) technology, the inertial sensor production has made a leap forward in terms of chip-size minimization, low-cost manufacturing, low-power consumption, and simplification in operation. Due to these advancements, various types of inertial MEMS sensors have been adapted for multiple applications, such as vehicles and personal navigation [
<xref rid="B1-sensors-16-01752" ref-type="bibr">1</xref>
], motion tracking systems [
<xref rid="B2-sensors-16-01752" ref-type="bibr">2</xref>
], and consumer electronic devices (smartphones) [
<xref rid="B3-sensors-16-01752" ref-type="bibr">3</xref>
]. The wearable electronic devices, which emerged during the last few years, also utilize the low-cost MEMS inertial sensor, and are becoming more attractive in the consumer market. </p>
<p>Wearable electronic devices refer to electronic technologies or devices that are incorporated into items of clothing and accessories which can be comfortably worn [
<xref rid="B4-sensors-16-01752" ref-type="bibr">4</xref>
]. Generally, these devices can perform communications and allow the wearers to access their activity and behavior information. The foot-mounted inertial sensor based electronic device is one commonly existing type and has attracted attention for further study, development and implementation. The application fields of foot-mounted wearable device can mainly be categorized into: pedestrian navigation, human daily or sports activity recognition, and medical field.</p>
<p>The foot-mounted personal navigation system is the most universal usage of this kind of device and has been reported several times before [
<xref rid="B5-sensors-16-01752" ref-type="bibr">5</xref>
,
<xref rid="B6-sensors-16-01752" ref-type="bibr">6</xref>
,
<xref rid="B7-sensors-16-01752" ref-type="bibr">7</xref>
]. It makes use of the self-contained and autonomous attributes of inertial sensor to derive the navigation solution, which is capable of effectively avoiding the negative effect of environmental elements. This system is meaningful to be utilized for firefighters, tracking military personnel, first responders and offenders, visually impaired and blind people [
<xref rid="B8-sensors-16-01752" ref-type="bibr">8</xref>
]. The fundamental of such system is to apply Inertial Navigation System (INS) mechanization equation to calculate navigation parameters (i.e., position, velocity, and attitude), and combines the Zero velocity update (ZUPT) technology to mitigate accumulated error and estimate the sensor error [
<xref rid="B9-sensors-16-01752" ref-type="bibr">9</xref>
,
<xref rid="B10-sensors-16-01752" ref-type="bibr">10</xref>
]. In order to improve the positioning performance, several other technical algorithms or estimation approaches, such as particle filter [
<xref rid="B11-sensors-16-01752" ref-type="bibr">11</xref>
], integration with RFID (Radio Frequency Identification) measurements [
<xref rid="B12-sensors-16-01752" ref-type="bibr">12</xref>
], map matching [
<xref rid="B13-sensors-16-01752" ref-type="bibr">13</xref>
], and improvement on hardware structure [
<xref rid="B14-sensors-16-01752" ref-type="bibr">14</xref>
] are also employed in the foot-mounted navigation devices. However, due to the requirement of initial position and the INS accumulated error caused by integral computation [
<xref rid="B15-sensors-16-01752" ref-type="bibr">15</xref>
], the foot-mounted navigation system is rarely capable of acquiring long-term stable solution and is limited for further industrial utilization. </p>
<p>Additionally, the foot-mounted wearable device is applied in the field of sports exercises or daily activities tracking. With the inertial sensor attached on shoe, gait cycle, daily energy expenditure for activities (i.e., running, and walking) and sportive activities can be fed back to the users [
<xref rid="B16-sensors-16-01752" ref-type="bibr">16</xref>
,
<xref rid="B17-sensors-16-01752" ref-type="bibr">17</xref>
]. The activity recognition process basically segments data, extracts features and classifies human motions [
<xref rid="B18-sensors-16-01752" ref-type="bibr">18</xref>
]. Wang designs a walking pattern classifier to determine the phases of a walking cycle: stance, push-off, swing, and heel-strike. Chen introduces employing Hidden Markov Model (HMM) to pre-process the inertial sensor data and classify common activities: standing, walking, going upstairs/downstairs, jogging and running, and other similar research work can be found in literatures [
<xref rid="B19-sensors-16-01752" ref-type="bibr">19</xref>
,
<xref rid="B20-sensors-16-01752" ref-type="bibr">20</xref>
]. In industry, miCoach [
<xref rid="B21-sensors-16-01752" ref-type="bibr">21</xref>
] and Nike+ [
<xref rid="B22-sensors-16-01752" ref-type="bibr">22</xref>
] are examples of foot-mounted commercial wearable products for monitoring sportive activities. These fitness products provide the user with information on speed, distance, and energy cost, and have achieved a tremendous popularity among users. However, they are limited to the provision of the users’ general motion information, such as discrimination of movement activity from rest, classification of activities (i.e., running, walking, and sleeping), and the quantization of general movement intensity (i.e., percentage of time spent moving, sleeping, and sitting) [
<xref rid="B23-sensors-16-01752" ref-type="bibr">23</xref>
]. These systems are merely data recorders or monitors, and their performance does not have a direct impact on the user experience because the user cannot identify whether the motion identification result is accurate or not.</p>
<p>In medical field, since gait disturbances are very common factors in patients with Parkinson’s disease, several research works concern with Parkinson patients who suffer from walking abnormality, and aim to help them in aspects of diagnosing, monitoring, and rehabilitating. Filippo [
<xref rid="B24-sensors-16-01752" ref-type="bibr">24</xref>
] proposes a system that is able to provide real-time computation of gait features, and feeds back to the user in the purpose of helping him/her execute the most effective gait pattern. Joonbum [
<xref rid="B25-sensors-16-01752" ref-type="bibr">25</xref>
] makes use of pressure sensor to monitor patients’ gait by observing the ground reaction force (GRF) and the center of GRF to give the quantitative information of gait abnormality. He [
<xref rid="B26-sensors-16-01752" ref-type="bibr">26</xref>
] improves the work by integrating Inertial Measurement Unit (IMU), employing the HMM to identify gait phases and developing it into a tele-monitoring system. Moreover, Strohrmann [
<xref rid="B27-sensors-16-01752" ref-type="bibr">27</xref>
] utilizes the motion data measured by inertial sensor for runner’s kinematic analysis to avoid risks provoked by fatigue or improper technique. Chung [
<xref rid="B28-sensors-16-01752" ref-type="bibr">28</xref>
] compares the motion data of Alzheimer patient and healthy people collected during walking, and concludes that the Alzheimer patients exhibited a significantly shorter mean stride length and slower mean gait speed than those of the healthy controls. The foot-mounted device, which provides continuous physical monitoring in any environment, is beneficial in shortening patent’s hospital stay, improving both recovery and diagnosis reliability, and raising patients’ quality of life. </p>
<p>This paper aims to introduce a novel application of foot-mounted wearable electronic device: game play. The inertial sensor has been successfully applied in game play scenario before and received massive attentions. The most popular and famous example is Wii remote controller, which integrates infra-red and three-axis accelerometer information to capture users’ hand motion and enables them to play games such as Golf, Tennis ball, Balling, etc. [
<xref rid="B29-sensors-16-01752" ref-type="bibr">29</xref>
,
<xref rid="B30-sensors-16-01752" ref-type="bibr">30</xref>
]. The MotionCore controller, proposed by Movea, can play the role of an air mouse and is employed to play Fruit Ninjia, or Shooting games. Shum [
<xref rid="B31-sensors-16-01752" ref-type="bibr">31</xref>
] introduces a fast accelerometer based motion recognition approach and applies this technology to plays boxing game. Ernst [
<xref rid="B32-sensors-16-01752" ref-type="bibr">32</xref>
] attempts an initial experiment of using wearable inertial sensor in game of material arts. These proposed or stated inertial sensor based game applications are all expressed in hand operating mode that users need to shake and swing their hands to interact with game operation in real time. To the best knowledge of the authors, attaching inertial sensor on shoe and using foot motion to play game has not been discussed in previous academic work or industry product; hence, it is a novelty of suggesting such game play manner. The main idea behind the proposed system is to identify human foot stepping directions in real-time and coordinate these directions to control the virtual player actions in game. In our work, one IMU configuration is selected to make it convenient and suitable for user to wear. The proposed motion identification procedure is then implemented in three successive steps: (1) the collected dynamic data are initially preprocessed to compensate sensor error (i.e., bias, scale factor error, and non-orthogonality error) and correct the inertial sensor misalignment during placement; (2) the peak points of acceleration norm are detected for acceleration data segmentation and the selected features (e.g., mean, variance, position change, etc.) in the vicinity of each peak point are extracted; and (3) the extracted features are finally fed into a machine learning process to train the classifier. Notably, to improve the robustness of the proposed system, each stepping motion type has its own corresponding classifier.</p>
<p>The advantages of proposed system can be described as follows: (1) it extends the current foot-mounted electronic wearable devices beyond pedestrian navigation or human activity recognition and monitoring to the game play field, (2) some of the kinetic games will not be limited to a confined space (i.e., living room) and specific game boxes (i.e., XBOX or Wii) anymore, and, on the contrary, will be playable almost anywhere or anytime on various terminals (i.e., smartphones or tablets); and (3) it introduces the possibility of building the low-cost, portable, real-time wearable exercising and entertainment platforms, with which people are able to conveniently perform virtual sports or exercise in an interesting manner without environmental constrains.</p>
<p>The paper is structured as follows:
<xref ref-type="sec" rid="sec2-sensors-16-01752">Section 2</xref>
introduces the main concept of the proposed system.
<xref ref-type="sec" rid="sec3-sensors-16-01752">Section 3</xref>
describes the system overview with the introduction of both hardware and software platforms.
<xref ref-type="sec" rid="sec4-sensors-16-01752">Section 4</xref>
illustrates the specific implementation of the foot motion detection algorithm.
<xref ref-type="sec" rid="sec5-sensors-16-01752">Section 5</xref>
shows the experimental results and analysis.
<xref ref-type="sec" rid="sec6-sensors-16-01752">Section 6</xref>
presents conclusions and provides recommendations for future research work.</p>
</sec>
<sec id="sec2-sensors-16-01752">
<title>2. Main Concept</title>
<p>One of the commonly-used game operating modes is that user controls the character’s movements (i.e., forward, backward, left or right) to avoid obstacles or achieve more points in popular smartphone running games, such as Temple Run, Surway Surf. According to this operation mode, the main concept of the foot-mounted systems is to utilize user’s steps to control the virtual player in game, instead of using the conventional manner (i.e., finger sliding, and button press). Specifically, the inertial sensor is attached to the user’s foot and the sensor data are collected during moving phase; then, the stepping direction is derived from collected data and used to control the character moving in game.
<xref ref-type="fig" rid="sensors-16-01752-f001">Figure 1</xref>
illustrates the main concept of the proposed system.</p>
<p>This illustration shows the human foot’s kinetic motions, detected by an inertial sensor, will substitute the traditional game controller. Concretely, the user’s stepping forward or jumping correlates to the press of up button (or finger slides up); stepping backward correlates to the press of down button (or finger slides down). Similarly, a person’s walking left or right correlates to the press of left or right buttons (or finger slides right or left).</p>
<p>This system has high real-time and detection accuracy requirements because any lag or false detection of steps will cause the user to discontinue playing game normally and contribute to a poor user experience. Hence, the main challenge of this system is to correctly determine step motion and moving directions when the step event happens with little delay, and synchronize those motions to game controls, which is meaningful to provide a favorable feedback to user. Moreover, due to the diversity of shoe styles, sensor mounted manners and user habits, the system robustness and algorithm compatibility are other difficult challenges to overcome.</p>
</sec>
<sec id="sec3-sensors-16-01752">
<title>3. System Architecture</title>
<p>The proposed system architecture is shown in
<xref ref-type="fig" rid="sensors-16-01752-f002">Figure 2</xref>
. In this system, the foot moving dynamic data are captured by inertial sensor, and is then wirelessly transmitted to various kinds of terminals (i.e., smartphone, tablets, computer, and smart TV) through Bluetooth 4.0. The software, which is compatible in different platforms, plays the role of receiving data, performing the step motion detection algorithm, and interacting with games. Both hardware and software platforms are included in this system, and are described as follows.</p>
<sec id="sec3dot1-sensors-16-01752">
<title>3.1. Hardware Platform</title>
<p>The system hardware platform mainly combines a CC 2450 microprocessor (Texas Instrument, Dallas, TX, USA), a MPU9150 9-axis inertial sensor (InvenSense, Sunnyvale, CA, USA), and other necessary electronic components. The CC2540 [
<xref rid="B33-sensors-16-01752" ref-type="bibr">33</xref>
] processor has a high performance and low-power 8051 microcontroller and includes a 2.4 GHz Bluetooth low energy System on Chip (SOC). It can run both application and BLE (Bluetooth Low Energy) protocol stack, so it is compatible with multiple mobile devices (i.e., Smartphone, and tablets). The MPU9150 [
<xref rid="B34-sensors-16-01752" ref-type="bibr">34</xref>
] is an integrated nine-axis MEMS motion tracking device that combines a three-axis gyroscope, a three-axis accelerometer, and a three-axis magnetometer.
<xref ref-type="fig" rid="sensors-16-01752-f003">Figure 3</xref>
shows the system hardware platform. In our system, the three tasks of the hardware platform are to derive inertial sensor data through II2 (I2C) interface in a pre-set sampling frequency (200 Hz), to package data in a pre-defined user protocol and send data via Bluetooth to the host. </p>
</sec>
<sec id="sec3dot2-sensors-16-01752">
<title>3.2. Software Platform</title>
<p>The system software platform is developed in C++ programming language in Visual Studio. The main functions of the software include the following: receive and decode data, log user’s motion data, calculate the attitude, run the human foot detection algorithm and interact the human motion with game control. For real-time processing, a multi-threaded program is designed to simultaneously implement the listed tasks. Multithreading is a widespread programming and execution model that allows multiple threads to exist within the context of a single process. These threads share the processor's resources, but execute their functions independently. This multi-threaded software can guarantee the whole system’s real-time application, and moreover introduces a clear structure, which is beneficial for further revision or development.</p>
</sec>
</sec>
<sec id="sec4-sensors-16-01752">
<title>4. Methodology</title>
<p>The inertial sensor is attached on human foot and the measured rotation and acceleration information is applied for stepping direction classification. The motion recognition process of the proposed system is illustrated in
<xref ref-type="fig" rid="sensors-16-01752-f004">Figure 4</xref>
.</p>
<p>As shown in
<xref ref-type="fig" rid="sensors-16-01752-f004">Figure 4</xref>
, the identification process is executed as: first, the collected raw inertial data is pre-processed for error compensation, noise reduction and misalignment elimination; second, the peak points of the norm of 3-axis acceleration are detected to segment data; and, finally, the selected features in the divided data segment are extracted and put into the classifier to derive the foot motion types. The detailed description of each procedure is provided in the following subsections. </p>
<sec id="sec4dot1-sensors-16-01752">
<title>4.1. Preprocessing</title>
<p>MEMS inertial sensor has the advantages of being small-size, low cost, affordable; however, they suffer from various error sources, which cause negative effects on their performance. Therefore, calibration experiments are indispensable to remove the deterministic errors, such as bias, scale factor, misalignment before using MEMS sensor. The inertial sensor error model [
<xref rid="B35-sensors-16-01752" ref-type="bibr">35</xref>
] is described as follows and it is employed for the error compensation:
<disp-formula id="FD1-sensors-16-01752">
<label>(1)</label>
<mml:math id="mm1">
<mml:mtable columnalign="left">
<mml:mtr>
<mml:mtd>
<mml:msup>
<mml:mover accent="true">
<mml:mi>ω</mml:mi>
<mml:mo>˜</mml:mo>
</mml:mover>
<mml:mi>b</mml:mi>
</mml:msup>
<mml:mo>=</mml:mo>
<mml:mi>ω</mml:mi>
<mml:mo>+</mml:mo>
<mml:msub>
<mml:mi>b</mml:mi>
<mml:mi>ω</mml:mi>
</mml:msub>
<mml:mo>+</mml:mo>
<mml:msub>
<mml:mi>S</mml:mi>
<mml:mi>ω</mml:mi>
</mml:msub>
<mml:mo></mml:mo>
<mml:msup>
<mml:mi>ω</mml:mi>
<mml:mi>b</mml:mi>
</mml:msup>
<mml:mo>+</mml:mo>
<mml:msub>
<mml:mi>N</mml:mi>
<mml:mi>ω</mml:mi>
</mml:msub>
<mml:mo></mml:mo>
<mml:msup>
<mml:mi>ω</mml:mi>
<mml:mi>b</mml:mi>
</mml:msup>
<mml:mo>+</mml:mo>
<mml:mi>ε</mml:mi>
<mml:mo stretchy="false">(</mml:mo>
<mml:mi>ω</mml:mi>
<mml:mo stretchy="false">)</mml:mo>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:msup>
<mml:mover accent="true">
<mml:mi>f</mml:mi>
<mml:mo>˜</mml:mo>
</mml:mover>
<mml:mi>b</mml:mi>
</mml:msup>
<mml:mo>=</mml:mo>
<mml:mi>f</mml:mi>
<mml:mo>+</mml:mo>
<mml:msub>
<mml:mi>b</mml:mi>
<mml:mi>f</mml:mi>
</mml:msub>
<mml:mo>+</mml:mo>
<mml:msub>
<mml:mi>S</mml:mi>
<mml:mi>f</mml:mi>
</mml:msub>
<mml:mo></mml:mo>
<mml:mi>f</mml:mi>
<mml:mo>+</mml:mo>
<mml:msub>
<mml:mi>N</mml:mi>
<mml:mi>f</mml:mi>
</mml:msub>
<mml:mo></mml:mo>
<mml:mi>f</mml:mi>
<mml:mo>+</mml:mo>
<mml:mi>ε</mml:mi>
<mml:mo stretchy="false">(</mml:mo>
<mml:mi>f</mml:mi>
<mml:mo stretchy="false">)</mml:mo>
</mml:mtd>
</mml:mtr>
</mml:mtable>
</mml:math>
</disp-formula>
where
<inline-formula>
<mml:math id="mm2">
<mml:mrow>
<mml:msup>
<mml:mover accent="true">
<mml:mi>f</mml:mi>
<mml:mo>˜</mml:mo>
</mml:mover>
<mml:mi>b</mml:mi>
</mml:msup>
<mml:mo>,</mml:mo>
<mml:msup>
<mml:mover accent="true">
<mml:mi>ω</mml:mi>
<mml:mo>˜</mml:mo>
</mml:mover>
<mml:mi>b</mml:mi>
</mml:msup>
</mml:mrow>
</mml:math>
</inline-formula>
denote the measurements of specific force and rotation and
<inline-formula>
<mml:math id="mm3">
<mml:mrow>
<mml:msup>
<mml:mi>f</mml:mi>
<mml:mi>b</mml:mi>
</mml:msup>
<mml:mo>,</mml:mo>
<mml:msup>
<mml:mi>ω</mml:mi>
<mml:mi>b</mml:mi>
</mml:msup>
</mml:mrow>
</mml:math>
</inline-formula>
denote the true specific force and angular velocity.
<inline-formula>
<mml:math id="mm4">
<mml:mrow>
<mml:msub>
<mml:mi>b</mml:mi>
<mml:mi>a</mml:mi>
</mml:msub>
<mml:mo>,</mml:mo>
<mml:msub>
<mml:mi>b</mml:mi>
<mml:mi>ω</mml:mi>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula>
, respectively, denote the accelerometer and gyroscope instrument bias;
<inline-formula>
<mml:math id="mm5">
<mml:mrow>
<mml:msub>
<mml:mi>S</mml:mi>
<mml:mi>b</mml:mi>
</mml:msub>
<mml:mo>,</mml:mo>
<mml:msub>
<mml:mi>S</mml:mi>
<mml:mi>ω</mml:mi>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula>
separately denote the matrices of the linear scale factor error of gyroscope and accelerometer; and
<inline-formula>
<mml:math id="mm6">
<mml:mrow>
<mml:msub>
<mml:mi>N</mml:mi>
<mml:mi>b</mml:mi>
</mml:msub>
<mml:mo>,</mml:mo>
<mml:msub>
<mml:mi>N</mml:mi>
<mml:mi>ω</mml:mi>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula>
denote the matrices of representing axes non-orthogonality.
<inline-formula>
<mml:math id="mm7">
<mml:mrow>
<mml:mi>ε</mml:mi>
<mml:mo stretchy="false">(</mml:mo>
<mml:mi>ω</mml:mi>
<mml:mo stretchy="false">)</mml:mo>
<mml:mo>,</mml:mo>
<mml:mi>ε</mml:mi>
<mml:mo stretchy="false">(</mml:mo>
<mml:mi>f</mml:mi>
<mml:mo stretchy="false">)</mml:mo>
</mml:mrow>
</mml:math>
</inline-formula>
denote the stochastic error of the sensors. The parameters
<inline-formula>
<mml:math id="mm8">
<mml:mrow>
<mml:msub>
<mml:mi>S</mml:mi>
<mml:mi>b</mml:mi>
</mml:msub>
<mml:mo>,</mml:mo>
<mml:msub>
<mml:mi>S</mml:mi>
<mml:mi>ω</mml:mi>
</mml:msub>
<mml:mo>,</mml:mo>
<mml:msub>
<mml:mi>b</mml:mi>
<mml:mi>a</mml:mi>
</mml:msub>
<mml:mo>,</mml:mo>
<mml:msub>
<mml:mi>b</mml:mi>
<mml:mi>ω</mml:mi>
</mml:msub>
<mml:mo>,</mml:mo>
<mml:msub>
<mml:mi>N</mml:mi>
<mml:mi>b</mml:mi>
</mml:msub>
<mml:mo>,</mml:mo>
<mml:msub>
<mml:mi>N</mml:mi>
<mml:mi>ω</mml:mi>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula>
can be derived through a calibration experiment before sensor usage [
<xref rid="B36-sensors-16-01752" ref-type="bibr">36</xref>
,
<xref rid="B37-sensors-16-01752" ref-type="bibr">37</xref>
]. With a hand rotating calibration scheme, the experiment can be accomplished in approximately one minute [
<xref rid="B38-sensors-16-01752" ref-type="bibr">38</xref>
]. </p>
<p>In the proposed system, the IMU is attached on shoes to detect the user’s foot motions and control the game. However, due to the difference between various shoe styles and the sensor placement, the IMU orientation (pitch and roll) varies when mounting on different users’ shoes, which causes the misalignment with different users.</p>
<p>Hence, in order to achieve a satisfactory identification result for different shoe styles or placement manners, the data should be collected under various attachment conditions and put into the training process to derive the classifier. However, this process is time-consuming and the performance is not guaranteed if the sensor is attached with a new placement that is not included in the training set. </p>
<p>To avoid such drawbacks, we propose to project the measured acceleration and rotation data from the sensor frame (shoe frame) to the user frame, where the user frame is defined as the user’s right, left and up directions as three axes to construct the right handed coordinate system. Thus, no matter how the inertial sensor is placed on the shoes (sensor frame is always different), the measured data can be unified to be expressed in the same coordinate. During the sensor installation, the forward axis of the IMU (
<italic>y</italic>
-axis in proposed system), is always aligned with the foot moving forward direction, so we only need to consider the misalignment of pitch and roll angles. This proposed data transformation from sensor frame to user frame is able to effectively eliminate the misalignment caused by different shoes styles and sensor placement because it aligns all the collected data in the same frame.
<xref ref-type="fig" rid="sensors-16-01752-f005">Figure 5</xref>
shows this process.</p>
<p>
<xref ref-type="fig" rid="sensors-16-01752-f005">Figure 5</xref>
shows the alignment process with the rotation matrix
<inline-formula>
<mml:math id="mm9">
<mml:mrow>
<mml:msubsup>
<mml:mi>C</mml:mi>
<mml:mi>b</mml:mi>
<mml:mi>n</mml:mi>
</mml:msubsup>
</mml:mrow>
</mml:math>
</inline-formula>
, where the inertial data, collected under different misalignment conditions, are scaled in the same frame (Right-Forward-Up). More importantly, the data expressed in this frame can directly reflect the actual user moving direction in horizontal plane, which provides a better data basis for the consequent signal process, and is beneficial to achieve a more robust result. </p>
<p>Therefore, a reliable and accurate attitude result is very significant and necessary since it can be used to correctly project the inertial measurement onto user frame with the rotation matrix to perform the data standardization process (align in the same frame), and consequently derive a dependable feature extraction section. Considering the given initial attitude and gyroscope measurement, the orientation results can be derived by integrating the angular velocity measured by 3-axis gyroscope. However, due to the error of MEMS gyroscope, the attitude result drifts quickly with time and is not able to provide long term solution. On the other side, the accelerometer can provide attitude angles without suffering from long term drift which is complementary with gyroscope, and is effective to compensate the attitude drift error. Hence, an attitude filter is used to integrate the gyroscope and accelerometer measurement together and derive the non-drift attitude solution. The Kalman filter is then used to blend the information in a feature-level fusion [
<xref rid="B39-sensors-16-01752" ref-type="bibr">39</xref>
]. The dynamic model, measurement model of the filter and an adaptive measurement noise tuning strategy implemented are subsequently described as follows. </p>
<sec id="sec4dot1dot1-sensors-16-01752">
<title>4.1.1. Dynamic Model</title>
<p>The attitude angle error model, which is the angle difference between true navigation frame and the computed navigation frame, is employed as the dynamic model [
<xref rid="B40-sensors-16-01752" ref-type="bibr">40</xref>
]. This model is expressed in linear form, and easy to implement. The 3-axis gyro biases are also included in the dynamic model; and they are estimated in the filter and work in the feedback loop to mitigate the error from raw measurement. The equation of dynamic model is written as:
<disp-formula id="FD2-sensors-16-01752">
<label>(2)</label>
<mml:math id="mm10">
<mml:mtable columnalign="left">
<mml:mtr>
<mml:mtd>
<mml:mover accent="true">
<mml:mi>ψ</mml:mi>
<mml:mo>˙</mml:mo>
</mml:mover>
<mml:mo>=</mml:mo>
<mml:mi>ψ</mml:mi>
<mml:mo>×</mml:mo>
<mml:msubsup>
<mml:mi>ω</mml:mi>
<mml:mrow>
<mml:mi>i</mml:mi>
<mml:mi>n</mml:mi>
</mml:mrow>
<mml:mi>n</mml:mi>
</mml:msubsup>
<mml:mo>+</mml:mo>
<mml:msubsup>
<mml:mi>C</mml:mi>
<mml:mi>b</mml:mi>
<mml:mi>n</mml:mi>
</mml:msubsup>
<mml:msup>
<mml:mi>ε</mml:mi>
<mml:mi>b</mml:mi>
</mml:msup>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:msup>
<mml:mover accent="true">
<mml:mi>ε</mml:mi>
<mml:mo>˙</mml:mo>
</mml:mover>
<mml:mi>b</mml:mi>
</mml:msup>
<mml:mo>=</mml:mo>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:mo></mml:mo>
<mml:mn>1</mml:mn>
<mml:mo>/</mml:mo>
<mml:msub>
<mml:mi>τ</mml:mi>
<mml:mi>b</mml:mi>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
<mml:msup>
<mml:mover accent="true">
<mml:mi>ε</mml:mi>
<mml:mo>˙</mml:mo>
</mml:mover>
<mml:mi>b</mml:mi>
</mml:msup>
<mml:mtext></mml:mtext>
<mml:mo>+</mml:mo>
<mml:mtext></mml:mtext>
<mml:msub>
<mml:mi>ω</mml:mi>
<mml:mi>b</mml:mi>
</mml:msub>
</mml:mtd>
</mml:mtr>
</mml:mtable>
</mml:math>
</disp-formula>
where
<inline-formula>
<mml:math id="mm11">
<mml:mi>ψ</mml:mi>
</mml:math>
</inline-formula>
denotes the attitude error.
<inline-formula>
<mml:math id="mm12">
<mml:mrow>
<mml:msubsup>
<mml:mi>ω</mml:mi>
<mml:mrow>
<mml:mi>i</mml:mi>
<mml:mi>n</mml:mi>
</mml:mrow>
<mml:mi>n</mml:mi>
</mml:msubsup>
</mml:mrow>
</mml:math>
</inline-formula>
denotes the
<italic>n</italic>
-frame rotation angular rate vector relative to the inertial frame (
<italic>i</italic>
-frame) expressed in the
<italic>n</italic>
-frame.
<inline-formula>
<mml:math id="mm13">
<mml:mrow>
<mml:msubsup>
<mml:mi>C</mml:mi>
<mml:mi>b</mml:mi>
<mml:mi>n</mml:mi>
</mml:msubsup>
</mml:mrow>
</mml:math>
</inline-formula>
denotes the Direction Cosine Matrix (DCM) from
<italic>b</italic>
-frame (i.e., the body frame) to
<italic>n</italic>
-frame (i.e., the navigation frame). The symbol “×” denotes cross product of two vectors.
<inline-formula>
<mml:math id="mm14">
<mml:mrow>
<mml:msup>
<mml:mi>ε</mml:mi>
<mml:mi>b</mml:mi>
</mml:msup>
</mml:mrow>
</mml:math>
</inline-formula>
denotes the gyros output error. Here, we only consider the effect of gyro bias and it is modeled as first order Gauss-Markov process. Finally,
<inline-formula>
<mml:math id="mm15">
<mml:mrow>
<mml:msub>
<mml:mi>τ</mml:mi>
<mml:mi>b</mml:mi>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula>
denotes the correlation time of the gyro biases and
<inline-formula>
<mml:math id="mm16">
<mml:mrow>
<mml:msub>
<mml:mi>ω</mml:mi>
<mml:mi>b</mml:mi>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula>
is the driving noise vector.</p>
</sec>
<sec id="sec4dot1dot2-sensors-16-01752">
<title>4.1.2. Measurement Model</title>
<p>The acceleration residuals in the body frame are used to derive the system measurement model. In our model, instead of using attitude difference separately derived by accelerometer and gyroscope, the acceleration difference is applied to avoid the singularity problem when the pitch angle is ±90° [
<xref rid="B41-sensors-16-01752" ref-type="bibr">41</xref>
]. The acceleration residuals in body frame are defined as the difference between accelerometer direct measurements and the projection of local gravity on the body frame.
<disp-formula id="FD3-sensors-16-01752">
<label>(3)</label>
<mml:math id="mm17">
<mml:mtable columnalign="left">
<mml:mtr>
<mml:mtd>
<mml:mi>δ</mml:mi>
<mml:mi>a</mml:mi>
<mml:mo>=</mml:mo>
<mml:msubsup>
<mml:mi>a</mml:mi>
<mml:mi>m</mml:mi>
<mml:mi>b</mml:mi>
</mml:msubsup>
<mml:mo></mml:mo>
<mml:msubsup>
<mml:mi>a</mml:mi>
<mml:mrow>
<mml:msub>
<mml:mi>n</mml:mi>
<mml:mi>c</mml:mi>
</mml:msub>
</mml:mrow>
<mml:mi>b</mml:mi>
</mml:msubsup>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:msubsup>
<mml:mi>a</mml:mi>
<mml:mrow>
<mml:msub>
<mml:mi>n</mml:mi>
<mml:mi>c</mml:mi>
</mml:msub>
</mml:mrow>
<mml:mi>b</mml:mi>
</mml:msubsup>
<mml:mo>=</mml:mo>
<mml:msubsup>
<mml:mi>C</mml:mi>
<mml:mrow>
<mml:msub>
<mml:mi>n</mml:mi>
<mml:mi>c</mml:mi>
</mml:msub>
</mml:mrow>
<mml:mi>b</mml:mi>
</mml:msubsup>
<mml:msup>
<mml:mi>a</mml:mi>
<mml:mi>n</mml:mi>
</mml:msup>
</mml:mtd>
</mml:mtr>
</mml:mtable>
</mml:math>
</disp-formula>
where
<inline-formula>
<mml:math id="mm18">
<mml:mrow>
<mml:msubsup>
<mml:mi>a</mml:mi>
<mml:mi>m</mml:mi>
<mml:mi>b</mml:mi>
</mml:msubsup>
</mml:mrow>
</mml:math>
</inline-formula>
denotes the accelerometer measurement.
<inline-formula>
<mml:math id="mm19">
<mml:mrow>
<mml:msubsup>
<mml:mi>a</mml:mi>
<mml:mrow>
<mml:msub>
<mml:mi>n</mml:mi>
<mml:mi>c</mml:mi>
</mml:msub>
</mml:mrow>
<mml:mi>b</mml:mi>
</mml:msubsup>
</mml:mrow>
</mml:math>
</inline-formula>
denotes the local gravity acceleration project on the body frame using the gyros derived rotation matrix
<inline-formula>
<mml:math id="mm20">
<mml:mrow>
<mml:msubsup>
<mml:mi>C</mml:mi>
<mml:mrow>
<mml:msub>
<mml:mi>n</mml:mi>
<mml:mi>c</mml:mi>
</mml:msub>
</mml:mrow>
<mml:mi>b</mml:mi>
</mml:msubsup>
</mml:mrow>
</mml:math>
</inline-formula>
. The subscript
<inline-formula>
<mml:math id="mm21">
<mml:mrow>
<mml:msub>
<mml:mi>n</mml:mi>
<mml:mi>c</mml:mi>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula>
denotes the computed frame. According to the DCM chain rule,
<inline-formula>
<mml:math id="mm22">
<mml:mrow>
<mml:msubsup>
<mml:mi>C</mml:mi>
<mml:mi>n</mml:mi>
<mml:mi>b</mml:mi>
</mml:msubsup>
</mml:mrow>
</mml:math>
</inline-formula>
is expressed as:
<disp-formula id="FD4-sensors-16-01752">
<label>(4)</label>
<mml:math id="mm23">
<mml:mtable columnalign="left">
<mml:mtr>
<mml:mtd>
<mml:mtext></mml:mtext>
<mml:msubsup>
<mml:mi>C</mml:mi>
<mml:mi>n</mml:mi>
<mml:mi>b</mml:mi>
</mml:msubsup>
<mml:mo>=</mml:mo>
<mml:msubsup>
<mml:mi>C</mml:mi>
<mml:mrow>
<mml:msub>
<mml:mi>n</mml:mi>
<mml:mi>c</mml:mi>
</mml:msub>
</mml:mrow>
<mml:mi>b</mml:mi>
</mml:msubsup>
<mml:msubsup>
<mml:mi>C</mml:mi>
<mml:mi>n</mml:mi>
<mml:mrow>
<mml:msub>
<mml:mi>n</mml:mi>
<mml:mi>c</mml:mi>
</mml:msub>
</mml:mrow>
</mml:msubsup>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:msubsup>
<mml:mi>C</mml:mi>
<mml:mi>n</mml:mi>
<mml:mrow>
<mml:msub>
<mml:mi>n</mml:mi>
<mml:mi>c</mml:mi>
</mml:msub>
</mml:mrow>
</mml:msubsup>
<mml:mo>=</mml:mo>
<mml:mi>I</mml:mi>
<mml:mo></mml:mo>
<mml:mo stretchy="false">[</mml:mo>
<mml:mi>ψ</mml:mi>
<mml:mo>×</mml:mo>
<mml:mo stretchy="false">]</mml:mo>
</mml:mtd>
</mml:mtr>
</mml:mtable>
</mml:math>
</disp-formula>
where
<inline-formula>
<mml:math id="mm24">
<mml:mrow>
<mml:mo stretchy="false">[</mml:mo>
<mml:mi>ψ</mml:mi>
<mml:mo>×</mml:mo>
<mml:mo stretchy="false">]</mml:mo>
</mml:mrow>
</mml:math>
</inline-formula>
denotes the skew matrix of attitude error. Substituting Equation (4) into Equation (3), the relationship between acceleration residuals in body frame and attitude error is written as:
<disp-formula id="FD5-sensors-16-01752">
<label>(5)</label>
<mml:math id="mm25">
<mml:mtable columnalign="left">
<mml:mtr>
<mml:mtd>
<mml:mi>δ</mml:mi>
<mml:mi>a</mml:mi>
</mml:mtd>
<mml:mtd>
<mml:mo>=</mml:mo>
<mml:msubsup>
<mml:mi>a</mml:mi>
<mml:mi>m</mml:mi>
<mml:mi>b</mml:mi>
</mml:msubsup>
<mml:mo></mml:mo>
<mml:msubsup>
<mml:mi>a</mml:mi>
<mml:mrow>
<mml:msub>
<mml:mi>n</mml:mi>
<mml:mi>c</mml:mi>
</mml:msub>
</mml:mrow>
<mml:mi>b</mml:mi>
</mml:msubsup>
<mml:mo>=</mml:mo>
<mml:msubsup>
<mml:mi>C</mml:mi>
<mml:mi>n</mml:mi>
<mml:mi>b</mml:mi>
</mml:msubsup>
<mml:msup>
<mml:mi>a</mml:mi>
<mml:mi>n</mml:mi>
</mml:msup>
<mml:mo></mml:mo>
<mml:msubsup>
<mml:mi>C</mml:mi>
<mml:mrow>
<mml:msub>
<mml:mi>n</mml:mi>
<mml:mi>c</mml:mi>
</mml:msub>
</mml:mrow>
<mml:mi>b</mml:mi>
</mml:msubsup>
<mml:msup>
<mml:mi>a</mml:mi>
<mml:mi>n</mml:mi>
</mml:msup>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd></mml:mtd>
<mml:mtd>
<mml:mo>=</mml:mo>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:msubsup>
<mml:mi>C</mml:mi>
<mml:mi>n</mml:mi>
<mml:mi>b</mml:mi>
</mml:msubsup>
<mml:mo></mml:mo>
<mml:msubsup>
<mml:mi>C</mml:mi>
<mml:mrow>
<mml:msub>
<mml:mi>n</mml:mi>
<mml:mi>c</mml:mi>
</mml:msub>
</mml:mrow>
<mml:mi>b</mml:mi>
</mml:msubsup>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
<mml:msup>
<mml:mi>a</mml:mi>
<mml:mi>n</mml:mi>
</mml:msup>
<mml:mo>=</mml:mo>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:msubsup>
<mml:mi>C</mml:mi>
<mml:mrow>
<mml:msub>
<mml:mi>n</mml:mi>
<mml:mi>c</mml:mi>
</mml:msub>
</mml:mrow>
<mml:mi>b</mml:mi>
</mml:msubsup>
<mml:msubsup>
<mml:mi>C</mml:mi>
<mml:mi>n</mml:mi>
<mml:mrow>
<mml:msub>
<mml:mi>n</mml:mi>
<mml:mi>c</mml:mi>
</mml:msub>
</mml:mrow>
</mml:msubsup>
<mml:mo></mml:mo>
<mml:msubsup>
<mml:mi>C</mml:mi>
<mml:mrow>
<mml:msub>
<mml:mi>n</mml:mi>
<mml:mi>c</mml:mi>
</mml:msub>
</mml:mrow>
<mml:mi>b</mml:mi>
</mml:msubsup>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
<mml:msup>
<mml:mi>a</mml:mi>
<mml:mi>n</mml:mi>
</mml:msup>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd></mml:mtd>
<mml:mtd>
<mml:mo>=</mml:mo>
<mml:msubsup>
<mml:mi>C</mml:mi>
<mml:mrow>
<mml:msub>
<mml:mi>n</mml:mi>
<mml:mi>c</mml:mi>
</mml:msub>
</mml:mrow>
<mml:mi>b</mml:mi>
</mml:msubsup>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:mi>I</mml:mi>
<mml:mo></mml:mo>
<mml:mo stretchy="false">[</mml:mo>
<mml:mi>ψ</mml:mi>
<mml:mo>×</mml:mo>
<mml:mo stretchy="false">]</mml:mo>
<mml:mo></mml:mo>
<mml:mi>I</mml:mi>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
<mml:msup>
<mml:mi>a</mml:mi>
<mml:mi>n</mml:mi>
</mml:msup>
<mml:mo>=</mml:mo>
<mml:msubsup>
<mml:mi>C</mml:mi>
<mml:mrow>
<mml:msub>
<mml:mi>n</mml:mi>
<mml:mi>c</mml:mi>
</mml:msub>
</mml:mrow>
<mml:mi>b</mml:mi>
</mml:msubsup>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:mo></mml:mo>
<mml:mo stretchy="false">[</mml:mo>
<mml:mi>ψ</mml:mi>
<mml:mo>×</mml:mo>
<mml:mo stretchy="false">]</mml:mo>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
<mml:msup>
<mml:mi>a</mml:mi>
<mml:mi>n</mml:mi>
</mml:msup>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd></mml:mtd>
<mml:mtd>
<mml:mo>=</mml:mo>
<mml:msubsup>
<mml:mi>C</mml:mi>
<mml:mrow>
<mml:msub>
<mml:mi>n</mml:mi>
<mml:mi>c</mml:mi>
</mml:msub>
</mml:mrow>
<mml:mi>b</mml:mi>
</mml:msubsup>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:mo stretchy="false">[</mml:mo>
<mml:msup>
<mml:mi>a</mml:mi>
<mml:mi>n</mml:mi>
</mml:msup>
<mml:mo>×</mml:mo>
<mml:mo stretchy="false">]</mml:mo>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
<mml:mi>ψ</mml:mi>
</mml:mtd>
</mml:mtr>
</mml:mtable>
</mml:math>
</disp-formula>
</p>
<p>Then, the measurement model can be obtained by Equation (5). The measurement
<inline-formula>
<mml:math id="mm26">
<mml:mi>Z</mml:mi>
</mml:math>
</inline-formula>
is the acceleration in body frame
<inline-formula>
<mml:math id="mm27">
<mml:mrow>
<mml:msup>
<mml:mrow>
<mml:mo stretchy="false">[</mml:mo>
<mml:mi>δ</mml:mi>
<mml:msub>
<mml:mi>a</mml:mi>
<mml:mi>x</mml:mi>
</mml:msub>
<mml:mtext></mml:mtext>
<mml:mi>δ</mml:mi>
<mml:msub>
<mml:mi>a</mml:mi>
<mml:mi>y</mml:mi>
</mml:msub>
<mml:mtext></mml:mtext>
<mml:mi>δ</mml:mi>
<mml:msub>
<mml:mi>a</mml:mi>
<mml:mi>z</mml:mi>
</mml:msub>
<mml:mtext></mml:mtext>
<mml:mo stretchy="false">]</mml:mo>
</mml:mrow>
<mml:mi>T</mml:mi>
</mml:msup>
</mml:mrow>
</mml:math>
</inline-formula>
, and the measurement matrix
<inline-formula>
<mml:math id="mm28">
<mml:mi>H</mml:mi>
</mml:math>
</inline-formula>
is expressed as:
<disp-formula id="FD6-sensors-16-01752">
<label>(6)</label>
<mml:math id="mm29">
<mml:mrow>
<mml:mi>H</mml:mi>
<mml:mo>=</mml:mo>
<mml:mrow>
<mml:mo>[</mml:mo>
<mml:mtable columnalign="left">
<mml:mtr>
<mml:mtd>
<mml:mo></mml:mo>
<mml:mi>g</mml:mi>
<mml:msubsup>
<mml:mi>C</mml:mi>
<mml:mrow>
<mml:msub>
<mml:mi>n</mml:mi>
<mml:mi>c</mml:mi>
</mml:msub>
</mml:mrow>
<mml:mi>b</mml:mi>
</mml:msubsup>
<mml:mo stretchy="false">(</mml:mo>
<mml:mn>1</mml:mn>
<mml:mo>,</mml:mo>
<mml:mn>2</mml:mn>
<mml:mo stretchy="false">)</mml:mo>
<mml:mtext>  </mml:mtext>
<mml:mi>g</mml:mi>
<mml:msubsup>
<mml:mi>C</mml:mi>
<mml:mrow>
<mml:msub>
<mml:mi>n</mml:mi>
<mml:mi>c</mml:mi>
</mml:msub>
</mml:mrow>
<mml:mi>b</mml:mi>
</mml:msubsup>
<mml:mo stretchy="false">(</mml:mo>
<mml:mn>1</mml:mn>
<mml:mo>,</mml:mo>
<mml:mn>1</mml:mn>
<mml:mo stretchy="false">)</mml:mo>
<mml:mtext>    </mml:mtext>
<mml:mn>0</mml:mn>
<mml:mtext>   </mml:mtext>
<mml:mn>0</mml:mn>
<mml:mtext>   </mml:mtext>
<mml:mn>0</mml:mn>
<mml:mtext>   </mml:mtext>
<mml:mn>0</mml:mn>
<mml:mtext></mml:mtext>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mo></mml:mo>
<mml:mi>g</mml:mi>
<mml:msubsup>
<mml:mi>C</mml:mi>
<mml:mrow>
<mml:msub>
<mml:mi>n</mml:mi>
<mml:mi>c</mml:mi>
</mml:msub>
</mml:mrow>
<mml:mi>b</mml:mi>
</mml:msubsup>
<mml:mo stretchy="false">(</mml:mo>
<mml:mn>2</mml:mn>
<mml:mo>,</mml:mo>
<mml:mn>2</mml:mn>
<mml:mo stretchy="false">)</mml:mo>
<mml:mtext>  </mml:mtext>
<mml:mi>g</mml:mi>
<mml:msubsup>
<mml:mi>C</mml:mi>
<mml:mrow>
<mml:msub>
<mml:mi>n</mml:mi>
<mml:mi>c</mml:mi>
</mml:msub>
</mml:mrow>
<mml:mi>b</mml:mi>
</mml:msubsup>
<mml:mo stretchy="false">(</mml:mo>
<mml:mn>2</mml:mn>
<mml:mo>,</mml:mo>
<mml:mn>1</mml:mn>
<mml:mo stretchy="false">)</mml:mo>
<mml:mtext>    </mml:mtext>
<mml:mn>0</mml:mn>
<mml:mtext>   </mml:mtext>
<mml:mn>0</mml:mn>
<mml:mtext>   </mml:mtext>
<mml:mn>0</mml:mn>
<mml:mtext>   </mml:mtext>
<mml:mn>0</mml:mn>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mo></mml:mo>
<mml:mi>g</mml:mi>
<mml:msubsup>
<mml:mi>C</mml:mi>
<mml:mrow>
<mml:msub>
<mml:mi>n</mml:mi>
<mml:mi>c</mml:mi>
</mml:msub>
</mml:mrow>
<mml:mi>b</mml:mi>
</mml:msubsup>
<mml:mo stretchy="false">(</mml:mo>
<mml:mn>3</mml:mn>
<mml:mo>,</mml:mo>
<mml:mn>2</mml:mn>
<mml:mo stretchy="false">)</mml:mo>
<mml:mtext>  </mml:mtext>
<mml:mi>g</mml:mi>
<mml:msubsup>
<mml:mi>C</mml:mi>
<mml:mrow>
<mml:msub>
<mml:mi>n</mml:mi>
<mml:mi>c</mml:mi>
</mml:msub>
</mml:mrow>
<mml:mi>b</mml:mi>
</mml:msubsup>
<mml:mo stretchy="false">(</mml:mo>
<mml:mn>3</mml:mn>
<mml:mo>,</mml:mo>
<mml:mn>1</mml:mn>
<mml:mo stretchy="false">)</mml:mo>
<mml:mtext>    </mml:mtext>
<mml:mn>0</mml:mn>
<mml:mtext>   </mml:mtext>
<mml:mn>0</mml:mn>
<mml:mtext>   </mml:mtext>
<mml:mn>0</mml:mn>
<mml:mtext>   </mml:mtext>
<mml:mn>0</mml:mn>
</mml:mtd>
</mml:mtr>
</mml:mtable>
<mml:mo>]</mml:mo>
</mml:mrow>
</mml:mrow>
</mml:math>
</disp-formula>
</p>
<p>This attitude filter works effectively under stationary or low acceleration conditions. In these situations, the specific force measured by accelerometer equals to local gravity acceleration, so the pitch and roll angles derived through accelerometer are accurate and can be positive in fixing the accumulated attitude error caused by gyroscope error, while, in high dynamic situation, the accelerometer will sense the external dynamic acceleration, which is undesirable in the filter. Hence, if the contribution of measurement update remains a same weight as that in low dynamic situation, a side effect will be introduced and lead to a degraded performance. Hence, to achieve an optimal attitude estimation result, we propose to adaptively tune the measurement covariance matrix R according to a system dynamic index
<inline-formula>
<mml:math id="mm30">
<mml:mi>ε</mml:mi>
</mml:math>
</inline-formula>
[
<xref rid="B42-sensors-16-01752" ref-type="bibr">42</xref>
] and is designed as:
<disp-formula id="FD7-sensors-16-01752">
<label>(7)</label>
<mml:math id="mm31">
<mml:mrow>
<mml:mi>ε</mml:mi>
<mml:mo>=</mml:mo>
<mml:mrow>
<mml:mo>|</mml:mo>
<mml:mrow>
<mml:mi>f</mml:mi>
<mml:mo></mml:mo>
<mml:mi>g</mml:mi>
</mml:mrow>
<mml:mo>|</mml:mo>
</mml:mrow>
</mml:mrow>
</mml:math>
</disp-formula>
where
<inline-formula>
<mml:math id="mm32">
<mml:mi>f</mml:mi>
</mml:math>
</inline-formula>
denotes the norm of measured acceleration and
<inline-formula>
<mml:math id="mm33">
<mml:mi>g</mml:mi>
</mml:math>
</inline-formula>
denotes the local gravity acceleration. Then the specific tuning strategy of covariance matrix R is described as follows:
<list list-type="order">
<list-item>
<p>Stationary mode: If the scalar subjects to
<inline-formula>
<mml:math id="mm34">
<mml:mrow>
<mml:mi>ε</mml:mi>
<mml:mo><</mml:mo>
<mml:mi>T</mml:mi>
<mml:mi>h</mml:mi>
<mml:mi>r</mml:mi>
<mml:mi>e</mml:mi>
<mml:mi>s</mml:mi>
<mml:mn>1</mml:mn>
</mml:mrow>
</mml:math>
</inline-formula>
, the system is considered to be stationary. Correspondingly, the covariance matrix
<inline-formula>
<mml:math id="mm35">
<mml:mi>R</mml:mi>
</mml:math>
</inline-formula>
is set as
<inline-formula>
<mml:math id="mm36">
<mml:mrow>
<mml:mi>R</mml:mi>
<mml:mo>=</mml:mo>
<mml:mi>d</mml:mi>
<mml:mi>i</mml:mi>
<mml:mi>a</mml:mi>
<mml:mi>g</mml:mi>
<mml:mo stretchy="false">[</mml:mo>
<mml:msubsup>
<mml:mi>σ</mml:mi>
<mml:mi>x</mml:mi>
<mml:mn>2</mml:mn>
</mml:msubsup>
<mml:mtext></mml:mtext>
<mml:msubsup>
<mml:mi>σ</mml:mi>
<mml:mi>y</mml:mi>
<mml:mn>2</mml:mn>
</mml:msubsup>
<mml:mtext></mml:mtext>
<mml:msubsup>
<mml:mi>σ</mml:mi>
<mml:mi>z</mml:mi>
<mml:mn>2</mml:mn>
</mml:msubsup>
<mml:mtext></mml:mtext>
<mml:mo stretchy="false">]</mml:mo>
</mml:mrow>
</mml:math>
</inline-formula>
, where
<inline-formula>
<mml:math id="mm37">
<mml:mrow>
<mml:msubsup>
<mml:mi>σ</mml:mi>
<mml:mi>x</mml:mi>
<mml:mn>2</mml:mn>
</mml:msubsup>
<mml:mo>,</mml:mo>
<mml:msubsup>
<mml:mi>σ</mml:mi>
<mml:mi>y</mml:mi>
<mml:mn>2</mml:mn>
</mml:msubsup>
<mml:mo>,</mml:mo>
<mml:msubsup>
<mml:mi>σ</mml:mi>
<mml:mi>z</mml:mi>
<mml:mn>2</mml:mn>
</mml:msubsup>
</mml:mrow>
</mml:math>
</inline-formula>
denote the velocity random walk of three-axis accelerometer. In our approach, the
<inline-formula>
<mml:math id="mm38">
<mml:mrow>
<mml:mi>T</mml:mi>
<mml:mi>h</mml:mi>
<mml:mi>r</mml:mi>
<mml:mi>e</mml:mi>
<mml:mi>s</mml:mi>
<mml:mn>1</mml:mn>
</mml:mrow>
</mml:math>
</inline-formula>
is set as
<inline-formula>
<mml:math id="mm39">
<mml:mrow>
<mml:mn>3</mml:mn>
<mml:mo></mml:mo>
<mml:mo stretchy="false">(</mml:mo>
<mml:msubsup>
<mml:mi>σ</mml:mi>
<mml:mi>x</mml:mi>
<mml:mn>2</mml:mn>
</mml:msubsup>
<mml:mo>+</mml:mo>
<mml:msubsup>
<mml:mi>σ</mml:mi>
<mml:mi>y</mml:mi>
<mml:mn>2</mml:mn>
</mml:msubsup>
<mml:mo>+</mml:mo>
<mml:msubsup>
<mml:mi>σ</mml:mi>
<mml:mi>z</mml:mi>
<mml:mn>2</mml:mn>
</mml:msubsup>
<mml:mo stretchy="false">)</mml:mo>
<mml:mtext></mml:mtext>
</mml:mrow>
</mml:math>
</inline-formula>
.</p>
</list-item>
<list-item>
<p>Low acceleration mode: If the index satisfies the condition
<inline-formula>
<mml:math id="mm40">
<mml:mrow>
<mml:mi>T</mml:mi>
<mml:mi>h</mml:mi>
<mml:mi>r</mml:mi>
<mml:mi>e</mml:mi>
<mml:mi>s</mml:mi>
<mml:mn>1</mml:mn>
<mml:mo><</mml:mo>
<mml:mi>ε</mml:mi>
<mml:mo><</mml:mo>
<mml:mi>T</mml:mi>
<mml:mi>h</mml:mi>
<mml:mi>r</mml:mi>
<mml:mi>e</mml:mi>
<mml:mi>s</mml:mi>
<mml:mn>2</mml:mn>
</mml:mrow>
</mml:math>
</inline-formula>
, the system suffers from low acceleration and is treated as measurement noise. The covariance matrix
<inline-formula>
<mml:math id="mm41">
<mml:mi>R</mml:mi>
</mml:math>
</inline-formula>
is set as
<inline-formula>
<mml:math id="mm42">
<mml:mrow>
<mml:mi>R</mml:mi>
<mml:mo>=</mml:mo>
<mml:mi>d</mml:mi>
<mml:mi>i</mml:mi>
<mml:mi>a</mml:mi>
<mml:mi>g</mml:mi>
<mml:mo stretchy="false">[</mml:mo>
<mml:msubsup>
<mml:mi>σ</mml:mi>
<mml:mi>x</mml:mi>
<mml:mn>2</mml:mn>
</mml:msubsup>
<mml:mtext></mml:mtext>
<mml:msubsup>
<mml:mi>σ</mml:mi>
<mml:mi>y</mml:mi>
<mml:mn>2</mml:mn>
</mml:msubsup>
<mml:mtext></mml:mtext>
<mml:msubsup>
<mml:mi>σ</mml:mi>
<mml:mi>z</mml:mi>
<mml:mn>2</mml:mn>
</mml:msubsup>
<mml:mtext></mml:mtext>
<mml:mo stretchy="false">]</mml:mo>
<mml:mo>+</mml:mo>
<mml:mi>k</mml:mi>
<mml:msup>
<mml:mi>ε</mml:mi>
<mml:mn>2</mml:mn>
</mml:msup>
</mml:mrow>
</mml:math>
</inline-formula>
, where
<inline-formula>
<mml:math id="mm43">
<mml:mi>k</mml:mi>
</mml:math>
</inline-formula>
is the scale.
<inline-formula>
<mml:math id="mm44">
<mml:mrow>
<mml:mi>T</mml:mi>
<mml:mi>h</mml:mi>
<mml:mi>r</mml:mi>
<mml:mi>e</mml:mi>
<mml:mi>s</mml:mi>
<mml:mn>2</mml:mn>
</mml:mrow>
</mml:math>
</inline-formula>
is set as
<inline-formula>
<mml:math id="mm45">
<mml:mrow>
<mml:mn>2</mml:mn>
<mml:mi>g</mml:mi>
</mml:mrow>
</mml:math>
</inline-formula>
</p>
</list-item>
<list-item>
<p>High dynamic mode: If the scalar subjects to
<inline-formula>
<mml:math id="mm46">
<mml:mrow>
<mml:mi>ε</mml:mi>
<mml:mo>></mml:mo>
<mml:mi>T</mml:mi>
<mml:mi>h</mml:mi>
<mml:mi>r</mml:mi>
<mml:mi>e</mml:mi>
<mml:mi>s</mml:mi>
<mml:mn>2</mml:mn>
</mml:mrow>
</mml:math>
</inline-formula>
, norm of the three accelerations is far from the specific force, which equals to gravity acceleration. The acceleration residuals are not reliable. In this situation, we only use the angular velocity to calculate attitude, and the filter only performs the prediction loop without measurement update.</p>
</list-item>
</list>
</p>
</sec>
</sec>
<sec id="sec4dot2-sensors-16-01752">
<title>4.2. Data Segmentation</title>
<p>Data segmentation is carried out to divide the continuous stream of collected sensor data into multiple subsequences, and retrieve the important and useful information for the activity recognition. The sliding windows algorithms are commonly used to segment data in various applications because they are simple, intuitive and online algorithms. However, this approach is not suitable here because an entire human stepping motion signal may not be included in the current detected window, and is separated in two adjacent windows, which is possible to cause poor result in some cases. Moreover, this algorithm works with a complexity of O(nL), where L is the average length of a segment, and it affects the system real-time capability.</p>
<p>Hence, the relationship between gait cycle and acceleration signal is analyzed to derive a practical approach to segment data. Generally, a gait cycle can be divided into four phases [
<xref rid="B43-sensors-16-01752" ref-type="bibr">43</xref>
], namely: (1) Push-off, heel off the ground and toe on the ground; (2) Swing, both heel and toe off the ground; (3) Heel Strike, heel on the ground and toe off the ground; and (4) Foot stance phase, heel and toe on the ground at rest.
<xref ref-type="fig" rid="sensors-16-01752-f006">Figure 6</xref>
shows these four phases and their correlated acceleration signal. </p>
<p>As shown in
<xref ref-type="fig" rid="sensors-16-01752-f006">Figure 6</xref>
, the blue line is the norm of three accelerations and red line denotes the smoothed acceleration signal by a moving average algorithm, where, for each epoch, a window containing the previous N sample points is averaged to produce the acceleration value; and the reason is to derive a smoother form of signal, deduce noise, and eliminate unexpected peak points. </p>
<p>
<xref ref-type="fig" rid="sensors-16-01752-f006">Figure 6</xref>
illustrates that the smoothed acceleration signal during one walking cycle generally features two peak points, one is in the push-off phase that the foot is leaving the ground and another one is in the heel-strike phase that the foot hits the ground. Although it may not hold for each walking cycle, that more than two peak points are available in one cycle, due to different user habits of motion strength, these two points are always available in each gait cycle. Here, the utilization of the peak point for triggering the date segmentation process is proposed. Once one peak point is detected, the feature in the vicinity of this point is extracted and consequently the foot motion type is identified.</p>
<p>The reason for using peak point is that one peak point is always available in the push-off phase when the foot leaves ground, which will not vary for different users or stepping patterns. This point facilitates the detection of the beginning phase of each step, and ensures the reliable real time performance. On the other side, the foot motion detection algorithm works with the O (peak point number) complexity. Therefore, the classification process is only performed when the peak point is detected, which decreases the computation burden. Moreover, the specific phase of each walking circle do not need to be classified, as it simplifies the identification process </p>
<p>Additionally, the length of data for feature extraction also needs to be ascertained. A tradeoff is available here between discrimination accuracy of motion types and real-time applicability. Involving more data in the segmentation procedure is beneficial to correctly identify human motion and achieve more reliable results, but will cause a lag response, whereas less data can achieve a quick and less delay judgment on human motion. However, there is not enough information included for classification. Hence, the distribution of three separate axis acceleration signal of different motions is analyzed to figure out the length of data segment for feature extraction.</p>
<p>
<xref ref-type="fig" rid="sensors-16-01752-f007">Figure 7</xref>
draws the collected three axes acceleration signals in the vicinity of the peak points in the initial stage of a step, and
<xref ref-type="fig" rid="sensors-16-01752-f007">Figure 7</xref>
a–e, respectively, represents acceleration signals collected from forward, backward, left, right and jump motions. The blue, red and green solid lines separately denote the acceleration signals represented in user frame. The green dashed line, which drawn from top to bottom, denotes the position of the peak points. The peak points line suffers a shift in right side and it is due to the implementation of the mean average algorithm, but it will not cause any negative effect to the identification process. It is suggested to use the acceleration signals to invest the data segment length because they experience different performances during the process of human stepping in various directions, and they are able to provide an intuitive, direct, and easy understanding manner to recognize the moving directions. For example,
<xref ref-type="fig" rid="sensors-16-01752-f007">Figure 7</xref>
c illustrates the left motion and the acceleration (red line) in user’s right direction features an obvious difference compared with the other two axes. Similarly, for the forward and backward motions, the accelerations in forward or backward directions exhibit more diversity.</p>
<p>Additionally, each figure illustrates the acceleration distribution of 500 motion samples performed by different testers where for each motion 500 data groups are collected. Acceleration data in the vicinity of first peak point are extracted, and the mean and standard deviation of these segments are calculated. The solid lines and dashed lines represent the mean and standard deviation, respectively. The acceleration distribution shown in the
<xref ref-type="fig" rid="sensors-16-01752-f007">Figure 7</xref>
provides an intuitive statistical result of acceleration in the initial phase of a step and is helpful to confirm the data segment length. The data segmentation length selected for feature extraction is 31 samples and it is presented in orange rectangle in the figure, where it includes 20 samples before the peak point with 10 samples after the peak point and the peak point itself. The main justifications to choose this length of data are: first, the extracted feature within the selected interval is able to provide enough distinguished information for the motion identification; and, second, it ensures a reliable real-time applicability. The data shown in
<xref ref-type="fig" rid="sensors-16-01752-f007">Figure 7</xref>
is sampled in 200 Hz and the first 30 samples of a gait cycle are utilized for classification, which means that the motion type can be decided in approximately 0.15 s after this motion occurs.</p>
</sec>
<sec id="sec4dot3-sensors-16-01752">
<title>4.3. Feature Extraction</title>
<p>Generally, features can be defined as the abstractions of raw data. The objective of feature extraction is to find the main characteristics of a data segment which can accurately represent the original data and identify valid, useful and understandable patterns. Basically, the features can be divided into various categories, time domain and frequency domain are the most commonly ones used for recognition. The feature selection is an extremely important step because a good feature space can lead to a clear and easy classification and poor feature space may be time-consuming, computationally-expensive, and cannot lead to good result. In our system, not all of the commonly-used features in activity recognition field in our system are selected; however, the collected signal is analyzed and the foot moving physical discipline is considered to choose the features, which are not only effective to discriminate motion types but also has less computation complexity. In this system, the selected features for foot motion classification are described as follows.</p>
<sec id="sec4dot3dot1-sensors-16-01752">
<title>4.3.1. Mean and Variance</title>
<p>The mean and variance value of the three axis accelerometer and gyroscope measurements are derived from the data segment to consider as the feature, according to the following equations:
<disp-formula id="FD8-sensors-16-01752">
<label>(8)</label>
<mml:math id="mm47">
<mml:mrow>
<mml:mrow>
<mml:mo>{</mml:mo>
<mml:mtable>
<mml:mtr>
<mml:mtd>
<mml:mover accent="true">
<mml:mi>x</mml:mi>
<mml:mo>¯</mml:mo>
</mml:mover>
<mml:mo>=</mml:mo>
<mml:mfrac>
<mml:mrow>
<mml:mstyle displaystyle="true">
<mml:munderover>
<mml:mo></mml:mo>
<mml:mn>1</mml:mn>
<mml:mi>N</mml:mi>
</mml:munderover>
<mml:mrow>
<mml:msub>
<mml:mi>x</mml:mi>
<mml:mi>i</mml:mi>
</mml:msub>
</mml:mrow>
</mml:mstyle>
</mml:mrow>
<mml:mi>N</mml:mi>
</mml:mfrac>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:msup>
<mml:mi>σ</mml:mi>
<mml:mn>2</mml:mn>
</mml:msup>
<mml:mo>=</mml:mo>
<mml:mstyle displaystyle="true">
<mml:munderover>
<mml:mo></mml:mo>
<mml:mrow>
<mml:mi>i</mml:mi>
<mml:mo>=</mml:mo>
<mml:mn>1</mml:mn>
</mml:mrow>
<mml:mi>N</mml:mi>
</mml:munderover>
<mml:mrow>
<mml:msup>
<mml:mrow>
<mml:mo stretchy="false">(</mml:mo>
<mml:msub>
<mml:mi>x</mml:mi>
<mml:mi>i</mml:mi>
</mml:msub>
<mml:mo></mml:mo>
<mml:mover accent="true">
<mml:mi>x</mml:mi>
<mml:mo>¯</mml:mo>
</mml:mover>
<mml:mo stretchy="false">)</mml:mo>
</mml:mrow>
<mml:mn>2</mml:mn>
</mml:msup>
</mml:mrow>
</mml:mstyle>
</mml:mtd>
</mml:mtr>
</mml:mtable>
</mml:mrow>
</mml:mrow>
</mml:math>
</disp-formula>
where
<inline-formula>
<mml:math id="mm48">
<mml:mrow>
<mml:msub>
<mml:mi>x</mml:mi>
<mml:mi>i</mml:mi>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula>
denotes the signal,
<italic>N</italic>
denotes the data length, and
<inline-formula>
<mml:math id="mm49">
<mml:mrow>
<mml:mover accent="true">
<mml:mi>x</mml:mi>
<mml:mo>¯</mml:mo>
</mml:mover>
<mml:mo>,</mml:mo>
<mml:msup>
<mml:mi>σ</mml:mi>
<mml:mn>2</mml:mn>
</mml:msup>
</mml:mrow>
</mml:math>
</inline-formula>
denote the mean and variance value of the data sequence. </p>
</sec>
<sec id="sec4dot3dot2-sensors-16-01752">
<title>4.3.2. Signal Magnitude Area</title>
<p>The signal magnitude area (SMA or sma) is a statistical measure of the magnitude of a varying quantity, and actually is the absolute values of the signal. SMA is calculated according to Equation (9).
<disp-formula id="FD9-sensors-16-01752">
<label>(9)</label>
<mml:math id="mm50">
<mml:mrow>
<mml:msub>
<mml:mi>f</mml:mi>
<mml:mrow>
<mml:mi>S</mml:mi>
<mml:mi>M</mml:mi>
<mml:mi>A</mml:mi>
</mml:mrow>
</mml:msub>
<mml:mo>=</mml:mo>
<mml:mstyle displaystyle="true">
<mml:mrow>
<mml:msubsup>
<mml:mo></mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi>t</mml:mi>
<mml:mn>1</mml:mn>
</mml:msub>
</mml:mrow>
<mml:mrow>
<mml:msub>
<mml:mi>t</mml:mi>
<mml:mn>2</mml:mn>
</mml:msub>
</mml:mrow>
</mml:msubsup>
<mml:mrow>
<mml:mrow>
<mml:mo>|</mml:mo>
<mml:mi>x</mml:mi>
<mml:mo>|</mml:mo>
</mml:mrow>
<mml:mi>d</mml:mi>
<mml:mi>t</mml:mi>
</mml:mrow>
</mml:mrow>
</mml:mstyle>
</mml:mrow>
</mml:math>
</disp-formula>
where
<inline-formula>
<mml:math id="mm51">
<mml:mi>x</mml:mi>
</mml:math>
</inline-formula>
denotes the signal and
<inline-formula>
<mml:math id="mm52">
<mml:mrow>
<mml:mo stretchy="false">(</mml:mo>
<mml:msub>
<mml:mi>t</mml:mi>
<mml:mn>1</mml:mn>
</mml:msub>
<mml:mo>,</mml:mo>
<mml:msub>
<mml:mi>t</mml:mi>
<mml:mn>2</mml:mn>
</mml:msub>
<mml:mo stretchy="false">)</mml:mo>
</mml:mrow>
</mml:math>
</inline-formula>
denotes the integration time period. </p>
</sec>
<sec id="sec4dot3dot3-sensors-16-01752">
<title>4.3.3. Position Change</title>
<p>Position change is an intuitive feature for the foot direction identification because different foot moving directions cause various position changes. For example, jumping features a larger change in vertical direction, and stepping right and left lead to an obvious position change in horizontal plane. The Inertial Navigation System (INS) mechanization equation is able to provide the trajectory of a moving object in three dimensions with the measured rotations and accelerations [
<xref rid="B44-sensors-16-01752" ref-type="bibr">44</xref>
], while, due to the double integration strategy of INS mechanization and the noise of sensor, the accumulated errors will be involved in the trajectory estimation and lead to a drift of position, especially when using a MEMS sensor. </p>
<p>Hence, it is not feasible to calculate the position during the whole identification process, the position is only derived in the data segmentation, with an initial velocity (0,0,0), initial position (0,0,0), and a zero azimuth during the calculation process. The inertial sensor has the characteristic of keeping accurate in short term, so the position result computed in the 31 samples interval is reliable and trustworthy. The position calculation equation is described as follows:
<disp-formula id="FD10-sensors-16-01752">
<label>(10)</label>
<mml:math id="mm53">
<mml:mrow>
<mml:mrow>
<mml:mo>{</mml:mo>
<mml:mtable>
<mml:mtr>
<mml:mtd>
<mml:msup>
<mml:mi>a</mml:mi>
<mml:mi>n</mml:mi>
</mml:msup>
<mml:mo>=</mml:mo>
<mml:msubsup>
<mml:mi mathvariant="normal">C</mml:mi>
<mml:mi>b</mml:mi>
<mml:mi>n</mml:mi>
</mml:msubsup>
<mml:msup>
<mml:mi>a</mml:mi>
<mml:mi>b</mml:mi>
</mml:msup>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mi>v</mml:mi>
<mml:mo>=</mml:mo>
<mml:msub>
<mml:mi>v</mml:mi>
<mml:mn>0</mml:mn>
</mml:msub>
<mml:mo>+</mml:mo>
<mml:mstyle displaystyle="true">
<mml:mrow>
<mml:mo></mml:mo>
<mml:mrow>
<mml:msup>
<mml:mi>a</mml:mi>
<mml:mi>n</mml:mi>
</mml:msup>
<mml:mi>d</mml:mi>
<mml:mi>t</mml:mi>
</mml:mrow>
</mml:mrow>
</mml:mstyle>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mi>p</mml:mi>
<mml:mo>=</mml:mo>
<mml:msub>
<mml:mi>p</mml:mi>
<mml:mn>0</mml:mn>
</mml:msub>
<mml:mo>+</mml:mo>
<mml:mstyle displaystyle="true">
<mml:mrow>
<mml:mo></mml:mo>
<mml:mrow>
<mml:mi>v</mml:mi>
<mml:mtext></mml:mtext>
<mml:mi>d</mml:mi>
<mml:mi>t</mml:mi>
</mml:mrow>
</mml:mrow>
</mml:mstyle>
</mml:mtd>
</mml:mtr>
</mml:mtable>
</mml:mrow>
</mml:mrow>
</mml:math>
</disp-formula>
where
<inline-formula>
<mml:math id="mm54">
<mml:mrow>
<mml:msup>
<mml:mi>a</mml:mi>
<mml:mi>b</mml:mi>
</mml:msup>
</mml:mrow>
</mml:math>
</inline-formula>
denotes the measured acceleration in body frame,
<inline-formula>
<mml:math id="mm55">
<mml:mrow>
<mml:msubsup>
<mml:mi>C</mml:mi>
<mml:mi>b</mml:mi>
<mml:mi>n</mml:mi>
</mml:msubsup>
</mml:mrow>
</mml:math>
</inline-formula>
is the rotation matrix that projects the acceleration from body frame to the navigation frame (local-level frame,) and
<inline-formula>
<mml:math id="mm56">
<mml:mrow>
<mml:msup>
<mml:mi>a</mml:mi>
<mml:mi>n</mml:mi>
</mml:msup>
</mml:mrow>
</mml:math>
</inline-formula>
denotes the projected acceleration in navigation frame.
<inline-formula>
<mml:math id="mm57">
<mml:mrow>
<mml:mi>v</mml:mi>
<mml:mo>,</mml:mo>
<mml:mi>p</mml:mi>
</mml:mrow>
</mml:math>
</inline-formula>
denote the computed velocity and position and
<inline-formula>
<mml:math id="mm58">
<mml:mrow>
<mml:msub>
<mml:mi>v</mml:mi>
<mml:mn>0</mml:mn>
</mml:msub>
<mml:mo>,</mml:mo>
<mml:msub>
<mml:mi>p</mml:mi>
<mml:mn>0</mml:mn>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula>
denote the initial velocity and position.</p>
</sec>
<sec id="sec4dot3dot4-sensors-16-01752">
<title>4.3.4. Ratio</title>
<p>The ratio feature is used to calculate the proportion of feature in single axis and the norm of features in three axes. The aim of introducing the ratio metric is to normalize the feature of three axes to best deal with the motions performed in different strength performed by different users. For example, for the jump motion, the position change in up direction (jump height) is more than that in horizontal plane and is dominant in the position change, though the jump height is different for various users, the proportion of jump height in position change still occupy significantly. Specifically, the position feature (the position change) derived from a heavy jump motion maybe (0.2, 0.2, 0.5), and is (0.05, 0.05, 0.2) from a slight jump motion; though the jump height amplitude varies significantly depending on different user habits, the ratio of jump height occupies over 50% of whole position change for the both groups. Hence, the ratio feature of position change in different directions is a good metric to distinguish and evaluate the motion types with different strengths. The ratio feature introduced here is calculated as in Equation (11):
<disp-formula id="FD11-sensors-16-01752">
<label>(11)</label>
<mml:math id="mm59">
<mml:mtable>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:mo>{</mml:mo>
<mml:mtable>
<mml:mtr>
<mml:mtd>
<mml:mi>F</mml:mi>
<mml:mi>e</mml:mi>
<mml:mi>a</mml:mi>
<mml:mi>t</mml:mi>
<mml:mi>u</mml:mi>
<mml:mi>r</mml:mi>
<mml:mi>e</mml:mi>
<mml:mi>N</mml:mi>
<mml:mi>o</mml:mi>
<mml:mi>r</mml:mi>
<mml:mi>m</mml:mi>
<mml:mo>=</mml:mo>
<mml:msqrt>
<mml:mrow>
<mml:mi>F</mml:mi>
<mml:mi>e</mml:mi>
<mml:mi>a</mml:mi>
<mml:mi>t</mml:mi>
<mml:mi>u</mml:mi>
<mml:mi>r</mml:mi>
<mml:mi>e</mml:mi>
<mml:msup>
<mml:mi>X</mml:mi>
<mml:mn>2</mml:mn>
</mml:msup>
<mml:mo>+</mml:mo>
<mml:mi>F</mml:mi>
<mml:mi>e</mml:mi>
<mml:mi>a</mml:mi>
<mml:mi>t</mml:mi>
<mml:mi>u</mml:mi>
<mml:mi>r</mml:mi>
<mml:mi>e</mml:mi>
<mml:msup>
<mml:mi>Y</mml:mi>
<mml:mn>2</mml:mn>
</mml:msup>
<mml:mo>+</mml:mo>
<mml:mi>F</mml:mi>
<mml:mi>e</mml:mi>
<mml:mi>a</mml:mi>
<mml:mi>t</mml:mi>
<mml:mi>u</mml:mi>
<mml:mi>r</mml:mi>
<mml:mi>e</mml:mi>
<mml:msup>
<mml:mi>Z</mml:mi>
<mml:mn>2</mml:mn>
</mml:msup>
</mml:mrow>
</mml:msqrt>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mi>r</mml:mi>
<mml:mi>a</mml:mi>
<mml:mi>t</mml:mi>
<mml:mi>i</mml:mi>
<mml:mi>o</mml:mi>
<mml:mi>F</mml:mi>
<mml:mi>e</mml:mi>
<mml:mi>a</mml:mi>
<mml:mi>t</mml:mi>
<mml:mi>u</mml:mi>
<mml:mi>r</mml:mi>
<mml:mi>e</mml:mi>
<mml:mi>X</mml:mi>
<mml:mo>=</mml:mo>
<mml:mfrac>
<mml:mrow>
<mml:mi>F</mml:mi>
<mml:mi>e</mml:mi>
<mml:mi>a</mml:mi>
<mml:mi>t</mml:mi>
<mml:mi>u</mml:mi>
<mml:mi>r</mml:mi>
<mml:mi>e</mml:mi>
<mml:mi>X</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi>F</mml:mi>
<mml:mi>e</mml:mi>
<mml:mi>a</mml:mi>
<mml:mi>t</mml:mi>
<mml:mi>u</mml:mi>
<mml:mi>r</mml:mi>
<mml:mi>e</mml:mi>
<mml:mi>N</mml:mi>
<mml:mi>o</mml:mi>
<mml:mi>r</mml:mi>
<mml:mi>m</mml:mi>
</mml:mrow>
</mml:mfrac>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mi>r</mml:mi>
<mml:mi>a</mml:mi>
<mml:mi>t</mml:mi>
<mml:mi>i</mml:mi>
<mml:mi>o</mml:mi>
<mml:mi>F</mml:mi>
<mml:mi>e</mml:mi>
<mml:mi>a</mml:mi>
<mml:mi>t</mml:mi>
<mml:mi>u</mml:mi>
<mml:mi>r</mml:mi>
<mml:mi>e</mml:mi>
<mml:mi>Y</mml:mi>
<mml:mo>=</mml:mo>
<mml:mfrac>
<mml:mrow>
<mml:mi>F</mml:mi>
<mml:mi>e</mml:mi>
<mml:mi>a</mml:mi>
<mml:mi>t</mml:mi>
<mml:mi>u</mml:mi>
<mml:mi>r</mml:mi>
<mml:mi>e</mml:mi>
<mml:mi>Y</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi>F</mml:mi>
<mml:mi>e</mml:mi>
<mml:mi>a</mml:mi>
<mml:mi>t</mml:mi>
<mml:mi>u</mml:mi>
<mml:mi>r</mml:mi>
<mml:mi>e</mml:mi>
<mml:mi>N</mml:mi>
<mml:mi>o</mml:mi>
<mml:mi>r</mml:mi>
<mml:mi>m</mml:mi>
</mml:mrow>
</mml:mfrac>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mi>r</mml:mi>
<mml:mi>a</mml:mi>
<mml:mi>t</mml:mi>
<mml:mi>i</mml:mi>
<mml:mi>o</mml:mi>
<mml:mi>F</mml:mi>
<mml:mi>e</mml:mi>
<mml:mi>a</mml:mi>
<mml:mi>t</mml:mi>
<mml:mi>u</mml:mi>
<mml:mi>r</mml:mi>
<mml:mi>e</mml:mi>
<mml:mi>Z</mml:mi>
<mml:mo>=</mml:mo>
<mml:mfrac>
<mml:mrow>
<mml:mi>F</mml:mi>
<mml:mi>e</mml:mi>
<mml:mi>a</mml:mi>
<mml:mi>t</mml:mi>
<mml:mi>u</mml:mi>
<mml:mi>r</mml:mi>
<mml:mi>e</mml:mi>
<mml:mi>Z</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi>F</mml:mi>
<mml:mi>e</mml:mi>
<mml:mi>a</mml:mi>
<mml:mi>t</mml:mi>
<mml:mi>u</mml:mi>
<mml:mi>r</mml:mi>
<mml:mi>e</mml:mi>
<mml:mi>N</mml:mi>
<mml:mi>o</mml:mi>
<mml:mi>r</mml:mi>
<mml:mi>m</mml:mi>
</mml:mrow>
</mml:mfrac>
</mml:mtd>
</mml:mtr>
</mml:mtable>
</mml:mrow>
</mml:mtd>
</mml:mtr>
</mml:mtable>
</mml:math>
</disp-formula>
where,
<inline-formula>
<mml:math id="mm60">
<mml:mrow>
<mml:mi>F</mml:mi>
<mml:mi>e</mml:mi>
<mml:mi>a</mml:mi>
<mml:mi>t</mml:mi>
<mml:mi>u</mml:mi>
<mml:mi>r</mml:mi>
<mml:mi>e</mml:mi>
<mml:mi>X</mml:mi>
<mml:mi></mml:mi>
<mml:mtext></mml:mtext>
<mml:mi>Y</mml:mi>
<mml:mtext></mml:mtext>
<mml:mo>,</mml:mo>
<mml:mi>Z</mml:mi>
</mml:mrow>
</mml:math>
</inline-formula>
denote the calculated features in different axes and
<inline-formula>
<mml:math id="mm61">
<mml:mrow>
<mml:mi>r</mml:mi>
<mml:mi>a</mml:mi>
<mml:mi>t</mml:mi>
<mml:mi>i</mml:mi>
<mml:mi>o</mml:mi>
<mml:mi>F</mml:mi>
<mml:mi>e</mml:mi>
<mml:mi>a</mml:mi>
<mml:mi>t</mml:mi>
<mml:mi>u</mml:mi>
<mml:mi>r</mml:mi>
<mml:mi>e</mml:mi>
</mml:mrow>
</mml:math>
</inline-formula>
denotes the ratio. In our proposed system, the position, mean, variance, and SMA features calculated in three directions or axes are all considered to derive the ratio feature. </p>
</sec>
</sec>
<sec id="sec4dot4-sensors-16-01752">
<title>4.4. Classification</title>
<p>The classification is a process to predict or reorganize the motions with the extracted features. In order to achieve a good motion classification performance, three popular supervised classification approaches are employed in our research work for the validation and these three classifiers are described as follows.</p>
<sec id="sec4dot4dot1-sensors-16-01752">
<title>4.4.1. Decision Tree</title>
<p>A decision tree is a decision support tool that uses a tree-like graph or model of decisions and their possible consequences. Generally, internal node, branch, and leaf nodes are included in a decision tree classifier, where, the internal node represents a test on the selected feature, branch denotes the outcome of the test, and the leaf nodes represent the class labels (different moving directions).
<xref ref-type="fig" rid="sensors-16-01752-f008">Figure 8</xref>
graphically illustrates the decision tree model.</p>
<p>
<xref ref-type="fig" rid="sensors-16-01752-f008">Figure 8</xref>
draws the graphical model of the decision tree. The blue circles denote the internal node that executes the test on the feature (comparison of the feature with trained parameter), the green arrows denote the test outcomes and the rectangles denote the different labels or classes. The red dashed lines from top nodes to the leaf nodes represent a decision process or classification rule. </p>
<p>The tree generation is the training stage of this classifier and it works in a recursive procedure. The general tree generation process is that, for each feature of the samples, a metric (the splitting measure) is computed from splitting on that feature. Then, the feature that generates the optimal index (highest or lowest) is selected and a decision node is created to split the data based on that feature. The recursion procedure stops when the samples in a node belong to the same class (majority), or when there are no remaining features on which to split. Depending on different splitting measures, the decision tree can be categorized as: ID3 (Iterative Dichotomiser 3), Quest (Quick, Unbiased, Efficient, Statistical Tree), CART (Classification And Regression Tree), C4.5, etc. [
<xref rid="B45-sensors-16-01752" ref-type="bibr">45</xref>
,
<xref rid="B46-sensors-16-01752" ref-type="bibr">46</xref>
]. </p>
</sec>
<sec id="sec4dot4dot2-sensors-16-01752">
<title>4.4.2. K-Nearest Neighbors</title>
<p>K-nearest neighbors algorithm (kNN) [
<xref rid="B47-sensors-16-01752" ref-type="bibr">47</xref>
] is an approach based on the closest training samples in the feature space, where k denotes the number of classes. In the kNN approach, an object is classified by a majority vote of its neighbors, with the object being assigned to the most common class among its nearest neighbors. Similarity measures are fundamental components in this algorithm and different distance measures can be used to find the distance between data points.
<xref ref-type="fig" rid="sensors-16-01752-f009">Figure 9</xref>
illustrates the main concept of kNN algorithm.</p>
<p>As shown in
<xref ref-type="fig" rid="sensors-16-01752-f009">Figure 9</xref>
, the test sample (blue circle) is classified by the neighbors class, either green square or green triangle. If k is selected as 3, the test sample is assigned to the red square because two of its k neighbors belong to red square. In the same way, if k = 5, the test sample is assigned to green triangle class. Hence, the main idea of kNN is that the category of the predicted object is decided by the labels of the neighbors’ majority. Additionally, the votes of these neighbors could be weighted based on the distance to overcome the problem of non-uniform densities of the neighbor classes. </p>
</sec>
<sec id="sec4dot4dot3-sensors-16-01752">
<title>4.4.3. Support Vector Machine</title>
<p>The support vector machine (SVM) is used to construct a hyperplane or set of hyperplanes in a high- or infinite-dimensional space for classification, regression, or other tasks. Since several available hyperplanes are able to classify the data, the SVM is employed to use the one that represents the largest separation, or margin, between the two classes to classify. The hyperplane chosen in SVM maximize the distance between the plane and the nearest data point on each side.
<xref ref-type="fig" rid="sensors-16-01752-f010">Figure 10</xref>
draws the SVM classifier. </p>
<p>As shown in this figure, the optimal separating hyper-plane (solid red line) locates the samples with different labels (blue circles 1, red square −1) in the two sides of the plane, and the distances of the closest samples to the hyper-plane in each side become maxima. These samples are called support vectors and the distance is optimal margin. The specific illustration of the SVM classifier can be found in the literature [
<xref rid="B48-sensors-16-01752" ref-type="bibr">48</xref>
,
<xref rid="B49-sensors-16-01752" ref-type="bibr">49</xref>
,
<xref rid="B50-sensors-16-01752" ref-type="bibr">50</xref>
].</p>
</sec>
</sec>
</sec>
<sec id="sec5-sensors-16-01752">
<title>5. Experiments and Results</title>
<p>The experiment is designed to include two parts. In the first part, different testers are invited to perform the five foot motions in their own manners. Then, we collect the data, preprocess data to remove error, divide the data into segments, extract the features and put them into the training process of introduced machine learning algorithms to derive the classifiers. Additionally, the classifiers are tested by two cross validation approaches. In the second part, the data processing procedure is transformed in our software platform and is programmed in C++, the program is also connected to the game control interface to perform the practical game playing experiment.</p>
<sec id="sec5dot1-sensors-16-01752">
<title>5.1. Date Set</title>
<p>In order to obtain a sufficient amount of data for training, ten testers—two females and eight males—are invited to participate in experiments. All testers are in good health condition without any abnormality in their gait cycles. The IMU sensor was attached on the testers’ shoes, and they were guided to perform the five stepping motions in their natural manners. In order to have diverse characteristic of each motion, some actions of the testers were conducted at different strengths (heavy or slight), different frequencies (fast or slow), and different scopes (large or small amplitude), and some actions were performed by the same tester on different days. The data collected during this experiment were stored to form the training dataset.
<xref ref-type="fig" rid="sensors-16-01752-f011">Figure 11</xref>
shows the system hardware platform. In this platform, a 3.7 V lithium battery (blue one) is used to provide the power supply. The IMU module has a small size, and is very convenient to mount on user’s shoe.
<xref ref-type="table" rid="sensors-16-01752-t001">Table 1</xref>
provides a summary of the collected training dataset, where the quantitative information of the collected human stepping motions is listed in this table. The second row lists the actual motion numbers collected in the experiment, and they include 895 jump, 954 stepping left, 901 stepping right, 510 moving forward and 515 moving backward.</p>
</sec>
<sec id="sec5dot2-sensors-16-01752">
<title>5.2. Classification Results</title>
<p>In our proposed system, a corresponding classifier is trained for each motion instead of using a single classifier for the five motions. This training strategy is beneficial to improve the robustness and decrease the complexity of this system, since one classifier only needs to recognize two classes instead of five. Moreover, it offers the possibility of selecting typical features for each motion based on motion principle or data analysis in future work. </p>
<p>In order to have a better evaluation of the classification performance, two cross-validation approaches for test were chosen: k-fold cross validation and holdout validation. In k-fold cross-validation approach, the original sample is randomly partitioned into k equal sized subsamples. A single subsample is then retained from these k subsamples as the validation data for testing the model, and the remaining (k − 1) subsamples are used as training data. The cross-validation process is then repeated k − 1 times, with each of the k − 1 subsamples used exactly once as the validation data. The k − 1 results from these folds can then be averaged to produce a single estimation. The advantage of this method over repeated random sub-sampling is that all of the observations are used for both training and validation, and each observation is used for validation exactly once. Here, a commonly used 10-fold test is employed. In holdout validation, a subset of observations is chosen randomly from the initial samples to form a validation or testing set, and the remaining observations are retained as the training data. Twenty-five percent of the initial samples are chosen for test and validation. The two cross validation approaches are also performed for the three classifiers and the classification results are listed in
<xref ref-type="table" rid="sensors-16-01752-t002">Table 2</xref>
and
<xref ref-type="table" rid="sensors-16-01752-t003">Table 3</xref>
.</p>
<p>For each motion, the column tagged with Class 1 shows the correct detection result of actual motions and the column tagged with Class 0 denotes the undesired jump motion detected from other motions. Specifically, for the jump motion detected by decision tree classifier, 813 motions are successfully identified out of 895 jump motions (where 82 actual jump motions are missed or falsely detected), and 75 motions of totally 2880 other motions (the sum of left, right, forward and backward) are falsely considered as jump motions. </p>
<p>Additionally, in order to have a quantitative evaluation of the classifier performance, the Accuracy, Precision, and Recall metrics are also introduced. The definition of these metrics and their calculation equations are described below.
<list list-type="bullet">
<list-item>
<p>
<bold>Accuracy:</bold>
The accuracy is the most standard metric to summarize the overall classification performance for all classes and it is defined as follows:
<disp-formula id="FD12-sensors-16-01752">
<label>(12)</label>
<mml:math id="mm62">
<mml:mrow>
<mml:mi>A</mml:mi>
<mml:mi>c</mml:mi>
<mml:mi>c</mml:mi>
<mml:mi>u</mml:mi>
<mml:mi>r</mml:mi>
<mml:mi>a</mml:mi>
<mml:mi>c</mml:mi>
<mml:mi>y</mml:mi>
<mml:mo>=</mml:mo>
<mml:mfrac>
<mml:mrow>
<mml:mi>T</mml:mi>
<mml:mi>P</mml:mi>
<mml:mo>+</mml:mo>
<mml:mi>T</mml:mi>
<mml:mi>N</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi>T</mml:mi>
<mml:mi>P</mml:mi>
<mml:mo>+</mml:mo>
<mml:mi>T</mml:mi>
<mml:mi>N</mml:mi>
<mml:mo>+</mml:mo>
<mml:mi>F</mml:mi>
<mml:mi>P</mml:mi>
<mml:mo>+</mml:mo>
<mml:mi>F</mml:mi>
<mml:mi>N</mml:mi>
</mml:mrow>
</mml:mfrac>
</mml:mrow>
</mml:math>
</disp-formula>
</p>
</list-item>
<list-item>
<p>
<bold>Precision:</bold>
Often referred to as positive predictive value, it is the ratio of correctly classified positive instances to the total number of instances classified as positive:
<disp-formula id="FD13-sensors-16-01752">
<label>(13)</label>
<mml:math id="mm63">
<mml:mrow>
<mml:mi>P</mml:mi>
<mml:mi>r</mml:mi>
<mml:mi>e</mml:mi>
<mml:mi>c</mml:mi>
<mml:mi>i</mml:mi>
<mml:mi>s</mml:mi>
<mml:mi>i</mml:mi>
<mml:mi>o</mml:mi>
<mml:mi>n</mml:mi>
<mml:mo>=</mml:mo>
<mml:mfrac>
<mml:mrow>
<mml:mi>T</mml:mi>
<mml:mi>P</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi>T</mml:mi>
<mml:mi>P</mml:mi>
<mml:mo>+</mml:mo>
<mml:mi>F</mml:mi>
<mml:mi>P</mml:mi>
</mml:mrow>
</mml:mfrac>
</mml:mrow>
</mml:math>
</disp-formula>
</p>
</list-item>
<list-item>
<p>
<bold>Recall:</bold>
Also called true positive rate, it is the ratio of correctly classified positive instances to the total number of positive instances:
<disp-formula id="FD14-sensors-16-01752">
<label>(14)</label>
<mml:math id="mm64">
<mml:mrow>
<mml:mi>R</mml:mi>
<mml:mi>e</mml:mi>
<mml:mi>c</mml:mi>
<mml:mi>a</mml:mi>
<mml:mi>l</mml:mi>
<mml:mi>l</mml:mi>
<mml:mo>=</mml:mo>
<mml:mfrac>
<mml:mrow>
<mml:mi>T</mml:mi>
<mml:mi>P</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi>T</mml:mi>
<mml:mi>P</mml:mi>
<mml:mo>+</mml:mo>
<mml:mi>F</mml:mi>
<mml:mi>N</mml:mi>
</mml:mrow>
</mml:mfrac>
</mml:mrow>
</mml:math>
</disp-formula>
where
<inline-formula>
<mml:math id="mm65">
<mml:mrow>
<mml:mi>T</mml:mi>
<mml:mi>P</mml:mi>
</mml:mrow>
</mml:math>
</inline-formula>
(True Positive) indicates the number of true positive or correctly classified results,
<inline-formula>
<mml:math id="mm66">
<mml:mrow>
<mml:mi>T</mml:mi>
<mml:mi>N</mml:mi>
</mml:mrow>
</mml:math>
</inline-formula>
(True Negatives) is the number of negative instances that were classified as negative,
<inline-formula>
<mml:math id="mm67">
<mml:mrow>
<mml:mi>F</mml:mi>
<mml:mi>P</mml:mi>
</mml:mrow>
</mml:math>
</inline-formula>
(False Positives) is the number of negative instances that were classified as positive and
<inline-formula>
<mml:math id="mm68">
<mml:mrow>
<mml:mi>F</mml:mi>
<mml:mi>N</mml:mi>
</mml:mrow>
</mml:math>
</inline-formula>
(False Negatives) is the number of positive instances that were classified as negative. According to the evaluation metrics, the accuracy, precision, recall, for the test result of each motion are calculated and listed in
<xref ref-type="table" rid="sensors-16-01752-t004">Table 4</xref>
,
<xref ref-type="table" rid="sensors-16-01752-t005">Table 5</xref>
and
<xref ref-type="table" rid="sensors-16-01752-t006">Table 6</xref>
.</p>
</list-item>
</list>
</p>
<p>Based on the evaluation metrics listed in
<xref ref-type="table" rid="sensors-16-01752-t004">Table 4</xref>
,
<xref ref-type="table" rid="sensors-16-01752-t005">Table 5</xref>
and
<xref ref-type="table" rid="sensors-16-01752-t006">Table 6</xref>
, and according to the graphically comparison of accuracy and precision shown in
<xref ref-type="fig" rid="sensors-16-01752-f012">Figure 12</xref>
and
<xref ref-type="fig" rid="sensors-16-01752-f013">Figure 13</xref>
, the SVM classifier has an overall better performance than the other approaches. Moreover, the average time for each classifier to make the decision on the motion type is: decision tree classifier 0.0056 ms; kNN, 0.53 ms; and SVM, 0.0632 ms. Although the decision tree classifier has the least response time for identification, its performance on the motion type is not satisfied. The response time for SVM is 0.06 ms and it is in an acceptable time frame because this lag level will not cause an observable delay on user experience. Hence, combined with the performance and the decision time of each classifier, the SVM classifier achieves the best result and is selected in our proposed system to classify the stepping motions. Additionally, we analyze the misclassified events of each motion to give the profile of errors, aiming to avoid that one specific stepping motion always contributes to the wrong recognition, which is potentially due to unsuitable feature selection or data segmentation. The statistical result is listed in
<xref ref-type="table" rid="sensors-16-01752-t007">Table 7</xref>
.</p>
<p>
<xref ref-type="table" rid="sensors-16-01752-t007">Table 7</xref>
provides the false identification of each motion in the two cross-validation approaches. For example, in 10-fold cross validation, 27 true jump motions are missing or mistakenly classified, which occupies 42.86% of the misclassified events; however, eight left, seven right, nine forward, and 12 backward are wrongly treaded as jump motion by classifier, which totally contributes 57.15% of the misclassified events. In each classifier, the identification error of its corresponding motion type (i.e., the wrong categorization of jump motion in the jump classifier) occupies approximately 33% to 48%, and the misclassified percentage of other motion varies from 51% to 66%. Moreover, the error result also shows that the misclassified events are averagely distributed in each motion, and demonstrates that no one specific motion error is predominant during the motion determination process. </p>
</sec>
<sec id="sec5dot3-sensors-16-01752">
<title>5.3. Practical Experiment Result</title>
<p>A running game we programmed in Unity is used to practically test the algorithm. In this game, a man is running in the forest with numerous obstacles and the traditional play manner is that the user needs to control the object to jump, go left, go right or get down to avoid the obstacles. Here, we use foot movement direction to control the man and the result is shown in following figures.</p>
<p>As shown in
<xref ref-type="fig" rid="sensors-16-01752-f014">Figure 14</xref>
, the red rectangle shows the virtual player presented in game, the arrow denotes the player’s moving direction, the green rectangle illustrates the step motion identification result, and the orange rectangle shows the person moving direction. </p>
<p>
<xref ref-type="fig" rid="sensors-16-01752-f015">Figure 15</xref>
shows practical test result in the game Subway Surfers.
<xref ref-type="fig" rid="sensors-16-01752-f015">Figure 15</xref>
a illustrates that a person walking forward correlates to the jump of kid in game operation. In the left side of this figure, the person steps forward and the red arrow presents the stepping direction. The right side shows the game environment, where we can see that the kid selected in the green circle jumps up to avoid the front obstacle. In the same way,
<xref ref-type="fig" rid="sensors-16-01752-f015">Figure 15</xref>
b shows the person stepping left and it correlates to the kid moving to left.</p>
</sec>
</sec>
<sec id="sec6-sensors-16-01752">
<title>6. Conclusions</title>
<p>This paper introduces a novel application of foot-mounted inertial sensor based wearable electronic devices—game play. The main contributions of this paper can be summarized as: (1) This paper presents the first attempt to employ user’s stepping direction for controlling the player operation in game play. (2) This paper proposes and implements a novel computationally-efficient, real-time algorithm for the identification of foot moving direction. (3) In the proposed system, the acceleration and gyroscope measurements are fused to derive the attitude and use it to correct the misalignment error. This makes the proposed algorithm compatible with various shoe styles and sensor placements. (4) The stepping motion type can be recognized in the beginning phase of one step cycle, which guarantees the system real-time applicability. (5) It is suggested to design the corresponding classifier for each motion where each classifier only needs to identify two classes instead of using one classifier to recognize all five motions. This is beneficial to acquire a more precise and reliable identification result. (6) Three commonly-used classifiers in the aspects of cross validation performance and response time are compared. Based on this comparison, it is concluded that the SVM classifier achieves the best performance. (7) It extends the inertial sensor based game play scenario to the foot motion control mode, which introduces the possibility of playing running game indoor or anywhere and is potentially beneficial to encourage the user to exercise more for good health. Practical experiments of different users illustrate that the proposed system reaches a high accuracy classification result and excellent user experience, and it effectively broadens the application of current available wearable electronic devices.</p>
</sec>
</body>
<back>
<ack>
<title>Acknowledgments</title>
<p>The work of Qifan Zhou was supported by the China Scholarship Council under Grant 201306020073. </p>
</ack>
<notes>
<title>Author Contributions</title>
<p>Hai Zhang and Naser El-Sheimy conceived the idea and supervised this research work. Qifan Zhou implemented the proposed system, performed the experiment and wrote this paper. Zahra Lari helped reviewed this paper and provided much valuable revising advice. Zhenbo Liu helped with the experimental data collection, tested the system and proposed several suggestions. </p>
</notes>
<notes notes-type="COI-statement">
<title>Conflicts of Interest</title>
<p>The authors declare no conflict of interest.</p>
</notes>
<ref-list>
<title>References</title>
<ref id="B1-sensors-16-01752">
<label>1.</label>
<element-citation publication-type="book">
<person-group person-group-type="author">
<name>
<surname>Susi</surname>
<given-names>M.</given-names>
</name>
</person-group>
<article-title>Gait Analysis for Pedestrian Navigation Using MEMS Handheld Devices</article-title>
<source>Master’s Thesis</source>
<publisher-name>Department of Geomatics Engineering, University of Calgary</publisher-name>
<publisher-loc>Calgary, AB, Canada</publisher-loc>
<year>2012</year>
</element-citation>
</ref>
<ref id="B2-sensors-16-01752">
<label>2.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Junker</surname>
<given-names>H.</given-names>
</name>
<name>
<surname>Amft</surname>
<given-names>O.</given-names>
</name>
<name>
<surname>Lukowicz</surname>
<given-names>P.</given-names>
</name>
<name>
<surname>Tröster</surname>
<given-names>G.</given-names>
</name>
</person-group>
<article-title>Gesture spotting with body-worn inertial sensors to detect user activities</article-title>
<source>Pattern Recognit.</source>
<year>2010</year>
<volume>41</volume>
<fpage>2010</fpage>
<lpage>2024</lpage>
<pub-id pub-id-type="doi">10.1016/j.patcog.2007.11.016</pub-id>
</element-citation>
</ref>
<ref id="B3-sensors-16-01752">
<label>3.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Li</surname>
<given-names>Y.</given-names>
</name>
<name>
<surname>Georgy</surname>
<given-names>J.</given-names>
</name>
<name>
<surname>Niu</surname>
<given-names>X.</given-names>
</name>
<name>
<surname>Li</surname>
<given-names>Q.</given-names>
</name>
<name>
<surname>El-Sheimy</surname>
<given-names>N.</given-names>
</name>
</person-group>
<article-title>Autonomous calibration of MEMS Gyros in consumer portable devices</article-title>
<source>IEEE Sens. J.</source>
<year>2015</year>
<volume>15</volume>
<fpage>4062</fpage>
<lpage>4072</lpage>
<pub-id pub-id-type="doi">10.1109/JSEN.2015.2410756</pub-id>
</element-citation>
</ref>
<ref id="B4-sensors-16-01752">
<label>4.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Park</surname>
<given-names>S.</given-names>
</name>
<name>
<surname>Jayaraman</surname>
<given-names>S.</given-names>
</name>
</person-group>
<article-title>Smart textiles: Wearable electronic systems</article-title>
<source>MRS Bull.</source>
<year>2003</year>
<volume>28</volume>
<fpage>585</fpage>
<lpage>591</lpage>
<pub-id pub-id-type="doi">10.1557/mrs2003.170</pub-id>
</element-citation>
</ref>
<ref id="B5-sensors-16-01752">
<label>5.</label>
<element-citation publication-type="confproc">
<person-group person-group-type="author">
<name>
<surname>Nilsson</surname>
<given-names>J.O.</given-names>
</name>
<name>
<surname>Skog</surname>
<given-names>I.</given-names>
</name>
<name>
<surname>Händel</surname>
<given-names>P.</given-names>
</name>
<name>
<surname>Hari</surname>
<given-names>K.V.S.</given-names>
</name>
</person-group>
<article-title>Foot-mounted INS for everybody—An open-source embedded implementation</article-title>
<source>Proceedings of the IEEE Position Location and Navigation Symposium (PLANS)</source>
<conf-loc>Myrtle Beach, SC, USA</conf-loc>
<conf-date>23–26 April 2012</conf-date>
<fpage>140</fpage>
<lpage>145</lpage>
</element-citation>
</ref>
<ref id="B6-sensors-16-01752">
<label>6.</label>
<element-citation publication-type="confproc">
<person-group person-group-type="author">
<name>
<surname>Ruppelt</surname>
<given-names>J.</given-names>
</name>
<name>
<surname>Kronenwett</surname>
<given-names>N.</given-names>
</name>
<name>
<surname>Scholz</surname>
<given-names>G.</given-names>
</name>
</person-group>
<article-title>High-precision and robust indoor localization based on foot-mounted inertial sensors</article-title>
<source>Proceedings of the IEEE/ION Position, Location and Navigation Symposium (PLANS)</source>
<conf-loc>Savannah, GA, USA</conf-loc>
<conf-date>11–16 April 2016</conf-date>
<fpage>67</fpage>
<lpage>75</lpage>
</element-citation>
</ref>
<ref id="B7-sensors-16-01752">
<label>7.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Abdulrahim</surname>
<given-names>K.</given-names>
</name>
<name>
<surname>Hide</surname>
<given-names>C.</given-names>
</name>
<name>
<surname>Moore</surname>
<given-names>T.</given-names>
</name>
<name>
<surname>Hill</surname>
<given-names>C.</given-names>
</name>
</person-group>
<article-title>Rotating a MEMS inertial measurement unit for a foot-mounted pedestrian navigation</article-title>
<source>J. Comput. Sci.</source>
<year>2014</year>
<volume>10</volume>
<fpage>2619</fpage>
<pub-id pub-id-type="doi">10.3844/jcssp.2014.2619.2627</pub-id>
</element-citation>
</ref>
<ref id="B8-sensors-16-01752">
<label>8.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Fischer</surname>
<given-names>C.</given-names>
</name>
<name>
<surname>Gellersen</surname>
<given-names>H.</given-names>
</name>
</person-group>
<article-title>Location and navigation support for emergency responders: A survey</article-title>
<source>IEEE Pervasive Comput.</source>
<year>2010</year>
<volume>9</volume>
<fpage>38</fpage>
<lpage>47</lpage>
<pub-id pub-id-type="doi">10.1109/MPRV.2009.91</pub-id>
</element-citation>
</ref>
<ref id="B9-sensors-16-01752">
<label>9.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Skog</surname>
<given-names>I.</given-names>
</name>
<name>
<surname>Händel</surname>
<given-names>P.</given-names>
</name>
<name>
<surname>Nilsson</surname>
<given-names>J.-O.</given-names>
</name>
<name>
<surname>Rantakokko</surname>
<given-names>J.</given-names>
</name>
</person-group>
<article-title>Zero-velocity detection—An algorithm evaluation</article-title>
<source>IEEE Trans. Biomed. Eng.</source>
<year>2010</year>
<volume>57</volume>
<fpage>2657</fpage>
<lpage>2666</lpage>
<pub-id pub-id-type="doi">10.1109/TBME.2010.2060723</pub-id>
<pub-id pub-id-type="pmid">20667801</pub-id>
</element-citation>
</ref>
<ref id="B10-sensors-16-01752">
<label>10.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Norrdine</surname>
<given-names>A.</given-names>
</name>
<name>
<surname>Kasmi</surname>
<given-names>Z.</given-names>
</name>
</person-group>
<article-title>Step detection for ZUPT-aided inertial pedestrian navigation system using foot-mounted</article-title>
<source>IEEE Sens. J.</source>
<year>2016</year>
<volume>16</volume>
<fpage>6766</fpage>
<lpage>6773</lpage>
<pub-id pub-id-type="doi">10.1109/JSEN.2016.2585599</pub-id>
</element-citation>
</ref>
<ref id="B11-sensors-16-01752">
<label>11.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Gu</surname>
<given-names>Y.</given-names>
</name>
<name>
<surname>Song</surname>
<given-names>Q.</given-names>
</name>
<name>
<surname>Li</surname>
<given-names>Y.</given-names>
</name>
<name>
<surname>Ma</surname>
<given-names>M.</given-names>
</name>
</person-group>
<article-title>Foot-mounted pedestrian navigation based on particle filter with an adaptive weight updating strategy</article-title>
<source>J. Navig.</source>
<year>2014</year>
<volume>68</volume>
<fpage>23</fpage>
<lpage>38</lpage>
<pub-id pub-id-type="doi">10.1017/S0373463314000496</pub-id>
</element-citation>
</ref>
<ref id="B12-sensors-16-01752">
<label>12.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Ruiz</surname>
<given-names>A.R.J.</given-names>
</name>
<name>
<surname>Granja</surname>
<given-names>F.S.</given-names>
</name>
<name>
<surname>Honorato</surname>
<given-names>J.C.P.</given-names>
</name>
<name>
<surname>Guevara Rosas</surname>
<given-names>J.I.</given-names>
</name>
</person-group>
<article-title>Accurate pedestrian indoor navigation by tightly coupling foot-mounted IMU and RFID measurements</article-title>
<source>IEEE Trans. Instrum. Meas.</source>
<year>2012</year>
<volume>61</volume>
<fpage>178</fpage>
<lpage>189</lpage>
<pub-id pub-id-type="doi">10.1109/TIM.2011.2159317</pub-id>
</element-citation>
</ref>
<ref id="B13-sensors-16-01752">
<label>13.</label>
<element-citation publication-type="confproc">
<person-group person-group-type="author">
<name>
<surname>Ascher</surname>
<given-names>C.</given-names>
</name>
<name>
<surname>Kessler</surname>
<given-names>C.</given-names>
</name>
<name>
<surname>Wankerl</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>Trommer</surname>
<given-names>G.F.</given-names>
</name>
</person-group>
<article-title>Dual IMU indoor navigation with particle filter based map-matching on a smartphone</article-title>
<source>Proceedings of the International Conference on Indoor Positioning and Indoor Navigation (IPIN)</source>
<conf-loc>Zürich, Switzerland</conf-loc>
<conf-date>15–17 September 2010</conf-date>
<fpage>15</fpage>
<lpage>17</lpage>
</element-citation>
</ref>
<ref id="B14-sensors-16-01752">
<label>14.</label>
<element-citation publication-type="confproc">
<person-group person-group-type="author">
<name>
<surname>Nilsson</surname>
<given-names>J.O.</given-names>
</name>
<name>
<surname>Gupta</surname>
<given-names>A.K.</given-names>
</name>
<name>
<surname>Handel</surname>
<given-names>P.</given-names>
</name>
</person-group>
<article-title>Foot-mounted inertial navigation made easy</article-title>
<source>Proceedings of the 5th International Conference on Indoor Positioning and Indoor Navigation</source>
<conf-loc>Busan, Korea</conf-loc>
<conf-date>27–30 October 2015</conf-date>
<fpage>24</fpage>
<lpage>29</lpage>
</element-citation>
</ref>
<ref id="B15-sensors-16-01752">
<label>15.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Harle</surname>
<given-names>R.</given-names>
</name>
</person-group>
<article-title>A survey of indoor inertial positioning systems for pedestrians</article-title>
<source>IEEE Commun. Surv. Tutor.</source>
<year>2013</year>
<volume>15</volume>
<fpage>1281</fpage>
<lpage>1293</lpage>
<pub-id pub-id-type="doi">10.1109/SURV.2012.121912.00075</pub-id>
</element-citation>
</ref>
<ref id="B16-sensors-16-01752">
<label>16.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Yun</surname>
<given-names>X.</given-names>
</name>
<name>
<surname>Calusdian</surname>
<given-names>J.</given-names>
</name>
<name>
<surname>Bachmann</surname>
<given-names>E.R.</given-names>
</name>
<name>
<surname>McGhee</surname>
<given-names>R.B.</given-names>
</name>
</person-group>
<article-title>Estimation of human foot motion during normal walking using inertial and magnetic sensor measurements</article-title>
<source>IEEE Trans. Instrum. Meas.</source>
<year>2012</year>
<volume>61</volume>
<fpage>2059</fpage>
<lpage>2072</lpage>
<pub-id pub-id-type="doi">10.1109/TIM.2011.2179830</pub-id>
</element-citation>
</ref>
<ref id="B17-sensors-16-01752">
<label>17.</label>
<element-citation publication-type="confproc">
<person-group person-group-type="author">
<name>
<surname>Bancroft</surname>
<given-names>J.B.</given-names>
</name>
<name>
<surname>Garrett</surname>
<given-names>D.</given-names>
</name>
<name>
<surname>Lachapelle</surname>
<given-names>G.</given-names>
</name>
</person-group>
<article-title>Activity and environment classification using foot mounted navigation sensors</article-title>
<source>Proceedings of the International Conference on Indoor Positioning and Indoor Navigation (IPIN)</source>
<conf-loc>Sydney, Australia</conf-loc>
<conf-date>13–15 November 2012</conf-date>
</element-citation>
</ref>
<ref id="B18-sensors-16-01752">
<label>18.</label>
<element-citation publication-type="confproc">
<person-group person-group-type="author">
<name>
<surname>Avci</surname>
<given-names>A.</given-names>
</name>
<name>
<surname>Bosch</surname>
<given-names>S.</given-names>
</name>
<name>
<surname>Marin-perianu</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>Marin-perianu</surname>
<given-names>R.</given-names>
</name>
<name>
<surname>Havinga</surname>
<given-names>P.</given-names>
</name>
</person-group>
<article-title>Activity recognition using inertial sensing for healthcare, wellbeing and sports applications: A survey</article-title>
<source>Proceedings of the 2010 23rd International Conference on Architecture of Computing Systems (ARCS)</source>
<conf-loc>Hannover, Germany</conf-loc>
<conf-date>22–25 February 2010</conf-date>
</element-citation>
</ref>
<ref id="B19-sensors-16-01752">
<label>19.</label>
<element-citation publication-type="confproc">
<person-group person-group-type="author">
<name>
<surname>Choi</surname>
<given-names>I.</given-names>
</name>
<name>
<surname>Ricci</surname>
<given-names>C.</given-names>
</name>
</person-group>
<article-title>Foot-mounted gesture detection and its application in virtual environments</article-title>
<source>
<italic>Computational Cybernetics and Simulation</italic>
, Proceedings of the IEEE International Conference on Systems, Man, and Cybernetics</source>
<conf-loc>Orlando, FL, USA</conf-loc>
<conf-date>12–15 October 1997</conf-date>
<publisher-name>IEEE</publisher-name>
<publisher-loc>New York, NY, USA</publisher-loc>
<year>1997</year>
<volume>Volume 5</volume>
<fpage>4248</fpage>
<lpage>4253</lpage>
</element-citation>
</ref>
<ref id="B20-sensors-16-01752">
<label>20.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Mannini</surname>
<given-names>A.</given-names>
</name>
<name>
<surname>Sabatini</surname>
<given-names>A.M.</given-names>
</name>
</person-group>
<article-title>Gait phase detection and discrimination between walking-jogging activities using hidden Markov models applied to foot motion data from a gyroscope</article-title>
<source>Gait Posture</source>
<year>2012</year>
<volume>36</volume>
<fpage>657</fpage>
<lpage>661</lpage>
<pub-id pub-id-type="doi">10.1016/j.gaitpost.2012.06.017</pub-id>
<pub-id pub-id-type="pmid">22796244</pub-id>
</element-citation>
</ref>
<ref id="B21-sensors-16-01752">
<label>21.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Porta</surname>
<given-names>J.P.</given-names>
</name>
<name>
<surname>Acosta</surname>
<given-names>D.J.</given-names>
</name>
<name>
<surname>Lehker</surname>
<given-names>A.N.</given-names>
</name>
<name>
<surname>Miller</surname>
<given-names>S.T.</given-names>
</name>
<name>
<surname>Tomaka</surname>
<given-names>J.</given-names>
</name>
<name>
<surname>King</surname>
<given-names>G.A.</given-names>
</name>
</person-group>
<article-title>Validating the adidas miCoach for estimating pace, distance, and energy expenditure during outdoor over-ground exercise accelerometer</article-title>
<source>Int. J. Exerc. Sci.</source>
<year>2012</year>
<volume>2</volume>
<fpage>23</fpage>
</element-citation>
</ref>
<ref id="B22-sensors-16-01752">
<label>22.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Tucker</surname>
<given-names>W.J.</given-names>
</name>
<name>
<surname>Bhammar</surname>
<given-names>D.M.</given-names>
</name>
<name>
<surname>Sawyer</surname>
<given-names>B.J.</given-names>
</name>
<name>
<surname>Buman</surname>
<given-names>M.P.</given-names>
</name>
<name>
<surname>Gaesser</surname>
<given-names>G.A.</given-names>
</name>
</person-group>
<article-title>Validity and reliability of Nike+ fuelband for estimating physical activity energy expenditure</article-title>
<source>BMC Sports Sci. Med. Rehabil.</source>
<year>2015</year>
<volume>7</volume>
<elocation-id>1752</elocation-id>
<pub-id pub-id-type="doi">10.1186/s13102-015-0008-7</pub-id>
<pub-id pub-id-type="pmid">26751385</pub-id>
</element-citation>
</ref>
<ref id="B23-sensors-16-01752">
<label>23.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Fong</surname>
<given-names>D.T.</given-names>
</name>
<name>
<surname>Chan</surname>
<given-names>Y.</given-names>
</name>
<name>
<surname>Kong</surname>
<given-names>H.</given-names>
</name>
<name>
<surname>Hong</surname>
<given-names>T.</given-names>
</name>
<name>
<surname>Jockey</surname>
<given-names>K.</given-names>
</name>
<name>
<surname>Sports</surname>
<given-names>C.</given-names>
</name>
<name>
<surname>Centre</surname>
<given-names>H.S.</given-names>
</name>
<name>
<surname>Kong</surname>
<given-names>H.</given-names>
</name>
<name>
<surname>Ho</surname>
<given-names>A.</given-names>
</name>
<name>
<surname>Ling</surname>
<given-names>M.</given-names>
</name>
<etal></etal>
</person-group>
<article-title>The use of wearable inertial motion sensors in human lower limb biomechanics studies: A systematic review</article-title>
<source>Sensors</source>
<year>2010</year>
<volume>10</volume>
<fpage>11556</fpage>
<lpage>11565</lpage>
<pub-id pub-id-type="doi">10.3390/s101211556</pub-id>
<pub-id pub-id-type="pmid">22163542</pub-id>
</element-citation>
</ref>
<ref id="B24-sensors-16-01752">
<label>24.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Casamassima</surname>
<given-names>F.</given-names>
</name>
<name>
<surname>Ferrari</surname>
<given-names>A.</given-names>
</name>
<name>
<surname>Milosevic</surname>
<given-names>B.</given-names>
</name>
<name>
<surname>Ginis</surname>
<given-names>P.</given-names>
</name>
<name>
<surname>Farella</surname>
<given-names>E.</given-names>
</name>
<name>
<surname>Rocchi</surname>
<given-names>L.</given-names>
</name>
</person-group>
<article-title>A wearable system for gait training in subjects with Parkinson’s disease</article-title>
<source>Sensors</source>
<year>2014</year>
<volume>14</volume>
<fpage>6229</fpage>
<lpage>6246</lpage>
<pub-id pub-id-type="doi">10.3390/s140406229</pub-id>
<pub-id pub-id-type="pmid">24686731</pub-id>
</element-citation>
</ref>
<ref id="B25-sensors-16-01752">
<label>25.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Bae</surname>
<given-names>J.</given-names>
</name>
<name>
<surname>Kong</surname>
<given-names>K.</given-names>
</name>
<name>
<surname>Byl</surname>
<given-names>N.</given-names>
</name>
<name>
<surname>Tomizuka</surname>
<given-names>M.</given-names>
</name>
</person-group>
<article-title>A mobile gait monitoring system for abnormal gait diagnosis and rehabilitation: A pilot study for Parkinson disease patients</article-title>
<source>J. Biomech. Eng.</source>
<year>2011</year>
<volume>133</volume>
<fpage>41005</fpage>
</element-citation>
</ref>
<ref id="B26-sensors-16-01752">
<label>26.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Bae</surname>
<given-names>J.</given-names>
</name>
<name>
<surname>Tomizuka</surname>
<given-names>M.</given-names>
</name>
</person-group>
<article-title>A tele-monitoring system for gait rehabilitation with an inertial measurement unit and a shoe-type ground reaction force sensor</article-title>
<source>Mechatronics</source>
<year>2013</year>
<volume>23</volume>
<fpage>646</fpage>
<lpage>651</lpage>
<pub-id pub-id-type="doi">10.1016/j.mechatronics.2013.06.007</pub-id>
</element-citation>
</ref>
<ref id="B27-sensors-16-01752">
<label>27.</label>
<element-citation publication-type="confproc">
<person-group person-group-type="author">
<name>
<surname>Strohrmann</surname>
<given-names>C.</given-names>
</name>
<name>
<surname>Harms</surname>
<given-names>H.</given-names>
</name>
<name>
<surname>Tröster</surname>
<given-names>G.</given-names>
</name>
<name>
<surname>Hensler</surname>
<given-names>S.</given-names>
</name>
<name>
<surname>Müller</surname>
<given-names>R.</given-names>
</name>
</person-group>
<article-title>Out of the lab and into the woods: Kinematic analysis in running using wearable sensors</article-title>
<source>Proceedings of the 13th International Conference on Ubiquitous Computing</source>
<conf-loc>Beijing, China</conf-loc>
<conf-date>17–21 September 2011</conf-date>
<fpage>119</fpage>
<lpage>122</lpage>
</element-citation>
</ref>
<ref id="B28-sensors-16-01752">
<label>28.</label>
<element-citation publication-type="confproc">
<person-group person-group-type="author">
<name>
<surname>Chung</surname>
<given-names>P.-C.</given-names>
</name>
<name>
<surname>Hsu</surname>
<given-names>Y.-L.</given-names>
</name>
<name>
<surname>Wang</surname>
<given-names>C.-Y.</given-names>
</name>
<name>
<surname>Lin</surname>
<given-names>C.-W.</given-names>
</name>
<name>
<surname>Wang</surname>
<given-names>J.-S.</given-names>
</name>
<name>
<surname>Pai</surname>
<given-names>M.-C.</given-names>
</name>
</person-group>
<article-title>Gait analysis for patients with Alzheimer’s disease using a triaxial accelerometer</article-title>
<source>Proceedings of the IEEE International Symposium on Circuits and Systems (ISCAS)</source>
<conf-loc>Seoul, Korea</conf-loc>
<conf-date>20–23 May 2012</conf-date>
<fpage>1323</fpage>
<lpage>1326</lpage>
</element-citation>
</ref>
<ref id="B29-sensors-16-01752">
<label>29.</label>
<element-citation publication-type="confproc">
<person-group person-group-type="author">
<name>
<surname>Schou</surname>
<given-names>T.</given-names>
</name>
<name>
<surname>Gardner</surname>
<given-names>H.J.</given-names>
</name>
</person-group>
<article-title>A Wii remote, a game engine, five sensor bars and a virtual reality theatre</article-title>
<source>Proceedings of the 19th Australasian Conference on Computer-Human Interaction: Entertaining User Interfaces</source>
<conf-loc>Adelaide, Australia</conf-loc>
<conf-date>28–30 November 2007</conf-date>
<fpage>231</fpage>
<lpage>234</lpage>
</element-citation>
</ref>
<ref id="B30-sensors-16-01752">
<label>30.</label>
<element-citation publication-type="confproc">
<person-group person-group-type="author">
<name>
<surname>Schlömer</surname>
<given-names>T.</given-names>
</name>
<name>
<surname>Poppinga</surname>
<given-names>B.</given-names>
</name>
<name>
<surname>Henze</surname>
<given-names>N.</given-names>
</name>
<name>
<surname>Boll</surname>
<given-names>S.</given-names>
</name>
</person-group>
<article-title>Gesture recognition with a Wii controller</article-title>
<source>Proceedings of the 2nd International Conference on Tangible and Embedded Interaction</source>
<conf-loc>Bonn, Germany</conf-loc>
<conf-date>18–21 February 2008</conf-date>
<fpage>11</fpage>
<lpage>14</lpage>
</element-citation>
</ref>
<ref id="B31-sensors-16-01752">
<label>31.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Shum</surname>
<given-names>H.P.H.</given-names>
</name>
<name>
<surname>Komura</surname>
<given-names>T.</given-names>
</name>
<name>
<surname>Takagi</surname>
<given-names>S.</given-names>
</name>
</person-group>
<article-title>Fast accelerometer-based motion recognition with a dual buffer framework</article-title>
<source>Int. J. Virtual Real.</source>
<year>2011</year>
<volume>10</volume>
<fpage>17</fpage>
<lpage>24</lpage>
</element-citation>
</ref>
<ref id="B32-sensors-16-01752">
<label>32.</label>
<element-citation publication-type="confproc">
<person-group person-group-type="author">
<name>
<surname>Heinz</surname>
<given-names>E.A.</given-names>
</name>
<name>
<surname>Kunze</surname>
<given-names>K.S.</given-names>
</name>
<name>
<surname>Gruber</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>Bannach</surname>
<given-names>D.</given-names>
</name>
<name>
<surname>Lukowicz</surname>
<given-names>P.</given-names>
</name>
</person-group>
<article-title>Using wearable sensors for real-time recognition tasks in games of martial arts—An initial experiment</article-title>
<source>Proceedings of the IEEE Symposium on Computational Intelligence and Games</source>
<conf-loc>Reno, NV, USA</conf-loc>
<conf-date>22–24 May 2006</conf-date>
<fpage>98</fpage>
<lpage>102</lpage>
</element-citation>
</ref>
<ref id="B33-sensors-16-01752">
<label>33.</label>
<element-citation publication-type="webpage">
<person-group person-group-type="author">
<collab>CC2540</collab>
</person-group>
<article-title>Bluetooth
<sup>®</sup>
Low Energy Software Developer’s Guide v1.4</article-title>
<comment>Available online:
<ext-link ext-link-type="uri" xlink:href="http://www.TI_BLE_Software_Developer’s_Guide.pdf">http://www.TI_BLE_Software_Developer’s_Guide.pdf</ext-link>
</comment>
<date-in-citation>(accessed on 10 Feburary 2010)</date-in-citation>
</element-citation>
</ref>
<ref id="B34-sensors-16-01752">
<label>34.</label>
<element-citation publication-type="book">
<person-group person-group-type="author">
<name>
<surname>Board</surname>
<given-names>E.V.</given-names>
</name>
<name>
<surname>Guide</surname>
<given-names>U.</given-names>
</name>
</person-group>
<source>MPU-9150 9-Axis Evaluation Board User Guide</source>
<publisher-name>InvenSense</publisher-name>
<publisher-loc>Sunnyvale, CA, USA</publisher-loc>
<year>2011</year>
<volume>Volume 1</volume>
</element-citation>
</ref>
<ref id="B35-sensors-16-01752">
<label>35.</label>
<element-citation publication-type="book">
<person-group person-group-type="author">
<name>
<surname>Noureldin</surname>
<given-names>A.</given-names>
</name>
<name>
<surname>Karamat</surname>
<given-names>T.B.</given-names>
</name>
<name>
<surname>Georgy</surname>
<given-names>J.</given-names>
</name>
</person-group>
<source>Fundamentals of Inertial Navigation, Satellite-Based Positioning and Their Integration</source>
<publisher-name>Springer Science & Business Media</publisher-name>
<publisher-loc>Berlin, Germany</publisher-loc>
<year>2012</year>
</element-citation>
</ref>
<ref id="B36-sensors-16-01752">
<label>36.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Syed</surname>
<given-names>Z.F.</given-names>
</name>
<name>
<surname>Aggarwal</surname>
<given-names>P.</given-names>
</name>
<name>
<surname>Goodall</surname>
<given-names>C.</given-names>
</name>
<name>
<surname>Niu</surname>
<given-names>X.</given-names>
</name>
<name>
<surname>El-Sheimy</surname>
<given-names>N.</given-names>
</name>
</person-group>
<article-title>A new multi-position calibration method for MEMS inertial navigation systems</article-title>
<source>Meas. Sci. Technol.</source>
<year>2007</year>
<volume>18</volume>
<fpage>1897</fpage>
<lpage>1907</lpage>
<pub-id pub-id-type="doi">10.1088/0957-0233/18/7/016</pub-id>
</element-citation>
</ref>
<ref id="B37-sensors-16-01752">
<label>37.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Shin</surname>
<given-names>E.H.</given-names>
</name>
<name>
<surname>El-Sheimy</surname>
<given-names>N.</given-names>
</name>
</person-group>
<article-title>A new calibration method for strapdown inertial navigation systems</article-title>
<source>Z. Vermess.</source>
<year>2002</year>
<volume>127</volume>
<fpage>41</fpage>
<lpage>50</lpage>
</element-citation>
</ref>
<ref id="B38-sensors-16-01752">
<label>38.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Li</surname>
<given-names>Y.</given-names>
</name>
<name>
<surname>Niu</surname>
<given-names>X.</given-names>
</name>
<name>
<surname>Zhang</surname>
<given-names>Q.</given-names>
</name>
<name>
<surname>Zhang</surname>
<given-names>H.</given-names>
</name>
<name>
<surname>Shi</surname>
<given-names>C.</given-names>
</name>
</person-group>
<article-title>An in situ hand calibration method using a pseudo-observation scheme for low-end inertial measurement units</article-title>
<source>Meas. Sci. Technol.</source>
<year>2012</year>
<volume>23</volume>
<fpage>105104</fpage>
<pub-id pub-id-type="doi">10.1088/0957-0233/23/10/105104</pub-id>
</element-citation>
</ref>
<ref id="B39-sensors-16-01752">
<label>39.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Gravina</surname>
<given-names>R.</given-names>
</name>
<name>
<surname>Alinia</surname>
<given-names>P.</given-names>
</name>
<name>
<surname>Ghasemzadeh</surname>
<given-names>H.</given-names>
</name>
<name>
<surname>Fortino</surname>
<given-names>G.</given-names>
</name>
</person-group>
<article-title>Multi-sensor fusion in body sensor networks: State-of-the-art and research challenges</article-title>
<source>Inf. Fusion</source>
<year>2017</year>
<volume>35</volume>
<fpage>68</fpage>
<lpage>80</lpage>
<pub-id pub-id-type="doi">10.1016/j.inffus.2016.09.005</pub-id>
</element-citation>
</ref>
<ref id="B40-sensors-16-01752">
<label>40.</label>
<element-citation publication-type="confproc">
<person-group person-group-type="author">
<name>
<surname>Foxlin</surname>
<given-names>E.</given-names>
</name>
</person-group>
<article-title>Inertial head-tracker sensor fusion by a complementary separate-bias Kalman filter</article-title>
<source>Proceedings of the IEEE Virtual Reality Annual International Symposium</source>
<conf-loc>Santa Clara, CA, USA</conf-loc>
<conf-date>3 March–3 April 1996</conf-date>
<fpage>185</fpage>
<lpage>195</lpage>
</element-citation>
</ref>
<ref id="B41-sensors-16-01752">
<label>41.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Li</surname>
<given-names>W.</given-names>
</name>
<name>
<surname>Wang</surname>
<given-names>J.</given-names>
</name>
</person-group>
<article-title>Effective adaptive Kalman filter for MEMS-IMU/magnetometers integrated attitude and heading reference systems</article-title>
<source>J. Navig.</source>
<year>2012</year>
<volume>66</volume>
<fpage>99</fpage>
<lpage>113</lpage>
<pub-id pub-id-type="doi">10.1017/S0373463312000331</pub-id>
</element-citation>
</ref>
<ref id="B42-sensors-16-01752">
<label>42.</label>
<element-citation publication-type="confproc">
<person-group person-group-type="author">
<name>
<surname>Wang</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>Yang</surname>
<given-names>Y.</given-names>
</name>
<name>
<surname>Hatch</surname>
<given-names>R.R.</given-names>
</name>
<name>
<surname>Zhang</surname>
<given-names>Y.</given-names>
</name>
</person-group>
<article-title>Adaptive filter for a miniature MEMS based attitude and heading reference system</article-title>
<source>Proceedings of the IEEE Position Location and Navigation Symposium Monterey</source>
<conf-loc>CA, USA</conf-loc>
<conf-date>26–29 April 2004</conf-date>
<fpage>193</fpage>
<lpage>200</lpage>
</element-citation>
</ref>
<ref id="B43-sensors-16-01752">
<label>43.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Godha</surname>
<given-names>S.</given-names>
</name>
<name>
<surname>Lachapelle</surname>
<given-names>G.</given-names>
</name>
</person-group>
<article-title>Foot mounted inertial system for pedestrian navigation</article-title>
<source>Meas. Sci. Technol.</source>
<year>2008</year>
<volume>19</volume>
<fpage>075202</fpage>
<pub-id pub-id-type="doi">10.1088/0957-0233/19/7/075202</pub-id>
</element-citation>
</ref>
<ref id="B44-sensors-16-01752">
<label>44.</label>
<element-citation publication-type="book">
<person-group person-group-type="author">
<name>
<surname>El-Sheimy</surname>
<given-names>N.</given-names>
</name>
</person-group>
<source>Inertial Techniques and INS/DGPS Integration</source>
<comment>Engo 623-Course Notes</comment>
<publisher-name>Department of Geomatics Engineering, University of Calgary</publisher-name>
<publisher-loc>Calgary, AB, Canada</publisher-loc>
<year>2003</year>
<fpage>170</fpage>
<lpage>182</lpage>
</element-citation>
</ref>
<ref id="B45-sensors-16-01752">
<label>45.</label>
<element-citation publication-type="confproc">
<person-group person-group-type="author">
<name>
<surname>Safavian</surname>
<given-names>S.R.</given-names>
</name>
<name>
<surname>Landgrebe</surname>
<given-names>D.</given-names>
</name>
</person-group>
<article-title>A survey of decision tree classifier methodology</article-title>
<source>Proceedings of the International Conference on Machine Learning</source>
<conf-loc>Amsterdam, The Netherlands</conf-loc>
<conf-date>7–12 August 1990</conf-date>
</element-citation>
</ref>
<ref id="B46-sensors-16-01752">
<label>46.</label>
<element-citation publication-type="confproc">
<person-group person-group-type="author">
<name>
<surname>Chawla</surname>
<given-names>N.V.</given-names>
</name>
</person-group>
<article-title>C4.5 and imbalanced data sets: Investigating the effect of sampling method, probabilistic estimate, and decision tree structure</article-title>
<source>Proceedings of the International Conference on Machine Learning</source>
<conf-loc>Washington, DC, USA</conf-loc>
<conf-date>21–24 August 2003</conf-date>
<volume>Volume 3</volume>
</element-citation>
</ref>
<ref id="B47-sensors-16-01752">
<label>47.</label>
<element-citation publication-type="book">
<person-group person-group-type="author">
<name>
<surname>Larose</surname>
<given-names>D.T.</given-names>
</name>
</person-group>
<article-title>k-Nearest Neighbor Algorithm</article-title>
<source>Discovering Knowledge in Data: An Introduction to Data Mining</source>
<publisher-name>Wiley-Interscience</publisher-name>
<publisher-loc>Hoboken, NJ, USA</publisher-loc>
<year>2005</year>
<fpage>90</fpage>
<lpage>106</lpage>
</element-citation>
</ref>
<ref id="B48-sensors-16-01752">
<label>48.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Chang</surname>
<given-names>C.-C.</given-names>
</name>
<name>
<surname>Lin</surname>
<given-names>C.-J.</given-names>
</name>
</person-group>
<article-title>LIBSVM: A library for support vector machines</article-title>
<source>ACM Trans. Intell. Syst. Technol.</source>
<year>2011</year>
<volume>2</volume>
<fpage>27</fpage>
<pub-id pub-id-type="doi">10.1145/1961189.1961199</pub-id>
</element-citation>
</ref>
<ref id="B49-sensors-16-01752">
<label>49.</label>
<element-citation publication-type="book">
<person-group person-group-type="author">
<name>
<surname>Hsu</surname>
<given-names>C.-W.</given-names>
</name>
<name>
<surname>Chang</surname>
<given-names>C.-C.</given-names>
</name>
<name>
<surname>Lin</surname>
<given-names>C.-J.</given-names>
</name>
</person-group>
<source>A Practical Guide to Support Vector Classification</source>
<publisher-name>Department of Computer Science, National Taiwan University</publisher-name>
<publisher-loc>Taipei, Taiwan</publisher-loc>
<year>2003</year>
</element-citation>
</ref>
<ref id="B50-sensors-16-01752">
<label>50.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Suykens</surname>
<given-names>J.A.K.</given-names>
</name>
<name>
<surname>Vandewalle</surname>
<given-names>J.</given-names>
</name>
</person-group>
<article-title>Recurrent least squares support vector machines</article-title>
<source>IEEE Trans. Circuits Syst. I Fundam. Theory Appl.</source>
<year>2000</year>
<volume>47</volume>
<fpage>1109</fpage>
<lpage>1114</lpage>
<pub-id pub-id-type="doi">10.1109/81.855471</pub-id>
</element-citation>
</ref>
</ref-list>
</back>
<floats-group>
<fig id="sensors-16-01752-f001" position="float">
<label>Figure 1</label>
<caption>
<p>The main concept of proposed system.</p>
</caption>
<graphic xlink:href="sensors-16-01752-g001"></graphic>
</fig>
<fig id="sensors-16-01752-f002" position="float">
<label>Figure 2</label>
<caption>
<p>System architecture.</p>
</caption>
<graphic xlink:href="sensors-16-01752-g002"></graphic>
</fig>
<fig id="sensors-16-01752-f003" position="float">
<label>Figure 3</label>
<caption>
<p>System hardware platform.</p>
</caption>
<graphic xlink:href="sensors-16-01752-g003"></graphic>
</fig>
<fig id="sensors-16-01752-f004" position="float">
<label>Figure 4</label>
<caption>
<p>Overview of foot motion identification process.</p>
</caption>
<graphic xlink:href="sensors-16-01752-g004"></graphic>
</fig>
<fig id="sensors-16-01752-f005" position="float">
<label>Figure 5</label>
<caption>
<p>Inertial sensor measurement alignment.</p>
</caption>
<graphic xlink:href="sensors-16-01752-g005"></graphic>
</fig>
<fig id="sensors-16-01752-f006" position="float">
<label>Figure 6</label>
<caption>
<p>Stepping acceleration signal and gait phases.</p>
</caption>
<graphic xlink:href="sensors-16-01752-g006"></graphic>
</fig>
<fig id="sensors-16-01752-f007" position="float">
<label>Figure 7</label>
<caption>
<p>Acceleration signal of different motions and data segmentation: (
<bold>a</bold>
) Forward; (
<bold>b</bold>
) Backward; (
<bold>c</bold>
) Left; (
<bold>d</bold>
) Right; and (
<bold>e</bold>
) Jump.</p>
</caption>
<graphic xlink:href="sensors-16-01752-g007a"></graphic>
<graphic xlink:href="sensors-16-01752-g007b"></graphic>
</fig>
<fig id="sensors-16-01752-f008" position="float">
<label>Figure 8</label>
<caption>
<p>Decision tree graphical model.</p>
</caption>
<graphic xlink:href="sensors-16-01752-g008"></graphic>
</fig>
<fig id="sensors-16-01752-f009" position="float">
<label>Figure 9</label>
<caption>
<p>K-nearest neighbors algorithm (kNN) algorithm concept.</p>
</caption>
<graphic xlink:href="sensors-16-01752-g009"></graphic>
</fig>
<fig id="sensors-16-01752-f010" position="float">
<label>Figure 10</label>
<caption>
<p>Support vector machine (SVM) classifier.</p>
</caption>
<graphic xlink:href="sensors-16-01752-g010"></graphic>
</fig>
<fig id="sensors-16-01752-f011" position="float">
<label>Figure 11</label>
<caption>
<p>Hardware platform and the sensor placement on shoe. (
<bold>a</bold>
) Hardware platform of proposed system; (
<bold>b</bold>
<bold>e</bold>
) Sensor placement on different testers’ shoes.</p>
</caption>
<graphic xlink:href="sensors-16-01752-g011a"></graphic>
<graphic xlink:href="sensors-16-01752-g011b"></graphic>
</fig>
<fig id="sensors-16-01752-f012" position="float">
<label>Figure 12</label>
<caption>
<p>Accuracy comparison of three classifiers.</p>
</caption>
<graphic xlink:href="sensors-16-01752-g012"></graphic>
</fig>
<fig id="sensors-16-01752-f013" position="float">
<label>Figure 13</label>
<caption>
<p>Precision comparison of three classifiers.</p>
</caption>
<graphic xlink:href="sensors-16-01752-g013"></graphic>
</fig>
<fig id="sensors-16-01752-f014" position="float">
<label>Figure 14</label>
<caption>
<p>Practical game play test in running game: (
<bold>a</bold>
) Jump motion; (
<bold>b</bold>
) Forward motion; (
<bold>c</bold>
) Backward motion; (
<bold>d</bold>
) Right motion; and (
<bold>e</bold>
) Left motion.</p>
</caption>
<graphic xlink:href="sensors-16-01752-g014"></graphic>
</fig>
<fig id="sensors-16-01752-f015" position="float">
<label>Figure 15</label>
<caption>
<p>Practical game play test in Game Subway Surfers: (
<bold>a</bold>
) Step forward; and (
<bold>b</bold>
) Step left.</p>
</caption>
<graphic xlink:href="sensors-16-01752-g015a"></graphic>
<graphic xlink:href="sensors-16-01752-g015b"></graphic>
</fig>
<table-wrap id="sensors-16-01752-t001" position="float">
<object-id pub-id-type="pii">sensors-16-01752-t001_Table 1</object-id>
<label>Table 1</label>
<caption>
<p>Stepping motion training set.</p>
</caption>
<table frame="hsides" rules="groups">
<thead>
<tr>
<th align="center" valign="middle" style="border-top:solid thin;border-bottom:solid thin" rowspan="1" colspan="1"></th>
<th align="center" valign="middle" style="border-top:solid thin;border-bottom:solid thin" rowspan="1" colspan="1">Jump</th>
<th align="center" valign="middle" style="border-top:solid thin;border-bottom:solid thin" rowspan="1" colspan="1">Left</th>
<th align="center" valign="middle" style="border-top:solid thin;border-bottom:solid thin" rowspan="1" colspan="1">Right</th>
<th align="center" valign="middle" style="border-top:solid thin;border-bottom:solid thin" rowspan="1" colspan="1">Forward</th>
<th align="center" valign="middle" style="border-top:solid thin;border-bottom:solid thin" rowspan="1" colspan="1">Backward</th>
</tr>
</thead>
<tbody>
<tr>
<td align="center" valign="middle" style="border-bottom:solid thin" rowspan="1" colspan="1">Action numbers</td>
<td align="center" valign="middle" style="border-bottom:solid thin" rowspan="1" colspan="1">895</td>
<td align="center" valign="middle" style="border-bottom:solid thin" rowspan="1" colspan="1">954</td>
<td align="center" valign="middle" style="border-bottom:solid thin" rowspan="1" colspan="1">901</td>
<td align="center" valign="middle" style="border-bottom:solid thin" rowspan="1" colspan="1">510</td>
<td align="center" valign="middle" style="border-bottom:solid thin" rowspan="1" colspan="1">515</td>
</tr>
</tbody>
</table>
</table-wrap>
<table-wrap id="sensors-16-01752-t002" position="float">
<object-id pub-id-type="pii">sensors-16-01752-t002_Table 2</object-id>
<label>Table 2</label>
<caption>
<p>Classification result of 10-fold cross validation.</p>
</caption>
<table frame="hsides" rules="groups">
<thead>
<tr>
<th rowspan="2" align="center" valign="middle" style="border-top:solid thin;border-bottom:solid thin" colspan="1"></th>
<th colspan="2" align="center" valign="middle" style="border-top:solid thin;border-bottom:solid thin" rowspan="1">Decision Tree</th>
<th colspan="2" align="center" valign="middle" style="border-top:solid thin;border-bottom:solid thin" rowspan="1">kNN</th>
<th colspan="2" align="center" valign="middle" style="border-top:solid thin;border-bottom:solid thin" rowspan="1">SVM</th>
</tr>
<tr>
<th align="center" valign="middle" style="border-bottom:solid thin" rowspan="1" colspan="1">Class 1</th>
<th align="center" valign="middle" style="border-bottom:solid thin" rowspan="1" colspan="1">Class 0</th>
<th align="center" valign="middle" style="border-bottom:solid thin" rowspan="1" colspan="1">Class 1</th>
<th align="center" valign="middle" style="border-bottom:solid thin" rowspan="1" colspan="1">Class 0</th>
<th align="center" valign="middle" style="border-bottom:solid thin" rowspan="1" colspan="1">Class 1</th>
<th align="center" valign="middle" style="border-bottom:solid thin" rowspan="1" colspan="1">Class 0</th>
</tr>
</thead>
<tbody>
<tr>
<td align="center" valign="middle" rowspan="1" colspan="1">Jump</td>
<td align="center" valign="middle" rowspan="1" colspan="1">813/895</td>
<td align="center" valign="middle" rowspan="1" colspan="1">75/2880</td>
<td align="center" valign="middle" rowspan="1" colspan="1">870/895</td>
<td align="center" valign="middle" rowspan="1" colspan="1">38/2880</td>
<td align="center" valign="middle" rowspan="1" colspan="1">868/895</td>
<td align="center" valign="middle" rowspan="1" colspan="1">36/2880</td>
</tr>
<tr>
<td align="center" valign="middle" rowspan="1" colspan="1">Left</td>
<td align="center" valign="middle" rowspan="1" colspan="1">914/954</td>
<td align="center" valign="middle" rowspan="1" colspan="1">36/2821</td>
<td align="center" valign="middle" rowspan="1" colspan="1">939/954</td>
<td align="center" valign="middle" rowspan="1" colspan="1">34/2821</td>
<td align="center" valign="middle" rowspan="1" colspan="1">936/954</td>
<td align="center" valign="middle" rowspan="1" colspan="1">19/2821</td>
</tr>
<tr>
<td align="center" valign="middle" rowspan="1" colspan="1">Right</td>
<td align="center" valign="middle" rowspan="1" colspan="1">806/901</td>
<td align="center" valign="middle" rowspan="1" colspan="1">84/2874</td>
<td align="center" valign="middle" rowspan="1" colspan="1">881/901</td>
<td align="center" valign="middle" rowspan="1" colspan="1">77/2874</td>
<td align="center" valign="middle" rowspan="1" colspan="1">885/901</td>
<td align="center" valign="middle" rowspan="1" colspan="1">62/2874</td>
</tr>
<tr>
<td align="center" valign="middle" rowspan="1" colspan="1">Forward</td>
<td align="center" valign="middle" rowspan="1" colspan="1">470/510</td>
<td align="center" valign="middle" rowspan="1" colspan="1">47/3265</td>
<td align="center" valign="middle" rowspan="1" colspan="1">498/510</td>
<td align="center" valign="middle" rowspan="1" colspan="1">39/3265</td>
<td align="center" valign="middle" rowspan="1" colspan="1">493/510</td>
<td align="center" valign="middle" rowspan="1" colspan="1">20/3265</td>
</tr>
<tr>
<td align="center" valign="middle" style="border-bottom:solid thin" rowspan="1" colspan="1">Backward</td>
<td align="center" valign="middle" style="border-bottom:solid thin" rowspan="1" colspan="1">445/515</td>
<td align="center" valign="middle" style="border-bottom:solid thin" rowspan="1" colspan="1">60/3260</td>
<td align="center" valign="middle" style="border-bottom:solid thin" rowspan="1" colspan="1">502/515</td>
<td align="center" valign="middle" style="border-bottom:solid thin" rowspan="1" colspan="1">27/3260</td>
<td align="center" valign="middle" style="border-bottom:solid thin" rowspan="1" colspan="1">498/515</td>
<td align="center" valign="middle" style="border-bottom:solid thin" rowspan="1" colspan="1">22/ 3260</td>
</tr>
</tbody>
</table>
</table-wrap>
<table-wrap id="sensors-16-01752-t003" position="float">
<object-id pub-id-type="pii">sensors-16-01752-t003_Table 3</object-id>
<label>Table 3</label>
<caption>
<p>Classification result of 25% held out validation.</p>
</caption>
<table frame="hsides" rules="groups">
<thead>
<tr>
<th rowspan="2" align="center" valign="middle" style="border-top:solid thin;border-bottom:solid thin" colspan="1"></th>
<th colspan="2" align="center" valign="middle" style="border-top:solid thin;border-bottom:solid thin" rowspan="1">Decision Tree</th>
<th colspan="2" align="center" valign="middle" style="border-top:solid thin;border-bottom:solid thin" rowspan="1">kNN</th>
<th colspan="2" align="center" valign="middle" style="border-top:solid thin;border-bottom:solid thin" rowspan="1">SVM</th>
</tr>
<tr>
<th align="center" valign="middle" style="border-bottom:solid thin" rowspan="1" colspan="1">Class 1</th>
<th align="center" valign="middle" style="border-bottom:solid thin" rowspan="1" colspan="1">Class 0</th>
<th align="center" valign="middle" style="border-bottom:solid thin" rowspan="1" colspan="1">Class 1</th>
<th align="center" valign="middle" style="border-bottom:solid thin" rowspan="1" colspan="1">Class 0</th>
<th align="center" valign="middle" style="border-bottom:solid thin" rowspan="1" colspan="1">Class 1</th>
<th align="center" valign="middle" style="border-bottom:solid thin" rowspan="1" colspan="1">Class 0</th>
</tr>
</thead>
<tbody>
<tr>
<td align="center" valign="middle" rowspan="1" colspan="1">Jump</td>
<td align="center" valign="middle" rowspan="1" colspan="1">212/223</td>
<td align="center" valign="middle" rowspan="1" colspan="1">15/718</td>
<td align="center" valign="middle" rowspan="1" colspan="1">219/223</td>
<td align="center" valign="middle" rowspan="1" colspan="1">8/718</td>
<td align="center" valign="middle" rowspan="1" colspan="1">216/223</td>
<td align="center" valign="middle" rowspan="1" colspan="1">8/718</td>
</tr>
<tr>
<td align="center" valign="middle" rowspan="1" colspan="1">Left</td>
<td align="center" valign="middle" rowspan="1" colspan="1">226/238</td>
<td align="center" valign="middle" rowspan="1" colspan="1">6/703</td>
<td align="center" valign="middle" rowspan="1" colspan="1">234/238</td>
<td align="center" valign="middle" rowspan="1" colspan="1">15/703</td>
<td align="center" valign="middle" rowspan="1" colspan="1">233/238</td>
<td align="center" valign="middle" rowspan="1" colspan="1">10/703</td>
</tr>
<tr>
<td align="center" valign="middle" rowspan="1" colspan="1">Right</td>
<td align="center" valign="middle" rowspan="1" colspan="1">198/225</td>
<td align="center" valign="middle" rowspan="1" colspan="1">18/716</td>
<td align="center" valign="middle" rowspan="1" colspan="1">219/225</td>
<td align="center" valign="middle" rowspan="1" colspan="1">16/716</td>
<td align="center" valign="middle" rowspan="1" colspan="1">217/225</td>
<td align="center" valign="middle" rowspan="1" colspan="1">13/716</td>
</tr>
<tr>
<td align="center" valign="middle" rowspan="1" colspan="1">Forward</td>
<td align="center" valign="middle" rowspan="1" colspan="1">16/127</td>
<td align="center" valign="middle" rowspan="1" colspan="1">9/814</td>
<td align="center" valign="middle" rowspan="1" colspan="1">125/127</td>
<td align="center" valign="middle" rowspan="1" colspan="1">13/814</td>
<td align="center" valign="middle" rowspan="1" colspan="1">123/127</td>
<td align="center" valign="middle" rowspan="1" colspan="1">7/814</td>
</tr>
<tr>
<td align="center" valign="middle" style="border-bottom:solid thin" rowspan="1" colspan="1">Backward</td>
<td align="center" valign="middle" style="border-bottom:solid thin" rowspan="1" colspan="1">114/128</td>
<td align="center" valign="middle" style="border-bottom:solid thin" rowspan="1" colspan="1">7/813</td>
<td align="center" valign="middle" style="border-bottom:solid thin" rowspan="1" colspan="1">126/128</td>
<td align="center" valign="middle" style="border-bottom:solid thin" rowspan="1" colspan="1">4/813</td>
<td align="center" valign="middle" style="border-bottom:solid thin" rowspan="1" colspan="1">126/128</td>
<td align="center" valign="middle" style="border-bottom:solid thin" rowspan="1" colspan="1">4/813</td>
</tr>
</tbody>
</table>
</table-wrap>
<table-wrap id="sensors-16-01752-t004" position="float">
<object-id pub-id-type="pii">sensors-16-01752-t004_Table 4</object-id>
<label>Table 4</label>
<caption>
<p>Evaluation of Decision tree classifier.</p>
</caption>
<table frame="hsides" rules="groups">
<thead>
<tr>
<th rowspan="2" align="center" valign="middle" style="border-top:solid thin;border-bottom:solid thin" colspan="1"></th>
<th colspan="3" align="center" valign="middle" style="border-top:solid thin;border-bottom:solid thin" rowspan="1">10-Fold Cross Validation</th>
<th colspan="3" align="center" valign="middle" style="border-top:solid thin;border-bottom:solid thin" rowspan="1">25% Hold Out Validation</th>
</tr>
<tr>
<th align="center" valign="middle" style="border-bottom:solid thin" rowspan="1" colspan="1">Accuracy</th>
<th align="center" valign="middle" style="border-bottom:solid thin" rowspan="1" colspan="1">Precision</th>
<th align="center" valign="middle" style="border-bottom:solid thin" rowspan="1" colspan="1">Recall</th>
<th align="center" valign="middle" style="border-bottom:solid thin" rowspan="1" colspan="1">Accuracy</th>
<th align="center" valign="middle" style="border-bottom:solid thin" rowspan="1" colspan="1">Precision</th>
<th align="center" valign="middle" style="border-bottom:solid thin" rowspan="1" colspan="1">Recall</th>
</tr>
</thead>
<tbody>
<tr>
<td align="center" valign="middle" rowspan="1" colspan="1">Jump</td>
<td align="center" valign="middle" rowspan="1" colspan="1">95.84%</td>
<td align="center" valign="middle" rowspan="1" colspan="1">91.55%</td>
<td align="center" valign="middle" rowspan="1" colspan="1">90.83%</td>
<td align="center" valign="middle" rowspan="1" colspan="1">97.24%</td>
<td align="center" valign="middle" rowspan="1" colspan="1">93.41%</td>
<td align="center" valign="middle" rowspan="1" colspan="1">95.08%</td>
</tr>
<tr>
<td align="center" valign="middle" rowspan="1" colspan="1">Left</td>
<td align="center" valign="middle" rowspan="1" colspan="1">97.98%</td>
<td align="center" valign="middle" rowspan="1" colspan="1">96.21%</td>
<td align="center" valign="middle" rowspan="1" colspan="1">95.80%</td>
<td align="center" valign="middle" rowspan="1" colspan="1">98.09%</td>
<td align="center" valign="middle" rowspan="1" colspan="1">97.41%</td>
<td align="center" valign="middle" rowspan="1" colspan="1">94.96%</td>
</tr>
<tr>
<td align="center" valign="middle" rowspan="1" colspan="1">Right</td>
<td align="center" valign="middle" rowspan="1" colspan="1">95.25%</td>
<td align="center" valign="middle" rowspan="1" colspan="1">90.56%</td>
<td align="center" valign="middle" rowspan="1" colspan="1">89.45%</td>
<td align="center" valign="middle" rowspan="1" colspan="1">95.23%</td>
<td align="center" valign="middle" rowspan="1" colspan="1">91.67%</td>
<td align="center" valign="middle" rowspan="1" colspan="1">88.01%</td>
</tr>
<tr>
<td align="center" valign="middle" rowspan="1" colspan="1">Forward</td>
<td align="center" valign="middle" rowspan="1" colspan="1">97.69%</td>
<td align="center" valign="middle" rowspan="1" colspan="1">90.90%</td>
<td align="center" valign="middle" rowspan="1" colspan="1">92.15%</td>
<td align="center" valign="middle" rowspan="1" colspan="1">97.35%</td>
<td align="center" valign="middle" rowspan="1" colspan="1">92.53%</td>
<td align="center" valign="middle" rowspan="1" colspan="1">87.45%</td>
</tr>
<tr>
<td align="center" valign="middle" style="border-bottom:solid thin" rowspan="1" colspan="1">Backward</td>
<td align="center" valign="middle" style="border-bottom:solid thin" rowspan="1" colspan="1">96.55%</td>
<td align="center" valign="middle" style="border-bottom:solid thin" rowspan="1" colspan="1">88.11%</td>
<td align="center" valign="middle" style="border-bottom:solid thin" rowspan="1" colspan="1">86.40%</td>
<td align="center" valign="middle" style="border-bottom:solid thin" rowspan="1" colspan="1">97.77%</td>
<td align="center" valign="middle" style="border-bottom:solid thin" rowspan="1" colspan="1">94.25%</td>
<td align="center" valign="middle" style="border-bottom:solid thin" rowspan="1" colspan="1">89.12%</td>
</tr>
</tbody>
</table>
</table-wrap>
<table-wrap id="sensors-16-01752-t005" position="float">
<object-id pub-id-type="pii">sensors-16-01752-t005_Table 5</object-id>
<label>Table 5</label>
<caption>
<p>Evaluation of kNN classifier.</p>
</caption>
<table frame="hsides" rules="groups">
<thead>
<tr>
<th rowspan="2" align="center" valign="middle" style="border-top:solid thin;border-bottom:solid thin" colspan="1"></th>
<th colspan="3" align="center" valign="middle" style="border-top:solid thin;border-bottom:solid thin" rowspan="1">10-Fold Cross Validation</th>
<th colspan="3" align="center" valign="middle" style="border-top:solid thin;border-bottom:solid thin" rowspan="1">25% Hold Out Validation</th>
</tr>
<tr>
<th align="center" valign="middle" style="border-bottom:solid thin" rowspan="1" colspan="1">Accuracy</th>
<th align="center" valign="middle" style="border-bottom:solid thin" rowspan="1" colspan="1">Precision</th>
<th align="center" valign="middle" style="border-bottom:solid thin" rowspan="1" colspan="1">Recall</th>
<th align="center" valign="middle" style="border-bottom:solid thin" rowspan="1" colspan="1">Accuracy</th>
<th align="center" valign="middle" style="border-bottom:solid thin" rowspan="1" colspan="1">Precision</th>
<th align="center" valign="middle" style="border-bottom:solid thin" rowspan="1" colspan="1">Recall</th>
</tr>
</thead>
<tbody>
<tr>
<td align="center" valign="middle" rowspan="1" colspan="1">Jump</td>
<td align="center" valign="middle" rowspan="1" colspan="1">98.33%</td>
<td align="center" valign="middle" rowspan="1" colspan="1">95.81%</td>
<td align="center" valign="middle" rowspan="1" colspan="1">97.20%</td>
<td align="center" valign="middle" rowspan="1" colspan="1">98.72 %</td>
<td align="center" valign="middle" rowspan="1" colspan="1">96.47%</td>
<td align="center" valign="middle" rowspan="1" colspan="1">98.20%</td>
</tr>
<tr>
<td align="center" valign="middle" rowspan="1" colspan="1">Left</td>
<td align="center" valign="middle" rowspan="1" colspan="1">98.70%</td>
<td align="center" valign="middle" rowspan="1" colspan="1">96.50%</td>
<td align="center" valign="middle" rowspan="1" colspan="1">98.42%</td>
<td align="center" valign="middle" rowspan="1" colspan="1">97.98%</td>
<td align="center" valign="middle" rowspan="1" colspan="1">93.97%</td>
<td align="center" valign="middle" rowspan="1" colspan="1">98.31%</td>
</tr>
<tr>
<td align="center" valign="middle" rowspan="1" colspan="1">Right</td>
<td align="center" valign="middle" rowspan="1" colspan="1">97.43%</td>
<td align="center" valign="middle" rowspan="1" colspan="1">91.96%</td>
<td align="center" valign="middle" rowspan="1" colspan="1">97.78%</td>
<td align="center" valign="middle" rowspan="1" colspan="1">97.66%</td>
<td align="center" valign="middle" rowspan="1" colspan="1">93.19%</td>
<td align="center" valign="middle" rowspan="1" colspan="1">97.33%</td>
</tr>
<tr>
<td align="center" valign="middle" rowspan="1" colspan="1">Forward</td>
<td align="center" valign="middle" rowspan="1" colspan="1">98.64%</td>
<td align="center" valign="middle" rowspan="1" colspan="1">92.73%</td>
<td align="center" valign="middle" rowspan="1" colspan="1">97.64%</td>
<td align="center" valign="middle" rowspan="1" colspan="1">98.40%</td>
<td align="center" valign="middle" rowspan="1" colspan="1">90.57%</td>
<td align="center" valign="middle" rowspan="1" colspan="1">98.42%</td>
</tr>
<tr>
<td align="center" valign="middle" style="border-bottom:solid thin" rowspan="1" colspan="1">Backward</td>
<td align="center" valign="middle" style="border-bottom:solid thin" rowspan="1" colspan="1">98.94%</td>
<td align="center" valign="middle" style="border-bottom:solid thin" rowspan="1" colspan="1">94.89%</td>
<td align="center" valign="middle" style="border-bottom:solid thin" rowspan="1" colspan="1">97.47%</td>
<td align="center" valign="middle" style="border-bottom:solid thin" rowspan="1" colspan="1">99.36%</td>
<td align="center" valign="middle" style="border-bottom:solid thin" rowspan="1" colspan="1">96.92%</td>
<td align="center" valign="middle" style="border-bottom:solid thin" rowspan="1" colspan="1">98.43%</td>
</tr>
</tbody>
</table>
</table-wrap>
<table-wrap id="sensors-16-01752-t006" position="float">
<object-id pub-id-type="pii">sensors-16-01752-t006_Table 6</object-id>
<label>Table 6</label>
<caption>
<p>Evaluation of SVM classifier.</p>
</caption>
<table frame="hsides" rules="groups">
<thead>
<tr>
<th rowspan="2" align="center" valign="middle" style="border-top:solid thin;border-bottom:solid thin" colspan="1"></th>
<th colspan="3" align="center" valign="middle" style="border-top:solid thin;border-bottom:solid thin" rowspan="1">10-Fold Cross Validation</th>
<th colspan="3" align="center" valign="middle" style="border-top:solid thin;border-bottom:solid thin" rowspan="1">25% Hold Out Validation</th>
</tr>
<tr>
<th align="center" valign="middle" style="border-bottom:solid thin" rowspan="1" colspan="1">Accuracy</th>
<th align="center" valign="middle" style="border-bottom:solid thin" rowspan="1" colspan="1">Precision</th>
<th align="center" valign="middle" style="border-bottom:solid thin" rowspan="1" colspan="1">Recall</th>
<th align="center" valign="middle" style="border-bottom:solid thin" rowspan="1" colspan="1">Accuracy</th>
<th align="center" valign="middle" style="border-bottom:solid thin" rowspan="1" colspan="1">Precision</th>
<th align="center" valign="middle" style="border-bottom:solid thin" rowspan="1" colspan="1">Recall</th>
</tr>
</thead>
<tbody>
<tr>
<td align="center" valign="middle" rowspan="1" colspan="1">Jump</td>
<td align="center" valign="middle" rowspan="1" colspan="1">98.33 %</td>
<td align="center" valign="middle" rowspan="1" colspan="1">96.01%</td>
<td align="center" valign="middle" rowspan="1" colspan="1">96.98 %</td>
<td align="center" valign="middle" rowspan="1" colspan="1">98.40%</td>
<td align="center" valign="middle" rowspan="1" colspan="1">96.42%</td>
<td align="center" valign="middle" rowspan="1" colspan="1">96.86%</td>
</tr>
<tr>
<td align="center" valign="middle" rowspan="1" colspan="1">Left</td>
<td align="center" valign="middle" rowspan="1" colspan="1">99.09%</td>
<td align="center" valign="middle" rowspan="1" colspan="1">98.01%</td>
<td align="center" valign="middle" rowspan="1" colspan="1">98.11%</td>
<td align="center" valign="middle" rowspan="1" colspan="1">98.40%</td>
<td align="center" valign="middle" rowspan="1" colspan="1">95.88%</td>
<td align="center" valign="middle" rowspan="1" colspan="1">97.89%</td>
</tr>
<tr>
<td align="center" valign="middle" rowspan="1" colspan="1">Right</td>
<td align="center" valign="middle" rowspan="1" colspan="1">97.93%</td>
<td align="center" valign="middle" rowspan="1" colspan="1">93.45%</td>
<td align="center" valign="middle" rowspan="1" colspan="1">98.22%</td>
<td align="center" valign="middle" rowspan="1" colspan="1">97.76%</td>
<td align="center" valign="middle" rowspan="1" colspan="1">94.34%</td>
<td align="center" valign="middle" rowspan="1" colspan="1">96.44%</td>
</tr>
<tr>
<td align="center" valign="middle" rowspan="1" colspan="1">Forward</td>
<td align="center" valign="middle" rowspan="1" colspan="1">99.09%</td>
<td align="center" valign="middle" rowspan="1" colspan="1">96.10%</td>
<td align="center" valign="middle" rowspan="1" colspan="1">96.66%</td>
<td align="center" valign="middle" rowspan="1" colspan="1">98.83%</td>
<td align="center" valign="middle" rowspan="1" colspan="1">94.61%</td>
<td align="center" valign="middle" rowspan="1" colspan="1">96.85%</td>
</tr>
<tr>
<td align="center" valign="middle" style="border-bottom:solid thin" rowspan="1" colspan="1">Backward</td>
<td align="center" valign="middle" style="border-bottom:solid thin" rowspan="1" colspan="1">98.96%</td>
<td align="center" valign="middle" style="border-bottom:solid thin" rowspan="1" colspan="1">95.76%</td>
<td align="center" valign="middle" style="border-bottom:solid thin" rowspan="1" colspan="1">96.69%</td>
<td align="center" valign="middle" style="border-bottom:solid thin" rowspan="1" colspan="1">99.36%</td>
<td align="center" valign="middle" style="border-bottom:solid thin" rowspan="1" colspan="1">96.92%</td>
<td align="center" valign="middle" style="border-bottom:solid thin" rowspan="1" colspan="1">98.43%</td>
</tr>
</tbody>
</table>
</table-wrap>
<table-wrap id="sensors-16-01752-t007" position="float">
<object-id pub-id-type="pii">sensors-16-01752-t007_Table 7</object-id>
<label>Table 7</label>
<caption>
<p>Foot motion identification error of SVM.</p>
</caption>
<table frame="hsides" rules="groups">
<thead>
<tr>
<th align="center" valign="middle" style="border-top:solid thin;border-bottom:solid thin" rowspan="1" colspan="1"></th>
<th align="center" valign="middle" style="border-top:solid thin;border-bottom:solid thin" rowspan="1" colspan="1"></th>
<th align="center" valign="middle" style="border-top:solid thin;border-bottom:solid thin" rowspan="1" colspan="1">Jump</th>
<th align="center" valign="middle" style="border-top:solid thin;border-bottom:solid thin" rowspan="1" colspan="1">Left</th>
<th align="center" valign="middle" style="border-top:solid thin;border-bottom:solid thin" rowspan="1" colspan="1">Right</th>
<th align="center" valign="middle" style="border-top:solid thin;border-bottom:solid thin" rowspan="1" colspan="1">Forward</th>
<th align="center" valign="middle" style="border-top:solid thin;border-bottom:solid thin" rowspan="1" colspan="1">Backward</th>
<th align="center" valign="middle" style="border-top:solid thin;border-bottom:solid thin" rowspan="1" colspan="1">Corresponding Motion Error</th>
<th align="center" valign="middle" style="border-top:solid thin;border-bottom:solid thin" rowspan="1" colspan="1">Other Motions Error</th>
</tr>
</thead>
<tbody>
<tr>
<td rowspan="5" align="center" valign="middle" style="border-bottom:solid thin" colspan="1">10-fold cross validation</td>
<td align="center" valign="middle" rowspan="1" colspan="1">Jump</td>
<td align="center" valign="middle" rowspan="1" colspan="1">27</td>
<td align="center" valign="middle" rowspan="1" colspan="1">8</td>
<td align="center" valign="middle" rowspan="1" colspan="1">7</td>
<td align="center" valign="middle" rowspan="1" colspan="1">9</td>
<td align="center" valign="middle" rowspan="1" colspan="1">12</td>
<td align="center" valign="middle" rowspan="1" colspan="1">42.86%</td>
<td align="center" valign="middle" rowspan="1" colspan="1">57.14%</td>
</tr>
<tr>
<td align="center" valign="middle" rowspan="1" colspan="1">Left</td>
<td align="center" valign="middle" rowspan="1" colspan="1">4</td>
<td align="center" valign="middle" rowspan="1" colspan="1">18</td>
<td align="center" valign="middle" rowspan="1" colspan="1">5</td>
<td align="center" valign="middle" rowspan="1" colspan="1">7</td>
<td align="center" valign="middle" rowspan="1" colspan="1">3</td>
<td align="center" valign="middle" rowspan="1" colspan="1">48.65%</td>
<td align="center" valign="middle" rowspan="1" colspan="1">51.35%</td>
</tr>
<tr>
<td align="center" valign="middle" rowspan="1" colspan="1">Right</td>
<td align="center" valign="middle" rowspan="1" colspan="1">13</td>
<td align="center" valign="middle" rowspan="1" colspan="1">15</td>
<td align="center" valign="middle" rowspan="1" colspan="1">16</td>
<td align="center" valign="middle" rowspan="1" colspan="1">16</td>
<td align="center" valign="middle" rowspan="1" colspan="1">18</td>
<td align="center" valign="middle" rowspan="1" colspan="1">20.51%</td>
<td align="center" valign="middle" rowspan="1" colspan="1">79.49%</td>
</tr>
<tr>
<td align="center" valign="middle" rowspan="1" colspan="1">Forward</td>
<td align="center" valign="middle" rowspan="1" colspan="1">6</td>
<td align="center" valign="middle" rowspan="1" colspan="1">5</td>
<td align="center" valign="middle" rowspan="1" colspan="1">4</td>
<td align="center" valign="middle" rowspan="1" colspan="1">17</td>
<td align="center" valign="middle" rowspan="1" colspan="1">5</td>
<td align="center" valign="middle" rowspan="1" colspan="1">45.95%</td>
<td align="center" valign="middle" rowspan="1" colspan="1">54.05%</td>
</tr>
<tr>
<td align="center" valign="middle" style="border-bottom:solid thin" rowspan="1" colspan="1">Backward</td>
<td align="center" valign="middle" style="border-bottom:solid thin" rowspan="1" colspan="1">7</td>
<td align="center" valign="middle" style="border-bottom:solid thin" rowspan="1" colspan="1">6</td>
<td align="center" valign="middle" style="border-bottom:solid thin" rowspan="1" colspan="1">5</td>
<td align="center" valign="middle" style="border-bottom:solid thin" rowspan="1" colspan="1">2</td>
<td align="center" valign="middle" style="border-bottom:solid thin" rowspan="1" colspan="1">17</td>
<td align="center" valign="middle" style="border-bottom:solid thin" rowspan="1" colspan="1">43.59%</td>
<td align="center" valign="middle" style="border-bottom:solid thin" rowspan="1" colspan="1">56.41%</td>
</tr>
<tr>
<td rowspan="5" align="center" valign="middle" style="border-bottom:solid thin" colspan="1">25 hold out validation</td>
<td align="center" valign="middle" rowspan="1" colspan="1">Jump</td>
<td align="center" valign="middle" rowspan="1" colspan="1">7</td>
<td align="center" valign="middle" rowspan="1" colspan="1">1</td>
<td align="center" valign="middle" rowspan="1" colspan="1">4</td>
<td align="center" valign="middle" rowspan="1" colspan="1">2</td>
<td align="center" valign="middle" rowspan="1" colspan="1">1</td>
<td align="center" valign="middle" rowspan="1" colspan="1">46.67%</td>
<td align="center" valign="middle" rowspan="1" colspan="1">53.33%</td>
</tr>
<tr>
<td align="center" valign="middle" rowspan="1" colspan="1">Left</td>
<td align="center" valign="middle" rowspan="1" colspan="1">3</td>
<td align="center" valign="middle" rowspan="1" colspan="1">5</td>
<td align="center" valign="middle" rowspan="1" colspan="1">2</td>
<td align="center" valign="middle" rowspan="1" colspan="1">1</td>
<td align="center" valign="middle" rowspan="1" colspan="1">4</td>
<td align="center" valign="middle" rowspan="1" colspan="1">33.33%</td>
<td align="center" valign="middle" rowspan="1" colspan="1">66.67%</td>
</tr>
<tr>
<td align="center" valign="middle" rowspan="1" colspan="1">Right</td>
<td align="center" valign="middle" rowspan="1" colspan="1">2</td>
<td align="center" valign="middle" rowspan="1" colspan="1">5</td>
<td align="center" valign="middle" rowspan="1" colspan="1">8</td>
<td align="center" valign="middle" rowspan="1" colspan="1">4</td>
<td align="center" valign="middle" rowspan="1" colspan="1">2</td>
<td align="center" valign="middle" rowspan="1" colspan="1">38.10%</td>
<td align="center" valign="middle" rowspan="1" colspan="1">61.90%</td>
</tr>
<tr>
<td align="center" valign="middle" rowspan="1" colspan="1">Forward</td>
<td align="center" valign="middle" rowspan="1" colspan="1">1</td>
<td align="center" valign="middle" rowspan="1" colspan="1">3</td>
<td align="center" valign="middle" rowspan="1" colspan="1">2</td>
<td align="center" valign="middle" rowspan="1" colspan="1">4</td>
<td align="center" valign="middle" rowspan="1" colspan="1">1</td>
<td align="center" valign="middle" rowspan="1" colspan="1">36.36%</td>
<td align="center" valign="middle" rowspan="1" colspan="1">63.64%</td>
</tr>
<tr>
<td align="center" valign="middle" style="border-bottom:solid thin" rowspan="1" colspan="1">Backward</td>
<td align="center" valign="middle" style="border-bottom:solid thin" rowspan="1" colspan="1">2</td>
<td align="center" valign="middle" style="border-bottom:solid thin" rowspan="1" colspan="1">1</td>
<td align="center" valign="middle" style="border-bottom:solid thin" rowspan="1" colspan="1">0</td>
<td align="center" valign="middle" style="border-bottom:solid thin" rowspan="1" colspan="1">1</td>
<td align="center" valign="middle" style="border-bottom:solid thin" rowspan="1" colspan="1">2</td>
<td align="center" valign="middle" style="border-bottom:solid thin" rowspan="1" colspan="1">33.33%</td>
<td align="center" valign="middle" style="border-bottom:solid thin" rowspan="1" colspan="1">66.67%</td>
</tr>
</tbody>
</table>
</table-wrap>
</floats-group>
</pmc>
</record>

Pour manipuler ce document sous Unix (Dilib)

EXPLOR_STEP=$WICRI_ROOT/Wicri/Canada/explor/ParkinsonCanadaV1/Data/Pmc/Corpus
HfdSelect -h $EXPLOR_STEP/biblio.hfd -nk 000B44  | SxmlIndent | more

Ou

HfdSelect -h $EXPLOR_AREA/Data/Pmc/Corpus/biblio.hfd -nk 000B44  | SxmlIndent | more

Pour mettre un lien sur cette page dans le réseau Wicri

{{Explor lien
   |wiki=    Wicri/Canada
   |area=    ParkinsonCanadaV1
   |flux=    Pmc
   |étape=   Corpus
   |type=    RBID
   |clé=     
   |texte=   
}}

Wicri

This area was generated with Dilib version V0.6.29.
Data generation: Thu May 4 22:20:19 2017. Site generation: Fri Dec 23 23:17:26 2022