La maladie de Parkinson au Canada (serveur d'exploration)

Attention, ce site est en cours de développement !
Attention, site généré par des moyens informatiques à partir de corpus bruts.
Les informations ne sont donc pas validées.
***** Acces problem to record *****\

Identifieur interne : 0002549 ( Pmc/Corpus ); précédent : 0002548; suivant : 0002550 ***** probable Xml problem with record *****

Links to Exploration step


Le document en format XML

<record>
<TEI>
<teiHeader>
<fileDesc>
<titleStmt>
<title xml:lang="en">Auto detection and segmentation of physical activities during a Timed-Up-and-Go (TUG) task in healthy older adults using multiple inertial sensors</title>
<author>
<name sortKey="Nguyen, Hung P" sort="Nguyen, Hung P" uniqKey="Nguyen H" first="Hung P" last="Nguyen">Hung P. Nguyen</name>
<affiliation>
<nlm:aff id="Aff1">Département de Kinanthropologie, Université du Québec à Montréal, C.P. 8888, succursale Centre-Ville, Montréal, H3C 3P8 Québec Canada</nlm:aff>
</affiliation>
</author>
<author>
<name sortKey="Ayachi, Fouaz" sort="Ayachi, Fouaz" uniqKey="Ayachi F" first="Fouaz" last="Ayachi">Fouaz Ayachi</name>
<affiliation>
<nlm:aff id="Aff1">Département de Kinanthropologie, Université du Québec à Montréal, C.P. 8888, succursale Centre-Ville, Montréal, H3C 3P8 Québec Canada</nlm:aff>
</affiliation>
</author>
<author>
<name sortKey="Lavigne Elletier, Catherine" sort="Lavigne Elletier, Catherine" uniqKey="Lavigne Elletier C" first="Catherine" last="Lavigne Elletier">Catherine Lavigne Elletier</name>
<affiliation>
<nlm:aff id="Aff1">Département de Kinanthropologie, Université du Québec à Montréal, C.P. 8888, succursale Centre-Ville, Montréal, H3C 3P8 Québec Canada</nlm:aff>
</affiliation>
</author>
<author>
<name sortKey="Blamoutier, Margaux" sort="Blamoutier, Margaux" uniqKey="Blamoutier M" first="Margaux" last="Blamoutier">Margaux Blamoutier</name>
<affiliation>
<nlm:aff id="Aff2">Faculté des Sciences, Université du Québec à Montréal, Montreal, Canada</nlm:aff>
</affiliation>
</author>
<author>
<name sortKey="Rahimi, Fariborz" sort="Rahimi, Fariborz" uniqKey="Rahimi F" first="Fariborz" last="Rahimi">Fariborz Rahimi</name>
<affiliation>
<nlm:aff id="Aff3">Electrical Engineering Department, University of Bonab, Bonab, East Azerbaijan Iran</nlm:aff>
</affiliation>
</author>
<author>
<name sortKey="Boissy, Patrick" sort="Boissy, Patrick" uniqKey="Boissy P" first="Patrick" last="Boissy">Patrick Boissy</name>
<affiliation>
<nlm:aff id="Aff4">Department of Surgery, Orthopaedic Division, Faculty of Medicine and Health Sciences, Université of Sherbrooke, Quebec, Canada</nlm:aff>
</affiliation>
</author>
<author>
<name sortKey="Jog, Mandar" sort="Jog, Mandar" uniqKey="Jog M" first="Mandar" last="Jog">Mandar Jog</name>
<affiliation>
<nlm:aff id="Aff5">London Health Sciences Center University Hospital, Ontario, Canada</nlm:aff>
</affiliation>
</author>
<author>
<name sortKey="Duval, Christian" sort="Duval, Christian" uniqKey="Duval C" first="Christian" last="Duval">Christian Duval</name>
<affiliation>
<nlm:aff id="Aff1">Département de Kinanthropologie, Université du Québec à Montréal, C.P. 8888, succursale Centre-Ville, Montréal, H3C 3P8 Québec Canada</nlm:aff>
</affiliation>
</author>
</titleStmt>
<publicationStmt>
<idno type="wicri:source">PMC</idno>
<idno type="pmid">25885438</idno>
<idno type="pmc">4403848</idno>
<idno type="url">http://www.ncbi.nlm.nih.gov/pmc/articles/PMC4403848</idno>
<idno type="RBID">PMC:4403848</idno>
<idno type="doi">10.1186/s12984-015-0026-4</idno>
<date when="2015">2015</date>
<idno type="wicri:Area/Pmc/Corpus">000254</idno>
<idno type="wicri:explorRef" wicri:stream="Pmc" wicri:step="Corpus" wicri:corpus="PMC">000254</idno>
</publicationStmt>
<sourceDesc>
<biblStruct>
<analytic>
<title xml:lang="en" level="a" type="main">Auto detection and segmentation of physical activities during a Timed-Up-and-Go (TUG) task in healthy older adults using multiple inertial sensors</title>
<author>
<name sortKey="Nguyen, Hung P" sort="Nguyen, Hung P" uniqKey="Nguyen H" first="Hung P" last="Nguyen">Hung P. Nguyen</name>
<affiliation>
<nlm:aff id="Aff1">Département de Kinanthropologie, Université du Québec à Montréal, C.P. 8888, succursale Centre-Ville, Montréal, H3C 3P8 Québec Canada</nlm:aff>
</affiliation>
</author>
<author>
<name sortKey="Ayachi, Fouaz" sort="Ayachi, Fouaz" uniqKey="Ayachi F" first="Fouaz" last="Ayachi">Fouaz Ayachi</name>
<affiliation>
<nlm:aff id="Aff1">Département de Kinanthropologie, Université du Québec à Montréal, C.P. 8888, succursale Centre-Ville, Montréal, H3C 3P8 Québec Canada</nlm:aff>
</affiliation>
</author>
<author>
<name sortKey="Lavigne Elletier, Catherine" sort="Lavigne Elletier, Catherine" uniqKey="Lavigne Elletier C" first="Catherine" last="Lavigne Elletier">Catherine Lavigne Elletier</name>
<affiliation>
<nlm:aff id="Aff1">Département de Kinanthropologie, Université du Québec à Montréal, C.P. 8888, succursale Centre-Ville, Montréal, H3C 3P8 Québec Canada</nlm:aff>
</affiliation>
</author>
<author>
<name sortKey="Blamoutier, Margaux" sort="Blamoutier, Margaux" uniqKey="Blamoutier M" first="Margaux" last="Blamoutier">Margaux Blamoutier</name>
<affiliation>
<nlm:aff id="Aff2">Faculté des Sciences, Université du Québec à Montréal, Montreal, Canada</nlm:aff>
</affiliation>
</author>
<author>
<name sortKey="Rahimi, Fariborz" sort="Rahimi, Fariborz" uniqKey="Rahimi F" first="Fariborz" last="Rahimi">Fariborz Rahimi</name>
<affiliation>
<nlm:aff id="Aff3">Electrical Engineering Department, University of Bonab, Bonab, East Azerbaijan Iran</nlm:aff>
</affiliation>
</author>
<author>
<name sortKey="Boissy, Patrick" sort="Boissy, Patrick" uniqKey="Boissy P" first="Patrick" last="Boissy">Patrick Boissy</name>
<affiliation>
<nlm:aff id="Aff4">Department of Surgery, Orthopaedic Division, Faculty of Medicine and Health Sciences, Université of Sherbrooke, Quebec, Canada</nlm:aff>
</affiliation>
</author>
<author>
<name sortKey="Jog, Mandar" sort="Jog, Mandar" uniqKey="Jog M" first="Mandar" last="Jog">Mandar Jog</name>
<affiliation>
<nlm:aff id="Aff5">London Health Sciences Center University Hospital, Ontario, Canada</nlm:aff>
</affiliation>
</author>
<author>
<name sortKey="Duval, Christian" sort="Duval, Christian" uniqKey="Duval C" first="Christian" last="Duval">Christian Duval</name>
<affiliation>
<nlm:aff id="Aff1">Département de Kinanthropologie, Université du Québec à Montréal, C.P. 8888, succursale Centre-Ville, Montréal, H3C 3P8 Québec Canada</nlm:aff>
</affiliation>
</author>
</analytic>
<series>
<title level="j">Journal of NeuroEngineering and Rehabilitation</title>
<idno type="eISSN">1743-0003</idno>
<imprint>
<date when="2015">2015</date>
</imprint>
</series>
</biblStruct>
</sourceDesc>
</fileDesc>
<profileDesc>
<textClass></textClass>
</profileDesc>
</teiHeader>
<front>
<div type="abstract" xml:lang="en">
<sec>
<title>Background</title>
<p>Recently, much attention has been given to the use of inertial sensors for remote monitoring of individuals with limited mobility. However, the focus has been mostly on the detection of symptoms, not specific activities. The objective of the present study was to develop an automated recognition and segmentation algorithm based on inertial sensor data to identify common gross motor patterns during activity of daily living.</p>
</sec>
<sec>
<title>Method</title>
<p>A modified Time-Up-And-Go (TUG) task was used since it is comprised of four common daily living activities;
<italic>Standing</italic>
,
<italic>Walking</italic>
,
<italic>Turning</italic>
, and
<italic>Sitting</italic>
, all performed in a continuous fashion resulting in six different segments during the task. Sixteen healthy older adults performed two trials of a 5 and 10 meter TUG task. They were outfitted with 17 inertial motion sensors covering each body segment. Data from the 10 meter TUG were used to identify pertinent sensors on the trunk, head, hip, knee, and thigh that provided suitable data for detecting and segmenting activities associated with the TUG. Raw data from sensors were detrended to remove sensor drift, normalized, and band pass filtered with optimal frequencies to reveal kinematic peaks that corresponded to different activities. Segmentation was accomplished by identifying the time stamps of the first minimum or maximum to the right and the left of these peaks. Segmentation time stamps were compared to results from two examiners visually segmenting the activities of the TUG.</p>
</sec>
<sec>
<title>Results</title>
<p>We were able to detect these activities in a TUG with 100% sensitivity and specificity (n = 192) during the 10 meter TUG. The rate of success was subsequently confirmed in the 5 meter TUG (n = 192) without altering the parameters of the algorithm. When applying the segmentation algorithms to the 10 meter TUG, we were able to parse 100% of the transition points (n = 224) between different segments that were as reliable and less variable than visual segmentation performed by two independent examiners.</p>
</sec>
<sec>
<title>Conclusions</title>
<p>The present study lays the foundation for the development of a comprehensive algorithm to detect and segment naturalistic activities using inertial sensors, in hope of evaluating automatically motor performance within the detected tasks.</p>
</sec>
</div>
</front>
<back>
<div1 type="bibliography">
<listBibl>
<biblStruct>
<analytic>
<author>
<name sortKey="Wassink Vossen, S" uniqKey="Wassink Vossen S">S Wassink-Vossen</name>
</author>
<author>
<name sortKey="Collard, Rm" uniqKey="Collard R">RM Collard</name>
</author>
<author>
<name sortKey="Oude Voshaar, Rc" uniqKey="Oude Voshaar R">RC Oude Voshaar</name>
</author>
<author>
<name sortKey="Comijs, Hc" uniqKey="Comijs H">HC Comijs</name>
</author>
<author>
<name sortKey="De Vocht, Hm" uniqKey="De Vocht H">HM de Vocht</name>
</author>
<author>
<name sortKey="Naarding, P" uniqKey="Naarding P">P Naarding</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Hassan, A" uniqKey="Hassan A">A Hassan</name>
</author>
<author>
<name sortKey="Vallabhajosula, S" uniqKey="Vallabhajosula S">S Vallabhajosula</name>
</author>
<author>
<name sortKey="Zahodne, Lb" uniqKey="Zahodne L">LB Zahodne</name>
</author>
<author>
<name sortKey="Bowers, D" uniqKey="Bowers D">D Bowers</name>
</author>
<author>
<name sortKey="Okun, Ms" uniqKey="Okun M">MS Okun</name>
</author>
<author>
<name sortKey="Fernandez, Hh" uniqKey="Fernandez H">HH Fernandez</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Maki, Be" uniqKey="Maki B">BE Maki</name>
</author>
<author>
<name sortKey="Holliday, Pj" uniqKey="Holliday P">PJ Holliday</name>
</author>
<author>
<name sortKey="Topper, Ak" uniqKey="Topper A">AK Topper</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Godfrey, A" uniqKey="Godfrey A">A Godfrey</name>
</author>
<author>
<name sortKey="Bourke, Ak" uniqKey="Bourke A">AK Bourke</name>
</author>
<author>
<name sortKey="Olaighin, Gm" uniqKey="Olaighin G">GM Olaighin</name>
</author>
<author>
<name sortKey="Van De Ven, P" uniqKey="Van De Ven P">P van de Ven</name>
</author>
<author>
<name sortKey="Nelson, J" uniqKey="Nelson J">J Nelson</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Arif, M" uniqKey="Arif M">M Arif</name>
</author>
<author>
<name sortKey="Bilal, M" uniqKey="Bilal M">M Bilal</name>
</author>
<author>
<name sortKey="Kattan, A" uniqKey="Kattan A">A Kattan</name>
</author>
<author>
<name sortKey="Ahamed, Si" uniqKey="Ahamed S">SI Ahamed</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Culhane, Km" uniqKey="Culhane K">KM Culhane</name>
</author>
<author>
<name sortKey="Lyons, Gm" uniqKey="Lyons G">GM Lyons</name>
</author>
<author>
<name sortKey="Hilton, D" uniqKey="Hilton D">D Hilton</name>
</author>
<author>
<name sortKey="Grace, Pa" uniqKey="Grace P">PA Grace</name>
</author>
<author>
<name sortKey="Lyons, D" uniqKey="Lyons D">D Lyons</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Lyons, Gm" uniqKey="Lyons G">GM Lyons</name>
</author>
<author>
<name sortKey="Culhane, Km" uniqKey="Culhane K">KM Culhane</name>
</author>
<author>
<name sortKey="Hilton, D" uniqKey="Hilton D">D Hilton</name>
</author>
<author>
<name sortKey="Grace, Pa" uniqKey="Grace P">PA Grace</name>
</author>
<author>
<name sortKey="Lyons, D" uniqKey="Lyons D">D Lyons</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Bourke, Ak" uniqKey="Bourke A">AK Bourke</name>
</author>
<author>
<name sortKey="O Rien, Jv" uniqKey="O Rien J">JV O’Brien</name>
</author>
<author>
<name sortKey="Lyons, Gm" uniqKey="Lyons G">GM Lyons</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Dijkstra, B" uniqKey="Dijkstra B">B Dijkstra</name>
</author>
<author>
<name sortKey="Kamsma, Yp" uniqKey="Kamsma Y">YP Kamsma</name>
</author>
<author>
<name sortKey="Zijlstra, W" uniqKey="Zijlstra W">W Zijlstra</name>
</author>
</analytic>
</biblStruct>
<biblStruct></biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Rahimi, F" uniqKey="Rahimi F">F Rahimi</name>
</author>
<author>
<name sortKey="Duval, C" uniqKey="Duval C">C Duval</name>
</author>
<author>
<name sortKey="Jog, M" uniqKey="Jog M">M Jog</name>
</author>
<author>
<name sortKey="Bee, C" uniqKey="Bee C">C Bee</name>
</author>
<author>
<name sortKey="South, A" uniqKey="South A">A South</name>
</author>
<author>
<name sortKey="Jog, M" uniqKey="Jog M">M Jog</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Moncada Torres, A" uniqKey="Moncada Torres A">A Moncada-Torres</name>
</author>
<author>
<name sortKey="Leuenberger, K" uniqKey="Leuenberger K">K Leuenberger</name>
</author>
<author>
<name sortKey="Gonzenbach, R" uniqKey="Gonzenbach R">R Gonzenbach</name>
</author>
<author>
<name sortKey="Luft, A" uniqKey="Luft A">A Luft</name>
</author>
<author>
<name sortKey="Gassert, R" uniqKey="Gassert R">R Gassert</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Najafi, B" uniqKey="Najafi B">B Najafi</name>
</author>
<author>
<name sortKey="Aminian, K" uniqKey="Aminian K">K Aminian</name>
</author>
<author>
<name sortKey="Loew, F" uniqKey="Loew F">F Loew</name>
</author>
<author>
<name sortKey="Blanc, Y" uniqKey="Blanc Y">Y Blanc</name>
</author>
<author>
<name sortKey="Robert, Pa" uniqKey="Robert P">PA Robert</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Verghese, J" uniqKey="Verghese J">J Verghese</name>
</author>
<author>
<name sortKey="Wang, C" uniqKey="Wang C">C Wang</name>
</author>
<author>
<name sortKey="Lipton, Rb" uniqKey="Lipton R">RB Lipton</name>
</author>
<author>
<name sortKey="Holtzer, R" uniqKey="Holtzer R">R Holtzer</name>
</author>
<author>
<name sortKey="Xue, X" uniqKey="Xue X">X Xue</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Cheng, Pt" uniqKey="Cheng P">PT Cheng</name>
</author>
<author>
<name sortKey="Liaw, My" uniqKey="Liaw M">MY Liaw</name>
</author>
<author>
<name sortKey="Wong, Mk" uniqKey="Wong M">MK Wong</name>
</author>
<author>
<name sortKey="Tang, Ft" uniqKey="Tang F">FT Tang</name>
</author>
<author>
<name sortKey="Lee, My" uniqKey="Lee M">MY Lee</name>
</author>
<author>
<name sortKey="Lin, Ps" uniqKey="Lin P">PS Lin</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Janssen, W" uniqKey="Janssen W">W Janssen</name>
</author>
<author>
<name sortKey="Bussmann, J" uniqKey="Bussmann J">J Bussmann</name>
</author>
<author>
<name sortKey="Selles, R" uniqKey="Selles R">R Selles</name>
</author>
<author>
<name sortKey="Koudstaal, P" uniqKey="Koudstaal P">P Koudstaal</name>
</author>
<author>
<name sortKey="Ribbers, G" uniqKey="Ribbers G">G Ribbers</name>
</author>
<author>
<name sortKey="Stam, H" uniqKey="Stam H">H Stam</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Brooks, Ris" uniqKey="Brooks R">RIS Brooks</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Roetenberg, D" uniqKey="Roetenberg D">D Roetenberg</name>
</author>
<author>
<name sortKey="Luinge, Hj" uniqKey="Luinge H">HJ Luinge</name>
</author>
<author>
<name sortKey="Baten, Ctm" uniqKey="Baten C">CTM Baten</name>
</author>
<author>
<name sortKey="Veltink, Ph" uniqKey="Veltink P">PH Veltink</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Sabatini, Am" uniqKey="Sabatini A">AM Sabatini</name>
</author>
</analytic>
</biblStruct>
<biblStruct></biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Chang, E" uniqKey="Chang E">E Chang</name>
</author>
<author>
<name sortKey="Zak, S" uniqKey="Zak S">S Zak</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Simon Rogers, Mg" uniqKey="Simon Rogers M">MG Simon Rogers</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Cheng, Fy" uniqKey="Cheng F">FY Cheng</name>
</author>
<author>
<name sortKey="Yang, Yr" uniqKey="Yang Y">YR Yang</name>
</author>
<author>
<name sortKey="Wang, Cj" uniqKey="Wang C">CJ Wang</name>
</author>
<author>
<name sortKey="Wu, Yr" uniqKey="Wu Y">YR Wu</name>
</author>
<author>
<name sortKey="Cheng, Sj" uniqKey="Cheng S">SJ Cheng</name>
</author>
<author>
<name sortKey="Wang, Hc" uniqKey="Wang H">HC Wang</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Stack, El" uniqKey="Stack E">EL Stack</name>
</author>
<author>
<name sortKey="Ashburn, Am" uniqKey="Ashburn A">AM Ashburn</name>
</author>
<author>
<name sortKey="Jupp, Ke" uniqKey="Jupp K">KE Jupp</name>
</author>
</analytic>
</biblStruct>
</listBibl>
</div1>
</back>
</TEI>
<pmc article-type="research-article">
<pmc-dir>properties open_access</pmc-dir>
<front>
<journal-meta>
<journal-id journal-id-type="nlm-ta">J Neuroeng Rehabil</journal-id>
<journal-id journal-id-type="iso-abbrev">J Neuroeng Rehabil</journal-id>
<journal-title-group>
<journal-title>Journal of NeuroEngineering and Rehabilitation</journal-title>
</journal-title-group>
<issn pub-type="epub">1743-0003</issn>
<publisher>
<publisher-name>BioMed Central</publisher-name>
<publisher-loc>London</publisher-loc>
</publisher>
</journal-meta>
<article-meta>
<article-id pub-id-type="pmid">25885438</article-id>
<article-id pub-id-type="pmc">4403848</article-id>
<article-id pub-id-type="publisher-id">26</article-id>
<article-id pub-id-type="doi">10.1186/s12984-015-0026-4</article-id>
<article-categories>
<subj-group subj-group-type="heading">
<subject>Research</subject>
</subj-group>
</article-categories>
<title-group>
<article-title>Auto detection and segmentation of physical activities during a Timed-Up-and-Go (TUG) task in healthy older adults using multiple inertial sensors</article-title>
</title-group>
<contrib-group>
<contrib contrib-type="author">
<name>
<surname>Nguyen</surname>
<given-names>Hung P</given-names>
</name>
<address>
<email>hpnguyen@utexas.edu</email>
</address>
<xref ref-type="aff" rid="Aff1"></xref>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Ayachi</surname>
<given-names>Fouaz</given-names>
</name>
<address>
<email>fouazayachi@gmail.com</email>
</address>
<xref ref-type="aff" rid="Aff1"></xref>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Lavigne–Pelletier</surname>
<given-names>Catherine</given-names>
</name>
<address>
<email>lavigne-pelletier.catherine@courrier.uqam.ca</email>
</address>
<xref ref-type="aff" rid="Aff1"></xref>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Blamoutier</surname>
<given-names>Margaux</given-names>
</name>
<address>
<email>margaux.blamoutier@gmail.com</email>
</address>
<xref ref-type="aff" rid="Aff2"></xref>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Rahimi</surname>
<given-names>Fariborz</given-names>
</name>
<address>
<email>fariborz.rahimi@gmail.com</email>
</address>
<xref ref-type="aff" rid="Aff3"></xref>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Boissy</surname>
<given-names>Patrick</given-names>
</name>
<address>
<email>Patrick.Boissy@USherbrooke.ca</email>
</address>
<xref ref-type="aff" rid="Aff4"></xref>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Jog</surname>
<given-names>Mandar</given-names>
</name>
<address>
<email>Mandar.Jog@lhsc.on.ca</email>
</address>
<xref ref-type="aff" rid="Aff5"></xref>
</contrib>
<contrib contrib-type="author" corresp="yes">
<name>
<surname>Duval</surname>
<given-names>Christian</given-names>
</name>
<address>
<email>duval.christian@uqam.ca</email>
</address>
<xref ref-type="aff" rid="Aff1"></xref>
</contrib>
<aff id="Aff1">
<label></label>
Département de Kinanthropologie, Université du Québec à Montréal, C.P. 8888, succursale Centre-Ville, Montréal, H3C 3P8 Québec Canada</aff>
<aff id="Aff2">
<label></label>
Faculté des Sciences, Université du Québec à Montréal, Montreal, Canada</aff>
<aff id="Aff3">
<label></label>
Electrical Engineering Department, University of Bonab, Bonab, East Azerbaijan Iran</aff>
<aff id="Aff4">
<label></label>
Department of Surgery, Orthopaedic Division, Faculty of Medicine and Health Sciences, Université of Sherbrooke, Quebec, Canada</aff>
<aff id="Aff5">
<label></label>
London Health Sciences Center University Hospital, Ontario, Canada</aff>
</contrib-group>
<pub-date pub-type="epub">
<day>11</day>
<month>4</month>
<year>2015</year>
</pub-date>
<pub-date pub-type="pmc-release">
<day>11</day>
<month>4</month>
<year>2015</year>
</pub-date>
<pub-date pub-type="collection">
<year>2015</year>
</pub-date>
<volume>12</volume>
<elocation-id>36</elocation-id>
<history>
<date date-type="received">
<day>2</day>
<month>12</month>
<year>2014</year>
</date>
<date date-type="accepted">
<day>20</day>
<month>3</month>
<year>2015</year>
</date>
</history>
<permissions>
<copyright-statement>© Nguyen et al.; licensee BioMed Central. 2015</copyright-statement>
<license license-type="open-access">
<license-p>This is an Open Access article distributed under the terms of the Creative Commons Attribution License (
<ext-link ext-link-type="uri" xlink:href="http://creativecommons.org/licenses/by/4.0">http://creativecommons.org/licenses/by/4.0</ext-link>
), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly credited. The Creative Commons Public Domain Dedication waiver (
<ext-link ext-link-type="uri" xlink:href="http://creativecommons.org/publicdomain/zero/1.0/">http://creativecommons.org/publicdomain/zero/1.0/</ext-link>
) applies to the data made available in this article, unless otherwise stated.</license-p>
</license>
</permissions>
<abstract id="Abs1">
<sec>
<title>Background</title>
<p>Recently, much attention has been given to the use of inertial sensors for remote monitoring of individuals with limited mobility. However, the focus has been mostly on the detection of symptoms, not specific activities. The objective of the present study was to develop an automated recognition and segmentation algorithm based on inertial sensor data to identify common gross motor patterns during activity of daily living.</p>
</sec>
<sec>
<title>Method</title>
<p>A modified Time-Up-And-Go (TUG) task was used since it is comprised of four common daily living activities;
<italic>Standing</italic>
,
<italic>Walking</italic>
,
<italic>Turning</italic>
, and
<italic>Sitting</italic>
, all performed in a continuous fashion resulting in six different segments during the task. Sixteen healthy older adults performed two trials of a 5 and 10 meter TUG task. They were outfitted with 17 inertial motion sensors covering each body segment. Data from the 10 meter TUG were used to identify pertinent sensors on the trunk, head, hip, knee, and thigh that provided suitable data for detecting and segmenting activities associated with the TUG. Raw data from sensors were detrended to remove sensor drift, normalized, and band pass filtered with optimal frequencies to reveal kinematic peaks that corresponded to different activities. Segmentation was accomplished by identifying the time stamps of the first minimum or maximum to the right and the left of these peaks. Segmentation time stamps were compared to results from two examiners visually segmenting the activities of the TUG.</p>
</sec>
<sec>
<title>Results</title>
<p>We were able to detect these activities in a TUG with 100% sensitivity and specificity (n = 192) during the 10 meter TUG. The rate of success was subsequently confirmed in the 5 meter TUG (n = 192) without altering the parameters of the algorithm. When applying the segmentation algorithms to the 10 meter TUG, we were able to parse 100% of the transition points (n = 224) between different segments that were as reliable and less variable than visual segmentation performed by two independent examiners.</p>
</sec>
<sec>
<title>Conclusions</title>
<p>The present study lays the foundation for the development of a comprehensive algorithm to detect and segment naturalistic activities using inertial sensors, in hope of evaluating automatically motor performance within the detected tasks.</p>
</sec>
</abstract>
<kwd-group xml:lang="en">
<title>Keywords</title>
<kwd>Walk</kwd>
<kwd>Turn</kwd>
<kwd>Auto</kwd>
<kwd>Elderly</kwd>
<kwd>Activities of daily living</kwd>
<kwd>Optimization</kwd>
<kwd>Sit</kwd>
<kwd>Stand</kwd>
<kwd>Segment</kwd>
</kwd-group>
<custom-meta-group>
<custom-meta>
<meta-name>issue-copyright-statement</meta-name>
<meta-value>© The Author(s) 2015</meta-value>
</custom-meta>
</custom-meta-group>
</article-meta>
</front>
<body>
<sec id="Sec1">
<title>Background</title>
<p>With an increasingly aging population of older adults, promoting and maintaining a healthy mental and physical lifestyle is crucial for their quality of life. People suffering from motor degenerative diseases often experience limited mobility, which could lead to physical and mental deterioration further compounding the effects of aging [
<xref ref-type="bibr" rid="CR1">1</xref>
,
<xref ref-type="bibr" rid="CR2">2</xref>
]. Loss of mobility will manifest itself in activities of daily living (ADLs) through altered gait and increased the risk of falling [
<xref ref-type="bibr" rid="CR3">3</xref>
]. Since, these limitations are felt during life activities, there is a need for a more systematic method of monitoring and evaluating the loss of mobility to increase the quality of life for older adults and people suffering from motor degenerative diseases.</p>
<p>Recently, inertial sensors have been used to detect human physical activities such as walking [
<xref ref-type="bibr" rid="CR4">4</xref>
,
<xref ref-type="bibr" rid="CR5">5</xref>
], lying [
<xref ref-type="bibr" rid="CR6">6</xref>
,
<xref ref-type="bibr" rid="CR7">7</xref>
] and falling in the elderly population [
<xref ref-type="bibr" rid="CR8">8</xref>
], as well as in people with Parkinson’s disease [
<xref ref-type="bibr" rid="CR9">9</xref>
-
<xref ref-type="bibr" rid="CR11">11</xref>
]. The emphasis has been on the detection of activity to evaluate mobility both in clinical setting as well as in the home [
<xref ref-type="bibr" rid="CR6">6</xref>
]. Sensor such as accelerometer has been widely adopted to detect physical activities due to its availability, compact size and low power consumption [
<xref ref-type="bibr" rid="CR4">4</xref>
]. These sensors have been used to detect walking, sitting, and standing during the course of daily living [
<xref ref-type="bibr" rid="CR7">7</xref>
,
<xref ref-type="bibr" rid="CR12">12</xref>
], allowing measurement of performance parameters such as gait stride speed, stride length, etc. A system of inertial and barometric sensor on different anatomical locations has also been used to detect activities such as drinking and writing [
<xref ref-type="bibr" rid="CR12">12</xref>
]. In addition to activity detection, postural transitions especially during
<italic>sit-to-stand</italic>
and
<italic>stand-to-sit</italic>
have been detected with high accuracy using a single chest mounted gyroscope [
<xref ref-type="bibr" rid="CR13">13</xref>
] and tri-axial accelerometer [
<xref ref-type="bibr" rid="CR4">4</xref>
]. However, the scope of these postural transition detections has been limited to static transition and the range of the activity that can be detected is limited by the amount of sensory information available.</p>
<p>These sensors have the potential to provide continuous mobility monitoring in the home environment, and therefore are more practical to deploy than laboratory based optical motion capture systems. The ultimate goal is to provide information that could be used to identify performance parameters to monitor disease or rehabilitation progress [
<xref ref-type="bibr" rid="CR14">14</xref>
-
<xref ref-type="bibr" rid="CR16">16</xref>
]. However, in order to remotely monitor performance, one must be able to segment, i.e., identify the subsets of movement within an individual task. Auto segmenting or isolating activities could then provide time stamps within which mobility parameters can be analyzed.</p>
<p>The objective of the present study was to develop and test an automated recognition and segmentation algorithm based on inertial sensor data to identify gross motor activities pattern in daily living tasks during a continuous trial. We used a modified Time-Up-And-Go (TUG) task as a model of simple activities that included four common activities;
<italic>Standing</italic>
,
<italic>Walking</italic>
,
<italic>Turning</italic>
, and
<italic>Sitting</italic>
performed in a continuous fashion.</p>
</sec>
<sec id="Sec2" sec-type="methods">
<title>Methods</title>
<sec id="Sec3">
<title>Participants</title>
<p>Sixteen healthy, community dwelling older adults (9 females; 68.7 ± 9.3 years old, height =1.6 ± 0.1 m, weight = 62.8 ± 8.4 kg, BMI = 25.4 ± 3.5 kg/m
<sup>2</sup>
; 7 males, 67.3 ± 5.8 years old, height =1.7 ± 0.1, weight = 67.8 ± 9.5 kg, BMI = 23.4 ± 3.1 kg/m
<sup>2</sup>
) were recruited through the Centre de Recherche de l’Institut Universitaire de Gériatrie de Montreal (CRIUGM). Participants were screened for comorbidities and cognitive deficits. None of the participants exhibited any physical limitations or pain that could affect their ability to perform the task. The institutional research ethics review board of the CRIUGM approved this research and each participant read and signed an informed consent form.</p>
</sec>
<sec id="Sec4">
<title>Experiment protocol</title>
<p>In this study, participants performed two randomly selected TUG tasks, one having length of 10 meters, the other 5 meters. Participants performed two trials of each TUG task. The algorithm was based on the 10 meters TUG because it provided more walking strides as well as a more gradual transition between
<italic>Walking</italic>
and
<italic>Turning</italic>
. The 5 meters TUG was used to evaluate the extensibility of the algorithm for shorter distance TUG task. The TUG was used simply because it contains key activities (
<italic>Standing</italic>
,
<italic>Walking</italic>
,
<italic>Turning</italic>
and
<italic>Sitting</italic>
) that are performed in a continuous fashion. Data recording started with participants in a standing position to align the sensors with the motion capture system, then sat down in a plastic armed-chair to perform the TUG task. Participants then stood up from the sitting position with their arms on the chair, walked to a distance marker on the floor, turned around, and walked back to the chair turned around, and finally sat down (Figure 
<xref rid="Fig1" ref-type="fig">1</xref>
A). Participants were asked to perform these tasks at their own pace and no instructions were given on how to stand, sit, walk, or turn.
<fig id="Fig1">
<label>Figure 1</label>
<caption>
<p>Schematic of the TUG task and the inertial sensor motion capture system.
<bold>A)</bold>
Spatial schematic of a TUG path and different transition points. Seven transitions were identified among the activities performed during a TUG. These transitions are: 1)
<italic>sit-to-stand</italic>
2)
<italic>stand-to-walk-out</italic>
3)
<italic>walk-out-to-turn</italic>
4)
<italic>turn-to-walk-in</italic>
5)
<italic>walk-in-to-turn</italic>
6)
<italic>turn-to-sit</italic>
7)
<italic>stand-to-sit</italic>
.
<bold>B)</bold>
Diagram of the 17 sensors and their location on the Animazoo suit.
<bold>C)</bold>
A close-up view of the sensors on the shoulders, trunk and hip.
<bold>D)</bold>
The orientation of the axes on the sensor. Using the right-hand Cartesian coordinate system, the y-axis line is aligned along the length of the inertial sensor while the x-axis is aligned along the width of the sensor.
<bold>E)</bold>
Global work flow of the algorithm to detect the activities and transition between activities using an inertial sensor motion capture system.</p>
</caption>
<graphic xlink:href="12984_2015_26_Fig1_HTML" id="MO1"></graphic>
</fig>
</p>
<p>Participants performed these TUG tests while wearing the Animazoo IGS-180 motion capture suit (Synertial UK Ltd, Brighton, UK). The ISG-180 (Figure 
<xref rid="Fig1" ref-type="fig">1</xref>
B, C) is equipped with 17 inertial sensing modules (OS3D, Inertial Lab, VA, USA) positioned on each limb in order to capture full-body 3D movement. Each sensor module is comprised of 3-axis linear acceleration (accelerometer), angular velocity (gyroscope) and magnetic north heading (magnetometer). Raw data (acceleration, angular velocity) and fused data (3D orientation in degrees estimated from a fusion algorithm [
<xref ref-type="bibr" rid="CR17">17</xref>
-
<xref ref-type="bibr" rid="CR20">20</xref>
] developed by Animazoo) from each sensor were acquired at 60 Hz. Since there was no
<italic>a priori</italic>
expectation as to which sensors were suitable markers for detection and segmentation, all 17 inertial sensors were active during the recording.</p>
</sec>
<sec id="Sec5">
<title>Sensor location</title>
<p>The head sensor was attached to a cap worn by the participants, which positioned it on the right side of the head. The trunk sensor was located on the midline over T1, while the hip sensor was positioned at the level of L5. For upper extremities, shoulder sensors were positioned over the scapula; upper arm sensors were positioned between the shoulder and elbow while forearm sensors were positioned between the elbow and wrist joint. Hand sensors were attached to an open-finger glove, and positioned on the dorsum surface of the hands. In the lower extremity, thigh sensors were positioned on the outer side of the limb segment between the hip and knee. Shin sensors were positioned between the knee and ankle. Foot sensors were attached to the dorsum of the shoes worn by the participants.</p>
</sec>
<sec id="Sec6">
<title>Signal conditioning</title>
<p>The signals from the inertial sensors were detrended to remove sensor drift and normalized against the absolute maximum amplitude of each signal (unitless) to ensure uniformity in the analysis across all participants. Ideal band pass filters in the frequency domain were applied. An ideal frequency-selective filter is a system that passes a pre-specified range of frequency components without any attenuation, but completely rejects the remaining frequency components. The transfer function of the ideal pass band filter is defined as follows:
<disp-formula id="Equ1">
<label>1</label>
<alternatives>
<tex-math id="M1">\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$ {H}_{BP}(jw)=\left\{\begin{array}{cc}\hfill 1\hfill & \hfill {w}_1\le w\le {w}_2\hfill \\ {}\hfill 0\hfill & \hfill elsewhere\hfill \end{array}\right. $$\end{document}</tex-math>
<mml:math id="M2">
<mml:msub>
<mml:mi>H</mml:mi>
<mml:mrow>
<mml:mi>B</mml:mi>
<mml:mi>P</mml:mi>
</mml:mrow>
</mml:msub>
<mml:mfenced close=")" open="(">
<mml:mrow>
<mml:mi>j</mml:mi>
<mml:mi>w</mml:mi>
</mml:mrow>
</mml:mfenced>
<mml:mo>=</mml:mo>
<mml:mfenced close="" open="{">
<mml:mtable columnalign="center">
<mml:mtr columnalign="center">
<mml:mtd columnalign="center">
<mml:mn>1</mml:mn>
</mml:mtd>
<mml:mtd columnalign="center">
<mml:msub>
<mml:mi>w</mml:mi>
<mml:mn>1</mml:mn>
</mml:msub>
<mml:mo></mml:mo>
<mml:mi>w</mml:mi>
<mml:mo></mml:mo>
<mml:msub>
<mml:mi>w</mml:mi>
<mml:mn>2</mml:mn>
</mml:msub>
</mml:mtd>
</mml:mtr>
<mml:mtr columnalign="center">
<mml:mtd columnalign="center">
<mml:mn>0</mml:mn>
</mml:mtd>
<mml:mtd columnalign="center">
<mml:mi mathvariant="italic">elsewhere</mml:mi>
</mml:mtd>
</mml:mtr>
</mml:mtable>
</mml:mfenced>
</mml:math>
<graphic xlink:href="12984_2015_26_Article_Equ1.gif" position="anchor"></graphic>
</alternatives>
</disp-formula>
</p>
<p>Where
<italic>w</italic>
<sub>
<italic>1</italic>
</sub>
and
<italic>w</italic>
<sub>
<italic>2</italic>
</sub>
are referred to as the low and high cutoff frequencies, respectively. A band pass filter was chosen and constructed as a generalize filter for the different sensors in the motion capture system. The band pass filter has a finite bandwidth as it only allows a range of frequencies (
<italic>w</italic>
<sub>
<italic>1</italic>
</sub>
 ≤ 
<italic>w</italic>
 ≤ 
<italic>w</italic>
<sub>
<italic>2</italic>
</sub>
) to be passed through the filter. The dominant frequencies in these inertial sensors during a TUG (sampled at 60 Hz) were less than 10 Hz. The low cut frequency was set at
<italic>w</italic>
<sub>
<italic>1</italic>
</sub>
 = 0.0025 Hz to capture all the low frequency dynamics and to condition the data in the frequency domain by removing the fundamental frequency and centralizing the data. The high cut frequency was optimized for each sensor (
<italic>w</italic>
<sub>
<italic>2</italic>
</sub>
 < 10 Hz) with an exhaustive search optimization method using the time stamps from the inertial sensors and the visual segmentation (see below). However, the cutoff frequency of the hip angular velocity used for the detection of
<italic>Walking</italic>
was set at the Nyquist frequency (30 Hz) to capture the stride information during walking. At low cutoff frequency, the stride features during
<italic>Walking</italic>
would not be detectable.</p>
</sec>
<sec id="Sec7">
<title>Sensor selection</title>
<sec id="Sec8">
<title>Activity detection</title>
<p>The sensors selected for activity detection were based on how they corresponded to the biomechanics of movement during the performance of these activities.
<italic>Standing</italic>
which denotes when participants stand up from the chair was detected using the acceleration of the trunk (a
<sub>z, Trunk</sub>
).
<italic>Sitting</italic>
which denotes when participants sit down on the chair was also detected using the same sensor data. Sensors on the trunk or chest have been used to identify
<italic>Standing</italic>
and
<italic>Sitting</italic>
during physical activities [
<xref ref-type="bibr" rid="CR4">4</xref>
]. However, in this study, the time derivative of the acceleration
<inline-formula id="IEq1">
<alternatives>
<tex-math id="M3">\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$ \left({\dot{a}}_{y, Hip}\right) $$\end{document}</tex-math>
<mml:math id="M4">
<mml:mfenced close=")" open="(">
<mml:msub>
<mml:mover accent="true">
<mml:mi>a</mml:mi>
<mml:mo stretchy="true">˙</mml:mo>
</mml:mover>
<mml:mrow>
<mml:mi>y</mml:mi>
<mml:mo>,</mml:mo>
<mml:mi mathvariant="italic">Hip</mml:mi>
</mml:mrow>
</mml:msub>
</mml:mfenced>
</mml:math>
<inline-graphic xlink:href="12984_2015_26_Article_IEq1.gif"></inline-graphic>
</alternatives>
</inline-formula>
of the thigh was also used to differentiate between
<italic>Standing</italic>
and
<italic>Sitting</italic>
. During
<italic>Standing</italic>
,
<inline-formula id="IEq2">
<alternatives>
<tex-math id="M5">\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$ {\dot{a}}_{y, Hip}\kern0.5em >0 $$\end{document}</tex-math>
<mml:math id="M6">
<mml:msub>
<mml:mover accent="true">
<mml:mi>a</mml:mi>
<mml:mo stretchy="true">˙</mml:mo>
</mml:mover>
<mml:mrow>
<mml:mi>y</mml:mi>
<mml:mo>,</mml:mo>
<mml:mi mathvariant="italic">Hip</mml:mi>
</mml:mrow>
</mml:msub>
<mml:mspace width="0.5em"></mml:mspace>
<mml:mo>></mml:mo>
<mml:mn>0</mml:mn>
</mml:math>
<inline-graphic xlink:href="12984_2015_26_Article_IEq2.gif"></inline-graphic>
</alternatives>
</inline-formula>
and during
<italic>Sitting</italic>
,
<inline-formula id="IEq3">
<alternatives>
<tex-math id="M7">\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$ {\dot{a}}_{y, Hip}\kern0.5em < 0 $$\end{document}</tex-math>
<mml:math id="M8">
<mml:msub>
<mml:mover accent="true">
<mml:mi>a</mml:mi>
<mml:mo stretchy="true">˙</mml:mo>
</mml:mover>
<mml:mrow>
<mml:mi>y</mml:mi>
<mml:mo>,</mml:mo>
<mml:mi mathvariant="italic">Hip</mml:mi>
</mml:mrow>
</mml:msub>
<mml:mspace width="0.5em"></mml:mspace>
<mml:mo><</mml:mo>
<mml:mspace width="0.25em"></mml:mspace>
<mml:mn>0</mml:mn>
</mml:math>
<inline-graphic xlink:href="12984_2015_26_Article_IEq3.gif"></inline-graphic>
</alternatives>
</inline-formula>
. The angular velocity (ω
<sub>y, Trunk</sub>
) of the trunk was used to detect
<italic>Turning</italic>
. The angular velocity of the head (ω
<sub>y, Head</sub>
) was also used to verify that
<italic>Turning</italic>
has occurred and the direction of
<italic>Turning. Walking</italic>
was detected by using a 500-millisecond window to detect the oscillation in the angular velocity (ω
<sub>x, Hip</sub>
) of the hip.
<italic>Walking</italic>
was also detected during
<italic>Turning</italic>
; however priority was given to classify this as
<italic>Turning</italic>
. The detections of
<italic>Standing</italic>
,
<italic>Turning</italic>
,
<italic>Sitting</italic>
, and
<italic>Walking</italic>
are shown in Figure 
<xref rid="Fig2" ref-type="fig">2</xref>
. The activities were detected by finding the maximal or minimal peaks of the selected sensors that corresponded to different activities. The square signals were generated by setting the threshold at 30% of peak amplitude to provide visual indication that an event was detected. The algorithm and sensors used to detect the activities during a TUG are shown in Figure 
<xref rid="Fig3" ref-type="fig">3</xref>
.
<fig id="Fig2">
<label>Figure 2</label>
<caption>
<p>Activities detection during a TUG. Detection of different activities during a TUG for one participant. Raw signals were detrended and normalized for uniformity across all participants. The detection algorithm relied on detecting the large kinematic peaks in the inertial sensor (T
<sub>Max</sub>
and T
<sub>Min</sub>
) that corresponded to different activities. The square signals were generated at 30% of the peaks to visually indicate that activities were detected during TUG.
<bold>A)</bold>
<italic>Standing,</italic>
when participants stand up from the chair, was detected using the trunk a
<sub>z</sub>
and the time derivative of the hip (
<inline-formula id="IEq4">
<alternatives>
<tex-math id="M9">\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$ {\dot{a}}_y\kern0.5em >0 $$\end{document}</tex-math>
<mml:math id="M10">
<mml:msub>
<mml:mover accent="true">
<mml:mi>a</mml:mi>
<mml:mo stretchy="true">˙</mml:mo>
</mml:mover>
<mml:mi>y</mml:mi>
</mml:msub>
<mml:mspace width="0.5em"></mml:mspace>
<mml:mo>></mml:mo>
<mml:mn>0</mml:mn>
</mml:math>
<inline-graphic xlink:href="12984_2015_26_Article_IEq4.gif"></inline-graphic>
</alternatives>
</inline-formula>
).
<bold>B)</bold>
<italic>Turning</italic>
was identified using the trunk angular velocity (ω
<sub>y</sub>
)
<bold>C)</bold>
<italic>Sitting</italic>
, when participants sit down on the chair, was detected using the trunk a
<sub>z</sub>
and the time derivative of the hip (
<inline-formula id="IEq5">
<alternatives>
<tex-math id="M11">\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$ {\dot{a}}_y\kern0.5em >0 $$\end{document}</tex-math>
<mml:math id="M12">
<mml:msub>
<mml:mover accent="true">
<mml:mi>a</mml:mi>
<mml:mo stretchy="true">˙</mml:mo>
</mml:mover>
<mml:mi>y</mml:mi>
</mml:msub>
<mml:mspace width="0.5em"></mml:mspace>
<mml:mo>></mml:mo>
<mml:mn>0</mml:mn>
</mml:math>
<inline-graphic xlink:href="12984_2015_26_Article_IEq5.gif"></inline-graphic>
</alternatives>
</inline-formula>
).
<bold>D)</bold>
<italic>Walking</italic>
was identified by using a 500 ms windows to detect the oscillation in the angular velocity of the hip (ω
<sub>y</sub>
).</p>
</caption>
<graphic xlink:href="12984_2015_26_Fig2_HTML" id="MO2"></graphic>
</fig>
<fig id="Fig3">
<label>Figure 3</label>
<caption>
<p>Flow chart of the detection algorithm use to identify the scripted activities during a TUG. These sensors were normalized and detrended for uniformity across all participants. The high cut frequencies of the band pass filter were determined by optimizing the difference between the transition time using the inertial sensors and by visual inspection. T
<sub>Max</sub>
or T
<sub>Min</sub>
denotes the large peaks that correspond to different activities while t
<sub>min</sub>
or t
<sub>max</sub>
represents the first minimum or maximum to the left or right of T
<sub>Max</sub>
or T
<sub>Min</sub>
.</p>
</caption>
<graphic xlink:href="12984_2015_26_Fig3_HTML" id="MO3"></graphic>
</fig>
</p>
</sec>
<sec id="Sec9">
<title>Segmentation</title>
<p>Four common daily living activities were featured during a TUG task. These activities are:
<italic>Standing</italic>
,
<italic>Walking</italic>
,
<italic>Turning</italic>
, and
<italic>Sitting</italic>
. The sequence of these activities generated six different
<italic>segments</italic>
during a TUG task. These segments were:
<italic>Stand up</italic>
,
<italic>Walk-out</italic>
,
<italic>Turn 180</italic>
,
<italic>Walk-in</italic>
,
<italic>Turn 180</italic>
and
<italic>Sit down</italic>
. The transition point was defined as the separation point between two consecutive segments. The transition points between these segments were identified by detecting the time stamp of the first minimum or maximum to the left and right of the segment peak, which marked the beginning and ending of each segment in the TUG. The seven transitions identified during a TUG are:
<italic>sit-to-stand</italic>
,
<italic>stand-to-walk-out</italic>
,
<italic>walk-out-to-turn</italic>
,
<italic>turn-to-walk-in</italic>
, w
<italic>alk-in-to-turn</italic>
,
<italic>turn-to-stand</italic>
and
<italic>stand-to-sit</italic>
.</p>
<p>The kinematic pattern of the joints and limbs during the performance of these activities were used to identify a set of sensors that marked the transition point for each segment. For example, the patterns of the trunk angular velocity for all participants during
<italic>walk-in-to-turn</italic>
are shown in Figure 
<xref rid="Fig4" ref-type="fig">4</xref>
A. While there were variability between participants in the duration and amplitude of these signals, there was a similar pattern that indicated the beginning and ending of
<italic>Turning.</italic>
While the maximal peak in trunk angular velocity (ω
<sub>y, Trunk</sub>
) was used to detect
<italic>Turning</italic>
, the time stamp of the first minimum to the left and right (t
<sub>min</sub>
) of these peaks were used to approximate the transition between
<italic>Walking</italic>
and
<italic>Turning</italic>
(Figure 
<xref rid="Fig4" ref-type="fig">4</xref>
B). Similar patterns were also exhibited in the hip and head sensor. However, these sensors were not always in-phase with each other; therefore, some might have lagged while others led. Therefore, an average of the sensor information from the head, trunk and hip were use as surrogate approximation of the
<italic>walk-to-turn</italic>
and
<italic>turn-to-walk</italic>
transition. The transition time for a few selected sensors were individually and collectively (using the mean) compared with the visual segmentation time and the sensor combinations that yielded the smallest differences across all participants were used to estimate the transition between these activities. The selected sensors and the algorithm to detect these transitions are presented in Figure 
<xref rid="Fig5" ref-type="fig">5</xref>
.
<fig id="Fig4">
<label>Figure 4</label>
<caption>
<p>Temporal schematic of segment transitions during a TUG and kinematics pattern during turning transitions.
<bold>A)</bold>
Selected inertial sensors are identified on the Y-axis (trunk ω
<sub>y</sub>
) of the graphs and the kinematics pattern during a
<italic>walk-in-to-turn</italic>
transition for all participants (n = 16). These patterns showed a consistent kinematic behavior of this sensor during
<italic>Turning</italic>
; therefore, it was used to identify
<italic>Turning</italic>
as well as the transition to the activities before and after
<italic>Turning</italic>
.
<bold>B)</bold>
The raw and filtered signals of the trunk ω
<sub>y</sub>
with two different maximum peaks that indicated two different turns during the TUG task. The time stamps of first minimum peak to the left and right of these peaks were used to approximate the transition point before and after
<italic>Turning</italic>
.</p>
</caption>
<graphic xlink:href="12984_2015_26_Fig4_HTML" id="MO4"></graphic>
</fig>
<fig id="Fig5">
<label>Figure 5</label>
<caption>
<p>The sensors and algorithm used to segment different transition points during a TUG. These sensors were selected by optimizing the transition time from each sensor with the visually segmented time.
<italic>Sit-to-stand</italic>
transition was detected using mean time from the acceleration of the trunk (a
<sub>z</sub>
) and the knee angle (
<italic>θ</italic>
) while
<italic>stand-to-sit</italic>
and
<italic>stand-to-walk-out</italic>
were estimated using only the trunk a
<sub>z</sub>
. Transitions before and after
<italic>Turning</italic>
were detected using the mean time estimated using the ω
<sub>y</sub>
of the hip and the trunk. However during
<italic>turn-to-walk-in</italic>
transition only the ω
<sub>y</sub>
of the hip was used since it yielded the best approximation as compared to the visual segmentation.</p>
</caption>
<graphic xlink:href="12984_2015_26_Fig5_HTML" id="MO5"></graphic>
</fig>
</p>
</sec>
</sec>
<sec id="Sec10">
<title>Visual segmentation</title>
<p>In addition to the development of an algorithm for the automatic detection and segmentation of activities performed during a TUG, we were also interested in determining if the transition points detected by the segmentation was at least as accurate as visual segmentation done by simply looking at the avatar. To do so, two examiners were asked to independently segment these activities during the 10 meter TUG using the avatar generated by the software during the performance of the task to provide validation to the automated algorithm. These two examiners were provided with general transition segmentation guidelines and were instructed to mark the time stamp of when the participants began to transition into different activities during the TUG. The variability of the marked time between two examiners for all participants (n = 16) and all transitions (n = 224) is shown in Figure 
<xref rid="Fig6" ref-type="fig">6</xref>
. The variability between the two examiners was then used to evaluate the performance of the algorithm in estimating the transition points using the information from the sensors for all participants. The examiners were most variable when marking the transition time during
<italic>stand-to-walk-out</italic>
while they were less variable during
<italic>sit-to-stand</italic>
transition. The
<italic>sit-to-stand</italic>
was easier to identify because it started from a static sitting position while other transitions were dynamically blended from one movement to the next. This was also evident during
<italic>stand-to-sit</italic>
transition. During dynamic transition between different activities within a TUG, it was more difficult for two examiners to agree on exactly when the activities started and ended because of how differently participants performed the TUG. Furthermore, these variations between the examiners were also evident in individual participant. These examiners were more variable in some participants than others during segmentation of the same transition. This indicates that participants did not perform the task in exactly the same way, which might affect the judgment of the examiners as to when the transition started and ended. The differences in how participants performed the task might contribute to the variability between the examiners. Since participants were performing the TUG at their own pace, the time stamp
<inline-formula id="IEq6">
<alternatives>
<tex-math id="M13">\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$ \left(\overline{T}\right) $$\end{document}</tex-math>
<mml:math id="M14">
<mml:mfenced close=")" open="(">
<mml:mover accent="true">
<mml:mi>T</mml:mi>
<mml:mo stretchy="true">¯</mml:mo>
</mml:mover>
</mml:mfenced>
</mml:math>
<inline-graphic xlink:href="12984_2015_26_Article_IEq6.gif"></inline-graphic>
</alternatives>
</inline-formula>
was shifted to zero to normalize the marked time across all participants. It took approximate 7 minutes to visually segment one 30-second trial.
<fig id="Fig6">
<label>Figure 6</label>
<caption>
<p>Variance of the visual segmentation of different transition points during a 10 m TUG between two examiners. The variance (mean ± std) from the visual segmentation of the different activities in a TUG for all participants (n = 16) at all transition by two independent examiners. The segmentation was marked by using the avatars provided by Animazoo. Participant performed two trials of the 10 meter TUG.</p>
</caption>
<graphic xlink:href="12984_2015_26_Fig6_HTML" id="MO6"></graphic>
</fig>
</p>
</sec>
<sec id="Sec11">
<title>Range of Motion (ROM) calculation</title>
<p>In addition to using the velocity and acceleration profiles of each sensor for the automatic detection and segmentation of activities performed during a TUG, we also considered the orientation data originating from the fusion algorithm of the sensors. To do so, quaternions were used to calculate the angles between limb segments, for instance, the angle between the hip and the thigh (hip ROM) or the angle between the thigh and the tibia (knee ROM). To calculate these angles, we used the quaternion output, which is a four-dimensional scalar and vector representation of complex numbers,
<disp-formula id="Equ2">
<label>2</label>
<alternatives>
<tex-math id="M15">\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$ q=\left[w\ x\ y\ z\right] $$\end{document}</tex-math>
<mml:math id="M16">
<mml:mi>q</mml:mi>
<mml:mo>=</mml:mo>
<mml:mfenced close="]" open="[">
<mml:mrow>
<mml:mi>w</mml:mi>
<mml:mspace width="0.25em"></mml:mspace>
<mml:mi>x</mml:mi>
<mml:mspace width="0.25em"></mml:mspace>
<mml:mi>y</mml:mi>
<mml:mspace width="0.25em"></mml:mspace>
<mml:mi>z</mml:mi>
</mml:mrow>
</mml:mfenced>
</mml:math>
<graphic xlink:href="12984_2015_26_Article_Equ2.gif" position="anchor"></graphic>
</alternatives>
</disp-formula>
</p>
<p>Where
<italic>w</italic>
is a real number and v = [
<italic>x y z</italic>
] is a vector</p>
<p>For example, let
<italic>q</italic>
<sub>1</sub>
and
<italic>q</italic>
<sub>2</sub>
represent the quaternions of the hip and thigh respectively and
<italic>q</italic>
<sub>rel</sub>
defines the relative quaternion between these two segments, then
<disp-formula id="Equ3">
<label>3</label>
<alternatives>
<tex-math id="M17">\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$ {q}_{rel}={q}_1^{-1}*{q}_2 $$\end{document}</tex-math>
<mml:math id="M18">
<mml:msub>
<mml:mi>q</mml:mi>
<mml:mrow>
<mml:mi>r</mml:mi>
<mml:mi>e</mml:mi>
<mml:mi>l</mml:mi>
</mml:mrow>
</mml:msub>
<mml:mo>=</mml:mo>
<mml:msubsup>
<mml:mi>q</mml:mi>
<mml:mn>1</mml:mn>
<mml:mrow>
<mml:mo></mml:mo>
<mml:mn>1</mml:mn>
</mml:mrow>
</mml:msubsup>
<mml:mo>*</mml:mo>
<mml:msub>
<mml:mi>q</mml:mi>
<mml:mn>2</mml:mn>
</mml:msub>
</mml:math>
<graphic xlink:href="12984_2015_26_Article_Equ3.gif" position="anchor"></graphic>
</alternatives>
</disp-formula>
</p>
<p>To track the relative changes of a quaternion during the TUG, a reference quaternion,
<italic>q</italic>
<sub>
<italic>ref</italic>
</sub>
, was recorded at the start of each trial when participants were in a standard pose position with their arms along the sides. The change in the quaternion was defined as
<disp-formula id="Equ4">
<label>4</label>
<alternatives>
<tex-math id="M19">\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$ {q}_{\Delta}={q}_{ref}^{-1}*{q}_{rel} $$\end{document}</tex-math>
<mml:math id="M20">
<mml:msub>
<mml:mi>q</mml:mi>
<mml:mi mathvariant="normal">Δ</mml:mi>
</mml:msub>
<mml:mo>=</mml:mo>
<mml:msubsup>
<mml:mi>q</mml:mi>
<mml:mrow>
<mml:mi>r</mml:mi>
<mml:mi>e</mml:mi>
<mml:mi>f</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mo></mml:mo>
<mml:mn>1</mml:mn>
</mml:mrow>
</mml:msubsup>
<mml:mo>*</mml:mo>
<mml:msub>
<mml:mi>q</mml:mi>
<mml:mrow>
<mml:mi>r</mml:mi>
<mml:mi>e</mml:mi>
<mml:mi>l</mml:mi>
</mml:mrow>
</mml:msub>
</mml:math>
<graphic xlink:href="12984_2015_26_Article_Equ4.gif" position="anchor"></graphic>
</alternatives>
</disp-formula>
</p>
<p>Post-processing algorithms were applied to
<italic>q</italic>
<sub></sub>
to ensure small angle representation (less than 180°) and continuity in the signal. The range of motion of the hip and knee were calculated by taking the real part of the inverse cosine of the quaternions.
<disp-formula id="Equ5">
<label>5</label>
<alternatives>
<tex-math id="M21">\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$ ROM= real\left(2{ \cos}^{-1}\left({q}_{\Delta}\right)\right) $$\end{document}</tex-math>
<mml:math id="M22">
<mml:mi mathvariant="italic">ROM</mml:mi>
<mml:mo>=</mml:mo>
<mml:mi mathvariant="italic">real</mml:mi>
<mml:mfenced close=")" open="(">
<mml:mrow>
<mml:mn>2</mml:mn>
<mml:msup>
<mml:mo>cos</mml:mo>
<mml:mrow>
<mml:mo></mml:mo>
<mml:mn>1</mml:mn>
</mml:mrow>
</mml:msup>
<mml:mfenced close=")" open="(">
<mml:msub>
<mml:mi>q</mml:mi>
<mml:mi mathvariant="normal">Δ</mml:mi>
</mml:msub>
</mml:mfenced>
</mml:mrow>
</mml:mfenced>
</mml:math>
<graphic xlink:href="12984_2015_26_Article_Equ5.gif" position="anchor"></graphic>
</alternatives>
</disp-formula>
</p>
</sec>
<sec id="Sec12">
<title>High cut frequency optimization</title>
<p>The high cut frequency for the band pass filter (
<italic>w</italic>
<sub>
<italic>2</italic>
</sub>
) was found my minimizing the sum of the square difference between the transition time stamp acquired using the inertial sensor and visually by two examiners across all participants (n = 16).
<disp-formula id="Equ6">
<label>6</label>
<alternatives>
<tex-math id="M23">\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$ \mathrm{minimize}\left\{{\displaystyle \sum_{k=1}^{16}{\left({T}_{manual,k}-{T}_{Sensor,k}\right)}^2}\right. $$\end{document}</tex-math>
<mml:math id="M24">
<mml:mi mathvariant="normal">minimize</mml:mi>
<mml:mfenced close="" open="{">
<mml:mstyle displaystyle="true">
<mml:munderover>
<mml:mo stretchy="true"></mml:mo>
<mml:mrow>
<mml:mi>k</mml:mi>
<mml:mo>=</mml:mo>
<mml:mn>1</mml:mn>
</mml:mrow>
<mml:mn>16</mml:mn>
</mml:munderover>
<mml:mrow>
<mml:msup>
<mml:mfenced close=")" open="(">
<mml:mrow>
<mml:msub>
<mml:mi>T</mml:mi>
<mml:mrow>
<mml:mi mathvariant="italic">manual</mml:mi>
<mml:mo>,</mml:mo>
<mml:mi>k</mml:mi>
</mml:mrow>
</mml:msub>
<mml:mo></mml:mo>
<mml:msub>
<mml:mi>T</mml:mi>
<mml:mrow>
<mml:mi mathvariant="italic">Sensor</mml:mi>
<mml:mo>,</mml:mo>
<mml:mi>k</mml:mi>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:mfenced>
<mml:mn>2</mml:mn>
</mml:msup>
</mml:mrow>
</mml:mstyle>
</mml:mfenced>
</mml:math>
<graphic xlink:href="12984_2015_26_Article_Equ6.gif" position="anchor"></graphic>
</alternatives>
</disp-formula>
</p>
<p>The objective function of the optimization problem to find the high cut off frequency is shown in eq. 
<xref rid="Equ6" ref-type="">6</xref>
, where T
<sub>visual,
<italic>k</italic>
</sub>
was the mean time estimated by both examiners and T
<sub>sensor,
<italic>k</italic>
</sub>
was the time estimated by the inertial sensors(s) and
<italic>k</italic>
was an index for the participants. This was done for all selected sensors for all seven transitions. The sensor or combinations of sensors (mean) that yield the lowest cost function across all participants were selected to approximate the transition time. An example of the cost function of the trunk acceleration (a
<sub>z</sub>
) as a function of the high cut-off frequency is shown in Figure 
<xref rid="Fig7" ref-type="fig">7</xref>
A.
<fig id="Fig7">
<label>Figure 7</label>
<caption>
<p>Optimization of
<italic>w</italic>
<sub>
<italic>2</italic>
</sub>
of trunk a
<sub>z</sub>
during
<italic>sit-to-stand</italic>
transition.
<bold>A)</bold>
The cost function of the trunk a
<sub>z</sub>
as a function of the high cut frequency (
<italic>w</italic>
<sub>
<italic>2</italic>
</sub>
) during
<italic>sit-to-stand</italic>
transition across all participants (n = 16).
<bold>B)</bold>
The convergence of
<italic>w</italic>
<sub>
<italic>2</italic>
</sub>
as it optimized across more participants. This result indicated that kinematics patterns were stable in these inertial sensors during the performance of these activities. While there was variability between participants, the optimal frequency quickly converged when more participants were factored into the cost function. Similar behaviors were also observed in other sensors for all seven transitions.</p>
</caption>
<graphic xlink:href="12984_2015_26_Fig7_HTML" id="MO7"></graphic>
</fig>
</p>
<p>An exhaustive search optimization method [
<xref ref-type="bibr" rid="CR21">21</xref>
] was used to find the high cutoff frequency for the inertial sensor (0.5 Hz ≤ 
<italic>w</italic>
<sub>
<italic>2</italic>
</sub>
≤10 Hz). This frequency band corresponded to the dominant frequencies of the activities performed during a TUG. 2000 steps between this frequency band were used to find the optimal high cut frequency. The optimal high cutoff frequencies for each sensor during each transition are summarized in Table 
<xref rid="Tab1" ref-type="table">1</xref>
.
<table-wrap id="Tab1">
<label>Table 1</label>
<caption>
<p>
<bold>Optimal high cutoff frequency for each sensor at different transitions during a TUG task</bold>
</p>
</caption>
<table frame="hsides" rules="groups">
<thead>
<tr valign="top">
<th colspan="2">
<bold>Transition</bold>
</th>
<th colspan="6">
<bold>Sensor</bold>
</th>
</tr>
<tr valign="top">
<th colspan="2"></th>
<th>
<bold>1</bold>
</th>
<th>
<bold>Freq (Hz)</bold>
</th>
<th>
<bold>2</bold>
</th>
<th>
<bold>Freq (Hz)</bold>
</th>
<th>
<bold>3</bold>
</th>
<th>
<bold>Freq (Hz)</bold>
</th>
</tr>
</thead>
<tbody>
<tr valign="top">
<td>1.</td>
<td>Sit-to-Stand</td>
<td>Trunk a
<sub>z</sub>
</td>
<td>1.57</td>
<td>Hip ω
<sub>x</sub>
</td>
<td>0.69</td>
<td></td>
<td></td>
</tr>
<tr valign="top">
<td>2.</td>
<td>Stand-to-Walk</td>
<td>Trunk a
<sub>z</sub>
</td>
<td>2.44</td>
<td>L Knee ROM</td>
<td>8.30</td>
<td></td>
<td></td>
</tr>
<tr valign="top">
<td>3.</td>
<td>Walk-to-Turn</td>
<td>Trunk v
<sub>y</sub>
</td>
<td>1.32</td>
<td>Head ω
<sub>x</sub>
</td>
<td>0.79</td>
<td>Hip v
<sub>y</sub>
</td>
<td>0.98</td>
</tr>
<tr valign="top">
<td>4.</td>
<td>Turn-to-Walk</td>
<td>Hip v
<sub>y</sub>
</td>
<td>0.53</td>
<td>Head ω
<sub>x</sub>
</td>
<td>0.41</td>
<td></td>
<td></td>
</tr>
<tr valign="top">
<td>5.</td>
<td>Walk-to-Turn</td>
<td>Trunk v
<sub>y</sub>
</td>
<td>1.00</td>
<td>Hip ω
<sub>y</sub>
</td>
<td>0.59</td>
<td></td>
<td></td>
</tr>
<tr valign="top">
<td>6.</td>
<td>Turn-to-Stand</td>
<td>Trunk v
<sub>y</sub>
</td>
<td>1.00</td>
<td>Hip ω
<sub>y</sub>
</td>
<td>0.81</td>
<td></td>
<td></td>
</tr>
<tr valign="top">
<td>7.</td>
<td>Stand-to-Sit</td>
<td>Hip a
<sub>z</sub>
</td>
<td>1.07</td>
<td></td>
<td></td>
<td></td>
<td></td>
</tr>
</tbody>
</table>
</table-wrap>
</p>
</sec>
<sec id="Sec13">
<title>Independent measures</title>
<p>Sensitivity and specificity [
<xref ref-type="bibr" rid="CR22">22</xref>
] were used to evaluate the performance of the algorithm to auto detect the activities performed during a TUG. Sensitivity measures the proportion of actual positive activities detected (true positive) while specificity measures the proportion of the negative activities that were detected (true negative).</p>
<p>The means of the absolute differences (ΔT) and the variances (σ) between the transition time segmented visually by two examiners and automatically using the inertial sensors were used to evaluate the performance of the algorithm across sixteen participants at each transition. The time difference at each transition was defined
<disp-formula id="Equ7">
<label>7</label>
<alternatives>
<tex-math id="M25">\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$ \Delta T=\frac{1}{16}{\displaystyle \sum_{k=1}^{16}\left|{T}_{visual,k}-{T}_{Sesnor,k}\right|} $$\end{document}</tex-math>
<mml:math id="M26">
<mml:mi mathvariant="normal">Δ</mml:mi>
<mml:mi>T</mml:mi>
<mml:mo>=</mml:mo>
<mml:mfrac>
<mml:mn>1</mml:mn>
<mml:mn>16</mml:mn>
</mml:mfrac>
<mml:mstyle displaystyle="true">
<mml:munderover>
<mml:mo stretchy="true"></mml:mo>
<mml:mrow>
<mml:mi>k</mml:mi>
<mml:mo>=</mml:mo>
<mml:mn>1</mml:mn>
</mml:mrow>
<mml:mn>16</mml:mn>
</mml:munderover>
<mml:mrow>
<mml:mfenced close="|" open="|">
<mml:mrow>
<mml:msub>
<mml:mi>T</mml:mi>
<mml:mrow>
<mml:mi mathvariant="italic">visual</mml:mi>
<mml:mo>,</mml:mo>
<mml:mi>k</mml:mi>
</mml:mrow>
</mml:msub>
<mml:mo></mml:mo>
<mml:msub>
<mml:mi>T</mml:mi>
<mml:mrow>
<mml:mi mathvariant="italic">Sesnor</mml:mi>
<mml:mo>,</mml:mo>
<mml:mi>k</mml:mi>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:mfenced>
</mml:mrow>
</mml:mstyle>
</mml:math>
<graphic xlink:href="12984_2015_26_Article_Equ7.gif" position="anchor"></graphic>
</alternatives>
</disp-formula>
</p>
<p>Where
<italic>k</italic>
was an index for the numbers of participant.</p>
</sec>
</sec>
<sec id="Sec14" sec-type="results">
<title>Results</title>
<p>The aims of this work were to develop an algorithm to utilize data from inertial sensor to detect activities such as
<italic>Standing</italic>
,
<italic>Sitting</italic>
,
<italic>Walking</italic>
and
<italic>Turning</italic>
as well as isolating these activities for post analysis of performance.</p>
<sec id="Sec15">
<title>Segment detection</title>
<p>The data analysis on 16 participants performing two trials of a 5 and 10 meter TUG task yielded 384 (16 participants × 6 segments × 2 trials × 2 tasks) instances of activities such as
<italic>Standing</italic>
,
<italic>Sitting</italic>
,
<italic>Walking</italic>
, and
<italic>Turning</italic>
. Using the algorithm with the selected sensors, the proposed algorithms were able to detect the activities with 100% sensitivity and specificity during the 10 meter TUG (n = 192). To validate the generality of the algorithms in detecting these activities, we then proceeded to test the
<italic>same</italic>
algorithms and parameters on the data recorded during the 5 meter TUG (n = 192). Participants performing the 5 meter TUG displayed similar kinematic patterns with half the duration (~15 vs. ~28 seconds). Again, without changing any of the parameters, the algorithm was also able to detect these activities with 100% sensitivity and specificity.</p>
</sec>
<sec id="Sec16">
<title>Transition detection</title>
<p>When applying the segmentation algorithms to the 10 meter TUG, we were able to parse 100% of the transition points (n = 224) between different segments. The differences and variances between the visual and auto segmentation of a 10 meter TUG across all transition points for all participants (n = 16) are shown in Figure 
<xref rid="Fig8" ref-type="fig">8</xref>
A-G. The smallest variability across all participants using inertial sensors was during the
<italic>sit-to-stand</italic>
transition (σ = 25 ms, Figure 
<xref rid="Fig8" ref-type="fig">8</xref>
A) while the largest variability was during the
<italic>turn-to-stand</italic>
transition (σ = 174 ms, Figure 
<xref rid="Fig8" ref-type="fig">8</xref>
F). In comparison, the smallest variability during visual segmentation between the two examiners was also during
<italic>sit-to-stand</italic>
; however, the variance was larger (σ = 60 ms). During
<italic>stand-to-walk-out</italic>
transition, the estimated transition time was more variable when marked visually by two examiners (σ = 253 ms) than using the inertial sensors (σ = 66 ms, Figure 
<xref rid="Fig8" ref-type="fig">8</xref>
B). On average, the automated segmentation
<inline-formula id="IEq7">
<alternatives>
<tex-math id="M27">\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$ \left(\overline{\sigma}{}_{Auto}\kern0.5em =\kern0.5em 86\kern0.5em ms\right) $$\end{document}</tex-math>
<mml:math id="M28">
<mml:mfenced close=")" open="(">
<mml:mrow>
<mml:mover accent="true">
<mml:mi>σ</mml:mi>
<mml:mo stretchy="true">¯</mml:mo>
</mml:mover>
<mml:msub>
<mml:malignmark></mml:malignmark>
<mml:mi mathvariant="italic">Auto</mml:mi>
</mml:msub>
<mml:mspace width="0.5em"></mml:mspace>
<mml:mo>=</mml:mo>
<mml:mspace width="0.5em"></mml:mspace>
<mml:mn>86</mml:mn>
<mml:mspace width="0.5em"></mml:mspace>
<mml:mi mathvariant="italic">ms</mml:mi>
</mml:mrow>
</mml:mfenced>
</mml:math>
<inline-graphic xlink:href="12984_2015_26_Article_IEq7.gif"></inline-graphic>
</alternatives>
</inline-formula>
was less variable than the visual segmentation
<inline-formula id="IEq8">
<alternatives>
<tex-math id="M29">\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$ \left(\overline{\sigma}{}_{Auto}\kern0.5em =\kern0.5em 86\kern0.5em ms\right) $$\end{document}</tex-math>
<mml:math id="M30">
<mml:mfenced close=")" open="(">
<mml:mrow>
<mml:mover accent="true">
<mml:mi>σ</mml:mi>
<mml:mo stretchy="true">¯</mml:mo>
</mml:mover>
<mml:msub>
<mml:malignmark></mml:malignmark>
<mml:mi mathvariant="italic">Auto</mml:mi>
</mml:msub>
<mml:mspace width="0.5em"></mml:mspace>
<mml:mo>=</mml:mo>
<mml:mspace width="0.5em"></mml:mspace>
<mml:mn>86</mml:mn>
<mml:mspace width="0.5em"></mml:mspace>
<mml:mi mathvariant="italic">ms</mml:mi>
</mml:mrow>
</mml:mfenced>
</mml:math>
<inline-graphic xlink:href="12984_2015_26_Article_IEq8.gif"></inline-graphic>
</alternatives>
</inline-formula>
across all participants and transitions.
<fig id="Fig8">
<label>Figure 8</label>
<caption>
<p>Differences between visual and auto segmentation of transition time during a 10 m TUG.
<bold>A-G)</bold>
The time stamp differences (mean ± std) of individual participant and of each transition (ΔT). This was the difference between the transition time marked visually by two examiners and the time detected using the inertial sensors.
<bold>H)</bold>
On average (n = 16), the absolute difference between the times marked visually and using inertial sensors were within one standard deviation of each other for all transitions.</p>
</caption>
<graphic xlink:href="12984_2015_26_Fig8_HTML" id="MO8"></graphic>
</fig>
</p>
<p>The smallest difference between the visual and auto segmentation was during the
<italic>sit-to-stand</italic>
transition (ΔT = 25 ms) while the largest was during the
<italic>turn-to-stand</italic>
transition (ΔT = 180 ms). Across 7 transitions, the average difference between the visual and auto segmentation was approximately ΔT
<sub>ave</sub>
 = 93 ms. The estimated transition time across all participants approximated using the inertial sensors were within one standard deviation of the transition time marked visually by two examiners.</p>
</sec>
</sec>
<sec id="Sec17" sec-type="discussion">
<title>Discussion</title>
<p>The aims of this work were to develop an algorithm that utilized the information from an inertial sensor-based motion capture system to identify gross physical activities performed during daily living and automatically isolate these activities for future performance analysis. This was accomplished by using an optimization method to filter and identify body-worn sensors in the system that were strongly associated to different activities and yielded the best performance. The results from the optimization lead to the development of an efficient detection and segmentation algorithm that minimized the effect of movement variability between participants while robustly detect and segment the activities during a TUG. This study has also demonstrated that using a set of inertial sensors and applying the detection algorithm, it was possible to identify and segment these activities during continuous execution of daily activity tasks in a healthy older adult population with 100% accuracy.</p>
<p>Attempting to detect specific activities is not a new concept. For instance, Torres
<italic>et al.</italic>
[
<xref ref-type="bibr" rid="CR12">12</xref>
] using a module of inertial and barometric sensors place at different location of the body were able to detect walking at 100% and sit and stand at 86% and 89%, respectively. Godfrey
<italic>et al.</italic>
[
<xref ref-type="bibr" rid="CR4">4</xref>
] used a chest mounted accelerometer to detect
<italic>Standing</italic>
with sensitivity and specificity of 83% (±11) (mean ± SD, n = 10)). While Najafi
<italic>et al.</italic>
[
<xref ref-type="bibr" rid="CR13">13</xref>
] could detect the same activity with more than 95% sensitivity and specificity in healthy elderly using a miniature gyroscope during long-time monitoring. In the present study, using a combination of pertinent information from specific sensors, we were able detect these activities with 100% specificity and sensitivity.</p>
<p>Since participants were told to perform the task at their own pace, there was variability in how they performed the TUG task. In fact, an older population was specifically selected because of its inherent variability in performing tasks, in addition to be the type of population that is more often the subject of mobility assessment. Using the optimal approach to find the cutoff frequencies, we minimized the effect of variability between participants by generating a single set of parameters (cutoff frequency) that can be applied to all participants. The global convergence of these cutoff frequencies indicated that kinematic patterns generated by the participants during the performance of the TUG were very similar (Figure 
<xref rid="Fig7" ref-type="fig">7</xref>
B).</p>
<p>Segmentation of these activities during daily living will become crucial when this type of sensors are deployed remotely in homes and free environments for long-term monitoring of patient’s mobility. Also, segmentation using the avatar is hugely time consuming. Case in point, we have asked an examiner to segment a five-minute free-living mobility activity of a person moving in an environment where they were able to perform multiple tasks. It took the examiner 5 hours to visually identify the different segments and time-stamp the transition points. While general guidelines were presented to the examiners on how to segment this task, there was still a large variability between the examiners on determining the onset and end of segments, especially during dynamic transitions (Figure 
<xref rid="Fig6" ref-type="fig">6</xref>
). In general, the transition points detected by the algorithm were less variable than visual segmentation across all transition points and participants (Figure 
<xref rid="Fig8" ref-type="fig">8</xref>
). If we are to assume that times marked by visual inspection is the gold standard
<italic>,</italic>
then largest time difference between the visual and automatic segmentation were approximately 180 
<italic>ms</italic>
during the most challenging transition,
<italic>turn-to-stand</italic>
. Given that the average variance between the examiners were 175 
<italic>ms</italic>
during this transition, the difference between visual and automatic segmentation would not be significant; yet, the algorithm was significantly faster in detecting these activities and segmenting the TUG task.</p>
<p>Detecting if a person stands up or sits down is critical for monitoring and evaluating
<italic>how</italic>
well the person has performed that task. For example, the time it takes for patients to perform a
<italic>sit-to-stand</italic>
task has been correlated with the risk of falling as well as functional recovery in community dwelling elderly [
<xref ref-type="bibr" rid="CR15">15</xref>
,
<xref ref-type="bibr" rid="CR16">16</xref>
]. Cheng [
<xref ref-type="bibr" rid="CR23">23</xref>
] showed that time needed to complete a 180° turning was a good index to differentiate between fallers and non-fallers in individual with Parkinson’s disease (PD). Stack [
<xref ref-type="bibr" rid="CR24">24</xref>
] showed that, on average, people with PD took more steps during turning to compensate for the difficulties experienced during turning. The present study provides an automated method to quickly isolate out these activities using inertial sensors. Such segmentation will be used in the future to assess the quality of the mobility for the detected tasks.</p>
</sec>
<sec id="Sec18" sec-type="conclusions">
<title>Conclusion</title>
<p>The present study lays the foundation for the development of a comprehensive algorithm to detect and segment activities performed during daily living using inertial sensors. The current study is limited in scope by the relatively simple tasks that were segmented, the environment in which the tasks were performed, and the relatively healthy population that performed the tasks. We are currently applying the detection and segmentation principles to less scripted tasks and in more unstructured environments, with longer trial durations. We are also testing our algorithms on populations with altered mobility. We expect that introducing more complex tasks and in a more variable environment and population would probably require more sensors (redundancy) to detect and segment the tasks. This is why we always record the tasks with 17 sensors, in the hope of providing the optimal sensor set for specific conditions. We also suspect that further optimization will be required when populations with altered mobility are studied. Nonetheless, the current results lay the foundation for future research, and could be utilized to develop a fully-automated TUG capture and analysis system. Ultimately, the detection and segmentation of these activities is needed to develop performance metrics to evaluate and monitor people with mobility impairment due to disease and old age.</p>
</sec>
</body>
<back>
<glossary>
<title>Abbreviations</title>
<def-list>
<def-item>
<term>TUG</term>
<def>
<p>Timed-up-and-go test</p>
</def>
</def-item>
<def-item>
<term>ADL</term>
<def>
<p>Activities of daily living</p>
</def>
</def-item>
</def-list>
</glossary>
<fn-group>
<fn>
<p>
<bold>Competing interests</bold>
</p>
<p>The authors declare that they have no competing interests.</p>
</fn>
<fn>
<p>
<bold>Authors’ contributions</bold>
</p>
<p>HN developed the algorithm, designed the analysis, and drafted the manuscript. FA was involved in the development of the algorithm as well as the analysis. CLP was responsible for the development of the protocol, participant recruitment, data collection, and aided in the interpretation of the results. MB was involved in the data collection as well as the interpretation of the results. FR aided in the development of the protocol and the algorithm. PB and MJ aided in the design of the study, the development of the algorithm as well as providing significant revisions to the manuscript. CD, the lead scientist, helped in all facets of the project. All authors read and approved the final manuscript.</p>
</fn>
</fn-group>
<ack>
<title>Acknowledgement</title>
<p>We would like to thank the volunteers from the Geriatric Institute at Montreal for their participation in the study. This project was conducted as part of the research program of the EMAP group which members, Mark Speechley (Department of Epidemiology and Biostatistics, University of Western Ontario), Anthony Karelis (Department of Kinesiology, UQAM), Claude Vincent (Department of Rehabilitation, Université Laval), James Frank (Faculty of Human Kinetics, University of Windsor) and Roderick Edwards (Department Mathematics and Statistics, University of Victoria).</p>
</ack>
<ref-list id="Bib1">
<title>References</title>
<ref id="CR1">
<label>1.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Wassink-Vossen</surname>
<given-names>S</given-names>
</name>
<name>
<surname>Collard</surname>
<given-names>RM</given-names>
</name>
<name>
<surname>Oude Voshaar</surname>
<given-names>RC</given-names>
</name>
<name>
<surname>Comijs</surname>
<given-names>HC</given-names>
</name>
<name>
<surname>de Vocht</surname>
<given-names>HM</given-names>
</name>
<name>
<surname>Naarding</surname>
<given-names>P</given-names>
</name>
</person-group>
<article-title>Physical (in)activity and depression in older people</article-title>
<source>J Affect Disord</source>
<year>2014</year>
<volume>161</volume>
<fpage>65</fpage>
<lpage>72</lpage>
<pub-id pub-id-type="doi">10.1016/j.jad.2014.03.001</pub-id>
<pub-id pub-id-type="pmid">24751309</pub-id>
</element-citation>
</ref>
<ref id="CR2">
<label>2.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Hassan</surname>
<given-names>A</given-names>
</name>
<name>
<surname>Vallabhajosula</surname>
<given-names>S</given-names>
</name>
<name>
<surname>Zahodne</surname>
<given-names>LB</given-names>
</name>
<name>
<surname>Bowers</surname>
<given-names>D</given-names>
</name>
<name>
<surname>Okun</surname>
<given-names>MS</given-names>
</name>
<name>
<surname>Fernandez</surname>
<given-names>HH</given-names>
</name>
<etal></etal>
</person-group>
<article-title>Correlations of apathy and depression with postural instability in Parkinson disease</article-title>
<source>J Neurol Sci</source>
<year>2014</year>
<volume>338</volume>
<issue>1–2</issue>
<fpage>162</fpage>
<lpage>5</lpage>
<pub-id pub-id-type="doi">10.1016/j.jns.2013.12.040</pub-id>
<pub-id pub-id-type="pmid">24461565</pub-id>
</element-citation>
</ref>
<ref id="CR3">
<label>3.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Maki</surname>
<given-names>BE</given-names>
</name>
<name>
<surname>Holliday</surname>
<given-names>PJ</given-names>
</name>
<name>
<surname>Topper</surname>
<given-names>AK</given-names>
</name>
</person-group>
<article-title>A prospective study of postural balance and risk of falling in an ambulatory and independent elderly population</article-title>
<source>J Gerontol</source>
<year>1994</year>
<volume>49</volume>
<issue>2</issue>
<fpage>M72</fpage>
<lpage>84</lpage>
<pub-id pub-id-type="doi">10.1093/geronj/49.2.M72</pub-id>
<pub-id pub-id-type="pmid">8126355</pub-id>
</element-citation>
</ref>
<ref id="CR4">
<label>4.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Godfrey</surname>
<given-names>A</given-names>
</name>
<name>
<surname>Bourke</surname>
<given-names>AK</given-names>
</name>
<name>
<surname>Olaighin</surname>
<given-names>GM</given-names>
</name>
<name>
<surname>van de Ven</surname>
<given-names>P</given-names>
</name>
<name>
<surname>Nelson</surname>
<given-names>J</given-names>
</name>
</person-group>
<article-title>Activity classification using a single chest mounted tri-axial accelerometer</article-title>
<source>Med Eng Phys</source>
<year>2011</year>
<volume>33</volume>
<issue>9</issue>
<fpage>1127</fpage>
<lpage>35</lpage>
<pub-id pub-id-type="doi">10.1016/j.medengphy.2011.05.002</pub-id>
<pub-id pub-id-type="pmid">21636308</pub-id>
</element-citation>
</ref>
<ref id="CR5">
<label>5.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Arif</surname>
<given-names>M</given-names>
</name>
<name>
<surname>Bilal</surname>
<given-names>M</given-names>
</name>
<name>
<surname>Kattan</surname>
<given-names>A</given-names>
</name>
<name>
<surname>Ahamed</surname>
<given-names>SI</given-names>
</name>
</person-group>
<article-title>Better Physical Activity Classification using Smartphone Acceleration Sensor</article-title>
<source>J Med Syst</source>
<year>2014</year>
<volume>38</volume>
<issue>9</issue>
<fpage>95</fpage>
<pub-id pub-id-type="doi">10.1007/s10916-014-0095-0</pub-id>
<pub-id pub-id-type="pmid">25000988</pub-id>
</element-citation>
</ref>
<ref id="CR6">
<label>6.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Culhane</surname>
<given-names>KM</given-names>
</name>
<name>
<surname>Lyons</surname>
<given-names>GM</given-names>
</name>
<name>
<surname>Hilton</surname>
<given-names>D</given-names>
</name>
<name>
<surname>Grace</surname>
<given-names>PA</given-names>
</name>
<name>
<surname>Lyons</surname>
<given-names>D</given-names>
</name>
</person-group>
<article-title>Long-term mobility monitoring of older adults using accelerometers in a clinical environment</article-title>
<source>Clin Rehabil</source>
<year>2004</year>
<volume>18</volume>
<issue>3</issue>
<fpage>335</fpage>
<lpage>43</lpage>
<pub-id pub-id-type="doi">10.1191/0269215504cr734oa</pub-id>
<pub-id pub-id-type="pmid">15137565</pub-id>
</element-citation>
</ref>
<ref id="CR7">
<label>7.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Lyons</surname>
<given-names>GM</given-names>
</name>
<name>
<surname>Culhane</surname>
<given-names>KM</given-names>
</name>
<name>
<surname>Hilton</surname>
<given-names>D</given-names>
</name>
<name>
<surname>Grace</surname>
<given-names>PA</given-names>
</name>
<name>
<surname>Lyons</surname>
<given-names>D</given-names>
</name>
</person-group>
<article-title>A description of an accelerometer-based mobility monitoring technique</article-title>
<source>Med Eng Phys</source>
<year>2005</year>
<volume>27</volume>
<issue>6</issue>
<fpage>497</fpage>
<lpage>504</lpage>
<pub-id pub-id-type="doi">10.1016/j.medengphy.2004.11.006</pub-id>
<pub-id pub-id-type="pmid">15990066</pub-id>
</element-citation>
</ref>
<ref id="CR8">
<label>8.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Bourke</surname>
<given-names>AK</given-names>
</name>
<name>
<surname>O’Brien</surname>
<given-names>JV</given-names>
</name>
<name>
<surname>Lyons</surname>
<given-names>GM</given-names>
</name>
</person-group>
<article-title>Evaluation of a threshold-based tri-axial accelerometer fall detection algorithm</article-title>
<source>Gait Posture</source>
<year>2007</year>
<volume>26</volume>
<issue>2</issue>
<fpage>194</fpage>
<lpage>9</lpage>
<pub-id pub-id-type="doi">10.1016/j.gaitpost.2006.09.012</pub-id>
<pub-id pub-id-type="pmid">17101272</pub-id>
</element-citation>
</ref>
<ref id="CR9">
<label>9.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Dijkstra</surname>
<given-names>B</given-names>
</name>
<name>
<surname>Kamsma</surname>
<given-names>YP</given-names>
</name>
<name>
<surname>Zijlstra</surname>
<given-names>W</given-names>
</name>
</person-group>
<article-title>Detection of gait and postures using a miniaturized triaxial accelerometer-based system: accuracy in patients with mild to moderate Parkinson’s disease</article-title>
<source>Arch Phys Med Rehabil</source>
<year>2010</year>
<volume>91</volume>
<issue>8</issue>
<fpage>1272</fpage>
<lpage>7</lpage>
<pub-id pub-id-type="doi">10.1016/j.apmr.2010.05.004</pub-id>
<pub-id pub-id-type="pmid">20684910</pub-id>
</element-citation>
</ref>
<ref id="CR10">
<label>10.</label>
<mixed-citation publication-type="other">Rahimi F, Bee C, Duval C, Boissy P, Edwards R, Jog M. Using ecological whole body kinematics to evaluate effects of medication adjustment in Parkinson Disease. J Parkinson Dis. 2014. doi:10.3233/JPD-140370.</mixed-citation>
</ref>
<ref id="CR11">
<label>11.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Rahimi</surname>
<given-names>F</given-names>
</name>
<name>
<surname>Duval</surname>
<given-names>C</given-names>
</name>
<name>
<surname>Jog</surname>
<given-names>M</given-names>
</name>
<name>
<surname>Bee</surname>
<given-names>C</given-names>
</name>
<name>
<surname>South</surname>
<given-names>A</given-names>
</name>
<name>
<surname>Jog</surname>
<given-names>M</given-names>
</name>
<etal></etal>
</person-group>
<article-title>Capturing whole-body mobility of patients with Parkinson disease using inertial motion sensors: expected challenges and rewards</article-title>
<source>Conf Proc IEEE Eng Med Biol Soc</source>
<year>2011</year>
<volume>2011</volume>
<fpage>5833</fpage>
<lpage>8</lpage>
<pub-id pub-id-type="pmid">22255666</pub-id>
</element-citation>
</ref>
<ref id="CR12">
<label>12.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Moncada-Torres</surname>
<given-names>A</given-names>
</name>
<name>
<surname>Leuenberger</surname>
<given-names>K</given-names>
</name>
<name>
<surname>Gonzenbach</surname>
<given-names>R</given-names>
</name>
<name>
<surname>Luft</surname>
<given-names>A</given-names>
</name>
<name>
<surname>Gassert</surname>
<given-names>R</given-names>
</name>
</person-group>
<article-title>Activity classification based on inertial and barometric pressure sensors at different anatomical locations</article-title>
<source>Physiol Meas</source>
<year>2014</year>
<volume>35</volume>
<issue>7</issue>
<fpage>1245</fpage>
<lpage>63</lpage>
<pub-id pub-id-type="doi">10.1088/0967-3334/35/7/1245</pub-id>
<pub-id pub-id-type="pmid">24853451</pub-id>
</element-citation>
</ref>
<ref id="CR13">
<label>13.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Najafi</surname>
<given-names>B</given-names>
</name>
<name>
<surname>Aminian</surname>
<given-names>K</given-names>
</name>
<name>
<surname>Loew</surname>
<given-names>F</given-names>
</name>
<name>
<surname>Blanc</surname>
<given-names>Y</given-names>
</name>
<name>
<surname>Robert</surname>
<given-names>PA</given-names>
</name>
</person-group>
<article-title>Measurement of stand-sit and sit-stand transitions using a miniature gyroscope and its application in fall risk evaluation in the elderly</article-title>
<source>IEEE Trans Bio-med Eng</source>
<year>2002</year>
<volume>49</volume>
<issue>8</issue>
<fpage>843</fpage>
<lpage>51</lpage>
<pub-id pub-id-type="doi">10.1109/TBME.2002.800763</pub-id>
</element-citation>
</ref>
<ref id="CR14">
<label>14.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Verghese</surname>
<given-names>J</given-names>
</name>
<name>
<surname>Wang</surname>
<given-names>C</given-names>
</name>
<name>
<surname>Lipton</surname>
<given-names>RB</given-names>
</name>
<name>
<surname>Holtzer</surname>
<given-names>R</given-names>
</name>
<name>
<surname>Xue</surname>
<given-names>X</given-names>
</name>
</person-group>
<article-title>Quantitative gait dysfunction and risk of cognitive decline and dementia</article-title>
<source>J Neurol Neurosurg Psychiatry</source>
<year>2007</year>
<volume>78</volume>
<issue>9</issue>
<fpage>929</fpage>
<lpage>35</lpage>
<pub-id pub-id-type="doi">10.1136/jnnp.2006.106914</pub-id>
<pub-id pub-id-type="pmid">17237140</pub-id>
</element-citation>
</ref>
<ref id="CR15">
<label>15.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Cheng</surname>
<given-names>PT</given-names>
</name>
<name>
<surname>Liaw</surname>
<given-names>MY</given-names>
</name>
<name>
<surname>Wong</surname>
<given-names>MK</given-names>
</name>
<name>
<surname>Tang</surname>
<given-names>FT</given-names>
</name>
<name>
<surname>Lee</surname>
<given-names>MY</given-names>
</name>
<name>
<surname>Lin</surname>
<given-names>PS</given-names>
</name>
</person-group>
<article-title>The sit-to-stand movement in stroke patients and its correlation with falling</article-title>
<source>Arch Phys Med Rehabil</source>
<year>1998</year>
<volume>79</volume>
<issue>9</issue>
<fpage>1043</fpage>
<lpage>6</lpage>
<pub-id pub-id-type="doi">10.1016/S0003-9993(98)90168-X</pub-id>
<pub-id pub-id-type="pmid">9749681</pub-id>
</element-citation>
</ref>
<ref id="CR16">
<label>16.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Janssen</surname>
<given-names>W</given-names>
</name>
<name>
<surname>Bussmann</surname>
<given-names>J</given-names>
</name>
<name>
<surname>Selles</surname>
<given-names>R</given-names>
</name>
<name>
<surname>Koudstaal</surname>
<given-names>P</given-names>
</name>
<name>
<surname>Ribbers</surname>
<given-names>G</given-names>
</name>
<name>
<surname>Stam</surname>
<given-names>H</given-names>
</name>
</person-group>
<article-title>Recovery of the sit-to-stand movement after stroke: a longitudinal cohort study</article-title>
<source>Neurorehabil Neural Repair</source>
<year>2010</year>
<volume>24</volume>
<issue>8</issue>
<fpage>763</fpage>
<lpage>9</lpage>
<pub-id pub-id-type="doi">10.1177/1545968310363584</pub-id>
<pub-id pub-id-type="pmid">20702392</pub-id>
</element-citation>
</ref>
<ref id="CR17">
<label>17.</label>
<element-citation publication-type="book">
<person-group person-group-type="author">
<name>
<surname>Brooks</surname>
<given-names>RIS</given-names>
</name>
</person-group>
<source>Multi-sensor fusion: fundamentals and application with software</source>
<year>1998</year>
<publisher-loc>Upper Saddle River, NJ</publisher-loc>
<publisher-name>Prentice-Hall Inc</publisher-name>
</element-citation>
</ref>
<ref id="CR18">
<label>18.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Roetenberg</surname>
<given-names>D</given-names>
</name>
<name>
<surname>Luinge</surname>
<given-names>HJ</given-names>
</name>
<name>
<surname>Baten</surname>
<given-names>CTM</given-names>
</name>
<name>
<surname>Veltink</surname>
<given-names>PH</given-names>
</name>
</person-group>
<article-title>Compensation of magnetic disturbances improves inertial and magnetic sensing of human body segment orientation</article-title>
<source>IEEE Trans Neural Syst Rehabil Eng</source>
<year>2005</year>
<volume>13</volume>
<issue>3</issue>
<fpage>395</fpage>
<lpage>405</lpage>
<pub-id pub-id-type="doi">10.1109/TNSRE.2005.847353</pub-id>
<pub-id pub-id-type="pmid">16200762</pub-id>
</element-citation>
</ref>
<ref id="CR19">
<label>19.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Sabatini</surname>
<given-names>AM</given-names>
</name>
</person-group>
<article-title>Quaternion-based extended Kalman filter for determining orientation by inertial and magnetic sensing</article-title>
<source>IEEE Trans Biomed Eng</source>
<year>2006</year>
<volume>53</volume>
<issue>7</issue>
<fpage>1346</fpage>
<lpage>56</lpage>
<pub-id pub-id-type="doi">10.1109/TBME.2006.875664</pub-id>
<pub-id pub-id-type="pmid">16830938</pub-id>
</element-citation>
</ref>
<ref id="CR20">
<label>20.</label>
<element-citation publication-type="book">
<person-group person-group-type="author">
<collab>Innovations and Advances in Computing, Informatics, System Sciences, Networking and Engineering</collab>
</person-group>
<source>Lecture Notes in Electrical Engineering</source>
<year>2015</year>
<publisher-loc>New York</publisher-loc>
<publisher-name>Springer International Publishing</publisher-name>
</element-citation>
</ref>
<ref id="CR21">
<label>21.</label>
<element-citation publication-type="book">
<person-group person-group-type="author">
<name>
<surname>Chang</surname>
<given-names>E</given-names>
</name>
<name>
<surname>Zak</surname>
<given-names>S</given-names>
</name>
</person-group>
<source>An introduction to optimization</source>
<year>2013</year>
<edition>1</edition>
<publisher-loc>Hoboken, New Jersey</publisher-loc>
<publisher-name>Wiley & Sons</publisher-name>
</element-citation>
</ref>
<ref id="CR22">
<label>22.</label>
<element-citation publication-type="book">
<person-group person-group-type="author">
<name>
<surname>Simon Rogers</surname>
<given-names>MG</given-names>
</name>
</person-group>
<source>A first course in machine learning. Machine learning & pattern recognition</source>
<year>2012</year>
<publisher-loc>Cambridge UK</publisher-loc>
<publisher-name>Chapman & Hall / CRC</publisher-name>
</element-citation>
</ref>
<ref id="CR23">
<label>23.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Cheng</surname>
<given-names>FY</given-names>
</name>
<name>
<surname>Yang</surname>
<given-names>YR</given-names>
</name>
<name>
<surname>Wang</surname>
<given-names>CJ</given-names>
</name>
<name>
<surname>Wu</surname>
<given-names>YR</given-names>
</name>
<name>
<surname>Cheng</surname>
<given-names>SJ</given-names>
</name>
<name>
<surname>Wang</surname>
<given-names>HC</given-names>
</name>
<etal></etal>
</person-group>
<article-title>Factors influencing turning and its relationship with falls in individuals with Parkinson’s disease</article-title>
<source>PLoS One</source>
<year>2014</year>
<volume>9</volume>
<issue>4</issue>
<fpage>e93572</fpage>
<pub-id pub-id-type="doi">10.1371/journal.pone.0093572</pub-id>
<pub-id pub-id-type="pmid">24699675</pub-id>
</element-citation>
</ref>
<ref id="CR24">
<label>24.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Stack</surname>
<given-names>EL</given-names>
</name>
<name>
<surname>Ashburn</surname>
<given-names>AM</given-names>
</name>
<name>
<surname>Jupp</surname>
<given-names>KE</given-names>
</name>
</person-group>
<article-title>Strategies used by people with Parkinson’s disease who report difficulty turning</article-title>
<source>Parkinsonism Relat Disord</source>
<year>2006</year>
<volume>12</volume>
<issue>2</issue>
<fpage>87</fpage>
<lpage>92</lpage>
<pub-id pub-id-type="doi">10.1016/j.parkreldis.2005.08.008</pub-id>
<pub-id pub-id-type="pmid">16338159</pub-id>
</element-citation>
</ref>
</ref-list>
</back>
</pmc>
</record>

Pour manipuler ce document sous Unix (Dilib)

EXPLOR_STEP=$WICRI_ROOT/Wicri/Canada/explor/ParkinsonCanadaV1/Data/Pmc/Corpus
HfdSelect -h $EXPLOR_STEP/biblio.hfd -nk 0002549 | SxmlIndent | more

Ou

HfdSelect -h $EXPLOR_AREA/Data/Pmc/Corpus/biblio.hfd -nk 0002549 | SxmlIndent | more

Pour mettre un lien sur cette page dans le réseau Wicri

{{Explor lien
   |wiki=    Wicri/Canada
   |area=    ParkinsonCanadaV1
   |flux=    Pmc
   |étape=   Corpus
   |type=    RBID
   |clé=     
   |texte=   
}}

Wicri

This area was generated with Dilib version V0.6.29.
Data generation: Thu May 4 22:20:19 2017. Site generation: Fri Dec 23 23:17:26 2022