Serveur d'exploration sur les dispositifs haptiques

Attention, ce site est en cours de développement !
Attention, site généré par des moyens informatiques à partir de corpus bruts.
Les informations ne sont donc pas validées.

Robust Human Machine Interface Based on Head Movements Applied to Assistive Robotics

Identifieur interne : 002461 ( Pmc/Curation ); précédent : 002460; suivant : 002462

Robust Human Machine Interface Based on Head Movements Applied to Assistive Robotics

Auteurs : Elisa Perez [Argentine] ; Natalia L Pez [Argentine] ; Eugenio Orosco [Argentine] ; Carlos Soria [Argentine] ; Vicente Mut [Argentine] ; Teodiano Freire-Bastos [Brésil]

Source :

RBID : PMC:3888755

Abstract

This paper presents an interface that uses two different sensing techniques and combines both results through a fusion process to obtain the minimum-variance estimator of the orientation of the user's head. Sensing techniques of the interface are based on an inertial sensor and artificial vision. The orientation of the user's head is used to steer the navigation of a robotic wheelchair. Also, a control algorithm for assistive technology system is presented. The system is evaluated by four individuals with severe motors disability and a quantitative index was developed, in order to objectively evaluate the performance. The results obtained are promising since most users could perform the proposed tasks with the robotic wheelchair.


Url:
DOI: 10.1155/2013/589636
PubMed: 24453877
PubMed Central: 3888755

Links toward previous steps (curation, corpus...)


Links to Exploration step

PMC:3888755

Le document en format XML

<record>
<TEI>
<teiHeader>
<fileDesc>
<titleStmt>
<title xml:lang="en">Robust Human Machine Interface Based on Head Movements Applied to Assistive Robotics</title>
<author>
<name sortKey="Perez, Elisa" sort="Perez, Elisa" uniqKey="Perez E" first="Elisa" last="Perez">Elisa Perez</name>
<affiliation wicri:level="1">
<nlm:aff id="I1">Gabinete de Tecnología Médica, Facultad de Ingeniería, Universidad Nacional de San Juan, 5400 San Juan, Argentina</nlm:aff>
<country xml:lang="fr">Argentine</country>
<wicri:regionArea>Gabinete de Tecnología Médica, Facultad de Ingeniería, Universidad Nacional de San Juan, 5400 San Juan</wicri:regionArea>
</affiliation>
</author>
<author>
<name sortKey="L Pez, Natalia" sort="L Pez, Natalia" uniqKey="L Pez N" first="Natalia" last="L Pez">Natalia L Pez</name>
<affiliation wicri:level="1">
<nlm:aff id="I1">Gabinete de Tecnología Médica, Facultad de Ingeniería, Universidad Nacional de San Juan, 5400 San Juan, Argentina</nlm:aff>
<country xml:lang="fr">Argentine</country>
<wicri:regionArea>Gabinete de Tecnología Médica, Facultad de Ingeniería, Universidad Nacional de San Juan, 5400 San Juan</wicri:regionArea>
</affiliation>
</author>
<author>
<name sortKey="Orosco, Eugenio" sort="Orosco, Eugenio" uniqKey="Orosco E" first="Eugenio" last="Orosco">Eugenio Orosco</name>
<affiliation wicri:level="1">
<nlm:aff id="I2">Instituto de Automática, Facultad de Ingeniería, Universidad Nacional de San Juan, 5400 San Juan, Argentina</nlm:aff>
<country xml:lang="fr">Argentine</country>
<wicri:regionArea>Instituto de Automática, Facultad de Ingeniería, Universidad Nacional de San Juan, 5400 San Juan</wicri:regionArea>
</affiliation>
</author>
<author>
<name sortKey="Soria, Carlos" sort="Soria, Carlos" uniqKey="Soria C" first="Carlos" last="Soria">Carlos Soria</name>
<affiliation wicri:level="1">
<nlm:aff id="I2">Instituto de Automática, Facultad de Ingeniería, Universidad Nacional de San Juan, 5400 San Juan, Argentina</nlm:aff>
<country xml:lang="fr">Argentine</country>
<wicri:regionArea>Instituto de Automática, Facultad de Ingeniería, Universidad Nacional de San Juan, 5400 San Juan</wicri:regionArea>
</affiliation>
</author>
<author>
<name sortKey="Mut, Vicente" sort="Mut, Vicente" uniqKey="Mut V" first="Vicente" last="Mut">Vicente Mut</name>
<affiliation wicri:level="1">
<nlm:aff id="I2">Instituto de Automática, Facultad de Ingeniería, Universidad Nacional de San Juan, 5400 San Juan, Argentina</nlm:aff>
<country xml:lang="fr">Argentine</country>
<wicri:regionArea>Instituto de Automática, Facultad de Ingeniería, Universidad Nacional de San Juan, 5400 San Juan</wicri:regionArea>
</affiliation>
</author>
<author>
<name sortKey="Freire Bastos, Teodiano" sort="Freire Bastos, Teodiano" uniqKey="Freire Bastos T" first="Teodiano" last="Freire-Bastos">Teodiano Freire-Bastos</name>
<affiliation wicri:level="1">
<nlm:aff id="I3">Departamento de Engenharia Elétrica, Universidade Federal do Espírito Santo, 29075910 Vitoria, ES, Brazil</nlm:aff>
<country xml:lang="fr">Brésil</country>
<wicri:regionArea>Departamento de Engenharia Elétrica, Universidade Federal do Espírito Santo, 29075910 Vitoria, ES</wicri:regionArea>
</affiliation>
</author>
</titleStmt>
<publicationStmt>
<idno type="wicri:source">PMC</idno>
<idno type="pmid">24453877</idno>
<idno type="pmc">3888755</idno>
<idno type="url">http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3888755</idno>
<idno type="RBID">PMC:3888755</idno>
<idno type="doi">10.1155/2013/589636</idno>
<date when="2013">2013</date>
<idno type="wicri:Area/Pmc/Corpus">002461</idno>
<idno type="wicri:Area/Pmc/Curation">002461</idno>
</publicationStmt>
<sourceDesc>
<biblStruct>
<analytic>
<title xml:lang="en" level="a" type="main">Robust Human Machine Interface Based on Head Movements Applied to Assistive Robotics</title>
<author>
<name sortKey="Perez, Elisa" sort="Perez, Elisa" uniqKey="Perez E" first="Elisa" last="Perez">Elisa Perez</name>
<affiliation wicri:level="1">
<nlm:aff id="I1">Gabinete de Tecnología Médica, Facultad de Ingeniería, Universidad Nacional de San Juan, 5400 San Juan, Argentina</nlm:aff>
<country xml:lang="fr">Argentine</country>
<wicri:regionArea>Gabinete de Tecnología Médica, Facultad de Ingeniería, Universidad Nacional de San Juan, 5400 San Juan</wicri:regionArea>
</affiliation>
</author>
<author>
<name sortKey="L Pez, Natalia" sort="L Pez, Natalia" uniqKey="L Pez N" first="Natalia" last="L Pez">Natalia L Pez</name>
<affiliation wicri:level="1">
<nlm:aff id="I1">Gabinete de Tecnología Médica, Facultad de Ingeniería, Universidad Nacional de San Juan, 5400 San Juan, Argentina</nlm:aff>
<country xml:lang="fr">Argentine</country>
<wicri:regionArea>Gabinete de Tecnología Médica, Facultad de Ingeniería, Universidad Nacional de San Juan, 5400 San Juan</wicri:regionArea>
</affiliation>
</author>
<author>
<name sortKey="Orosco, Eugenio" sort="Orosco, Eugenio" uniqKey="Orosco E" first="Eugenio" last="Orosco">Eugenio Orosco</name>
<affiliation wicri:level="1">
<nlm:aff id="I2">Instituto de Automática, Facultad de Ingeniería, Universidad Nacional de San Juan, 5400 San Juan, Argentina</nlm:aff>
<country xml:lang="fr">Argentine</country>
<wicri:regionArea>Instituto de Automática, Facultad de Ingeniería, Universidad Nacional de San Juan, 5400 San Juan</wicri:regionArea>
</affiliation>
</author>
<author>
<name sortKey="Soria, Carlos" sort="Soria, Carlos" uniqKey="Soria C" first="Carlos" last="Soria">Carlos Soria</name>
<affiliation wicri:level="1">
<nlm:aff id="I2">Instituto de Automática, Facultad de Ingeniería, Universidad Nacional de San Juan, 5400 San Juan, Argentina</nlm:aff>
<country xml:lang="fr">Argentine</country>
<wicri:regionArea>Instituto de Automática, Facultad de Ingeniería, Universidad Nacional de San Juan, 5400 San Juan</wicri:regionArea>
</affiliation>
</author>
<author>
<name sortKey="Mut, Vicente" sort="Mut, Vicente" uniqKey="Mut V" first="Vicente" last="Mut">Vicente Mut</name>
<affiliation wicri:level="1">
<nlm:aff id="I2">Instituto de Automática, Facultad de Ingeniería, Universidad Nacional de San Juan, 5400 San Juan, Argentina</nlm:aff>
<country xml:lang="fr">Argentine</country>
<wicri:regionArea>Instituto de Automática, Facultad de Ingeniería, Universidad Nacional de San Juan, 5400 San Juan</wicri:regionArea>
</affiliation>
</author>
<author>
<name sortKey="Freire Bastos, Teodiano" sort="Freire Bastos, Teodiano" uniqKey="Freire Bastos T" first="Teodiano" last="Freire-Bastos">Teodiano Freire-Bastos</name>
<affiliation wicri:level="1">
<nlm:aff id="I3">Departamento de Engenharia Elétrica, Universidade Federal do Espírito Santo, 29075910 Vitoria, ES, Brazil</nlm:aff>
<country xml:lang="fr">Brésil</country>
<wicri:regionArea>Departamento de Engenharia Elétrica, Universidade Federal do Espírito Santo, 29075910 Vitoria, ES</wicri:regionArea>
</affiliation>
</author>
</analytic>
<series>
<title level="j">The Scientific World Journal</title>
<idno type="eISSN">1537-744X</idno>
<imprint>
<date when="2013">2013</date>
</imprint>
</series>
</biblStruct>
</sourceDesc>
</fileDesc>
<profileDesc>
<textClass></textClass>
</profileDesc>
</teiHeader>
<front>
<div type="abstract" xml:lang="en">
<p>This paper presents an interface that uses two different sensing techniques and combines both results through a fusion process to obtain the minimum-variance estimator of the orientation of the user's head. Sensing techniques of the interface are based on an inertial sensor and artificial vision. The orientation of the user's head is used to steer the navigation of a robotic wheelchair. Also, a control algorithm for assistive technology system is presented. The system is evaluated by four individuals with severe motors disability and a quantitative index was developed, in order to objectively evaluate the performance. The results obtained are promising since most users could perform the proposed tasks with the robotic wheelchair.</p>
</div>
</front>
<back>
<div1 type="bibliography">
<listBibl>
<biblStruct>
<analytic>
<author>
<name sortKey="Bruemmer, Dj" uniqKey="Bruemmer D">DJ Bruemmer</name>
</author>
<author>
<name sortKey="Few, Da" uniqKey="Few D">DA Few</name>
</author>
<author>
<name sortKey="Boring, Rl" uniqKey="Boring R">RL Boring</name>
</author>
<author>
<name sortKey="Marble, Jl" uniqKey="Marble J">JL Marble</name>
</author>
<author>
<name sortKey="Walton, Mc" uniqKey="Walton M">MC Walton</name>
</author>
<author>
<name sortKey="Nielsen, Cw" uniqKey="Nielsen C">CW Nielsen</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Galindo, C" uniqKey="Galindo C">C Galindo</name>
</author>
<author>
<name sortKey="Gonzalez, J" uniqKey="Gonzalez J">J Gonzalez</name>
</author>
<author>
<name sortKey="Fernandez Madrigal, J A" uniqKey="Fernandez Madrigal J">J-A Fernández-Madrigal</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Perez, E" uniqKey="Perez E">E Perez</name>
</author>
<author>
<name sortKey="Soria, C" uniqKey="Soria C">C Soria</name>
</author>
<author>
<name sortKey="Nasisi, O" uniqKey="Nasisi O">O Nasisi</name>
</author>
<author>
<name sortKey="Freire Bastos, T" uniqKey="Freire Bastos T">T Freire-Bastos</name>
</author>
<author>
<name sortKey="Mut, V" uniqKey="Mut V">V Mut</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Freire Bastos, T" uniqKey="Freire Bastos T">T Freire-Bastos</name>
</author>
<author>
<name sortKey="Sarcinelli, M" uniqKey="Sarcinelli M">M Sarcinelli</name>
</author>
<author>
<name sortKey="Ferreira, A" uniqKey="Ferreira A">A Ferreira</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Ferreira, A" uniqKey="Ferreira A">A Ferreira</name>
</author>
<author>
<name sortKey="Celeste, Wc" uniqKey="Celeste W">WC Celeste</name>
</author>
<author>
<name sortKey="Cheein, Fa" uniqKey="Cheein F">FA Cheein</name>
</author>
<author>
<name sortKey="Freire Bastos, T" uniqKey="Freire Bastos T">T Freire-Bastos</name>
</author>
<author>
<name sortKey="Sarcinelli Filho, M" uniqKey="Sarcinelli Filho M">M Sarcinelli-Filho</name>
</author>
<author>
<name sortKey="Carelli, R" uniqKey="Carelli R">R Carelli</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Auat Cheein, F" uniqKey="Auat Cheein F">F Auat Cheein</name>
</author>
<author>
<name sortKey="Lopez, N" uniqKey="Lopez N">N Lopez</name>
</author>
<author>
<name sortKey="Soria, Cm" uniqKey="Soria C">CM Soria</name>
</author>
<author>
<name sortKey="Di Sciascio, Fa" uniqKey="Di Sciascio F">FA di Sciascio</name>
</author>
<author>
<name sortKey="Lobo Pereira, F" uniqKey="Lobo Pereira F">F Lobo Pereira</name>
</author>
<author>
<name sortKey="Carelli, R" uniqKey="Carelli R">R Carelli</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Barea, R" uniqKey="Barea R">R Barea</name>
</author>
<author>
<name sortKey="Boquete, L" uniqKey="Boquete L">L Boquete</name>
</author>
<author>
<name sortKey="Mazo, M" uniqKey="Mazo M">M Mazo</name>
</author>
<author>
<name sortKey="L Pez, E" uniqKey="L Pez E">E López</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Vie Christensen, H" uniqKey="Vie Christensen H">H Vie Christensen</name>
</author>
<author>
<name sortKey="Garcia, Jc" uniqKey="Garcia J">JC Garcia</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Manogna, S" uniqKey="Manogna S">S Manogna</name>
</author>
<author>
<name sortKey="Vaishnavi, S" uniqKey="Vaishnavi S">S Vaishnavi</name>
</author>
<author>
<name sortKey="Geethanjali, B" uniqKey="Geethanjali B">B Geethanjali</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Ju, Js" uniqKey="Ju J">JS Ju</name>
</author>
<author>
<name sortKey="Shin, Y" uniqKey="Shin Y">Y Shin</name>
</author>
<author>
<name sortKey="Kim, Ey" uniqKey="Kim E">EY Kim</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Montesano, L" uniqKey="Montesano L">L Montesano</name>
</author>
<author>
<name sortKey="Minguez, J" uniqKey="Minguez J">J Minguez</name>
</author>
<author>
<name sortKey="Alcubierre, Jm" uniqKey="Alcubierre J">JM Alcubierre</name>
</author>
<author>
<name sortKey="Montano, L" uniqKey="Montano L">L Montano</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Bauckhage, C" uniqKey="Bauckhage C">C Bauckhage</name>
</author>
<author>
<name sortKey="K Ster, T" uniqKey="K Ster T">T Käster</name>
</author>
<author>
<name sortKey="Rotenstein, Am" uniqKey="Rotenstein A">AM Rotenstein</name>
</author>
<author>
<name sortKey="Tsotsos, Jk" uniqKey="Tsotsos J">JK Tsotsos</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Kobayashi, Y" uniqKey="Kobayashi Y">Y Kobayashi</name>
</author>
<author>
<name sortKey="Kinpara, Y" uniqKey="Kinpara Y">Y Kinpara</name>
</author>
<author>
<name sortKey="Shibusawa, T" uniqKey="Shibusawa T">T Shibusawa</name>
</author>
<author>
<name sortKey="Kuno, Y" uniqKey="Kuno Y">Y Kuno</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Rebsamen, B" uniqKey="Rebsamen B">B Rebsamen</name>
</author>
<author>
<name sortKey="Burdet, E" uniqKey="Burdet E">E Burdet</name>
</author>
<author>
<name sortKey="Guan, C" uniqKey="Guan C">C Guan</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Iturrate, I" uniqKey="Iturrate I">I Iturrate</name>
</author>
<author>
<name sortKey="Antelis, Jm" uniqKey="Antelis J">JM Antelis</name>
</author>
<author>
<name sortKey="Kubler, A" uniqKey="Kubler A">A Kübler</name>
</author>
<author>
<name sortKey="Minguez, J" uniqKey="Minguez J">J Minguez</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="De La Cruz, C" uniqKey="De La Cruz C">C de la Cruz</name>
</author>
<author>
<name sortKey="Freire Bastos, T" uniqKey="Freire Bastos T">T Freire-Bastos</name>
</author>
<author>
<name sortKey="Cheein, Faa" uniqKey="Cheein F">FAA Cheein</name>
</author>
<author>
<name sortKey="Carelli, R" uniqKey="Carelli R">R Carelli</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Zhou, L" uniqKey="Zhou L">L Zhou</name>
</author>
<author>
<name sortKey="Teo, Cl" uniqKey="Teo C">CL Teo</name>
</author>
<author>
<name sortKey="Burdet, E" uniqKey="Burdet E">E Burdet</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="De La Cruz, C" uniqKey="De La Cruz C">C de la Cruz</name>
</author>
<author>
<name sortKey="Freire Bastos, T" uniqKey="Freire Bastos T">T Freire-Bastos</name>
</author>
<author>
<name sortKey="Carelli, R" uniqKey="Carelli R">R Carelli</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Mutambara, Ag" uniqKey="Mutambara A">AG Mutambara</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Ferreira, A" uniqKey="Ferreira A">A Ferreira</name>
</author>
<author>
<name sortKey="Cavalieri, Dc" uniqKey="Cavalieri D">DC Cavalieri</name>
</author>
<author>
<name sortKey="Silva, Rl" uniqKey="Silva R">RL Silva</name>
</author>
<author>
<name sortKey="Freire Bastos, T" uniqKey="Freire Bastos T">T Freire-Bastos</name>
</author>
<author>
<name sortKey="Sarcinelli Filho, M" uniqKey="Sarcinelli Filho M">M Sarcinelli-Filho</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Gong, S" uniqKey="Gong S">S Gong</name>
</author>
<author>
<name sortKey="Mackenna, S" uniqKey="Mackenna S">S MacKenna</name>
</author>
<author>
<name sortKey="Psarrou, A" uniqKey="Psarrou A">A Psarrou</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Cho, K M" uniqKey="Cho K">K-M Cho</name>
</author>
<author>
<name sortKey="Jang, J H" uniqKey="Jang J">J-H Jang</name>
</author>
<author>
<name sortKey="Hong, K S" uniqKey="Hong K">K-S Hong</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Jones, Mj" uniqKey="Jones M">MJ Jones</name>
</author>
<author>
<name sortKey="Rehg, Jm" uniqKey="Rehg J">JM Rehg</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Hsu, R L" uniqKey="Hsu R">R-L Hsu</name>
</author>
<author>
<name sortKey="Abdel Mottaleb, M" uniqKey="Abdel Mottaleb M">M Abdel-Mottaleb</name>
</author>
<author>
<name sortKey="Jain, Ak" uniqKey="Jain A">AK Jain</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Berbar, Ma" uniqKey="Berbar M">MA Berbar</name>
</author>
<author>
<name sortKey="Kelash, Hm" uniqKey="Kelash H">HM Kelash</name>
</author>
<author>
<name sortKey="Kandeel, Aa" uniqKey="Kandeel A">AA Kandeel</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Perez, E" uniqKey="Perez E">E Perez</name>
</author>
<author>
<name sortKey="Soria, C" uniqKey="Soria C">C Soria</name>
</author>
<author>
<name sortKey="L Pez, Nm" uniqKey="L Pez N">NM López</name>
</author>
<author>
<name sortKey="Nasisi, O" uniqKey="Nasisi O">O Nasisi</name>
</author>
<author>
<name sortKey="Mut, V" uniqKey="Mut V">V Mut</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Trucco, E" uniqKey="Trucco E">E Trucco</name>
</author>
<author>
<name sortKey="Verri, A" uniqKey="Verri A">A Verri</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Zhang, Y" uniqKey="Zhang Y">Y Zhang</name>
</author>
<author>
<name sortKey="Hong, D" uniqKey="Hong D">D Hong</name>
</author>
<author>
<name sortKey="Chung, J" uniqKey="Chung J">J Chung</name>
</author>
<author>
<name sortKey="Velinsky, S" uniqKey="Velinsky S">S Velinsky</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Soria, Cm" uniqKey="Soria C">CM Soria</name>
</author>
<author>
<name sortKey="Carelli, R" uniqKey="Carelli R">R Carelli</name>
</author>
<author>
<name sortKey="Sarcinelli Filho, M" uniqKey="Sarcinelli Filho M">M Sarcinelli-Filho</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Cook, Am" uniqKey="Cook A">AM Cook</name>
</author>
<author>
<name sortKey="Hussey, Sm" uniqKey="Hussey S">SM Hussey</name>
</author>
</analytic>
</biblStruct>
</listBibl>
</div1>
</back>
</TEI>
<pmc article-type="research-article">
<pmc-dir>properties open_access</pmc-dir>
<front>
<journal-meta>
<journal-id journal-id-type="nlm-ta">ScientificWorldJournal</journal-id>
<journal-id journal-id-type="iso-abbrev">ScientificWorldJournal</journal-id>
<journal-id journal-id-type="publisher-id">TSWJ</journal-id>
<journal-title-group>
<journal-title>The Scientific World Journal</journal-title>
</journal-title-group>
<issn pub-type="epub">1537-744X</issn>
<publisher>
<publisher-name>Hindawi Publishing Corporation</publisher-name>
</publisher>
</journal-meta>
<article-meta>
<article-id pub-id-type="pmid">24453877</article-id>
<article-id pub-id-type="pmc">3888755</article-id>
<article-id pub-id-type="doi">10.1155/2013/589636</article-id>
<article-categories>
<subj-group subj-group-type="heading">
<subject>Research Article</subject>
</subj-group>
</article-categories>
<title-group>
<article-title>Robust Human Machine Interface Based on Head Movements Applied to Assistive Robotics</article-title>
</title-group>
<contrib-group>
<contrib contrib-type="author">
<name>
<surname>Perez</surname>
<given-names>Elisa</given-names>
</name>
<xref ref-type="aff" rid="I1">
<sup>1</sup>
</xref>
<xref ref-type="corresp" rid="cor1">*</xref>
</contrib>
<contrib contrib-type="author">
<name>
<surname>López</surname>
<given-names>Natalia</given-names>
</name>
<xref ref-type="aff" rid="I1">
<sup>1</sup>
</xref>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Orosco</surname>
<given-names>Eugenio</given-names>
</name>
<xref ref-type="aff" rid="I2">
<sup>2</sup>
</xref>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Soria</surname>
<given-names>Carlos</given-names>
</name>
<xref ref-type="aff" rid="I2">
<sup>2</sup>
</xref>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Mut</surname>
<given-names>Vicente</given-names>
</name>
<xref ref-type="aff" rid="I2">
<sup>2</sup>
</xref>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Freire-Bastos</surname>
<given-names>Teodiano</given-names>
</name>
<xref ref-type="aff" rid="I3">
<sup>3</sup>
</xref>
</contrib>
</contrib-group>
<aff id="I1">
<sup>1</sup>
Gabinete de Tecnología Médica, Facultad de Ingeniería, Universidad Nacional de San Juan, 5400 San Juan, Argentina</aff>
<aff id="I2">
<sup>2</sup>
Instituto de Automática, Facultad de Ingeniería, Universidad Nacional de San Juan, 5400 San Juan, Argentina</aff>
<aff id="I3">
<sup>3</sup>
Departamento de Engenharia Elétrica, Universidade Federal do Espírito Santo, 29075910 Vitoria, ES, Brazil</aff>
<author-notes>
<corresp id="cor1">*Elisa Perez:
<email>eperez@gateme.unsj.edu.ar</email>
</corresp>
<fn fn-type="other">
<p>Academic Editors: Z. Wang and H.-W. Wu</p>
</fn>
</author-notes>
<pub-date pub-type="collection">
<year>2013</year>
</pub-date>
<pub-date pub-type="epub">
<day>26</day>
<month>12</month>
<year>2013</year>
</pub-date>
<volume>2013</volume>
<elocation-id>589636</elocation-id>
<history>
<date date-type="received">
<day>8</day>
<month>8</month>
<year>2013</year>
</date>
<date date-type="accepted">
<day>2</day>
<month>10</month>
<year>2013</year>
</date>
</history>
<permissions>
<copyright-statement>Copyright © 2013 Elisa Perez et al.</copyright-statement>
<copyright-year>2013</copyright-year>
<license license-type="open-access">
<license-p>This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.</license-p>
</license>
</permissions>
<abstract>
<p>This paper presents an interface that uses two different sensing techniques and combines both results through a fusion process to obtain the minimum-variance estimator of the orientation of the user's head. Sensing techniques of the interface are based on an inertial sensor and artificial vision. The orientation of the user's head is used to steer the navigation of a robotic wheelchair. Also, a control algorithm for assistive technology system is presented. The system is evaluated by four individuals with severe motors disability and a quantitative index was developed, in order to objectively evaluate the performance. The results obtained are promising since most users could perform the proposed tasks with the robotic wheelchair.</p>
</abstract>
</article-meta>
</front>
<body>
<sec id="sec1">
<title>1. Introduction</title>
<p>In robotics, navigation is one of the major challenges in control schemes, both in structured as in nonstructured environments. This challenge involves the robot's ability to determine its own position and plan a path towards some specific location, based on the information available of the sensors, the environment structure, and the restrictions imposed by the control scheme. As a special case, the assisted navigation is a hybrid scheme, in which the control is supervised or commanded by a human operator [
<xref ref-type="bibr" rid="B1">1</xref>
]. This area of application is growing every year, due to the prevalence of chronic neurological diseases (as poststroke and others) and population ageing.</p>
<p>Assistive robots are used as locomotion devices [
<xref ref-type="bibr" rid="B2">2</xref>
,
<xref ref-type="bibr" rid="B3">3</xref>
] such as robotic wheelchairs or mobile robots, objects transportation, and other kinds of assistance. Mobile robots or robotic wheelchairs are automatic transportation devices, capable to navigate through a particular environment with some degree of autonomy and perform specific tasks. Robotic wheelchairs are equipped with adaptable interfaces that allow the use as mobility aid for disabled people. These devices attract the attention of many researchers and help people to reach a major autonomy in their daily life.</p>
<p>People with severe motor disabilities (such as quadriplegia, cerebral palsy, or multiple sclerosis) require specific and complex devices to satisfy their needs. Robotic wheelchairs, assistive robots, and human-computer interface (HCI) are alternative tools that improve their quality of life and independence. HCIs are developed taking advantages of the residual capabilities which are used as input in the control scheme.</p>
<p>There is a wide range of biological signals and voluntary commands that can be used in the HCI, such as electroencephalographic (EEG), electromyographic (EMG), electroculographic (EOG) signals [
<xref ref-type="bibr" rid="B4">4</xref>
<xref ref-type="bibr" rid="B7">7</xref>
], inertial sensors [
<xref ref-type="bibr" rid="B8">8</xref>
,
<xref ref-type="bibr" rid="B9">9</xref>
], and Vision Based Interfaces (VBIs) [
<xref ref-type="bibr" rid="B10">10</xref>
]. VBIs and inertial sensors (IMU) are the most preferred interfaces, because they are noninvasive and can be adapted to head movements, which are the only residual capabilities in severe physical disabilities. For these reasons, there is a lot of effort in the development of better HCIs.</p>
<p>Moreover, the control strategy used in the wheelchair command is a decisive factor to the performance of the overall system, assuring the control of velocity and stability of the wheelchair. Many researchers have developed and reported different control strategies for assistive mobile devices. As an example, the work of Ju et al. [
<xref ref-type="bibr" rid="B10">10</xref>
] describes the design and implementation of a hands-free system for intelligent wheelchair. An interface based on head gestures is developed, based on the fusion of the face detection algorithm with the object tracking algorithm. The control architecture has two control modes, called intelligent control and manual control. However, this work does not show a control law that guarantees that the wheelchair would reach the desired velocities. In [
<xref ref-type="bibr" rid="B11">11</xref>
] a robotic wheelchair for cognitively disabled children is presented. Three HCIs are provided: (i) one based on speech recognition, (ii) a motion interpreter, and (iii) the last one based on visual feedback. The results are very promising since the children were able to guide the wheelchair from the first time. However, none of above works provides a control system which ensures that the wheelchair achieves the desired velocity. Furthermore, this system is not suitable for people with severe brain and spinal injury who do not have recognizable speech and haptic sensing abilities.</p>
<p>In this work, the main objective is to develop an alternative tool of mobility or assistance for people with severe motor disability. We propose a system for these people, designed to command the navigation of a robotic wheelchair. The system combines the information of head orientation provided by a VBI and an inertial sensor. These variables are translated into reference signals for a robotic wheelchair. The proposed system includes the HCI, that allows the user to control the assistive device, and the control algorithms based on the kinematic and dynamic models of the robotic wheelchair in order to regulate the linear and angular velocities. By using specific control algorithms, the navigation results to be safer for the users. In fact, the approach of a complete assistance system is an improvement in the area, because some works that can be found in the literature focus their research only on one of two main areas: the interface with the user [
<xref ref-type="bibr" rid="B12">12</xref>
<xref ref-type="bibr" rid="B15">15</xref>
]; or the control system for autonomous navigation [
<xref ref-type="bibr" rid="B16">16</xref>
<xref ref-type="bibr" rid="B18">18</xref>
]. The first topics present different HCIs without ensuring a comfortable and safe navigation for the user, because the systems do not include control algorithms in order to avoid abrupt changes of velocity. On the other hand, the papers that emphasize the autonomous navigation do not contemplate the user intention related to the chosen path and his/her intentionality, undermining the sensation of controllability of the wheelchair. For these reasons, this paper proposes a complete system, including a robust (against measurement outliers) HCI and a control scheme designed for wheelchairs navigation, considering their kinematic and dynamic models.</p>
<p>Moreover, the robustness of the HCIs is also improved, because two different interfaces (VBI and IMU) determine the rotation angles of the user's head and the information is combined obtaining a unique estimation with less sensing error. This estimation is carried out by a decentralized Kalman Filter [
<xref ref-type="bibr" rid="B19">19</xref>
]. The luminosity variations in the VBI, uncertainties and outliers of the IMU sensors, constitute the most common problems of the HCIs. In this approach, the use of fusion algorithms provides a unique angle that is translated in velocity references for the robotic wheelchair. The results show an adequate performance and a softer path tracking.</p>
<p>In the special case of disabilities, there is a wide range of tools to measure the requirement of assistive technologies, the level of disability or functional capabilities of the user, and the cognitive state of the patient. However, there is a lack in the criterions to quantitative evaluation of the performance of assistive devices. In this paper, an index was developed, in order to objectively evaluate the performance of the system in experimental sessions with a group of severe physical disabled volunteers. This index constitutes a contribution to the qualitative and quantitative evaluation of the results in assistive technologies.</p>
<p>The paper is organized as follows. In the
<xref ref-type="sec" rid="sec2">Section 2</xref>
the system is described, including the image processing algorithm, the inertial sensor, and fusion and control algorithms, as well as the proposed evaluation index.
<xref ref-type="sec" rid="sec3"> Section 3</xref>
presents the experimental results, and finally in
<xref ref-type="sec" rid="sec4">Section 4</xref>
we discuss the conclusions of this work.</p>
</sec>
<sec id="sec2">
<title>2. Materials and Methods</title>
<sec id="sec2.1">
<title>2.1. System Overview</title>
<p>The proposed system has three main parts: a VBI interface, an IMU interface, and fusion and control algorithms of the wheelchair (
<xref ref-type="fig" rid="fig1">Figure 1</xref>
). More details about the wheelchair used can be found in [
<xref ref-type="bibr" rid="B5">5</xref>
,
<xref ref-type="bibr" rid="B18">18</xref>
,
<xref ref-type="bibr" rid="B20">20</xref>
].</p>
<p>The interfaces are noninvasive and of low cost since are used a webcam (resolution: 320 × 240 pixels; frame rate: 10 fps) and an IMU sensor. Both HCIs estimate the rotation angles of the user's head, because they are designed to people with severe disabilities, who cannot mobilize their hands or fingers.</p>
<p>Then, the values of the angles obtained by these two techniques are combined in order to estimate the head's orientation by the Kalman Fusion algorithm. It is important to note that the HCI obtains two parameters from the user head, that is, the orientation angles of head
<italic>α</italic>
and
<italic>γ</italic>
in the space, relative to
<italic>X</italic>
and
<italic>Z</italic>
axes, respectively (
<xref ref-type="fig" rid="fig1">Figure 1</xref>
). Angle
<italic>γ</italic>
is estimated by both techniques and the results are combined in the fusion process that improves the estimation. This angle is used to generate angular velocities to command the wheelchair. The angle
<italic>α</italic>
is estimated only by the accelerometer, and it is used in the calculus of the linear velocity command.
<xref ref-type="fig" rid="fig1">Figure 1</xref>
shows the overall system and the reference system of the rotation angles of the head.</p>
</sec>
<sec id="sec2.2">
<title>2.2. Vision Based Interface</title>
<p>Artificial vision is used to estimate head movements through face detection [
<xref ref-type="bibr" rid="B21">21</xref>
]. A method for face detection is based on features, which extracts image features and tracks the movement along subsequent images. The commonly used image features are based in regions, skin colour, contours, and landmarks. In this work the skin colour was chosen for detecting and tracking the face, using the colour space YCbCr, because the skin colour is grouped in a precise region of this space [
<xref ref-type="bibr" rid="B21">21</xref>
<xref ref-type="bibr" rid="B23">23</xref>
].</p>
<p>The first step of the detection process is the compensation of the illumination. Some authors [
<xref ref-type="bibr" rid="B22">22</xref>
<xref ref-type="bibr" rid="B25">25</xref>
] use detection of the skin colour with different techniques. They propose different preprocessing algorithms for obtaining a stable image when there are changes in the illumination. In this work, to address this issue, light compensation technique is employed. With this aim, histogram equalization of the input image is performed. This method improves the quality of the image by increasing the dynamic range of the pixels and enhancing the image contrast.</p>
<p>The second step is the detection of the skin by segmentation in the YCbCr colour space. The original image is first transformed from RGB to YCbCr space by
<disp-formula id="EEq1">
<label>(1)</label>
<mml:math id="M1">
<mml:mtable>
<mml:mtr>
<mml:mtd>
<mml:mtext>Y</mml:mtext>
<mml:mo>=</mml:mo>
<mml:mn mathvariant="normal">0.3</mml:mn>
<mml:mtext>R</mml:mtext>
<mml:mo>+</mml:mo>
<mml:mn mathvariant="normal">0.6</mml:mn>
<mml:mtext>G</mml:mtext>
<mml:mo>+</mml:mo>
<mml:mn mathvariant="normal">0.1</mml:mn>
<mml:mtext>B</mml:mtext>
<mml:mo>,</mml:mo>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mtext>Cr</mml:mtext>
<mml:mo>=</mml:mo>
<mml:mn mathvariant="normal">0.5</mml:mn>
<mml:mo>+</mml:mo>
<mml:mn mathvariant="normal">0.4375</mml:mn>
<mml:mtext>R</mml:mtext>
<mml:mo></mml:mo>
<mml:mn mathvariant="normal">0.375</mml:mn>
<mml:mtext>G</mml:mtext>
<mml:mo></mml:mo>
<mml:mn mathvariant="normal">0.0625</mml:mn>
<mml:mtext>B</mml:mtext>
<mml:mo>,</mml:mo>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mtext>Cb</mml:mtext>
<mml:mo>=</mml:mo>
<mml:mn mathvariant="normal">0.5</mml:mn>
<mml:mo></mml:mo>
<mml:mn mathvariant="normal">0.15</mml:mn>
<mml:mtext>R</mml:mtext>
<mml:mo></mml:mo>
<mml:mn mathvariant="normal">0.3</mml:mn>
<mml:mtext>G</mml:mtext>
<mml:mo>+</mml:mo>
<mml:mn mathvariant="normal">0.45</mml:mn>
<mml:mtext>B</mml:mtext>
<mml:mo>.</mml:mo>
</mml:mtd>
</mml:mtr>
</mml:mtable>
</mml:math>
</disp-formula>
</p>
<p>The YCbCr space is selected because its components are less variant to different skin tones [
<xref ref-type="bibr" rid="B23">23</xref>
]. Segmentation is performed by thresholding Cr and Cb components [
<xref ref-type="bibr" rid="B27">26</xref>
]. The threshold values were empirically selected as 130 < Cr < 170 and 70 < Cb < 127.</p>
<p>The third step is determining the maximum connected skin zone, in order to eliminate the objects on the background. The final step is to calculate an ellipse to localise the face, whose centre matches with the mass centre of the segmented image. This ellipse defines the region of interest (ROI) for extracting the facial features in the luminance component Y of the YCbCr space (
<xref ref-type="fig" rid="fig2">Figure 2</xref>
).</p>
<p>Different techniques to obtain facial features have been presented in the last years; each one of these techniques presents some advantages and some disadvantages. In this work we propose the integration of two methods for facial features extraction and tracking, in order to improve the performance. These two methods are (a) the classic K-means algorithm and (b) the classic normalized correlation (NC). K-means technique is used for determining the centre of a set of features points, while NC is used to identify the eyes' region based on a template of the eyes.</p>
<p>The algorithm proposed can be subdivided into three well defined steps: step (1) is extraction of features points and computation of the centroid of the facial features in the image; step (2) is normalized correlation; step (3) is tracking of the features by combination of the results obtained in the previous steps.</p>
<p>
<italic>(1) Extraction of the Characteristic Points</italic>
. Facial features shown in
<xref ref-type="fig" rid="fig3">Figure 3</xref>
present stronger and brighter contours than the surrounding regions. These characteristics allow the use of the method of the characteristic points [
<xref ref-type="bibr" rid="B28">27</xref>
], which represents a fast and simple method for detection and localization. Feature points selected correspond to contours or corners.</p>
<p>Considering a point
<italic>p</italic>
and a region surrounding it of 5 × 5 pixels, denominate
<italic>Q</italic>
, a matrix
<bold>C</bold>
that represents the gradients in the region can be determined as
<disp-formula id="EEq2">
<label>(2)</label>
<mml:math id="M2">
<mml:mtable>
<mml:mtr>
<mml:mtd>
<mml:mi mathvariant="bold">C</mml:mi>
<mml:mo>=</mml:mo>
<mml:mrow>
<mml:mo>[</mml:mo>
<mml:mrow>
<mml:mtable>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:munder>
<mml:mstyle displaystyle="true">
<mml:mo stretchy="false"></mml:mo>
</mml:mstyle>
<mml:mrow>
<mml:mi>Q</mml:mi>
</mml:mrow>
</mml:munder>
<mml:mrow>
<mml:msubsup>
<mml:mrow>
<mml:mi>E</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi>x</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mn mathvariant="normal">2</mml:mn>
</mml:mrow>
</mml:msubsup>
</mml:mrow>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:munder>
<mml:mstyle displaystyle="true">
<mml:mo stretchy="false"></mml:mo>
</mml:mstyle>
<mml:mrow>
<mml:mi>Q</mml:mi>
</mml:mrow>
</mml:munder>
<mml:mrow>
<mml:msub>
<mml:mrow>
<mml:mi>E</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi>x</mml:mi>
</mml:mrow>
</mml:msub>
<mml:msub>
<mml:mrow>
<mml:mi>E</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi>y</mml:mi>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:mrow>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:munder>
<mml:mstyle displaystyle="true">
<mml:mo stretchy="false"></mml:mo>
</mml:mstyle>
<mml:mrow>
<mml:mi>Q</mml:mi>
</mml:mrow>
</mml:munder>
<mml:mrow>
<mml:msub>
<mml:mrow>
<mml:mi>E</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi>x</mml:mi>
</mml:mrow>
</mml:msub>
<mml:msub>
<mml:mrow>
<mml:mi>E</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi>y</mml:mi>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:munder>
<mml:mstyle displaystyle="true">
<mml:mo stretchy="false"></mml:mo>
</mml:mstyle>
<mml:mrow>
<mml:mi>Q</mml:mi>
</mml:mrow>
</mml:munder>
<mml:mrow>
<mml:msubsup>
<mml:mrow>
<mml:mi>E</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi>y</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mn mathvariant="normal">2</mml:mn>
</mml:mrow>
</mml:msubsup>
</mml:mrow>
</mml:mrow>
</mml:mtd>
</mml:mtr>
</mml:mtable>
</mml:mrow>
<mml:mo>]</mml:mo>
</mml:mrow>
<mml:mo>,</mml:mo>
</mml:mtd>
</mml:mtr>
</mml:mtable>
</mml:math>
</disp-formula>
where
<italic>E</italic>
<sub>
<italic>x</italic>
</sub>
and
<italic>E</italic>
<sub>
<italic>y</italic>
</sub>
are the gradient of each point in the region
<italic>Q</italic>
in the axes
<italic>x</italic>
and
<italic>y</italic>
, respectively. As
<bold>C</bold>
is symmetric, it can be diagonalized as
<disp-formula id="EEq3">
<label>(3)</label>
<mml:math id="M3">
<mml:mtable>
<mml:mtr>
<mml:mtd>
<mml:mi mathvariant="bold">C</mml:mi>
<mml:mo>=</mml:mo>
<mml:mrow>
<mml:mo>[</mml:mo>
<mml:mrow>
<mml:mtable>
<mml:mtr>
<mml:mtd>
<mml:msub>
<mml:mrow>
<mml:mi>λ</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mn mathvariant="normal">1</mml:mn>
</mml:mrow>
</mml:msub>
</mml:mtd>
<mml:mtd>
<mml:mn mathvariant="normal">0</mml:mn>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mn mathvariant="normal">0</mml:mn>
</mml:mtd>
<mml:mtd>
<mml:msub>
<mml:mrow>
<mml:mi>λ</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mn mathvariant="normal">2</mml:mn>
</mml:mrow>
</mml:msub>
</mml:mtd>
</mml:mtr>
</mml:mtable>
</mml:mrow>
<mml:mo>]</mml:mo>
</mml:mrow>
<mml:mo>,</mml:mo>
</mml:mtd>
</mml:mtr>
</mml:mtable>
</mml:math>
</disp-formula>
where  
<italic>λ</italic>
<sub>1</sub>
and
<italic>λ</italic>
<sub>2</sub>
are the eigenvalues of the matrix
<bold>C</bold>
. Through the geometric interpretation of the eigen values of
<bold>C</bold>
, it can be determined if a pixel represents a corner or not. Since corner is the result of two strong contours, all pixels with intensity value greater than the minor eigenvalue belong to a corner. This way, it is possible to obtain the characteristic points of the image, which correspond to the region of the face that has strong contour (eyes, mouth, eyebrows, etc.).</p>
<p>Facial features need the calculus of the mass centres of the two regions of interest (eyes). These mass centres are obtained by clustering the feature points and discarding points which are not associated with regions of the facial features. With this aim, a simple and efficient K-means algorithm is used.</p>
<p>
<italic>(2) Normalized Correlation</italic>
. The second step consists of the optimization of feature extraction based on a correlation process. For this purpose, a subimage of 50 × 36 pixels is used. This subimage of a known image of the frontal face is compared with the eyes region on the luminance image using fast correlation to identify the eye location.</p>
<p>
<italic>(3) Tracking of the Features</italic>
. The average mass centres values of the regions associated with each eyes, obtained in the previous technique, are filtered using a Kalman filter to obtain the location of the eyes. The filter considers a first order kinematics model whose states correspond to the measurement of the centroids. The values of the covariance matrices of the Kalman filter
<italic>Q</italic>
<sub>Kalman</sub>
and
<italic>R</italic>
<sub>Kalman</sub>
are 1 ×
<italic>I</italic>
<sub>2×2</sub>
and 5 ×
<italic>I</italic>
<sub>2×2</sub>
, respectively. Examples of the image with the correlation and the characteristic points estimated can be observed in
<xref ref-type="fig" rid="fig3">Figure 3</xref>
.</p>
<p>The value of the rotation angle
<italic>γ</italic>
<sub>
<italic>c</italic>
</sub>
is obtained using the eyes centroid as
<disp-formula id="EEq4">
<label>(4)</label>
<mml:math id="M4">
<mml:mtable>
<mml:mtr>
<mml:mtd>
<mml:msub>
<mml:mrow>
<mml:mi>γ</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi>c</mml:mi>
</mml:mrow>
</mml:msub>
<mml:mo>=</mml:mo>
<mml:mrow>
<mml:mrow>
<mml:msup>
<mml:mrow>
<mml:mi>tan</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mo></mml:mo>
<mml:mn mathvariant="normal">1</mml:mn>
</mml:mrow>
</mml:msup>
</mml:mrow>
<mml:mo></mml:mo>
<mml:mrow>
<mml:mfrac>
<mml:mrow>
<mml:msub>
<mml:mrow>
<mml:mi>y</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mn mathvariant="normal">1</mml:mn>
</mml:mrow>
</mml:msub>
<mml:mo></mml:mo>
<mml:msub>
<mml:mrow>
<mml:mi>y</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mn mathvariant="normal">2</mml:mn>
</mml:mrow>
</mml:msub>
</mml:mrow>
<mml:mrow>
<mml:msub>
<mml:mrow>
<mml:mi>x</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mn mathvariant="normal">1</mml:mn>
</mml:mrow>
</mml:msub>
<mml:mo></mml:mo>
<mml:msub>
<mml:mrow>
<mml:mi>x</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mn mathvariant="normal">2</mml:mn>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:mfrac>
</mml:mrow>
</mml:mrow>
<mml:mo>,</mml:mo>
</mml:mtd>
</mml:mtr>
</mml:mtable>
</mml:math>
</disp-formula>
where (
<italic>y</italic>
<sub>1</sub>
,
<italic>x</italic>
<sub>1</sub>
) are the centroid coordinates of the right eye and (
<italic>y</italic>
<sub>2</sub>
,
<italic>x</italic>
<sub>2</sub>
) are the centroid coordinates of the left eye.</p>
</sec>
<sec id="sec2.3">
<title>2.3. IMU Interface</title>
<p>As mentioned in the preceding paragraph an accelerometer is used for estimating two angles of the head orientation in the space. These angles are
<italic>γ</italic>
<sub>
<italic>a</italic>
</sub>
(related to the
<italic>Z</italic>
-axis) and the angle
<italic>α</italic>
<sub>
<italic>a</italic>
</sub>
(related to the
<italic>X</italic>
-axis).</p>
<p>The accelerometer used is the ADXL322j from Analog Devices. This accelerometer requires a voltage supply between 2.4 and 6.0 volts and low power consumption with an average current of 340 
<italic>μ</italic>
A in operation, and its dimensions are 4 × 4 mm, allowing the use in a discrete and portable assembly. This sensor has two output signals; one of them is for the deviation with respect to
<italic>X</italic>
-axis, and the other one is for deviation relative to
<italic>Z</italic>
-axis, which varies linearly with the inclination of the sensor. The accelerometer and a filter stage are implemented on a small board (
<xref ref-type="fig" rid="fig4">Figure 4</xref>
). The microcontroller used is a PIC16F876A manufactured by Microchip. This microcontroller reads the data sensed and transmits them to a Bluetooth communication module OEMSPA311i (ConnecBlue). The data are transmitted via Bluetooth protocol to the computer at 2.4 GHz, where the fusion with the angle estimated by image processing is carried out. The obtained system is small, simple, and of low cost and has lower consumption of power (
<xref ref-type="fig" rid="fig4">Figure 4</xref>
). The device is mounted on a classical cap or a headband.</p>
<p>Once the inertial sensor is placed on the head of the user, the software developed makes a first reading of the inclination angles of the head and these angles are saved for relating all subsequent measurements to these first values. This way, an offset correction is carried out when the assistive device is initialized.</p>
</sec>
<sec id="sec2.4">
<title>2.4. Fusion Algorithm</title>
<p>Estimation tools such as the Kalman filter can be used to combine or fuse information from different sources or sensors for hybrid systems. The Decentralized Kalman Filter (DKF) generates the overall signal estimate by minimizing the variances [
<xref ref-type="bibr" rid="B19">19</xref>
]. The DKF can be considered an algebraic equivalent of the Centralized Kalman Filter (CKF). Theoretically, there is no performance loss in the decentralized system; it delivers the same results as the CKF, but the benefits of the DKF are the modular concept that allows to add more sensors to the system, as needed, and an easier parallel implementation. In this work, fusion is used to decrease the variance of the angle estimations in an optimal way, improving the interface performance [
<xref ref-type="bibr" rid="B19">19</xref>
].</p>
<p>The angular values
<italic>γ</italic>
<sub>
<italic>a</italic>
</sub>
and
<italic>γ</italic>
<sub>
<italic>c</italic>
</sub>
, and their variances obtained by both techniques, are introduced in a DKF, where the angle fusion is carried because the angle
<italic>γ</italic>
<sub>
<italic>a</italic>
</sub>
estimated by the accelerometer presents abrupt changes when the user moves the head. These changes could produce undesired movements to the wheelchair when the user is driving it. On the other hand, this sensing technique is more stable than the image processing technique, because the technique based on artificial vision depends on the centroids of the eyes and it is not always detected, due to abrupt changes in the illumination or wide movement of the head. These problems could produce errors in the calculus of the
<italic>γ</italic>
<sub>
<italic>c</italic>
</sub>
angle. For this reason, the fusion of the two angles is implemented. Thus it provides an interface with better performance and stability for the navigation of the chair than without using the fusion.</p>
</sec>
<sec id="sec2.5">
<title>2.5. Wheelchair Model</title>
<p>The control laws used in this work consider the dynamic model developed by de la Cruz et al. [
<xref ref-type="bibr" rid="B18">18</xref>
]. This model is based on the contributions of [
<xref ref-type="bibr" rid="B29">28</xref>
], considering velocity references as inputs. The model of the wheelchair is presented in
<xref ref-type="fig" rid="fig5">Figure 5</xref>
. This figure depicts the wheelchair with the parameters and variables of interest. In the figure,
<italic>u</italic>
and
<italic>ω</italic>
are the linear and angular velocities of the wheelchair, respectively,
<italic>G</italic>
is the center of mass of the wheelchair,
<italic>c</italic>
is the position of the middle point between the front wheels,
<italic>E</italic>
is the mass center of the user location,
<italic>h</italic>
is the point of interest with coordinate
<italic>x</italic>
,
<italic>y</italic>
in the
<italic>XY</italic>
plane,
<italic>ψ</italic>
is the robot orientation, and
<italic>a</italic>
is the distance between the point of interest and the central point of the virtual axis linking the traction wheels.</p>
<p>The mathematical representation of the complete model can be seen in the same way of mobile robots and is given by the following.</p>
<p>Kinematic Model:
<disp-formula id="EEq5">
<label>(5)</label>
<mml:math id="M5">
<mml:mtable>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:mo>[</mml:mo>
<mml:mrow>
<mml:mtable>
<mml:mtr>
<mml:mtd>
<mml:mover accent="true">
<mml:mrow>
<mml:mi>x</mml:mi>
</mml:mrow>
<mml:mo>˙</mml:mo>
</mml:mover>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mover accent="true">
<mml:mrow>
<mml:mi>y</mml:mi>
</mml:mrow>
<mml:mo>˙</mml:mo>
</mml:mover>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mover accent="true">
<mml:mrow>
<mml:mi>ψ</mml:mi>
</mml:mrow>
<mml:mo>˙</mml:mo>
</mml:mover>
</mml:mtd>
</mml:mtr>
</mml:mtable>
</mml:mrow>
<mml:mo>]</mml:mo>
</mml:mrow>
<mml:mo>=</mml:mo>
<mml:mrow>
<mml:mo>[</mml:mo>
<mml:mrow>
<mml:mtable>
<mml:mtr>
<mml:mtd>
<mml:mi>cos</mml:mi>
<mml:mo></mml:mo>
<mml:mi>ψ</mml:mi>
</mml:mtd>
<mml:mtd>
<mml:mo></mml:mo>
<mml:mi>a</mml:mi>
<mml:mi>sin</mml:mi>
<mml:mi>ψ</mml:mi>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mi>sin</mml:mi>
<mml:mi>ψ</mml:mi>
</mml:mtd>
<mml:mtd>
<mml:mi>a</mml:mi>
<mml:mi>cos</mml:mi>
<mml:mo></mml:mo>
<mml:mi>ψ</mml:mi>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mn mathvariant="normal">0</mml:mn>
</mml:mtd>
<mml:mtd>
<mml:mn mathvariant="normal">1</mml:mn>
</mml:mtd>
</mml:mtr>
</mml:mtable>
</mml:mrow>
<mml:mo>]</mml:mo>
</mml:mrow>
<mml:mrow>
<mml:mo>[</mml:mo>
<mml:mrow>
<mml:mtable>
<mml:mtr>
<mml:mtd>
<mml:mi>u</mml:mi>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mi>ω</mml:mi>
</mml:mtd>
</mml:mtr>
</mml:mtable>
</mml:mrow>
<mml:mo>]</mml:mo>
</mml:mrow>
<mml:mo>+</mml:mo>
<mml:mrow>
<mml:mo>[</mml:mo>
<mml:mrow>
<mml:mtable>
<mml:mtr>
<mml:mtd>
<mml:msub>
<mml:mrow>
<mml:mi>δ</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi>x</mml:mi>
</mml:mrow>
</mml:msub>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:msub>
<mml:mrow>
<mml:mi>δ</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi>y</mml:mi>
</mml:mrow>
</mml:msub>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mn mathvariant="normal">0</mml:mn>
</mml:mtd>
</mml:mtr>
</mml:mtable>
</mml:mrow>
<mml:mo>]</mml:mo>
</mml:mrow>
<mml:mo>.</mml:mo>
</mml:mtd>
</mml:mtr>
</mml:mtable>
</mml:math>
</disp-formula>
</p>
<p>Dynamic Model:
<disp-formula id="EEq6">
<label>(6)</label>
<mml:math id="M6">
<mml:mtable>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:mo>[</mml:mo>
<mml:mrow>
<mml:mtable>
<mml:mtr>
<mml:mtd>
<mml:mover accent="true">
<mml:mrow>
<mml:mi>u</mml:mi>
</mml:mrow>
<mml:mo>˙</mml:mo>
</mml:mover>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mover accent="true">
<mml:mrow>
<mml:mi>ω</mml:mi>
</mml:mrow>
<mml:mo>˙</mml:mo>
</mml:mover>
</mml:mtd>
</mml:mtr>
</mml:mtable>
</mml:mrow>
<mml:mo>]</mml:mo>
</mml:mrow>
<mml:mo>=</mml:mo>
<mml:mrow>
<mml:mo>[</mml:mo>
<mml:mrow>
<mml:mtable>
<mml:mtr>
<mml:mtd>
<mml:mfrac>
<mml:mrow>
<mml:msub>
<mml:mrow>
<mml:mi>θ</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mn mathvariant="normal">3</mml:mn>
</mml:mrow>
</mml:msub>
</mml:mrow>
<mml:mrow>
<mml:msub>
<mml:mrow>
<mml:mi>θ</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mn mathvariant="normal">1</mml:mn>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:mfrac>
<mml:msup>
<mml:mrow>
<mml:mi>ω</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mn mathvariant="normal">2</mml:mn>
</mml:mrow>
</mml:msup>
</mml:mtd>
<mml:mtd>
<mml:mo></mml:mo>
<mml:mfrac>
<mml:mrow>
<mml:msub>
<mml:mrow>
<mml:mi>θ</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mn mathvariant="normal">4</mml:mn>
</mml:mrow>
</mml:msub>
</mml:mrow>
<mml:mrow>
<mml:msub>
<mml:mrow>
<mml:mi>θ</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mn mathvariant="normal">1</mml:mn>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:mfrac>
<mml:mi>u</mml:mi>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mo></mml:mo>
<mml:mfrac>
<mml:mrow>
<mml:msub>
<mml:mrow>
<mml:mi>θ</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mn mathvariant="normal">5</mml:mn>
</mml:mrow>
</mml:msub>
</mml:mrow>
<mml:mrow>
<mml:msub>
<mml:mrow>
<mml:mi>θ</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mn mathvariant="normal">2</mml:mn>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:mfrac>
<mml:mi>u</mml:mi>
<mml:mi>ω</mml:mi>
</mml:mtd>
<mml:mtd>
<mml:mo></mml:mo>
<mml:mfrac>
<mml:mrow>
<mml:msub>
<mml:mrow>
<mml:mi>θ</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mn mathvariant="normal">6</mml:mn>
</mml:mrow>
</mml:msub>
</mml:mrow>
<mml:mrow>
<mml:msub>
<mml:mrow>
<mml:mi>θ</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mn mathvariant="normal">2</mml:mn>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:mfrac>
<mml:mi>ω</mml:mi>
</mml:mtd>
</mml:mtr>
</mml:mtable>
</mml:mrow>
<mml:mo>]</mml:mo>
</mml:mrow>
<mml:mo>+</mml:mo>
<mml:mrow>
<mml:mo>[</mml:mo>
<mml:mrow>
<mml:mtable>
<mml:mtr>
<mml:mtd>
<mml:mfrac>
<mml:mrow>
<mml:mn mathvariant="normal">1</mml:mn>
</mml:mrow>
<mml:mrow>
<mml:msub>
<mml:mrow>
<mml:mi>θ</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mn mathvariant="normal">1</mml:mn>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:mfrac>
</mml:mtd>
<mml:mtd>
<mml:mn mathvariant="normal">0</mml:mn>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mn mathvariant="normal">0</mml:mn>
</mml:mtd>
<mml:mtd>
<mml:mfrac>
<mml:mrow>
<mml:mn mathvariant="normal">1</mml:mn>
</mml:mrow>
<mml:mrow>
<mml:msub>
<mml:mrow>
<mml:mi>θ</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mn mathvariant="normal">2</mml:mn>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:mfrac>
</mml:mtd>
</mml:mtr>
</mml:mtable>
</mml:mrow>
<mml:mo>]</mml:mo>
</mml:mrow>
<mml:mrow>
<mml:mo>[</mml:mo>
<mml:mrow>
<mml:mtable>
<mml:mtr>
<mml:mtd>
<mml:msubsup>
<mml:mrow>
<mml:mi>u</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mtext>ref</mml:mtext>
</mml:mrow>
<mml:mrow>
<mml:mi>d</mml:mi>
</mml:mrow>
</mml:msubsup>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:msubsup>
<mml:mrow>
<mml:mi>ω</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mtext>ref</mml:mtext>
</mml:mrow>
<mml:mrow>
<mml:mi>d</mml:mi>
</mml:mrow>
</mml:msubsup>
</mml:mtd>
</mml:mtr>
</mml:mtable>
</mml:mrow>
<mml:mo>]</mml:mo>
</mml:mrow>
<mml:mo>+</mml:mo>
<mml:mrow>
<mml:mo>[</mml:mo>
<mml:mrow>
<mml:mtable>
<mml:mtr>
<mml:mtd>
<mml:msub>
<mml:mrow>
<mml:mi>δ</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi>u</mml:mi>
</mml:mrow>
</mml:msub>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:msub>
<mml:mrow>
<mml:mi>δ</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi>ω</mml:mi>
</mml:mrow>
</mml:msub>
</mml:mtd>
</mml:mtr>
</mml:mtable>
</mml:mrow>
<mml:mo>]</mml:mo>
</mml:mrow>
<mml:mo>.</mml:mo>
</mml:mtd>
</mml:mtr>
</mml:mtable>
</mml:math>
</disp-formula>
</p>
<p>The vector of model parameters and the vector of uncertainties parameters are, respectively,
<disp-formula id="eq7">
<label>(7)</label>
<mml:math id="M7">
<mml:mtable>
<mml:mtr>
<mml:mtd>
<mml:mi mathvariant="bold">θ</mml:mi>
<mml:mo>=</mml:mo>
<mml:msup>
<mml:mrow>
<mml:mrow>
<mml:mo>[</mml:mo>
<mml:mrow>
<mml:mtable>
<mml:mtr>
<mml:mtd>
<mml:msub>
<mml:mrow>
<mml:mi>θ</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mn mathvariant="normal">1</mml:mn>
</mml:mrow>
</mml:msub>
</mml:mtd>
<mml:mtd>
<mml:msub>
<mml:mrow>
<mml:mi>θ</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mn mathvariant="normal">2</mml:mn>
</mml:mrow>
</mml:msub>
</mml:mtd>
<mml:mtd>
<mml:msub>
<mml:mrow>
<mml:mi>θ</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mn mathvariant="normal">3</mml:mn>
</mml:mrow>
</mml:msub>
</mml:mtd>
<mml:mtd>
<mml:msub>
<mml:mrow>
<mml:mi>θ</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mn mathvariant="normal">4</mml:mn>
</mml:mrow>
</mml:msub>
</mml:mtd>
<mml:mtd>
<mml:msub>
<mml:mrow>
<mml:mi>θ</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mn mathvariant="normal">5</mml:mn>
</mml:mrow>
</mml:msub>
</mml:mtd>
<mml:mtd>
<mml:msub>
<mml:mrow>
<mml:mi>θ</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mn mathvariant="normal">6</mml:mn>
</mml:mrow>
</mml:msub>
</mml:mtd>
</mml:mtr>
</mml:mtable>
</mml:mrow>
<mml:mo>]</mml:mo>
</mml:mrow>
</mml:mrow>
<mml:mrow>
<mml:mi>T</mml:mi>
</mml:mrow>
</mml:msup>
<mml:mo>,</mml:mo>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mi mathvariant="bold-italic">δ</mml:mi>
<mml:mo>=</mml:mo>
<mml:msup>
<mml:mrow>
<mml:mrow>
<mml:mo>[</mml:mo>
<mml:mrow>
<mml:mtable>
<mml:mtr>
<mml:mtd>
<mml:msub>
<mml:mrow>
<mml:mi>δ</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi>x</mml:mi>
</mml:mrow>
</mml:msub>
</mml:mtd>
<mml:mtd>
<mml:msub>
<mml:mrow>
<mml:mi>δ</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi>y</mml:mi>
</mml:mrow>
</mml:msub>
</mml:mtd>
<mml:mtd>
<mml:mn mathvariant="normal">0</mml:mn>
</mml:mtd>
<mml:mtd>
<mml:msub>
<mml:mrow>
<mml:mover accent="true">
<mml:mrow>
<mml:mi>δ</mml:mi>
</mml:mrow>
<mml:mo></mml:mo>
</mml:mover>
</mml:mrow>
<mml:mrow>
<mml:mi>u</mml:mi>
</mml:mrow>
</mml:msub>
</mml:mtd>
<mml:mtd>
<mml:msub>
<mml:mrow>
<mml:mover accent="true">
<mml:mrow>
<mml:mi>δ</mml:mi>
</mml:mrow>
<mml:mo></mml:mo>
</mml:mover>
</mml:mrow>
<mml:mrow>
<mml:mi>ω</mml:mi>
</mml:mrow>
</mml:msub>
</mml:mtd>
</mml:mtr>
</mml:mtable>
</mml:mrow>
<mml:mo>]</mml:mo>
</mml:mrow>
</mml:mrow>
<mml:mrow>
<mml:mi>T</mml:mi>
</mml:mrow>
</mml:msup>
<mml:mo>.</mml:mo>
</mml:mtd>
</mml:mtr>
</mml:mtable>
</mml:math>
</disp-formula>
</p>
<p>The vector
<bold>
<italic>θ</italic>
</bold>
was obtained through an identification experiment, which can be found in [
<xref ref-type="bibr" rid="B18">18</xref>
], and the values obtained were
<disp-formula id="eq8">
<label>(8)</label>
<mml:math id="M8">
<mml:mtable>
<mml:mtr>
<mml:mtd>
<mml:msub>
<mml:mrow>
<mml:mi>θ</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mn mathvariant="normal">1</mml:mn>
</mml:mrow>
</mml:msub>
<mml:mo>=</mml:mo>
<mml:mn mathvariant="normal">0.4087</mml:mn>
<mml:mo>,</mml:mo>
<mml:mo></mml:mo>
<mml:mo></mml:mo>
<mml:msub>
<mml:mrow>
<mml:mi>θ</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mn mathvariant="normal">2</mml:mn>
</mml:mrow>
</mml:msub>
<mml:mo>=</mml:mo>
<mml:mn mathvariant="normal">0.1925</mml:mn>
<mml:mo>,</mml:mo>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:msub>
<mml:mrow>
<mml:mi>θ</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mn mathvariant="normal">3</mml:mn>
</mml:mrow>
</mml:msub>
<mml:mo>=</mml:mo>
<mml:mn mathvariant="normal">0.0047</mml:mn>
<mml:mo>,</mml:mo>
<mml:mo></mml:mo>
<mml:mo></mml:mo>
<mml:msub>
<mml:mrow>
<mml:mi>θ</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mn mathvariant="normal">4</mml:mn>
</mml:mrow>
</mml:msub>
<mml:mo>=</mml:mo>
<mml:mn mathvariant="normal">1.0042</mml:mn>
<mml:mo>,</mml:mo>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:msub>
<mml:mrow>
<mml:mi>θ</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mn mathvariant="normal">5</mml:mn>
</mml:mrow>
</mml:msub>
<mml:mo>=</mml:mo>
<mml:mn mathvariant="normal">0.0044</mml:mn>
<mml:mo>,</mml:mo>
<mml:mo></mml:mo>
<mml:mo></mml:mo>
<mml:msub>
<mml:mrow>
<mml:mi>θ</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mn mathvariant="normal">6</mml:mn>
</mml:mrow>
</mml:msub>
<mml:mo>=</mml:mo>
<mml:mn mathvariant="normal">0.8744</mml:mn>
<mml:mo>.</mml:mo>
</mml:mtd>
</mml:mtr>
</mml:mtable>
</mml:math>
</disp-formula>
</p>
</sec>
<sec id="sec2.6">
<title>2.6. Control Scheme</title>
<p>The control system here implemented has two well-differentiated stages of control. The first stage is based on the kinematic model of the wheelchair. In this stage, reference velocities are computed as functions of the orientation angles of the head, obtained from the interface.</p>
<p>These reference velocities are the inputs of the second stage, designed according to the dynamic model that generates the control actions to be sent to the robotic wheelchair.</p>
<sec id="sec2.6.1">
<title>2.6.1. Design of the Kinematic Controller</title>
<p>As was mentioned above, the kinematic controller uses the orientation angles
<italic>γ</italic>
(estimated by DKF) and
<italic>α</italic>
. The angle
<italic>γ</italic>
is used in the angular velocity control law, while both angles
<italic>α</italic>
and
<italic>γ</italic>
are used in the control law for linear velocity.</p>
<p>The nonlinear control law for the angular velocity used is
<disp-formula id="EEq7">
<label>(9)</label>
<mml:math id="M9">
<mml:mtable>
<mml:mtr>
<mml:mtd>
<mml:msub>
<mml:mrow>
<mml:mi>ω</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mtext>ref</mml:mtext>
</mml:mrow>
</mml:msub>
<mml:mo>=</mml:mo>
<mml:mo></mml:mo>
<mml:msub>
<mml:mrow>
<mml:mi>k</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi>ω</mml:mi>
<mml:mn mathvariant="normal">1</mml:mn>
</mml:mrow>
</mml:msub>
<mml:mrow>
<mml:mrow>
<mml:mi>tanh</mml:mi>
</mml:mrow>
<mml:mo></mml:mo>
<mml:mrow>
<mml:mover accent="true">
<mml:mrow>
<mml:mi>ψ</mml:mi>
</mml:mrow>
<mml:mo>~</mml:mo>
</mml:mover>
</mml:mrow>
</mml:mrow>
<mml:mo>,</mml:mo>
</mml:mtd>
</mml:mtr>
</mml:mtable>
</mml:math>
</disp-formula>
where
<italic>k</italic>
<sub>
<italic>ω</italic>
1</sub>
is a positive design constant and
<inline-formula>
<mml:math id="M10">
<mml:mover accent="true">
<mml:mrow>
<mml:mi>ψ</mml:mi>
</mml:mrow>
<mml:mo>~</mml:mo>
</mml:mover>
<mml:mo>=</mml:mo>
<mml:mi>γ</mml:mi>
<mml:mo>-</mml:mo>
<mml:mi>ψ</mml:mi>
</mml:math>
</inline-formula>
is the heading error of the robot. The function tanh(·) is used to prevent the saturation of the angular velocity command when high orientation errors exist. The analysis of stability of this law of control is developed in [
<xref ref-type="bibr" rid="B30">29</xref>
].</p>
<p>The control law for the linear velocity was developed in such a way that the velocity is reduced when the robotic wheelchair is manoeuvring; that is, when an orientation error
<inline-formula>
<mml:math id="M11">
<mml:mrow>
<mml:mover accent="true">
<mml:mrow>
<mml:mi>ψ</mml:mi>
</mml:mrow>
<mml:mo>~</mml:mo>
</mml:mover>
</mml:mrow>
</mml:math>
</inline-formula>
exists. Therefore, the control law for the linear velocity is
<disp-formula id="EEq8">
<label>(10)</label>
<mml:math id="M12">
<mml:mtable>
<mml:mtr>
<mml:mtd>
<mml:msub>
<mml:mrow>
<mml:mi>u</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mtext>ref</mml:mtext>
</mml:mrow>
</mml:msub>
<mml:mo>=</mml:mo>
<mml:msub>
<mml:mrow>
<mml:mi>V</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi>max</mml:mi>
<mml:mo></mml:mo>
</mml:mrow>
</mml:msub>
<mml:mrow>
<mml:mrow>
<mml:mi>cos</mml:mi>
<mml:mo></mml:mo>
</mml:mrow>
<mml:mo></mml:mo>
<mml:mrow>
<mml:mover accent="true">
<mml:mrow>
<mml:mi>ψ</mml:mi>
</mml:mrow>
<mml:mo>~</mml:mo>
</mml:mover>
</mml:mrow>
</mml:mrow>
<mml:mo></mml:mo>
<mml:mtext>if</mml:mtext>
<mml:mo></mml:mo>
<mml:mo></mml:mo>
<mml:mi>α</mml:mi>
<mml:mo>></mml:mo>
<mml:mn>0</mml:mn>
<mml:mo>,</mml:mo>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:msub>
<mml:mrow>
<mml:mi>u</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mtext>ref</mml:mtext>
</mml:mrow>
</mml:msub>
<mml:mo>=</mml:mo>
<mml:mn mathvariant="normal">0</mml:mn>
<mml:mo></mml:mo>
<mml:mtext>if</mml:mtext>
<mml:mo></mml:mo>
<mml:mo></mml:mo>
<mml:mi>α</mml:mi>
<mml:mo><</mml:mo>
<mml:mn>0</mml:mn>
<mml:mo>.</mml:mo>
</mml:mtd>
</mml:mtr>
</mml:mtable>
</mml:math>
</disp-formula>
</p>
<p>This way, the maximum linear velocity is
<italic>u</italic>
<sub>ref</sub>
=
<italic>V</italic>
<sub>max⁡</sub>
. The maximu velocity
<italic>V</italic>
<sub>max⁡</sub>
should be defined taking into account both the physics limits of the wheelchair (avoiding the saturation of the actuators), as well as the comfort and safety of the user.</p>
</sec>
<sec id="sec2.6.2">
<title>2.6.2. Design of the Dynamic Controller</title>
<p>This controller compensates the wheelchair dynamics, improving the performance of the proposed system. The dynamic controller receives the velocity references from the kinematic controller and generates the linear and angular velocities to be sent to the wheelchair.</p>
<p>The dynamic controller is based on the nominal dynamic of the wheelchair, which represents the estimated medium dynamics, disregarding the uncertainties. This nominal dynamic can be represented as
<disp-formula id="EEq9">
<label>(11)</label>
<mml:math id="M13">
<mml:mtable>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:mo>[</mml:mo>
<mml:mrow>
<mml:mtable>
<mml:mtr>
<mml:mtd>
<mml:mover accent="true">
<mml:mrow>
<mml:mi>u</mml:mi>
</mml:mrow>
<mml:mo>˙</mml:mo>
</mml:mover>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mover accent="true">
<mml:mrow>
<mml:mi>ω</mml:mi>
</mml:mrow>
<mml:mo>˙</mml:mo>
</mml:mover>
</mml:mtd>
</mml:mtr>
</mml:mtable>
</mml:mrow>
<mml:mo>]</mml:mo>
</mml:mrow>
<mml:mo>=</mml:mo>
<mml:mrow>
<mml:mo>[</mml:mo>
<mml:mrow>
<mml:mtable>
<mml:mtr>
<mml:mtd>
<mml:mfrac>
<mml:mrow>
<mml:msub>
<mml:mrow>
<mml:mi>θ</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mn mathvariant="normal">3</mml:mn>
</mml:mrow>
</mml:msub>
</mml:mrow>
<mml:mrow>
<mml:msub>
<mml:mrow>
<mml:mi>θ</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mn mathvariant="normal">1</mml:mn>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:mfrac>
<mml:msup>
<mml:mrow>
<mml:mi>ω</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mn mathvariant="normal">2</mml:mn>
</mml:mrow>
</mml:msup>
<mml:mo></mml:mo>
<mml:mfrac>
<mml:mrow>
<mml:msub>
<mml:mrow>
<mml:mi>θ</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mn mathvariant="normal">4</mml:mn>
</mml:mrow>
</mml:msub>
</mml:mrow>
<mml:mrow>
<mml:msub>
<mml:mrow>
<mml:mi>θ</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mn mathvariant="normal">1</mml:mn>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:mfrac>
<mml:mi>u</mml:mi>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mo></mml:mo>
<mml:mfrac>
<mml:mrow>
<mml:msub>
<mml:mrow>
<mml:mi>θ</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mn mathvariant="normal">5</mml:mn>
</mml:mrow>
</mml:msub>
</mml:mrow>
<mml:mrow>
<mml:msub>
<mml:mrow>
<mml:mi>θ</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mn mathvariant="normal">2</mml:mn>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:mfrac>
<mml:mi>u</mml:mi>
<mml:mi>ω</mml:mi>
<mml:mo></mml:mo>
<mml:mfrac>
<mml:mrow>
<mml:msub>
<mml:mrow>
<mml:mi>θ</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mn mathvariant="normal">6</mml:mn>
</mml:mrow>
</mml:msub>
</mml:mrow>
<mml:mrow>
<mml:msub>
<mml:mrow>
<mml:mi>θ</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mn mathvariant="normal">2</mml:mn>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:mfrac>
<mml:mi>ω</mml:mi>
</mml:mtd>
</mml:mtr>
</mml:mtable>
</mml:mrow>
<mml:mo>]</mml:mo>
</mml:mrow>
<mml:mo>+</mml:mo>
<mml:mrow>
<mml:mo>[</mml:mo>
<mml:mrow>
<mml:mtable>
<mml:mtr>
<mml:mtd>
<mml:mfrac>
<mml:mrow>
<mml:mn mathvariant="normal">1</mml:mn>
</mml:mrow>
<mml:mrow>
<mml:msub>
<mml:mrow>
<mml:mi>θ</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mn mathvariant="normal">1</mml:mn>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:mfrac>
</mml:mtd>
<mml:mtd>
<mml:mn mathvariant="normal">0</mml:mn>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mn mathvariant="normal">0</mml:mn>
</mml:mtd>
<mml:mtd>
<mml:mfrac>
<mml:mrow>
<mml:mn mathvariant="normal">1</mml:mn>
</mml:mrow>
<mml:mrow>
<mml:msub>
<mml:mrow>
<mml:mi>θ</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mn mathvariant="normal">2</mml:mn>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:mfrac>
</mml:mtd>
</mml:mtr>
</mml:mtable>
</mml:mrow>
<mml:mo>]</mml:mo>
</mml:mrow>
<mml:mrow>
<mml:mo>[</mml:mo>
<mml:mrow>
<mml:mtable>
<mml:mtr>
<mml:mtd>
<mml:msubsup>
<mml:mrow>
<mml:mi>u</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mtext>ref</mml:mtext>
</mml:mrow>
<mml:mrow>
<mml:mi>d</mml:mi>
</mml:mrow>
</mml:msubsup>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:msubsup>
<mml:mrow>
<mml:mi>ω</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mtext>ref</mml:mtext>
</mml:mrow>
<mml:mrow>
<mml:mi>d</mml:mi>
</mml:mrow>
</mml:msubsup>
</mml:mtd>
</mml:mtr>
</mml:mtable>
</mml:mrow>
<mml:mo>]</mml:mo>
</mml:mrow>
<mml:mo>.</mml:mo>
</mml:mtd>
</mml:mtr>
</mml:mtable>
</mml:math>
</disp-formula>
</p>
<p>From (
<xref ref-type="disp-formula" rid="EEq9">11</xref>
) and without considering the uncertainties, the inverse dynamics of the robotic wheelchair can be parameterized as follows:
<disp-formula id="EEq10">
<label>(12)</label>
<mml:math id="M14">
<mml:mtable>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:mo>[</mml:mo>
<mml:mrow>
<mml:mtable>
<mml:mtr>
<mml:mtd>
<mml:msubsup>
<mml:mrow>
<mml:mi>u</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mtext>ref</mml:mtext>
</mml:mrow>
<mml:mrow>
<mml:mi>d</mml:mi>
</mml:mrow>
</mml:msubsup>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:msubsup>
<mml:mrow>
<mml:mi>ω</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mtext>ref</mml:mtext>
</mml:mrow>
<mml:mrow>
<mml:mi>d</mml:mi>
</mml:mrow>
</mml:msubsup>
</mml:mtd>
</mml:mtr>
</mml:mtable>
</mml:mrow>
<mml:mo>]</mml:mo>
</mml:mrow>
<mml:mo>=</mml:mo>
<mml:mrow>
<mml:mo>[</mml:mo>
<mml:mrow>
<mml:mtable>
<mml:mtr>
<mml:mtd>
<mml:mover accent="true">
<mml:mrow>
<mml:mi>u</mml:mi>
</mml:mrow>
<mml:mo>˙</mml:mo>
</mml:mover>
</mml:mtd>
<mml:mtd>
<mml:mn mathvariant="normal">0</mml:mn>
</mml:mtd>
<mml:mtd>
<mml:mo></mml:mo>
<mml:msup>
<mml:mrow>
<mml:mi>ω</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mn mathvariant="normal">2</mml:mn>
</mml:mrow>
</mml:msup>
</mml:mtd>
<mml:mtd>
<mml:mi>u</mml:mi>
</mml:mtd>
<mml:mtd>
<mml:mn mathvariant="normal">0</mml:mn>
</mml:mtd>
<mml:mtd>
<mml:mn mathvariant="normal">0</mml:mn>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mn mathvariant="normal">0</mml:mn>
</mml:mtd>
<mml:mtd>
<mml:mover accent="true">
<mml:mrow>
<mml:mi>ω</mml:mi>
</mml:mrow>
<mml:mo>˙</mml:mo>
</mml:mover>
</mml:mtd>
<mml:mtd>
<mml:mn mathvariant="normal">0</mml:mn>
</mml:mtd>
<mml:mtd>
<mml:mn mathvariant="normal">0</mml:mn>
</mml:mtd>
<mml:mtd>
<mml:mi>u</mml:mi>
<mml:mi>ω</mml:mi>
</mml:mtd>
<mml:mtd>
<mml:mi>ω</mml:mi>
</mml:mtd>
</mml:mtr>
</mml:mtable>
</mml:mrow>
<mml:mo>]</mml:mo>
</mml:mrow>
<mml:mi mathvariant="bold">θ</mml:mi>
<mml:mo>,</mml:mo>
</mml:mtd>
</mml:mtr>
</mml:mtable>
</mml:math>
</disp-formula>
which can be rewritten as
<disp-formula id="EEq11">
<label>(13)</label>
<mml:math id="M15">
<mml:mtable>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:mo>[</mml:mo>
<mml:mrow>
<mml:mtable>
<mml:mtr>
<mml:mtd>
<mml:msubsup>
<mml:mrow>
<mml:mi>u</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mtext>ref</mml:mtext>
</mml:mrow>
<mml:mrow>
<mml:mi>d</mml:mi>
</mml:mrow>
</mml:msubsup>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:msubsup>
<mml:mrow>
<mml:mi>ω</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mtext>ref</mml:mtext>
</mml:mrow>
<mml:mrow>
<mml:mi>d</mml:mi>
</mml:mrow>
</mml:msubsup>
</mml:mtd>
</mml:mtr>
</mml:mtable>
</mml:mrow>
<mml:mo>]</mml:mo>
</mml:mrow>
<mml:mo>=</mml:mo>
<mml:mrow>
<mml:mo>[</mml:mo>
<mml:mrow>
<mml:mtable>
<mml:mtr>
<mml:mtd>
<mml:msub>
<mml:mrow>
<mml:mi>θ</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mn mathvariant="normal">1</mml:mn>
</mml:mrow>
</mml:msub>
</mml:mtd>
<mml:mtd>
<mml:mn mathvariant="normal">0</mml:mn>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mn mathvariant="normal">0</mml:mn>
</mml:mtd>
<mml:mtd>
<mml:msub>
<mml:mrow>
<mml:mi>θ</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mn mathvariant="normal">2</mml:mn>
</mml:mrow>
</mml:msub>
</mml:mtd>
</mml:mtr>
</mml:mtable>
</mml:mrow>
<mml:mo>]</mml:mo>
</mml:mrow>
<mml:mrow>
<mml:mo>[</mml:mo>
<mml:mrow>
<mml:mtable>
<mml:mtr>
<mml:mtd>
<mml:mover accent="true">
<mml:mrow>
<mml:mi>u</mml:mi>
</mml:mrow>
<mml:mo>˙</mml:mo>
</mml:mover>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mover accent="true">
<mml:mrow>
<mml:mi>ω</mml:mi>
</mml:mrow>
<mml:mo>˙</mml:mo>
</mml:mover>
</mml:mtd>
</mml:mtr>
</mml:mtable>
</mml:mrow>
<mml:mo>]</mml:mo>
</mml:mrow>
<mml:mo>+</mml:mo>
<mml:mrow>
<mml:mo>[</mml:mo>
<mml:mrow>
<mml:mtable>
<mml:mtr>
<mml:mtd>
<mml:mn mathvariant="normal">0</mml:mn>
</mml:mtd>
<mml:mtd>
<mml:mn mathvariant="normal">0</mml:mn>
</mml:mtd>
<mml:mtd>
<mml:mo></mml:mo>
<mml:msup>
<mml:mrow>
<mml:mi>ω</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mn mathvariant="normal">2</mml:mn>
</mml:mrow>
</mml:msup>
</mml:mtd>
<mml:mtd>
<mml:mi>u</mml:mi>
</mml:mtd>
<mml:mtd>
<mml:mn mathvariant="normal">0</mml:mn>
</mml:mtd>
<mml:mtd>
<mml:mn mathvariant="normal">0</mml:mn>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mn mathvariant="normal">0</mml:mn>
</mml:mtd>
<mml:mtd>
<mml:mn mathvariant="normal">0</mml:mn>
</mml:mtd>
<mml:mtd>
<mml:mn mathvariant="normal">0</mml:mn>
</mml:mtd>
<mml:mtd>
<mml:mn mathvariant="normal">0</mml:mn>
</mml:mtd>
<mml:mtd>
<mml:mi>u</mml:mi>
<mml:mi>ω</mml:mi>
</mml:mtd>
<mml:mtd>
<mml:mi>ω</mml:mi>
</mml:mtd>
</mml:mtr>
</mml:mtable>
</mml:mrow>
<mml:mo>]</mml:mo>
</mml:mrow>
<mml:mi mathvariant="bold">θ</mml:mi>
<mml:mo>.</mml:mo>
</mml:mtd>
</mml:mtr>
</mml:mtable>
</mml:math>
</disp-formula>
The proposed inverse dynamics control law is</p>
<p>
<disp-formula id="EEq12a">
<label>(14a)</label>
<mml:math id="M16">
<mml:mtable>
<mml:mtr>
<mml:mtd>
<mml:msubsup>
<mml:mrow>
<mml:mi>ν</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mtext>ref</mml:mtext>
</mml:mrow>
<mml:mrow>
<mml:mi>d</mml:mi>
</mml:mrow>
</mml:msubsup>
<mml:mo>=</mml:mo>
<mml:mi mathvariant="bold">G</mml:mi>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:mi>u</mml:mi>
<mml:mo>,</mml:mo>
<mml:mi>ω</mml:mi>
<mml:mo>,</mml:mo>
<mml:msub>
<mml:mrow>
<mml:mi>u</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mtext>ref</mml:mtext>
</mml:mrow>
</mml:msub>
<mml:mo>,</mml:mo>
<mml:msub>
<mml:mrow>
<mml:mi>ω</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mtext>ref</mml:mtext>
</mml:mrow>
</mml:msub>
<mml:mo>,</mml:mo>
<mml:msub>
<mml:mrow>
<mml:mover accent="true">
<mml:mrow>
<mml:mi>u</mml:mi>
</mml:mrow>
<mml:mo>˙</mml:mo>
</mml:mover>
</mml:mrow>
<mml:mrow>
<mml:mtext>ref</mml:mtext>
</mml:mrow>
</mml:msub>
<mml:mo>,</mml:mo>
<mml:msub>
<mml:mrow>
<mml:mover accent="true">
<mml:mrow>
<mml:mi>ω</mml:mi>
</mml:mrow>
<mml:mo>˙</mml:mo>
</mml:mover>
</mml:mrow>
<mml:mrow>
<mml:mtext>ref</mml:mtext>
</mml:mrow>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
<mml:mi mathvariant="bold">θ</mml:mi>
<mml:mo>,</mml:mo>
</mml:mtd>
</mml:mtr>
</mml:mtable>
</mml:math>
</disp-formula>
where
<disp-formula>
<mml:math id="M17">
<mml:mtable>
<mml:mlabeledtr id="EEq12b">
<mml:mtd>
<mml:mtext>(14b)</mml:mtext>
</mml:mtd>
<mml:mtd>
<mml:mi mathvariant="bold">G</mml:mi>
<mml:mo>=</mml:mo>
<mml:mrow>
<mml:mo>[</mml:mo>
<mml:mrow>
<mml:mtable>
<mml:mtr>
<mml:mtd>
<mml:msub>
<mml:mrow>
<mml:mi>σ</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mn mathvariant="normal">1</mml:mn>
</mml:mrow>
</mml:msub>
</mml:mtd>
<mml:mtd>
<mml:mn mathvariant="normal">0</mml:mn>
</mml:mtd>
<mml:mtd>
<mml:mo></mml:mo>
<mml:msup>
<mml:mrow>
<mml:mi>ω</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mn mathvariant="normal">2</mml:mn>
</mml:mrow>
</mml:msup>
</mml:mtd>
<mml:mtd>
<mml:mi>u</mml:mi>
</mml:mtd>
<mml:mtd>
<mml:mn mathvariant="normal">0</mml:mn>
</mml:mtd>
<mml:mtd>
<mml:mn mathvariant="normal">0</mml:mn>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mn mathvariant="normal">0</mml:mn>
</mml:mtd>
<mml:mtd>
<mml:msub>
<mml:mrow>
<mml:mi>σ</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mn mathvariant="normal">2</mml:mn>
</mml:mrow>
</mml:msub>
</mml:mtd>
<mml:mtd>
<mml:mn mathvariant="normal">0</mml:mn>
</mml:mtd>
<mml:mtd>
<mml:mn mathvariant="normal">0</mml:mn>
</mml:mtd>
<mml:mtd>
<mml:mi>u</mml:mi>
<mml:mi>ω</mml:mi>
</mml:mtd>
<mml:mtd>
<mml:mi>ω</mml:mi>
</mml:mtd>
</mml:mtr>
</mml:mtable>
</mml:mrow>
<mml:mo>]</mml:mo>
</mml:mrow>
<mml:mo>,</mml:mo>
</mml:mtd>
</mml:mlabeledtr>
<mml:mlabeledtr id="EEq12c">
<mml:mtd rowspan="2">
<mml:mtext>(14c)</mml:mtext>
</mml:mtd>
<mml:mtd>
<mml:msub>
<mml:mrow>
<mml:mi>σ</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mn mathvariant="normal">1</mml:mn>
</mml:mrow>
</mml:msub>
<mml:mo>=</mml:mo>
<mml:msub>
<mml:mrow>
<mml:mover accent="true">
<mml:mrow>
<mml:mi>u</mml:mi>
</mml:mrow>
<mml:mo>˙</mml:mo>
</mml:mover>
</mml:mrow>
<mml:mrow>
<mml:mtext>ref</mml:mtext>
</mml:mrow>
</mml:msub>
<mml:mo>+</mml:mo>
<mml:msub>
<mml:mrow>
<mml:mi>k</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi>u</mml:mi>
</mml:mrow>
</mml:msub>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mrow>
<mml:mi>u</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mtext>ref</mml:mtext>
</mml:mrow>
</mml:msub>
<mml:mo></mml:mo>
<mml:mi>u</mml:mi>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
<mml:mo>,</mml:mo>
</mml:mtd>
</mml:mlabeledtr>
<mml:mtr>
<mml:mtd>
<mml:msub>
<mml:mrow>
<mml:mi>σ</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mn mathvariant="normal">2</mml:mn>
</mml:mrow>
</mml:msub>
<mml:mo>=</mml:mo>
<mml:msub>
<mml:mrow>
<mml:mover accent="true">
<mml:mrow>
<mml:mi>ω</mml:mi>
</mml:mrow>
<mml:mo>˙</mml:mo>
</mml:mover>
</mml:mrow>
<mml:mrow>
<mml:mtext>ref</mml:mtext>
</mml:mrow>
</mml:msub>
<mml:mo>+</mml:mo>
<mml:msub>
<mml:mrow>
<mml:mi>k</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi>ω</mml:mi>
</mml:mrow>
</mml:msub>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mrow>
<mml:mi>ω</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mtext>ref</mml:mtext>
</mml:mrow>
</mml:msub>
<mml:mo></mml:mo>
<mml:mi>ω</mml:mi>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
<mml:mo>.</mml:mo>
</mml:mtd>
</mml:mtr>
</mml:mtable>
</mml:math>
</disp-formula>
</p>
</sec>
</sec>
<sec id="sec2.7">
<title>2.7. Experimental Protocol and Evaluation Index</title>
<p>The performance evaluation of the proposed system follows the Human Activity Assistive Technology (HAAT) model [
<xref ref-type="bibr" rid="B31">30</xref>
]. According to this model, the system to be evaluated comprises not only the assistive device but also the user, the activity carried out by user, and the context where the activity was developed. Therefore, the system is effective if it is useful to the user for achieving the objectives of the proposed tasks.</p>
<p>The objective stated for this assistive technology system is that an individual with motor disabilities can drive the wheelchair to a precise localization. The evaluation of the activity is carried out in two stages. In the first stage the user becomes familiar with the human-computer interface and the navigation of the wheelchair. In this training stage the user performs a free navigation, no longer than three minutes, without any predetermined task. Each user completes four or five free navigations. This first stage is useful not only for the user's training but also for establishing the maximum wheelchair's velocity by taking into account the comfort and safety of each user.</p>
<p>The second stage consists of achieving a final position in the structured environment, while avoiding static obstacles. The described path is subdivided in three steps:
<italic>step 1</italic>
, the path from the beginning to the obstacles,
<italic>step 2</italic>
, avoiding the obstacles without colliding, and
<italic>step 3</italic>
, reaching the final destination. This experiment is carried out five times by each user and the information extracted comprises the time taken to perform the task and how many and which stages the user could reach. Once finalized the task proposed, the user answers an inquiry, in order to obtain the opinion of the user about the assistive technology.</p>
<p>After the experiments with the wheelchair are completed, and with the aim of having a quantitative appreciation of each experiment, we propose the performance index Λ such that 0 < Λ < 1. This index is calculated as
<disp-formula id="EEq13">
<label>(15)</label>
<mml:math id="M18">
<mml:mtable>
<mml:mtr>
<mml:mtd>
<mml:mi mathvariant="normal">Λ</mml:mi>
<mml:mo>=</mml:mo>
<mml:mfrac>
<mml:mrow>
<mml:mn mathvariant="normal">1</mml:mn>
</mml:mrow>
<mml:mrow>
<mml:mi>n</mml:mi>
<mml:mo>+</mml:mo>
<mml:mn mathvariant="normal">1</mml:mn>
</mml:mrow>
</mml:mfrac>
<mml:mrow>
<mml:munderover>
<mml:mstyle displaystyle="true">
<mml:mo stretchy="false"></mml:mo>
</mml:mstyle>
<mml:mrow>
<mml:mi>i</mml:mi>
<mml:mo>=</mml:mo>
<mml:mn mathvariant="normal">1</mml:mn>
</mml:mrow>
<mml:mrow>
<mml:mi>n</mml:mi>
</mml:mrow>
</mml:munderover>
<mml:mrow>
<mml:msub>
<mml:mrow>
<mml:mi>g</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mi>i</mml:mi>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:mrow>
<mml:mo>+</mml:mo>
<mml:mfrac>
<mml:mrow>
<mml:mn mathvariant="normal">1</mml:mn>
</mml:mrow>
<mml:mrow>
<mml:mi>n</mml:mi>
<mml:mo>+</mml:mo>
<mml:mn mathvariant="normal">1</mml:mn>
</mml:mrow>
</mml:mfrac>
<mml:msup>
<mml:mrow>
<mml:mi>e</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mo></mml:mo>
<mml:mi>  </mml:mi>
<mml:mrow>
<mml:mrow>
<mml:mi>t</mml:mi>
</mml:mrow>
<mml:mo>/</mml:mo>
<mml:mrow>
<mml:mi>h</mml:mi>
</mml:mrow>
</mml:mrow>
</mml:mrow>
</mml:msup>
<mml:mo>,</mml:mo>
</mml:mtd>
</mml:mtr>
</mml:mtable>
</mml:math>
</disp-formula>
where
<italic>n</italic>
is the number of stages in the specific task
<italic>t</italic>
is the time consumed in seconds,
<italic>h</italic>
= 150 is a constant that is set in function of the expected time for the task
<italic>g</italic>
<sub>
<italic>i</italic>
</sub>
indicates if the stage is complete or not, such as
<italic>g</italic>
<sub>
<italic>i</italic>
</sub>
= 1 if the stage's objective is reached, and
<italic>g</italic>
<sub>
<italic>i</italic>
</sub>
= 0 otherwise. This index is calculated when at least one stage is completed.</p>
</sec>
</sec>
<sec id="sec3">
<title>3. Results</title>
<p>The results will be presented in two sections: the fusion results and experimental results obtained in several tasks with a group of volunteers. Also, the quantitative index proposed is explained in this section.</p>
<sec id="sec3.1">
<title>3.1. Fusion Results</title>
<p>The fusion results were evaluated in normal conditions and also introducing variations in luminosity and abrupt head movements, with the aim to introduce measurements errors and evaluate the performance of the fusion algorithm.</p>
<p>
<xref ref-type="fig" rid="fig6">Figure 6</xref>
shows the time evolution of the of the angle
<italic>γ</italic>
in estimated signals and the fusion results against a measurement outlier. It is important to note the time evolution between 200 and 300 ms, where abrupt head movements introduce outliers and error measurements that could be translated as an inadequate control signal. The control signal was filtered and smoothed by the fusion algorithm.</p>
</sec>
<sec id="sec3.2">
<title>3.2. Experimental Results</title>
<p>Images of 230 × 240 pixels are captured during the experiments at 10 fps, using a conventional webcam with focal length of 565 pixels. The maximal velocity for the wheelchair, established during the training stage, was 70 mm/s. As we stated previously, the robotic wheelchair used in the experiments has been developed by [
<xref ref-type="bibr" rid="B4">4</xref>
,
<xref ref-type="bibr" rid="B20">20</xref>
]. The wheelchair is programmed to move forward when the angle
<italic>α</italic>
is positive (when the head moves ahead), and the wheelchair should stop when the angle
<italic>α</italic>
is negative (when the head moves back), according to the kinematic control law (
<xref ref-type="disp-formula" rid="EEq8">10</xref>
). On the other hand, the wheelchair turns to the right when the angle
<italic>γ</italic>
is positive (head movements to the right) and it turns left when the angle
<italic>γ</italic>
is negative (head movements to the left), also according to the kinematic control law (
<xref ref-type="disp-formula" rid="EEq7">9</xref>
).</p>
<p>The system was evaluated by four individuals with severe motor disabilities. The users or their parents (in the case of minors) signed the informed consent of the Ethic Committee of the Universidade Federal do Espirito Santo (UFES). Individual A is a fourteen-year-old boy, who has cerebral palsy. Individual B is an eight-year-old girl that has motor anomalies due to a tumour. Individual C is an eleven-year-old boy, with Duchenne muscular dystrophy. Individual D is a quadriplegic thirty five-year-old woman. Some pictures of the volunteers using the robotic wheelchair are shown in
<xref ref-type="fig" rid="fig7">Figure 7</xref>
.</p>
<p>The data obtained in the experiments are shown in Tables
<xref ref-type="table" rid="tab1">1</xref>
,
<xref ref-type="table" rid="tab2">2</xref>
,
<xref ref-type="table" rid="tab3">3</xref>
, and
<xref ref-type="table" rid="tab4">4</xref>
. Columns for each stage are filled with a check symbol (
<italic></italic>
) if the user completes the stage or with a cross (
<italic></italic>
) if not. The fifth column shows the total time in the experiment (
<xref ref-type="fig" rid="fig8">Figure 8</xref>
summarizes these times) and the sixth column shows the index Λ.</p>
<p>In these experimental sessions three stages were accomplished (
<italic>n</italic>
= 3). Therefore if the user completes, for example, only the first stage, the first term of the index Λ will be equal to 0.25, and the second term will provide a value between 0 and 0.25 according to the consuming time to carry out the task. Therefore, if 0.25 < Λ < 0.5 the user has completed only one stages (being Λ closer to 0,5 when the time spent decreases); if 0.5 < Λ < 0.75 the user has completed two stages (being Λ closer to 0,75 when the time spent decreases); and if 0.75 < Λ < 1 the user has completed three stage (being Λ closer to 1 when the time spent decreases).</p>
<p>The results obtained from the questionnaire can be summarized as follows.
<list list-type="simple">
<list-item>
<label></label>
<p>Individual A: he expressed that commanding the wheelchair is easy and comfortable.</p>
</list-item>
<list-item>
<label></label>
<p>Individual B: the navigation with robotic wheelchair with the interface, although was easier, she felt unsafe during the first experiments.</p>
</list-item>
<list-item>
<label></label>
<p>Individual C: he found difficulty in the experiments with the robotic wheelchair. It was hard for him to learn the movements of its head to command the wheelchair.</p>
</list-item>
<list-item>
<label></label>
<p>Individual D: regarding the robotic wheelchair, she expressed that the systems are comfortable, reliable, and easy to use.</p>
</list-item>
</list>
</p>
<p>It is important to note that individuals B and C are kids and they can be intimidated by the new interface.</p>
<p>In Figures
<xref ref-type="fig" rid="fig9">9</xref>
and
<xref ref-type="fig" rid="fig10">10</xref>
, the results of the fourth experiment of one individual (D in this case) are shown. The evolution of the angle
<italic>γ</italic>
is observed at the top of
<xref ref-type="fig" rid="fig9">Figure 9</xref>
, suited by the angular and linear velocity commands for the wheelchair. Finally the path followed by the wheelchair is shown in
<xref ref-type="fig" rid="fig10">Figure 10</xref>
.</p>
</sec>
</sec>
<sec id="sec4">
<title>4. Conclusions </title>
<p>In this work an assistive technology system for people with severe motor disabilities was presented, as an alternative tool for locomotion and people assistance. This system has been evaluated by people with severe disabilities, with acceptable performance quantified by the proposed index.</p>
<p>The assistive system uses a combination of VBI and IMU sensors to estimate the pose estimation of the user's head. The pose parameters are combined by a Kalman Filter Fusion algorithm in order to avoid uncertainties, outliers, and error measurements. The fusion process implemented for the
<italic>γ</italic>
angle improves its performance, obtaining a better estimation of this angle. Additionally, it provides redundancy to the system, which increases the safety for the users. These parameters are used as reference inputs for controlling the navigation of a robotic wheelchair.</p>
<p>Several experiments have been performed in indoor environments with people with severe disabilities. Most users expressed through the inquiry that the control of the robotic wheelchair through the movements of the head is easy and intuitive. Moreover, they pointed out that the navigation is smooth and comfortable. This characteristic of the proposed assistive system is due to the kinematic controller along with the dynamic compensation implemented on-board the wheelchair.</p>
<p>From the results obtained in the experiments, it can be seen that all users, in at least three attempts, reached the objective, leading the chair to the established final position, while avoiding static obstacles. These results are promising because all users could command the wheelchair by using the interface to generate velocity commands, with little training. The time spent to carry out the task decreases while the user acquires more skills and familiarity with the system, which is an important characteristic in the evaluation of this assistive technology. Since the experiments were carried out in an environment similar to a work office and the illumination was not controlled, it is possible to infer that the developed technology can be used to provide autonomy in the locomotion of disabled people. Regarding the performance of the interface itself, it was observed that all users were able to guide the wheelchair in a smooth and safe way for them, without abrupt changes in speed and rotations. This desirable performance was obtained for two reasons, the fusion process implemented for the angle
<italic>γ</italic>
and the implementation of the control algorithm with dynamic compensation.</p>
<p>Finally, a quantitative index of performance was proposed, with the aim of standardizing the evaluation of the assistive technologies.</p>
</sec>
</body>
<back>
<ack>
<title>Conflict of Interests</title>
<p>The authors declare that there is no conflict of interests regarding the publication of this paper.</p>
</ack>
<ref-list>
<ref id="B1">
<label>1</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Bruemmer</surname>
<given-names>DJ</given-names>
</name>
<name>
<surname>Few</surname>
<given-names>DA</given-names>
</name>
<name>
<surname>Boring</surname>
<given-names>RL</given-names>
</name>
<name>
<surname>Marble</surname>
<given-names>JL</given-names>
</name>
<name>
<surname>Walton</surname>
<given-names>MC</given-names>
</name>
<name>
<surname>Nielsen</surname>
<given-names>CW</given-names>
</name>
</person-group>
<article-title>Shared understanding for collaborative control</article-title>
<source>
<italic>IEEE Transactions on Systems, Man, and Cybernetics A</italic>
</source>
<year>2005</year>
<volume>35</volume>
<issue>4</issue>
<fpage>494</fpage>
<lpage>504</lpage>
<pub-id pub-id-type="other">2-s2.0-22244436276</pub-id>
</element-citation>
</ref>
<ref id="B2">
<label>2</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Galindo</surname>
<given-names>C</given-names>
</name>
<name>
<surname>Gonzalez</surname>
<given-names>J</given-names>
</name>
<name>
<surname>Fernández-Madrigal</surname>
<given-names>J-A</given-names>
</name>
</person-group>
<article-title>Control architecture for human-robot integration: application to a robotic wheelchair</article-title>
<source>
<italic>IEEE Transactions on Systems, Man, and Cybernetics B</italic>
</source>
<year>2006</year>
<volume>36</volume>
<issue>5</issue>
<fpage>1053</fpage>
<lpage>1067</lpage>
<pub-id pub-id-type="other">2-s2.0-33749401776</pub-id>
</element-citation>
</ref>
<ref id="B3">
<label>3</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Perez</surname>
<given-names>E</given-names>
</name>
<name>
<surname>Soria</surname>
<given-names>C</given-names>
</name>
<name>
<surname>Nasisi</surname>
<given-names>O</given-names>
</name>
<name>
<surname>Freire-Bastos</surname>
<given-names>T</given-names>
</name>
<name>
<surname>Mut</surname>
<given-names>V</given-names>
</name>
</person-group>
<article-title>Robotic wheelchair controlled through a vision­based interface</article-title>
<source>
<italic>Robotica</italic>
</source>
<year>2012</year>
<volume>30</volume>
<issue>5</issue>
<fpage>691</fpage>
<lpage>708</lpage>
</element-citation>
</ref>
<ref id="B4">
<label>4</label>
<element-citation publication-type="book">
<person-group person-group-type="author">
<name>
<surname>Freire-Bastos</surname>
<given-names>T</given-names>
</name>
<name>
<surname>Sarcinelli</surname>
<given-names>M</given-names>
</name>
<name>
<surname>Ferreira</surname>
<given-names>A</given-names>
</name>
<etal></etal>
</person-group>
<person-group person-group-type="editor">
<name>
<surname>Pons</surname>
<given-names>JL</given-names>
</name>
</person-group>
<article-title>Case study: cognitive control of a robotic wheelchair</article-title>
<source>
<italic>Wearable Robots: Biomechatronic Exoskeletons</italic>
</source>
<year>2008</year>
<publisher-name>John Wiley & Sons</publisher-name>
<fpage>315</fpage>
<lpage>319</lpage>
</element-citation>
</ref>
<ref id="B5">
<label>5</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Ferreira</surname>
<given-names>A</given-names>
</name>
<name>
<surname>Celeste</surname>
<given-names>WC</given-names>
</name>
<name>
<surname>Cheein</surname>
<given-names>FA</given-names>
</name>
<name>
<surname>Freire-Bastos</surname>
<given-names>T</given-names>
</name>
<name>
<surname>Sarcinelli-Filho</surname>
<given-names>M</given-names>
</name>
<name>
<surname>Carelli</surname>
<given-names>R</given-names>
</name>
</person-group>
<article-title>Human-machine interfaces based on EMG and EEG applied to robotic systems</article-title>
<source>
<italic>Journal of NeuroEngineering and Rehabilitation</italic>
</source>
<year>2008</year>
<volume>5, article 10</volume>
<pub-id pub-id-type="other">2-s2.0-42149171996</pub-id>
</element-citation>
</ref>
<ref id="B6">
<label>6</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Auat Cheein</surname>
<given-names>F</given-names>
</name>
<name>
<surname>Lopez</surname>
<given-names>N</given-names>
</name>
<name>
<surname>Soria</surname>
<given-names>CM</given-names>
</name>
<name>
<surname>di Sciascio</surname>
<given-names>FA</given-names>
</name>
<name>
<surname>Lobo Pereira</surname>
<given-names>F</given-names>
</name>
<name>
<surname>Carelli</surname>
<given-names>R</given-names>
</name>
</person-group>
<article-title>SLAM algorithm applied to robotics assistance for navigation in unknown environments</article-title>
<source>
<italic>Journal of NeuroEngineering and Rehabilitation</italic>
</source>
<year>2010</year>
<volume>7</volume>
<issue>1, article 10</issue>
<pub-id pub-id-type="other">2-s2.0-77949474636</pub-id>
</element-citation>
</ref>
<ref id="B7">
<label>7</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Barea</surname>
<given-names>R</given-names>
</name>
<name>
<surname>Boquete</surname>
<given-names>L</given-names>
</name>
<name>
<surname>Mazo</surname>
<given-names>M</given-names>
</name>
<name>
<surname>López</surname>
<given-names>E</given-names>
</name>
</person-group>
<article-title>System for assisted mobility using eye movements based on electrooculography</article-title>
<source>
<italic>IEEE Transactions on Neural Systems and Rehabilitation Engineering</italic>
</source>
<year>2002</year>
<volume>10</volume>
<issue>4</issue>
<fpage>209</fpage>
<lpage>218</lpage>
<pub-id pub-id-type="other">2-s2.0-0036991652</pub-id>
<pub-id pub-id-type="pmid">12611358</pub-id>
</element-citation>
</ref>
<ref id="B8">
<label>8</label>
<element-citation publication-type="book">
<person-group person-group-type="author">
<name>
<surname>Vie Christensen</surname>
<given-names>H</given-names>
</name>
<name>
<surname>Garcia</surname>
<given-names>JC</given-names>
</name>
</person-group>
<source>
<italic>Infrared Non-Contact Head Sensor, for Control of Wheelchair Movements</italic>
</source>
<year>2003</year>
<publisher-name>IOS Press</publisher-name>
</element-citation>
</ref>
<ref id="B9">
<label>9</label>
<element-citation publication-type="confproc">
<person-group person-group-type="author">
<name>
<surname>Manogna</surname>
<given-names>S</given-names>
</name>
<name>
<surname>Vaishnavi</surname>
<given-names>S</given-names>
</name>
<name>
<surname>Geethanjali</surname>
<given-names>B</given-names>
</name>
</person-group>
<article-title>Head movement based assist system for physically challenged</article-title>
<conf-name>Proceedings of the 4th International Conference on Bioinformatics and Biomedical Engineering (iCBBE '10)</conf-name>
<conf-date>June 2010</conf-date>
<pub-id pub-id-type="other">2-s2.0-77956149942</pub-id>
</element-citation>
</ref>
<ref id="B10">
<label>10</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Ju</surname>
<given-names>JS</given-names>
</name>
<name>
<surname>Shin</surname>
<given-names>Y</given-names>
</name>
<name>
<surname>Kim</surname>
<given-names>EY</given-names>
</name>
</person-group>
<article-title>Vision based interface system for hands free control of an intelligent wheelchair</article-title>
<source>
<italic>Journal of NeuroEngineering and Rehabilitation</italic>
</source>
<year>2009</year>
<volume>6</volume>
<issue>1, article 33</issue>
<pub-id pub-id-type="other">2-s2.0-69449101233</pub-id>
</element-citation>
</ref>
<ref id="B11">
<label>11</label>
<element-citation publication-type="confproc">
<person-group person-group-type="author">
<name>
<surname>Montesano</surname>
<given-names>L</given-names>
</name>
<name>
<surname>Minguez</surname>
<given-names>J</given-names>
</name>
<name>
<surname>Alcubierre</surname>
<given-names>JM</given-names>
</name>
<name>
<surname>Montano</surname>
<given-names>L</given-names>
</name>
</person-group>
<article-title>Towards the adaptation of a robotic wheelchair for cognitive disabled children</article-title>
<conf-name>Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS '06)</conf-name>
<conf-date>October 2006</conf-date>
<conf-loc>Beijing, China</conf-loc>
<fpage>710</fpage>
<lpage>716</lpage>
<pub-id pub-id-type="other">2-s2.0-34250686180</pub-id>
</element-citation>
</ref>
<ref id="B12">
<label>12</label>
<element-citation publication-type="confproc">
<person-group person-group-type="author">
<name>
<surname>Bauckhage</surname>
<given-names>C</given-names>
</name>
<name>
<surname>Käster</surname>
<given-names>T</given-names>
</name>
<name>
<surname>Rotenstein</surname>
<given-names>AM</given-names>
</name>
<name>
<surname>Tsotsos</surname>
<given-names>JK</given-names>
</name>
</person-group>
<article-title>Fast learning for customizable head pose recognition in robotic wheelchair control</article-title>
<conf-name>Proceedings of the 7th International Conference on Automatic Face and Gesture Recognition (FGR '06)</conf-name>
<conf-date>April 2006</conf-date>
<conf-loc>Southampton, UK</conf-loc>
<fpage>311</fpage>
<lpage>316</lpage>
<pub-id pub-id-type="other">2-s2.0-33750832536</pub-id>
</element-citation>
</ref>
<ref id="B13">
<label>13</label>
<element-citation publication-type="confproc">
<person-group person-group-type="author">
<name>
<surname>Kobayashi</surname>
<given-names>Y</given-names>
</name>
<name>
<surname>Kinpara</surname>
<given-names>Y</given-names>
</name>
<name>
<surname>Shibusawa</surname>
<given-names>T</given-names>
</name>
<name>
<surname>Kuno</surname>
<given-names>Y</given-names>
</name>
</person-group>
<article-title>Robotic wheelchair based on observations of people using integrated sensors</article-title>
<conf-name>Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS '09)</conf-name>
<conf-date>October 2009</conf-date>
<conf-loc>St. Louis, Mo, USA</conf-loc>
<fpage>2013</fpage>
<lpage>2018</lpage>
<pub-id pub-id-type="other">2-s2.0-76249103833</pub-id>
</element-citation>
</ref>
<ref id="B14">
<label>14</label>
<element-citation publication-type="confproc">
<person-group person-group-type="author">
<name>
<surname>Rebsamen</surname>
<given-names>B</given-names>
</name>
<name>
<surname>Burdet</surname>
<given-names>E</given-names>
</name>
<name>
<surname>Guan</surname>
<given-names>C</given-names>
</name>
<etal></etal>
</person-group>
<article-title>Controlling a wheelchair using a BCI with low information transfer rate</article-title>
<conf-name>Proceedings of the 10th IEEE International Conference on Rehabilitation Robotics (ICORR '07)</conf-name>
<conf-date>June 2007</conf-date>
<conf-loc>Noordwijk, The Netherlands</conf-loc>
<fpage>1003</fpage>
<lpage>1008</lpage>
<pub-id pub-id-type="other">2-s2.0-48349132875</pub-id>
</element-citation>
</ref>
<ref id="B15">
<label>15</label>
<element-citation publication-type="confproc">
<person-group person-group-type="author">
<name>
<surname>Iturrate</surname>
<given-names>I</given-names>
</name>
<name>
<surname>Antelis</surname>
<given-names>JM</given-names>
</name>
<name>
<surname>Kübler</surname>
<given-names>A</given-names>
</name>
<name>
<surname>Minguez</surname>
<given-names>J</given-names>
</name>
</person-group>
<article-title>Non-invasive brain-actuated wheelchair based on a P300 neurophysiological protocol and automated navigation</article-title>
<conf-name>Proceedings of the IEEE International Conference on Robotics and Automation</conf-name>
<conf-date>2009</conf-date>
<conf-loc>Kobe, Japan</conf-loc>
</element-citation>
</ref>
<ref id="B16">
<label>16</label>
<element-citation publication-type="confproc">
<person-group person-group-type="author">
<name>
<surname>de la Cruz</surname>
<given-names>C</given-names>
</name>
<name>
<surname>Freire-Bastos</surname>
<given-names>T</given-names>
</name>
<name>
<surname>Cheein</surname>
<given-names>FAA</given-names>
</name>
<name>
<surname>Carelli</surname>
<given-names>R</given-names>
</name>
</person-group>
<article-title>SLAM-based robotic wheelchair navigation system designed for confined spaces</article-title>
<conf-name>Proceedings of the IEEE International Symposium on Industrial Electronics (ISIE '10)</conf-name>
<conf-date>July 2010</conf-date>
<conf-loc>Bari, Italy</conf-loc>
<fpage>2331</fpage>
<lpage>2336</lpage>
<pub-id pub-id-type="other">2-s2.0-78650375234</pub-id>
</element-citation>
</ref>
<ref id="B17">
<label>17</label>
<element-citation publication-type="confproc">
<person-group person-group-type="author">
<name>
<surname>Zhou</surname>
<given-names>L</given-names>
</name>
<name>
<surname>Teo</surname>
<given-names>CL</given-names>
</name>
<name>
<surname>Burdet</surname>
<given-names>E</given-names>
</name>
</person-group>
<article-title>A nonlinear elastic path controller for a robotic wheelchair</article-title>
<conf-name>Proceedings of the 3rd IEEE Conference on Industrial Electronics and Applications (ICIEA '08)</conf-name>
<conf-date>June 2008</conf-date>
<conf-loc>Singapore</conf-loc>
<fpage>142</fpage>
<lpage>147</lpage>
<pub-id pub-id-type="other">2-s2.0-51949110305</pub-id>
</element-citation>
</ref>
<ref id="B18">
<label>18</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>de la Cruz</surname>
<given-names>C</given-names>
</name>
<name>
<surname>Freire-Bastos</surname>
<given-names>T</given-names>
</name>
<name>
<surname>Carelli</surname>
<given-names>R</given-names>
</name>
</person-group>
<article-title>Adaptive motion control law of a robotic wheelchair</article-title>
<source>
<italic>Control Engineering Practice</italic>
</source>
<year>2011</year>
<volume>19</volume>
<issue>2</issue>
<fpage>113</fpage>
<lpage>125</lpage>
<pub-id pub-id-type="other">2-s2.0-78651472331</pub-id>
</element-citation>
</ref>
<ref id="B19">
<label>19</label>
<element-citation publication-type="book">
<person-group person-group-type="author">
<name>
<surname>Mutambara</surname>
<given-names>AG</given-names>
</name>
</person-group>
<source>
<italic>Decentralized Estimation and Control for Multi-Sensor Systems</italic>
</source>
<year>1998</year>
<publisher-loc>Boca Raton, Fla, USA</publisher-loc>
<publisher-name>CRC Press</publisher-name>
</element-citation>
</ref>
<ref id="B20">
<label>20</label>
<element-citation publication-type="confproc">
<person-group person-group-type="author">
<name>
<surname>Ferreira</surname>
<given-names>A</given-names>
</name>
<name>
<surname>Cavalieri</surname>
<given-names>DC</given-names>
</name>
<name>
<surname>Silva</surname>
<given-names>RL</given-names>
</name>
<name>
<surname>Freire-Bastos</surname>
<given-names>T</given-names>
</name>
<name>
<surname>Sarcinelli-Filho</surname>
<given-names>M</given-names>
</name>
</person-group>
<article-title>A versatile robotic wheelchair commanded by brain signals or eye blinks</article-title>
<volume>2</volume>
<conf-name>Proceedings of the 1st International Conference on Biomedical Electronics and Devices (BIODEVICES '08)</conf-name>
<conf-date>January 2008</conf-date>
<conf-loc>Funchal, Portugal</conf-loc>
<fpage>62</fpage>
<lpage>67</lpage>
<pub-id pub-id-type="other">2-s2.0-55649095718</pub-id>
</element-citation>
</ref>
<ref id="B21">
<label>21</label>
<element-citation publication-type="book">
<person-group person-group-type="author">
<name>
<surname>Gong</surname>
<given-names>S</given-names>
</name>
<name>
<surname>MacKenna</surname>
<given-names>S</given-names>
</name>
<name>
<surname>Psarrou</surname>
<given-names>A</given-names>
</name>
</person-group>
<source>
<italic>Dynamic Vision from Images to Face Recognition</italic>
</source>
<year>2000</year>
<publisher-loc>London, UK</publisher-loc>
<publisher-name>Imperial Collage Press</publisher-name>
</element-citation>
</ref>
<ref id="B22">
<label>22</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Cho</surname>
<given-names>K-M</given-names>
</name>
<name>
<surname>Jang</surname>
<given-names>J-H</given-names>
</name>
<name>
<surname>Hong</surname>
<given-names>K-S</given-names>
</name>
</person-group>
<article-title>Adaptive skin-color filter</article-title>
<source>
<italic>Pattern Recognition</italic>
</source>
<year>2001</year>
<volume>34</volume>
<issue>5</issue>
<fpage>1067</fpage>
<lpage>1073</lpage>
<pub-id pub-id-type="other">2-s2.0-0035342487</pub-id>
</element-citation>
</ref>
<ref id="B23">
<label>23</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Jones</surname>
<given-names>MJ</given-names>
</name>
<name>
<surname>Rehg</surname>
<given-names>JM</given-names>
</name>
</person-group>
<article-title>Statistical color models with application to skin detection</article-title>
<source>
<italic>International Journal of Computer Vision</italic>
</source>
<year>2002</year>
<volume>46</volume>
<issue>1</issue>
<fpage>81</fpage>
<lpage>96</lpage>
<pub-id pub-id-type="other">2-s2.0-0036165170</pub-id>
</element-citation>
</ref>
<ref id="B24">
<label>24</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Hsu</surname>
<given-names>R-L</given-names>
</name>
<name>
<surname>Abdel-Mottaleb</surname>
<given-names>M</given-names>
</name>
<name>
<surname>Jain</surname>
<given-names>AK</given-names>
</name>
</person-group>
<article-title>Face detection in color images</article-title>
<source>
<italic>IEEE Transactions on Pattern Analysis and Machine Intelligence</italic>
</source>
<year>2002</year>
<volume>24</volume>
<issue>5</issue>
<fpage>696</fpage>
<lpage>706</lpage>
<pub-id pub-id-type="other">2-s2.0-0036566509</pub-id>
</element-citation>
</ref>
<ref id="B25">
<label>25</label>
<element-citation publication-type="confproc">
<person-group person-group-type="author">
<name>
<surname>Berbar</surname>
<given-names>MA</given-names>
</name>
<name>
<surname>Kelash</surname>
<given-names>HM</given-names>
</name>
<name>
<surname>Kandeel</surname>
<given-names>AA</given-names>
</name>
</person-group>
<article-title>Faces and facial features detection in color images</article-title>
<conf-name>Proceedings of the Conference on Geometric Modeling and Imaging New Trends</conf-name>
<conf-date>July 2006</conf-date>
<publisher-name>IEEE</publisher-name>
<fpage>209</fpage>
<lpage>214</lpage>
<pub-id pub-id-type="other">2-s2.0-33947643658</pub-id>
</element-citation>
</ref>
<ref id="B27">
<label>26</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Perez</surname>
<given-names>E</given-names>
</name>
<name>
<surname>Soria</surname>
<given-names>C</given-names>
</name>
<name>
<surname>López</surname>
<given-names>NM</given-names>
</name>
<name>
<surname>Nasisi</surname>
<given-names>O</given-names>
</name>
<name>
<surname>Mut</surname>
<given-names>V</given-names>
</name>
</person-group>
<article-title>Vision based interface applied to assistive robots</article-title>
<source>
<italic>International Journal of Advanced Robotic Systems</italic>
</source>
<year>2013</year>
<volume>10</volume>
</element-citation>
</ref>
<ref id="B28">
<label>27</label>
<element-citation publication-type="book">
<person-group person-group-type="author">
<name>
<surname>Trucco</surname>
<given-names>E</given-names>
</name>
<name>
<surname>Verri</surname>
<given-names>A</given-names>
</name>
</person-group>
<source>
<italic>Introductory Techniques for 3-D Computer Vision</italic>
</source>
<year>1998</year>
<publisher-loc>Upper Saddle River, NJ, USA</publisher-loc>
<publisher-name>Prentice Hall</publisher-name>
</element-citation>
</ref>
<ref id="B29">
<label>28</label>
<element-citation publication-type="confproc">
<person-group person-group-type="author">
<name>
<surname>Zhang</surname>
<given-names>Y</given-names>
</name>
<name>
<surname>Hong</surname>
<given-names>D</given-names>
</name>
<name>
<surname>Chung</surname>
<given-names>J</given-names>
</name>
<name>
<surname>Velinsky</surname>
<given-names>S</given-names>
</name>
</person-group>
<article-title>Dynamic model based robust tracking control of a differentially steered wheeled mobile robot</article-title>
<conf-name>Proceedings of the American Control Conference</conf-name>
<conf-date>1998</conf-date>
<conf-loc>Philadelphia, Pa, USA</conf-loc>
</element-citation>
</ref>
<ref id="B30">
<label>29</label>
<element-citation publication-type="confproc">
<person-group person-group-type="author">
<name>
<surname>Soria</surname>
<given-names>CM</given-names>
</name>
<name>
<surname>Carelli</surname>
<given-names>R</given-names>
</name>
<name>
<surname>Sarcinelli-Filho</surname>
<given-names>M</given-names>
</name>
</person-group>
<article-title>Using panoramic images and optical flow to avoid obstacles in mobile robot navigation</article-title>
<conf-name>Proceedings of the International Symposium on Industrial Electronics (ISIE '06)</conf-name>
<conf-date>July 2006</conf-date>
<conf-loc>Montreal, Canada</conf-loc>
<fpage>2902</fpage>
<lpage>2907</lpage>
<pub-id pub-id-type="other">2-s2.0-53849099293</pub-id>
</element-citation>
</ref>
<ref id="B31">
<label>30</label>
<element-citation publication-type="book">
<person-group person-group-type="author">
<name>
<surname>Cook</surname>
<given-names>AM</given-names>
</name>
<name>
<surname>Hussey</surname>
<given-names>SM</given-names>
</name>
</person-group>
<source>
<italic>Assistive Technologies: Principles and Practice</italic>
</source>
<year>2002</year>
<publisher-loc>St Louis, Mo, USA</publisher-loc>
<publisher-name>Mosby</publisher-name>
</element-citation>
</ref>
</ref-list>
</back>
<floats-group>
<fig id="fig1" orientation="portrait" position="float">
<label>Figure 1</label>
<caption>
<p>(a) System overview. The DKF fuses the head orientation angles obtained through VBI (
<italic>γ</italic>
<sub>
<italic>c</italic>
</sub>
) and the IMU sensor (
<italic>γ</italic>
<sub>
<italic>a</italic>
</sub>
). The estimated angle
<italic>γ</italic>
is used in the control algorithms. (b) Reference framework associated with the user's head and rotation angles of the head in the 3D space related to the coordinate system.</p>
</caption>
<graphic xlink:href="TSWJ2013-589636.001"></graphic>
</fig>
<fig id="fig2" orientation="portrait" position="float">
<label>Figure 2</label>
<caption>
<p>Vision based interface. (a) Image obtained by the webcam interface at a resolution of 320 × 240 pixels. (b) Ellipse ROI image obtained after the skin detection (in YCbCr space) and the ellipse calculation.</p>
</caption>
<graphic xlink:href="TSWJ2013-589636.002"></graphic>
</fig>
<fig id="fig3" orientation="portrait" position="float">
<label>Figure 3</label>
<caption>
<p>Examples of the facial characteristics detection. Three cases of eyes detection are shown, these case are (a) centred head, (b) head's left-move, and (c) head's right-move.</p>
</caption>
<graphic xlink:href="TSWJ2013-589636.003"></graphic>
</fig>
<fig id="fig4" orientation="portrait" position="float">
<label>Figure 4</label>
<caption>
<p>Inertial sensor mounted on a cap or headband. These two accessories were proposed to provide comfort to the user's tastes.</p>
</caption>
<graphic xlink:href="TSWJ2013-589636.004"></graphic>
</fig>
<fig id="fig5" orientation="portrait" position="float">
<label>Figure 5</label>
<caption>
<p>Parameters of the dynamic model of the robotic wheelchair. The relevant parameters are
<italic>u</italic>
linear velocity,
<italic>ω</italic>
angular velocity,
<italic>G</italic>
center of mass,
<italic>c</italic>
middle point between front wheels,
<italic>E</italic>
mass center of the user location,
<italic>h</italic>
: (
<italic>x</italic>
,
<italic>y</italic>
) point of interest,
<italic>ψ</italic>
robot orientation, and
<italic>a</italic>
distance between
<italic>h</italic>
and the central point of the virtual axis of the traction wheels.</p>
</caption>
<graphic xlink:href="TSWJ2013-589636.005"></graphic>
</fig>
<fig id="fig6" orientation="portrait" position="float">
<label>Figure 6</label>
<caption>
<p>Fusion algorithm and results in laboratory conditions. The angle estimated from the IMU, VBI, and the fusion are tested with nonabrupt and abrupt changes in head's movements. Between 200 and 300 ms, a measurement outlier is filtered by the fusion technique.</p>
</caption>
<graphic xlink:href="TSWJ2013-589636.006"></graphic>
</fig>
<fig id="fig7" orientation="portrait" position="float">
<label>Figure 7</label>
<caption>
<p>Volunteers with motor disabilities while using the proposed assistive system.</p>
</caption>
<graphic xlink:href="TSWJ2013-589636.007"></graphic>
</fig>
<fig id="fig8" orientation="portrait" position="float">
<label>Figure 8</label>
<caption>
<p>Execution time of experiments. Graphics show total time ([min:sec]) in experiments 2–4 and the downward trend of the times, denoting the training relevance. (a) Adults A and D are presented and (b) childs B and C are exposed.</p>
</caption>
<graphic xlink:href="TSWJ2013-589636.008"></graphic>
</fig>
<fig id="fig9" orientation="portrait" position="float">
<label>Figure 9</label>
<caption>
<p>Time evolution of the linear and angular velocities of the wheelchair. The perturbations in the estimated angle are compensated by the control law.</p>
</caption>
<graphic xlink:href="TSWJ2013-589636.009"></graphic>
</fig>
<fig id="fig10" orientation="portrait" position="float">
<label>Figure 10</label>
<caption>
<p>Path described by the wheelchair commanded by the linear and angular velocities of
<xref ref-type="fig" rid="fig9">Figure 9</xref>
.</p>
</caption>
<graphic xlink:href="TSWJ2013-589636.010"></graphic>
</fig>
<table-wrap id="tab1" orientation="portrait" position="float">
<label>Table 1</label>
<caption>
<p>Results of the experiment. User A.</p>
</caption>
<table frame="hsides" rules="groups">
<thead>
<tr>
<th align="left" rowspan="1" colspan="1">User A</th>
<th align="center" rowspan="1" colspan="1">Stage 1</th>
<th align="center" rowspan="1" colspan="1">Stage 2</th>
<th align="center" rowspan="1" colspan="1">Stage 3</th>
<th align="center" rowspan="1" colspan="1">Time (min:sec)</th>
<th align="center" rowspan="1" colspan="1">Λ</th>
</tr>
</thead>
<tbody>
<tr>
<td align="left" rowspan="1" colspan="1">Exp. 1</td>
<td align="center" rowspan="1" colspan="1">
<italic></italic>
</td>
<td align="center" rowspan="1" colspan="1">
<italic></italic>
</td>
<td align="center" rowspan="1" colspan="1">
<italic></italic>
</td>
<td align="center" rowspan="1" colspan="1">2:30</td>
<td align="center" rowspan="1" colspan="1">0,592</td>
</tr>
<tr>
<td align="left" rowspan="1" colspan="1">Exp. 2</td>
<td align="center" rowspan="1" colspan="1">
<italic></italic>
</td>
<td align="center" rowspan="1" colspan="1">
<italic></italic>
</td>
<td align="center" rowspan="1" colspan="1">
<italic></italic>
</td>
<td align="center" rowspan="1" colspan="1">3:00</td>
<td align="center" rowspan="1" colspan="1">0,573</td>
</tr>
<tr>
<td align="left" rowspan="1" colspan="1">Exp. 3</td>
<td align="center" rowspan="1" colspan="1">
<italic></italic>
</td>
<td align="center" rowspan="1" colspan="1">
<italic></italic>
</td>
<td align="center" rowspan="1" colspan="1">
<italic></italic>
</td>
<td align="center" rowspan="1" colspan="1">1:25</td>
<td align="center" rowspan="1" colspan="1">0,892</td>
</tr>
<tr>
<td align="left" rowspan="1" colspan="1">Exp. 4</td>
<td align="center" rowspan="1" colspan="1">
<italic></italic>
</td>
<td align="center" rowspan="1" colspan="1">
<italic></italic>
</td>
<td align="center" rowspan="1" colspan="1">
<italic></italic>
</td>
<td align="center" rowspan="1" colspan="1">1:45</td>
<td align="center" rowspan="1" colspan="1">0,874</td>
</tr>
<tr>
<td align="left" rowspan="1" colspan="1">Exp. 5</td>
<td align="center" rowspan="1" colspan="1">
<italic></italic>
</td>
<td align="center" rowspan="1" colspan="1">
<italic></italic>
</td>
<td align="center" rowspan="1" colspan="1">
<italic></italic>
</td>
<td align="center" rowspan="1" colspan="1">1:00</td>
<td align="center" rowspan="1" colspan="1">0,918</td>
</tr>
</tbody>
</table>
</table-wrap>
<table-wrap id="tab2" orientation="portrait" position="float">
<label>Table 2</label>
<caption>
<p>Results of the experiment. User B.</p>
</caption>
<table frame="hsides" rules="groups">
<thead>
<tr>
<th align="left" rowspan="1" colspan="1">User B</th>
<th align="center" rowspan="1" colspan="1">Stage 1</th>
<th align="center" rowspan="1" colspan="1">Stage 2</th>
<th align="center" rowspan="1" colspan="1">Stage 3</th>
<th align="center" rowspan="1" colspan="1">Time (min:sec)</th>
<th align="center" rowspan="1" colspan="1">Λ</th>
</tr>
</thead>
<tbody>
<tr>
<td align="left" rowspan="1" colspan="1">Exp. 1</td>
<td align="center" rowspan="1" colspan="1">
<italic></italic>
</td>
<td align="center" rowspan="1" colspan="1">
<italic></italic>
</td>
<td align="center" rowspan="1" colspan="1">
<italic></italic>
</td>
<td align="center" rowspan="1" colspan="1"></td>
<td align="center" rowspan="1" colspan="1"></td>
</tr>
<tr>
<td align="left" rowspan="1" colspan="1">Exp. 2</td>
<td align="center" rowspan="1" colspan="1">
<italic></italic>
</td>
<td align="center" rowspan="1" colspan="1">
<italic></italic>
</td>
<td align="center" rowspan="1" colspan="1">
<italic></italic>
</td>
<td align="center" rowspan="1" colspan="1">3:15</td>
<td align="center" rowspan="1" colspan="1">0,318</td>
</tr>
<tr>
<td align="left" rowspan="1" colspan="1">Exp. 3</td>
<td align="center" rowspan="1" colspan="1">
<italic></italic>
</td>
<td align="center" rowspan="1" colspan="1">
<italic></italic>
</td>
<td align="center" rowspan="1" colspan="1">
<italic></italic>
</td>
<td align="center" rowspan="1" colspan="1">2:45</td>
<td align="center" rowspan="1" colspan="1">0,833</td>
</tr>
<tr>
<td align="left" rowspan="1" colspan="1">Exp. 4</td>
<td align="center" rowspan="1" colspan="1">
<italic></italic>
</td>
<td align="center" rowspan="1" colspan="1">
<italic></italic>
</td>
<td align="center" rowspan="1" colspan="1">
<italic></italic>
</td>
<td align="center" rowspan="1" colspan="1">3:15</td>
<td align="center" rowspan="1" colspan="1">0,568</td>
</tr>
<tr>
<td align="left" rowspan="1" colspan="1">Exp. 5</td>
<td align="center" rowspan="1" colspan="1">
<italic></italic>
</td>
<td align="center" rowspan="1" colspan="1">
<italic></italic>
</td>
<td align="center" rowspan="1" colspan="1">
<italic></italic>
</td>
<td align="center" rowspan="1" colspan="1">2:30</td>
<td align="center" rowspan="1" colspan="1">0,842</td>
</tr>
</tbody>
</table>
</table-wrap>
<table-wrap id="tab3" orientation="portrait" position="float">
<label>Table 3</label>
<caption>
<p>Results of the experiment. User C.</p>
</caption>
<table frame="hsides" rules="groups">
<thead>
<tr>
<th align="left" rowspan="1" colspan="1">User C</th>
<th align="center" rowspan="1" colspan="1">Stage 1</th>
<th align="center" rowspan="1" colspan="1">Stage 2</th>
<th align="center" rowspan="1" colspan="1">Stage 3</th>
<th align="center" rowspan="1" colspan="1">Time (min:sec)</th>
<th align="center" rowspan="1" colspan="1">Λ</th>
</tr>
</thead>
<tbody>
<tr>
<td align="left" rowspan="1" colspan="1">Exp. 1</td>
<td align="center" rowspan="1" colspan="1">
<italic></italic>
</td>
<td align="center" rowspan="1" colspan="1">
<italic></italic>
</td>
<td align="center" rowspan="1" colspan="1">
<italic></italic>
</td>
<td align="center" rowspan="1" colspan="1">3:45</td>
<td align="center" rowspan="1" colspan="1">0,556</td>
</tr>
<tr>
<td align="left" rowspan="1" colspan="1">Exp. 2</td>
<td align="center" rowspan="1" colspan="1">
<italic></italic>
</td>
<td align="center" rowspan="1" colspan="1">
<italic></italic>
</td>
<td align="center" rowspan="1" colspan="1">
<italic></italic>
</td>
<td align="center" rowspan="1" colspan="1">3:30</td>
<td align="center" rowspan="1" colspan="1">0,312</td>
</tr>
<tr>
<td align="left" rowspan="1" colspan="1">Exp. 3</td>
<td align="center" rowspan="1" colspan="1">
<italic></italic>
</td>
<td align="center" rowspan="1" colspan="1">
<italic></italic>
</td>
<td align="center" rowspan="1" colspan="1">
<italic></italic>
</td>
<td align="center" rowspan="1" colspan="1">1:10</td>
<td align="center" rowspan="1" colspan="1">0,907</td>
</tr>
<tr>
<td align="left" rowspan="1" colspan="1">Exp. 4</td>
<td align="center" rowspan="1" colspan="1">
<italic></italic>
</td>
<td align="center" rowspan="1" colspan="1">
<italic></italic>
</td>
<td align="center" rowspan="1" colspan="1">
<italic></italic>
</td>
<td align="center" rowspan="1" colspan="1">3:10</td>
<td align="center" rowspan="1" colspan="1">0,820</td>
</tr>
<tr>
<td align="left" rowspan="1" colspan="1">Exp. 5</td>
<td align="center" rowspan="1" colspan="1">
<italic></italic>
</td>
<td align="center" rowspan="1" colspan="1">
<italic></italic>
</td>
<td align="center" rowspan="1" colspan="1">
<italic></italic>
</td>
<td align="center" rowspan="1" colspan="1">1:20</td>
<td align="center" rowspan="1" colspan="1">0,897</td>
</tr>
</tbody>
</table>
</table-wrap>
<table-wrap id="tab4" orientation="portrait" position="float">
<label>Table 4</label>
<caption>
<p>Results of the experiment. User D.</p>
</caption>
<table frame="hsides" rules="groups">
<thead>
<tr>
<th align="left" rowspan="1" colspan="1">User D</th>
<th align="center" rowspan="1" colspan="1">Stage 1</th>
<th align="center" rowspan="1" colspan="1">Stage 2</th>
<th align="center" rowspan="1" colspan="1">Stage 3</th>
<th align="center" rowspan="1" colspan="1">Time (min:sec)</th>
<th align="center" rowspan="1" colspan="1">Λ</th>
</tr>
</thead>
<tbody>
<tr>
<td align="left" rowspan="1" colspan="1">Exp. 1</td>
<td align="center" rowspan="1" colspan="1">
<italic></italic>
</td>
<td align="center" rowspan="1" colspan="1">
<italic></italic>
</td>
<td align="center" rowspan="1" colspan="1">
<italic></italic>
</td>
<td align="center" rowspan="1" colspan="1"></td>
<td align="center" rowspan="1" colspan="1"></td>
</tr>
<tr>
<td align="left" rowspan="1" colspan="1">Exp. 2</td>
<td align="center" rowspan="1" colspan="1">
<italic></italic>
</td>
<td align="center" rowspan="1" colspan="1">
<italic></italic>
</td>
<td align="center" rowspan="1" colspan="1">
<italic></italic>
</td>
<td align="center" rowspan="1" colspan="1">4:00</td>
<td align="center" rowspan="1" colspan="1">0,550</td>
</tr>
<tr>
<td align="left" rowspan="1" colspan="1">Exp. 3</td>
<td align="center" rowspan="1" colspan="1">
<italic></italic>
</td>
<td align="center" rowspan="1" colspan="1">
<italic></italic>
</td>
<td align="center" rowspan="1" colspan="1">
<italic></italic>
</td>
<td align="center" rowspan="1" colspan="1">2:15</td>
<td align="center" rowspan="1" colspan="1">0,852</td>
</tr>
<tr>
<td align="left" rowspan="1" colspan="1">Exp. 4</td>
<td align="center" rowspan="1" colspan="1">
<italic></italic>
</td>
<td align="center" rowspan="1" colspan="1">
<italic></italic>
</td>
<td align="center" rowspan="1" colspan="1">
<italic></italic>
</td>
<td align="center" rowspan="1" colspan="1">1:00</td>
<td align="center" rowspan="1" colspan="1">0,918</td>
</tr>
<tr>
<td align="left" rowspan="1" colspan="1">Exp. 5</td>
<td align="center" rowspan="1" colspan="1">
<italic></italic>
</td>
<td align="center" rowspan="1" colspan="1">
<italic></italic>
</td>
<td align="center" rowspan="1" colspan="1">
<italic></italic>
</td>
<td align="center" rowspan="1" colspan="1">1:40</td>
<td align="center" rowspan="1" colspan="1">0,878</td>
</tr>
</tbody>
</table>
</table-wrap>
</floats-group>
</pmc>
</record>

Pour manipuler ce document sous Unix (Dilib)

EXPLOR_STEP=$WICRI_ROOT/Ticri/CIDE/explor/HapticV1/Data/Pmc/Curation
HfdSelect -h $EXPLOR_STEP/biblio.hfd -nk 002461 | SxmlIndent | more

Ou

HfdSelect -h $EXPLOR_AREA/Data/Pmc/Curation/biblio.hfd -nk 002461 | SxmlIndent | more

Pour mettre un lien sur cette page dans le réseau Wicri

{{Explor lien
   |wiki=    Ticri/CIDE
   |area=    HapticV1
   |flux=    Pmc
   |étape=   Curation
   |type=    RBID
   |clé=     PMC:3888755
   |texte=   Robust Human Machine Interface Based on Head Movements Applied to Assistive Robotics
}}

Pour générer des pages wiki

HfdIndexSelect -h $EXPLOR_AREA/Data/Pmc/Curation/RBID.i   -Sk "pubmed:24453877" \
       | HfdSelect -Kh $EXPLOR_AREA/Data/Pmc/Curation/biblio.hfd   \
       | NlmPubMed2Wicri -a HapticV1 

Wicri

This area was generated with Dilib version V0.6.23.
Data generation: Mon Jun 13 01:09:46 2016. Site generation: Wed Mar 6 09:54:07 2024