Serveur d'exploration sur les dispositifs haptiques

Attention, ce site est en cours de développement !
Attention, site généré par des moyens informatiques à partir de corpus bruts.
Les informations ne sont donc pas validées.

A haptic and auditory assistive user interface: helping the blinds on their computer operations.

Identifieur interne : 000D03 ( PubMed/Corpus ); précédent : 000D02; suivant : 000D04

A haptic and auditory assistive user interface: helping the blinds on their computer operations.

Auteurs : V-Ris Jaijongrak ; Itsuo Kumazawa ; Surapa Thiemjarus

Source :

RBID : pubmed:22275546

English descriptors

Abstract

In this paper, a study of assistive devices with multi-modal feedback is conducted to evaluate the efficiency of haptic and auditory information towards the users' mouse operations. Haptic feedback, generated by a combination of wheels driven by motors, is provided through the use of the haptic mouse. Meanwhile, audio feedback either in the form of synthesized directional speech or audio signal. Based on these interfaces, a set of experiments are conducted to compare their efficiencies. The measurement criteria used in this experiment are the distance regarding to the target circle in pixels, the operational time for the task in milliseconds, and opinion in term of understandability and comfortability towards each modal of the tested user interfaces in discrete indices. The experimental results show that with the proper modalities of feedback interfaces for the user, the efficiency can be improved by either the reduction in operational time or the increase of accuracy in pointing the target. Furthermore, the justification is also based on the user's satisfaction towards using the device to conduct the predefined cursor movement task, which occasionally is difficult to understand and interpret by the user. For example of the application adopting the proposed interface system, a web browser application is implemented and explained in this paper.

DOI: 10.1109/ICORR.2011.5975341
PubMed: 22275546

Links to Exploration step

pubmed:22275546

Le document en format XML

<record>
<TEI>
<teiHeader>
<fileDesc>
<titleStmt>
<title xml:lang="en">A haptic and auditory assistive user interface: helping the blinds on their computer operations.</title>
<author>
<name sortKey="Jaijongrak, V Ris" sort="Jaijongrak, V Ris" uniqKey="Jaijongrak V" first="V-Ris" last="Jaijongrak">V-Ris Jaijongrak</name>
<affiliation>
<nlm:affiliation>Imaging Science and Engineering Laboratory, Tokyo Institute of Technology, Yokohama, Japan. jaijongrak.v.aa@m.titech.ac.jp</nlm:affiliation>
</affiliation>
</author>
<author>
<name sortKey="Kumazawa, Itsuo" sort="Kumazawa, Itsuo" uniqKey="Kumazawa I" first="Itsuo" last="Kumazawa">Itsuo Kumazawa</name>
</author>
<author>
<name sortKey="Thiemjarus, Surapa" sort="Thiemjarus, Surapa" uniqKey="Thiemjarus S" first="Surapa" last="Thiemjarus">Surapa Thiemjarus</name>
</author>
</titleStmt>
<publicationStmt>
<idno type="wicri:source">PubMed</idno>
<date when="2011">2011</date>
<idno type="doi">10.1109/ICORR.2011.5975341</idno>
<idno type="RBID">pubmed:22275546</idno>
<idno type="pmid">22275546</idno>
<idno type="wicri:Area/PubMed/Corpus">000D03</idno>
</publicationStmt>
<sourceDesc>
<biblStruct>
<analytic>
<title xml:lang="en">A haptic and auditory assistive user interface: helping the blinds on their computer operations.</title>
<author>
<name sortKey="Jaijongrak, V Ris" sort="Jaijongrak, V Ris" uniqKey="Jaijongrak V" first="V-Ris" last="Jaijongrak">V-Ris Jaijongrak</name>
<affiliation>
<nlm:affiliation>Imaging Science and Engineering Laboratory, Tokyo Institute of Technology, Yokohama, Japan. jaijongrak.v.aa@m.titech.ac.jp</nlm:affiliation>
</affiliation>
</author>
<author>
<name sortKey="Kumazawa, Itsuo" sort="Kumazawa, Itsuo" uniqKey="Kumazawa I" first="Itsuo" last="Kumazawa">Itsuo Kumazawa</name>
</author>
<author>
<name sortKey="Thiemjarus, Surapa" sort="Thiemjarus, Surapa" uniqKey="Thiemjarus S" first="Surapa" last="Thiemjarus">Surapa Thiemjarus</name>
</author>
</analytic>
<series>
<title level="j">IEEE ... International Conference on Rehabilitation Robotics : [proceedings]</title>
<idno type="eISSN">1945-7901</idno>
<imprint>
<date when="2011" type="published">2011</date>
</imprint>
</series>
</biblStruct>
</sourceDesc>
</fileDesc>
<profileDesc>
<textClass>
<keywords scheme="KwdEn" xml:lang="en">
<term>Adult</term>
<term>Blindness (physiopathology)</term>
<term>Computers</term>
<term>Female</term>
<term>Humans</term>
<term>Male</term>
<term>Self-Help Devices</term>
<term>User-Computer Interface</term>
<term>Young Adult</term>
</keywords>
<keywords scheme="MESH" qualifier="physiopathology" xml:lang="en">
<term>Blindness</term>
</keywords>
<keywords scheme="MESH" xml:lang="en">
<term>Adult</term>
<term>Computers</term>
<term>Female</term>
<term>Humans</term>
<term>Male</term>
<term>Self-Help Devices</term>
<term>User-Computer Interface</term>
<term>Young Adult</term>
</keywords>
</textClass>
</profileDesc>
</teiHeader>
<front>
<div type="abstract" xml:lang="en">In this paper, a study of assistive devices with multi-modal feedback is conducted to evaluate the efficiency of haptic and auditory information towards the users' mouse operations. Haptic feedback, generated by a combination of wheels driven by motors, is provided through the use of the haptic mouse. Meanwhile, audio feedback either in the form of synthesized directional speech or audio signal. Based on these interfaces, a set of experiments are conducted to compare their efficiencies. The measurement criteria used in this experiment are the distance regarding to the target circle in pixels, the operational time for the task in milliseconds, and opinion in term of understandability and comfortability towards each modal of the tested user interfaces in discrete indices. The experimental results show that with the proper modalities of feedback interfaces for the user, the efficiency can be improved by either the reduction in operational time or the increase of accuracy in pointing the target. Furthermore, the justification is also based on the user's satisfaction towards using the device to conduct the predefined cursor movement task, which occasionally is difficult to understand and interpret by the user. For example of the application adopting the proposed interface system, a web browser application is implemented and explained in this paper.</div>
</front>
</TEI>
<pubmed>
<MedlineCitation Owner="NLM" Status="MEDLINE">
<PMID Version="1">22275546</PMID>
<DateCreated>
<Year>2012</Year>
<Month>01</Month>
<Day>25</Day>
</DateCreated>
<DateCompleted>
<Year>2012</Year>
<Month>07</Month>
<Day>19</Day>
</DateCompleted>
<Article PubModel="Print">
<Journal>
<ISSN IssnType="Electronic">1945-7901</ISSN>
<JournalIssue CitedMedium="Internet">
<Volume>2011</Volume>
<PubDate>
<Year>2011</Year>
</PubDate>
</JournalIssue>
<Title>IEEE ... International Conference on Rehabilitation Robotics : [proceedings]</Title>
<ISOAbbreviation>IEEE Int Conf Rehabil Robot</ISOAbbreviation>
</Journal>
<ArticleTitle>A haptic and auditory assistive user interface: helping the blinds on their computer operations.</ArticleTitle>
<Pagination>
<MedlinePgn>5975341</MedlinePgn>
</Pagination>
<ELocationID EIdType="doi" ValidYN="Y">10.1109/ICORR.2011.5975341</ELocationID>
<Abstract>
<AbstractText>In this paper, a study of assistive devices with multi-modal feedback is conducted to evaluate the efficiency of haptic and auditory information towards the users' mouse operations. Haptic feedback, generated by a combination of wheels driven by motors, is provided through the use of the haptic mouse. Meanwhile, audio feedback either in the form of synthesized directional speech or audio signal. Based on these interfaces, a set of experiments are conducted to compare their efficiencies. The measurement criteria used in this experiment are the distance regarding to the target circle in pixels, the operational time for the task in milliseconds, and opinion in term of understandability and comfortability towards each modal of the tested user interfaces in discrete indices. The experimental results show that with the proper modalities of feedback interfaces for the user, the efficiency can be improved by either the reduction in operational time or the increase of accuracy in pointing the target. Furthermore, the justification is also based on the user's satisfaction towards using the device to conduct the predefined cursor movement task, which occasionally is difficult to understand and interpret by the user. For example of the application adopting the proposed interface system, a web browser application is implemented and explained in this paper.</AbstractText>
<CopyrightInformation>© 2011 IEEE</CopyrightInformation>
</Abstract>
<AuthorList CompleteYN="Y">
<Author ValidYN="Y">
<LastName>Jaijongrak</LastName>
<ForeName>V-ris</ForeName>
<Initials>VR</Initials>
<AffiliationInfo>
<Affiliation>Imaging Science and Engineering Laboratory, Tokyo Institute of Technology, Yokohama, Japan. jaijongrak.v.aa@m.titech.ac.jp</Affiliation>
</AffiliationInfo>
</Author>
<Author ValidYN="Y">
<LastName>Kumazawa</LastName>
<ForeName>Itsuo</ForeName>
<Initials>I</Initials>
</Author>
<Author ValidYN="Y">
<LastName>Thiemjarus</LastName>
<ForeName>Surapa</ForeName>
<Initials>S</Initials>
</Author>
</AuthorList>
<Language>eng</Language>
<PublicationTypeList>
<PublicationType UI="D016430">Clinical Trial</PublicationType>
<PublicationType UI="D016428">Journal Article</PublicationType>
<PublicationType UI="D013485">Research Support, Non-U.S. Gov't</PublicationType>
</PublicationTypeList>
</Article>
<MedlineJournalInfo>
<Country>United States</Country>
<MedlineTA>IEEE Int Conf Rehabil Robot</MedlineTA>
<NlmUniqueID>101260913</NlmUniqueID>
<ISSNLinking>1945-7898</ISSNLinking>
</MedlineJournalInfo>
<CitationSubset>IM</CitationSubset>
<MeshHeadingList>
<MeshHeading>
<DescriptorName MajorTopicYN="N" UI="D000328">Adult</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="N" UI="D001766">Blindness</DescriptorName>
<QualifierName MajorTopicYN="Y" UI="Q000503">physiopathology</QualifierName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="N" UI="D003201">Computers</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="N" UI="D005260">Female</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="N" UI="D006801">Humans</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="N" UI="D008297">Male</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="Y" UI="D012656">Self-Help Devices</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="Y" UI="D014584">User-Computer Interface</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="N" UI="D055815">Young Adult</DescriptorName>
</MeshHeading>
</MeshHeadingList>
</MedlineCitation>
<PubmedData>
<History>
<PubMedPubDate PubStatus="entrez">
<Year>2012</Year>
<Month>1</Month>
<Day>26</Day>
<Hour>6</Hour>
<Minute>0</Minute>
</PubMedPubDate>
<PubMedPubDate PubStatus="pubmed">
<Year>2012</Year>
<Month>1</Month>
<Day>26</Day>
<Hour>6</Hour>
<Minute>0</Minute>
</PubMedPubDate>
<PubMedPubDate PubStatus="medline">
<Year>2012</Year>
<Month>7</Month>
<Day>20</Day>
<Hour>6</Hour>
<Minute>0</Minute>
</PubMedPubDate>
</History>
<PublicationStatus>ppublish</PublicationStatus>
<ArticleIdList>
<ArticleId IdType="doi">10.1109/ICORR.2011.5975341</ArticleId>
<ArticleId IdType="pubmed">22275546</ArticleId>
</ArticleIdList>
</PubmedData>
</pubmed>
</record>

Pour manipuler ce document sous Unix (Dilib)

EXPLOR_STEP=$WICRI_ROOT/Ticri/CIDE/explor/HapticV1/Data/PubMed/Corpus
HfdSelect -h $EXPLOR_STEP/biblio.hfd -nk 000D03 | SxmlIndent | more

Ou

HfdSelect -h $EXPLOR_AREA/Data/PubMed/Corpus/biblio.hfd -nk 000D03 | SxmlIndent | more

Pour mettre un lien sur cette page dans le réseau Wicri

{{Explor lien
   |wiki=    Ticri/CIDE
   |area=    HapticV1
   |flux=    PubMed
   |étape=   Corpus
   |type=    RBID
   |clé=     pubmed:22275546
   |texte=   A haptic and auditory assistive user interface: helping the blinds on their computer operations.
}}

Pour générer des pages wiki

HfdIndexSelect -h $EXPLOR_AREA/Data/PubMed/Corpus/RBID.i   -Sk "pubmed:22275546" \
       | HfdSelect -Kh $EXPLOR_AREA/Data/PubMed/Corpus/biblio.hfd   \
       | NlmPubMed2Wicri -a HapticV1 

Wicri

This area was generated with Dilib version V0.6.23.
Data generation: Mon Jun 13 01:09:46 2016. Site generation: Wed Mar 6 09:54:07 2024