Serveur d'exploration sur les dispositifs haptiques

Attention, ce site est en cours de développement !
Attention, site généré par des moyens informatiques à partir de corpus bruts.
Les informations ne sont donc pas validées.

Identification of Vibrotactile Patterns Encoding Obstacle Distance Information.

Identifieur interne : 000386 ( PubMed/Corpus ); précédent : 000385; suivant : 000387

Identification of Vibrotactile Patterns Encoding Obstacle Distance Information.

Auteurs : Yeongmi Kim ; Matthias Harders ; Roger Gassert

Source :

RBID : pubmed:25807569

English descriptors

Abstract

Delivering distance information of nearby obstacles from sensors embedded in a white cane-in addition to the intrinsic mechanical feedback from the cane-can aid the visually impaired in ambulating independently. Haptics is a common modality for conveying such information to cane users, typically in the form of vibrotactile signals. In this context, we investigated the effect of tactile rendering methods, tactile feedback configurations and directions of tactile flow on the identification of obstacle distance. Three tactile rendering methods with temporal variation only, spatio-temporal variation and spatial/temporal/intensity variation were investigated for two vibration feedback configurations. Results showed a significant interaction between tactile rendering method and feedback configuration. Spatio-temporal variation generally resulted in high correct identification rates for both feedback configurations. In the case of the four-finger vibration, tactile rendering with spatial/temporal/intensity variation also resulted in high distance identification rate. Further, participants expressed their preference for the four-finger vibration over the single-finger vibration in a survey. Both preferred rendering methods with spatio-temporal variation and spatial/temporal/intensity variation for the four-finger vibration could convey obstacle distance information with low workload. Overall, the presented findings provide valuable insights and guidance for the design of haptic displays for electronic travel aids for the visually impaired.

DOI: 10.1109/TOH.2015.2415213
PubMed: 25807569

Links to Exploration step

pubmed:25807569

Le document en format XML

<record>
<TEI>
<teiHeader>
<fileDesc>
<titleStmt>
<title xml:lang="en">Identification of Vibrotactile Patterns Encoding Obstacle Distance Information.</title>
<author>
<name sortKey="Kim, Yeongmi" sort="Kim, Yeongmi" uniqKey="Kim Y" first="Yeongmi" last="Kim">Yeongmi Kim</name>
</author>
<author>
<name sortKey="Harders, Matthias" sort="Harders, Matthias" uniqKey="Harders M" first="Matthias" last="Harders">Matthias Harders</name>
</author>
<author>
<name sortKey="Gassert, Roger" sort="Gassert, Roger" uniqKey="Gassert R" first="Roger" last="Gassert">Roger Gassert</name>
</author>
</titleStmt>
<publicationStmt>
<idno type="wicri:source">PubMed</idno>
<date when="????">
<PubDate>
<MedlineDate>2015 Jul-Sep</MedlineDate>
</PubDate>
</date>
<idno type="doi">10.1109/TOH.2015.2415213</idno>
<idno type="RBID">pubmed:25807569</idno>
<idno type="pmid">25807569</idno>
<idno type="wicri:Area/PubMed/Corpus">000386</idno>
</publicationStmt>
<sourceDesc>
<biblStruct>
<analytic>
<title xml:lang="en">Identification of Vibrotactile Patterns Encoding Obstacle Distance Information.</title>
<author>
<name sortKey="Kim, Yeongmi" sort="Kim, Yeongmi" uniqKey="Kim Y" first="Yeongmi" last="Kim">Yeongmi Kim</name>
</author>
<author>
<name sortKey="Harders, Matthias" sort="Harders, Matthias" uniqKey="Harders M" first="Matthias" last="Harders">Matthias Harders</name>
</author>
<author>
<name sortKey="Gassert, Roger" sort="Gassert, Roger" uniqKey="Gassert R" first="Roger" last="Gassert">Roger Gassert</name>
</author>
</analytic>
<series>
<title level="j">IEEE transactions on haptics</title>
<idno type="eISSN">2329-4051</idno>
</series>
</biblStruct>
</sourceDesc>
</fileDesc>
<profileDesc>
<textClass>
<keywords scheme="KwdEn" xml:lang="en">
<term>Adult</term>
<term>Algorithms</term>
<term>Canes</term>
<term>Distance Perception (physiology)</term>
<term>Equipment Design</term>
<term>Feedback</term>
<term>Female</term>
<term>Fingers</term>
<term>Humans</term>
<term>Male</term>
<term>Mental Processes (physiology)</term>
<term>Middle Aged</term>
<term>Pattern Recognition, Physiological (physiology)</term>
<term>Psychophysics</term>
<term>Self-Help Devices</term>
<term>Touch (physiology)</term>
<term>Touch Perception (physiology)</term>
<term>User-Computer Interface</term>
<term>Vibration</term>
</keywords>
<keywords scheme="MESH" qualifier="physiology" xml:lang="en">
<term>Distance Perception</term>
<term>Mental Processes</term>
<term>Pattern Recognition, Physiological</term>
<term>Touch</term>
<term>Touch Perception</term>
</keywords>
<keywords scheme="MESH" xml:lang="en">
<term>Adult</term>
<term>Algorithms</term>
<term>Canes</term>
<term>Equipment Design</term>
<term>Feedback</term>
<term>Female</term>
<term>Fingers</term>
<term>Humans</term>
<term>Male</term>
<term>Middle Aged</term>
<term>Psychophysics</term>
<term>Self-Help Devices</term>
<term>User-Computer Interface</term>
<term>Vibration</term>
</keywords>
</textClass>
</profileDesc>
</teiHeader>
<front>
<div type="abstract" xml:lang="en">Delivering distance information of nearby obstacles from sensors embedded in a white cane-in addition to the intrinsic mechanical feedback from the cane-can aid the visually impaired in ambulating independently. Haptics is a common modality for conveying such information to cane users, typically in the form of vibrotactile signals. In this context, we investigated the effect of tactile rendering methods, tactile feedback configurations and directions of tactile flow on the identification of obstacle distance. Three tactile rendering methods with temporal variation only, spatio-temporal variation and spatial/temporal/intensity variation were investigated for two vibration feedback configurations. Results showed a significant interaction between tactile rendering method and feedback configuration. Spatio-temporal variation generally resulted in high correct identification rates for both feedback configurations. In the case of the four-finger vibration, tactile rendering with spatial/temporal/intensity variation also resulted in high distance identification rate. Further, participants expressed their preference for the four-finger vibration over the single-finger vibration in a survey. Both preferred rendering methods with spatio-temporal variation and spatial/temporal/intensity variation for the four-finger vibration could convey obstacle distance information with low workload. Overall, the presented findings provide valuable insights and guidance for the design of haptic displays for electronic travel aids for the visually impaired.</div>
</front>
</TEI>
<pubmed>
<MedlineCitation Owner="NLM" Status="MEDLINE">
<PMID Version="1">25807569</PMID>
<DateCreated>
<Year>2015</Year>
<Month>09</Month>
<Day>23</Day>
</DateCreated>
<DateCompleted>
<Year>2016</Year>
<Month>05</Month>
<Day>20</Day>
</DateCompleted>
<Article PubModel="Print-Electronic">
<Journal>
<ISSN IssnType="Electronic">2329-4051</ISSN>
<JournalIssue CitedMedium="Internet">
<Volume>8</Volume>
<Issue>3</Issue>
<PubDate>
<MedlineDate>2015 Jul-Sep</MedlineDate>
</PubDate>
</JournalIssue>
<Title>IEEE transactions on haptics</Title>
<ISOAbbreviation>IEEE Trans Haptics</ISOAbbreviation>
</Journal>
<ArticleTitle>Identification of Vibrotactile Patterns Encoding Obstacle Distance Information.</ArticleTitle>
<Pagination>
<MedlinePgn>298-305</MedlinePgn>
</Pagination>
<ELocationID EIdType="doi" ValidYN="Y">10.1109/TOH.2015.2415213</ELocationID>
<Abstract>
<AbstractText>Delivering distance information of nearby obstacles from sensors embedded in a white cane-in addition to the intrinsic mechanical feedback from the cane-can aid the visually impaired in ambulating independently. Haptics is a common modality for conveying such information to cane users, typically in the form of vibrotactile signals. In this context, we investigated the effect of tactile rendering methods, tactile feedback configurations and directions of tactile flow on the identification of obstacle distance. Three tactile rendering methods with temporal variation only, spatio-temporal variation and spatial/temporal/intensity variation were investigated for two vibration feedback configurations. Results showed a significant interaction between tactile rendering method and feedback configuration. Spatio-temporal variation generally resulted in high correct identification rates for both feedback configurations. In the case of the four-finger vibration, tactile rendering with spatial/temporal/intensity variation also resulted in high distance identification rate. Further, participants expressed their preference for the four-finger vibration over the single-finger vibration in a survey. Both preferred rendering methods with spatio-temporal variation and spatial/temporal/intensity variation for the four-finger vibration could convey obstacle distance information with low workload. Overall, the presented findings provide valuable insights and guidance for the design of haptic displays for electronic travel aids for the visually impaired.</AbstractText>
</Abstract>
<AuthorList CompleteYN="Y">
<Author ValidYN="Y">
<LastName>Kim</LastName>
<ForeName>Yeongmi</ForeName>
<Initials>Y</Initials>
</Author>
<Author ValidYN="Y">
<LastName>Harders</LastName>
<ForeName>Matthias</ForeName>
<Initials>M</Initials>
</Author>
<Author ValidYN="Y">
<LastName>Gassert</LastName>
<ForeName>Roger</ForeName>
<Initials>R</Initials>
</Author>
</AuthorList>
<Language>eng</Language>
<PublicationTypeList>
<PublicationType UI="D016428">Journal Article</PublicationType>
<PublicationType UI="D013485">Research Support, Non-U.S. Gov't</PublicationType>
</PublicationTypeList>
<ArticleDate DateType="Electronic">
<Year>2015</Year>
<Month>03</Month>
<Day>20</Day>
</ArticleDate>
</Article>
<MedlineJournalInfo>
<Country>United States</Country>
<MedlineTA>IEEE Trans Haptics</MedlineTA>
<NlmUniqueID>101491191</NlmUniqueID>
<ISSNLinking>1939-1412</ISSNLinking>
</MedlineJournalInfo>
<CitationSubset>IM</CitationSubset>
<MeshHeadingList>
<MeshHeading>
<DescriptorName MajorTopicYN="N" UI="D000328">Adult</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="N" UI="D000465">Algorithms</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="N" UI="D002183">Canes</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="N" UI="D004215">Distance Perception</DescriptorName>
<QualifierName MajorTopicYN="N" UI="Q000502">physiology</QualifierName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="N" UI="D004867">Equipment Design</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="N" UI="D005246">Feedback</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="N" UI="D005260">Female</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="N" UI="D005385">Fingers</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="N" UI="D006801">Humans</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="N" UI="D008297">Male</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="N" UI="D008606">Mental Processes</DescriptorName>
<QualifierName MajorTopicYN="Y" UI="Q000502">physiology</QualifierName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="N" UI="D008875">Middle Aged</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="N" UI="D046709">Pattern Recognition, Physiological</DescriptorName>
<QualifierName MajorTopicYN="N" UI="Q000502">physiology</QualifierName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="N" UI="D011601">Psychophysics</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="Y" UI="D012656">Self-Help Devices</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="N" UI="D014110">Touch</DescriptorName>
<QualifierName MajorTopicYN="Y" UI="Q000502">physiology</QualifierName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="N" UI="D055698">Touch Perception</DescriptorName>
<QualifierName MajorTopicYN="Y" UI="Q000502">physiology</QualifierName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="N" UI="D014584">User-Computer Interface</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="N" UI="D014732">Vibration</DescriptorName>
</MeshHeading>
</MeshHeadingList>
</MedlineCitation>
<PubmedData>
<History>
<PubMedPubDate PubStatus="aheadofprint">
<Year>2015</Year>
<Month>3</Month>
<Day>20</Day>
</PubMedPubDate>
<PubMedPubDate PubStatus="entrez">
<Year>2015</Year>
<Month>3</Month>
<Day>26</Day>
<Hour>6</Hour>
<Minute>0</Minute>
</PubMedPubDate>
<PubMedPubDate PubStatus="pubmed">
<Year>2015</Year>
<Month>3</Month>
<Day>26</Day>
<Hour>6</Hour>
<Minute>0</Minute>
</PubMedPubDate>
<PubMedPubDate PubStatus="medline">
<Year>2016</Year>
<Month>5</Month>
<Day>21</Day>
<Hour>6</Hour>
<Minute>0</Minute>
</PubMedPubDate>
</History>
<PublicationStatus>ppublish</PublicationStatus>
<ArticleIdList>
<ArticleId IdType="doi">10.1109/TOH.2015.2415213</ArticleId>
<ArticleId IdType="pubmed">25807569</ArticleId>
</ArticleIdList>
</PubmedData>
</pubmed>
</record>

Pour manipuler ce document sous Unix (Dilib)

EXPLOR_STEP=$WICRI_ROOT/Ticri/CIDE/explor/HapticV1/Data/PubMed/Corpus
HfdSelect -h $EXPLOR_STEP/biblio.hfd -nk 000386 | SxmlIndent | more

Ou

HfdSelect -h $EXPLOR_AREA/Data/PubMed/Corpus/biblio.hfd -nk 000386 | SxmlIndent | more

Pour mettre un lien sur cette page dans le réseau Wicri

{{Explor lien
   |wiki=    Ticri/CIDE
   |area=    HapticV1
   |flux=    PubMed
   |étape=   Corpus
   |type=    RBID
   |clé=     pubmed:25807569
   |texte=   Identification of Vibrotactile Patterns Encoding Obstacle Distance Information.
}}

Pour générer des pages wiki

HfdIndexSelect -h $EXPLOR_AREA/Data/PubMed/Corpus/RBID.i   -Sk "pubmed:25807569" \
       | HfdSelect -Kh $EXPLOR_AREA/Data/PubMed/Corpus/biblio.hfd   \
       | NlmPubMed2Wicri -a HapticV1 

Wicri

This area was generated with Dilib version V0.6.23.
Data generation: Mon Jun 13 01:09:46 2016. Site generation: Wed Mar 6 09:54:07 2024