Serveur d'exploration sur les dispositifs haptiques

Attention, ce site est en cours de développement !
Attention, site généré par des moyens informatiques à partir de corpus bruts.
Les informations ne sont donc pas validées.

Mobile Navigation Using Haptic, Audio, and Visual Direction Cues with a Handheld Test Platform.

Identifieur interne : 000D47 ( PubMed/Corpus ); précédent : 000D46; suivant : 000D48

Mobile Navigation Using Haptic, Audio, and Visual Direction Cues with a Handheld Test Platform.

Auteurs : R L Koslover ; B T Gleeson ; J T De Bever ; W R Provancher

Source :

RBID : pubmed:26963827

Abstract

This paper reports on a series of user experiments evaluating the design of a multimodal test platform capable of rendering visual, audio, vibrotactile, and directional skin-stretch stimuli. The test platform is a handheld, wirelessly controlled device that will facilitate experiments with mobile users in realistic environments. Stimuli rendered by the device are fully characterized, and have little variance in stimulus onset timing. A series of user experiments utilizing navigational cues validates the function of the device and investigates the user response to all stimulus modes. Results show users are capable of interpreting all stimuli with high accuracy and can use the direction cues for mobile navigation. Tests included both stationary (seated) and mobile (walking a simple obstacle course) tasks. Accuracy and response time patterns are similar in both seated and mobile conditions. This device provides a means of designing and evaluating multimodal communication methods for handheld devices and will facilitate experiments investigating the effects of stimulus mode on device usability and situation awareness.

DOI: 10.1109/TOH.2011.58
PubMed: 26963827

Links to Exploration step

pubmed:26963827

Le document en format XML

<record>
<TEI>
<teiHeader>
<fileDesc>
<titleStmt>
<title xml:lang="en">Mobile Navigation Using Haptic, Audio, and Visual Direction Cues with a Handheld Test Platform.</title>
<author>
<name sortKey="Koslover, R L" sort="Koslover, R L" uniqKey="Koslover R" first="R L" last="Koslover">R L Koslover</name>
</author>
<author>
<name sortKey="Gleeson, B T" sort="Gleeson, B T" uniqKey="Gleeson B" first="B T" last="Gleeson">B T Gleeson</name>
</author>
<author>
<name sortKey="De Bever, J T" sort="De Bever, J T" uniqKey="De Bever J" first="J T" last="De Bever">J T De Bever</name>
</author>
<author>
<name sortKey="Provancher, W R" sort="Provancher, W R" uniqKey="Provancher W" first="W R" last="Provancher">W R Provancher</name>
</author>
</titleStmt>
<publicationStmt>
<idno type="wicri:source">PubMed</idno>
<date when="????">
<PubDate>
<MedlineDate>2012 Jan-Mar</MedlineDate>
</PubDate>
</date>
<idno type="doi">10.1109/TOH.2011.58</idno>
<idno type="RBID">pubmed:26963827</idno>
<idno type="pmid">26963827</idno>
<idno type="wicri:Area/PubMed/Corpus">000D47</idno>
</publicationStmt>
<sourceDesc>
<biblStruct>
<analytic>
<title xml:lang="en">Mobile Navigation Using Haptic, Audio, and Visual Direction Cues with a Handheld Test Platform.</title>
<author>
<name sortKey="Koslover, R L" sort="Koslover, R L" uniqKey="Koslover R" first="R L" last="Koslover">R L Koslover</name>
</author>
<author>
<name sortKey="Gleeson, B T" sort="Gleeson, B T" uniqKey="Gleeson B" first="B T" last="Gleeson">B T Gleeson</name>
</author>
<author>
<name sortKey="De Bever, J T" sort="De Bever, J T" uniqKey="De Bever J" first="J T" last="De Bever">J T De Bever</name>
</author>
<author>
<name sortKey="Provancher, W R" sort="Provancher, W R" uniqKey="Provancher W" first="W R" last="Provancher">W R Provancher</name>
</author>
</analytic>
<series>
<title level="j">IEEE transactions on haptics</title>
<idno type="ISSN">1939-1412</idno>
</series>
</biblStruct>
</sourceDesc>
</fileDesc>
<profileDesc>
<textClass></textClass>
</profileDesc>
</teiHeader>
<front>
<div type="abstract" xml:lang="en">This paper reports on a series of user experiments evaluating the design of a multimodal test platform capable of rendering visual, audio, vibrotactile, and directional skin-stretch stimuli. The test platform is a handheld, wirelessly controlled device that will facilitate experiments with mobile users in realistic environments. Stimuli rendered by the device are fully characterized, and have little variance in stimulus onset timing. A series of user experiments utilizing navigational cues validates the function of the device and investigates the user response to all stimulus modes. Results show users are capable of interpreting all stimuli with high accuracy and can use the direction cues for mobile navigation. Tests included both stationary (seated) and mobile (walking a simple obstacle course) tasks. Accuracy and response time patterns are similar in both seated and mobile conditions. This device provides a means of designing and evaluating multimodal communication methods for handheld devices and will facilitate experiments investigating the effects of stimulus mode on device usability and situation awareness.</div>
</front>
</TEI>
<pubmed>
<MedlineCitation Owner="NLM" Status="PubMed-not-MEDLINE">
<PMID Version="1">26963827</PMID>
<DateCreated>
<Year>2016</Year>
<Month>03</Month>
<Day>11</Day>
</DateCreated>
<DateCompleted>
<Year>2016</Year>
<Month>03</Month>
<Day>12</Day>
</DateCompleted>
<Article PubModel="Print">
<Journal>
<ISSN IssnType="Print">1939-1412</ISSN>
<JournalIssue CitedMedium="Print">
<Volume>5</Volume>
<Issue>1</Issue>
<PubDate>
<MedlineDate>2012 Jan-Mar</MedlineDate>
</PubDate>
</JournalIssue>
<Title>IEEE transactions on haptics</Title>
<ISOAbbreviation>IEEE Trans Haptics</ISOAbbreviation>
</Journal>
<ArticleTitle>Mobile Navigation Using Haptic, Audio, and Visual Direction Cues with a Handheld Test Platform.</ArticleTitle>
<Pagination>
<MedlinePgn>33-8</MedlinePgn>
</Pagination>
<ELocationID EIdType="doi" ValidYN="Y">10.1109/TOH.2011.58</ELocationID>
<Abstract>
<AbstractText>This paper reports on a series of user experiments evaluating the design of a multimodal test platform capable of rendering visual, audio, vibrotactile, and directional skin-stretch stimuli. The test platform is a handheld, wirelessly controlled device that will facilitate experiments with mobile users in realistic environments. Stimuli rendered by the device are fully characterized, and have little variance in stimulus onset timing. A series of user experiments utilizing navigational cues validates the function of the device and investigates the user response to all stimulus modes. Results show users are capable of interpreting all stimuli with high accuracy and can use the direction cues for mobile navigation. Tests included both stationary (seated) and mobile (walking a simple obstacle course) tasks. Accuracy and response time patterns are similar in both seated and mobile conditions. This device provides a means of designing and evaluating multimodal communication methods for handheld devices and will facilitate experiments investigating the effects of stimulus mode on device usability and situation awareness.</AbstractText>
</Abstract>
<AuthorList CompleteYN="Y">
<Author ValidYN="Y">
<LastName>Koslover</LastName>
<ForeName>R L</ForeName>
<Initials>RL</Initials>
</Author>
<Author ValidYN="Y">
<LastName>Gleeson</LastName>
<ForeName>B T</ForeName>
<Initials>BT</Initials>
</Author>
<Author ValidYN="Y">
<LastName>de Bever</LastName>
<ForeName>J T</ForeName>
<Initials>JT</Initials>
</Author>
<Author ValidYN="Y">
<LastName>Provancher</LastName>
<ForeName>W R</ForeName>
<Initials>WR</Initials>
</Author>
</AuthorList>
<Language>eng</Language>
<PublicationTypeList>
<PublicationType UI="D016428">Journal Article</PublicationType>
</PublicationTypeList>
</Article>
<MedlineJournalInfo>
<Country>United States</Country>
<MedlineTA>IEEE Trans Haptics</MedlineTA>
<NlmUniqueID>101491191</NlmUniqueID>
<ISSNLinking>1939-1412</ISSNLinking>
</MedlineJournalInfo>
</MedlineCitation>
<PubmedData>
<History>
<PubMedPubDate PubStatus="entrez">
<Year>2016</Year>
<Month>3</Month>
<Day>11</Day>
<Hour>6</Hour>
<Minute>0</Minute>
</PubMedPubDate>
<PubMedPubDate PubStatus="pubmed">
<Year>2012</Year>
<Month>1</Month>
<Day>1</Day>
<Hour>0</Hour>
<Minute>0</Minute>
</PubMedPubDate>
<PubMedPubDate PubStatus="medline">
<Year>2012</Year>
<Month>1</Month>
<Day>1</Day>
<Hour>0</Hour>
<Minute>1</Minute>
</PubMedPubDate>
</History>
<PublicationStatus>ppublish</PublicationStatus>
<ArticleIdList>
<ArticleId IdType="doi">10.1109/TOH.2011.58</ArticleId>
<ArticleId IdType="pubmed">26963827</ArticleId>
</ArticleIdList>
</PubmedData>
</pubmed>
</record>

Pour manipuler ce document sous Unix (Dilib)

EXPLOR_STEP=$WICRI_ROOT/Ticri/CIDE/explor/HapticV1/Data/PubMed/Corpus
HfdSelect -h $EXPLOR_STEP/biblio.hfd -nk 000D47 | SxmlIndent | more

Ou

HfdSelect -h $EXPLOR_AREA/Data/PubMed/Corpus/biblio.hfd -nk 000D47 | SxmlIndent | more

Pour mettre un lien sur cette page dans le réseau Wicri

{{Explor lien
   |wiki=    Ticri/CIDE
   |area=    HapticV1
   |flux=    PubMed
   |étape=   Corpus
   |type=    RBID
   |clé=     pubmed:26963827
   |texte=   Mobile Navigation Using Haptic, Audio, and Visual Direction Cues with a Handheld Test Platform.
}}

Pour générer des pages wiki

HfdIndexSelect -h $EXPLOR_AREA/Data/PubMed/Corpus/RBID.i   -Sk "pubmed:26963827" \
       | HfdSelect -Kh $EXPLOR_AREA/Data/PubMed/Corpus/biblio.hfd   \
       | NlmPubMed2Wicri -a HapticV1 

Wicri

This area was generated with Dilib version V0.6.23.
Data generation: Mon Jun 13 01:09:46 2016. Site generation: Wed Mar 6 09:54:07 2024