Serveur d'exploration sur les dispositifs haptiques

Attention, ce site est en cours de développement !
Attention, site généré par des moyens informatiques à partir de corpus bruts.
Les informations ne sont donc pas validées.

Blind persons navigate in virtual reality (VR); hearing and feeling communicates "reality".

Identifieur interne : 002047 ( PubMed/Corpus ); précédent : 002046; suivant : 002048

Blind persons navigate in virtual reality (VR); hearing and feeling communicates "reality".

Auteurs : M L Max ; J R Gonzalez

Source :

RBID : pubmed:10168947

English descriptors

Abstract

Can Virtual Reality (VR) developments in audio navigation for blind persons support therapies for all? Working with Crystal River Engineering we are developing navigable Virtual Reality worlds for blind users, using spatialized audio [1], [2]. All persons, however, use specialized channels, such as: visual, aural, and kinetic learning senses. Predominantly visual VR worlds and health informatics models from World Wide Webs, may be downloaded, tailored, augmented, and delivered to each of these learning senses using VR. We are also testing a proof of concept system with Boston Dynamics which downloads 3-dimensional, satellite-derived map models from the World Wide Web, and makes them navigable by "feeling" the terrain using haptic (tactual or force feedback to your hand) robotic interfaces. Ultimately, these multi-sensory VR access methods: sight, localization by audio, and "feeling" of data sets could open up the World Wide Web to individuals with sight impairments. This could also, however, benefit government, businesses, universities, and (elementary) education. It could contribute more powerful communications, education, and medical simulation applications on the World Wide Web. This work is part of government technology transfer to telemedicine, (elementary) education, disabilities access to the Web, and new Internet access and productivity efforts under Vice President Gore's National Performance Review.

PubMed: 10168947

Links to Exploration step

pubmed:10168947

Le document en format XML

<record>
<TEI>
<teiHeader>
<fileDesc>
<titleStmt>
<title xml:lang="en">Blind persons navigate in virtual reality (VR); hearing and feeling communicates "reality".</title>
<author>
<name sortKey="Max, M L" sort="Max, M L" uniqKey="Max M" first="M L" last="Max">M L Max</name>
</author>
<author>
<name sortKey="Gonzalez, J R" sort="Gonzalez, J R" uniqKey="Gonzalez J" first="J R" last="Gonzalez">J R Gonzalez</name>
</author>
</titleStmt>
<publicationStmt>
<idno type="wicri:source">PubMed</idno>
<date when="1997">1997</date>
<idno type="RBID">pubmed:10168947</idno>
<idno type="pmid">10168947</idno>
<idno type="wicri:Area/PubMed/Corpus">002047</idno>
</publicationStmt>
<sourceDesc>
<biblStruct>
<analytic>
<title xml:lang="en">Blind persons navigate in virtual reality (VR); hearing and feeling communicates "reality".</title>
<author>
<name sortKey="Max, M L" sort="Max, M L" uniqKey="Max M" first="M L" last="Max">M L Max</name>
</author>
<author>
<name sortKey="Gonzalez, J R" sort="Gonzalez, J R" uniqKey="Gonzalez J" first="J R" last="Gonzalez">J R Gonzalez</name>
</author>
</analytic>
<series>
<title level="j">Studies in health technology and informatics</title>
<idno type="ISSN">0926-9630</idno>
<imprint>
<date when="1997" type="published">1997</date>
</imprint>
</series>
</biblStruct>
</sourceDesc>
</fileDesc>
<profileDesc>
<textClass>
<keywords scheme="KwdEn" xml:lang="en">
<term>Adult</term>
<term>Blindness (rehabilitation)</term>
<term>Child, Preschool</term>
<term>Female</term>
<term>Humans</term>
<term>Male</term>
<term>Pilot Projects</term>
<term>Sensory Aids</term>
<term>Sound Localization</term>
<term>Space Perception</term>
<term>User-Computer Interface</term>
</keywords>
<keywords scheme="MESH" qualifier="rehabilitation" xml:lang="en">
<term>Blindness</term>
</keywords>
<keywords scheme="MESH" xml:lang="en">
<term>Adult</term>
<term>Child, Preschool</term>
<term>Female</term>
<term>Humans</term>
<term>Male</term>
<term>Pilot Projects</term>
<term>Sensory Aids</term>
<term>Sound Localization</term>
<term>Space Perception</term>
<term>User-Computer Interface</term>
</keywords>
</textClass>
</profileDesc>
</teiHeader>
<front>
<div type="abstract" xml:lang="en">Can Virtual Reality (VR) developments in audio navigation for blind persons support therapies for all? Working with Crystal River Engineering we are developing navigable Virtual Reality worlds for blind users, using spatialized audio [1], [2]. All persons, however, use specialized channels, such as: visual, aural, and kinetic learning senses. Predominantly visual VR worlds and health informatics models from World Wide Webs, may be downloaded, tailored, augmented, and delivered to each of these learning senses using VR. We are also testing a proof of concept system with Boston Dynamics which downloads 3-dimensional, satellite-derived map models from the World Wide Web, and makes them navigable by "feeling" the terrain using haptic (tactual or force feedback to your hand) robotic interfaces. Ultimately, these multi-sensory VR access methods: sight, localization by audio, and "feeling" of data sets could open up the World Wide Web to individuals with sight impairments. This could also, however, benefit government, businesses, universities, and (elementary) education. It could contribute more powerful communications, education, and medical simulation applications on the World Wide Web. This work is part of government technology transfer to telemedicine, (elementary) education, disabilities access to the Web, and new Internet access and productivity efforts under Vice President Gore's National Performance Review.</div>
</front>
</TEI>
<pubmed>
<MedlineCitation Owner="NLM" Status="MEDLINE">
<PMID Version="1">10168947</PMID>
<DateCreated>
<Year>1997</Year>
<Month>09</Month>
<Day>04</Day>
</DateCreated>
<DateCompleted>
<Year>1997</Year>
<Month>09</Month>
<Day>04</Day>
</DateCompleted>
<DateRevised>
<Year>2004</Year>
<Month>11</Month>
<Day>17</Day>
</DateRevised>
<Article PubModel="Print">
<Journal>
<ISSN IssnType="Print">0926-9630</ISSN>
<JournalIssue CitedMedium="Print">
<Volume>39</Volume>
<PubDate>
<Year>1997</Year>
</PubDate>
</JournalIssue>
<Title>Studies in health technology and informatics</Title>
<ISOAbbreviation>Stud Health Technol Inform</ISOAbbreviation>
</Journal>
<ArticleTitle>Blind persons navigate in virtual reality (VR); hearing and feeling communicates "reality".</ArticleTitle>
<Pagination>
<MedlinePgn>54-9</MedlinePgn>
</Pagination>
<Abstract>
<AbstractText>Can Virtual Reality (VR) developments in audio navigation for blind persons support therapies for all? Working with Crystal River Engineering we are developing navigable Virtual Reality worlds for blind users, using spatialized audio [1], [2]. All persons, however, use specialized channels, such as: visual, aural, and kinetic learning senses. Predominantly visual VR worlds and health informatics models from World Wide Webs, may be downloaded, tailored, augmented, and delivered to each of these learning senses using VR. We are also testing a proof of concept system with Boston Dynamics which downloads 3-dimensional, satellite-derived map models from the World Wide Web, and makes them navigable by "feeling" the terrain using haptic (tactual or force feedback to your hand) robotic interfaces. Ultimately, these multi-sensory VR access methods: sight, localization by audio, and "feeling" of data sets could open up the World Wide Web to individuals with sight impairments. This could also, however, benefit government, businesses, universities, and (elementary) education. It could contribute more powerful communications, education, and medical simulation applications on the World Wide Web. This work is part of government technology transfer to telemedicine, (elementary) education, disabilities access to the Web, and new Internet access and productivity efforts under Vice President Gore's National Performance Review.</AbstractText>
</Abstract>
<AuthorList CompleteYN="Y">
<Author ValidYN="Y">
<LastName>Max</LastName>
<ForeName>M L</ForeName>
<Initials>ML</Initials>
</Author>
<Author ValidYN="Y">
<LastName>Gonzalez</LastName>
<ForeName>J R</ForeName>
<Initials>JR</Initials>
</Author>
</AuthorList>
<Language>eng</Language>
<PublicationTypeList>
<PublicationType UI="D016428">Journal Article</PublicationType>
</PublicationTypeList>
</Article>
<MedlineJournalInfo>
<Country>NETHERLANDS</Country>
<MedlineTA>Stud Health Technol Inform</MedlineTA>
<NlmUniqueID>9214582</NlmUniqueID>
<ISSNLinking>0926-9630</ISSNLinking>
</MedlineJournalInfo>
<CitationSubset>T</CitationSubset>
<MeshHeadingList>
<MeshHeading>
<DescriptorName MajorTopicYN="N" UI="D000328">Adult</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="N" UI="D001766">Blindness</DescriptorName>
<QualifierName MajorTopicYN="Y" UI="Q000534">rehabilitation</QualifierName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="N" UI="D002675">Child, Preschool</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="N" UI="D005260">Female</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="N" UI="D006801">Humans</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="N" UI="D008297">Male</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="N" UI="D010865">Pilot Projects</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="Y" UI="D012682">Sensory Aids</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="N" UI="D013017">Sound Localization</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="N" UI="D013028">Space Perception</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="Y" UI="D014584">User-Computer Interface</DescriptorName>
</MeshHeading>
</MeshHeadingList>
</MedlineCitation>
<PubmedData>
<History>
<PubMedPubDate PubStatus="pubmed">
<Year>1996</Year>
<Month>12</Month>
<Day>8</Day>
</PubMedPubDate>
<PubMedPubDate PubStatus="medline">
<Year>1996</Year>
<Month>12</Month>
<Day>8</Day>
<Hour>0</Hour>
<Minute>1</Minute>
</PubMedPubDate>
<PubMedPubDate PubStatus="entrez">
<Year>1996</Year>
<Month>12</Month>
<Day>8</Day>
<Hour>0</Hour>
<Minute>0</Minute>
</PubMedPubDate>
</History>
<PublicationStatus>ppublish</PublicationStatus>
<ArticleIdList>
<ArticleId IdType="pubmed">10168947</ArticleId>
</ArticleIdList>
</PubmedData>
</pubmed>
</record>

Pour manipuler ce document sous Unix (Dilib)

EXPLOR_STEP=$WICRI_ROOT/Ticri/CIDE/explor/HapticV1/Data/PubMed/Corpus
HfdSelect -h $EXPLOR_STEP/biblio.hfd -nk 002047 | SxmlIndent | more

Ou

HfdSelect -h $EXPLOR_AREA/Data/PubMed/Corpus/biblio.hfd -nk 002047 | SxmlIndent | more

Pour mettre un lien sur cette page dans le réseau Wicri

{{Explor lien
   |wiki=    Ticri/CIDE
   |area=    HapticV1
   |flux=    PubMed
   |étape=   Corpus
   |type=    RBID
   |clé=     pubmed:10168947
   |texte=   Blind persons navigate in virtual reality (VR); hearing and feeling communicates "reality".
}}

Pour générer des pages wiki

HfdIndexSelect -h $EXPLOR_AREA/Data/PubMed/Corpus/RBID.i   -Sk "pubmed:10168947" \
       | HfdSelect -Kh $EXPLOR_AREA/Data/PubMed/Corpus/biblio.hfd   \
       | NlmPubMed2Wicri -a HapticV1 

Wicri

This area was generated with Dilib version V0.6.23.
Data generation: Mon Jun 13 01:09:46 2016. Site generation: Wed Mar 6 09:54:07 2024