Blind persons navigate in virtual reality (VR); hearing and feeling communicates "reality".
Identifieur interne : 008892 ( Main/Exploration ); précédent : 008891; suivant : 008893Blind persons navigate in virtual reality (VR); hearing and feeling communicates "reality".
Auteurs : M L Max ; J R GonzalezSource :
- Studies in health technology and informatics [ 0926-9630 ] ; 1997.
English descriptors
- KwdEn :
- MESH :
- rehabilitation : Blindness.
- Adult, Child, Preschool, Female, Humans, Male, Pilot Projects, Sensory Aids, Sound Localization, Space Perception, User-Computer Interface.
Abstract
Can Virtual Reality (VR) developments in audio navigation for blind persons support therapies for all? Working with Crystal River Engineering we are developing navigable Virtual Reality worlds for blind users, using spatialized audio [1], [2]. All persons, however, use specialized channels, such as: visual, aural, and kinetic learning senses. Predominantly visual VR worlds and health informatics models from World Wide Webs, may be downloaded, tailored, augmented, and delivered to each of these learning senses using VR. We are also testing a proof of concept system with Boston Dynamics which downloads 3-dimensional, satellite-derived map models from the World Wide Web, and makes them navigable by "feeling" the terrain using haptic (tactual or force feedback to your hand) robotic interfaces. Ultimately, these multi-sensory VR access methods: sight, localization by audio, and "feeling" of data sets could open up the World Wide Web to individuals with sight impairments. This could also, however, benefit government, businesses, universities, and (elementary) education. It could contribute more powerful communications, education, and medical simulation applications on the World Wide Web. This work is part of government technology transfer to telemedicine, (elementary) education, disabilities access to the Web, and new Internet access and productivity efforts under Vice President Gore's National Performance Review.
PubMed: 10168947
Affiliations:
Links toward previous steps (curation, corpus...)
- to stream PubMed, to step Corpus: 002047
- to stream PubMed, to step Curation: 002047
- to stream PubMed, to step Checkpoint: 001D79
- to stream Ncbi, to step Merge: 000017
- to stream Ncbi, to step Curation: 000017
- to stream Ncbi, to step Checkpoint: 000017
- to stream Main, to step Merge: 009166
- to stream Main, to step Curation: 008892
Le document en format XML
<record><TEI><teiHeader><fileDesc><titleStmt><title xml:lang="en">Blind persons navigate in virtual reality (VR); hearing and feeling communicates "reality".</title>
<author><name sortKey="Max, M L" sort="Max, M L" uniqKey="Max M" first="M L" last="Max">M L Max</name>
</author>
<author><name sortKey="Gonzalez, J R" sort="Gonzalez, J R" uniqKey="Gonzalez J" first="J R" last="Gonzalez">J R Gonzalez</name>
</author>
</titleStmt>
<publicationStmt><idno type="wicri:source">PubMed</idno>
<date when="1997">1997</date>
<idno type="RBID">pubmed:10168947</idno>
<idno type="pmid">10168947</idno>
<idno type="wicri:Area/PubMed/Corpus">002047</idno>
<idno type="wicri:Area/PubMed/Curation">002047</idno>
<idno type="wicri:Area/PubMed/Checkpoint">001D79</idno>
<idno type="wicri:Area/Ncbi/Merge">000017</idno>
<idno type="wicri:Area/Ncbi/Curation">000017</idno>
<idno type="wicri:Area/Ncbi/Checkpoint">000017</idno>
<idno type="wicri:doubleKey">0926-9630:1997:Max M:blind:persons:navigate</idno>
<idno type="wicri:Area/Main/Merge">009166</idno>
<idno type="wicri:Area/Main/Curation">008892</idno>
<idno type="wicri:Area/Main/Exploration">008892</idno>
</publicationStmt>
<sourceDesc><biblStruct><analytic><title xml:lang="en">Blind persons navigate in virtual reality (VR); hearing and feeling communicates "reality".</title>
<author><name sortKey="Max, M L" sort="Max, M L" uniqKey="Max M" first="M L" last="Max">M L Max</name>
</author>
<author><name sortKey="Gonzalez, J R" sort="Gonzalez, J R" uniqKey="Gonzalez J" first="J R" last="Gonzalez">J R Gonzalez</name>
</author>
</analytic>
<series><title level="j">Studies in health technology and informatics</title>
<idno type="ISSN">0926-9630</idno>
<imprint><date when="1997" type="published">1997</date>
</imprint>
</series>
</biblStruct>
</sourceDesc>
</fileDesc>
<profileDesc><textClass><keywords scheme="KwdEn" xml:lang="en"><term>Adult</term>
<term>Blindness (rehabilitation)</term>
<term>Child, Preschool</term>
<term>Female</term>
<term>Humans</term>
<term>Male</term>
<term>Pilot Projects</term>
<term>Sensory Aids</term>
<term>Sound Localization</term>
<term>Space Perception</term>
<term>User-Computer Interface</term>
</keywords>
<keywords scheme="MESH" qualifier="rehabilitation" xml:lang="en"><term>Blindness</term>
</keywords>
<keywords scheme="MESH" xml:lang="en"><term>Adult</term>
<term>Child, Preschool</term>
<term>Female</term>
<term>Humans</term>
<term>Male</term>
<term>Pilot Projects</term>
<term>Sensory Aids</term>
<term>Sound Localization</term>
<term>Space Perception</term>
<term>User-Computer Interface</term>
</keywords>
</textClass>
</profileDesc>
</teiHeader>
<front><div type="abstract" xml:lang="en">Can Virtual Reality (VR) developments in audio navigation for blind persons support therapies for all? Working with Crystal River Engineering we are developing navigable Virtual Reality worlds for blind users, using spatialized audio [1], [2]. All persons, however, use specialized channels, such as: visual, aural, and kinetic learning senses. Predominantly visual VR worlds and health informatics models from World Wide Webs, may be downloaded, tailored, augmented, and delivered to each of these learning senses using VR. We are also testing a proof of concept system with Boston Dynamics which downloads 3-dimensional, satellite-derived map models from the World Wide Web, and makes them navigable by "feeling" the terrain using haptic (tactual or force feedback to your hand) robotic interfaces. Ultimately, these multi-sensory VR access methods: sight, localization by audio, and "feeling" of data sets could open up the World Wide Web to individuals with sight impairments. This could also, however, benefit government, businesses, universities, and (elementary) education. It could contribute more powerful communications, education, and medical simulation applications on the World Wide Web. This work is part of government technology transfer to telemedicine, (elementary) education, disabilities access to the Web, and new Internet access and productivity efforts under Vice President Gore's National Performance Review.</div>
</front>
</TEI>
<affiliations><list></list>
<tree><noCountry><name sortKey="Gonzalez, J R" sort="Gonzalez, J R" uniqKey="Gonzalez J" first="J R" last="Gonzalez">J R Gonzalez</name>
<name sortKey="Max, M L" sort="Max, M L" uniqKey="Max M" first="M L" last="Max">M L Max</name>
</noCountry>
</tree>
</affiliations>
</record>
Pour manipuler ce document sous Unix (Dilib)
EXPLOR_STEP=$WICRI_ROOT/Ticri/CIDE/explor/HapticV1/Data/Main/Exploration
HfdSelect -h $EXPLOR_STEP/biblio.hfd -nk 008892 | SxmlIndent | more
Ou
HfdSelect -h $EXPLOR_AREA/Data/Main/Exploration/biblio.hfd -nk 008892 | SxmlIndent | more
Pour mettre un lien sur cette page dans le réseau Wicri
{{Explor lien |wiki= Ticri/CIDE |area= HapticV1 |flux= Main |étape= Exploration |type= RBID |clé= pubmed:10168947 |texte= Blind persons navigate in virtual reality (VR); hearing and feeling communicates "reality". }}
Pour générer des pages wiki
HfdIndexSelect -h $EXPLOR_AREA/Data/Main/Exploration/RBID.i -Sk "pubmed:10168947" \ | HfdSelect -Kh $EXPLOR_AREA/Data/Main/Exploration/biblio.hfd \ | NlmPubMed2Wicri -a HapticV1
![]() | This area was generated with Dilib version V0.6.23. | ![]() |