Serveur d'exploration sur les dispositifs haptiques

Attention, ce site est en cours de développement !
Attention, site généré par des moyens informatiques à partir de corpus bruts.
Les informations ne sont donc pas validées.

Telerobotic Haptic Exploration in Art Galleries and Museums for Individuals with Visual Impairments.

Identifieur interne : 000287 ( PubMed/Corpus ); précédent : 000286; suivant : 000288

Telerobotic Haptic Exploration in Art Galleries and Museums for Individuals with Visual Impairments.

Auteurs : Chung Hyuk Park ; Eun-Seok Ryu ; Ayanna M. Howard

Source :

RBID : pubmed:26219098

English descriptors

Abstract

This paper presents a haptic telepresence system that enables visually impaired users to explore locations with rich visual observation such as art galleries and museums by using a telepresence robot, a RGB-D sensor (color and depth camera), and a haptic interface. The recent improvement on RGB-D sensors has enabled real-time access to 3D spatial information in the form of point clouds. However, the real-time representation of this data in the form of tangible haptic experience has not been challenged enough, especially in the case of telepresence for individuals with visual impairments. Thus, the proposed system addresses the real-time haptic exploration of remote 3D information through video encoding and real-time 3D haptic rendering of the remote real-world environment. This paper investigates two scenarios in haptic telepresence, i.e., mobile navigation and object exploration in a remote environment. Participants with and without visual impairments participated in our experiments based on the two scenarios, and the system performance was validated. In conclusion, the proposed framework provides a new methodology of haptic telepresence for individuals with visual impairments by providing an enhanced interactive experience where they can remotely access public places (art galleries and museums) with the aid of haptic modality and robotic telepresence.

DOI: 10.1109/TOH.2015.2460253
PubMed: 26219098

Links to Exploration step

pubmed:26219098

Le document en format XML

<record>
<TEI>
<teiHeader>
<fileDesc>
<titleStmt>
<title xml:lang="en">Telerobotic Haptic Exploration in Art Galleries and Museums for Individuals with Visual Impairments.</title>
<author>
<name sortKey="Park, Chung Hyuk" sort="Park, Chung Hyuk" uniqKey="Park C" first="Chung Hyuk" last="Park">Chung Hyuk Park</name>
</author>
<author>
<name sortKey="Ryu, Eun Seok" sort="Ryu, Eun Seok" uniqKey="Ryu E" first="Eun-Seok" last="Ryu">Eun-Seok Ryu</name>
</author>
<author>
<name sortKey="Howard, Ayanna M" sort="Howard, Ayanna M" uniqKey="Howard A" first="Ayanna M" last="Howard">Ayanna M. Howard</name>
</author>
</titleStmt>
<publicationStmt>
<idno type="wicri:source">PubMed</idno>
<date when="????">
<PubDate>
<MedlineDate>2015 Jul-Sep</MedlineDate>
</PubDate>
</date>
<idno type="doi">10.1109/TOH.2015.2460253</idno>
<idno type="RBID">pubmed:26219098</idno>
<idno type="pmid">26219098</idno>
<idno type="wicri:Area/PubMed/Corpus">000287</idno>
</publicationStmt>
<sourceDesc>
<biblStruct>
<analytic>
<title xml:lang="en">Telerobotic Haptic Exploration in Art Galleries and Museums for Individuals with Visual Impairments.</title>
<author>
<name sortKey="Park, Chung Hyuk" sort="Park, Chung Hyuk" uniqKey="Park C" first="Chung Hyuk" last="Park">Chung Hyuk Park</name>
</author>
<author>
<name sortKey="Ryu, Eun Seok" sort="Ryu, Eun Seok" uniqKey="Ryu E" first="Eun-Seok" last="Ryu">Eun-Seok Ryu</name>
</author>
<author>
<name sortKey="Howard, Ayanna M" sort="Howard, Ayanna M" uniqKey="Howard A" first="Ayanna M" last="Howard">Ayanna M. Howard</name>
</author>
</analytic>
<series>
<title level="j">IEEE transactions on haptics</title>
<idno type="eISSN">2329-4051</idno>
</series>
</biblStruct>
</sourceDesc>
</fileDesc>
<profileDesc>
<textClass>
<keywords scheme="KwdEn" xml:lang="en">
<term>Algorithms</term>
<term>Computer Graphics</term>
<term>Computer Simulation</term>
<term>Data Display</term>
<term>Female</term>
<term>Humans</term>
<term>Male</term>
<term>Multimedia</term>
<term>Museums</term>
<term>Robotics (methods)</term>
<term>Self-Help Devices</term>
<term>Sensory Aids</term>
<term>Task Performance and Analysis</term>
<term>Touch (physiology)</term>
<term>User-Computer Interface</term>
<term>Vision Disorders (rehabilitation)</term>
</keywords>
<keywords scheme="MESH" qualifier="methods" xml:lang="en">
<term>Robotics</term>
</keywords>
<keywords scheme="MESH" qualifier="physiology" xml:lang="en">
<term>Touch</term>
</keywords>
<keywords scheme="MESH" qualifier="rehabilitation" xml:lang="en">
<term>Vision Disorders</term>
</keywords>
<keywords scheme="MESH" xml:lang="en">
<term>Algorithms</term>
<term>Computer Graphics</term>
<term>Computer Simulation</term>
<term>Data Display</term>
<term>Female</term>
<term>Humans</term>
<term>Male</term>
<term>Multimedia</term>
<term>Museums</term>
<term>Self-Help Devices</term>
<term>Sensory Aids</term>
<term>Task Performance and Analysis</term>
<term>User-Computer Interface</term>
</keywords>
</textClass>
</profileDesc>
</teiHeader>
<front>
<div type="abstract" xml:lang="en">This paper presents a haptic telepresence system that enables visually impaired users to explore locations with rich visual observation such as art galleries and museums by using a telepresence robot, a RGB-D sensor (color and depth camera), and a haptic interface. The recent improvement on RGB-D sensors has enabled real-time access to 3D spatial information in the form of point clouds. However, the real-time representation of this data in the form of tangible haptic experience has not been challenged enough, especially in the case of telepresence for individuals with visual impairments. Thus, the proposed system addresses the real-time haptic exploration of remote 3D information through video encoding and real-time 3D haptic rendering of the remote real-world environment. This paper investigates two scenarios in haptic telepresence, i.e., mobile navigation and object exploration in a remote environment. Participants with and without visual impairments participated in our experiments based on the two scenarios, and the system performance was validated. In conclusion, the proposed framework provides a new methodology of haptic telepresence for individuals with visual impairments by providing an enhanced interactive experience where they can remotely access public places (art galleries and museums) with the aid of haptic modality and robotic telepresence.</div>
</front>
</TEI>
<pubmed>
<MedlineCitation Owner="NLM" Status="MEDLINE">
<PMID Version="1">26219098</PMID>
<DateCreated>
<Year>2015</Year>
<Month>09</Month>
<Day>23</Day>
</DateCreated>
<DateCompleted>
<Year>2016</Year>
<Month>05</Month>
<Day>20</Day>
</DateCompleted>
<Article PubModel="Print-Electronic">
<Journal>
<ISSN IssnType="Electronic">2329-4051</ISSN>
<JournalIssue CitedMedium="Internet">
<Volume>8</Volume>
<Issue>3</Issue>
<PubDate>
<MedlineDate>2015 Jul-Sep</MedlineDate>
</PubDate>
</JournalIssue>
<Title>IEEE transactions on haptics</Title>
<ISOAbbreviation>IEEE Trans Haptics</ISOAbbreviation>
</Journal>
<ArticleTitle>Telerobotic Haptic Exploration in Art Galleries and Museums for Individuals with Visual Impairments.</ArticleTitle>
<Pagination>
<MedlinePgn>327-38</MedlinePgn>
</Pagination>
<ELocationID EIdType="doi" ValidYN="Y">10.1109/TOH.2015.2460253</ELocationID>
<Abstract>
<AbstractText>This paper presents a haptic telepresence system that enables visually impaired users to explore locations with rich visual observation such as art galleries and museums by using a telepresence robot, a RGB-D sensor (color and depth camera), and a haptic interface. The recent improvement on RGB-D sensors has enabled real-time access to 3D spatial information in the form of point clouds. However, the real-time representation of this data in the form of tangible haptic experience has not been challenged enough, especially in the case of telepresence for individuals with visual impairments. Thus, the proposed system addresses the real-time haptic exploration of remote 3D information through video encoding and real-time 3D haptic rendering of the remote real-world environment. This paper investigates two scenarios in haptic telepresence, i.e., mobile navigation and object exploration in a remote environment. Participants with and without visual impairments participated in our experiments based on the two scenarios, and the system performance was validated. In conclusion, the proposed framework provides a new methodology of haptic telepresence for individuals with visual impairments by providing an enhanced interactive experience where they can remotely access public places (art galleries and museums) with the aid of haptic modality and robotic telepresence.</AbstractText>
</Abstract>
<AuthorList CompleteYN="Y">
<Author ValidYN="Y">
<LastName>Park</LastName>
<ForeName>Chung Hyuk</ForeName>
<Initials>CH</Initials>
</Author>
<Author ValidYN="Y">
<LastName>Ryu</LastName>
<ForeName>Eun-Seok</ForeName>
<Initials>ES</Initials>
</Author>
<Author ValidYN="Y">
<LastName>Howard</LastName>
<ForeName>Ayanna M</ForeName>
<Initials>AM</Initials>
</Author>
</AuthorList>
<Language>eng</Language>
<PublicationTypeList>
<PublicationType UI="D016428">Journal Article</PublicationType>
<PublicationType UI="D013486">Research Support, U.S. Gov't, Non-P.H.S.</PublicationType>
</PublicationTypeList>
<ArticleDate DateType="Electronic">
<Year>2015</Year>
<Month>07</Month>
<Day>23</Day>
</ArticleDate>
</Article>
<MedlineJournalInfo>
<Country>United States</Country>
<MedlineTA>IEEE Trans Haptics</MedlineTA>
<NlmUniqueID>101491191</NlmUniqueID>
<ISSNLinking>1939-1412</ISSNLinking>
</MedlineJournalInfo>
<CitationSubset>IM</CitationSubset>
<MeshHeadingList>
<MeshHeading>
<DescriptorName MajorTopicYN="N" UI="D000465">Algorithms</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="N" UI="D003196">Computer Graphics</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="N" UI="D003198">Computer Simulation</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="Y" UI="D003626">Data Display</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="N" UI="D005260">Female</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="N" UI="D006801">Humans</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="N" UI="D008297">Male</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="N" UI="D019212">Multimedia</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="Y" UI="D009144">Museums</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="N" UI="D012371">Robotics</DescriptorName>
<QualifierName MajorTopicYN="Y" UI="Q000379">methods</QualifierName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="Y" UI="D012656">Self-Help Devices</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="Y" UI="D012682">Sensory Aids</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="N" UI="D013647">Task Performance and Analysis</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="N" UI="D014110">Touch</DescriptorName>
<QualifierName MajorTopicYN="Y" UI="Q000502">physiology</QualifierName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="N" UI="D014584">User-Computer Interface</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="N" UI="D014786">Vision Disorders</DescriptorName>
<QualifierName MajorTopicYN="Y" UI="Q000534">rehabilitation</QualifierName>
</MeshHeading>
</MeshHeadingList>
</MedlineCitation>
<PubmedData>
<History>
<PubMedPubDate PubStatus="aheadofprint">
<Year>2015</Year>
<Month>7</Month>
<Day>23</Day>
</PubMedPubDate>
<PubMedPubDate PubStatus="entrez">
<Year>2015</Year>
<Month>7</Month>
<Day>29</Day>
<Hour>6</Hour>
<Minute>0</Minute>
</PubMedPubDate>
<PubMedPubDate PubStatus="pubmed">
<Year>2015</Year>
<Month>7</Month>
<Day>29</Day>
<Hour>6</Hour>
<Minute>0</Minute>
</PubMedPubDate>
<PubMedPubDate PubStatus="medline">
<Year>2016</Year>
<Month>5</Month>
<Day>21</Day>
<Hour>6</Hour>
<Minute>0</Minute>
</PubMedPubDate>
</History>
<PublicationStatus>ppublish</PublicationStatus>
<ArticleIdList>
<ArticleId IdType="doi">10.1109/TOH.2015.2460253</ArticleId>
<ArticleId IdType="pubmed">26219098</ArticleId>
</ArticleIdList>
</PubmedData>
</pubmed>
</record>

Pour manipuler ce document sous Unix (Dilib)

EXPLOR_STEP=$WICRI_ROOT/Ticri/CIDE/explor/HapticV1/Data/PubMed/Corpus
HfdSelect -h $EXPLOR_STEP/biblio.hfd -nk 000287 | SxmlIndent | more

Ou

HfdSelect -h $EXPLOR_AREA/Data/PubMed/Corpus/biblio.hfd -nk 000287 | SxmlIndent | more

Pour mettre un lien sur cette page dans le réseau Wicri

{{Explor lien
   |wiki=    Ticri/CIDE
   |area=    HapticV1
   |flux=    PubMed
   |étape=   Corpus
   |type=    RBID
   |clé=     pubmed:26219098
   |texte=   Telerobotic Haptic Exploration in Art Galleries and Museums for Individuals with Visual Impairments.
}}

Pour générer des pages wiki

HfdIndexSelect -h $EXPLOR_AREA/Data/PubMed/Corpus/RBID.i   -Sk "pubmed:26219098" \
       | HfdSelect -Kh $EXPLOR_AREA/Data/PubMed/Corpus/biblio.hfd   \
       | NlmPubMed2Wicri -a HapticV1 

Wicri

This area was generated with Dilib version V0.6.23.
Data generation: Mon Jun 13 01:09:46 2016. Site generation: Wed Mar 6 09:54:07 2024