Serveur d'exploration sur les dispositifs haptiques

Attention, ce site est en cours de développement !
Attention, site généré par des moyens informatiques à partir de corpus bruts.
Les informations ne sont donc pas validées.

Stereo camera based virtual cane system with identifiable distance tactile feedback for the blind.

Identifieur interne : 000623 ( PubMed/Corpus ); précédent : 000622; suivant : 000624

Stereo camera based virtual cane system with identifiable distance tactile feedback for the blind.

Auteurs : Donghun Kim ; Kwangtaek Kim ; Sangyoun Lee

Source :

RBID : pubmed:24932864

English descriptors

Abstract

In this paper, we propose a new haptic-assisted virtual cane system operated by a simple finger pointing gesture. The system is developed by two stages: development of visual information delivery assistant (VIDA) with a stereo camera and adding a tactile feedback interface with dual actuators for guidance and distance feedbacks. In the first stage, user's pointing finger is automatically detected using color and disparity data from stereo images and then a 3D pointing direction of the finger is estimated with its geometric and textural features. Finally, any object within the estimated pointing trajectory in 3D space is detected and the distance is then estimated in real time. For the second stage, identifiable tactile signals are designed through a series of identification experiments, and an identifiable tactile feedback interface is developed and integrated into the VIDA system. Our approach differs in that navigation guidance is provided by a simple finger pointing gesture and tactile distance feedbacks are perfectly identifiable to the blind.

DOI: 10.3390/s140610412
PubMed: 24932864

Links to Exploration step

pubmed:24932864

Le document en format XML

<record>
<TEI>
<teiHeader>
<fileDesc>
<titleStmt>
<title xml:lang="en">Stereo camera based virtual cane system with identifiable distance tactile feedback for the blind.</title>
<author>
<name sortKey="Kim, Donghun" sort="Kim, Donghun" uniqKey="Kim D" first="Donghun" last="Kim">Donghun Kim</name>
<affiliation>
<nlm:affiliation>School of Electrical and Computer Engineering, Purdue University, West Lafayette, IN 47906, USA. zava@purdue.edu.</nlm:affiliation>
</affiliation>
</author>
<author>
<name sortKey="Kim, Kwangtaek" sort="Kim, Kwangtaek" uniqKey="Kim K" first="Kwangtaek" last="Kim">Kwangtaek Kim</name>
<affiliation>
<nlm:affiliation>Department of Electrical and Electronic Engineering, Institute of BioMed-IT, Energy-IT and Smart-ITTechnology (Best), Yonsei University, Yonsei-ro, Seodaemun-gu, Seoul 120-749, Korea. kwangtaekkim@yonsei.ac.kr.</nlm:affiliation>
</affiliation>
</author>
<author>
<name sortKey="Lee, Sangyoun" sort="Lee, Sangyoun" uniqKey="Lee S" first="Sangyoun" last="Lee">Sangyoun Lee</name>
<affiliation>
<nlm:affiliation>Department of Electrical and Electronic Engineering, Institute of BioMed-IT, Energy-IT and Smart-ITTechnology (Best), Yonsei University, Yonsei-ro, Seodaemun-gu, Seoul 120-749, Korea. syleee@yonsei.ac.kr.</nlm:affiliation>
</affiliation>
</author>
</titleStmt>
<publicationStmt>
<idno type="wicri:source">PubMed</idno>
<date when="2014">2014</date>
<idno type="doi">10.3390/s140610412</idno>
<idno type="RBID">pubmed:24932864</idno>
<idno type="pmid">24932864</idno>
<idno type="wicri:Area/PubMed/Corpus">000623</idno>
</publicationStmt>
<sourceDesc>
<biblStruct>
<analytic>
<title xml:lang="en">Stereo camera based virtual cane system with identifiable distance tactile feedback for the blind.</title>
<author>
<name sortKey="Kim, Donghun" sort="Kim, Donghun" uniqKey="Kim D" first="Donghun" last="Kim">Donghun Kim</name>
<affiliation>
<nlm:affiliation>School of Electrical and Computer Engineering, Purdue University, West Lafayette, IN 47906, USA. zava@purdue.edu.</nlm:affiliation>
</affiliation>
</author>
<author>
<name sortKey="Kim, Kwangtaek" sort="Kim, Kwangtaek" uniqKey="Kim K" first="Kwangtaek" last="Kim">Kwangtaek Kim</name>
<affiliation>
<nlm:affiliation>Department of Electrical and Electronic Engineering, Institute of BioMed-IT, Energy-IT and Smart-ITTechnology (Best), Yonsei University, Yonsei-ro, Seodaemun-gu, Seoul 120-749, Korea. kwangtaekkim@yonsei.ac.kr.</nlm:affiliation>
</affiliation>
</author>
<author>
<name sortKey="Lee, Sangyoun" sort="Lee, Sangyoun" uniqKey="Lee S" first="Sangyoun" last="Lee">Sangyoun Lee</name>
<affiliation>
<nlm:affiliation>Department of Electrical and Electronic Engineering, Institute of BioMed-IT, Energy-IT and Smart-ITTechnology (Best), Yonsei University, Yonsei-ro, Seodaemun-gu, Seoul 120-749, Korea. syleee@yonsei.ac.kr.</nlm:affiliation>
</affiliation>
</author>
</analytic>
<series>
<title level="j">Sensors (Basel, Switzerland)</title>
<idno type="eISSN">1424-8220</idno>
<imprint>
<date when="2014" type="published">2014</date>
</imprint>
</series>
</biblStruct>
</sourceDesc>
</fileDesc>
<profileDesc>
<textClass>
<keywords scheme="KwdEn" xml:lang="en">
<term>Blindness (rehabilitation)</term>
<term>Canes</term>
<term>Equipment Design</term>
<term>Equipment Failure Analysis</term>
<term>Feedback, Physiological</term>
<term>Humans</term>
<term>Imaging, Three-Dimensional (instrumentation)</term>
<term>Orientation</term>
<term>Pattern Recognition, Automated (methods)</term>
<term>Physical Stimulation (instrumentation)</term>
<term>Self-Help Devices</term>
<term>Signal Processing, Computer-Assisted (instrumentation)</term>
<term>Touch</term>
<term>Transducers</term>
<term>User-Computer Interface</term>
</keywords>
<keywords scheme="MESH" qualifier="instrumentation" xml:lang="en">
<term>Imaging, Three-Dimensional</term>
<term>Physical Stimulation</term>
<term>Signal Processing, Computer-Assisted</term>
</keywords>
<keywords scheme="MESH" qualifier="methods" xml:lang="en">
<term>Pattern Recognition, Automated</term>
</keywords>
<keywords scheme="MESH" qualifier="rehabilitation" xml:lang="en">
<term>Blindness</term>
</keywords>
<keywords scheme="MESH" xml:lang="en">
<term>Canes</term>
<term>Equipment Design</term>
<term>Equipment Failure Analysis</term>
<term>Feedback, Physiological</term>
<term>Humans</term>
<term>Orientation</term>
<term>Self-Help Devices</term>
<term>Touch</term>
<term>Transducers</term>
<term>User-Computer Interface</term>
</keywords>
</textClass>
</profileDesc>
</teiHeader>
<front>
<div type="abstract" xml:lang="en">In this paper, we propose a new haptic-assisted virtual cane system operated by a simple finger pointing gesture. The system is developed by two stages: development of visual information delivery assistant (VIDA) with a stereo camera and adding a tactile feedback interface with dual actuators for guidance and distance feedbacks. In the first stage, user's pointing finger is automatically detected using color and disparity data from stereo images and then a 3D pointing direction of the finger is estimated with its geometric and textural features. Finally, any object within the estimated pointing trajectory in 3D space is detected and the distance is then estimated in real time. For the second stage, identifiable tactile signals are designed through a series of identification experiments, and an identifiable tactile feedback interface is developed and integrated into the VIDA system. Our approach differs in that navigation guidance is provided by a simple finger pointing gesture and tactile distance feedbacks are perfectly identifiable to the blind.</div>
</front>
</TEI>
<pubmed>
<MedlineCitation Owner="NLM" Status="MEDLINE">
<PMID Version="1">24932864</PMID>
<DateCreated>
<Year>2014</Year>
<Month>06</Month>
<Day>17</Day>
</DateCreated>
<DateCompleted>
<Year>2015</Year>
<Month>06</Month>
<Day>22</Day>
</DateCompleted>
<Article PubModel="Electronic">
<Journal>
<ISSN IssnType="Electronic">1424-8220</ISSN>
<JournalIssue CitedMedium="Internet">
<Volume>14</Volume>
<Issue>6</Issue>
<PubDate>
<Year>2014</Year>
</PubDate>
</JournalIssue>
<Title>Sensors (Basel, Switzerland)</Title>
<ISOAbbreviation>Sensors (Basel)</ISOAbbreviation>
</Journal>
<ArticleTitle>Stereo camera based virtual cane system with identifiable distance tactile feedback for the blind.</ArticleTitle>
<Pagination>
<MedlinePgn>10412-31</MedlinePgn>
</Pagination>
<ELocationID EIdType="doi" ValidYN="Y">10.3390/s140610412</ELocationID>
<Abstract>
<AbstractText>In this paper, we propose a new haptic-assisted virtual cane system operated by a simple finger pointing gesture. The system is developed by two stages: development of visual information delivery assistant (VIDA) with a stereo camera and adding a tactile feedback interface with dual actuators for guidance and distance feedbacks. In the first stage, user's pointing finger is automatically detected using color and disparity data from stereo images and then a 3D pointing direction of the finger is estimated with its geometric and textural features. Finally, any object within the estimated pointing trajectory in 3D space is detected and the distance is then estimated in real time. For the second stage, identifiable tactile signals are designed through a series of identification experiments, and an identifiable tactile feedback interface is developed and integrated into the VIDA system. Our approach differs in that navigation guidance is provided by a simple finger pointing gesture and tactile distance feedbacks are perfectly identifiable to the blind.</AbstractText>
</Abstract>
<AuthorList CompleteYN="Y">
<Author ValidYN="Y">
<LastName>Kim</LastName>
<ForeName>Donghun</ForeName>
<Initials>D</Initials>
<AffiliationInfo>
<Affiliation>School of Electrical and Computer Engineering, Purdue University, West Lafayette, IN 47906, USA. zava@purdue.edu.</Affiliation>
</AffiliationInfo>
</Author>
<Author ValidYN="Y">
<LastName>Kim</LastName>
<ForeName>Kwangtaek</ForeName>
<Initials>K</Initials>
<AffiliationInfo>
<Affiliation>Department of Electrical and Electronic Engineering, Institute of BioMed-IT, Energy-IT and Smart-ITTechnology (Best), Yonsei University, Yonsei-ro, Seodaemun-gu, Seoul 120-749, Korea. kwangtaekkim@yonsei.ac.kr.</Affiliation>
</AffiliationInfo>
</Author>
<Author ValidYN="Y">
<LastName>Lee</LastName>
<ForeName>Sangyoun</ForeName>
<Initials>S</Initials>
<AffiliationInfo>
<Affiliation>Department of Electrical and Electronic Engineering, Institute of BioMed-IT, Energy-IT and Smart-ITTechnology (Best), Yonsei University, Yonsei-ro, Seodaemun-gu, Seoul 120-749, Korea. syleee@yonsei.ac.kr.</Affiliation>
</AffiliationInfo>
</Author>
</AuthorList>
<Language>eng</Language>
<PublicationTypeList>
<PublicationType UI="D016428">Journal Article</PublicationType>
<PublicationType UI="D013485">Research Support, Non-U.S. Gov't</PublicationType>
</PublicationTypeList>
<ArticleDate DateType="Electronic">
<Year>2014</Year>
<Month>06</Month>
<Day>13</Day>
</ArticleDate>
</Article>
<MedlineJournalInfo>
<Country>Switzerland</Country>
<MedlineTA>Sensors (Basel)</MedlineTA>
<NlmUniqueID>101204366</NlmUniqueID>
<ISSNLinking>1424-8220</ISSNLinking>
</MedlineJournalInfo>
<CitationSubset>IM</CitationSubset>
<MeshHeadingList>
<MeshHeading>
<DescriptorName MajorTopicYN="N" UI="D001766">Blindness</DescriptorName>
<QualifierName MajorTopicYN="Y" UI="Q000534">rehabilitation</QualifierName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="Y" UI="D002183">Canes</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="N" UI="D004867">Equipment Design</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="N" UI="D019544">Equipment Failure Analysis</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="N" UI="D025461">Feedback, Physiological</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="N" UI="D006801">Humans</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="N" UI="D021621">Imaging, Three-Dimensional</DescriptorName>
<QualifierName MajorTopicYN="Y" UI="Q000295">instrumentation</QualifierName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="N" UI="D009949">Orientation</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="N" UI="D010363">Pattern Recognition, Automated</DescriptorName>
<QualifierName MajorTopicYN="N" UI="Q000379">methods</QualifierName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="N" UI="D010812">Physical Stimulation</DescriptorName>
<QualifierName MajorTopicYN="Y" UI="Q000295">instrumentation</QualifierName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="Y" UI="D012656">Self-Help Devices</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="N" UI="D012815">Signal Processing, Computer-Assisted</DescriptorName>
<QualifierName MajorTopicYN="N" UI="Q000295">instrumentation</QualifierName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="Y" UI="D014110">Touch</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="N" UI="D014159">Transducers</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="Y" UI="D014584">User-Computer Interface</DescriptorName>
</MeshHeading>
</MeshHeadingList>
<OtherID Source="NLM">PMC4118356</OtherID>
</MedlineCitation>
<PubmedData>
<History>
<PubMedPubDate PubStatus="received">
<Year>2014</Year>
<Month>1</Month>
<Day>23</Day>
</PubMedPubDate>
<PubMedPubDate PubStatus="revised">
<Year>2014</Year>
<Month>5</Month>
<Day>26</Day>
</PubMedPubDate>
<PubMedPubDate PubStatus="accepted">
<Year>2014</Year>
<Month>6</Month>
<Day>05</Day>
</PubMedPubDate>
<PubMedPubDate PubStatus="entrez">
<Year>2014</Year>
<Month>6</Month>
<Day>17</Day>
<Hour>6</Hour>
<Minute>0</Minute>
</PubMedPubDate>
<PubMedPubDate PubStatus="pubmed">
<Year>2014</Year>
<Month>6</Month>
<Day>17</Day>
<Hour>6</Hour>
<Minute>0</Minute>
</PubMedPubDate>
<PubMedPubDate PubStatus="medline">
<Year>2015</Year>
<Month>6</Month>
<Day>24</Day>
<Hour>6</Hour>
<Minute>0</Minute>
</PubMedPubDate>
</History>
<PublicationStatus>epublish</PublicationStatus>
<ArticleIdList>
<ArticleId IdType="pii">s140610412</ArticleId>
<ArticleId IdType="doi">10.3390/s140610412</ArticleId>
<ArticleId IdType="pubmed">24932864</ArticleId>
<ArticleId IdType="pmc">PMC4118356</ArticleId>
</ArticleIdList>
</PubmedData>
</pubmed>
</record>

Pour manipuler ce document sous Unix (Dilib)

EXPLOR_STEP=$WICRI_ROOT/Ticri/CIDE/explor/HapticV1/Data/PubMed/Corpus
HfdSelect -h $EXPLOR_STEP/biblio.hfd -nk 000623 | SxmlIndent | more

Ou

HfdSelect -h $EXPLOR_AREA/Data/PubMed/Corpus/biblio.hfd -nk 000623 | SxmlIndent | more

Pour mettre un lien sur cette page dans le réseau Wicri

{{Explor lien
   |wiki=    Ticri/CIDE
   |area=    HapticV1
   |flux=    PubMed
   |étape=   Corpus
   |type=    RBID
   |clé=     pubmed:24932864
   |texte=   Stereo camera based virtual cane system with identifiable distance tactile feedback for the blind.
}}

Pour générer des pages wiki

HfdIndexSelect -h $EXPLOR_AREA/Data/PubMed/Corpus/RBID.i   -Sk "pubmed:24932864" \
       | HfdSelect -Kh $EXPLOR_AREA/Data/PubMed/Corpus/biblio.hfd   \
       | NlmPubMed2Wicri -a HapticV1 

Wicri

This area was generated with Dilib version V0.6.23.
Data generation: Mon Jun 13 01:09:46 2016. Site generation: Wed Mar 6 09:54:07 2024