Serveur d'exploration sur les dispositifs haptiques

Attention, ce site est en cours de développement !
Attention, site généré par des moyens informatiques à partir de corpus bruts.
Les informations ne sont donc pas validées.

Depth camera-based 3D hand gesture controls with immersive tactile feedback for natural mid-air gesture interactions.

Identifieur interne : 000440 ( PubMed/Corpus ); précédent : 000439; suivant : 000441

Depth camera-based 3D hand gesture controls with immersive tactile feedback for natural mid-air gesture interactions.

Auteurs : Kwangtaek Kim ; Joongrock Kim ; Jaesung Choi ; Junghyun Kim ; Sangyoun Lee

Source :

RBID : pubmed:25580901

English descriptors

Abstract

Vision-based hand gesture interactions are natural and intuitive when interacting with computers, since we naturally exploit gestures to communicate with other people. However, it is agreed that users suffer from discomfort and fatigue when using gesture-controlled interfaces, due to the lack of physical feedback. To solve the problem, we propose a novel complete solution of a hand gesture control system employing immersive tactile feedback to the user's hand. For this goal, we first developed a fast and accurate hand-tracking algorithm with a Kinect sensor using the proposed MLBP (modified local binary pattern) that can efficiently analyze 3D shapes in depth images. The superiority of our tracking method was verified in terms of tracking accuracy and speed by comparing with existing methods, Natural Interaction Technology for End-user (NITE), 3D Hand Tracker and CamShift. As the second step, a new tactile feedback technology with a piezoelectric actuator has been developed and integrated into the developed hand tracking algorithm, including the DTW (dynamic time warping) gesture recognition algorithm for a complete solution of an immersive gesture control system. The quantitative and qualitative evaluations of the integrated system were conducted with human subjects, and the results demonstrate that our gesture control with tactile feedback is a promising technology compared to a vision-based gesture control system that has typically no feedback for the user's gesture inputs. Our study provides researchers and designers with informative guidelines to develop more natural gesture control systems or immersive user interfaces with haptic feedback.

DOI: 10.3390/s150101022
PubMed: 25580901

Links to Exploration step

pubmed:25580901

Le document en format XML

<record>
<TEI>
<teiHeader>
<fileDesc>
<titleStmt>
<title xml:lang="en">Depth camera-based 3D hand gesture controls with immersive tactile feedback for natural mid-air gesture interactions.</title>
<author>
<name sortKey="Kim, Kwangtaek" sort="Kim, Kwangtaek" uniqKey="Kim K" first="Kwangtaek" last="Kim">Kwangtaek Kim</name>
<affiliation>
<nlm:affiliation>Department of Electrical and Electronic Engineering, Institute of BioMed-IT, Energy-IT and Smart-IT Technology (Best), Yonsei University, 50 Yonsei-ro, Seodaemun-gu, Seoul 120-749, Korea. kwangtaekkim@yonsei.ac.kr.</nlm:affiliation>
</affiliation>
</author>
<author>
<name sortKey="Kim, Joongrock" sort="Kim, Joongrock" uniqKey="Kim J" first="Joongrock" last="Kim">Joongrock Kim</name>
<affiliation>
<nlm:affiliation>Future IT Convergence Lab, LGE Advanced Research Institute, 38 Baumoe-ro, Seocho-gu, Seoul 137-724, Korea. jurock.kim@lge.com.</nlm:affiliation>
</affiliation>
</author>
<author>
<name sortKey="Choi, Jaesung" sort="Choi, Jaesung" uniqKey="Choi J" first="Jaesung" last="Choi">Jaesung Choi</name>
<affiliation>
<nlm:affiliation>Department of Electrical and Electronic Engineering, Institute of BioMed-IT, Energy-IT and Smart-IT Technology (Best), Yonsei University, 50 Yonsei-ro, Seodaemun-gu, Seoul 120-749, Korea. ciyciyciy@yonsei.ac.kr.</nlm:affiliation>
</affiliation>
</author>
<author>
<name sortKey="Kim, Junghyun" sort="Kim, Junghyun" uniqKey="Kim J" first="Junghyun" last="Kim">Junghyun Kim</name>
<affiliation>
<nlm:affiliation>Department of Electrical and Electronic Engineering, Institute of BioMed-IT, Energy-IT and Smart-IT Technology (Best), Yonsei University, 50 Yonsei-ro, Seodaemun-gu, Seoul 120-749, Korea. jhkim_1012@yonsei.ac.kr.</nlm:affiliation>
</affiliation>
</author>
<author>
<name sortKey="Lee, Sangyoun" sort="Lee, Sangyoun" uniqKey="Lee S" first="Sangyoun" last="Lee">Sangyoun Lee</name>
<affiliation>
<nlm:affiliation>Department of Electrical and Electronic Engineering, Institute of BioMed-IT, Energy-IT and Smart-IT Technology (Best), Yonsei University, 50 Yonsei-ro, Seodaemun-gu, Seoul 120-749, Korea. syleee@yonsei.ac.kr.</nlm:affiliation>
</affiliation>
</author>
</titleStmt>
<publicationStmt>
<idno type="wicri:source">PubMed</idno>
<date when="2015">2015</date>
<idno type="doi">10.3390/s150101022</idno>
<idno type="RBID">pubmed:25580901</idno>
<idno type="pmid">25580901</idno>
<idno type="wicri:Area/PubMed/Corpus">000440</idno>
</publicationStmt>
<sourceDesc>
<biblStruct>
<analytic>
<title xml:lang="en">Depth camera-based 3D hand gesture controls with immersive tactile feedback for natural mid-air gesture interactions.</title>
<author>
<name sortKey="Kim, Kwangtaek" sort="Kim, Kwangtaek" uniqKey="Kim K" first="Kwangtaek" last="Kim">Kwangtaek Kim</name>
<affiliation>
<nlm:affiliation>Department of Electrical and Electronic Engineering, Institute of BioMed-IT, Energy-IT and Smart-IT Technology (Best), Yonsei University, 50 Yonsei-ro, Seodaemun-gu, Seoul 120-749, Korea. kwangtaekkim@yonsei.ac.kr.</nlm:affiliation>
</affiliation>
</author>
<author>
<name sortKey="Kim, Joongrock" sort="Kim, Joongrock" uniqKey="Kim J" first="Joongrock" last="Kim">Joongrock Kim</name>
<affiliation>
<nlm:affiliation>Future IT Convergence Lab, LGE Advanced Research Institute, 38 Baumoe-ro, Seocho-gu, Seoul 137-724, Korea. jurock.kim@lge.com.</nlm:affiliation>
</affiliation>
</author>
<author>
<name sortKey="Choi, Jaesung" sort="Choi, Jaesung" uniqKey="Choi J" first="Jaesung" last="Choi">Jaesung Choi</name>
<affiliation>
<nlm:affiliation>Department of Electrical and Electronic Engineering, Institute of BioMed-IT, Energy-IT and Smart-IT Technology (Best), Yonsei University, 50 Yonsei-ro, Seodaemun-gu, Seoul 120-749, Korea. ciyciyciy@yonsei.ac.kr.</nlm:affiliation>
</affiliation>
</author>
<author>
<name sortKey="Kim, Junghyun" sort="Kim, Junghyun" uniqKey="Kim J" first="Junghyun" last="Kim">Junghyun Kim</name>
<affiliation>
<nlm:affiliation>Department of Electrical and Electronic Engineering, Institute of BioMed-IT, Energy-IT and Smart-IT Technology (Best), Yonsei University, 50 Yonsei-ro, Seodaemun-gu, Seoul 120-749, Korea. jhkim_1012@yonsei.ac.kr.</nlm:affiliation>
</affiliation>
</author>
<author>
<name sortKey="Lee, Sangyoun" sort="Lee, Sangyoun" uniqKey="Lee S" first="Sangyoun" last="Lee">Sangyoun Lee</name>
<affiliation>
<nlm:affiliation>Department of Electrical and Electronic Engineering, Institute of BioMed-IT, Energy-IT and Smart-IT Technology (Best), Yonsei University, 50 Yonsei-ro, Seodaemun-gu, Seoul 120-749, Korea. syleee@yonsei.ac.kr.</nlm:affiliation>
</affiliation>
</author>
</analytic>
<series>
<title level="j">Sensors (Basel, Switzerland)</title>
<idno type="eISSN">1424-8220</idno>
<imprint>
<date when="2015" type="published">2015</date>
</imprint>
</series>
</biblStruct>
</sourceDesc>
</fileDesc>
<profileDesc>
<textClass>
<keywords scheme="KwdEn" xml:lang="en">
<term>Adult</term>
<term>Air</term>
<term>Algorithms</term>
<term>Feedback, Physiological</term>
<term>Female</term>
<term>Gestures</term>
<term>Hand (physiology)</term>
<term>Humans</term>
<term>Male</term>
<term>Photography (instrumentation)</term>
<term>Signal Processing, Computer-Assisted</term>
<term>Surveys and Questionnaires</term>
<term>Touch (physiology)</term>
</keywords>
<keywords scheme="MESH" qualifier="instrumentation" xml:lang="en">
<term>Photography</term>
</keywords>
<keywords scheme="MESH" qualifier="physiology" xml:lang="en">
<term>Hand</term>
<term>Touch</term>
</keywords>
<keywords scheme="MESH" xml:lang="en">
<term>Adult</term>
<term>Air</term>
<term>Algorithms</term>
<term>Feedback, Physiological</term>
<term>Female</term>
<term>Gestures</term>
<term>Humans</term>
<term>Male</term>
<term>Signal Processing, Computer-Assisted</term>
<term>Surveys and Questionnaires</term>
</keywords>
</textClass>
</profileDesc>
</teiHeader>
<front>
<div type="abstract" xml:lang="en">Vision-based hand gesture interactions are natural and intuitive when interacting with computers, since we naturally exploit gestures to communicate with other people. However, it is agreed that users suffer from discomfort and fatigue when using gesture-controlled interfaces, due to the lack of physical feedback. To solve the problem, we propose a novel complete solution of a hand gesture control system employing immersive tactile feedback to the user's hand. For this goal, we first developed a fast and accurate hand-tracking algorithm with a Kinect sensor using the proposed MLBP (modified local binary pattern) that can efficiently analyze 3D shapes in depth images. The superiority of our tracking method was verified in terms of tracking accuracy and speed by comparing with existing methods, Natural Interaction Technology for End-user (NITE), 3D Hand Tracker and CamShift. As the second step, a new tactile feedback technology with a piezoelectric actuator has been developed and integrated into the developed hand tracking algorithm, including the DTW (dynamic time warping) gesture recognition algorithm for a complete solution of an immersive gesture control system. The quantitative and qualitative evaluations of the integrated system were conducted with human subjects, and the results demonstrate that our gesture control with tactile feedback is a promising technology compared to a vision-based gesture control system that has typically no feedback for the user's gesture inputs. Our study provides researchers and designers with informative guidelines to develop more natural gesture control systems or immersive user interfaces with haptic feedback.</div>
</front>
</TEI>
<pubmed>
<MedlineCitation Owner="NLM" Status="MEDLINE">
<PMID Version="1">25580901</PMID>
<DateCreated>
<Year>2015</Year>
<Month>01</Month>
<Day>13</Day>
</DateCreated>
<DateCompleted>
<Year>2015</Year>
<Month>08</Month>
<Day>27</Day>
</DateCompleted>
<DateRevised>
<Year>2015</Year>
<Month>11</Month>
<Day>19</Day>
</DateRevised>
<Article PubModel="Electronic">
<Journal>
<ISSN IssnType="Electronic">1424-8220</ISSN>
<JournalIssue CitedMedium="Internet">
<Volume>15</Volume>
<Issue>1</Issue>
<PubDate>
<Year>2015</Year>
</PubDate>
</JournalIssue>
<Title>Sensors (Basel, Switzerland)</Title>
<ISOAbbreviation>Sensors (Basel)</ISOAbbreviation>
</Journal>
<ArticleTitle>Depth camera-based 3D hand gesture controls with immersive tactile feedback for natural mid-air gesture interactions.</ArticleTitle>
<Pagination>
<MedlinePgn>1022-46</MedlinePgn>
</Pagination>
<ELocationID EIdType="doi" ValidYN="Y">10.3390/s150101022</ELocationID>
<Abstract>
<AbstractText>Vision-based hand gesture interactions are natural and intuitive when interacting with computers, since we naturally exploit gestures to communicate with other people. However, it is agreed that users suffer from discomfort and fatigue when using gesture-controlled interfaces, due to the lack of physical feedback. To solve the problem, we propose a novel complete solution of a hand gesture control system employing immersive tactile feedback to the user's hand. For this goal, we first developed a fast and accurate hand-tracking algorithm with a Kinect sensor using the proposed MLBP (modified local binary pattern) that can efficiently analyze 3D shapes in depth images. The superiority of our tracking method was verified in terms of tracking accuracy and speed by comparing with existing methods, Natural Interaction Technology for End-user (NITE), 3D Hand Tracker and CamShift. As the second step, a new tactile feedback technology with a piezoelectric actuator has been developed and integrated into the developed hand tracking algorithm, including the DTW (dynamic time warping) gesture recognition algorithm for a complete solution of an immersive gesture control system. The quantitative and qualitative evaluations of the integrated system were conducted with human subjects, and the results demonstrate that our gesture control with tactile feedback is a promising technology compared to a vision-based gesture control system that has typically no feedback for the user's gesture inputs. Our study provides researchers and designers with informative guidelines to develop more natural gesture control systems or immersive user interfaces with haptic feedback.</AbstractText>
</Abstract>
<AuthorList CompleteYN="Y">
<Author ValidYN="Y">
<LastName>Kim</LastName>
<ForeName>Kwangtaek</ForeName>
<Initials>K</Initials>
<AffiliationInfo>
<Affiliation>Department of Electrical and Electronic Engineering, Institute of BioMed-IT, Energy-IT and Smart-IT Technology (Best), Yonsei University, 50 Yonsei-ro, Seodaemun-gu, Seoul 120-749, Korea. kwangtaekkim@yonsei.ac.kr.</Affiliation>
</AffiliationInfo>
</Author>
<Author ValidYN="Y">
<LastName>Kim</LastName>
<ForeName>Joongrock</ForeName>
<Initials>J</Initials>
<AffiliationInfo>
<Affiliation>Future IT Convergence Lab, LGE Advanced Research Institute, 38 Baumoe-ro, Seocho-gu, Seoul 137-724, Korea. jurock.kim@lge.com.</Affiliation>
</AffiliationInfo>
</Author>
<Author ValidYN="Y">
<LastName>Choi</LastName>
<ForeName>Jaesung</ForeName>
<Initials>J</Initials>
<AffiliationInfo>
<Affiliation>Department of Electrical and Electronic Engineering, Institute of BioMed-IT, Energy-IT and Smart-IT Technology (Best), Yonsei University, 50 Yonsei-ro, Seodaemun-gu, Seoul 120-749, Korea. ciyciyciy@yonsei.ac.kr.</Affiliation>
</AffiliationInfo>
</Author>
<Author ValidYN="Y">
<LastName>Kim</LastName>
<ForeName>Junghyun</ForeName>
<Initials>J</Initials>
<AffiliationInfo>
<Affiliation>Department of Electrical and Electronic Engineering, Institute of BioMed-IT, Energy-IT and Smart-IT Technology (Best), Yonsei University, 50 Yonsei-ro, Seodaemun-gu, Seoul 120-749, Korea. jhkim_1012@yonsei.ac.kr.</Affiliation>
</AffiliationInfo>
</Author>
<Author ValidYN="Y">
<LastName>Lee</LastName>
<ForeName>Sangyoun</ForeName>
<Initials>S</Initials>
<AffiliationInfo>
<Affiliation>Department of Electrical and Electronic Engineering, Institute of BioMed-IT, Energy-IT and Smart-IT Technology (Best), Yonsei University, 50 Yonsei-ro, Seodaemun-gu, Seoul 120-749, Korea. syleee@yonsei.ac.kr.</Affiliation>
</AffiliationInfo>
</Author>
</AuthorList>
<Language>eng</Language>
<PublicationTypeList>
<PublicationType UI="D016428">Journal Article</PublicationType>
<PublicationType UI="D013485">Research Support, Non-U.S. Gov't</PublicationType>
</PublicationTypeList>
<ArticleDate DateType="Electronic">
<Year>2015</Year>
<Month>01</Month>
<Day>08</Day>
</ArticleDate>
</Article>
<MedlineJournalInfo>
<Country>Switzerland</Country>
<MedlineTA>Sensors (Basel)</MedlineTA>
<NlmUniqueID>101204366</NlmUniqueID>
<ISSNLinking>1424-8220</ISSNLinking>
</MedlineJournalInfo>
<CitationSubset>IM</CitationSubset>
<CommentsCorrectionsList>
<CommentsCorrections RefType="Cites">
<RefSource>IEEE Trans Pattern Anal Mach Intell. 2006 Sep;28(9):1372-84</RefSource>
<PMID Version="1">16929725</PMID>
</CommentsCorrections>
<CommentsCorrections RefType="Cites">
<RefSource>J Am Med Inform Assoc. 2008 May-Jun;15(3):321-3</RefSource>
<PMID Version="1">18451034</PMID>
</CommentsCorrections>
<CommentsCorrections RefType="Cites">
<RefSource>World J Urol. 2012 Oct;30(5):687-91</RefSource>
<PMID Version="1">22580994</PMID>
</CommentsCorrections>
<CommentsCorrections RefType="Cites">
<RefSource>Res Dev Disabil. 2011 Nov-Dec;32(6):2566-70</RefSource>
<PMID Version="1">21784612</PMID>
</CommentsCorrections>
<CommentsCorrections RefType="Cites">
<RefSource>Conf Proc IEEE Eng Med Biol Soc. 2010;2010:3690-3</RefSource>
<PMID Version="1">21096856</PMID>
</CommentsCorrections>
</CommentsCorrectionsList>
<MeshHeadingList>
<MeshHeading>
<DescriptorName MajorTopicYN="N" UI="D000328">Adult</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="Y" UI="D000388">Air</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="N" UI="D000465">Algorithms</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="Y" UI="D025461">Feedback, Physiological</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="N" UI="D005260">Female</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="Y" UI="D005868">Gestures</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="N" UI="D006225">Hand</DescriptorName>
<QualifierName MajorTopicYN="Y" UI="Q000502">physiology</QualifierName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="N" UI="D006801">Humans</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="N" UI="D008297">Male</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="N" UI="D010781">Photography</DescriptorName>
<QualifierName MajorTopicYN="Y" UI="Q000295">instrumentation</QualifierName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="N" UI="D012815">Signal Processing, Computer-Assisted</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="N" UI="D011795">Surveys and Questionnaires</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="N" UI="D014110">Touch</DescriptorName>
<QualifierName MajorTopicYN="Y" UI="Q000502">physiology</QualifierName>
</MeshHeading>
</MeshHeadingList>
<OtherID Source="NLM">PMC4327062</OtherID>
</MedlineCitation>
<PubmedData>
<History>
<PubMedPubDate PubStatus="received">
<Year>2014</Year>
<Month>10</Month>
<Day>28</Day>
</PubMedPubDate>
<PubMedPubDate PubStatus="revised">
<Year>-0001</Year>
<Month>11</Month>
<Day>30</Day>
</PubMedPubDate>
<PubMedPubDate PubStatus="accepted">
<Year>2014</Year>
<Month>12</Month>
<Day>25</Day>
</PubMedPubDate>
<PubMedPubDate PubStatus="entrez">
<Year>2015</Year>
<Month>1</Month>
<Day>13</Day>
<Hour>6</Hour>
<Minute>0</Minute>
</PubMedPubDate>
<PubMedPubDate PubStatus="pubmed">
<Year>2015</Year>
<Month>1</Month>
<Day>13</Day>
<Hour>6</Hour>
<Minute>0</Minute>
</PubMedPubDate>
<PubMedPubDate PubStatus="medline">
<Year>2015</Year>
<Month>8</Month>
<Day>28</Day>
<Hour>6</Hour>
<Minute>0</Minute>
</PubMedPubDate>
</History>
<PublicationStatus>epublish</PublicationStatus>
<ArticleIdList>
<ArticleId IdType="pii">s150101022</ArticleId>
<ArticleId IdType="doi">10.3390/s150101022</ArticleId>
<ArticleId IdType="pubmed">25580901</ArticleId>
<ArticleId IdType="pmc">PMC4327062</ArticleId>
</ArticleIdList>
</PubmedData>
</pubmed>
</record>

Pour manipuler ce document sous Unix (Dilib)

EXPLOR_STEP=$WICRI_ROOT/Ticri/CIDE/explor/HapticV1/Data/PubMed/Corpus
HfdSelect -h $EXPLOR_STEP/biblio.hfd -nk 000440 | SxmlIndent | more

Ou

HfdSelect -h $EXPLOR_AREA/Data/PubMed/Corpus/biblio.hfd -nk 000440 | SxmlIndent | more

Pour mettre un lien sur cette page dans le réseau Wicri

{{Explor lien
   |wiki=    Ticri/CIDE
   |area=    HapticV1
   |flux=    PubMed
   |étape=   Corpus
   |type=    RBID
   |clé=     pubmed:25580901
   |texte=   Depth camera-based 3D hand gesture controls with immersive tactile feedback for natural mid-air gesture interactions.
}}

Pour générer des pages wiki

HfdIndexSelect -h $EXPLOR_AREA/Data/PubMed/Corpus/RBID.i   -Sk "pubmed:25580901" \
       | HfdSelect -Kh $EXPLOR_AREA/Data/PubMed/Corpus/biblio.hfd   \
       | NlmPubMed2Wicri -a HapticV1 

Wicri

This area was generated with Dilib version V0.6.23.
Data generation: Mon Jun 13 01:09:46 2016. Site generation: Wed Mar 6 09:54:07 2024