Perception-based 3D tactile rendering from a single image for human skin examinations by dynamic touch.
Identifieur interne : 000584 ( PubMed/Corpus ); précédent : 000583; suivant : 000585Perception-based 3D tactile rendering from a single image for human skin examinations by dynamic touch.
Auteurs : K. Kim ; S. LeeSource :
- Skin research and technology : official journal of International Society for Bioengineering and the Skin (ISBS) [and] International Society for Digital Imaging of Skin (ISDIS) [and] International Society for Skin Imaging (ISSI) [ 1600-0846 ] ; 2015.
English descriptors
- KwdEn :
- Adult, Computer Simulation, Elastic Modulus (physiology), Equipment Design, Equipment Failure Analysis, Female, Friction, Humans, Image Interpretation, Computer-Assisted (methods), Imaging, Three-Dimensional (instrumentation), Male, Models, Biological, Palpation (instrumentation), Reproducibility of Results, Sensitivity and Specificity, Skin (anatomy & histology), Skin Physiological Phenomena, Touch (physiology), Touch Perception (physiology), User-Computer Interface, Young Adult.
- MESH :
- anatomy & histology : Skin.
- instrumentation : Imaging, Three-Dimensional, Palpation.
- methods : Image Interpretation, Computer-Assisted.
- physiology : Elastic Modulus, Touch, Touch Perception.
- Adult, Computer Simulation, Equipment Design, Equipment Failure Analysis, Female, Friction, Humans, Male, Models, Biological, Reproducibility of Results, Sensitivity and Specificity, Skin Physiological Phenomena, User-Computer Interface, Young Adult.
Abstract
Diagnosis of skin conditions is dependent on the assessment of skin surface properties that are represented by more tactile properties such as stiffness, roughness, and friction than visual information. Due to this reason, adding tactile feedback to existing vision based diagnosis systems can help dermatologists diagnose skin diseases or disorders more accurately. The goal of our research was therefore to develop a tactile rendering system for skin examinations by dynamic touch.
DOI: 10.1111/srt.12173
PubMed: 25087469
Links to Exploration step
pubmed:25087469Le document en format XML
<record><TEI><teiHeader><fileDesc><titleStmt><title xml:lang="en">Perception-based 3D tactile rendering from a single image for human skin examinations by dynamic touch.</title>
<author><name sortKey="Kim, K" sort="Kim, K" uniqKey="Kim K" first="K" last="Kim">K. Kim</name>
<affiliation><nlm:affiliation>Department of Electrical and Electronic Engineering, Institute of BioMed-IT, Energy-IT and Smart-IT Technology (BEST), Yonsei University, Seoul, Seodaemun-gu, 120-749, Korea.</nlm:affiliation>
</affiliation>
</author>
<author><name sortKey="Lee, S" sort="Lee, S" uniqKey="Lee S" first="S" last="Lee">S. Lee</name>
</author>
</titleStmt>
<publicationStmt><idno type="wicri:source">PubMed</idno>
<date when="2015">2015</date>
<idno type="RBID">pubmed:25087469</idno>
<idno type="pmid">25087469</idno>
<idno type="doi">10.1111/srt.12173</idno>
<idno type="wicri:Area/PubMed/Corpus">000584</idno>
</publicationStmt>
<sourceDesc><biblStruct><analytic><title xml:lang="en">Perception-based 3D tactile rendering from a single image for human skin examinations by dynamic touch.</title>
<author><name sortKey="Kim, K" sort="Kim, K" uniqKey="Kim K" first="K" last="Kim">K. Kim</name>
<affiliation><nlm:affiliation>Department of Electrical and Electronic Engineering, Institute of BioMed-IT, Energy-IT and Smart-IT Technology (BEST), Yonsei University, Seoul, Seodaemun-gu, 120-749, Korea.</nlm:affiliation>
</affiliation>
</author>
<author><name sortKey="Lee, S" sort="Lee, S" uniqKey="Lee S" first="S" last="Lee">S. Lee</name>
</author>
</analytic>
<series><title level="j">Skin research and technology : official journal of International Society for Bioengineering and the Skin (ISBS) [and] International Society for Digital Imaging of Skin (ISDIS) [and] International Society for Skin Imaging (ISSI)</title>
<idno type="eISSN">1600-0846</idno>
<imprint><date when="2015" type="published">2015</date>
</imprint>
</series>
</biblStruct>
</sourceDesc>
</fileDesc>
<profileDesc><textClass><keywords scheme="KwdEn" xml:lang="en"><term>Adult</term>
<term>Computer Simulation</term>
<term>Elastic Modulus (physiology)</term>
<term>Equipment Design</term>
<term>Equipment Failure Analysis</term>
<term>Female</term>
<term>Friction</term>
<term>Humans</term>
<term>Image Interpretation, Computer-Assisted (methods)</term>
<term>Imaging, Three-Dimensional (instrumentation)</term>
<term>Male</term>
<term>Models, Biological</term>
<term>Palpation (instrumentation)</term>
<term>Reproducibility of Results</term>
<term>Sensitivity and Specificity</term>
<term>Skin (anatomy & histology)</term>
<term>Skin Physiological Phenomena</term>
<term>Touch (physiology)</term>
<term>Touch Perception (physiology)</term>
<term>User-Computer Interface</term>
<term>Young Adult</term>
</keywords>
<keywords scheme="MESH" qualifier="anatomy & histology" xml:lang="en"><term>Skin</term>
</keywords>
<keywords scheme="MESH" qualifier="instrumentation" xml:lang="en"><term>Imaging, Three-Dimensional</term>
<term>Palpation</term>
</keywords>
<keywords scheme="MESH" qualifier="methods" xml:lang="en"><term>Image Interpretation, Computer-Assisted</term>
</keywords>
<keywords scheme="MESH" qualifier="physiology" xml:lang="en"><term>Elastic Modulus</term>
<term>Touch</term>
<term>Touch Perception</term>
</keywords>
<keywords scheme="MESH" xml:lang="en"><term>Adult</term>
<term>Computer Simulation</term>
<term>Equipment Design</term>
<term>Equipment Failure Analysis</term>
<term>Female</term>
<term>Friction</term>
<term>Humans</term>
<term>Male</term>
<term>Models, Biological</term>
<term>Reproducibility of Results</term>
<term>Sensitivity and Specificity</term>
<term>Skin Physiological Phenomena</term>
<term>User-Computer Interface</term>
<term>Young Adult</term>
</keywords>
</textClass>
</profileDesc>
</teiHeader>
<front><div type="abstract" xml:lang="en">Diagnosis of skin conditions is dependent on the assessment of skin surface properties that are represented by more tactile properties such as stiffness, roughness, and friction than visual information. Due to this reason, adding tactile feedback to existing vision based diagnosis systems can help dermatologists diagnose skin diseases or disorders more accurately. The goal of our research was therefore to develop a tactile rendering system for skin examinations by dynamic touch.</div>
</front>
</TEI>
<pubmed><MedlineCitation Owner="NLM" Status="MEDLINE"><PMID Version="1">25087469</PMID>
<DateCreated><Year>2015</Year>
<Month>04</Month>
<Day>07</Day>
</DateCreated>
<DateCompleted><Year>2015</Year>
<Month>12</Month>
<Day>28</Day>
</DateCompleted>
<Article PubModel="Print-Electronic"><Journal><ISSN IssnType="Electronic">1600-0846</ISSN>
<JournalIssue CitedMedium="Internet"><Volume>21</Volume>
<Issue>2</Issue>
<PubDate><Year>2015</Year>
<Month>May</Month>
</PubDate>
</JournalIssue>
<Title>Skin research and technology : official journal of International Society for Bioengineering and the Skin (ISBS) [and] International Society for Digital Imaging of Skin (ISDIS) [and] International Society for Skin Imaging (ISSI)</Title>
<ISOAbbreviation>Skin Res Technol</ISOAbbreviation>
</Journal>
<ArticleTitle>Perception-based 3D tactile rendering from a single image for human skin examinations by dynamic touch.</ArticleTitle>
<Pagination><MedlinePgn>164-74</MedlinePgn>
</Pagination>
<ELocationID EIdType="doi" ValidYN="Y">10.1111/srt.12173</ELocationID>
<Abstract><AbstractText Label="BACKGROUND/AIMS" NlmCategory="OBJECTIVE">Diagnosis of skin conditions is dependent on the assessment of skin surface properties that are represented by more tactile properties such as stiffness, roughness, and friction than visual information. Due to this reason, adding tactile feedback to existing vision based diagnosis systems can help dermatologists diagnose skin diseases or disorders more accurately. The goal of our research was therefore to develop a tactile rendering system for skin examinations by dynamic touch.</AbstractText>
<AbstractText Label="METHODS" NlmCategory="METHODS">Our development consists of two stages: converting a single image to a 3D haptic surface and rendering the generated haptic surface in real-time. Converting to 3D surfaces from 2D single images was implemented with concerning human perception data collected by a psychophysical experiment that measured human visual and haptic sensibility to 3D skin surface changes. For the second stage, we utilized real skin biomechanical properties found by prior studies. Our tactile rendering system is a standalone system that can be used with any single cameras and haptic feedback devices.</AbstractText>
<AbstractText Label="RESULTS" NlmCategory="RESULTS">We evaluated the performance of our system by conducting an identification experiment with three different skin images with five subjects. The participants had to identify one of the three skin surfaces by using a haptic device (Falcon) only. No visual cue was provided for the experiment. The results indicate that our system provides sufficient performance to render discernable tactile rendering with different skin surfaces.</AbstractText>
<AbstractText Label="CONCLUSION" NlmCategory="CONCLUSIONS">Our system uses only a single skin image and automatically generates a 3D haptic surface based on human haptic perception. Realistic skin interactions can be provided in real-time for the purpose of skin diagnosis, simulations, or training. Our system can also be used for other applications like virtual reality and cosmetic applications.</AbstractText>
<CopyrightInformation>© 2014 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.</CopyrightInformation>
</Abstract>
<AuthorList CompleteYN="Y"><Author ValidYN="Y"><LastName>Kim</LastName>
<ForeName>K</ForeName>
<Initials>K</Initials>
<AffiliationInfo><Affiliation>Department of Electrical and Electronic Engineering, Institute of BioMed-IT, Energy-IT and Smart-IT Technology (BEST), Yonsei University, Seoul, Seodaemun-gu, 120-749, Korea.</Affiliation>
</AffiliationInfo>
</Author>
<Author ValidYN="Y"><LastName>Lee</LastName>
<ForeName>S</ForeName>
<Initials>S</Initials>
</Author>
</AuthorList>
<Language>eng</Language>
<PublicationTypeList><PublicationType UI="D016428">Journal Article</PublicationType>
<PublicationType UI="D013485">Research Support, Non-U.S. Gov't</PublicationType>
</PublicationTypeList>
<ArticleDate DateType="Electronic"><Year>2014</Year>
<Month>08</Month>
<Day>04</Day>
</ArticleDate>
</Article>
<MedlineJournalInfo><Country>England</Country>
<MedlineTA>Skin Res Technol</MedlineTA>
<NlmUniqueID>9504453</NlmUniqueID>
<ISSNLinking>0909-752X</ISSNLinking>
</MedlineJournalInfo>
<CitationSubset>IM</CitationSubset>
<MeshHeadingList><MeshHeading><DescriptorName MajorTopicYN="N" UI="D000328">Adult</DescriptorName>
</MeshHeading>
<MeshHeading><DescriptorName MajorTopicYN="N" UI="D003198">Computer Simulation</DescriptorName>
</MeshHeading>
<MeshHeading><DescriptorName MajorTopicYN="N" UI="D055119">Elastic Modulus</DescriptorName>
<QualifierName MajorTopicYN="N" UI="Q000502">physiology</QualifierName>
</MeshHeading>
<MeshHeading><DescriptorName MajorTopicYN="N" UI="D004867">Equipment Design</DescriptorName>
</MeshHeading>
<MeshHeading><DescriptorName MajorTopicYN="N" UI="D019544">Equipment Failure Analysis</DescriptorName>
</MeshHeading>
<MeshHeading><DescriptorName MajorTopicYN="N" UI="D005260">Female</DescriptorName>
</MeshHeading>
<MeshHeading><DescriptorName MajorTopicYN="N" UI="D017276">Friction</DescriptorName>
</MeshHeading>
<MeshHeading><DescriptorName MajorTopicYN="N" UI="D006801">Humans</DescriptorName>
</MeshHeading>
<MeshHeading><DescriptorName MajorTopicYN="N" UI="D007090">Image Interpretation, Computer-Assisted</DescriptorName>
<QualifierName MajorTopicYN="N" UI="Q000379">methods</QualifierName>
</MeshHeading>
<MeshHeading><DescriptorName MajorTopicYN="N" UI="D021621">Imaging, Three-Dimensional</DescriptorName>
<QualifierName MajorTopicYN="Y" UI="Q000295">instrumentation</QualifierName>
</MeshHeading>
<MeshHeading><DescriptorName MajorTopicYN="N" UI="D008297">Male</DescriptorName>
</MeshHeading>
<MeshHeading><DescriptorName MajorTopicYN="N" UI="D008954">Models, Biological</DescriptorName>
</MeshHeading>
<MeshHeading><DescriptorName MajorTopicYN="N" UI="D010173">Palpation</DescriptorName>
<QualifierName MajorTopicYN="Y" UI="Q000295">instrumentation</QualifierName>
</MeshHeading>
<MeshHeading><DescriptorName MajorTopicYN="N" UI="D015203">Reproducibility of Results</DescriptorName>
</MeshHeading>
<MeshHeading><DescriptorName MajorTopicYN="N" UI="D012680">Sensitivity and Specificity</DescriptorName>
</MeshHeading>
<MeshHeading><DescriptorName MajorTopicYN="N" UI="D012867">Skin</DescriptorName>
<QualifierName MajorTopicYN="Y" UI="Q000033">anatomy & histology</QualifierName>
</MeshHeading>
<MeshHeading><DescriptorName MajorTopicYN="Y" UI="D012879">Skin Physiological Phenomena</DescriptorName>
</MeshHeading>
<MeshHeading><DescriptorName MajorTopicYN="N" UI="D014110">Touch</DescriptorName>
<QualifierName MajorTopicYN="Y" UI="Q000502">physiology</QualifierName>
</MeshHeading>
<MeshHeading><DescriptorName MajorTopicYN="N" UI="D055698">Touch Perception</DescriptorName>
<QualifierName MajorTopicYN="Y" UI="Q000502">physiology</QualifierName>
</MeshHeading>
<MeshHeading><DescriptorName MajorTopicYN="N" UI="D014584">User-Computer Interface</DescriptorName>
</MeshHeading>
<MeshHeading><DescriptorName MajorTopicYN="N" UI="D055815">Young Adult</DescriptorName>
</MeshHeading>
</MeshHeadingList>
<KeywordList Owner="NOTNLM"><Keyword MajorTopicYN="N">haptics</Keyword>
<Keyword MajorTopicYN="N">perceptual rendering</Keyword>
<Keyword MajorTopicYN="N">skin</Keyword>
<Keyword MajorTopicYN="N">tactile feedback</Keyword>
</KeywordList>
</MedlineCitation>
<PubmedData><History><PubMedPubDate PubStatus="accepted"><Year>2014</Year>
<Month>5</Month>
<Day>16</Day>
</PubMedPubDate>
<PubMedPubDate PubStatus="aheadofprint"><Year>2014</Year>
<Month>8</Month>
<Day>4</Day>
</PubMedPubDate>
<PubMedPubDate PubStatus="entrez"><Year>2014</Year>
<Month>8</Month>
<Day>5</Day>
<Hour>6</Hour>
<Minute>0</Minute>
</PubMedPubDate>
<PubMedPubDate PubStatus="pubmed"><Year>2014</Year>
<Month>8</Month>
<Day>5</Day>
<Hour>6</Hour>
<Minute>0</Minute>
</PubMedPubDate>
<PubMedPubDate PubStatus="medline"><Year>2015</Year>
<Month>12</Month>
<Day>29</Day>
<Hour>6</Hour>
<Minute>0</Minute>
</PubMedPubDate>
</History>
<PublicationStatus>ppublish</PublicationStatus>
<ArticleIdList><ArticleId IdType="pubmed">25087469</ArticleId>
<ArticleId IdType="doi">10.1111/srt.12173</ArticleId>
</ArticleIdList>
</PubmedData>
</pubmed>
</record>
Pour manipuler ce document sous Unix (Dilib)
EXPLOR_STEP=$WICRI_ROOT/Ticri/CIDE/explor/HapticV1/Data/PubMed/Corpus
HfdSelect -h $EXPLOR_STEP/biblio.hfd -nk 000584 | SxmlIndent | more
Ou
HfdSelect -h $EXPLOR_AREA/Data/PubMed/Corpus/biblio.hfd -nk 000584 | SxmlIndent | more
Pour mettre un lien sur cette page dans le réseau Wicri
{{Explor lien |wiki= Ticri/CIDE |area= HapticV1 |flux= PubMed |étape= Corpus |type= RBID |clé= pubmed:25087469 |texte= Perception-based 3D tactile rendering from a single image for human skin examinations by dynamic touch. }}
Pour générer des pages wiki
HfdIndexSelect -h $EXPLOR_AREA/Data/PubMed/Corpus/RBID.i -Sk "pubmed:25087469" \ | HfdSelect -Kh $EXPLOR_AREA/Data/PubMed/Corpus/biblio.hfd \ | NlmPubMed2Wicri -a HapticV1
![]() | This area was generated with Dilib version V0.6.23. | ![]() |