Serveur d'exploration sur les dispositifs haptiques

Attention, ce site est en cours de développement !
Attention, site généré par des moyens informatiques à partir de corpus bruts.
Les informations ne sont donc pas validées.

A new haptic interface for VR medical training.

Identifieur interne : 000604 ( Ncbi/Merge ); précédent : 000603; suivant : 000605

A new haptic interface for VR medical training.

Auteurs : Robert Riener [Allemagne] ; Rainer Burgkart

Source :

RBID : pubmed:15458120

English descriptors

Abstract

Successful applications of haptic displays are limited to tool-based interfaces that simulate haptic effects on surgical and other medical instruments. However, no satisfactory haptic display exist so far, that enable the simulation of high fidelity palpation of human tissue or body segments. Existing approaches developed for medical training fail due to unrealistic haptic effects, time-consuming donning and doffing, and inconvenient use (e.g., mechatronic tactile and kinesthetic displays) or due to restricted function and adjustability (e.g., passive mannequins). The key idea of the new haptic interface is to attach artificial organs or segments (e.g. a plastic leg) to a force actuating mechatronic unit (e.g. robot). A set of different materials combined in certain layers yield components that look and feel like real objects. When the user touches the artificial object the contact forces and position changes are measured and fed into a model-based controller. Thus, the actuator moves the object so that the user gets the impression that he had induced the movement. The new haptic display has been verified with a setup developed for the training of functional joint evaluation after knee injuries. Compared to classical approaches, this display is convenient to use, provides realistic tactile properties and can be partly adjusted to different system properties (e.g. pathological joint properties). This kind of new interface can be applied to many different medical applications, where the clinician directly touches human limbs or tissue, such as in obstetrics, reanimation, organ palpation, etc.

PubMed: 15458120

Links toward previous steps (curation, corpus...)


Links to Exploration step

pubmed:15458120

Le document en format XML

<record>
<TEI>
<teiHeader>
<fileDesc>
<titleStmt>
<title xml:lang="en">A new haptic interface for VR medical training.</title>
<author>
<name sortKey="Riener, Robert" sort="Riener, Robert" uniqKey="Riener R" first="Robert" last="Riener">Robert Riener</name>
<affiliation wicri:level="4">
<nlm:affiliation>Institute of Automatic Control Engineering, Klinikum Rechts der Isar, Technische Universität München, 80290 Munich, Germany.</nlm:affiliation>
<country xml:lang="fr">Allemagne</country>
<wicri:regionArea>Institute of Automatic Control Engineering, Klinikum Rechts der Isar, Technische Universität München, 80290 Munich</wicri:regionArea>
<wicri:noRegion>80290 Munich</wicri:noRegion>
<placeName>
<settlement type="city">Munich</settlement>
<region type="land" nuts="1">Bavière</region>
<region type="district" nuts="2">District de Haute-Bavière</region>
</placeName>
<orgName type="university">Université technique de Munich</orgName>
<placeName>
<settlement type="city">Munich</settlement>
<region type="land" nuts="1">Bavière</region>
<region type="district" nuts="2">District de Haute-Bavière</region>
</placeName>
</affiliation>
</author>
<author>
<name sortKey="Burgkart, Rainer" sort="Burgkart, Rainer" uniqKey="Burgkart R" first="Rainer" last="Burgkart">Rainer Burgkart</name>
</author>
</titleStmt>
<publicationStmt>
<idno type="wicri:source">PubMed</idno>
<date when="2002">2002</date>
<idno type="RBID">pubmed:15458120</idno>
<idno type="pmid">15458120</idno>
<idno type="wicri:Area/PubMed/Corpus">001A62</idno>
<idno type="wicri:Area/PubMed/Curation">001A62</idno>
<idno type="wicri:Area/PubMed/Checkpoint">001A46</idno>
<idno type="wicri:Area/Ncbi/Merge">000604</idno>
</publicationStmt>
<sourceDesc>
<biblStruct>
<analytic>
<title xml:lang="en">A new haptic interface for VR medical training.</title>
<author>
<name sortKey="Riener, Robert" sort="Riener, Robert" uniqKey="Riener R" first="Robert" last="Riener">Robert Riener</name>
<affiliation wicri:level="4">
<nlm:affiliation>Institute of Automatic Control Engineering, Klinikum Rechts der Isar, Technische Universität München, 80290 Munich, Germany.</nlm:affiliation>
<country xml:lang="fr">Allemagne</country>
<wicri:regionArea>Institute of Automatic Control Engineering, Klinikum Rechts der Isar, Technische Universität München, 80290 Munich</wicri:regionArea>
<wicri:noRegion>80290 Munich</wicri:noRegion>
<placeName>
<settlement type="city">Munich</settlement>
<region type="land" nuts="1">Bavière</region>
<region type="district" nuts="2">District de Haute-Bavière</region>
</placeName>
<orgName type="university">Université technique de Munich</orgName>
<placeName>
<settlement type="city">Munich</settlement>
<region type="land" nuts="1">Bavière</region>
<region type="district" nuts="2">District de Haute-Bavière</region>
</placeName>
</affiliation>
</author>
<author>
<name sortKey="Burgkart, Rainer" sort="Burgkart, Rainer" uniqKey="Burgkart R" first="Rainer" last="Burgkart">Rainer Burgkart</name>
</author>
</analytic>
<series>
<title level="j">Studies in health technology and informatics</title>
<idno type="ISSN">0926-9630</idno>
<imprint>
<date when="2002" type="published">2002</date>
</imprint>
</series>
</biblStruct>
</sourceDesc>
</fileDesc>
<profileDesc>
<textClass>
<keywords scheme="KwdEn" xml:lang="en">
<term>Artificial Organs</term>
<term>Biomechanical Phenomena</term>
<term>Biophysical Phenomena</term>
<term>Biophysics</term>
<term>Computer Simulation</term>
<term>Computer Systems</term>
<term>Data Display</term>
<term>Education, Medical</term>
<term>Feedback</term>
<term>Humans</term>
<term>Knee Injuries (physiopathology)</term>
<term>Knee Joint (physiopathology)</term>
<term>Manikins</term>
<term>Range of Motion, Articular</term>
<term>Robotics (instrumentation)</term>
<term>Software</term>
<term>Touch</term>
<term>User-Computer Interface</term>
</keywords>
<keywords scheme="MESH" qualifier="instrumentation" xml:lang="en">
<term>Robotics</term>
</keywords>
<keywords scheme="MESH" qualifier="physiopathology" xml:lang="en">
<term>Knee Injuries</term>
<term>Knee Joint</term>
</keywords>
<keywords scheme="MESH" xml:lang="en">
<term>Artificial Organs</term>
<term>Biomechanical Phenomena</term>
<term>Biophysical Phenomena</term>
<term>Biophysics</term>
<term>Computer Simulation</term>
<term>Computer Systems</term>
<term>Data Display</term>
<term>Education, Medical</term>
<term>Feedback</term>
<term>Humans</term>
<term>Manikins</term>
<term>Range of Motion, Articular</term>
<term>Software</term>
<term>Touch</term>
<term>User-Computer Interface</term>
</keywords>
</textClass>
</profileDesc>
</teiHeader>
<front>
<div type="abstract" xml:lang="en">Successful applications of haptic displays are limited to tool-based interfaces that simulate haptic effects on surgical and other medical instruments. However, no satisfactory haptic display exist so far, that enable the simulation of high fidelity palpation of human tissue or body segments. Existing approaches developed for medical training fail due to unrealistic haptic effects, time-consuming donning and doffing, and inconvenient use (e.g., mechatronic tactile and kinesthetic displays) or due to restricted function and adjustability (e.g., passive mannequins). The key idea of the new haptic interface is to attach artificial organs or segments (e.g. a plastic leg) to a force actuating mechatronic unit (e.g. robot). A set of different materials combined in certain layers yield components that look and feel like real objects. When the user touches the artificial object the contact forces and position changes are measured and fed into a model-based controller. Thus, the actuator moves the object so that the user gets the impression that he had induced the movement. The new haptic display has been verified with a setup developed for the training of functional joint evaluation after knee injuries. Compared to classical approaches, this display is convenient to use, provides realistic tactile properties and can be partly adjusted to different system properties (e.g. pathological joint properties). This kind of new interface can be applied to many different medical applications, where the clinician directly touches human limbs or tissue, such as in obstetrics, reanimation, organ palpation, etc.</div>
</front>
</TEI>
<pubmed>
<MedlineCitation Owner="NLM" Status="MEDLINE">
<PMID Version="1">15458120</PMID>
<DateCreated>
<Year>2004</Year>
<Month>10</Month>
<Day>01</Day>
</DateCreated>
<DateCompleted>
<Year>2004</Year>
<Month>11</Month>
<Day>02</Day>
</DateCompleted>
<DateRevised>
<Year>2013</Year>
<Month>11</Month>
<Day>21</Day>
</DateRevised>
<Article PubModel="Print">
<Journal>
<ISSN IssnType="Print">0926-9630</ISSN>
<JournalIssue CitedMedium="Print">
<Volume>85</Volume>
<PubDate>
<Year>2002</Year>
</PubDate>
</JournalIssue>
<Title>Studies in health technology and informatics</Title>
<ISOAbbreviation>Stud Health Technol Inform</ISOAbbreviation>
</Journal>
<ArticleTitle>A new haptic interface for VR medical training.</ArticleTitle>
<Pagination>
<MedlinePgn>388-94</MedlinePgn>
</Pagination>
<Abstract>
<AbstractText>Successful applications of haptic displays are limited to tool-based interfaces that simulate haptic effects on surgical and other medical instruments. However, no satisfactory haptic display exist so far, that enable the simulation of high fidelity palpation of human tissue or body segments. Existing approaches developed for medical training fail due to unrealistic haptic effects, time-consuming donning and doffing, and inconvenient use (e.g., mechatronic tactile and kinesthetic displays) or due to restricted function and adjustability (e.g., passive mannequins). The key idea of the new haptic interface is to attach artificial organs or segments (e.g. a plastic leg) to a force actuating mechatronic unit (e.g. robot). A set of different materials combined in certain layers yield components that look and feel like real objects. When the user touches the artificial object the contact forces and position changes are measured and fed into a model-based controller. Thus, the actuator moves the object so that the user gets the impression that he had induced the movement. The new haptic display has been verified with a setup developed for the training of functional joint evaluation after knee injuries. Compared to classical approaches, this display is convenient to use, provides realistic tactile properties and can be partly adjusted to different system properties (e.g. pathological joint properties). This kind of new interface can be applied to many different medical applications, where the clinician directly touches human limbs or tissue, such as in obstetrics, reanimation, organ palpation, etc.</AbstractText>
</Abstract>
<AuthorList CompleteYN="Y">
<Author ValidYN="Y">
<LastName>Riener</LastName>
<ForeName>Robert</ForeName>
<Initials>R</Initials>
<AffiliationInfo>
<Affiliation>Institute of Automatic Control Engineering, Klinikum Rechts der Isar, Technische Universität München, 80290 Munich, Germany.</Affiliation>
</AffiliationInfo>
</Author>
<Author ValidYN="Y">
<LastName>Burgkart</LastName>
<ForeName>Rainer</ForeName>
<Initials>R</Initials>
</Author>
</AuthorList>
<Language>eng</Language>
<PublicationTypeList>
<PublicationType UI="D016428">Journal Article</PublicationType>
<PublicationType UI="D013485">Research Support, Non-U.S. Gov't</PublicationType>
</PublicationTypeList>
</Article>
<MedlineJournalInfo>
<Country>Netherlands</Country>
<MedlineTA>Stud Health Technol Inform</MedlineTA>
<NlmUniqueID>9214582</NlmUniqueID>
<ISSNLinking>0926-9630</ISSNLinking>
</MedlineJournalInfo>
<CitationSubset>T</CitationSubset>
<MeshHeadingList>
<MeshHeading>
<DescriptorName MajorTopicYN="N" UI="D001187">Artificial Organs</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="N" UI="D001696">Biomechanical Phenomena</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="N" UI="D055592">Biophysical Phenomena</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="N" UI="D001703">Biophysics</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="Y" UI="D003198">Computer Simulation</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="N" UI="D003199">Computer Systems</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="N" UI="D003626">Data Display</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="Y" UI="D004501">Education, Medical</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="Y" UI="D005246">Feedback</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="N" UI="D006801">Humans</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="N" UI="D007718">Knee Injuries</DescriptorName>
<QualifierName MajorTopicYN="N" UI="Q000503">physiopathology</QualifierName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="N" UI="D007719">Knee Joint</DescriptorName>
<QualifierName MajorTopicYN="N" UI="Q000503">physiopathology</QualifierName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="Y" UI="D008348">Manikins</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="N" UI="D016059">Range of Motion, Articular</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="N" UI="D012371">Robotics</DescriptorName>
<QualifierName MajorTopicYN="N" UI="Q000295">instrumentation</QualifierName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="N" UI="D012984">Software</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="Y" UI="D014110">Touch</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="Y" UI="D014584">User-Computer Interface</DescriptorName>
</MeshHeading>
</MeshHeadingList>
</MedlineCitation>
<PubmedData>
<History>
<PubMedPubDate PubStatus="pubmed">
<Year>2004</Year>
<Month>10</Month>
<Day>2</Day>
<Hour>5</Hour>
<Minute>0</Minute>
</PubMedPubDate>
<PubMedPubDate PubStatus="medline">
<Year>2004</Year>
<Month>11</Month>
<Day>4</Day>
<Hour>9</Hour>
<Minute>0</Minute>
</PubMedPubDate>
<PubMedPubDate PubStatus="entrez">
<Year>2004</Year>
<Month>10</Month>
<Day>2</Day>
<Hour>5</Hour>
<Minute>0</Minute>
</PubMedPubDate>
</History>
<PublicationStatus>ppublish</PublicationStatus>
<ArticleIdList>
<ArticleId IdType="pubmed">15458120</ArticleId>
</ArticleIdList>
</PubmedData>
</pubmed>
<affiliations>
<list>
<country>
<li>Allemagne</li>
</country>
<region>
<li>Bavière</li>
<li>District de Haute-Bavière</li>
</region>
<settlement>
<li>Munich</li>
</settlement>
<orgName>
<li>Université technique de Munich</li>
</orgName>
</list>
<tree>
<noCountry>
<name sortKey="Burgkart, Rainer" sort="Burgkart, Rainer" uniqKey="Burgkart R" first="Rainer" last="Burgkart">Rainer Burgkart</name>
</noCountry>
<country name="Allemagne">
<region name="Bavière">
<name sortKey="Riener, Robert" sort="Riener, Robert" uniqKey="Riener R" first="Robert" last="Riener">Robert Riener</name>
</region>
</country>
</tree>
</affiliations>
</record>

Pour manipuler ce document sous Unix (Dilib)

EXPLOR_STEP=$WICRI_ROOT/Ticri/CIDE/explor/HapticV1/Data/Ncbi/Merge
HfdSelect -h $EXPLOR_STEP/biblio.hfd -nk 000604 | SxmlIndent | more

Ou

HfdSelect -h $EXPLOR_AREA/Data/Ncbi/Merge/biblio.hfd -nk 000604 | SxmlIndent | more

Pour mettre un lien sur cette page dans le réseau Wicri

{{Explor lien
   |wiki=    Ticri/CIDE
   |area=    HapticV1
   |flux=    Ncbi
   |étape=   Merge
   |type=    RBID
   |clé=     pubmed:15458120
   |texte=   A new haptic interface for VR medical training.
}}

Pour générer des pages wiki

HfdIndexSelect -h $EXPLOR_AREA/Data/Ncbi/Merge/RBID.i   -Sk "pubmed:15458120" \
       | HfdSelect -Kh $EXPLOR_AREA/Data/Ncbi/Merge/biblio.hfd   \
       | NlmPubMed2Wicri -a HapticV1 

Wicri

This area was generated with Dilib version V0.6.23.
Data generation: Mon Jun 13 01:09:46 2016. Site generation: Wed Mar 6 09:54:07 2024