Serveur d'exploration sur les dispositifs haptiques

Attention, ce site est en cours de développement !
Attention, site généré par des moyens informatiques à partir de corpus bruts.
Les informations ne sont donc pas validées.

Multi-sensory surgical support system incorporating, tactile, visual and auditory perception modalities.

Identifieur interne : 006518 ( Main/Merge ); précédent : 006517; suivant : 006519

Multi-sensory surgical support system incorporating, tactile, visual and auditory perception modalities.

Auteurs : Sadao Omata [Japon] ; Yoshinobu Murayama ; Christos E. Constantinou

Source :

RBID : pubmed:15718762

Descripteurs français

English descriptors

Abstract

The incorporation of novel broad band sensory modalities, integrating tactile technology, with visual and auditory signals into the evolution of the next generation of surgical robotic is likely to significantly enhance their utility and safety. In this paper considerations are made of a system, where tactile information together with visual and audio feedback are integrated into a multisensory surgical support platform. The tactile sensor system uses a piezoelectric transducer (PZT) system to evaluate the haptic properties of tissues. The spatial position of the sensor is tracked by a video camera, visualizing the location of the marker. Tactile information is additionally converted to an audio signal, to represent tissue properties in terms of a frequency/amplitude modulated signal. Representative data were obtained from biological tissues demonstrating that the technology developed has potential applications in virtual systems or robotic tele-medical care. In view of these technical developments, consideration is made as to whether visual audio and tactile modalities act as independent sources of information.

PubMed: 15718762

Links toward previous steps (curation, corpus...)


Links to Exploration step

pubmed:15718762

Le document en format XML

<record>
<TEI>
<teiHeader>
<fileDesc>
<titleStmt>
<title xml:lang="en">Multi-sensory surgical support system incorporating, tactile, visual and auditory perception modalities.</title>
<author>
<name sortKey="Omata, Sadao" sort="Omata, Sadao" uniqKey="Omata S" first="Sadao" last="Omata">Sadao Omata</name>
<affiliation wicri:level="1">
<nlm:affiliation>College of Engineering, Nihon University, Koriyama, Fukushima, Japan.</nlm:affiliation>
<country xml:lang="fr">Japon</country>
<wicri:regionArea>College of Engineering, Nihon University, Koriyama, Fukushima</wicri:regionArea>
<wicri:noRegion>Fukushima</wicri:noRegion>
</affiliation>
</author>
<author>
<name sortKey="Murayama, Yoshinobu" sort="Murayama, Yoshinobu" uniqKey="Murayama Y" first="Yoshinobu" last="Murayama">Yoshinobu Murayama</name>
</author>
<author>
<name sortKey="Constantinou, Christos E" sort="Constantinou, Christos E" uniqKey="Constantinou C" first="Christos E" last="Constantinou">Christos E. Constantinou</name>
</author>
</titleStmt>
<publicationStmt>
<idno type="wicri:source">PubMed</idno>
<date when="2005">2005</date>
<idno type="RBID">pubmed:15718762</idno>
<idno type="pmid">15718762</idno>
<idno type="wicri:Area/PubMed/Corpus">001973</idno>
<idno type="wicri:Area/PubMed/Curation">001973</idno>
<idno type="wicri:Area/PubMed/Checkpoint">001695</idno>
<idno type="wicri:Area/Ncbi/Merge">000709</idno>
<idno type="wicri:Area/Ncbi/Curation">000709</idno>
<idno type="wicri:Area/Ncbi/Checkpoint">000709</idno>
<idno type="wicri:doubleKey">0926-9630:2005:Omata S:multi:sensory:surgical</idno>
<idno type="wicri:Area/Main/Merge">006518</idno>
</publicationStmt>
<sourceDesc>
<biblStruct>
<analytic>
<title xml:lang="en">Multi-sensory surgical support system incorporating, tactile, visual and auditory perception modalities.</title>
<author>
<name sortKey="Omata, Sadao" sort="Omata, Sadao" uniqKey="Omata S" first="Sadao" last="Omata">Sadao Omata</name>
<affiliation wicri:level="1">
<nlm:affiliation>College of Engineering, Nihon University, Koriyama, Fukushima, Japan.</nlm:affiliation>
<country xml:lang="fr">Japon</country>
<wicri:regionArea>College of Engineering, Nihon University, Koriyama, Fukushima</wicri:regionArea>
<wicri:noRegion>Fukushima</wicri:noRegion>
</affiliation>
</author>
<author>
<name sortKey="Murayama, Yoshinobu" sort="Murayama, Yoshinobu" uniqKey="Murayama Y" first="Yoshinobu" last="Murayama">Yoshinobu Murayama</name>
</author>
<author>
<name sortKey="Constantinou, Christos E" sort="Constantinou, Christos E" uniqKey="Constantinou C" first="Christos E" last="Constantinou">Christos E. Constantinou</name>
</author>
</analytic>
<series>
<title level="j">Studies in health technology and informatics</title>
<idno type="ISSN">0926-9630</idno>
<imprint>
<date when="2005" type="published">2005</date>
</imprint>
</series>
</biblStruct>
</sourceDesc>
</fileDesc>
<profileDesc>
<textClass>
<keywords scheme="KwdEn" xml:lang="en">
<term>Hearing</term>
<term>Humans</term>
<term>Surgery, Computer-Assisted (instrumentation)</term>
<term>Touch</term>
<term>United States</term>
<term>Vision, Ocular</term>
</keywords>
<keywords scheme="MESH" type="geographic" xml:lang="en">
<term>United States</term>
</keywords>
<keywords scheme="MESH" qualifier="instrumentation" xml:lang="en">
<term>Surgery, Computer-Assisted</term>
</keywords>
<keywords scheme="MESH" xml:lang="en">
<term>Hearing</term>
<term>Humans</term>
<term>Touch</term>
<term>Vision, Ocular</term>
</keywords>
<keywords scheme="Wicri" type="geographic" xml:lang="fr">
<term>États-Unis</term>
</keywords>
</textClass>
</profileDesc>
</teiHeader>
<front>
<div type="abstract" xml:lang="en">The incorporation of novel broad band sensory modalities, integrating tactile technology, with visual and auditory signals into the evolution of the next generation of surgical robotic is likely to significantly enhance their utility and safety. In this paper considerations are made of a system, where tactile information together with visual and audio feedback are integrated into a multisensory surgical support platform. The tactile sensor system uses a piezoelectric transducer (PZT) system to evaluate the haptic properties of tissues. The spatial position of the sensor is tracked by a video camera, visualizing the location of the marker. Tactile information is additionally converted to an audio signal, to represent tissue properties in terms of a frequency/amplitude modulated signal. Representative data were obtained from biological tissues demonstrating that the technology developed has potential applications in virtual systems or robotic tele-medical care. In view of these technical developments, consideration is made as to whether visual audio and tactile modalities act as independent sources of information.</div>
</front>
</TEI>
</record>

Pour manipuler ce document sous Unix (Dilib)

EXPLOR_STEP=$WICRI_ROOT/Ticri/CIDE/explor/HapticV1/Data/Main/Merge
HfdSelect -h $EXPLOR_STEP/biblio.hfd -nk 006518 | SxmlIndent | more

Ou

HfdSelect -h $EXPLOR_AREA/Data/Main/Merge/biblio.hfd -nk 006518 | SxmlIndent | more

Pour mettre un lien sur cette page dans le réseau Wicri

{{Explor lien
   |wiki=    Ticri/CIDE
   |area=    HapticV1
   |flux=    Main
   |étape=   Merge
   |type=    RBID
   |clé=     pubmed:15718762
   |texte=   Multi-sensory surgical support system incorporating, tactile, visual and auditory perception modalities.
}}

Pour générer des pages wiki

HfdIndexSelect -h $EXPLOR_AREA/Data/Main/Merge/RBID.i   -Sk "pubmed:15718762" \
       | HfdSelect -Kh $EXPLOR_AREA/Data/Main/Merge/biblio.hfd   \
       | NlmPubMed2Wicri -a HapticV1 

Wicri

This area was generated with Dilib version V0.6.23.
Data generation: Mon Jun 13 01:09:46 2016. Site generation: Wed Mar 6 09:54:07 2024