Serveur d'exploration sur les dispositifs haptiques

Attention, ce site est en cours de développement !
Attention, site généré par des moyens informatiques à partir de corpus bruts.
Les informations ne sont donc pas validées.

BrainTrain: brain simulator for medical VR application.

Identifieur interne : 006568 ( Main/Merge ); précédent : 006567; suivant : 006569

BrainTrain: brain simulator for medical VR application.

Auteurs : Bundit Panchaphongsaphak [Suisse] ; Rainer Burgkart ; Robert Riener

Source :

RBID : pubmed:15718764

English descriptors

Abstract

The brain is known as the most complex organ in the human body. Due to its complexity, learning and understanding the anatomy and functions of the cerebral cortex without effective learning assistance is rather difficult for medical novices and students in health and biological sciences. In this paper, we present a new virtual reality (VR) simulator for neurological education and neurosurgery. The system is based on a new three-dimensional (3D) user-computer interface design with a tangible object and a force-torque sensor. The system is combined with highly interactive computer-generated graphics and acoustics to provide multi-modal interactions through the user's sensory channels (vision, tactile, haptic and auditory). The system allows the user to feel the simulated object from its physical model that formed the interface device, while exploring or interacting with the mimicked computer-generated object in the virtual environment (VE). Unlike other passive interface devices, our system can detect the position and orientation of the interacting force in real-time, based on the system's set-up and a force-torque data acquisition technique. As long as the user is touching the model, the positions of the user's fingertip in the VE can be determined and is synchronized with the finger's motion in the physical world without requirement of an additional six-degree-of-freedom tracking device. The prior works have shown the use of the system set-up in medical applications. We demonstrate the system for neurological education and neurosurgery as a recent application. The main functions of the simulator contribute to education in neuroanatomy and visualization for diagnostic and pre-surgery planning. Once the user has touched the model, the system will mark the associated anatomy region and will provide the information of the region in terms of text note and/or sound. The user can switch from anatomy to the brain's function module, which will give details of motor, sensory or other cortical functions associated to the touch areas. In addition, the user can generate and visualize arbitrary cross-sectional images from corresponding to the magnetic resonance imaging (MRI) datasets either for training or for diagnostic purpose. The user can manipulate the cross-section image interactively and intuitively by moving the finger on the interface device.

PubMed: 15718764

Links toward previous steps (curation, corpus...)


Links to Exploration step

pubmed:15718764

Le document en format XML

<record>
<TEI>
<teiHeader>
<fileDesc>
<titleStmt>
<title xml:lang="en">BrainTrain: brain simulator for medical VR application.</title>
<author>
<name sortKey="Panchaphongsaphak, Bundit" sort="Panchaphongsaphak, Bundit" uniqKey="Panchaphongsaphak B" first="Bundit" last="Panchaphongsaphak">Bundit Panchaphongsaphak</name>
<affiliation wicri:level="1">
<nlm:affiliation>Automatic Control Laboratory, Swiss Federal Institute of Technology, Switzerland.</nlm:affiliation>
<country xml:lang="fr">Suisse</country>
<wicri:regionArea>Automatic Control Laboratory, Swiss Federal Institute of Technology</wicri:regionArea>
<wicri:noRegion>Swiss Federal Institute of Technology</wicri:noRegion>
</affiliation>
</author>
<author>
<name sortKey="Burgkart, Rainer" sort="Burgkart, Rainer" uniqKey="Burgkart R" first="Rainer" last="Burgkart">Rainer Burgkart</name>
</author>
<author>
<name sortKey="Riener, Robert" sort="Riener, Robert" uniqKey="Riener R" first="Robert" last="Riener">Robert Riener</name>
</author>
</titleStmt>
<publicationStmt>
<idno type="wicri:source">PubMed</idno>
<date when="2005">2005</date>
<idno type="RBID">pubmed:15718764</idno>
<idno type="pmid">15718764</idno>
<idno type="wicri:Area/PubMed/Corpus">001972</idno>
<idno type="wicri:Area/PubMed/Curation">001972</idno>
<idno type="wicri:Area/PubMed/Checkpoint">001739</idno>
<idno type="wicri:Area/Ncbi/Merge">000710</idno>
<idno type="wicri:Area/Ncbi/Curation">000710</idno>
<idno type="wicri:Area/Ncbi/Checkpoint">000710</idno>
<idno type="wicri:doubleKey">0926-9630:2005:Panchaphongsaphak B:braintrain:brain:simulator</idno>
<idno type="wicri:Area/Main/Merge">006568</idno>
</publicationStmt>
<sourceDesc>
<biblStruct>
<analytic>
<title xml:lang="en">BrainTrain: brain simulator for medical VR application.</title>
<author>
<name sortKey="Panchaphongsaphak, Bundit" sort="Panchaphongsaphak, Bundit" uniqKey="Panchaphongsaphak B" first="Bundit" last="Panchaphongsaphak">Bundit Panchaphongsaphak</name>
<affiliation wicri:level="1">
<nlm:affiliation>Automatic Control Laboratory, Swiss Federal Institute of Technology, Switzerland.</nlm:affiliation>
<country xml:lang="fr">Suisse</country>
<wicri:regionArea>Automatic Control Laboratory, Swiss Federal Institute of Technology</wicri:regionArea>
<wicri:noRegion>Swiss Federal Institute of Technology</wicri:noRegion>
</affiliation>
</author>
<author>
<name sortKey="Burgkart, Rainer" sort="Burgkart, Rainer" uniqKey="Burgkart R" first="Rainer" last="Burgkart">Rainer Burgkart</name>
</author>
<author>
<name sortKey="Riener, Robert" sort="Riener, Robert" uniqKey="Riener R" first="Robert" last="Riener">Robert Riener</name>
</author>
</analytic>
<series>
<title level="j">Studies in health technology and informatics</title>
<idno type="ISSN">0926-9630</idno>
<imprint>
<date when="2005" type="published">2005</date>
</imprint>
</series>
</biblStruct>
</sourceDesc>
</fileDesc>
<profileDesc>
<textClass>
<keywords scheme="KwdEn" xml:lang="en">
<term>Brain (anatomy & histology)</term>
<term>Computer Simulation</term>
<term>Humans</term>
<term>User-Computer Interface</term>
</keywords>
<keywords scheme="MESH" qualifier="anatomy & histology" xml:lang="en">
<term>Brain</term>
</keywords>
<keywords scheme="MESH" xml:lang="en">
<term>Computer Simulation</term>
<term>Humans</term>
<term>User-Computer Interface</term>
</keywords>
</textClass>
</profileDesc>
</teiHeader>
<front>
<div type="abstract" xml:lang="en">The brain is known as the most complex organ in the human body. Due to its complexity, learning and understanding the anatomy and functions of the cerebral cortex without effective learning assistance is rather difficult for medical novices and students in health and biological sciences. In this paper, we present a new virtual reality (VR) simulator for neurological education and neurosurgery. The system is based on a new three-dimensional (3D) user-computer interface design with a tangible object and a force-torque sensor. The system is combined with highly interactive computer-generated graphics and acoustics to provide multi-modal interactions through the user's sensory channels (vision, tactile, haptic and auditory). The system allows the user to feel the simulated object from its physical model that formed the interface device, while exploring or interacting with the mimicked computer-generated object in the virtual environment (VE). Unlike other passive interface devices, our system can detect the position and orientation of the interacting force in real-time, based on the system's set-up and a force-torque data acquisition technique. As long as the user is touching the model, the positions of the user's fingertip in the VE can be determined and is synchronized with the finger's motion in the physical world without requirement of an additional six-degree-of-freedom tracking device. The prior works have shown the use of the system set-up in medical applications. We demonstrate the system for neurological education and neurosurgery as a recent application. The main functions of the simulator contribute to education in neuroanatomy and visualization for diagnostic and pre-surgery planning. Once the user has touched the model, the system will mark the associated anatomy region and will provide the information of the region in terms of text note and/or sound. The user can switch from anatomy to the brain's function module, which will give details of motor, sensory or other cortical functions associated to the touch areas. In addition, the user can generate and visualize arbitrary cross-sectional images from corresponding to the magnetic resonance imaging (MRI) datasets either for training or for diagnostic purpose. The user can manipulate the cross-section image interactively and intuitively by moving the finger on the interface device.</div>
</front>
</TEI>
</record>

Pour manipuler ce document sous Unix (Dilib)

EXPLOR_STEP=$WICRI_ROOT/Ticri/CIDE/explor/HapticV1/Data/Main/Merge
HfdSelect -h $EXPLOR_STEP/biblio.hfd -nk 006568 | SxmlIndent | more

Ou

HfdSelect -h $EXPLOR_AREA/Data/Main/Merge/biblio.hfd -nk 006568 | SxmlIndent | more

Pour mettre un lien sur cette page dans le réseau Wicri

{{Explor lien
   |wiki=    Ticri/CIDE
   |area=    HapticV1
   |flux=    Main
   |étape=   Merge
   |type=    RBID
   |clé=     pubmed:15718764
   |texte=   BrainTrain: brain simulator for medical VR application.
}}

Pour générer des pages wiki

HfdIndexSelect -h $EXPLOR_AREA/Data/Main/Merge/RBID.i   -Sk "pubmed:15718764" \
       | HfdSelect -Kh $EXPLOR_AREA/Data/Main/Merge/biblio.hfd   \
       | NlmPubMed2Wicri -a HapticV1 

Wicri

This area was generated with Dilib version V0.6.23.
Data generation: Mon Jun 13 01:09:46 2016. Site generation: Wed Mar 6 09:54:07 2024