Virtualized reality interface for tele-micromanipulation
Identifieur interne : 000D32 ( PascalFrancis/Corpus ); précédent : 000D31; suivant : 000D33Virtualized reality interface for tele-micromanipulation
Auteurs : Mehdi Ammi ; Antoine Ferreira ; Jean-Guy FontaineSource :
Descripteurs français
- Pascal (Inist)
English descriptors
- KwdEn :
Abstract
Operators suffer much difficulty in manipulating micro/nano-sized objects without the assistance of human-machine interfaces due to scaling effects. We developed an immersive telemanipulation system using haptic/visual/sound interfaces for observation of micro-objects under an optical microscope. As the image of the microscope is two-dimensional, so it is hard to observe the workspace in the 3-D space. To improve the real-time observation and manipulation, we proposed real-time 3-D reconstruction of the microworld using image processing and virtualized reality techniques. Then, feasible haptically-generated paths based on potentials fields reaction forces are selected for efficient pushing-based manipulation without collisions. The proposed system guides the operator's gesture fully immerged in the virtual workspace.
Notice en format standard (ISO 2709)
Pour connaître la documentation sur le format Inist Standard.
pA |
|
---|
Format Inist (serveur)
NO : | PASCAL 06-0270144 INIST |
---|---|
ET : | Virtualized reality interface for tele-micromanipulation |
AU : | AMMI (Mehdi); FERREIRA (Antoine); FONTAINE (Jean-Guy) |
AF : | Laboratoire Vision et Robotique, ENSI de Bourges-Université d'Orléans, 10 Bld. Lahitolle/18020 Bourges/France (1 aut., 2 aut., 3 aut.) |
DT : | Congrès; Niveau analytique |
SO : | IEEE International Conference on Robotics and Automation/21/2004/New Orleans LA USA; Etats-Unis; Piscataway NJ: IEEE; Da. 2004; Pp. 2776-2781; ISBN 0-7803-8232-3 |
LA : | Anglais |
EA : | Operators suffer much difficulty in manipulating micro/nano-sized objects without the assistance of human-machine interfaces due to scaling effects. We developed an immersive telemanipulation system using haptic/visual/sound interfaces for observation of micro-objects under an optical microscope. As the image of the microscope is two-dimensional, so it is hard to observe the workspace in the 3-D space. To improve the real-time observation and manipulation, we proposed real-time 3-D reconstruction of the microworld using image processing and virtualized reality techniques. Then, feasible haptically-generated paths based on potentials fields reaction forces are selected for efficient pushing-based manipulation without collisions. The proposed system guides the operator's gesture fully immerged in the virtual workspace. |
CC : | 001D02D11; 001D02B04 |
FD : | Homme; Téléopération; Temps réel; Micromanipulation; Echelle nanométrique; Interface utilisateur; Sensibilité tactile; Microscope optique; Domaine travail; Collision; Geste; Traitement image; Force réaction |
ED : | Human; Remote operation; Real time; Micromanipulation; Nanometer scale; User interface; Tactile sensitivity; Optical microscope; Workspace; Collision; Gesture; Image processing; Reaction force |
SD : | Hombre; Teleacción; Tiempo real; Micromanipulación; Interfase usuario; Sensibilidad tactil; Microscopio óptico; Dominio trabajo; Colisión; Gesto; Procesamiento imagen; Fuerza reacción |
LO : | INIST-Y 38842.354000153471324470 |
ID : | 06-0270144 |
Links to Exploration step
Pascal:06-0270144Le document en format XML
<record><TEI><teiHeader><fileDesc><titleStmt><title xml:lang="en" level="a">Virtualized reality interface for tele-micromanipulation</title>
<author><name sortKey="Ammi, Mehdi" sort="Ammi, Mehdi" uniqKey="Ammi M" first="Mehdi" last="Ammi">Mehdi Ammi</name>
<affiliation><inist:fA14 i1="01"><s1>Laboratoire Vision et Robotique, ENSI de Bourges-Université d'Orléans, 10 Bld. Lahitolle</s1>
<s2>18020 Bourges</s2>
<s3>FRA</s3>
<sZ>1 aut.</sZ>
<sZ>2 aut.</sZ>
<sZ>3 aut.</sZ>
</inist:fA14>
</affiliation>
</author>
<author><name sortKey="Ferreira, Antoine" sort="Ferreira, Antoine" uniqKey="Ferreira A" first="Antoine" last="Ferreira">Antoine Ferreira</name>
<affiliation><inist:fA14 i1="01"><s1>Laboratoire Vision et Robotique, ENSI de Bourges-Université d'Orléans, 10 Bld. Lahitolle</s1>
<s2>18020 Bourges</s2>
<s3>FRA</s3>
<sZ>1 aut.</sZ>
<sZ>2 aut.</sZ>
<sZ>3 aut.</sZ>
</inist:fA14>
</affiliation>
</author>
<author><name sortKey="Fontaine, Jean Guy" sort="Fontaine, Jean Guy" uniqKey="Fontaine J" first="Jean-Guy" last="Fontaine">Jean-Guy Fontaine</name>
<affiliation><inist:fA14 i1="01"><s1>Laboratoire Vision et Robotique, ENSI de Bourges-Université d'Orléans, 10 Bld. Lahitolle</s1>
<s2>18020 Bourges</s2>
<s3>FRA</s3>
<sZ>1 aut.</sZ>
<sZ>2 aut.</sZ>
<sZ>3 aut.</sZ>
</inist:fA14>
</affiliation>
</author>
</titleStmt>
<publicationStmt><idno type="wicri:source">INIST</idno>
<idno type="inist">06-0270144</idno>
<date when="2004">2004</date>
<idno type="stanalyst">PASCAL 06-0270144 INIST</idno>
<idno type="RBID">Pascal:06-0270144</idno>
<idno type="wicri:Area/PascalFrancis/Corpus">000D32</idno>
</publicationStmt>
<sourceDesc><biblStruct><analytic><title xml:lang="en" level="a">Virtualized reality interface for tele-micromanipulation</title>
<author><name sortKey="Ammi, Mehdi" sort="Ammi, Mehdi" uniqKey="Ammi M" first="Mehdi" last="Ammi">Mehdi Ammi</name>
<affiliation><inist:fA14 i1="01"><s1>Laboratoire Vision et Robotique, ENSI de Bourges-Université d'Orléans, 10 Bld. Lahitolle</s1>
<s2>18020 Bourges</s2>
<s3>FRA</s3>
<sZ>1 aut.</sZ>
<sZ>2 aut.</sZ>
<sZ>3 aut.</sZ>
</inist:fA14>
</affiliation>
</author>
<author><name sortKey="Ferreira, Antoine" sort="Ferreira, Antoine" uniqKey="Ferreira A" first="Antoine" last="Ferreira">Antoine Ferreira</name>
<affiliation><inist:fA14 i1="01"><s1>Laboratoire Vision et Robotique, ENSI de Bourges-Université d'Orléans, 10 Bld. Lahitolle</s1>
<s2>18020 Bourges</s2>
<s3>FRA</s3>
<sZ>1 aut.</sZ>
<sZ>2 aut.</sZ>
<sZ>3 aut.</sZ>
</inist:fA14>
</affiliation>
</author>
<author><name sortKey="Fontaine, Jean Guy" sort="Fontaine, Jean Guy" uniqKey="Fontaine J" first="Jean-Guy" last="Fontaine">Jean-Guy Fontaine</name>
<affiliation><inist:fA14 i1="01"><s1>Laboratoire Vision et Robotique, ENSI de Bourges-Université d'Orléans, 10 Bld. Lahitolle</s1>
<s2>18020 Bourges</s2>
<s3>FRA</s3>
<sZ>1 aut.</sZ>
<sZ>2 aut.</sZ>
<sZ>3 aut.</sZ>
</inist:fA14>
</affiliation>
</author>
</analytic>
</biblStruct>
</sourceDesc>
</fileDesc>
<profileDesc><textClass><keywords scheme="KwdEn" xml:lang="en"><term>Collision</term>
<term>Gesture</term>
<term>Human</term>
<term>Image processing</term>
<term>Micromanipulation</term>
<term>Nanometer scale</term>
<term>Optical microscope</term>
<term>Reaction force</term>
<term>Real time</term>
<term>Remote operation</term>
<term>Tactile sensitivity</term>
<term>User interface</term>
<term>Workspace</term>
</keywords>
<keywords scheme="Pascal" xml:lang="fr"><term>Homme</term>
<term>Téléopération</term>
<term>Temps réel</term>
<term>Micromanipulation</term>
<term>Echelle nanométrique</term>
<term>Interface utilisateur</term>
<term>Sensibilité tactile</term>
<term>Microscope optique</term>
<term>Domaine travail</term>
<term>Collision</term>
<term>Geste</term>
<term>Traitement image</term>
<term>Force réaction</term>
</keywords>
</textClass>
</profileDesc>
</teiHeader>
<front><div type="abstract" xml:lang="en">Operators suffer much difficulty in manipulating micro/nano-sized objects without the assistance of human-machine interfaces due to scaling effects. We developed an immersive telemanipulation system using haptic/visual/sound interfaces for observation of micro-objects under an optical microscope. As the image of the microscope is two-dimensional, so it is hard to observe the workspace in the 3-D space. To improve the real-time observation and manipulation, we proposed real-time 3-D reconstruction of the microworld using image processing and virtualized reality techniques. Then, feasible haptically-generated paths based on potentials fields reaction forces are selected for efficient pushing-based manipulation without collisions. The proposed system guides the operator's gesture fully immerged in the virtual workspace.</div>
</front>
</TEI>
<inist><standard h6="B"><pA><fA08 i1="01" i2="1" l="ENG"><s1>Virtualized reality interface for tele-micromanipulation</s1>
</fA08>
<fA09 i1="01" i2="1" l="ENG"><s1>2004 IEEE International Conference on Robotics and Automation : April 26-May 1, 2004, Hilton New Orleans Riverside, New Orleans, LA, USA : Proceedings</s1>
</fA09>
<fA11 i1="01" i2="1"><s1>AMMI (Mehdi)</s1>
</fA11>
<fA11 i1="02" i2="1"><s1>FERREIRA (Antoine)</s1>
</fA11>
<fA11 i1="03" i2="1"><s1>FONTAINE (Jean-Guy)</s1>
</fA11>
<fA14 i1="01"><s1>Laboratoire Vision et Robotique, ENSI de Bourges-Université d'Orléans, 10 Bld. Lahitolle</s1>
<s2>18020 Bourges</s2>
<s3>FRA</s3>
<sZ>1 aut.</sZ>
<sZ>2 aut.</sZ>
<sZ>3 aut.</sZ>
</fA14>
<fA18 i1="01" i2="1"><s1>IEEE Robotics and automation society</s1>
<s3>USA</s3>
<s9>org-cong.</s9>
</fA18>
<fA20><s1>2776-2781</s1>
</fA20>
<fA21><s1>2004</s1>
</fA21>
<fA23 i1="01"><s0>ENG</s0>
</fA23>
<fA25 i1="01"><s1>IEEE</s1>
<s2>Piscataway NJ</s2>
</fA25>
<fA26 i1="01"><s0>0-7803-8232-3</s0>
</fA26>
<fA30 i1="01" i2="1" l="ENG"><s1>IEEE International Conference on Robotics and Automation</s1>
<s2>21</s2>
<s3>New Orleans LA USA</s3>
<s4>2004</s4>
</fA30>
<fA43 i1="01"><s1>INIST</s1>
<s2>Y 38842</s2>
<s5>354000153471324470</s5>
</fA43>
<fA44><s0>0000</s0>
<s1>© 2006 INIST-CNRS. All rights reserved.</s1>
</fA44>
<fA45><s0>11 ref.</s0>
</fA45>
<fA47 i1="01" i2="1"><s0>06-0270144</s0>
</fA47>
<fA60><s1>C</s1>
</fA60>
<fA61><s0>A</s0>
</fA61>
<fA66 i1="01"><s0>USA</s0>
</fA66>
<fC01 i1="01" l="ENG"><s0>Operators suffer much difficulty in manipulating micro/nano-sized objects without the assistance of human-machine interfaces due to scaling effects. We developed an immersive telemanipulation system using haptic/visual/sound interfaces for observation of micro-objects under an optical microscope. As the image of the microscope is two-dimensional, so it is hard to observe the workspace in the 3-D space. To improve the real-time observation and manipulation, we proposed real-time 3-D reconstruction of the microworld using image processing and virtualized reality techniques. Then, feasible haptically-generated paths based on potentials fields reaction forces are selected for efficient pushing-based manipulation without collisions. The proposed system guides the operator's gesture fully immerged in the virtual workspace.</s0>
</fC01>
<fC02 i1="01" i2="X"><s0>001D02D11</s0>
</fC02>
<fC02 i1="02" i2="X"><s0>001D02B04</s0>
</fC02>
<fC03 i1="01" i2="X" l="FRE"><s0>Homme</s0>
<s5>06</s5>
</fC03>
<fC03 i1="01" i2="X" l="ENG"><s0>Human</s0>
<s5>06</s5>
</fC03>
<fC03 i1="01" i2="X" l="SPA"><s0>Hombre</s0>
<s5>06</s5>
</fC03>
<fC03 i1="02" i2="X" l="FRE"><s0>Téléopération</s0>
<s5>07</s5>
</fC03>
<fC03 i1="02" i2="X" l="ENG"><s0>Remote operation</s0>
<s5>07</s5>
</fC03>
<fC03 i1="02" i2="X" l="SPA"><s0>Teleacción</s0>
<s5>07</s5>
</fC03>
<fC03 i1="03" i2="X" l="FRE"><s0>Temps réel</s0>
<s5>08</s5>
</fC03>
<fC03 i1="03" i2="X" l="ENG"><s0>Real time</s0>
<s5>08</s5>
</fC03>
<fC03 i1="03" i2="X" l="SPA"><s0>Tiempo real</s0>
<s5>08</s5>
</fC03>
<fC03 i1="04" i2="X" l="FRE"><s0>Micromanipulation</s0>
<s5>18</s5>
</fC03>
<fC03 i1="04" i2="X" l="ENG"><s0>Micromanipulation</s0>
<s5>18</s5>
</fC03>
<fC03 i1="04" i2="X" l="SPA"><s0>Micromanipulación</s0>
<s5>18</s5>
</fC03>
<fC03 i1="05" i2="3" l="FRE"><s0>Echelle nanométrique</s0>
<s5>19</s5>
</fC03>
<fC03 i1="05" i2="3" l="ENG"><s0>Nanometer scale</s0>
<s5>19</s5>
</fC03>
<fC03 i1="06" i2="X" l="FRE"><s0>Interface utilisateur</s0>
<s5>20</s5>
</fC03>
<fC03 i1="06" i2="X" l="ENG"><s0>User interface</s0>
<s5>20</s5>
</fC03>
<fC03 i1="06" i2="X" l="SPA"><s0>Interfase usuario</s0>
<s5>20</s5>
</fC03>
<fC03 i1="07" i2="X" l="FRE"><s0>Sensibilité tactile</s0>
<s5>21</s5>
</fC03>
<fC03 i1="07" i2="X" l="ENG"><s0>Tactile sensitivity</s0>
<s5>21</s5>
</fC03>
<fC03 i1="07" i2="X" l="SPA"><s0>Sensibilidad tactil</s0>
<s5>21</s5>
</fC03>
<fC03 i1="08" i2="X" l="FRE"><s0>Microscope optique</s0>
<s5>22</s5>
</fC03>
<fC03 i1="08" i2="X" l="ENG"><s0>Optical microscope</s0>
<s5>22</s5>
</fC03>
<fC03 i1="08" i2="X" l="SPA"><s0>Microscopio óptico</s0>
<s5>22</s5>
</fC03>
<fC03 i1="09" i2="X" l="FRE"><s0>Domaine travail</s0>
<s5>23</s5>
</fC03>
<fC03 i1="09" i2="X" l="ENG"><s0>Workspace</s0>
<s5>23</s5>
</fC03>
<fC03 i1="09" i2="X" l="SPA"><s0>Dominio trabajo</s0>
<s5>23</s5>
</fC03>
<fC03 i1="10" i2="X" l="FRE"><s0>Collision</s0>
<s5>24</s5>
</fC03>
<fC03 i1="10" i2="X" l="ENG"><s0>Collision</s0>
<s5>24</s5>
</fC03>
<fC03 i1="10" i2="X" l="SPA"><s0>Colisión</s0>
<s5>24</s5>
</fC03>
<fC03 i1="11" i2="X" l="FRE"><s0>Geste</s0>
<s5>25</s5>
</fC03>
<fC03 i1="11" i2="X" l="ENG"><s0>Gesture</s0>
<s5>25</s5>
</fC03>
<fC03 i1="11" i2="X" l="SPA"><s0>Gesto</s0>
<s5>25</s5>
</fC03>
<fC03 i1="12" i2="X" l="FRE"><s0>Traitement image</s0>
<s5>28</s5>
</fC03>
<fC03 i1="12" i2="X" l="ENG"><s0>Image processing</s0>
<s5>28</s5>
</fC03>
<fC03 i1="12" i2="X" l="SPA"><s0>Procesamiento imagen</s0>
<s5>28</s5>
</fC03>
<fC03 i1="13" i2="X" l="FRE"><s0>Force réaction</s0>
<s5>29</s5>
</fC03>
<fC03 i1="13" i2="X" l="ENG"><s0>Reaction force</s0>
<s5>29</s5>
</fC03>
<fC03 i1="13" i2="X" l="SPA"><s0>Fuerza reacción</s0>
<s5>29</s5>
</fC03>
<fN21><s1>170</s1>
</fN21>
<fN44 i1="01"><s1>OTO</s1>
</fN44>
<fN82><s1>OTO</s1>
</fN82>
</pA>
</standard>
<server><NO>PASCAL 06-0270144 INIST</NO>
<ET>Virtualized reality interface for tele-micromanipulation</ET>
<AU>AMMI (Mehdi); FERREIRA (Antoine); FONTAINE (Jean-Guy)</AU>
<AF>Laboratoire Vision et Robotique, ENSI de Bourges-Université d'Orléans, 10 Bld. Lahitolle/18020 Bourges/France (1 aut., 2 aut., 3 aut.)</AF>
<DT>Congrès; Niveau analytique</DT>
<SO>IEEE International Conference on Robotics and Automation/21/2004/New Orleans LA USA; Etats-Unis; Piscataway NJ: IEEE; Da. 2004; Pp. 2776-2781; ISBN 0-7803-8232-3</SO>
<LA>Anglais</LA>
<EA>Operators suffer much difficulty in manipulating micro/nano-sized objects without the assistance of human-machine interfaces due to scaling effects. We developed an immersive telemanipulation system using haptic/visual/sound interfaces for observation of micro-objects under an optical microscope. As the image of the microscope is two-dimensional, so it is hard to observe the workspace in the 3-D space. To improve the real-time observation and manipulation, we proposed real-time 3-D reconstruction of the microworld using image processing and virtualized reality techniques. Then, feasible haptically-generated paths based on potentials fields reaction forces are selected for efficient pushing-based manipulation without collisions. The proposed system guides the operator's gesture fully immerged in the virtual workspace.</EA>
<CC>001D02D11; 001D02B04</CC>
<FD>Homme; Téléopération; Temps réel; Micromanipulation; Echelle nanométrique; Interface utilisateur; Sensibilité tactile; Microscope optique; Domaine travail; Collision; Geste; Traitement image; Force réaction</FD>
<ED>Human; Remote operation; Real time; Micromanipulation; Nanometer scale; User interface; Tactile sensitivity; Optical microscope; Workspace; Collision; Gesture; Image processing; Reaction force</ED>
<SD>Hombre; Teleacción; Tiempo real; Micromanipulación; Interfase usuario; Sensibilidad tactil; Microscopio óptico; Dominio trabajo; Colisión; Gesto; Procesamiento imagen; Fuerza reacción</SD>
<LO>INIST-Y 38842.354000153471324470</LO>
<ID>06-0270144</ID>
</server>
</inist>
</record>
Pour manipuler ce document sous Unix (Dilib)
EXPLOR_STEP=$WICRI_ROOT/Ticri/CIDE/explor/HapticV1/Data/PascalFrancis/Corpus
HfdSelect -h $EXPLOR_STEP/biblio.hfd -nk 000D32 | SxmlIndent | more
Ou
HfdSelect -h $EXPLOR_AREA/Data/PascalFrancis/Corpus/biblio.hfd -nk 000D32 | SxmlIndent | more
Pour mettre un lien sur cette page dans le réseau Wicri
{{Explor lien |wiki= Ticri/CIDE |area= HapticV1 |flux= PascalFrancis |étape= Corpus |type= RBID |clé= Pascal:06-0270144 |texte= Virtualized reality interface for tele-micromanipulation }}
![]() | This area was generated with Dilib version V0.6.23. | ![]() |