Serveur d'exploration sur les dispositifs haptiques

Attention, ce site est en cours de développement !
Attention, site généré par des moyens informatiques à partir de corpus bruts.
Les informations ne sont donc pas validées.

Virtualized reality interface for tele-micromanipulation

Identifieur interne : 000D32 ( PascalFrancis/Corpus ); précédent : 000D31; suivant : 000D33

Virtualized reality interface for tele-micromanipulation

Auteurs : Mehdi Ammi ; Antoine Ferreira ; Jean-Guy Fontaine

Source :

RBID : Pascal:06-0270144

Descripteurs français

English descriptors

Abstract

Operators suffer much difficulty in manipulating micro/nano-sized objects without the assistance of human-machine interfaces due to scaling effects. We developed an immersive telemanipulation system using haptic/visual/sound interfaces for observation of micro-objects under an optical microscope. As the image of the microscope is two-dimensional, so it is hard to observe the workspace in the 3-D space. To improve the real-time observation and manipulation, we proposed real-time 3-D reconstruction of the microworld using image processing and virtualized reality techniques. Then, feasible haptically-generated paths based on potentials fields reaction forces are selected for efficient pushing-based manipulation without collisions. The proposed system guides the operator's gesture fully immerged in the virtual workspace.

Notice en format standard (ISO 2709)

Pour connaître la documentation sur le format Inist Standard.

pA  
A08 01  1  ENG  @1 Virtualized reality interface for tele-micromanipulation
A09 01  1  ENG  @1 2004 IEEE International Conference on Robotics and Automation : April 26-May 1, 2004, Hilton New Orleans Riverside, New Orleans, LA, USA : Proceedings
A11 01  1    @1 AMMI (Mehdi)
A11 02  1    @1 FERREIRA (Antoine)
A11 03  1    @1 FONTAINE (Jean-Guy)
A14 01      @1 Laboratoire Vision et Robotique, ENSI de Bourges-Université d'Orléans, 10 Bld. Lahitolle @2 18020 Bourges @3 FRA @Z 1 aut. @Z 2 aut. @Z 3 aut.
A18 01  1    @1 IEEE Robotics and automation society @3 USA @9 org-cong.
A20       @1 2776-2781
A21       @1 2004
A23 01      @0 ENG
A25 01      @1 IEEE @2 Piscataway NJ
A26 01      @0 0-7803-8232-3
A30 01  1  ENG  @1 IEEE International Conference on Robotics and Automation @2 21 @3 New Orleans LA USA @4 2004
A43 01      @1 INIST @2 Y 38842 @5 354000153471324470
A44       @0 0000 @1 © 2006 INIST-CNRS. All rights reserved.
A45       @0 11 ref.
A47 01  1    @0 06-0270144
A60       @1 C
A61       @0 A
A66 01      @0 USA
C01 01    ENG  @0 Operators suffer much difficulty in manipulating micro/nano-sized objects without the assistance of human-machine interfaces due to scaling effects. We developed an immersive telemanipulation system using haptic/visual/sound interfaces for observation of micro-objects under an optical microscope. As the image of the microscope is two-dimensional, so it is hard to observe the workspace in the 3-D space. To improve the real-time observation and manipulation, we proposed real-time 3-D reconstruction of the microworld using image processing and virtualized reality techniques. Then, feasible haptically-generated paths based on potentials fields reaction forces are selected for efficient pushing-based manipulation without collisions. The proposed system guides the operator's gesture fully immerged in the virtual workspace.
C02 01  X    @0 001D02D11
C02 02  X    @0 001D02B04
C03 01  X  FRE  @0 Homme @5 06
C03 01  X  ENG  @0 Human @5 06
C03 01  X  SPA  @0 Hombre @5 06
C03 02  X  FRE  @0 Téléopération @5 07
C03 02  X  ENG  @0 Remote operation @5 07
C03 02  X  SPA  @0 Teleacción @5 07
C03 03  X  FRE  @0 Temps réel @5 08
C03 03  X  ENG  @0 Real time @5 08
C03 03  X  SPA  @0 Tiempo real @5 08
C03 04  X  FRE  @0 Micromanipulation @5 18
C03 04  X  ENG  @0 Micromanipulation @5 18
C03 04  X  SPA  @0 Micromanipulación @5 18
C03 05  3  FRE  @0 Echelle nanométrique @5 19
C03 05  3  ENG  @0 Nanometer scale @5 19
C03 06  X  FRE  @0 Interface utilisateur @5 20
C03 06  X  ENG  @0 User interface @5 20
C03 06  X  SPA  @0 Interfase usuario @5 20
C03 07  X  FRE  @0 Sensibilité tactile @5 21
C03 07  X  ENG  @0 Tactile sensitivity @5 21
C03 07  X  SPA  @0 Sensibilidad tactil @5 21
C03 08  X  FRE  @0 Microscope optique @5 22
C03 08  X  ENG  @0 Optical microscope @5 22
C03 08  X  SPA  @0 Microscopio óptico @5 22
C03 09  X  FRE  @0 Domaine travail @5 23
C03 09  X  ENG  @0 Workspace @5 23
C03 09  X  SPA  @0 Dominio trabajo @5 23
C03 10  X  FRE  @0 Collision @5 24
C03 10  X  ENG  @0 Collision @5 24
C03 10  X  SPA  @0 Colisión @5 24
C03 11  X  FRE  @0 Geste @5 25
C03 11  X  ENG  @0 Gesture @5 25
C03 11  X  SPA  @0 Gesto @5 25
C03 12  X  FRE  @0 Traitement image @5 28
C03 12  X  ENG  @0 Image processing @5 28
C03 12  X  SPA  @0 Procesamiento imagen @5 28
C03 13  X  FRE  @0 Force réaction @5 29
C03 13  X  ENG  @0 Reaction force @5 29
C03 13  X  SPA  @0 Fuerza reacción @5 29
N21       @1 170
N44 01      @1 OTO
N82       @1 OTO

Format Inist (serveur)

NO : PASCAL 06-0270144 INIST
ET : Virtualized reality interface for tele-micromanipulation
AU : AMMI (Mehdi); FERREIRA (Antoine); FONTAINE (Jean-Guy)
AF : Laboratoire Vision et Robotique, ENSI de Bourges-Université d'Orléans, 10 Bld. Lahitolle/18020 Bourges/France (1 aut., 2 aut., 3 aut.)
DT : Congrès; Niveau analytique
SO : IEEE International Conference on Robotics and Automation/21/2004/New Orleans LA USA; Etats-Unis; Piscataway NJ: IEEE; Da. 2004; Pp. 2776-2781; ISBN 0-7803-8232-3
LA : Anglais
EA : Operators suffer much difficulty in manipulating micro/nano-sized objects without the assistance of human-machine interfaces due to scaling effects. We developed an immersive telemanipulation system using haptic/visual/sound interfaces for observation of micro-objects under an optical microscope. As the image of the microscope is two-dimensional, so it is hard to observe the workspace in the 3-D space. To improve the real-time observation and manipulation, we proposed real-time 3-D reconstruction of the microworld using image processing and virtualized reality techniques. Then, feasible haptically-generated paths based on potentials fields reaction forces are selected for efficient pushing-based manipulation without collisions. The proposed system guides the operator's gesture fully immerged in the virtual workspace.
CC : 001D02D11; 001D02B04
FD : Homme; Téléopération; Temps réel; Micromanipulation; Echelle nanométrique; Interface utilisateur; Sensibilité tactile; Microscope optique; Domaine travail; Collision; Geste; Traitement image; Force réaction
ED : Human; Remote operation; Real time; Micromanipulation; Nanometer scale; User interface; Tactile sensitivity; Optical microscope; Workspace; Collision; Gesture; Image processing; Reaction force
SD : Hombre; Teleacción; Tiempo real; Micromanipulación; Interfase usuario; Sensibilidad tactil; Microscopio óptico; Dominio trabajo; Colisión; Gesto; Procesamiento imagen; Fuerza reacción
LO : INIST-Y 38842.354000153471324470
ID : 06-0270144

Links to Exploration step

Pascal:06-0270144

Le document en format XML

<record>
<TEI>
<teiHeader>
<fileDesc>
<titleStmt>
<title xml:lang="en" level="a">Virtualized reality interface for tele-micromanipulation</title>
<author>
<name sortKey="Ammi, Mehdi" sort="Ammi, Mehdi" uniqKey="Ammi M" first="Mehdi" last="Ammi">Mehdi Ammi</name>
<affiliation>
<inist:fA14 i1="01">
<s1>Laboratoire Vision et Robotique, ENSI de Bourges-Université d'Orléans, 10 Bld. Lahitolle</s1>
<s2>18020 Bourges</s2>
<s3>FRA</s3>
<sZ>1 aut.</sZ>
<sZ>2 aut.</sZ>
<sZ>3 aut.</sZ>
</inist:fA14>
</affiliation>
</author>
<author>
<name sortKey="Ferreira, Antoine" sort="Ferreira, Antoine" uniqKey="Ferreira A" first="Antoine" last="Ferreira">Antoine Ferreira</name>
<affiliation>
<inist:fA14 i1="01">
<s1>Laboratoire Vision et Robotique, ENSI de Bourges-Université d'Orléans, 10 Bld. Lahitolle</s1>
<s2>18020 Bourges</s2>
<s3>FRA</s3>
<sZ>1 aut.</sZ>
<sZ>2 aut.</sZ>
<sZ>3 aut.</sZ>
</inist:fA14>
</affiliation>
</author>
<author>
<name sortKey="Fontaine, Jean Guy" sort="Fontaine, Jean Guy" uniqKey="Fontaine J" first="Jean-Guy" last="Fontaine">Jean-Guy Fontaine</name>
<affiliation>
<inist:fA14 i1="01">
<s1>Laboratoire Vision et Robotique, ENSI de Bourges-Université d'Orléans, 10 Bld. Lahitolle</s1>
<s2>18020 Bourges</s2>
<s3>FRA</s3>
<sZ>1 aut.</sZ>
<sZ>2 aut.</sZ>
<sZ>3 aut.</sZ>
</inist:fA14>
</affiliation>
</author>
</titleStmt>
<publicationStmt>
<idno type="wicri:source">INIST</idno>
<idno type="inist">06-0270144</idno>
<date when="2004">2004</date>
<idno type="stanalyst">PASCAL 06-0270144 INIST</idno>
<idno type="RBID">Pascal:06-0270144</idno>
<idno type="wicri:Area/PascalFrancis/Corpus">000D32</idno>
</publicationStmt>
<sourceDesc>
<biblStruct>
<analytic>
<title xml:lang="en" level="a">Virtualized reality interface for tele-micromanipulation</title>
<author>
<name sortKey="Ammi, Mehdi" sort="Ammi, Mehdi" uniqKey="Ammi M" first="Mehdi" last="Ammi">Mehdi Ammi</name>
<affiliation>
<inist:fA14 i1="01">
<s1>Laboratoire Vision et Robotique, ENSI de Bourges-Université d'Orléans, 10 Bld. Lahitolle</s1>
<s2>18020 Bourges</s2>
<s3>FRA</s3>
<sZ>1 aut.</sZ>
<sZ>2 aut.</sZ>
<sZ>3 aut.</sZ>
</inist:fA14>
</affiliation>
</author>
<author>
<name sortKey="Ferreira, Antoine" sort="Ferreira, Antoine" uniqKey="Ferreira A" first="Antoine" last="Ferreira">Antoine Ferreira</name>
<affiliation>
<inist:fA14 i1="01">
<s1>Laboratoire Vision et Robotique, ENSI de Bourges-Université d'Orléans, 10 Bld. Lahitolle</s1>
<s2>18020 Bourges</s2>
<s3>FRA</s3>
<sZ>1 aut.</sZ>
<sZ>2 aut.</sZ>
<sZ>3 aut.</sZ>
</inist:fA14>
</affiliation>
</author>
<author>
<name sortKey="Fontaine, Jean Guy" sort="Fontaine, Jean Guy" uniqKey="Fontaine J" first="Jean-Guy" last="Fontaine">Jean-Guy Fontaine</name>
<affiliation>
<inist:fA14 i1="01">
<s1>Laboratoire Vision et Robotique, ENSI de Bourges-Université d'Orléans, 10 Bld. Lahitolle</s1>
<s2>18020 Bourges</s2>
<s3>FRA</s3>
<sZ>1 aut.</sZ>
<sZ>2 aut.</sZ>
<sZ>3 aut.</sZ>
</inist:fA14>
</affiliation>
</author>
</analytic>
</biblStruct>
</sourceDesc>
</fileDesc>
<profileDesc>
<textClass>
<keywords scheme="KwdEn" xml:lang="en">
<term>Collision</term>
<term>Gesture</term>
<term>Human</term>
<term>Image processing</term>
<term>Micromanipulation</term>
<term>Nanometer scale</term>
<term>Optical microscope</term>
<term>Reaction force</term>
<term>Real time</term>
<term>Remote operation</term>
<term>Tactile sensitivity</term>
<term>User interface</term>
<term>Workspace</term>
</keywords>
<keywords scheme="Pascal" xml:lang="fr">
<term>Homme</term>
<term>Téléopération</term>
<term>Temps réel</term>
<term>Micromanipulation</term>
<term>Echelle nanométrique</term>
<term>Interface utilisateur</term>
<term>Sensibilité tactile</term>
<term>Microscope optique</term>
<term>Domaine travail</term>
<term>Collision</term>
<term>Geste</term>
<term>Traitement image</term>
<term>Force réaction</term>
</keywords>
</textClass>
</profileDesc>
</teiHeader>
<front>
<div type="abstract" xml:lang="en">Operators suffer much difficulty in manipulating micro/nano-sized objects without the assistance of human-machine interfaces due to scaling effects. We developed an immersive telemanipulation system using haptic/visual/sound interfaces for observation of micro-objects under an optical microscope. As the image of the microscope is two-dimensional, so it is hard to observe the workspace in the 3-D space. To improve the real-time observation and manipulation, we proposed real-time 3-D reconstruction of the microworld using image processing and virtualized reality techniques. Then, feasible haptically-generated paths based on potentials fields reaction forces are selected for efficient pushing-based manipulation without collisions. The proposed system guides the operator's gesture fully immerged in the virtual workspace.</div>
</front>
</TEI>
<inist>
<standard h6="B">
<pA>
<fA08 i1="01" i2="1" l="ENG">
<s1>Virtualized reality interface for tele-micromanipulation</s1>
</fA08>
<fA09 i1="01" i2="1" l="ENG">
<s1>2004 IEEE International Conference on Robotics and Automation : April 26-May 1, 2004, Hilton New Orleans Riverside, New Orleans, LA, USA : Proceedings</s1>
</fA09>
<fA11 i1="01" i2="1">
<s1>AMMI (Mehdi)</s1>
</fA11>
<fA11 i1="02" i2="1">
<s1>FERREIRA (Antoine)</s1>
</fA11>
<fA11 i1="03" i2="1">
<s1>FONTAINE (Jean-Guy)</s1>
</fA11>
<fA14 i1="01">
<s1>Laboratoire Vision et Robotique, ENSI de Bourges-Université d'Orléans, 10 Bld. Lahitolle</s1>
<s2>18020 Bourges</s2>
<s3>FRA</s3>
<sZ>1 aut.</sZ>
<sZ>2 aut.</sZ>
<sZ>3 aut.</sZ>
</fA14>
<fA18 i1="01" i2="1">
<s1>IEEE Robotics and automation society</s1>
<s3>USA</s3>
<s9>org-cong.</s9>
</fA18>
<fA20>
<s1>2776-2781</s1>
</fA20>
<fA21>
<s1>2004</s1>
</fA21>
<fA23 i1="01">
<s0>ENG</s0>
</fA23>
<fA25 i1="01">
<s1>IEEE</s1>
<s2>Piscataway NJ</s2>
</fA25>
<fA26 i1="01">
<s0>0-7803-8232-3</s0>
</fA26>
<fA30 i1="01" i2="1" l="ENG">
<s1>IEEE International Conference on Robotics and Automation</s1>
<s2>21</s2>
<s3>New Orleans LA USA</s3>
<s4>2004</s4>
</fA30>
<fA43 i1="01">
<s1>INIST</s1>
<s2>Y 38842</s2>
<s5>354000153471324470</s5>
</fA43>
<fA44>
<s0>0000</s0>
<s1>© 2006 INIST-CNRS. All rights reserved.</s1>
</fA44>
<fA45>
<s0>11 ref.</s0>
</fA45>
<fA47 i1="01" i2="1">
<s0>06-0270144</s0>
</fA47>
<fA60>
<s1>C</s1>
</fA60>
<fA61>
<s0>A</s0>
</fA61>
<fA66 i1="01">
<s0>USA</s0>
</fA66>
<fC01 i1="01" l="ENG">
<s0>Operators suffer much difficulty in manipulating micro/nano-sized objects without the assistance of human-machine interfaces due to scaling effects. We developed an immersive telemanipulation system using haptic/visual/sound interfaces for observation of micro-objects under an optical microscope. As the image of the microscope is two-dimensional, so it is hard to observe the workspace in the 3-D space. To improve the real-time observation and manipulation, we proposed real-time 3-D reconstruction of the microworld using image processing and virtualized reality techniques. Then, feasible haptically-generated paths based on potentials fields reaction forces are selected for efficient pushing-based manipulation without collisions. The proposed system guides the operator's gesture fully immerged in the virtual workspace.</s0>
</fC01>
<fC02 i1="01" i2="X">
<s0>001D02D11</s0>
</fC02>
<fC02 i1="02" i2="X">
<s0>001D02B04</s0>
</fC02>
<fC03 i1="01" i2="X" l="FRE">
<s0>Homme</s0>
<s5>06</s5>
</fC03>
<fC03 i1="01" i2="X" l="ENG">
<s0>Human</s0>
<s5>06</s5>
</fC03>
<fC03 i1="01" i2="X" l="SPA">
<s0>Hombre</s0>
<s5>06</s5>
</fC03>
<fC03 i1="02" i2="X" l="FRE">
<s0>Téléopération</s0>
<s5>07</s5>
</fC03>
<fC03 i1="02" i2="X" l="ENG">
<s0>Remote operation</s0>
<s5>07</s5>
</fC03>
<fC03 i1="02" i2="X" l="SPA">
<s0>Teleacción</s0>
<s5>07</s5>
</fC03>
<fC03 i1="03" i2="X" l="FRE">
<s0>Temps réel</s0>
<s5>08</s5>
</fC03>
<fC03 i1="03" i2="X" l="ENG">
<s0>Real time</s0>
<s5>08</s5>
</fC03>
<fC03 i1="03" i2="X" l="SPA">
<s0>Tiempo real</s0>
<s5>08</s5>
</fC03>
<fC03 i1="04" i2="X" l="FRE">
<s0>Micromanipulation</s0>
<s5>18</s5>
</fC03>
<fC03 i1="04" i2="X" l="ENG">
<s0>Micromanipulation</s0>
<s5>18</s5>
</fC03>
<fC03 i1="04" i2="X" l="SPA">
<s0>Micromanipulación</s0>
<s5>18</s5>
</fC03>
<fC03 i1="05" i2="3" l="FRE">
<s0>Echelle nanométrique</s0>
<s5>19</s5>
</fC03>
<fC03 i1="05" i2="3" l="ENG">
<s0>Nanometer scale</s0>
<s5>19</s5>
</fC03>
<fC03 i1="06" i2="X" l="FRE">
<s0>Interface utilisateur</s0>
<s5>20</s5>
</fC03>
<fC03 i1="06" i2="X" l="ENG">
<s0>User interface</s0>
<s5>20</s5>
</fC03>
<fC03 i1="06" i2="X" l="SPA">
<s0>Interfase usuario</s0>
<s5>20</s5>
</fC03>
<fC03 i1="07" i2="X" l="FRE">
<s0>Sensibilité tactile</s0>
<s5>21</s5>
</fC03>
<fC03 i1="07" i2="X" l="ENG">
<s0>Tactile sensitivity</s0>
<s5>21</s5>
</fC03>
<fC03 i1="07" i2="X" l="SPA">
<s0>Sensibilidad tactil</s0>
<s5>21</s5>
</fC03>
<fC03 i1="08" i2="X" l="FRE">
<s0>Microscope optique</s0>
<s5>22</s5>
</fC03>
<fC03 i1="08" i2="X" l="ENG">
<s0>Optical microscope</s0>
<s5>22</s5>
</fC03>
<fC03 i1="08" i2="X" l="SPA">
<s0>Microscopio óptico</s0>
<s5>22</s5>
</fC03>
<fC03 i1="09" i2="X" l="FRE">
<s0>Domaine travail</s0>
<s5>23</s5>
</fC03>
<fC03 i1="09" i2="X" l="ENG">
<s0>Workspace</s0>
<s5>23</s5>
</fC03>
<fC03 i1="09" i2="X" l="SPA">
<s0>Dominio trabajo</s0>
<s5>23</s5>
</fC03>
<fC03 i1="10" i2="X" l="FRE">
<s0>Collision</s0>
<s5>24</s5>
</fC03>
<fC03 i1="10" i2="X" l="ENG">
<s0>Collision</s0>
<s5>24</s5>
</fC03>
<fC03 i1="10" i2="X" l="SPA">
<s0>Colisión</s0>
<s5>24</s5>
</fC03>
<fC03 i1="11" i2="X" l="FRE">
<s0>Geste</s0>
<s5>25</s5>
</fC03>
<fC03 i1="11" i2="X" l="ENG">
<s0>Gesture</s0>
<s5>25</s5>
</fC03>
<fC03 i1="11" i2="X" l="SPA">
<s0>Gesto</s0>
<s5>25</s5>
</fC03>
<fC03 i1="12" i2="X" l="FRE">
<s0>Traitement image</s0>
<s5>28</s5>
</fC03>
<fC03 i1="12" i2="X" l="ENG">
<s0>Image processing</s0>
<s5>28</s5>
</fC03>
<fC03 i1="12" i2="X" l="SPA">
<s0>Procesamiento imagen</s0>
<s5>28</s5>
</fC03>
<fC03 i1="13" i2="X" l="FRE">
<s0>Force réaction</s0>
<s5>29</s5>
</fC03>
<fC03 i1="13" i2="X" l="ENG">
<s0>Reaction force</s0>
<s5>29</s5>
</fC03>
<fC03 i1="13" i2="X" l="SPA">
<s0>Fuerza reacción</s0>
<s5>29</s5>
</fC03>
<fN21>
<s1>170</s1>
</fN21>
<fN44 i1="01">
<s1>OTO</s1>
</fN44>
<fN82>
<s1>OTO</s1>
</fN82>
</pA>
</standard>
<server>
<NO>PASCAL 06-0270144 INIST</NO>
<ET>Virtualized reality interface for tele-micromanipulation</ET>
<AU>AMMI (Mehdi); FERREIRA (Antoine); FONTAINE (Jean-Guy)</AU>
<AF>Laboratoire Vision et Robotique, ENSI de Bourges-Université d'Orléans, 10 Bld. Lahitolle/18020 Bourges/France (1 aut., 2 aut., 3 aut.)</AF>
<DT>Congrès; Niveau analytique</DT>
<SO>IEEE International Conference on Robotics and Automation/21/2004/New Orleans LA USA; Etats-Unis; Piscataway NJ: IEEE; Da. 2004; Pp. 2776-2781; ISBN 0-7803-8232-3</SO>
<LA>Anglais</LA>
<EA>Operators suffer much difficulty in manipulating micro/nano-sized objects without the assistance of human-machine interfaces due to scaling effects. We developed an immersive telemanipulation system using haptic/visual/sound interfaces for observation of micro-objects under an optical microscope. As the image of the microscope is two-dimensional, so it is hard to observe the workspace in the 3-D space. To improve the real-time observation and manipulation, we proposed real-time 3-D reconstruction of the microworld using image processing and virtualized reality techniques. Then, feasible haptically-generated paths based on potentials fields reaction forces are selected for efficient pushing-based manipulation without collisions. The proposed system guides the operator's gesture fully immerged in the virtual workspace.</EA>
<CC>001D02D11; 001D02B04</CC>
<FD>Homme; Téléopération; Temps réel; Micromanipulation; Echelle nanométrique; Interface utilisateur; Sensibilité tactile; Microscope optique; Domaine travail; Collision; Geste; Traitement image; Force réaction</FD>
<ED>Human; Remote operation; Real time; Micromanipulation; Nanometer scale; User interface; Tactile sensitivity; Optical microscope; Workspace; Collision; Gesture; Image processing; Reaction force</ED>
<SD>Hombre; Teleacción; Tiempo real; Micromanipulación; Interfase usuario; Sensibilidad tactil; Microscopio óptico; Dominio trabajo; Colisión; Gesto; Procesamiento imagen; Fuerza reacción</SD>
<LO>INIST-Y 38842.354000153471324470</LO>
<ID>06-0270144</ID>
</server>
</inist>
</record>

Pour manipuler ce document sous Unix (Dilib)

EXPLOR_STEP=$WICRI_ROOT/Ticri/CIDE/explor/HapticV1/Data/PascalFrancis/Corpus
HfdSelect -h $EXPLOR_STEP/biblio.hfd -nk 000D32 | SxmlIndent | more

Ou

HfdSelect -h $EXPLOR_AREA/Data/PascalFrancis/Corpus/biblio.hfd -nk 000D32 | SxmlIndent | more

Pour mettre un lien sur cette page dans le réseau Wicri

{{Explor lien
   |wiki=    Ticri/CIDE
   |area=    HapticV1
   |flux=    PascalFrancis
   |étape=   Corpus
   |type=    RBID
   |clé=     Pascal:06-0270144
   |texte=   Virtualized reality interface for tele-micromanipulation
}}

Wicri

This area was generated with Dilib version V0.6.23.
Data generation: Mon Jun 13 01:09:46 2016. Site generation: Wed Mar 6 09:54:07 2024