Intelligent control for haptic displays
Identifieur interne :
001701 ( PascalFrancis/Corpus );
précédent :
001700;
suivant :
001702
Intelligent control for haptic displays
Auteurs : S. Münch ;
M. StangenbergSource :
-
Computer graphics forum [ 0167-7055 ] ; 1996.
RBID : Pascal:96-0506574
Descripteurs français
- Pascal (Inist)
- Système conversationnel,
Système asservi,
Système intelligent,
Interface utilisateur,
Entrée sortie,
Boucle réaction,
Représentation graphique,
Relation homme machine,
Interface graphique interactive,
Interface homme machine,
Interface multimodale.
English descriptors
- KwdEn :
- Feedback,
Feedback system,
Graphics,
Input output,
Intelligent system,
Interactive graphic interface,
Interactive system,
Man machine interface,
Man machine relation,
Multimodal interface,
User interface.
Abstract
Usually, a mouse is used for input activities only, whereas output from the computer is sent via the monitor and one or two loudspeakers. But why not use the mouse for output, too? For instance, if it would be possible to predict the next interaction object the user wants to click on, a mouse with a mechanical brake could stop the cursor movement at the desired position. This kind of aid is especially attractive for small targets like resize handles of windows or small buttons. In this paper, we present an approach for the integration of haptic feedback in everyday graphical user interfaces. We use a specialized mouse which is able to apply simple haptic information to the user's hand and index finger. A multi-agent system has been designed which 'observes' the user in order to predict the next interaction object and launch haptic feedback, thus supporting positioning actions with the mouse. Although primarily designed in order to provide 'intelligent' haptic feedback, the system can be combined with other output modalities as well, due to its modular and flexible architecture.
Notice en format standard (ISO 2709)
Pour connaître la documentation sur le format Inist Standard.
pA |
A01 | 01 | 1 | | @0 0167-7055 |
---|
A03 | | 1 | | @0 Comput. graph. forum |
---|
A05 | | | | @2 15 |
---|
A06 | | | | @2 3 |
---|
A08 | 01 | 1 | ENG | @1 Intelligent control for haptic displays |
---|
A09 | 01 | 1 | ENG | @1 Computer graphics - Virtual reality - Graphics highways |
---|
A11 | 01 | 1 | | @1 MÜNCH (S.) |
---|
A11 | 02 | 1 | | @1 STANGENBERG (M.) |
---|
A12 | 01 | 1 | | @1 ROSSIGNAC (Jarek) @9 ed. |
---|
A12 | 02 | 1 | | @1 SILLION (François) @9 ed. |
---|
A14 | 01 | | | @1 Institute for Real-Time Computer Systems & Robotics, University of Karlsruhe, Kaiserstr. 12 @2 76128 Karlsruhe @3 DEU @Z 1 aut. @Z 2 aut. |
---|
A15 | 01 | | | @1 IBM T.J. Watson Research Center, P.O.Box 704 @2 Yorktown Heights, NY 10598 @3 USA @Z 1 aut. |
---|
A18 | 01 | 1 | | @1 Eurographics Association EG @3 EUR @9 patr. |
---|
A18 | 02 | 1 | | @1 INRIA @3 FRA @9 patr. |
---|
A20 | | | | @2 C.217-C.226 |
---|
A21 | | | | @1 1996 |
---|
A23 | 01 | | | @0 ENG |
---|
A43 | 01 | | | @1 INIST @2 21796 @5 354000064208980210 |
---|
A44 | | | | @0 0000 @1 © 1996 INIST-CNRS. All rights reserved. |
---|
A45 | | | | @0 13 ref. |
---|
A47 | 01 | 1 | | @0 96-0506574 |
---|
A60 | | | | @1 P @2 C |
---|
A61 | | | | @0 A |
---|
A64 | 01 | 1 | | @0 Computer graphics forum |
---|
A66 | 01 | | | @0 GBR |
---|
C01 | 01 | | ENG | @0 Usually, a mouse is used for input activities only, whereas output from the computer is sent via the monitor and one or two loudspeakers. But why not use the mouse for output, too? For instance, if it would be possible to predict the next interaction object the user wants to click on, a mouse with a mechanical brake could stop the cursor movement at the desired position. This kind of aid is especially attractive for small targets like resize handles of windows or small buttons. In this paper, we present an approach for the integration of haptic feedback in everyday graphical user interfaces. We use a specialized mouse which is able to apply simple haptic information to the user's hand and index finger. A multi-agent system has been designed which 'observes' the user in order to predict the next interaction object and launch haptic feedback, thus supporting positioning actions with the mouse. Although primarily designed in order to provide 'intelligent' haptic feedback, the system can be combined with other output modalities as well, due to its modular and flexible architecture. |
---|
C02 | 01 | X | | @0 001D02C03 |
---|
C02 | 02 | X | | @0 001D02B04 |
---|
C02 | 03 | X | | @0 001D02C04 |
---|
C03 | 01 | X | FRE | @0 Système conversationnel @5 01 |
---|
C03 | 01 | X | ENG | @0 Interactive system @5 01 |
---|
C03 | 01 | X | SPA | @0 Sistema conversacional @5 01 |
---|
C03 | 02 | X | FRE | @0 Système asservi @5 02 |
---|
C03 | 02 | X | ENG | @0 Feedback system @5 02 |
---|
C03 | 02 | X | SPA | @0 Servomecanismo @5 02 |
---|
C03 | 03 | X | FRE | @0 Système intelligent @5 03 |
---|
C03 | 03 | X | ENG | @0 Intelligent system @5 03 |
---|
C03 | 03 | X | SPA | @0 Sistema inteligente @5 03 |
---|
C03 | 04 | X | FRE | @0 Interface utilisateur @5 04 |
---|
C03 | 04 | X | ENG | @0 User interface @5 04 |
---|
C03 | 04 | X | SPA | @0 Interfase usuario @5 04 |
---|
C03 | 05 | X | FRE | @0 Entrée sortie @5 05 |
---|
C03 | 05 | X | ENG | @0 Input output @5 05 |
---|
C03 | 05 | X | SPA | @0 Entrada salida @5 05 |
---|
C03 | 06 | X | FRE | @0 Boucle réaction @5 06 |
---|
C03 | 06 | X | ENG | @0 Feedback @5 06 |
---|
C03 | 06 | X | SPA | @0 Retroalimentación @5 06 |
---|
C03 | 07 | X | FRE | @0 Représentation graphique @5 07 |
---|
C03 | 07 | X | ENG | @0 Graphics @5 07 |
---|
C03 | 07 | X | SPA | @0 Representación gráfica @5 07 |
---|
C03 | 08 | X | FRE | @0 Relation homme machine @5 08 |
---|
C03 | 08 | X | ENG | @0 Man machine relation @5 08 |
---|
C03 | 08 | X | SPA | @0 Relación hombre máquina @5 08 |
---|
C03 | 09 | X | FRE | @0 Interface graphique interactive @4 CD @5 96 |
---|
C03 | 09 | X | ENG | @0 Interactive graphic interface @4 CD @5 96 |
---|
C03 | 10 | X | FRE | @0 Interface homme machine @4 CD @5 97 |
---|
C03 | 10 | X | ENG | @0 Man machine interface @4 CD @5 97 |
---|
C03 | 11 | X | FRE | @0 Interface multimodale @4 CD @5 98 |
---|
C03 | 11 | X | ENG | @0 Multimodal interface @4 CD @5 98 |
---|
N21 | | | | @1 345 |
---|
|
pR |
A30 | 01 | 1 | ENG | @1 EUROGRAPHICS '96 @3 Poitiers FRA @4 1996-08-26 |
---|
A30 | 02 | 1 | ENG | @1 European Association for Computer Graphics. Annual Conference and Exhibition @2 17 @3 Poitiers FRA @4 1996-08-26 |
---|
|
Format Inist (serveur)
NO : | PASCAL 96-0506574 INIST |
ET : | Intelligent control for haptic displays |
AU : | MÜNCH (S.); STANGENBERG (M.); ROSSIGNAC (Jarek); SILLION (François) |
AF : | Institute for Real-Time Computer Systems & Robotics, University of Karlsruhe, Kaiserstr. 12/76128 Karlsruhe/Allemagne (1 aut., 2 aut.); IBM T.J. Watson Research Center, P.O.Box 704/Yorktown Heights, NY 10598/Etats-Unis (1 aut.) |
DT : | Publication en série; Congrès; Niveau analytique |
SO : | Computer graphics forum; ISSN 0167-7055; Royaume-Uni; Da. 1996; Vol. 15; No. 3; C.217-C.226; Bibl. 13 ref. |
LA : | Anglais |
EA : | Usually, a mouse is used for input activities only, whereas output from the computer is sent via the monitor and one or two loudspeakers. But why not use the mouse for output, too? For instance, if it would be possible to predict the next interaction object the user wants to click on, a mouse with a mechanical brake could stop the cursor movement at the desired position. This kind of aid is especially attractive for small targets like resize handles of windows or small buttons. In this paper, we present an approach for the integration of haptic feedback in everyday graphical user interfaces. We use a specialized mouse which is able to apply simple haptic information to the user's hand and index finger. A multi-agent system has been designed which 'observes' the user in order to predict the next interaction object and launch haptic feedback, thus supporting positioning actions with the mouse. Although primarily designed in order to provide 'intelligent' haptic feedback, the system can be combined with other output modalities as well, due to its modular and flexible architecture. |
CC : | 001D02C03; 001D02B04; 001D02C04 |
FD : | Système conversationnel; Système asservi; Système intelligent; Interface utilisateur; Entrée sortie; Boucle réaction; Représentation graphique; Relation homme machine; Interface graphique interactive; Interface homme machine; Interface multimodale |
ED : | Interactive system; Feedback system; Intelligent system; User interface; Input output; Feedback; Graphics; Man machine relation; Interactive graphic interface; Man machine interface; Multimodal interface |
SD : | Sistema conversacional; Servomecanismo; Sistema inteligente; Interfase usuario; Entrada salida; Retroalimentación; Representación gráfica; Relación hombre máquina |
LO : | INIST-21796.354000064208980210 |
ID : | 96-0506574 |
Links to Exploration step
Pascal:96-0506574
Le document en format XML
<record><TEI><teiHeader><fileDesc><titleStmt><title xml:lang="en" level="a">Intelligent control for haptic displays</title>
<author><name sortKey="Munch, S" sort="Munch, S" uniqKey="Munch S" first="S." last="Münch">S. Münch</name>
<affiliation><inist:fA14 i1="01"><s1>Institute for Real-Time Computer Systems & Robotics, University of Karlsruhe, Kaiserstr. 12</s1>
<s2>76128 Karlsruhe</s2>
<s3>DEU</s3>
<sZ>1 aut.</sZ>
<sZ>2 aut.</sZ>
</inist:fA14>
</affiliation>
</author>
<author><name sortKey="Stangenberg, M" sort="Stangenberg, M" uniqKey="Stangenberg M" first="M." last="Stangenberg">M. Stangenberg</name>
<affiliation><inist:fA14 i1="01"><s1>Institute for Real-Time Computer Systems & Robotics, University of Karlsruhe, Kaiserstr. 12</s1>
<s2>76128 Karlsruhe</s2>
<s3>DEU</s3>
<sZ>1 aut.</sZ>
<sZ>2 aut.</sZ>
</inist:fA14>
</affiliation>
</author>
</titleStmt>
<publicationStmt><idno type="wicri:source">INIST</idno>
<idno type="inist">96-0506574</idno>
<date when="1996">1996</date>
<idno type="stanalyst">PASCAL 96-0506574 INIST</idno>
<idno type="RBID">Pascal:96-0506574</idno>
<idno type="wicri:Area/PascalFrancis/Corpus">001701</idno>
</publicationStmt>
<sourceDesc><biblStruct><analytic><title xml:lang="en" level="a">Intelligent control for haptic displays</title>
<author><name sortKey="Munch, S" sort="Munch, S" uniqKey="Munch S" first="S." last="Münch">S. Münch</name>
<affiliation><inist:fA14 i1="01"><s1>Institute for Real-Time Computer Systems & Robotics, University of Karlsruhe, Kaiserstr. 12</s1>
<s2>76128 Karlsruhe</s2>
<s3>DEU</s3>
<sZ>1 aut.</sZ>
<sZ>2 aut.</sZ>
</inist:fA14>
</affiliation>
</author>
<author><name sortKey="Stangenberg, M" sort="Stangenberg, M" uniqKey="Stangenberg M" first="M." last="Stangenberg">M. Stangenberg</name>
<affiliation><inist:fA14 i1="01"><s1>Institute for Real-Time Computer Systems & Robotics, University of Karlsruhe, Kaiserstr. 12</s1>
<s2>76128 Karlsruhe</s2>
<s3>DEU</s3>
<sZ>1 aut.</sZ>
<sZ>2 aut.</sZ>
</inist:fA14>
</affiliation>
</author>
</analytic>
<series><title level="j" type="main">Computer graphics forum</title>
<title level="j" type="abbreviated">Comput. graph. forum</title>
<idno type="ISSN">0167-7055</idno>
<imprint><date when="1996">1996</date>
</imprint>
</series>
</biblStruct>
</sourceDesc>
<seriesStmt><title level="j" type="main">Computer graphics forum</title>
<title level="j" type="abbreviated">Comput. graph. forum</title>
<idno type="ISSN">0167-7055</idno>
</seriesStmt>
</fileDesc>
<profileDesc><textClass><keywords scheme="KwdEn" xml:lang="en"><term>Feedback</term>
<term>Feedback system</term>
<term>Graphics</term>
<term>Input output</term>
<term>Intelligent system</term>
<term>Interactive graphic interface</term>
<term>Interactive system</term>
<term>Man machine interface</term>
<term>Man machine relation</term>
<term>Multimodal interface</term>
<term>User interface</term>
</keywords>
<keywords scheme="Pascal" xml:lang="fr"><term>Système conversationnel</term>
<term>Système asservi</term>
<term>Système intelligent</term>
<term>Interface utilisateur</term>
<term>Entrée sortie</term>
<term>Boucle réaction</term>
<term>Représentation graphique</term>
<term>Relation homme machine</term>
<term>Interface graphique interactive</term>
<term>Interface homme machine</term>
<term>Interface multimodale</term>
</keywords>
</textClass>
</profileDesc>
</teiHeader>
<front><div type="abstract" xml:lang="en">Usually, a mouse is used for input activities only, whereas output from the computer is sent via the monitor and one or two loudspeakers. But why not use the mouse for output, too? For instance, if it would be possible to predict the next interaction object the user wants to click on, a mouse with a mechanical brake could stop the cursor movement at the desired position. This kind of aid is especially attractive for small targets like resize handles of windows or small buttons. In this paper, we present an approach for the integration of haptic feedback in everyday graphical user interfaces. We use a specialized mouse which is able to apply simple haptic information to the user's hand and index finger. A multi-agent system has been designed which 'observes' the user in order to predict the next interaction object and launch haptic feedback, thus supporting positioning actions with the mouse. Although primarily designed in order to provide 'intelligent' haptic feedback, the system can be combined with other output modalities as well, due to its modular and flexible architecture.</div>
</front>
</TEI>
<inist><standard h6="B"><pA><fA01 i1="01" i2="1"><s0>0167-7055</s0>
</fA01>
<fA03 i2="1"><s0>Comput. graph. forum</s0>
</fA03>
<fA08 i1="01" i2="1" l="ENG"><s1>Intelligent control for haptic displays</s1>
</fA08>
<fA09 i1="01" i2="1" l="ENG"><s1>Computer graphics - Virtual reality - Graphics highways</s1>
</fA09>
<fA11 i1="01" i2="1"><s1>MÜNCH (S.)</s1>
</fA11>
<fA11 i1="02" i2="1"><s1>STANGENBERG (M.)</s1>
</fA11>
<fA12 i1="01" i2="1"><s1>ROSSIGNAC (Jarek)</s1>
<s9>ed.</s9>
</fA12>
<fA12 i1="02" i2="1"><s1>SILLION (François)</s1>
<s9>ed.</s9>
</fA12>
<fA14 i1="01"><s1>Institute for Real-Time Computer Systems & Robotics, University of Karlsruhe, Kaiserstr. 12</s1>
<s2>76128 Karlsruhe</s2>
<s3>DEU</s3>
<sZ>1 aut.</sZ>
<sZ>2 aut.</sZ>
</fA14>
<fA15 i1="01"><s1>IBM T.J. Watson Research Center, P.O.Box 704</s1>
<s2>Yorktown Heights, NY 10598</s2>
<s3>USA</s3>
<sZ>1 aut.</sZ>
</fA15>
<fA18 i1="01" i2="1"><s1>Eurographics Association EG</s1>
<s3>EUR</s3>
<s9>patr.</s9>
</fA18>
<fA18 i1="02" i2="1"><s1>INRIA</s1>
<s3>FRA</s3>
<s9>patr.</s9>
</fA18>
<fA20><s2>C.217-C.226</s2>
</fA20>
<fA21><s1>1996</s1>
</fA21>
<fA23 i1="01"><s0>ENG</s0>
</fA23>
<fA43 i1="01"><s1>INIST</s1>
<s2>21796</s2>
<s5>354000064208980210</s5>
</fA43>
<fA44><s0>0000</s0>
<s1>© 1996 INIST-CNRS. All rights reserved.</s1>
</fA44>
<fA45><s0>13 ref.</s0>
</fA45>
<fA47 i1="01" i2="1"><s0>96-0506574</s0>
</fA47>
<fA60><s1>P</s1>
<s2>C</s2>
</fA60>
<fA64 i1="01" i2="1"><s0>Computer graphics forum</s0>
</fA64>
<fA66 i1="01"><s0>GBR</s0>
</fA66>
<fC01 i1="01" l="ENG"><s0>Usually, a mouse is used for input activities only, whereas output from the computer is sent via the monitor and one or two loudspeakers. But why not use the mouse for output, too? For instance, if it would be possible to predict the next interaction object the user wants to click on, a mouse with a mechanical brake could stop the cursor movement at the desired position. This kind of aid is especially attractive for small targets like resize handles of windows or small buttons. In this paper, we present an approach for the integration of haptic feedback in everyday graphical user interfaces. We use a specialized mouse which is able to apply simple haptic information to the user's hand and index finger. A multi-agent system has been designed which 'observes' the user in order to predict the next interaction object and launch haptic feedback, thus supporting positioning actions with the mouse. Although primarily designed in order to provide 'intelligent' haptic feedback, the system can be combined with other output modalities as well, due to its modular and flexible architecture.</s0>
</fC01>
<fC02 i1="01" i2="X"><s0>001D02C03</s0>
</fC02>
<fC02 i1="02" i2="X"><s0>001D02B04</s0>
</fC02>
<fC02 i1="03" i2="X"><s0>001D02C04</s0>
</fC02>
<fC03 i1="01" i2="X" l="FRE"><s0>Système conversationnel</s0>
<s5>01</s5>
</fC03>
<fC03 i1="01" i2="X" l="ENG"><s0>Interactive system</s0>
<s5>01</s5>
</fC03>
<fC03 i1="01" i2="X" l="SPA"><s0>Sistema conversacional</s0>
<s5>01</s5>
</fC03>
<fC03 i1="02" i2="X" l="FRE"><s0>Système asservi</s0>
<s5>02</s5>
</fC03>
<fC03 i1="02" i2="X" l="ENG"><s0>Feedback system</s0>
<s5>02</s5>
</fC03>
<fC03 i1="02" i2="X" l="SPA"><s0>Servomecanismo</s0>
<s5>02</s5>
</fC03>
<fC03 i1="03" i2="X" l="FRE"><s0>Système intelligent</s0>
<s5>03</s5>
</fC03>
<fC03 i1="03" i2="X" l="ENG"><s0>Intelligent system</s0>
<s5>03</s5>
</fC03>
<fC03 i1="03" i2="X" l="SPA"><s0>Sistema inteligente</s0>
<s5>03</s5>
</fC03>
<fC03 i1="04" i2="X" l="FRE"><s0>Interface utilisateur</s0>
<s5>04</s5>
</fC03>
<fC03 i1="04" i2="X" l="ENG"><s0>User interface</s0>
<s5>04</s5>
</fC03>
<fC03 i1="04" i2="X" l="SPA"><s0>Interfase usuario</s0>
<s5>04</s5>
</fC03>
<fC03 i1="05" i2="X" l="FRE"><s0>Entrée sortie</s0>
<s5>05</s5>
</fC03>
<fC03 i1="05" i2="X" l="ENG"><s0>Input output</s0>
<s5>05</s5>
</fC03>
<fC03 i1="05" i2="X" l="SPA"><s0>Entrada salida</s0>
<s5>05</s5>
</fC03>
<fC03 i1="06" i2="X" l="FRE"><s0>Boucle réaction</s0>
<s5>06</s5>
</fC03>
<fC03 i1="06" i2="X" l="ENG"><s0>Feedback</s0>
<s5>06</s5>
</fC03>
<fC03 i1="06" i2="X" l="SPA"><s0>Retroalimentación</s0>
<s5>06</s5>
</fC03>
<fC03 i1="07" i2="X" l="FRE"><s0>Représentation graphique</s0>
<s5>07</s5>
</fC03>
<fC03 i1="07" i2="X" l="ENG"><s0>Graphics</s0>
<s5>07</s5>
</fC03>
<fC03 i1="07" i2="X" l="SPA"><s0>Representación gráfica</s0>
<s5>07</s5>
</fC03>
<fC03 i1="08" i2="X" l="FRE"><s0>Relation homme machine</s0>
<s5>08</s5>
</fC03>
<fC03 i1="08" i2="X" l="ENG"><s0>Man machine relation</s0>
<s5>08</s5>
</fC03>
<fC03 i1="08" i2="X" l="SPA"><s0>Relación hombre máquina</s0>
<s5>08</s5>
</fC03>
<fC03 i1="09" i2="X" l="FRE"><s0>Interface graphique interactive</s0>
<s4>CD</s4>
<s5>96</s5>
</fC03>
<fC03 i1="09" i2="X" l="ENG"><s0>Interactive graphic interface</s0>
<s4>CD</s4>
<s5>96</s5>
</fC03>
<fC03 i1="10" i2="X" l="FRE"><s0>Interface homme machine</s0>
<s4>CD</s4>
<s5>97</s5>
</fC03>
<fC03 i1="10" i2="X" l="ENG"><s0>Man machine interface</s0>
<s4>CD</s4>
<s5>97</s5>
</fC03>
<fC03 i1="11" i2="X" l="FRE"><s0>Interface multimodale</s0>
<s4>CD</s4>
<s5>98</s5>
</fC03>
<fC03 i1="11" i2="X" l="ENG"><s0>Multimodal interface</s0>
<s4>CD</s4>
<s5>98</s5>
</fC03>
<fN21><s1>345</s1>
</fN21>
</pA>
<pR><fA30 i1="01" i2="1" l="ENG"><s1>EUROGRAPHICS '96</s1>
<s3>Poitiers FRA</s3>
<s4>1996-08-26</s4>
</fA30>
<fA30 i1="02" i2="1" l="ENG"><s1>European Association for Computer Graphics. Annual Conference and Exhibition</s1>
<s2>17</s2>
<s3>Poitiers FRA</s3>
<s4>1996-08-26</s4>
</fA30>
</pR>
</standard>
<server><NO>PASCAL 96-0506574 INIST</NO>
<ET>Intelligent control for haptic displays</ET>
<AU>MÜNCH (S.); STANGENBERG (M.); ROSSIGNAC (Jarek); SILLION (François)</AU>
<AF>Institute for Real-Time Computer Systems & Robotics, University of Karlsruhe, Kaiserstr. 12/76128 Karlsruhe/Allemagne (1 aut., 2 aut.); IBM T.J. Watson Research Center, P.O.Box 704/Yorktown Heights, NY 10598/Etats-Unis (1 aut.)</AF>
<DT>Publication en série; Congrès; Niveau analytique</DT>
<SO>Computer graphics forum; ISSN 0167-7055; Royaume-Uni; Da. 1996; Vol. 15; No. 3; C.217-C.226; Bibl. 13 ref.</SO>
<LA>Anglais</LA>
<EA>Usually, a mouse is used for input activities only, whereas output from the computer is sent via the monitor and one or two loudspeakers. But why not use the mouse for output, too? For instance, if it would be possible to predict the next interaction object the user wants to click on, a mouse with a mechanical brake could stop the cursor movement at the desired position. This kind of aid is especially attractive for small targets like resize handles of windows or small buttons. In this paper, we present an approach for the integration of haptic feedback in everyday graphical user interfaces. We use a specialized mouse which is able to apply simple haptic information to the user's hand and index finger. A multi-agent system has been designed which 'observes' the user in order to predict the next interaction object and launch haptic feedback, thus supporting positioning actions with the mouse. Although primarily designed in order to provide 'intelligent' haptic feedback, the system can be combined with other output modalities as well, due to its modular and flexible architecture.</EA>
<CC>001D02C03; 001D02B04; 001D02C04</CC>
<FD>Système conversationnel; Système asservi; Système intelligent; Interface utilisateur; Entrée sortie; Boucle réaction; Représentation graphique; Relation homme machine; Interface graphique interactive; Interface homme machine; Interface multimodale</FD>
<ED>Interactive system; Feedback system; Intelligent system; User interface; Input output; Feedback; Graphics; Man machine relation; Interactive graphic interface; Man machine interface; Multimodal interface</ED>
<SD>Sistema conversacional; Servomecanismo; Sistema inteligente; Interfase usuario; Entrada salida; Retroalimentación; Representación gráfica; Relación hombre máquina</SD>
<LO>INIST-21796.354000064208980210</LO>
<ID>96-0506574</ID>
</server>
</inist>
</record>
Pour manipuler ce document sous Unix (Dilib)
EXPLOR_STEP=$WICRI_ROOT/Ticri/CIDE/explor/HapticV1/Data/PascalFrancis/Corpus
HfdSelect -h $EXPLOR_STEP/biblio.hfd -nk 001701 | SxmlIndent | more
Ou
HfdSelect -h $EXPLOR_AREA/Data/PascalFrancis/Corpus/biblio.hfd -nk 001701 | SxmlIndent | more
Pour mettre un lien sur cette page dans le réseau Wicri
{{Explor lien
|wiki= Ticri/CIDE
|area= HapticV1
|flux= PascalFrancis
|étape= Corpus
|type= RBID
|clé= Pascal:96-0506574
|texte= Intelligent control for haptic displays
}}
| This area was generated with Dilib version V0.6.23. Data generation: Mon Jun 13 01:09:46 2016. Site generation: Wed Mar 6 09:54:07 2024 | |