Haptic Communications
Identifieur interne : 000329 ( PascalFrancis/Corpus ); précédent : 000328; suivant : 000330Haptic Communications
Auteurs : Eckehard Steinbach ; Sandra Hirche ; Marc Ernst ; Fernanda Brandi ; Rahul Chaudhari ; Julius Kammerl ; Iason VittoriasSource :
- Proceedings of the IEEE [ 0018-9219 ] ; 2012.
Descripteurs français
- Pascal (Inist)
- Communication multimédia, Système multimédia, Immersion, Relation homme machine, Etat actuel, Evaluation subjective, Evaluation performance, Synchronisation, Apprentissage, Codage, Transmission information, Milieu dissipatif, Temps retard, Variation temporelle, Analyse objective, Contrôle qualité, Téléopération, Psychophysique, Télémanipulation.
English descriptors
- KwdEn :
- Coding, Delay time, Immersion, Information transmission, Learning, Lossy medium, Man machine relation, Multimedia communication, Multimedia systems, Objective analysis, Performance evaluation, Psychophysics, Quality control, Remote handling, Remote operation, State of the art, Subjective evaluation, Synchronization, Time variation.
Abstract
Audiovisual communications is at the core of multimedia systems that allow users to interact across distances. It is common understanding that both audio and video are required for high-quality interaction. While audiovisual information provides a user with a satisfactory impression of being present in a remote environment, physical interaction and manipulation is not supported. True immersion into a distant environment and efficient distributed collaboration require the ability to physically interact with remote objects and to literally get in touch with other people. Touching and manipulating objects remotely becomes possible if we augment traditional audiovisual communications by the haptic modality. Haptic communications is a relatively young field of research that has the potential to substantially improve human-human and human-machine interaction. In this paper, we discuss the state-of-the-art in haptic communications both from psychophysical and technical points of view. From a human perception point of view, we mainly focus on the multimodal integration of video and haptics and the improved performance that can be achieved when combining them. We also discuss how the human adapts to discrepancies and synchronization errors between different modalities, a research area which is typically referred to as perceptual learning. From a technical perspective, we address perceptual coding of haptic information and the transmission of haptic data streams over resource-constrained and potentially lossy networks in the presence of unpredictable and time-varying communication delays. In this context, we also discuss the need for objective quality metrics for haptic communication. Throughout the paper, we stress the fact that haptic communications is not meant as a replacement of traditional audiovisual communications but rather as an additional dimension for telepresence that will allow us to advance in our quest for truly immersive communication.
Notice en format standard (ISO 2709)
Pour connaître la documentation sur le format Inist Standard.
pA |
|
---|
Format Inist (serveur)
NO : | PASCAL 12-0208902 INIST |
---|---|
ET : | Haptic Communications |
AU : | STEINBACH (Eckehard); HIRCHE (Sandra); ERNST (Marc); BRANDI (Fernanda); CHAUDHARI (Rahul); KAMMERL (Julius); VITTORIAS (Iason); JAYANT (Nikil) |
AF : | Institute for Media Technology, Technische universität München/Munich 80333/Allemagne (1 aut., 4 aut., 5 aut., 6 aut.); Institute for Automatic Control Engineering. Technische Universität München/Munich 80333/Allemagne (2 aut., 7 aut.); Faculty of Biology/Kognitive Neurowissenschaften, Universität Bielefeld/Bielefeld 33615/Allemagne (3 aut.) |
DT : | Publication en série; Niveau analytique |
SO : | Proceedings of the IEEE; ISSN 0018-9219; Coden IEEPAD; Etats-Unis; Da. 2012; Vol. 100; No. 4; Pp. 937-956; Bibl. 137 ref. |
LA : | Anglais |
EA : | Audiovisual communications is at the core of multimedia systems that allow users to interact across distances. It is common understanding that both audio and video are required for high-quality interaction. While audiovisual information provides a user with a satisfactory impression of being present in a remote environment, physical interaction and manipulation is not supported. True immersion into a distant environment and efficient distributed collaboration require the ability to physically interact with remote objects and to literally get in touch with other people. Touching and manipulating objects remotely becomes possible if we augment traditional audiovisual communications by the haptic modality. Haptic communications is a relatively young field of research that has the potential to substantially improve human-human and human-machine interaction. In this paper, we discuss the state-of-the-art in haptic communications both from psychophysical and technical points of view. From a human perception point of view, we mainly focus on the multimodal integration of video and haptics and the improved performance that can be achieved when combining them. We also discuss how the human adapts to discrepancies and synchronization errors between different modalities, a research area which is typically referred to as perceptual learning. From a technical perspective, we address perceptual coding of haptic information and the transmission of haptic data streams over resource-constrained and potentially lossy networks in the presence of unpredictable and time-varying communication delays. In this context, we also discuss the need for objective quality metrics for haptic communication. Throughout the paper, we stress the fact that haptic communications is not meant as a replacement of traditional audiovisual communications but rather as an additional dimension for telepresence that will allow us to advance in our quest for truly immersive communication. |
CC : | 001D04A04B; 001D04B02G; 001D04B02H4 |
FD : | Communication multimédia; Système multimédia; Immersion; Relation homme machine; Etat actuel; Evaluation subjective; Evaluation performance; Synchronisation; Apprentissage; Codage; Transmission information; Milieu dissipatif; Temps retard; Variation temporelle; Analyse objective; Contrôle qualité; Téléopération; Psychophysique; Télémanipulation |
ED : | Multimedia communication; Multimedia systems; Immersion; Man machine relation; State of the art; Subjective evaluation; Performance evaluation; Synchronization; Learning; Coding; Information transmission; Lossy medium; Delay time; Time variation; Objective analysis; Quality control; Remote operation; Psychophysics; Remote handling |
SD : | Inmersión; Relación hombre máquina; Estado actual; Evaluación subjetiva; Evaluación prestación; Sincronización; Aprendizaje; Codificación; Transmisión información; Medio dispersor; Tiempo retardo; Variación temporal; Análisis objetivos; Control de calidad; Teleacción; Psicofísica |
LO : | INIST-222.354000509643200100 |
ID : | 12-0208902 |
Links to Exploration step
Pascal:12-0208902Le document en format XML
<record><TEI><teiHeader><fileDesc><titleStmt><title xml:lang="en" level="a">Haptic Communications</title>
<author><name sortKey="Steinbach, Eckehard" sort="Steinbach, Eckehard" uniqKey="Steinbach E" first="Eckehard" last="Steinbach">Eckehard Steinbach</name>
<affiliation><inist:fA14 i1="01"><s1>Institute for Media Technology, Technische universität München</s1>
<s2>Munich 80333</s2>
<s3>DEU</s3>
<sZ>1 aut.</sZ>
<sZ>4 aut.</sZ>
<sZ>5 aut.</sZ>
<sZ>6 aut.</sZ>
</inist:fA14>
</affiliation>
</author>
<author><name sortKey="Hirche, Sandra" sort="Hirche, Sandra" uniqKey="Hirche S" first="Sandra" last="Hirche">Sandra Hirche</name>
<affiliation><inist:fA14 i1="02"><s1>Institute for Automatic Control Engineering. Technische Universität München</s1>
<s2>Munich 80333</s2>
<s3>DEU</s3>
<sZ>2 aut.</sZ>
<sZ>7 aut.</sZ>
</inist:fA14>
</affiliation>
</author>
<author><name sortKey="Ernst, Marc" sort="Ernst, Marc" uniqKey="Ernst M" first="Marc" last="Ernst">Marc Ernst</name>
<affiliation><inist:fA14 i1="03"><s1>Faculty of Biology/Kognitive Neurowissenschaften, Universität Bielefeld</s1>
<s2>Bielefeld 33615</s2>
<s3>DEU</s3>
<sZ>3 aut.</sZ>
</inist:fA14>
</affiliation>
</author>
<author><name sortKey="Brandi, Fernanda" sort="Brandi, Fernanda" uniqKey="Brandi F" first="Fernanda" last="Brandi">Fernanda Brandi</name>
<affiliation><inist:fA14 i1="01"><s1>Institute for Media Technology, Technische universität München</s1>
<s2>Munich 80333</s2>
<s3>DEU</s3>
<sZ>1 aut.</sZ>
<sZ>4 aut.</sZ>
<sZ>5 aut.</sZ>
<sZ>6 aut.</sZ>
</inist:fA14>
</affiliation>
</author>
<author><name sortKey="Chaudhari, Rahul" sort="Chaudhari, Rahul" uniqKey="Chaudhari R" first="Rahul" last="Chaudhari">Rahul Chaudhari</name>
<affiliation><inist:fA14 i1="01"><s1>Institute for Media Technology, Technische universität München</s1>
<s2>Munich 80333</s2>
<s3>DEU</s3>
<sZ>1 aut.</sZ>
<sZ>4 aut.</sZ>
<sZ>5 aut.</sZ>
<sZ>6 aut.</sZ>
</inist:fA14>
</affiliation>
</author>
<author><name sortKey="Kammerl, Julius" sort="Kammerl, Julius" uniqKey="Kammerl J" first="Julius" last="Kammerl">Julius Kammerl</name>
<affiliation><inist:fA14 i1="01"><s1>Institute for Media Technology, Technische universität München</s1>
<s2>Munich 80333</s2>
<s3>DEU</s3>
<sZ>1 aut.</sZ>
<sZ>4 aut.</sZ>
<sZ>5 aut.</sZ>
<sZ>6 aut.</sZ>
</inist:fA14>
</affiliation>
</author>
<author><name sortKey="Vittorias, Iason" sort="Vittorias, Iason" uniqKey="Vittorias I" first="Iason" last="Vittorias">Iason Vittorias</name>
<affiliation><inist:fA14 i1="02"><s1>Institute for Automatic Control Engineering. Technische Universität München</s1>
<s2>Munich 80333</s2>
<s3>DEU</s3>
<sZ>2 aut.</sZ>
<sZ>7 aut.</sZ>
</inist:fA14>
</affiliation>
</author>
</titleStmt>
<publicationStmt><idno type="wicri:source">INIST</idno>
<idno type="inist">12-0208902</idno>
<date when="2012">2012</date>
<idno type="stanalyst">PASCAL 12-0208902 INIST</idno>
<idno type="RBID">Pascal:12-0208902</idno>
<idno type="wicri:Area/PascalFrancis/Corpus">000329</idno>
</publicationStmt>
<sourceDesc><biblStruct><analytic><title xml:lang="en" level="a">Haptic Communications</title>
<author><name sortKey="Steinbach, Eckehard" sort="Steinbach, Eckehard" uniqKey="Steinbach E" first="Eckehard" last="Steinbach">Eckehard Steinbach</name>
<affiliation><inist:fA14 i1="01"><s1>Institute for Media Technology, Technische universität München</s1>
<s2>Munich 80333</s2>
<s3>DEU</s3>
<sZ>1 aut.</sZ>
<sZ>4 aut.</sZ>
<sZ>5 aut.</sZ>
<sZ>6 aut.</sZ>
</inist:fA14>
</affiliation>
</author>
<author><name sortKey="Hirche, Sandra" sort="Hirche, Sandra" uniqKey="Hirche S" first="Sandra" last="Hirche">Sandra Hirche</name>
<affiliation><inist:fA14 i1="02"><s1>Institute for Automatic Control Engineering. Technische Universität München</s1>
<s2>Munich 80333</s2>
<s3>DEU</s3>
<sZ>2 aut.</sZ>
<sZ>7 aut.</sZ>
</inist:fA14>
</affiliation>
</author>
<author><name sortKey="Ernst, Marc" sort="Ernst, Marc" uniqKey="Ernst M" first="Marc" last="Ernst">Marc Ernst</name>
<affiliation><inist:fA14 i1="03"><s1>Faculty of Biology/Kognitive Neurowissenschaften, Universität Bielefeld</s1>
<s2>Bielefeld 33615</s2>
<s3>DEU</s3>
<sZ>3 aut.</sZ>
</inist:fA14>
</affiliation>
</author>
<author><name sortKey="Brandi, Fernanda" sort="Brandi, Fernanda" uniqKey="Brandi F" first="Fernanda" last="Brandi">Fernanda Brandi</name>
<affiliation><inist:fA14 i1="01"><s1>Institute for Media Technology, Technische universität München</s1>
<s2>Munich 80333</s2>
<s3>DEU</s3>
<sZ>1 aut.</sZ>
<sZ>4 aut.</sZ>
<sZ>5 aut.</sZ>
<sZ>6 aut.</sZ>
</inist:fA14>
</affiliation>
</author>
<author><name sortKey="Chaudhari, Rahul" sort="Chaudhari, Rahul" uniqKey="Chaudhari R" first="Rahul" last="Chaudhari">Rahul Chaudhari</name>
<affiliation><inist:fA14 i1="01"><s1>Institute for Media Technology, Technische universität München</s1>
<s2>Munich 80333</s2>
<s3>DEU</s3>
<sZ>1 aut.</sZ>
<sZ>4 aut.</sZ>
<sZ>5 aut.</sZ>
<sZ>6 aut.</sZ>
</inist:fA14>
</affiliation>
</author>
<author><name sortKey="Kammerl, Julius" sort="Kammerl, Julius" uniqKey="Kammerl J" first="Julius" last="Kammerl">Julius Kammerl</name>
<affiliation><inist:fA14 i1="01"><s1>Institute for Media Technology, Technische universität München</s1>
<s2>Munich 80333</s2>
<s3>DEU</s3>
<sZ>1 aut.</sZ>
<sZ>4 aut.</sZ>
<sZ>5 aut.</sZ>
<sZ>6 aut.</sZ>
</inist:fA14>
</affiliation>
</author>
<author><name sortKey="Vittorias, Iason" sort="Vittorias, Iason" uniqKey="Vittorias I" first="Iason" last="Vittorias">Iason Vittorias</name>
<affiliation><inist:fA14 i1="02"><s1>Institute for Automatic Control Engineering. Technische Universität München</s1>
<s2>Munich 80333</s2>
<s3>DEU</s3>
<sZ>2 aut.</sZ>
<sZ>7 aut.</sZ>
</inist:fA14>
</affiliation>
</author>
</analytic>
<series><title level="j" type="main">Proceedings of the IEEE</title>
<title level="j" type="abbreviated">Proc. IEEE</title>
<idno type="ISSN">0018-9219</idno>
<imprint><date when="2012">2012</date>
</imprint>
</series>
</biblStruct>
</sourceDesc>
<seriesStmt><title level="j" type="main">Proceedings of the IEEE</title>
<title level="j" type="abbreviated">Proc. IEEE</title>
<idno type="ISSN">0018-9219</idno>
</seriesStmt>
</fileDesc>
<profileDesc><textClass><keywords scheme="KwdEn" xml:lang="en"><term>Coding</term>
<term>Delay time</term>
<term>Immersion</term>
<term>Information transmission</term>
<term>Learning</term>
<term>Lossy medium</term>
<term>Man machine relation</term>
<term>Multimedia communication</term>
<term>Multimedia systems</term>
<term>Objective analysis</term>
<term>Performance evaluation</term>
<term>Psychophysics</term>
<term>Quality control</term>
<term>Remote handling</term>
<term>Remote operation</term>
<term>State of the art</term>
<term>Subjective evaluation</term>
<term>Synchronization</term>
<term>Time variation</term>
</keywords>
<keywords scheme="Pascal" xml:lang="fr"><term>Communication multimédia</term>
<term>Système multimédia</term>
<term>Immersion</term>
<term>Relation homme machine</term>
<term>Etat actuel</term>
<term>Evaluation subjective</term>
<term>Evaluation performance</term>
<term>Synchronisation</term>
<term>Apprentissage</term>
<term>Codage</term>
<term>Transmission information</term>
<term>Milieu dissipatif</term>
<term>Temps retard</term>
<term>Variation temporelle</term>
<term>Analyse objective</term>
<term>Contrôle qualité</term>
<term>Téléopération</term>
<term>Psychophysique</term>
<term>Télémanipulation</term>
</keywords>
</textClass>
</profileDesc>
</teiHeader>
<front><div type="abstract" xml:lang="en">Audiovisual communications is at the core of multimedia systems that allow users to interact across distances. It is common understanding that both audio and video are required for high-quality interaction. While audiovisual information provides a user with a satisfactory impression of being present in a remote environment, physical interaction and manipulation is not supported. True immersion into a distant environment and efficient distributed collaboration require the ability to physically interact with remote objects and to literally get in touch with other people. Touching and manipulating objects remotely becomes possible if we augment traditional audiovisual communications by the haptic modality. Haptic communications is a relatively young field of research that has the potential to substantially improve human-human and human-machine interaction. In this paper, we discuss the state-of-the-art in haptic communications both from psychophysical and technical points of view. From a human perception point of view, we mainly focus on the multimodal integration of video and haptics and the improved performance that can be achieved when combining them. We also discuss how the human adapts to discrepancies and synchronization errors between different modalities, a research area which is typically referred to as perceptual learning. From a technical perspective, we address perceptual coding of haptic information and the transmission of haptic data streams over resource-constrained and potentially lossy networks in the presence of unpredictable and time-varying communication delays. In this context, we also discuss the need for objective quality metrics for haptic communication. Throughout the paper, we stress the fact that haptic communications is not meant as a replacement of traditional audiovisual communications but rather as an additional dimension for telepresence that will allow us to advance in our quest for truly immersive communication.</div>
</front>
</TEI>
<inist><standard h6="B"><pA><fA01 i1="01" i2="1"><s0>0018-9219</s0>
</fA01>
<fA02 i1="01"><s0>IEEPAD</s0>
</fA02>
<fA03 i2="1"><s0>Proc. IEEE</s0>
</fA03>
<fA05><s2>100</s2>
</fA05>
<fA06><s2>4</s2>
</fA06>
<fA08 i1="01" i2="1" l="ENG"><s1>Haptic Communications</s1>
</fA08>
<fA09 i1="01" i2="1" l="ENG"><s1>Frontiers of Audiovisual Communications: New Convergence of Broadband Communications, Computing, and Rich Media</s1>
</fA09>
<fA11 i1="01" i2="1"><s1>STEINBACH (Eckehard)</s1>
</fA11>
<fA11 i1="02" i2="1"><s1>HIRCHE (Sandra)</s1>
</fA11>
<fA11 i1="03" i2="1"><s1>ERNST (Marc)</s1>
</fA11>
<fA11 i1="04" i2="1"><s1>BRANDI (Fernanda)</s1>
</fA11>
<fA11 i1="05" i2="1"><s1>CHAUDHARI (Rahul)</s1>
</fA11>
<fA11 i1="06" i2="1"><s1>KAMMERL (Julius)</s1>
</fA11>
<fA11 i1="07" i2="1"><s1>VITTORIAS (Iason)</s1>
</fA11>
<fA12 i1="01" i2="1"><s1>JAYANT (Nikil)</s1>
<s9>ed.</s9>
</fA12>
<fA14 i1="01"><s1>Institute for Media Technology, Technische universität München</s1>
<s2>Munich 80333</s2>
<s3>DEU</s3>
<sZ>1 aut.</sZ>
<sZ>4 aut.</sZ>
<sZ>5 aut.</sZ>
<sZ>6 aut.</sZ>
</fA14>
<fA14 i1="02"><s1>Institute for Automatic Control Engineering. Technische Universität München</s1>
<s2>Munich 80333</s2>
<s3>DEU</s3>
<sZ>2 aut.</sZ>
<sZ>7 aut.</sZ>
</fA14>
<fA14 i1="03"><s1>Faculty of Biology/Kognitive Neurowissenschaften, Universität Bielefeld</s1>
<s2>Bielefeld 33615</s2>
<s3>DEU</s3>
<sZ>3 aut.</sZ>
</fA14>
<fA20><s1>937-956</s1>
</fA20>
<fA21><s1>2012</s1>
</fA21>
<fA23 i1="01"><s0>ENG</s0>
</fA23>
<fA43 i1="01"><s1>INIST</s1>
<s2>222</s2>
<s5>354000509643200100</s5>
</fA43>
<fA44><s0>0000</s0>
<s1>© 2012 INIST-CNRS. All rights reserved.</s1>
</fA44>
<fA45><s0>137 ref.</s0>
</fA45>
<fA47 i1="01" i2="1"><s0>12-0208902</s0>
</fA47>
<fA60><s1>P</s1>
</fA60>
<fA61><s0>A</s0>
</fA61>
<fA64 i1="01" i2="1"><s0>Proceedings of the IEEE</s0>
</fA64>
<fA66 i1="01"><s0>USA</s0>
</fA66>
<fC01 i1="01" l="ENG"><s0>Audiovisual communications is at the core of multimedia systems that allow users to interact across distances. It is common understanding that both audio and video are required for high-quality interaction. While audiovisual information provides a user with a satisfactory impression of being present in a remote environment, physical interaction and manipulation is not supported. True immersion into a distant environment and efficient distributed collaboration require the ability to physically interact with remote objects and to literally get in touch with other people. Touching and manipulating objects remotely becomes possible if we augment traditional audiovisual communications by the haptic modality. Haptic communications is a relatively young field of research that has the potential to substantially improve human-human and human-machine interaction. In this paper, we discuss the state-of-the-art in haptic communications both from psychophysical and technical points of view. From a human perception point of view, we mainly focus on the multimodal integration of video and haptics and the improved performance that can be achieved when combining them. We also discuss how the human adapts to discrepancies and synchronization errors between different modalities, a research area which is typically referred to as perceptual learning. From a technical perspective, we address perceptual coding of haptic information and the transmission of haptic data streams over resource-constrained and potentially lossy networks in the presence of unpredictable and time-varying communication delays. In this context, we also discuss the need for objective quality metrics for haptic communication. Throughout the paper, we stress the fact that haptic communications is not meant as a replacement of traditional audiovisual communications but rather as an additional dimension for telepresence that will allow us to advance in our quest for truly immersive communication.</s0>
</fC01>
<fC02 i1="01" i2="X"><s0>001D04A04B</s0>
</fC02>
<fC02 i1="02" i2="X"><s0>001D04B02G</s0>
</fC02>
<fC02 i1="03" i2="X"><s0>001D04B02H4</s0>
</fC02>
<fC03 i1="01" i2="3" l="FRE"><s0>Communication multimédia</s0>
<s5>01</s5>
</fC03>
<fC03 i1="01" i2="3" l="ENG"><s0>Multimedia communication</s0>
<s5>01</s5>
</fC03>
<fC03 i1="02" i2="3" l="FRE"><s0>Système multimédia</s0>
<s5>02</s5>
</fC03>
<fC03 i1="02" i2="3" l="ENG"><s0>Multimedia systems</s0>
<s5>02</s5>
</fC03>
<fC03 i1="03" i2="X" l="FRE"><s0>Immersion</s0>
<s5>03</s5>
</fC03>
<fC03 i1="03" i2="X" l="ENG"><s0>Immersion</s0>
<s5>03</s5>
</fC03>
<fC03 i1="03" i2="X" l="SPA"><s0>Inmersión</s0>
<s5>03</s5>
</fC03>
<fC03 i1="04" i2="X" l="FRE"><s0>Relation homme machine</s0>
<s5>04</s5>
</fC03>
<fC03 i1="04" i2="X" l="ENG"><s0>Man machine relation</s0>
<s5>04</s5>
</fC03>
<fC03 i1="04" i2="X" l="SPA"><s0>Relación hombre máquina</s0>
<s5>04</s5>
</fC03>
<fC03 i1="05" i2="X" l="FRE"><s0>Etat actuel</s0>
<s5>05</s5>
</fC03>
<fC03 i1="05" i2="X" l="ENG"><s0>State of the art</s0>
<s5>05</s5>
</fC03>
<fC03 i1="05" i2="X" l="SPA"><s0>Estado actual</s0>
<s5>05</s5>
</fC03>
<fC03 i1="06" i2="X" l="FRE"><s0>Evaluation subjective</s0>
<s5>06</s5>
</fC03>
<fC03 i1="06" i2="X" l="ENG"><s0>Subjective evaluation</s0>
<s5>06</s5>
</fC03>
<fC03 i1="06" i2="X" l="SPA"><s0>Evaluación subjetiva</s0>
<s5>06</s5>
</fC03>
<fC03 i1="07" i2="X" l="FRE"><s0>Evaluation performance</s0>
<s5>07</s5>
</fC03>
<fC03 i1="07" i2="X" l="ENG"><s0>Performance evaluation</s0>
<s5>07</s5>
</fC03>
<fC03 i1="07" i2="X" l="SPA"><s0>Evaluación prestación</s0>
<s5>07</s5>
</fC03>
<fC03 i1="08" i2="X" l="FRE"><s0>Synchronisation</s0>
<s5>08</s5>
</fC03>
<fC03 i1="08" i2="X" l="ENG"><s0>Synchronization</s0>
<s5>08</s5>
</fC03>
<fC03 i1="08" i2="X" l="SPA"><s0>Sincronización</s0>
<s5>08</s5>
</fC03>
<fC03 i1="09" i2="X" l="FRE"><s0>Apprentissage</s0>
<s5>09</s5>
</fC03>
<fC03 i1="09" i2="X" l="ENG"><s0>Learning</s0>
<s5>09</s5>
</fC03>
<fC03 i1="09" i2="X" l="SPA"><s0>Aprendizaje</s0>
<s5>09</s5>
</fC03>
<fC03 i1="10" i2="X" l="FRE"><s0>Codage</s0>
<s5>10</s5>
</fC03>
<fC03 i1="10" i2="X" l="ENG"><s0>Coding</s0>
<s5>10</s5>
</fC03>
<fC03 i1="10" i2="X" l="SPA"><s0>Codificación</s0>
<s5>10</s5>
</fC03>
<fC03 i1="11" i2="X" l="FRE"><s0>Transmission information</s0>
<s5>11</s5>
</fC03>
<fC03 i1="11" i2="X" l="ENG"><s0>Information transmission</s0>
<s5>11</s5>
</fC03>
<fC03 i1="11" i2="X" l="SPA"><s0>Transmisión información</s0>
<s5>11</s5>
</fC03>
<fC03 i1="12" i2="X" l="FRE"><s0>Milieu dissipatif</s0>
<s5>12</s5>
</fC03>
<fC03 i1="12" i2="X" l="ENG"><s0>Lossy medium</s0>
<s5>12</s5>
</fC03>
<fC03 i1="12" i2="X" l="SPA"><s0>Medio dispersor</s0>
<s5>12</s5>
</fC03>
<fC03 i1="13" i2="X" l="FRE"><s0>Temps retard</s0>
<s5>13</s5>
</fC03>
<fC03 i1="13" i2="X" l="ENG"><s0>Delay time</s0>
<s5>13</s5>
</fC03>
<fC03 i1="13" i2="X" l="SPA"><s0>Tiempo retardo</s0>
<s5>13</s5>
</fC03>
<fC03 i1="14" i2="X" l="FRE"><s0>Variation temporelle</s0>
<s5>14</s5>
</fC03>
<fC03 i1="14" i2="X" l="ENG"><s0>Time variation</s0>
<s5>14</s5>
</fC03>
<fC03 i1="14" i2="X" l="SPA"><s0>Variación temporal</s0>
<s5>14</s5>
</fC03>
<fC03 i1="15" i2="X" l="FRE"><s0>Analyse objective</s0>
<s5>15</s5>
</fC03>
<fC03 i1="15" i2="X" l="ENG"><s0>Objective analysis</s0>
<s5>15</s5>
</fC03>
<fC03 i1="15" i2="X" l="SPA"><s0>Análisis objetivos</s0>
<s5>15</s5>
</fC03>
<fC03 i1="16" i2="X" l="FRE"><s0>Contrôle qualité</s0>
<s5>16</s5>
</fC03>
<fC03 i1="16" i2="X" l="ENG"><s0>Quality control</s0>
<s5>16</s5>
</fC03>
<fC03 i1="16" i2="X" l="SPA"><s0>Control de calidad</s0>
<s5>16</s5>
</fC03>
<fC03 i1="17" i2="X" l="FRE"><s0>Téléopération</s0>
<s5>17</s5>
</fC03>
<fC03 i1="17" i2="X" l="ENG"><s0>Remote operation</s0>
<s5>17</s5>
</fC03>
<fC03 i1="17" i2="X" l="SPA"><s0>Teleacción</s0>
<s5>17</s5>
</fC03>
<fC03 i1="18" i2="X" l="FRE"><s0>Psychophysique</s0>
<s5>18</s5>
</fC03>
<fC03 i1="18" i2="X" l="ENG"><s0>Psychophysics</s0>
<s5>18</s5>
</fC03>
<fC03 i1="18" i2="X" l="SPA"><s0>Psicofísica</s0>
<s5>18</s5>
</fC03>
<fC03 i1="19" i2="3" l="FRE"><s0>Télémanipulation</s0>
<s5>19</s5>
</fC03>
<fC03 i1="19" i2="3" l="ENG"><s0>Remote handling</s0>
<s5>19</s5>
</fC03>
<fN21><s1>163</s1>
</fN21>
<fN44 i1="01"><s1>OTO</s1>
</fN44>
<fN82><s1>OTO</s1>
</fN82>
</pA>
</standard>
<server><NO>PASCAL 12-0208902 INIST</NO>
<ET>Haptic Communications</ET>
<AU>STEINBACH (Eckehard); HIRCHE (Sandra); ERNST (Marc); BRANDI (Fernanda); CHAUDHARI (Rahul); KAMMERL (Julius); VITTORIAS (Iason); JAYANT (Nikil)</AU>
<AF>Institute for Media Technology, Technische universität München/Munich 80333/Allemagne (1 aut., 4 aut., 5 aut., 6 aut.); Institute for Automatic Control Engineering. Technische Universität München/Munich 80333/Allemagne (2 aut., 7 aut.); Faculty of Biology/Kognitive Neurowissenschaften, Universität Bielefeld/Bielefeld 33615/Allemagne (3 aut.)</AF>
<DT>Publication en série; Niveau analytique</DT>
<SO>Proceedings of the IEEE; ISSN 0018-9219; Coden IEEPAD; Etats-Unis; Da. 2012; Vol. 100; No. 4; Pp. 937-956; Bibl. 137 ref.</SO>
<LA>Anglais</LA>
<EA>Audiovisual communications is at the core of multimedia systems that allow users to interact across distances. It is common understanding that both audio and video are required for high-quality interaction. While audiovisual information provides a user with a satisfactory impression of being present in a remote environment, physical interaction and manipulation is not supported. True immersion into a distant environment and efficient distributed collaboration require the ability to physically interact with remote objects and to literally get in touch with other people. Touching and manipulating objects remotely becomes possible if we augment traditional audiovisual communications by the haptic modality. Haptic communications is a relatively young field of research that has the potential to substantially improve human-human and human-machine interaction. In this paper, we discuss the state-of-the-art in haptic communications both from psychophysical and technical points of view. From a human perception point of view, we mainly focus on the multimodal integration of video and haptics and the improved performance that can be achieved when combining them. We also discuss how the human adapts to discrepancies and synchronization errors between different modalities, a research area which is typically referred to as perceptual learning. From a technical perspective, we address perceptual coding of haptic information and the transmission of haptic data streams over resource-constrained and potentially lossy networks in the presence of unpredictable and time-varying communication delays. In this context, we also discuss the need for objective quality metrics for haptic communication. Throughout the paper, we stress the fact that haptic communications is not meant as a replacement of traditional audiovisual communications but rather as an additional dimension for telepresence that will allow us to advance in our quest for truly immersive communication.</EA>
<CC>001D04A04B; 001D04B02G; 001D04B02H4</CC>
<FD>Communication multimédia; Système multimédia; Immersion; Relation homme machine; Etat actuel; Evaluation subjective; Evaluation performance; Synchronisation; Apprentissage; Codage; Transmission information; Milieu dissipatif; Temps retard; Variation temporelle; Analyse objective; Contrôle qualité; Téléopération; Psychophysique; Télémanipulation</FD>
<ED>Multimedia communication; Multimedia systems; Immersion; Man machine relation; State of the art; Subjective evaluation; Performance evaluation; Synchronization; Learning; Coding; Information transmission; Lossy medium; Delay time; Time variation; Objective analysis; Quality control; Remote operation; Psychophysics; Remote handling</ED>
<SD>Inmersión; Relación hombre máquina; Estado actual; Evaluación subjetiva; Evaluación prestación; Sincronización; Aprendizaje; Codificación; Transmisión información; Medio dispersor; Tiempo retardo; Variación temporal; Análisis objetivos; Control de calidad; Teleacción; Psicofísica</SD>
<LO>INIST-222.354000509643200100</LO>
<ID>12-0208902</ID>
</server>
</inist>
</record>
Pour manipuler ce document sous Unix (Dilib)
EXPLOR_STEP=$WICRI_ROOT/Ticri/CIDE/explor/HapticV1/Data/PascalFrancis/Corpus
HfdSelect -h $EXPLOR_STEP/biblio.hfd -nk 000329 | SxmlIndent | more
Ou
HfdSelect -h $EXPLOR_AREA/Data/PascalFrancis/Corpus/biblio.hfd -nk 000329 | SxmlIndent | more
Pour mettre un lien sur cette page dans le réseau Wicri
{{Explor lien |wiki= Ticri/CIDE |area= HapticV1 |flux= PascalFrancis |étape= Corpus |type= RBID |clé= Pascal:12-0208902 |texte= Haptic Communications }}
![]() | This area was generated with Dilib version V0.6.23. | ![]() |