Serveur d'exploration sur les dispositifs haptiques

Attention, ce site est en cours de développement !
Attention, site généré par des moyens informatiques à partir de corpus bruts.
Les informations ne sont donc pas validées.

Haptic Communications

Identifieur interne : 000329 ( PascalFrancis/Corpus ); précédent : 000328; suivant : 000330

Haptic Communications

Auteurs : Eckehard Steinbach ; Sandra Hirche ; Marc Ernst ; Fernanda Brandi ; Rahul Chaudhari ; Julius Kammerl ; Iason Vittorias

Source :

RBID : Pascal:12-0208902

Descripteurs français

English descriptors

Abstract

Audiovisual communications is at the core of multimedia systems that allow users to interact across distances. It is common understanding that both audio and video are required for high-quality interaction. While audiovisual information provides a user with a satisfactory impression of being present in a remote environment, physical interaction and manipulation is not supported. True immersion into a distant environment and efficient distributed collaboration require the ability to physically interact with remote objects and to literally get in touch with other people. Touching and manipulating objects remotely becomes possible if we augment traditional audiovisual communications by the haptic modality. Haptic communications is a relatively young field of research that has the potential to substantially improve human-human and human-machine interaction. In this paper, we discuss the state-of-the-art in haptic communications both from psychophysical and technical points of view. From a human perception point of view, we mainly focus on the multimodal integration of video and haptics and the improved performance that can be achieved when combining them. We also discuss how the human adapts to discrepancies and synchronization errors between different modalities, a research area which is typically referred to as perceptual learning. From a technical perspective, we address perceptual coding of haptic information and the transmission of haptic data streams over resource-constrained and potentially lossy networks in the presence of unpredictable and time-varying communication delays. In this context, we also discuss the need for objective quality metrics for haptic communication. Throughout the paper, we stress the fact that haptic communications is not meant as a replacement of traditional audiovisual communications but rather as an additional dimension for telepresence that will allow us to advance in our quest for truly immersive communication.

Notice en format standard (ISO 2709)

Pour connaître la documentation sur le format Inist Standard.

pA  
A01 01  1    @0 0018-9219
A02 01      @0 IEEPAD
A03   1    @0 Proc. IEEE
A05       @2 100
A06       @2 4
A08 01  1  ENG  @1 Haptic Communications
A09 01  1  ENG  @1 Frontiers of Audiovisual Communications: New Convergence of Broadband Communications, Computing, and Rich Media
A11 01  1    @1 STEINBACH (Eckehard)
A11 02  1    @1 HIRCHE (Sandra)
A11 03  1    @1 ERNST (Marc)
A11 04  1    @1 BRANDI (Fernanda)
A11 05  1    @1 CHAUDHARI (Rahul)
A11 06  1    @1 KAMMERL (Julius)
A11 07  1    @1 VITTORIAS (Iason)
A12 01  1    @1 JAYANT (Nikil) @9 ed.
A14 01      @1 Institute for Media Technology, Technische universität München @2 Munich 80333 @3 DEU @Z 1 aut. @Z 4 aut. @Z 5 aut. @Z 6 aut.
A14 02      @1 Institute for Automatic Control Engineering. Technische Universität München @2 Munich 80333 @3 DEU @Z 2 aut. @Z 7 aut.
A14 03      @1 Faculty of Biology/Kognitive Neurowissenschaften, Universität Bielefeld @2 Bielefeld 33615 @3 DEU @Z 3 aut.
A20       @1 937-956
A21       @1 2012
A23 01      @0 ENG
A43 01      @1 INIST @2 222 @5 354000509643200100
A44       @0 0000 @1 © 2012 INIST-CNRS. All rights reserved.
A45       @0 137 ref.
A47 01  1    @0 12-0208902
A60       @1 P
A61       @0 A
A64 01  1    @0 Proceedings of the IEEE
A66 01      @0 USA
C01 01    ENG  @0 Audiovisual communications is at the core of multimedia systems that allow users to interact across distances. It is common understanding that both audio and video are required for high-quality interaction. While audiovisual information provides a user with a satisfactory impression of being present in a remote environment, physical interaction and manipulation is not supported. True immersion into a distant environment and efficient distributed collaboration require the ability to physically interact with remote objects and to literally get in touch with other people. Touching and manipulating objects remotely becomes possible if we augment traditional audiovisual communications by the haptic modality. Haptic communications is a relatively young field of research that has the potential to substantially improve human-human and human-machine interaction. In this paper, we discuss the state-of-the-art in haptic communications both from psychophysical and technical points of view. From a human perception point of view, we mainly focus on the multimodal integration of video and haptics and the improved performance that can be achieved when combining them. We also discuss how the human adapts to discrepancies and synchronization errors between different modalities, a research area which is typically referred to as perceptual learning. From a technical perspective, we address perceptual coding of haptic information and the transmission of haptic data streams over resource-constrained and potentially lossy networks in the presence of unpredictable and time-varying communication delays. In this context, we also discuss the need for objective quality metrics for haptic communication. Throughout the paper, we stress the fact that haptic communications is not meant as a replacement of traditional audiovisual communications but rather as an additional dimension for telepresence that will allow us to advance in our quest for truly immersive communication.
C02 01  X    @0 001D04A04B
C02 02  X    @0 001D04B02G
C02 03  X    @0 001D04B02H4
C03 01  3  FRE  @0 Communication multimédia @5 01
C03 01  3  ENG  @0 Multimedia communication @5 01
C03 02  3  FRE  @0 Système multimédia @5 02
C03 02  3  ENG  @0 Multimedia systems @5 02
C03 03  X  FRE  @0 Immersion @5 03
C03 03  X  ENG  @0 Immersion @5 03
C03 03  X  SPA  @0 Inmersión @5 03
C03 04  X  FRE  @0 Relation homme machine @5 04
C03 04  X  ENG  @0 Man machine relation @5 04
C03 04  X  SPA  @0 Relación hombre máquina @5 04
C03 05  X  FRE  @0 Etat actuel @5 05
C03 05  X  ENG  @0 State of the art @5 05
C03 05  X  SPA  @0 Estado actual @5 05
C03 06  X  FRE  @0 Evaluation subjective @5 06
C03 06  X  ENG  @0 Subjective evaluation @5 06
C03 06  X  SPA  @0 Evaluación subjetiva @5 06
C03 07  X  FRE  @0 Evaluation performance @5 07
C03 07  X  ENG  @0 Performance evaluation @5 07
C03 07  X  SPA  @0 Evaluación prestación @5 07
C03 08  X  FRE  @0 Synchronisation @5 08
C03 08  X  ENG  @0 Synchronization @5 08
C03 08  X  SPA  @0 Sincronización @5 08
C03 09  X  FRE  @0 Apprentissage @5 09
C03 09  X  ENG  @0 Learning @5 09
C03 09  X  SPA  @0 Aprendizaje @5 09
C03 10  X  FRE  @0 Codage @5 10
C03 10  X  ENG  @0 Coding @5 10
C03 10  X  SPA  @0 Codificación @5 10
C03 11  X  FRE  @0 Transmission information @5 11
C03 11  X  ENG  @0 Information transmission @5 11
C03 11  X  SPA  @0 Transmisión información @5 11
C03 12  X  FRE  @0 Milieu dissipatif @5 12
C03 12  X  ENG  @0 Lossy medium @5 12
C03 12  X  SPA  @0 Medio dispersor @5 12
C03 13  X  FRE  @0 Temps retard @5 13
C03 13  X  ENG  @0 Delay time @5 13
C03 13  X  SPA  @0 Tiempo retardo @5 13
C03 14  X  FRE  @0 Variation temporelle @5 14
C03 14  X  ENG  @0 Time variation @5 14
C03 14  X  SPA  @0 Variación temporal @5 14
C03 15  X  FRE  @0 Analyse objective @5 15
C03 15  X  ENG  @0 Objective analysis @5 15
C03 15  X  SPA  @0 Análisis objetivos @5 15
C03 16  X  FRE  @0 Contrôle qualité @5 16
C03 16  X  ENG  @0 Quality control @5 16
C03 16  X  SPA  @0 Control de calidad @5 16
C03 17  X  FRE  @0 Téléopération @5 17
C03 17  X  ENG  @0 Remote operation @5 17
C03 17  X  SPA  @0 Teleacción @5 17
C03 18  X  FRE  @0 Psychophysique @5 18
C03 18  X  ENG  @0 Psychophysics @5 18
C03 18  X  SPA  @0 Psicofísica @5 18
C03 19  3  FRE  @0 Télémanipulation @5 19
C03 19  3  ENG  @0 Remote handling @5 19
N21       @1 163
N44 01      @1 OTO
N82       @1 OTO

Format Inist (serveur)

NO : PASCAL 12-0208902 INIST
ET : Haptic Communications
AU : STEINBACH (Eckehard); HIRCHE (Sandra); ERNST (Marc); BRANDI (Fernanda); CHAUDHARI (Rahul); KAMMERL (Julius); VITTORIAS (Iason); JAYANT (Nikil)
AF : Institute for Media Technology, Technische universität München/Munich 80333/Allemagne (1 aut., 4 aut., 5 aut., 6 aut.); Institute for Automatic Control Engineering. Technische Universität München/Munich 80333/Allemagne (2 aut., 7 aut.); Faculty of Biology/Kognitive Neurowissenschaften, Universität Bielefeld/Bielefeld 33615/Allemagne (3 aut.)
DT : Publication en série; Niveau analytique
SO : Proceedings of the IEEE; ISSN 0018-9219; Coden IEEPAD; Etats-Unis; Da. 2012; Vol. 100; No. 4; Pp. 937-956; Bibl. 137 ref.
LA : Anglais
EA : Audiovisual communications is at the core of multimedia systems that allow users to interact across distances. It is common understanding that both audio and video are required for high-quality interaction. While audiovisual information provides a user with a satisfactory impression of being present in a remote environment, physical interaction and manipulation is not supported. True immersion into a distant environment and efficient distributed collaboration require the ability to physically interact with remote objects and to literally get in touch with other people. Touching and manipulating objects remotely becomes possible if we augment traditional audiovisual communications by the haptic modality. Haptic communications is a relatively young field of research that has the potential to substantially improve human-human and human-machine interaction. In this paper, we discuss the state-of-the-art in haptic communications both from psychophysical and technical points of view. From a human perception point of view, we mainly focus on the multimodal integration of video and haptics and the improved performance that can be achieved when combining them. We also discuss how the human adapts to discrepancies and synchronization errors between different modalities, a research area which is typically referred to as perceptual learning. From a technical perspective, we address perceptual coding of haptic information and the transmission of haptic data streams over resource-constrained and potentially lossy networks in the presence of unpredictable and time-varying communication delays. In this context, we also discuss the need for objective quality metrics for haptic communication. Throughout the paper, we stress the fact that haptic communications is not meant as a replacement of traditional audiovisual communications but rather as an additional dimension for telepresence that will allow us to advance in our quest for truly immersive communication.
CC : 001D04A04B; 001D04B02G; 001D04B02H4
FD : Communication multimédia; Système multimédia; Immersion; Relation homme machine; Etat actuel; Evaluation subjective; Evaluation performance; Synchronisation; Apprentissage; Codage; Transmission information; Milieu dissipatif; Temps retard; Variation temporelle; Analyse objective; Contrôle qualité; Téléopération; Psychophysique; Télémanipulation
ED : Multimedia communication; Multimedia systems; Immersion; Man machine relation; State of the art; Subjective evaluation; Performance evaluation; Synchronization; Learning; Coding; Information transmission; Lossy medium; Delay time; Time variation; Objective analysis; Quality control; Remote operation; Psychophysics; Remote handling
SD : Inmersión; Relación hombre máquina; Estado actual; Evaluación subjetiva; Evaluación prestación; Sincronización; Aprendizaje; Codificación; Transmisión información; Medio dispersor; Tiempo retardo; Variación temporal; Análisis objetivos; Control de calidad; Teleacción; Psicofísica
LO : INIST-222.354000509643200100
ID : 12-0208902

Links to Exploration step

Pascal:12-0208902

Le document en format XML

<record>
<TEI>
<teiHeader>
<fileDesc>
<titleStmt>
<title xml:lang="en" level="a">Haptic Communications</title>
<author>
<name sortKey="Steinbach, Eckehard" sort="Steinbach, Eckehard" uniqKey="Steinbach E" first="Eckehard" last="Steinbach">Eckehard Steinbach</name>
<affiliation>
<inist:fA14 i1="01">
<s1>Institute for Media Technology, Technische universität München</s1>
<s2>Munich 80333</s2>
<s3>DEU</s3>
<sZ>1 aut.</sZ>
<sZ>4 aut.</sZ>
<sZ>5 aut.</sZ>
<sZ>6 aut.</sZ>
</inist:fA14>
</affiliation>
</author>
<author>
<name sortKey="Hirche, Sandra" sort="Hirche, Sandra" uniqKey="Hirche S" first="Sandra" last="Hirche">Sandra Hirche</name>
<affiliation>
<inist:fA14 i1="02">
<s1>Institute for Automatic Control Engineering. Technische Universität München</s1>
<s2>Munich 80333</s2>
<s3>DEU</s3>
<sZ>2 aut.</sZ>
<sZ>7 aut.</sZ>
</inist:fA14>
</affiliation>
</author>
<author>
<name sortKey="Ernst, Marc" sort="Ernst, Marc" uniqKey="Ernst M" first="Marc" last="Ernst">Marc Ernst</name>
<affiliation>
<inist:fA14 i1="03">
<s1>Faculty of Biology/Kognitive Neurowissenschaften, Universität Bielefeld</s1>
<s2>Bielefeld 33615</s2>
<s3>DEU</s3>
<sZ>3 aut.</sZ>
</inist:fA14>
</affiliation>
</author>
<author>
<name sortKey="Brandi, Fernanda" sort="Brandi, Fernanda" uniqKey="Brandi F" first="Fernanda" last="Brandi">Fernanda Brandi</name>
<affiliation>
<inist:fA14 i1="01">
<s1>Institute for Media Technology, Technische universität München</s1>
<s2>Munich 80333</s2>
<s3>DEU</s3>
<sZ>1 aut.</sZ>
<sZ>4 aut.</sZ>
<sZ>5 aut.</sZ>
<sZ>6 aut.</sZ>
</inist:fA14>
</affiliation>
</author>
<author>
<name sortKey="Chaudhari, Rahul" sort="Chaudhari, Rahul" uniqKey="Chaudhari R" first="Rahul" last="Chaudhari">Rahul Chaudhari</name>
<affiliation>
<inist:fA14 i1="01">
<s1>Institute for Media Technology, Technische universität München</s1>
<s2>Munich 80333</s2>
<s3>DEU</s3>
<sZ>1 aut.</sZ>
<sZ>4 aut.</sZ>
<sZ>5 aut.</sZ>
<sZ>6 aut.</sZ>
</inist:fA14>
</affiliation>
</author>
<author>
<name sortKey="Kammerl, Julius" sort="Kammerl, Julius" uniqKey="Kammerl J" first="Julius" last="Kammerl">Julius Kammerl</name>
<affiliation>
<inist:fA14 i1="01">
<s1>Institute for Media Technology, Technische universität München</s1>
<s2>Munich 80333</s2>
<s3>DEU</s3>
<sZ>1 aut.</sZ>
<sZ>4 aut.</sZ>
<sZ>5 aut.</sZ>
<sZ>6 aut.</sZ>
</inist:fA14>
</affiliation>
</author>
<author>
<name sortKey="Vittorias, Iason" sort="Vittorias, Iason" uniqKey="Vittorias I" first="Iason" last="Vittorias">Iason Vittorias</name>
<affiliation>
<inist:fA14 i1="02">
<s1>Institute for Automatic Control Engineering. Technische Universität München</s1>
<s2>Munich 80333</s2>
<s3>DEU</s3>
<sZ>2 aut.</sZ>
<sZ>7 aut.</sZ>
</inist:fA14>
</affiliation>
</author>
</titleStmt>
<publicationStmt>
<idno type="wicri:source">INIST</idno>
<idno type="inist">12-0208902</idno>
<date when="2012">2012</date>
<idno type="stanalyst">PASCAL 12-0208902 INIST</idno>
<idno type="RBID">Pascal:12-0208902</idno>
<idno type="wicri:Area/PascalFrancis/Corpus">000329</idno>
</publicationStmt>
<sourceDesc>
<biblStruct>
<analytic>
<title xml:lang="en" level="a">Haptic Communications</title>
<author>
<name sortKey="Steinbach, Eckehard" sort="Steinbach, Eckehard" uniqKey="Steinbach E" first="Eckehard" last="Steinbach">Eckehard Steinbach</name>
<affiliation>
<inist:fA14 i1="01">
<s1>Institute for Media Technology, Technische universität München</s1>
<s2>Munich 80333</s2>
<s3>DEU</s3>
<sZ>1 aut.</sZ>
<sZ>4 aut.</sZ>
<sZ>5 aut.</sZ>
<sZ>6 aut.</sZ>
</inist:fA14>
</affiliation>
</author>
<author>
<name sortKey="Hirche, Sandra" sort="Hirche, Sandra" uniqKey="Hirche S" first="Sandra" last="Hirche">Sandra Hirche</name>
<affiliation>
<inist:fA14 i1="02">
<s1>Institute for Automatic Control Engineering. Technische Universität München</s1>
<s2>Munich 80333</s2>
<s3>DEU</s3>
<sZ>2 aut.</sZ>
<sZ>7 aut.</sZ>
</inist:fA14>
</affiliation>
</author>
<author>
<name sortKey="Ernst, Marc" sort="Ernst, Marc" uniqKey="Ernst M" first="Marc" last="Ernst">Marc Ernst</name>
<affiliation>
<inist:fA14 i1="03">
<s1>Faculty of Biology/Kognitive Neurowissenschaften, Universität Bielefeld</s1>
<s2>Bielefeld 33615</s2>
<s3>DEU</s3>
<sZ>3 aut.</sZ>
</inist:fA14>
</affiliation>
</author>
<author>
<name sortKey="Brandi, Fernanda" sort="Brandi, Fernanda" uniqKey="Brandi F" first="Fernanda" last="Brandi">Fernanda Brandi</name>
<affiliation>
<inist:fA14 i1="01">
<s1>Institute for Media Technology, Technische universität München</s1>
<s2>Munich 80333</s2>
<s3>DEU</s3>
<sZ>1 aut.</sZ>
<sZ>4 aut.</sZ>
<sZ>5 aut.</sZ>
<sZ>6 aut.</sZ>
</inist:fA14>
</affiliation>
</author>
<author>
<name sortKey="Chaudhari, Rahul" sort="Chaudhari, Rahul" uniqKey="Chaudhari R" first="Rahul" last="Chaudhari">Rahul Chaudhari</name>
<affiliation>
<inist:fA14 i1="01">
<s1>Institute for Media Technology, Technische universität München</s1>
<s2>Munich 80333</s2>
<s3>DEU</s3>
<sZ>1 aut.</sZ>
<sZ>4 aut.</sZ>
<sZ>5 aut.</sZ>
<sZ>6 aut.</sZ>
</inist:fA14>
</affiliation>
</author>
<author>
<name sortKey="Kammerl, Julius" sort="Kammerl, Julius" uniqKey="Kammerl J" first="Julius" last="Kammerl">Julius Kammerl</name>
<affiliation>
<inist:fA14 i1="01">
<s1>Institute for Media Technology, Technische universität München</s1>
<s2>Munich 80333</s2>
<s3>DEU</s3>
<sZ>1 aut.</sZ>
<sZ>4 aut.</sZ>
<sZ>5 aut.</sZ>
<sZ>6 aut.</sZ>
</inist:fA14>
</affiliation>
</author>
<author>
<name sortKey="Vittorias, Iason" sort="Vittorias, Iason" uniqKey="Vittorias I" first="Iason" last="Vittorias">Iason Vittorias</name>
<affiliation>
<inist:fA14 i1="02">
<s1>Institute for Automatic Control Engineering. Technische Universität München</s1>
<s2>Munich 80333</s2>
<s3>DEU</s3>
<sZ>2 aut.</sZ>
<sZ>7 aut.</sZ>
</inist:fA14>
</affiliation>
</author>
</analytic>
<series>
<title level="j" type="main">Proceedings of the IEEE</title>
<title level="j" type="abbreviated">Proc. IEEE</title>
<idno type="ISSN">0018-9219</idno>
<imprint>
<date when="2012">2012</date>
</imprint>
</series>
</biblStruct>
</sourceDesc>
<seriesStmt>
<title level="j" type="main">Proceedings of the IEEE</title>
<title level="j" type="abbreviated">Proc. IEEE</title>
<idno type="ISSN">0018-9219</idno>
</seriesStmt>
</fileDesc>
<profileDesc>
<textClass>
<keywords scheme="KwdEn" xml:lang="en">
<term>Coding</term>
<term>Delay time</term>
<term>Immersion</term>
<term>Information transmission</term>
<term>Learning</term>
<term>Lossy medium</term>
<term>Man machine relation</term>
<term>Multimedia communication</term>
<term>Multimedia systems</term>
<term>Objective analysis</term>
<term>Performance evaluation</term>
<term>Psychophysics</term>
<term>Quality control</term>
<term>Remote handling</term>
<term>Remote operation</term>
<term>State of the art</term>
<term>Subjective evaluation</term>
<term>Synchronization</term>
<term>Time variation</term>
</keywords>
<keywords scheme="Pascal" xml:lang="fr">
<term>Communication multimédia</term>
<term>Système multimédia</term>
<term>Immersion</term>
<term>Relation homme machine</term>
<term>Etat actuel</term>
<term>Evaluation subjective</term>
<term>Evaluation performance</term>
<term>Synchronisation</term>
<term>Apprentissage</term>
<term>Codage</term>
<term>Transmission information</term>
<term>Milieu dissipatif</term>
<term>Temps retard</term>
<term>Variation temporelle</term>
<term>Analyse objective</term>
<term>Contrôle qualité</term>
<term>Téléopération</term>
<term>Psychophysique</term>
<term>Télémanipulation</term>
</keywords>
</textClass>
</profileDesc>
</teiHeader>
<front>
<div type="abstract" xml:lang="en">Audiovisual communications is at the core of multimedia systems that allow users to interact across distances. It is common understanding that both audio and video are required for high-quality interaction. While audiovisual information provides a user with a satisfactory impression of being present in a remote environment, physical interaction and manipulation is not supported. True immersion into a distant environment and efficient distributed collaboration require the ability to physically interact with remote objects and to literally get in touch with other people. Touching and manipulating objects remotely becomes possible if we augment traditional audiovisual communications by the haptic modality. Haptic communications is a relatively young field of research that has the potential to substantially improve human-human and human-machine interaction. In this paper, we discuss the state-of-the-art in haptic communications both from psychophysical and technical points of view. From a human perception point of view, we mainly focus on the multimodal integration of video and haptics and the improved performance that can be achieved when combining them. We also discuss how the human adapts to discrepancies and synchronization errors between different modalities, a research area which is typically referred to as perceptual learning. From a technical perspective, we address perceptual coding of haptic information and the transmission of haptic data streams over resource-constrained and potentially lossy networks in the presence of unpredictable and time-varying communication delays. In this context, we also discuss the need for objective quality metrics for haptic communication. Throughout the paper, we stress the fact that haptic communications is not meant as a replacement of traditional audiovisual communications but rather as an additional dimension for telepresence that will allow us to advance in our quest for truly immersive communication.</div>
</front>
</TEI>
<inist>
<standard h6="B">
<pA>
<fA01 i1="01" i2="1">
<s0>0018-9219</s0>
</fA01>
<fA02 i1="01">
<s0>IEEPAD</s0>
</fA02>
<fA03 i2="1">
<s0>Proc. IEEE</s0>
</fA03>
<fA05>
<s2>100</s2>
</fA05>
<fA06>
<s2>4</s2>
</fA06>
<fA08 i1="01" i2="1" l="ENG">
<s1>Haptic Communications</s1>
</fA08>
<fA09 i1="01" i2="1" l="ENG">
<s1>Frontiers of Audiovisual Communications: New Convergence of Broadband Communications, Computing, and Rich Media</s1>
</fA09>
<fA11 i1="01" i2="1">
<s1>STEINBACH (Eckehard)</s1>
</fA11>
<fA11 i1="02" i2="1">
<s1>HIRCHE (Sandra)</s1>
</fA11>
<fA11 i1="03" i2="1">
<s1>ERNST (Marc)</s1>
</fA11>
<fA11 i1="04" i2="1">
<s1>BRANDI (Fernanda)</s1>
</fA11>
<fA11 i1="05" i2="1">
<s1>CHAUDHARI (Rahul)</s1>
</fA11>
<fA11 i1="06" i2="1">
<s1>KAMMERL (Julius)</s1>
</fA11>
<fA11 i1="07" i2="1">
<s1>VITTORIAS (Iason)</s1>
</fA11>
<fA12 i1="01" i2="1">
<s1>JAYANT (Nikil)</s1>
<s9>ed.</s9>
</fA12>
<fA14 i1="01">
<s1>Institute for Media Technology, Technische universität München</s1>
<s2>Munich 80333</s2>
<s3>DEU</s3>
<sZ>1 aut.</sZ>
<sZ>4 aut.</sZ>
<sZ>5 aut.</sZ>
<sZ>6 aut.</sZ>
</fA14>
<fA14 i1="02">
<s1>Institute for Automatic Control Engineering. Technische Universität München</s1>
<s2>Munich 80333</s2>
<s3>DEU</s3>
<sZ>2 aut.</sZ>
<sZ>7 aut.</sZ>
</fA14>
<fA14 i1="03">
<s1>Faculty of Biology/Kognitive Neurowissenschaften, Universität Bielefeld</s1>
<s2>Bielefeld 33615</s2>
<s3>DEU</s3>
<sZ>3 aut.</sZ>
</fA14>
<fA20>
<s1>937-956</s1>
</fA20>
<fA21>
<s1>2012</s1>
</fA21>
<fA23 i1="01">
<s0>ENG</s0>
</fA23>
<fA43 i1="01">
<s1>INIST</s1>
<s2>222</s2>
<s5>354000509643200100</s5>
</fA43>
<fA44>
<s0>0000</s0>
<s1>© 2012 INIST-CNRS. All rights reserved.</s1>
</fA44>
<fA45>
<s0>137 ref.</s0>
</fA45>
<fA47 i1="01" i2="1">
<s0>12-0208902</s0>
</fA47>
<fA60>
<s1>P</s1>
</fA60>
<fA61>
<s0>A</s0>
</fA61>
<fA64 i1="01" i2="1">
<s0>Proceedings of the IEEE</s0>
</fA64>
<fA66 i1="01">
<s0>USA</s0>
</fA66>
<fC01 i1="01" l="ENG">
<s0>Audiovisual communications is at the core of multimedia systems that allow users to interact across distances. It is common understanding that both audio and video are required for high-quality interaction. While audiovisual information provides a user with a satisfactory impression of being present in a remote environment, physical interaction and manipulation is not supported. True immersion into a distant environment and efficient distributed collaboration require the ability to physically interact with remote objects and to literally get in touch with other people. Touching and manipulating objects remotely becomes possible if we augment traditional audiovisual communications by the haptic modality. Haptic communications is a relatively young field of research that has the potential to substantially improve human-human and human-machine interaction. In this paper, we discuss the state-of-the-art in haptic communications both from psychophysical and technical points of view. From a human perception point of view, we mainly focus on the multimodal integration of video and haptics and the improved performance that can be achieved when combining them. We also discuss how the human adapts to discrepancies and synchronization errors between different modalities, a research area which is typically referred to as perceptual learning. From a technical perspective, we address perceptual coding of haptic information and the transmission of haptic data streams over resource-constrained and potentially lossy networks in the presence of unpredictable and time-varying communication delays. In this context, we also discuss the need for objective quality metrics for haptic communication. Throughout the paper, we stress the fact that haptic communications is not meant as a replacement of traditional audiovisual communications but rather as an additional dimension for telepresence that will allow us to advance in our quest for truly immersive communication.</s0>
</fC01>
<fC02 i1="01" i2="X">
<s0>001D04A04B</s0>
</fC02>
<fC02 i1="02" i2="X">
<s0>001D04B02G</s0>
</fC02>
<fC02 i1="03" i2="X">
<s0>001D04B02H4</s0>
</fC02>
<fC03 i1="01" i2="3" l="FRE">
<s0>Communication multimédia</s0>
<s5>01</s5>
</fC03>
<fC03 i1="01" i2="3" l="ENG">
<s0>Multimedia communication</s0>
<s5>01</s5>
</fC03>
<fC03 i1="02" i2="3" l="FRE">
<s0>Système multimédia</s0>
<s5>02</s5>
</fC03>
<fC03 i1="02" i2="3" l="ENG">
<s0>Multimedia systems</s0>
<s5>02</s5>
</fC03>
<fC03 i1="03" i2="X" l="FRE">
<s0>Immersion</s0>
<s5>03</s5>
</fC03>
<fC03 i1="03" i2="X" l="ENG">
<s0>Immersion</s0>
<s5>03</s5>
</fC03>
<fC03 i1="03" i2="X" l="SPA">
<s0>Inmersión</s0>
<s5>03</s5>
</fC03>
<fC03 i1="04" i2="X" l="FRE">
<s0>Relation homme machine</s0>
<s5>04</s5>
</fC03>
<fC03 i1="04" i2="X" l="ENG">
<s0>Man machine relation</s0>
<s5>04</s5>
</fC03>
<fC03 i1="04" i2="X" l="SPA">
<s0>Relación hombre máquina</s0>
<s5>04</s5>
</fC03>
<fC03 i1="05" i2="X" l="FRE">
<s0>Etat actuel</s0>
<s5>05</s5>
</fC03>
<fC03 i1="05" i2="X" l="ENG">
<s0>State of the art</s0>
<s5>05</s5>
</fC03>
<fC03 i1="05" i2="X" l="SPA">
<s0>Estado actual</s0>
<s5>05</s5>
</fC03>
<fC03 i1="06" i2="X" l="FRE">
<s0>Evaluation subjective</s0>
<s5>06</s5>
</fC03>
<fC03 i1="06" i2="X" l="ENG">
<s0>Subjective evaluation</s0>
<s5>06</s5>
</fC03>
<fC03 i1="06" i2="X" l="SPA">
<s0>Evaluación subjetiva</s0>
<s5>06</s5>
</fC03>
<fC03 i1="07" i2="X" l="FRE">
<s0>Evaluation performance</s0>
<s5>07</s5>
</fC03>
<fC03 i1="07" i2="X" l="ENG">
<s0>Performance evaluation</s0>
<s5>07</s5>
</fC03>
<fC03 i1="07" i2="X" l="SPA">
<s0>Evaluación prestación</s0>
<s5>07</s5>
</fC03>
<fC03 i1="08" i2="X" l="FRE">
<s0>Synchronisation</s0>
<s5>08</s5>
</fC03>
<fC03 i1="08" i2="X" l="ENG">
<s0>Synchronization</s0>
<s5>08</s5>
</fC03>
<fC03 i1="08" i2="X" l="SPA">
<s0>Sincronización</s0>
<s5>08</s5>
</fC03>
<fC03 i1="09" i2="X" l="FRE">
<s0>Apprentissage</s0>
<s5>09</s5>
</fC03>
<fC03 i1="09" i2="X" l="ENG">
<s0>Learning</s0>
<s5>09</s5>
</fC03>
<fC03 i1="09" i2="X" l="SPA">
<s0>Aprendizaje</s0>
<s5>09</s5>
</fC03>
<fC03 i1="10" i2="X" l="FRE">
<s0>Codage</s0>
<s5>10</s5>
</fC03>
<fC03 i1="10" i2="X" l="ENG">
<s0>Coding</s0>
<s5>10</s5>
</fC03>
<fC03 i1="10" i2="X" l="SPA">
<s0>Codificación</s0>
<s5>10</s5>
</fC03>
<fC03 i1="11" i2="X" l="FRE">
<s0>Transmission information</s0>
<s5>11</s5>
</fC03>
<fC03 i1="11" i2="X" l="ENG">
<s0>Information transmission</s0>
<s5>11</s5>
</fC03>
<fC03 i1="11" i2="X" l="SPA">
<s0>Transmisión información</s0>
<s5>11</s5>
</fC03>
<fC03 i1="12" i2="X" l="FRE">
<s0>Milieu dissipatif</s0>
<s5>12</s5>
</fC03>
<fC03 i1="12" i2="X" l="ENG">
<s0>Lossy medium</s0>
<s5>12</s5>
</fC03>
<fC03 i1="12" i2="X" l="SPA">
<s0>Medio dispersor</s0>
<s5>12</s5>
</fC03>
<fC03 i1="13" i2="X" l="FRE">
<s0>Temps retard</s0>
<s5>13</s5>
</fC03>
<fC03 i1="13" i2="X" l="ENG">
<s0>Delay time</s0>
<s5>13</s5>
</fC03>
<fC03 i1="13" i2="X" l="SPA">
<s0>Tiempo retardo</s0>
<s5>13</s5>
</fC03>
<fC03 i1="14" i2="X" l="FRE">
<s0>Variation temporelle</s0>
<s5>14</s5>
</fC03>
<fC03 i1="14" i2="X" l="ENG">
<s0>Time variation</s0>
<s5>14</s5>
</fC03>
<fC03 i1="14" i2="X" l="SPA">
<s0>Variación temporal</s0>
<s5>14</s5>
</fC03>
<fC03 i1="15" i2="X" l="FRE">
<s0>Analyse objective</s0>
<s5>15</s5>
</fC03>
<fC03 i1="15" i2="X" l="ENG">
<s0>Objective analysis</s0>
<s5>15</s5>
</fC03>
<fC03 i1="15" i2="X" l="SPA">
<s0>Análisis objetivos</s0>
<s5>15</s5>
</fC03>
<fC03 i1="16" i2="X" l="FRE">
<s0>Contrôle qualité</s0>
<s5>16</s5>
</fC03>
<fC03 i1="16" i2="X" l="ENG">
<s0>Quality control</s0>
<s5>16</s5>
</fC03>
<fC03 i1="16" i2="X" l="SPA">
<s0>Control de calidad</s0>
<s5>16</s5>
</fC03>
<fC03 i1="17" i2="X" l="FRE">
<s0>Téléopération</s0>
<s5>17</s5>
</fC03>
<fC03 i1="17" i2="X" l="ENG">
<s0>Remote operation</s0>
<s5>17</s5>
</fC03>
<fC03 i1="17" i2="X" l="SPA">
<s0>Teleacción</s0>
<s5>17</s5>
</fC03>
<fC03 i1="18" i2="X" l="FRE">
<s0>Psychophysique</s0>
<s5>18</s5>
</fC03>
<fC03 i1="18" i2="X" l="ENG">
<s0>Psychophysics</s0>
<s5>18</s5>
</fC03>
<fC03 i1="18" i2="X" l="SPA">
<s0>Psicofísica</s0>
<s5>18</s5>
</fC03>
<fC03 i1="19" i2="3" l="FRE">
<s0>Télémanipulation</s0>
<s5>19</s5>
</fC03>
<fC03 i1="19" i2="3" l="ENG">
<s0>Remote handling</s0>
<s5>19</s5>
</fC03>
<fN21>
<s1>163</s1>
</fN21>
<fN44 i1="01">
<s1>OTO</s1>
</fN44>
<fN82>
<s1>OTO</s1>
</fN82>
</pA>
</standard>
<server>
<NO>PASCAL 12-0208902 INIST</NO>
<ET>Haptic Communications</ET>
<AU>STEINBACH (Eckehard); HIRCHE (Sandra); ERNST (Marc); BRANDI (Fernanda); CHAUDHARI (Rahul); KAMMERL (Julius); VITTORIAS (Iason); JAYANT (Nikil)</AU>
<AF>Institute for Media Technology, Technische universität München/Munich 80333/Allemagne (1 aut., 4 aut., 5 aut., 6 aut.); Institute for Automatic Control Engineering. Technische Universität München/Munich 80333/Allemagne (2 aut., 7 aut.); Faculty of Biology/Kognitive Neurowissenschaften, Universität Bielefeld/Bielefeld 33615/Allemagne (3 aut.)</AF>
<DT>Publication en série; Niveau analytique</DT>
<SO>Proceedings of the IEEE; ISSN 0018-9219; Coden IEEPAD; Etats-Unis; Da. 2012; Vol. 100; No. 4; Pp. 937-956; Bibl. 137 ref.</SO>
<LA>Anglais</LA>
<EA>Audiovisual communications is at the core of multimedia systems that allow users to interact across distances. It is common understanding that both audio and video are required for high-quality interaction. While audiovisual information provides a user with a satisfactory impression of being present in a remote environment, physical interaction and manipulation is not supported. True immersion into a distant environment and efficient distributed collaboration require the ability to physically interact with remote objects and to literally get in touch with other people. Touching and manipulating objects remotely becomes possible if we augment traditional audiovisual communications by the haptic modality. Haptic communications is a relatively young field of research that has the potential to substantially improve human-human and human-machine interaction. In this paper, we discuss the state-of-the-art in haptic communications both from psychophysical and technical points of view. From a human perception point of view, we mainly focus on the multimodal integration of video and haptics and the improved performance that can be achieved when combining them. We also discuss how the human adapts to discrepancies and synchronization errors between different modalities, a research area which is typically referred to as perceptual learning. From a technical perspective, we address perceptual coding of haptic information and the transmission of haptic data streams over resource-constrained and potentially lossy networks in the presence of unpredictable and time-varying communication delays. In this context, we also discuss the need for objective quality metrics for haptic communication. Throughout the paper, we stress the fact that haptic communications is not meant as a replacement of traditional audiovisual communications but rather as an additional dimension for telepresence that will allow us to advance in our quest for truly immersive communication.</EA>
<CC>001D04A04B; 001D04B02G; 001D04B02H4</CC>
<FD>Communication multimédia; Système multimédia; Immersion; Relation homme machine; Etat actuel; Evaluation subjective; Evaluation performance; Synchronisation; Apprentissage; Codage; Transmission information; Milieu dissipatif; Temps retard; Variation temporelle; Analyse objective; Contrôle qualité; Téléopération; Psychophysique; Télémanipulation</FD>
<ED>Multimedia communication; Multimedia systems; Immersion; Man machine relation; State of the art; Subjective evaluation; Performance evaluation; Synchronization; Learning; Coding; Information transmission; Lossy medium; Delay time; Time variation; Objective analysis; Quality control; Remote operation; Psychophysics; Remote handling</ED>
<SD>Inmersión; Relación hombre máquina; Estado actual; Evaluación subjetiva; Evaluación prestación; Sincronización; Aprendizaje; Codificación; Transmisión información; Medio dispersor; Tiempo retardo; Variación temporal; Análisis objetivos; Control de calidad; Teleacción; Psicofísica</SD>
<LO>INIST-222.354000509643200100</LO>
<ID>12-0208902</ID>
</server>
</inist>
</record>

Pour manipuler ce document sous Unix (Dilib)

EXPLOR_STEP=$WICRI_ROOT/Ticri/CIDE/explor/HapticV1/Data/PascalFrancis/Corpus
HfdSelect -h $EXPLOR_STEP/biblio.hfd -nk 000329 | SxmlIndent | more

Ou

HfdSelect -h $EXPLOR_AREA/Data/PascalFrancis/Corpus/biblio.hfd -nk 000329 | SxmlIndent | more

Pour mettre un lien sur cette page dans le réseau Wicri

{{Explor lien
   |wiki=    Ticri/CIDE
   |area=    HapticV1
   |flux=    PascalFrancis
   |étape=   Corpus
   |type=    RBID
   |clé=     Pascal:12-0208902
   |texte=   Haptic Communications
}}

Wicri

This area was generated with Dilib version V0.6.23.
Data generation: Mon Jun 13 01:09:46 2016. Site generation: Wed Mar 6 09:54:07 2024