Serveur d'exploration sur les dispositifs haptiques

Attention, ce site est en cours de développement !
Attention, site généré par des moyens informatiques à partir de corpus bruts.
Les informations ne sont donc pas validées.

Haptic interaction with depth video media

Identifieur interne : 000685 ( PascalFrancis/Curation ); précédent : 000684; suivant : 000686

Haptic interaction with depth video media

Auteurs : Jongeun Cha [Corée du Sud] ; Seung-Man Kim [Corée du Sud] ; Ian Oakley [Corée du Sud] ; Jeha Ryu [Corée du Sud] ; Kwan H. Lee [Corée du Sud]

Source :

RBID : Pascal:06-0051619

Descripteurs français

English descriptors

Abstract

In this paper we propose a touch enabled video player system. A conventional video player only allows viewers to passively experience visual and audio media. In virtual environment, touch or haptic interaction has been shown to convey a powerful illusion of the tangible nature - the reality - of the displayed environments and we feel the same benefits may be conferred to a broadcast, viewing domain. To this end, this paper describes a system that uses a video representation based on depth images to add a haptic component to an audio-visual stream. We generate this stream through the combination of a regular RGB image and a synchronized depth image composed of per-pixel depth-from-camera information. The depth video, a unified stream of the color and depth images, can be synthesized from a computer graphics animation by rendering with commercial packages or captured from a real environment by using a active depth camera such as the Zcam<TM>. In order to provide a haptic representation of this data, we propose a modified proxy graph algorithm for depth video streams. The modified proxy graph algorithm can (i) detect collisions between a moving virtual proxy and time-varying video scenes, (ii) generates smooth touch sensation by handling the implications of the radically different display update rates required by visual (30Hz) and haptic systems (in the order of 1000Hz), (iii) avoid sudden change of contact forces. A sample experiment shows the effectiveness of the proposed system.
pA  
A01 01  1    @0 0302-9743
A05       @2 3767
A08 01  1  ENG  @1 Haptic interaction with depth video media
A09 01  1  ENG  @1 Advances in multimedia information processing : PCM 2005 : 6th Pacific Rim Conference on Multimedia, Jeju Island, Korea, November 13-16, 2005 : proceedings
A11 01  1    @1 CHA (Jongeun)
A11 02  1    @1 KIM (Seung-Man)
A11 03  1    @1 OAKLEY (Ian)
A11 04  1    @1 RYU (Jeha)
A11 05  1    @1 LEE (Kwan H.)
A14 01      @1 Human-Machine-Computer Interface Lab., Dept. of Mechatronics, Gwangju Institute of Science and Technology, 1 Oryong-dong @2 Buk-gu, Gwangju 500-712 @3 KOR @Z 1 aut. @Z 3 aut. @Z 4 aut.
A14 02      @1 Intelligent Design & Graphics Lab., Dept. of Mechatronics @3 KOR @Z 2 aut. @Z 5 aut.
A20       @2 Part I, 420-430
A21       @1 2005
A23 01      @0 ENG
A26 01      @0 3-540-30027-9
A43 01      @1 INIST @2 16343 @5 354000138665540370
A44       @0 0000 @1 © 2006 INIST-CNRS. All rights reserved.
A45       @0 14 ref.
A47 01  1    @0 06-0051619
A60       @1 P @2 C
A61       @0 A
A64 01  1    @0 Lecture notes in computer science
A66 01      @0 DEU
C01 01    ENG  @0 In this paper we propose a touch enabled video player system. A conventional video player only allows viewers to passively experience visual and audio media. In virtual environment, touch or haptic interaction has been shown to convey a powerful illusion of the tangible nature - the reality - of the displayed environments and we feel the same benefits may be conferred to a broadcast, viewing domain. To this end, this paper describes a system that uses a video representation based on depth images to add a haptic component to an audio-visual stream. We generate this stream through the combination of a regular RGB image and a synchronized depth image composed of per-pixel depth-from-camera information. The depth video, a unified stream of the color and depth images, can be synthesized from a computer graphics animation by rendering with commercial packages or captured from a real environment by using a active depth camera such as the Zcam<TM>. In order to provide a haptic representation of this data, we propose a modified proxy graph algorithm for depth video streams. The modified proxy graph algorithm can (i) detect collisions between a moving virtual proxy and time-varying video scenes, (ii) generates smooth touch sensation by handling the implications of the radically different display update rates required by visual (30Hz) and haptic systems (in the order of 1000Hz), (iii) avoid sudden change of contact forces. A sample experiment shows the effectiveness of the proposed system.
C02 01  X    @0 001D02C03
C03 01  X  FRE  @0 Multimédia @5 01
C03 01  X  ENG  @0 Multimedia @5 01
C03 01  X  SPA  @0 Multimedia @5 01
C03 02  X  FRE  @0 Signal vidéo @5 06
C03 02  X  ENG  @0 Video signal @5 06
C03 02  X  SPA  @0 Señal video @5 06
C03 03  X  FRE  @0 Vision ordinateur @5 07
C03 03  X  ENG  @0 Computer vision @5 07
C03 03  X  SPA  @0 Visión ordenador @5 07
C03 04  X  FRE  @0 Réalité virtuelle @5 08
C03 04  X  ENG  @0 Virtual reality @5 08
C03 04  X  SPA  @0 Realidad virtual @5 08
C03 05  3  FRE  @0 Acoustique audio @5 09
C03 05  3  ENG  @0 Audio acoustics @5 09
C03 06  X  FRE  @0 Reproduction son @5 10
C03 06  X  ENG  @0 Sound reproduction @5 10
C03 06  X  SPA  @0 Reproducción sonido @5 10
C03 07  X  FRE  @0 Image couleur @5 11
C03 07  X  ENG  @0 Color image @5 11
C03 07  X  SPA  @0 Imagen color @5 11
C03 08  X  FRE  @0 Animation par ordinateur @5 12
C03 08  X  ENG  @0 Computer animation @5 12
C03 08  X  SPA  @0 Animación por computador @5 12
C03 09  X  FRE  @0 Infographie @5 13
C03 09  X  ENG  @0 Computer graphics @5 13
C03 09  X  SPA  @0 Gráfico computadora @5 13
C03 10  X  FRE  @0 Rendu image @5 14
C03 10  X  ENG  @0 Image rendering @5 14
C03 10  X  SPA  @0 Restitucíon imagen @5 14
C03 11  X  FRE  @0 Antémémoire @5 15
C03 11  X  ENG  @0 Cache memory @5 15
C03 11  X  SPA  @0 Antememoria @5 15
C03 12  X  FRE  @0 Serveur informatique @5 16
C03 12  X  ENG  @0 Computer server @5 16
C03 12  X  SPA  @0 Servidor informático @5 16
C03 13  X  FRE  @0 Mise à jour @5 17
C03 13  X  ENG  @0 Updating @5 17
C03 13  X  SPA  @0 Actualización @5 17
C03 14  X  FRE  @0 Sensibilité tactile @5 18
C03 14  X  ENG  @0 Tactile sensitivity @5 18
C03 14  X  SPA  @0 Sensibilidad tactil @5 18
C03 15  X  FRE  @0 Illusion @5 19
C03 15  X  ENG  @0 Illusion @5 19
C03 15  X  SPA  @0 Ilusión @5 19
C03 16  X  FRE  @0 Sensation @5 20
C03 16  X  ENG  @0 Sensation @5 20
C03 16  X  SPA  @0 Sensación @5 20
C03 17  X  FRE  @0 Appareil visuel @5 21
C03 17  X  ENG  @0 Visual system @5 21
C03 17  X  SPA  @0 Aparato visual @5 21
C03 18  X  FRE  @0 Système paramètre variable @5 23
C03 18  X  ENG  @0 Time varying system @5 23
C03 18  X  SPA  @0 Sistema parámetro variable @5 23
C03 19  X  FRE  @0 . @4 INC @5 82
C03 20  X  FRE  @0 Transmission en continu @4 CD @5 96
C03 20  X  ENG  @0 Streaming @4 CD @5 96
C03 20  X  SPA  @0 Transmisión continua @4 CD @5 96
N21       @1 023
N44 01      @1 OTO
N82       @1 OTO
pR  
A30 01  1  ENG  @1 Pacific Rim Conference on Multimedia @2 6 @3 KOR @4 2005-11-13

Links toward previous steps (curation, corpus...)


Links to Exploration step

Pascal:06-0051619

Le document en format XML

<record>
<TEI>
<teiHeader>
<fileDesc>
<titleStmt>
<title xml:lang="en" level="a">Haptic interaction with depth video media</title>
<author>
<name sortKey="Cha, Jongeun" sort="Cha, Jongeun" uniqKey="Cha J" first="Jongeun" last="Cha">Jongeun Cha</name>
<affiliation wicri:level="1">
<inist:fA14 i1="01">
<s1>Human-Machine-Computer Interface Lab., Dept. of Mechatronics, Gwangju Institute of Science and Technology, 1 Oryong-dong</s1>
<s2>Buk-gu, Gwangju 500-712</s2>
<s3>KOR</s3>
<sZ>1 aut.</sZ>
<sZ>3 aut.</sZ>
<sZ>4 aut.</sZ>
</inist:fA14>
<country>Corée du Sud</country>
</affiliation>
</author>
<author>
<name sortKey="Kim, Seung Man" sort="Kim, Seung Man" uniqKey="Kim S" first="Seung-Man" last="Kim">Seung-Man Kim</name>
<affiliation wicri:level="1">
<inist:fA14 i1="02">
<s1>Intelligent Design & Graphics Lab., Dept. of Mechatronics</s1>
<s3>KOR</s3>
<sZ>2 aut.</sZ>
<sZ>5 aut.</sZ>
</inist:fA14>
<country>Corée du Sud</country>
</affiliation>
</author>
<author>
<name sortKey="Oakley, Ian" sort="Oakley, Ian" uniqKey="Oakley I" first="Ian" last="Oakley">Ian Oakley</name>
<affiliation wicri:level="1">
<inist:fA14 i1="01">
<s1>Human-Machine-Computer Interface Lab., Dept. of Mechatronics, Gwangju Institute of Science and Technology, 1 Oryong-dong</s1>
<s2>Buk-gu, Gwangju 500-712</s2>
<s3>KOR</s3>
<sZ>1 aut.</sZ>
<sZ>3 aut.</sZ>
<sZ>4 aut.</sZ>
</inist:fA14>
<country>Corée du Sud</country>
</affiliation>
</author>
<author>
<name sortKey="Ryu, Jeha" sort="Ryu, Jeha" uniqKey="Ryu J" first="Jeha" last="Ryu">Jeha Ryu</name>
<affiliation wicri:level="1">
<inist:fA14 i1="01">
<s1>Human-Machine-Computer Interface Lab., Dept. of Mechatronics, Gwangju Institute of Science and Technology, 1 Oryong-dong</s1>
<s2>Buk-gu, Gwangju 500-712</s2>
<s3>KOR</s3>
<sZ>1 aut.</sZ>
<sZ>3 aut.</sZ>
<sZ>4 aut.</sZ>
</inist:fA14>
<country>Corée du Sud</country>
</affiliation>
</author>
<author>
<name sortKey="Lee, Kwan H" sort="Lee, Kwan H" uniqKey="Lee K" first="Kwan H." last="Lee">Kwan H. Lee</name>
<affiliation wicri:level="1">
<inist:fA14 i1="02">
<s1>Intelligent Design & Graphics Lab., Dept. of Mechatronics</s1>
<s3>KOR</s3>
<sZ>2 aut.</sZ>
<sZ>5 aut.</sZ>
</inist:fA14>
<country>Corée du Sud</country>
</affiliation>
</author>
</titleStmt>
<publicationStmt>
<idno type="wicri:source">INIST</idno>
<idno type="inist">06-0051619</idno>
<date when="2005">2005</date>
<idno type="stanalyst">PASCAL 06-0051619 INIST</idno>
<idno type="RBID">Pascal:06-0051619</idno>
<idno type="wicri:Area/PascalFrancis/Corpus">000E21</idno>
<idno type="wicri:Area/PascalFrancis/Curation">000685</idno>
</publicationStmt>
<sourceDesc>
<biblStruct>
<analytic>
<title xml:lang="en" level="a">Haptic interaction with depth video media</title>
<author>
<name sortKey="Cha, Jongeun" sort="Cha, Jongeun" uniqKey="Cha J" first="Jongeun" last="Cha">Jongeun Cha</name>
<affiliation wicri:level="1">
<inist:fA14 i1="01">
<s1>Human-Machine-Computer Interface Lab., Dept. of Mechatronics, Gwangju Institute of Science and Technology, 1 Oryong-dong</s1>
<s2>Buk-gu, Gwangju 500-712</s2>
<s3>KOR</s3>
<sZ>1 aut.</sZ>
<sZ>3 aut.</sZ>
<sZ>4 aut.</sZ>
</inist:fA14>
<country>Corée du Sud</country>
</affiliation>
</author>
<author>
<name sortKey="Kim, Seung Man" sort="Kim, Seung Man" uniqKey="Kim S" first="Seung-Man" last="Kim">Seung-Man Kim</name>
<affiliation wicri:level="1">
<inist:fA14 i1="02">
<s1>Intelligent Design & Graphics Lab., Dept. of Mechatronics</s1>
<s3>KOR</s3>
<sZ>2 aut.</sZ>
<sZ>5 aut.</sZ>
</inist:fA14>
<country>Corée du Sud</country>
</affiliation>
</author>
<author>
<name sortKey="Oakley, Ian" sort="Oakley, Ian" uniqKey="Oakley I" first="Ian" last="Oakley">Ian Oakley</name>
<affiliation wicri:level="1">
<inist:fA14 i1="01">
<s1>Human-Machine-Computer Interface Lab., Dept. of Mechatronics, Gwangju Institute of Science and Technology, 1 Oryong-dong</s1>
<s2>Buk-gu, Gwangju 500-712</s2>
<s3>KOR</s3>
<sZ>1 aut.</sZ>
<sZ>3 aut.</sZ>
<sZ>4 aut.</sZ>
</inist:fA14>
<country>Corée du Sud</country>
</affiliation>
</author>
<author>
<name sortKey="Ryu, Jeha" sort="Ryu, Jeha" uniqKey="Ryu J" first="Jeha" last="Ryu">Jeha Ryu</name>
<affiliation wicri:level="1">
<inist:fA14 i1="01">
<s1>Human-Machine-Computer Interface Lab., Dept. of Mechatronics, Gwangju Institute of Science and Technology, 1 Oryong-dong</s1>
<s2>Buk-gu, Gwangju 500-712</s2>
<s3>KOR</s3>
<sZ>1 aut.</sZ>
<sZ>3 aut.</sZ>
<sZ>4 aut.</sZ>
</inist:fA14>
<country>Corée du Sud</country>
</affiliation>
</author>
<author>
<name sortKey="Lee, Kwan H" sort="Lee, Kwan H" uniqKey="Lee K" first="Kwan H." last="Lee">Kwan H. Lee</name>
<affiliation wicri:level="1">
<inist:fA14 i1="02">
<s1>Intelligent Design & Graphics Lab., Dept. of Mechatronics</s1>
<s3>KOR</s3>
<sZ>2 aut.</sZ>
<sZ>5 aut.</sZ>
</inist:fA14>
<country>Corée du Sud</country>
</affiliation>
</author>
</analytic>
<series>
<title level="j" type="main">Lecture notes in computer science</title>
<idno type="ISSN">0302-9743</idno>
<imprint>
<date when="2005">2005</date>
</imprint>
</series>
</biblStruct>
</sourceDesc>
<seriesStmt>
<title level="j" type="main">Lecture notes in computer science</title>
<idno type="ISSN">0302-9743</idno>
</seriesStmt>
</fileDesc>
<profileDesc>
<textClass>
<keywords scheme="KwdEn" xml:lang="en">
<term>Audio acoustics</term>
<term>Cache memory</term>
<term>Color image</term>
<term>Computer animation</term>
<term>Computer graphics</term>
<term>Computer server</term>
<term>Computer vision</term>
<term>Illusion</term>
<term>Image rendering</term>
<term>Multimedia</term>
<term>Sensation</term>
<term>Sound reproduction</term>
<term>Streaming</term>
<term>Tactile sensitivity</term>
<term>Time varying system</term>
<term>Updating</term>
<term>Video signal</term>
<term>Virtual reality</term>
<term>Visual system</term>
</keywords>
<keywords scheme="Pascal" xml:lang="fr">
<term>Multimédia</term>
<term>Signal vidéo</term>
<term>Vision ordinateur</term>
<term>Réalité virtuelle</term>
<term>Acoustique audio</term>
<term>Reproduction son</term>
<term>Image couleur</term>
<term>Animation par ordinateur</term>
<term>Infographie</term>
<term>Rendu image</term>
<term>Antémémoire</term>
<term>Serveur informatique</term>
<term>Mise à jour</term>
<term>Sensibilité tactile</term>
<term>Illusion</term>
<term>Sensation</term>
<term>Appareil visuel</term>
<term>Système paramètre variable</term>
<term>.</term>
<term>Transmission en continu</term>
</keywords>
<keywords scheme="Wicri" type="topic" xml:lang="fr">
<term>Multimédia</term>
<term>Réalité virtuelle</term>
</keywords>
</textClass>
</profileDesc>
</teiHeader>
<front>
<div type="abstract" xml:lang="en">In this paper we propose a touch enabled video player system. A conventional video player only allows viewers to passively experience visual and audio media. In virtual environment, touch or haptic interaction has been shown to convey a powerful illusion of the tangible nature - the reality - of the displayed environments and we feel the same benefits may be conferred to a broadcast, viewing domain. To this end, this paper describes a system that uses a video representation based on depth images to add a haptic component to an audio-visual stream. We generate this stream through the combination of a regular RGB image and a synchronized depth image composed of per-pixel depth-from-camera information. The depth video, a unified stream of the color and depth images, can be synthesized from a computer graphics animation by rendering with commercial packages or captured from a real environment by using a active depth camera such as the Zcam<
<sup>TM</sup>
>. In order to provide a haptic representation of this data, we propose a modified proxy graph algorithm for depth video streams. The modified proxy graph algorithm can (i) detect collisions between a moving virtual proxy and time-varying video scenes, (ii) generates smooth touch sensation by handling the implications of the radically different display update rates required by visual (30Hz) and haptic systems (in the order of 1000Hz), (iii) avoid sudden change of contact forces. A sample experiment shows the effectiveness of the proposed system.</div>
</front>
</TEI>
<inist>
<standard h6="B">
<pA>
<fA01 i1="01" i2="1">
<s0>0302-9743</s0>
</fA01>
<fA05>
<s2>3767</s2>
</fA05>
<fA08 i1="01" i2="1" l="ENG">
<s1>Haptic interaction with depth video media</s1>
</fA08>
<fA09 i1="01" i2="1" l="ENG">
<s1>Advances in multimedia information processing : PCM 2005 : 6th Pacific Rim Conference on Multimedia, Jeju Island, Korea, November 13-16, 2005 : proceedings</s1>
</fA09>
<fA11 i1="01" i2="1">
<s1>CHA (Jongeun)</s1>
</fA11>
<fA11 i1="02" i2="1">
<s1>KIM (Seung-Man)</s1>
</fA11>
<fA11 i1="03" i2="1">
<s1>OAKLEY (Ian)</s1>
</fA11>
<fA11 i1="04" i2="1">
<s1>RYU (Jeha)</s1>
</fA11>
<fA11 i1="05" i2="1">
<s1>LEE (Kwan H.)</s1>
</fA11>
<fA14 i1="01">
<s1>Human-Machine-Computer Interface Lab., Dept. of Mechatronics, Gwangju Institute of Science and Technology, 1 Oryong-dong</s1>
<s2>Buk-gu, Gwangju 500-712</s2>
<s3>KOR</s3>
<sZ>1 aut.</sZ>
<sZ>3 aut.</sZ>
<sZ>4 aut.</sZ>
</fA14>
<fA14 i1="02">
<s1>Intelligent Design & Graphics Lab., Dept. of Mechatronics</s1>
<s3>KOR</s3>
<sZ>2 aut.</sZ>
<sZ>5 aut.</sZ>
</fA14>
<fA20>
<s2>Part I, 420-430</s2>
</fA20>
<fA21>
<s1>2005</s1>
</fA21>
<fA23 i1="01">
<s0>ENG</s0>
</fA23>
<fA26 i1="01">
<s0>3-540-30027-9</s0>
</fA26>
<fA43 i1="01">
<s1>INIST</s1>
<s2>16343</s2>
<s5>354000138665540370</s5>
</fA43>
<fA44>
<s0>0000</s0>
<s1>© 2006 INIST-CNRS. All rights reserved.</s1>
</fA44>
<fA45>
<s0>14 ref.</s0>
</fA45>
<fA47 i1="01" i2="1">
<s0>06-0051619</s0>
</fA47>
<fA60>
<s1>P</s1>
<s2>C</s2>
</fA60>
<fA61>
<s0>A</s0>
</fA61>
<fA64 i1="01" i2="1">
<s0>Lecture notes in computer science</s0>
</fA64>
<fA66 i1="01">
<s0>DEU</s0>
</fA66>
<fC01 i1="01" l="ENG">
<s0>In this paper we propose a touch enabled video player system. A conventional video player only allows viewers to passively experience visual and audio media. In virtual environment, touch or haptic interaction has been shown to convey a powerful illusion of the tangible nature - the reality - of the displayed environments and we feel the same benefits may be conferred to a broadcast, viewing domain. To this end, this paper describes a system that uses a video representation based on depth images to add a haptic component to an audio-visual stream. We generate this stream through the combination of a regular RGB image and a synchronized depth image composed of per-pixel depth-from-camera information. The depth video, a unified stream of the color and depth images, can be synthesized from a computer graphics animation by rendering with commercial packages or captured from a real environment by using a active depth camera such as the Zcam<
<sup>TM</sup>
>. In order to provide a haptic representation of this data, we propose a modified proxy graph algorithm for depth video streams. The modified proxy graph algorithm can (i) detect collisions between a moving virtual proxy and time-varying video scenes, (ii) generates smooth touch sensation by handling the implications of the radically different display update rates required by visual (30Hz) and haptic systems (in the order of 1000Hz), (iii) avoid sudden change of contact forces. A sample experiment shows the effectiveness of the proposed system.</s0>
</fC01>
<fC02 i1="01" i2="X">
<s0>001D02C03</s0>
</fC02>
<fC03 i1="01" i2="X" l="FRE">
<s0>Multimédia</s0>
<s5>01</s5>
</fC03>
<fC03 i1="01" i2="X" l="ENG">
<s0>Multimedia</s0>
<s5>01</s5>
</fC03>
<fC03 i1="01" i2="X" l="SPA">
<s0>Multimedia</s0>
<s5>01</s5>
</fC03>
<fC03 i1="02" i2="X" l="FRE">
<s0>Signal vidéo</s0>
<s5>06</s5>
</fC03>
<fC03 i1="02" i2="X" l="ENG">
<s0>Video signal</s0>
<s5>06</s5>
</fC03>
<fC03 i1="02" i2="X" l="SPA">
<s0>Señal video</s0>
<s5>06</s5>
</fC03>
<fC03 i1="03" i2="X" l="FRE">
<s0>Vision ordinateur</s0>
<s5>07</s5>
</fC03>
<fC03 i1="03" i2="X" l="ENG">
<s0>Computer vision</s0>
<s5>07</s5>
</fC03>
<fC03 i1="03" i2="X" l="SPA">
<s0>Visión ordenador</s0>
<s5>07</s5>
</fC03>
<fC03 i1="04" i2="X" l="FRE">
<s0>Réalité virtuelle</s0>
<s5>08</s5>
</fC03>
<fC03 i1="04" i2="X" l="ENG">
<s0>Virtual reality</s0>
<s5>08</s5>
</fC03>
<fC03 i1="04" i2="X" l="SPA">
<s0>Realidad virtual</s0>
<s5>08</s5>
</fC03>
<fC03 i1="05" i2="3" l="FRE">
<s0>Acoustique audio</s0>
<s5>09</s5>
</fC03>
<fC03 i1="05" i2="3" l="ENG">
<s0>Audio acoustics</s0>
<s5>09</s5>
</fC03>
<fC03 i1="06" i2="X" l="FRE">
<s0>Reproduction son</s0>
<s5>10</s5>
</fC03>
<fC03 i1="06" i2="X" l="ENG">
<s0>Sound reproduction</s0>
<s5>10</s5>
</fC03>
<fC03 i1="06" i2="X" l="SPA">
<s0>Reproducción sonido</s0>
<s5>10</s5>
</fC03>
<fC03 i1="07" i2="X" l="FRE">
<s0>Image couleur</s0>
<s5>11</s5>
</fC03>
<fC03 i1="07" i2="X" l="ENG">
<s0>Color image</s0>
<s5>11</s5>
</fC03>
<fC03 i1="07" i2="X" l="SPA">
<s0>Imagen color</s0>
<s5>11</s5>
</fC03>
<fC03 i1="08" i2="X" l="FRE">
<s0>Animation par ordinateur</s0>
<s5>12</s5>
</fC03>
<fC03 i1="08" i2="X" l="ENG">
<s0>Computer animation</s0>
<s5>12</s5>
</fC03>
<fC03 i1="08" i2="X" l="SPA">
<s0>Animación por computador</s0>
<s5>12</s5>
</fC03>
<fC03 i1="09" i2="X" l="FRE">
<s0>Infographie</s0>
<s5>13</s5>
</fC03>
<fC03 i1="09" i2="X" l="ENG">
<s0>Computer graphics</s0>
<s5>13</s5>
</fC03>
<fC03 i1="09" i2="X" l="SPA">
<s0>Gráfico computadora</s0>
<s5>13</s5>
</fC03>
<fC03 i1="10" i2="X" l="FRE">
<s0>Rendu image</s0>
<s5>14</s5>
</fC03>
<fC03 i1="10" i2="X" l="ENG">
<s0>Image rendering</s0>
<s5>14</s5>
</fC03>
<fC03 i1="10" i2="X" l="SPA">
<s0>Restitucíon imagen</s0>
<s5>14</s5>
</fC03>
<fC03 i1="11" i2="X" l="FRE">
<s0>Antémémoire</s0>
<s5>15</s5>
</fC03>
<fC03 i1="11" i2="X" l="ENG">
<s0>Cache memory</s0>
<s5>15</s5>
</fC03>
<fC03 i1="11" i2="X" l="SPA">
<s0>Antememoria</s0>
<s5>15</s5>
</fC03>
<fC03 i1="12" i2="X" l="FRE">
<s0>Serveur informatique</s0>
<s5>16</s5>
</fC03>
<fC03 i1="12" i2="X" l="ENG">
<s0>Computer server</s0>
<s5>16</s5>
</fC03>
<fC03 i1="12" i2="X" l="SPA">
<s0>Servidor informático</s0>
<s5>16</s5>
</fC03>
<fC03 i1="13" i2="X" l="FRE">
<s0>Mise à jour</s0>
<s5>17</s5>
</fC03>
<fC03 i1="13" i2="X" l="ENG">
<s0>Updating</s0>
<s5>17</s5>
</fC03>
<fC03 i1="13" i2="X" l="SPA">
<s0>Actualización</s0>
<s5>17</s5>
</fC03>
<fC03 i1="14" i2="X" l="FRE">
<s0>Sensibilité tactile</s0>
<s5>18</s5>
</fC03>
<fC03 i1="14" i2="X" l="ENG">
<s0>Tactile sensitivity</s0>
<s5>18</s5>
</fC03>
<fC03 i1="14" i2="X" l="SPA">
<s0>Sensibilidad tactil</s0>
<s5>18</s5>
</fC03>
<fC03 i1="15" i2="X" l="FRE">
<s0>Illusion</s0>
<s5>19</s5>
</fC03>
<fC03 i1="15" i2="X" l="ENG">
<s0>Illusion</s0>
<s5>19</s5>
</fC03>
<fC03 i1="15" i2="X" l="SPA">
<s0>Ilusión</s0>
<s5>19</s5>
</fC03>
<fC03 i1="16" i2="X" l="FRE">
<s0>Sensation</s0>
<s5>20</s5>
</fC03>
<fC03 i1="16" i2="X" l="ENG">
<s0>Sensation</s0>
<s5>20</s5>
</fC03>
<fC03 i1="16" i2="X" l="SPA">
<s0>Sensación</s0>
<s5>20</s5>
</fC03>
<fC03 i1="17" i2="X" l="FRE">
<s0>Appareil visuel</s0>
<s5>21</s5>
</fC03>
<fC03 i1="17" i2="X" l="ENG">
<s0>Visual system</s0>
<s5>21</s5>
</fC03>
<fC03 i1="17" i2="X" l="SPA">
<s0>Aparato visual</s0>
<s5>21</s5>
</fC03>
<fC03 i1="18" i2="X" l="FRE">
<s0>Système paramètre variable</s0>
<s5>23</s5>
</fC03>
<fC03 i1="18" i2="X" l="ENG">
<s0>Time varying system</s0>
<s5>23</s5>
</fC03>
<fC03 i1="18" i2="X" l="SPA">
<s0>Sistema parámetro variable</s0>
<s5>23</s5>
</fC03>
<fC03 i1="19" i2="X" l="FRE">
<s0>.</s0>
<s4>INC</s4>
<s5>82</s5>
</fC03>
<fC03 i1="20" i2="X" l="FRE">
<s0>Transmission en continu</s0>
<s4>CD</s4>
<s5>96</s5>
</fC03>
<fC03 i1="20" i2="X" l="ENG">
<s0>Streaming</s0>
<s4>CD</s4>
<s5>96</s5>
</fC03>
<fC03 i1="20" i2="X" l="SPA">
<s0>Transmisión continua</s0>
<s4>CD</s4>
<s5>96</s5>
</fC03>
<fN21>
<s1>023</s1>
</fN21>
<fN44 i1="01">
<s1>OTO</s1>
</fN44>
<fN82>
<s1>OTO</s1>
</fN82>
</pA>
<pR>
<fA30 i1="01" i2="1" l="ENG">
<s1>Pacific Rim Conference on Multimedia</s1>
<s2>6</s2>
<s3>KOR</s3>
<s4>2005-11-13</s4>
</fA30>
</pR>
</standard>
</inist>
</record>

Pour manipuler ce document sous Unix (Dilib)

EXPLOR_STEP=$WICRI_ROOT/Ticri/CIDE/explor/HapticV1/Data/PascalFrancis/Curation
HfdSelect -h $EXPLOR_STEP/biblio.hfd -nk 000685 | SxmlIndent | more

Ou

HfdSelect -h $EXPLOR_AREA/Data/PascalFrancis/Curation/biblio.hfd -nk 000685 | SxmlIndent | more

Pour mettre un lien sur cette page dans le réseau Wicri

{{Explor lien
   |wiki=    Ticri/CIDE
   |area=    HapticV1
   |flux=    PascalFrancis
   |étape=   Curation
   |type=    RBID
   |clé=     Pascal:06-0051619
   |texte=   Haptic interaction with depth video media
}}

Wicri

This area was generated with Dilib version V0.6.23.
Data generation: Mon Jun 13 01:09:46 2016. Site generation: Wed Mar 6 09:54:07 2024