Serveur d'exploration sur les dispositifs haptiques

Attention, ce site est en cours de développement !
Attention, site généré par des moyens informatiques à partir de corpus bruts.
Les informations ne sont donc pas validées.

Haptic interaction with depth video media

Identifieur interne : 000B26 ( PascalFrancis/Checkpoint ); précédent : 000B25; suivant : 000B27

Haptic interaction with depth video media

Auteurs : Jongeun Cha [Corée du Sud] ; Seung-Man Kim [Corée du Sud] ; Ian Oakley [Corée du Sud] ; Jeha Ryu [Corée du Sud] ; Kwan H. Lee [Corée du Sud]

Source :

RBID : Pascal:06-0051619

Descripteurs français

English descriptors

Abstract

In this paper we propose a touch enabled video player system. A conventional video player only allows viewers to passively experience visual and audio media. In virtual environment, touch or haptic interaction has been shown to convey a powerful illusion of the tangible nature - the reality - of the displayed environments and we feel the same benefits may be conferred to a broadcast, viewing domain. To this end, this paper describes a system that uses a video representation based on depth images to add a haptic component to an audio-visual stream. We generate this stream through the combination of a regular RGB image and a synchronized depth image composed of per-pixel depth-from-camera information. The depth video, a unified stream of the color and depth images, can be synthesized from a computer graphics animation by rendering with commercial packages or captured from a real environment by using a active depth camera such as the Zcam<TM>. In order to provide a haptic representation of this data, we propose a modified proxy graph algorithm for depth video streams. The modified proxy graph algorithm can (i) detect collisions between a moving virtual proxy and time-varying video scenes, (ii) generates smooth touch sensation by handling the implications of the radically different display update rates required by visual (30Hz) and haptic systems (in the order of 1000Hz), (iii) avoid sudden change of contact forces. A sample experiment shows the effectiveness of the proposed system.


Affiliations:


Links toward previous steps (curation, corpus...)


Links to Exploration step

Pascal:06-0051619

Le document en format XML

<record>
<TEI>
<teiHeader>
<fileDesc>
<titleStmt>
<title xml:lang="en" level="a">Haptic interaction with depth video media</title>
<author>
<name sortKey="Cha, Jongeun" sort="Cha, Jongeun" uniqKey="Cha J" first="Jongeun" last="Cha">Jongeun Cha</name>
<affiliation wicri:level="1">
<inist:fA14 i1="01">
<s1>Human-Machine-Computer Interface Lab., Dept. of Mechatronics, Gwangju Institute of Science and Technology, 1 Oryong-dong</s1>
<s2>Buk-gu, Gwangju 500-712</s2>
<s3>KOR</s3>
<sZ>1 aut.</sZ>
<sZ>3 aut.</sZ>
<sZ>4 aut.</sZ>
</inist:fA14>
<country>Corée du Sud</country>
<wicri:noRegion>Buk-gu, Gwangju 500-712</wicri:noRegion>
</affiliation>
</author>
<author>
<name sortKey="Kim, Seung Man" sort="Kim, Seung Man" uniqKey="Kim S" first="Seung-Man" last="Kim">Seung-Man Kim</name>
<affiliation wicri:level="1">
<inist:fA14 i1="02">
<s1>Intelligent Design & Graphics Lab., Dept. of Mechatronics</s1>
<s3>KOR</s3>
<sZ>2 aut.</sZ>
<sZ>5 aut.</sZ>
</inist:fA14>
<country>Corée du Sud</country>
<wicri:noRegion>Intelligent Design & Graphics Lab., Dept. of Mechatronics</wicri:noRegion>
</affiliation>
</author>
<author>
<name sortKey="Oakley, Ian" sort="Oakley, Ian" uniqKey="Oakley I" first="Ian" last="Oakley">Ian Oakley</name>
<affiliation wicri:level="1">
<inist:fA14 i1="01">
<s1>Human-Machine-Computer Interface Lab., Dept. of Mechatronics, Gwangju Institute of Science and Technology, 1 Oryong-dong</s1>
<s2>Buk-gu, Gwangju 500-712</s2>
<s3>KOR</s3>
<sZ>1 aut.</sZ>
<sZ>3 aut.</sZ>
<sZ>4 aut.</sZ>
</inist:fA14>
<country>Corée du Sud</country>
<wicri:noRegion>Buk-gu, Gwangju 500-712</wicri:noRegion>
</affiliation>
</author>
<author>
<name sortKey="Ryu, Jeha" sort="Ryu, Jeha" uniqKey="Ryu J" first="Jeha" last="Ryu">Jeha Ryu</name>
<affiliation wicri:level="1">
<inist:fA14 i1="01">
<s1>Human-Machine-Computer Interface Lab., Dept. of Mechatronics, Gwangju Institute of Science and Technology, 1 Oryong-dong</s1>
<s2>Buk-gu, Gwangju 500-712</s2>
<s3>KOR</s3>
<sZ>1 aut.</sZ>
<sZ>3 aut.</sZ>
<sZ>4 aut.</sZ>
</inist:fA14>
<country>Corée du Sud</country>
<wicri:noRegion>Buk-gu, Gwangju 500-712</wicri:noRegion>
</affiliation>
</author>
<author>
<name sortKey="Lee, Kwan H" sort="Lee, Kwan H" uniqKey="Lee K" first="Kwan H." last="Lee">Kwan H. Lee</name>
<affiliation wicri:level="1">
<inist:fA14 i1="02">
<s1>Intelligent Design & Graphics Lab., Dept. of Mechatronics</s1>
<s3>KOR</s3>
<sZ>2 aut.</sZ>
<sZ>5 aut.</sZ>
</inist:fA14>
<country>Corée du Sud</country>
<wicri:noRegion>Intelligent Design & Graphics Lab., Dept. of Mechatronics</wicri:noRegion>
</affiliation>
</author>
</titleStmt>
<publicationStmt>
<idno type="wicri:source">INIST</idno>
<idno type="inist">06-0051619</idno>
<date when="2005">2005</date>
<idno type="stanalyst">PASCAL 06-0051619 INIST</idno>
<idno type="RBID">Pascal:06-0051619</idno>
<idno type="wicri:Area/PascalFrancis/Corpus">000E21</idno>
<idno type="wicri:Area/PascalFrancis/Curation">000685</idno>
<idno type="wicri:Area/PascalFrancis/Checkpoint">000B26</idno>
</publicationStmt>
<sourceDesc>
<biblStruct>
<analytic>
<title xml:lang="en" level="a">Haptic interaction with depth video media</title>
<author>
<name sortKey="Cha, Jongeun" sort="Cha, Jongeun" uniqKey="Cha J" first="Jongeun" last="Cha">Jongeun Cha</name>
<affiliation wicri:level="1">
<inist:fA14 i1="01">
<s1>Human-Machine-Computer Interface Lab., Dept. of Mechatronics, Gwangju Institute of Science and Technology, 1 Oryong-dong</s1>
<s2>Buk-gu, Gwangju 500-712</s2>
<s3>KOR</s3>
<sZ>1 aut.</sZ>
<sZ>3 aut.</sZ>
<sZ>4 aut.</sZ>
</inist:fA14>
<country>Corée du Sud</country>
<wicri:noRegion>Buk-gu, Gwangju 500-712</wicri:noRegion>
</affiliation>
</author>
<author>
<name sortKey="Kim, Seung Man" sort="Kim, Seung Man" uniqKey="Kim S" first="Seung-Man" last="Kim">Seung-Man Kim</name>
<affiliation wicri:level="1">
<inist:fA14 i1="02">
<s1>Intelligent Design & Graphics Lab., Dept. of Mechatronics</s1>
<s3>KOR</s3>
<sZ>2 aut.</sZ>
<sZ>5 aut.</sZ>
</inist:fA14>
<country>Corée du Sud</country>
<wicri:noRegion>Intelligent Design & Graphics Lab., Dept. of Mechatronics</wicri:noRegion>
</affiliation>
</author>
<author>
<name sortKey="Oakley, Ian" sort="Oakley, Ian" uniqKey="Oakley I" first="Ian" last="Oakley">Ian Oakley</name>
<affiliation wicri:level="1">
<inist:fA14 i1="01">
<s1>Human-Machine-Computer Interface Lab., Dept. of Mechatronics, Gwangju Institute of Science and Technology, 1 Oryong-dong</s1>
<s2>Buk-gu, Gwangju 500-712</s2>
<s3>KOR</s3>
<sZ>1 aut.</sZ>
<sZ>3 aut.</sZ>
<sZ>4 aut.</sZ>
</inist:fA14>
<country>Corée du Sud</country>
<wicri:noRegion>Buk-gu, Gwangju 500-712</wicri:noRegion>
</affiliation>
</author>
<author>
<name sortKey="Ryu, Jeha" sort="Ryu, Jeha" uniqKey="Ryu J" first="Jeha" last="Ryu">Jeha Ryu</name>
<affiliation wicri:level="1">
<inist:fA14 i1="01">
<s1>Human-Machine-Computer Interface Lab., Dept. of Mechatronics, Gwangju Institute of Science and Technology, 1 Oryong-dong</s1>
<s2>Buk-gu, Gwangju 500-712</s2>
<s3>KOR</s3>
<sZ>1 aut.</sZ>
<sZ>3 aut.</sZ>
<sZ>4 aut.</sZ>
</inist:fA14>
<country>Corée du Sud</country>
<wicri:noRegion>Buk-gu, Gwangju 500-712</wicri:noRegion>
</affiliation>
</author>
<author>
<name sortKey="Lee, Kwan H" sort="Lee, Kwan H" uniqKey="Lee K" first="Kwan H." last="Lee">Kwan H. Lee</name>
<affiliation wicri:level="1">
<inist:fA14 i1="02">
<s1>Intelligent Design & Graphics Lab., Dept. of Mechatronics</s1>
<s3>KOR</s3>
<sZ>2 aut.</sZ>
<sZ>5 aut.</sZ>
</inist:fA14>
<country>Corée du Sud</country>
<wicri:noRegion>Intelligent Design & Graphics Lab., Dept. of Mechatronics</wicri:noRegion>
</affiliation>
</author>
</analytic>
<series>
<title level="j" type="main">Lecture notes in computer science</title>
<idno type="ISSN">0302-9743</idno>
<imprint>
<date when="2005">2005</date>
</imprint>
</series>
</biblStruct>
</sourceDesc>
<seriesStmt>
<title level="j" type="main">Lecture notes in computer science</title>
<idno type="ISSN">0302-9743</idno>
</seriesStmt>
</fileDesc>
<profileDesc>
<textClass>
<keywords scheme="KwdEn" xml:lang="en">
<term>Audio acoustics</term>
<term>Cache memory</term>
<term>Color image</term>
<term>Computer animation</term>
<term>Computer graphics</term>
<term>Computer server</term>
<term>Computer vision</term>
<term>Illusion</term>
<term>Image rendering</term>
<term>Multimedia</term>
<term>Sensation</term>
<term>Sound reproduction</term>
<term>Streaming</term>
<term>Tactile sensitivity</term>
<term>Time varying system</term>
<term>Updating</term>
<term>Video signal</term>
<term>Virtual reality</term>
<term>Visual system</term>
</keywords>
<keywords scheme="Pascal" xml:lang="fr">
<term>Multimédia</term>
<term>Signal vidéo</term>
<term>Vision ordinateur</term>
<term>Réalité virtuelle</term>
<term>Acoustique audio</term>
<term>Reproduction son</term>
<term>Image couleur</term>
<term>Animation par ordinateur</term>
<term>Infographie</term>
<term>Rendu image</term>
<term>Antémémoire</term>
<term>Serveur informatique</term>
<term>Mise à jour</term>
<term>Sensibilité tactile</term>
<term>Illusion</term>
<term>Sensation</term>
<term>Appareil visuel</term>
<term>Système paramètre variable</term>
<term>.</term>
<term>Transmission en continu</term>
</keywords>
<keywords scheme="Wicri" type="topic" xml:lang="fr">
<term>Multimédia</term>
<term>Réalité virtuelle</term>
</keywords>
</textClass>
</profileDesc>
</teiHeader>
<front>
<div type="abstract" xml:lang="en">In this paper we propose a touch enabled video player system. A conventional video player only allows viewers to passively experience visual and audio media. In virtual environment, touch or haptic interaction has been shown to convey a powerful illusion of the tangible nature - the reality - of the displayed environments and we feel the same benefits may be conferred to a broadcast, viewing domain. To this end, this paper describes a system that uses a video representation based on depth images to add a haptic component to an audio-visual stream. We generate this stream through the combination of a regular RGB image and a synchronized depth image composed of per-pixel depth-from-camera information. The depth video, a unified stream of the color and depth images, can be synthesized from a computer graphics animation by rendering with commercial packages or captured from a real environment by using a active depth camera such as the Zcam<
<sup>TM</sup>
>. In order to provide a haptic representation of this data, we propose a modified proxy graph algorithm for depth video streams. The modified proxy graph algorithm can (i) detect collisions between a moving virtual proxy and time-varying video scenes, (ii) generates smooth touch sensation by handling the implications of the radically different display update rates required by visual (30Hz) and haptic systems (in the order of 1000Hz), (iii) avoid sudden change of contact forces. A sample experiment shows the effectiveness of the proposed system.</div>
</front>
</TEI>
<inist>
<standard h6="B">
<pA>
<fA01 i1="01" i2="1">
<s0>0302-9743</s0>
</fA01>
<fA05>
<s2>3767</s2>
</fA05>
<fA08 i1="01" i2="1" l="ENG">
<s1>Haptic interaction with depth video media</s1>
</fA08>
<fA09 i1="01" i2="1" l="ENG">
<s1>Advances in multimedia information processing : PCM 2005 : 6th Pacific Rim Conference on Multimedia, Jeju Island, Korea, November 13-16, 2005 : proceedings</s1>
</fA09>
<fA11 i1="01" i2="1">
<s1>CHA (Jongeun)</s1>
</fA11>
<fA11 i1="02" i2="1">
<s1>KIM (Seung-Man)</s1>
</fA11>
<fA11 i1="03" i2="1">
<s1>OAKLEY (Ian)</s1>
</fA11>
<fA11 i1="04" i2="1">
<s1>RYU (Jeha)</s1>
</fA11>
<fA11 i1="05" i2="1">
<s1>LEE (Kwan H.)</s1>
</fA11>
<fA14 i1="01">
<s1>Human-Machine-Computer Interface Lab., Dept. of Mechatronics, Gwangju Institute of Science and Technology, 1 Oryong-dong</s1>
<s2>Buk-gu, Gwangju 500-712</s2>
<s3>KOR</s3>
<sZ>1 aut.</sZ>
<sZ>3 aut.</sZ>
<sZ>4 aut.</sZ>
</fA14>
<fA14 i1="02">
<s1>Intelligent Design & Graphics Lab., Dept. of Mechatronics</s1>
<s3>KOR</s3>
<sZ>2 aut.</sZ>
<sZ>5 aut.</sZ>
</fA14>
<fA20>
<s2>Part I, 420-430</s2>
</fA20>
<fA21>
<s1>2005</s1>
</fA21>
<fA23 i1="01">
<s0>ENG</s0>
</fA23>
<fA26 i1="01">
<s0>3-540-30027-9</s0>
</fA26>
<fA43 i1="01">
<s1>INIST</s1>
<s2>16343</s2>
<s5>354000138665540370</s5>
</fA43>
<fA44>
<s0>0000</s0>
<s1>© 2006 INIST-CNRS. All rights reserved.</s1>
</fA44>
<fA45>
<s0>14 ref.</s0>
</fA45>
<fA47 i1="01" i2="1">
<s0>06-0051619</s0>
</fA47>
<fA60>
<s1>P</s1>
<s2>C</s2>
</fA60>
<fA61>
<s0>A</s0>
</fA61>
<fA64 i1="01" i2="1">
<s0>Lecture notes in computer science</s0>
</fA64>
<fA66 i1="01">
<s0>DEU</s0>
</fA66>
<fC01 i1="01" l="ENG">
<s0>In this paper we propose a touch enabled video player system. A conventional video player only allows viewers to passively experience visual and audio media. In virtual environment, touch or haptic interaction has been shown to convey a powerful illusion of the tangible nature - the reality - of the displayed environments and we feel the same benefits may be conferred to a broadcast, viewing domain. To this end, this paper describes a system that uses a video representation based on depth images to add a haptic component to an audio-visual stream. We generate this stream through the combination of a regular RGB image and a synchronized depth image composed of per-pixel depth-from-camera information. The depth video, a unified stream of the color and depth images, can be synthesized from a computer graphics animation by rendering with commercial packages or captured from a real environment by using a active depth camera such as the Zcam<
<sup>TM</sup>
>. In order to provide a haptic representation of this data, we propose a modified proxy graph algorithm for depth video streams. The modified proxy graph algorithm can (i) detect collisions between a moving virtual proxy and time-varying video scenes, (ii) generates smooth touch sensation by handling the implications of the radically different display update rates required by visual (30Hz) and haptic systems (in the order of 1000Hz), (iii) avoid sudden change of contact forces. A sample experiment shows the effectiveness of the proposed system.</s0>
</fC01>
<fC02 i1="01" i2="X">
<s0>001D02C03</s0>
</fC02>
<fC03 i1="01" i2="X" l="FRE">
<s0>Multimédia</s0>
<s5>01</s5>
</fC03>
<fC03 i1="01" i2="X" l="ENG">
<s0>Multimedia</s0>
<s5>01</s5>
</fC03>
<fC03 i1="01" i2="X" l="SPA">
<s0>Multimedia</s0>
<s5>01</s5>
</fC03>
<fC03 i1="02" i2="X" l="FRE">
<s0>Signal vidéo</s0>
<s5>06</s5>
</fC03>
<fC03 i1="02" i2="X" l="ENG">
<s0>Video signal</s0>
<s5>06</s5>
</fC03>
<fC03 i1="02" i2="X" l="SPA">
<s0>Señal video</s0>
<s5>06</s5>
</fC03>
<fC03 i1="03" i2="X" l="FRE">
<s0>Vision ordinateur</s0>
<s5>07</s5>
</fC03>
<fC03 i1="03" i2="X" l="ENG">
<s0>Computer vision</s0>
<s5>07</s5>
</fC03>
<fC03 i1="03" i2="X" l="SPA">
<s0>Visión ordenador</s0>
<s5>07</s5>
</fC03>
<fC03 i1="04" i2="X" l="FRE">
<s0>Réalité virtuelle</s0>
<s5>08</s5>
</fC03>
<fC03 i1="04" i2="X" l="ENG">
<s0>Virtual reality</s0>
<s5>08</s5>
</fC03>
<fC03 i1="04" i2="X" l="SPA">
<s0>Realidad virtual</s0>
<s5>08</s5>
</fC03>
<fC03 i1="05" i2="3" l="FRE">
<s0>Acoustique audio</s0>
<s5>09</s5>
</fC03>
<fC03 i1="05" i2="3" l="ENG">
<s0>Audio acoustics</s0>
<s5>09</s5>
</fC03>
<fC03 i1="06" i2="X" l="FRE">
<s0>Reproduction son</s0>
<s5>10</s5>
</fC03>
<fC03 i1="06" i2="X" l="ENG">
<s0>Sound reproduction</s0>
<s5>10</s5>
</fC03>
<fC03 i1="06" i2="X" l="SPA">
<s0>Reproducción sonido</s0>
<s5>10</s5>
</fC03>
<fC03 i1="07" i2="X" l="FRE">
<s0>Image couleur</s0>
<s5>11</s5>
</fC03>
<fC03 i1="07" i2="X" l="ENG">
<s0>Color image</s0>
<s5>11</s5>
</fC03>
<fC03 i1="07" i2="X" l="SPA">
<s0>Imagen color</s0>
<s5>11</s5>
</fC03>
<fC03 i1="08" i2="X" l="FRE">
<s0>Animation par ordinateur</s0>
<s5>12</s5>
</fC03>
<fC03 i1="08" i2="X" l="ENG">
<s0>Computer animation</s0>
<s5>12</s5>
</fC03>
<fC03 i1="08" i2="X" l="SPA">
<s0>Animación por computador</s0>
<s5>12</s5>
</fC03>
<fC03 i1="09" i2="X" l="FRE">
<s0>Infographie</s0>
<s5>13</s5>
</fC03>
<fC03 i1="09" i2="X" l="ENG">
<s0>Computer graphics</s0>
<s5>13</s5>
</fC03>
<fC03 i1="09" i2="X" l="SPA">
<s0>Gráfico computadora</s0>
<s5>13</s5>
</fC03>
<fC03 i1="10" i2="X" l="FRE">
<s0>Rendu image</s0>
<s5>14</s5>
</fC03>
<fC03 i1="10" i2="X" l="ENG">
<s0>Image rendering</s0>
<s5>14</s5>
</fC03>
<fC03 i1="10" i2="X" l="SPA">
<s0>Restitucíon imagen</s0>
<s5>14</s5>
</fC03>
<fC03 i1="11" i2="X" l="FRE">
<s0>Antémémoire</s0>
<s5>15</s5>
</fC03>
<fC03 i1="11" i2="X" l="ENG">
<s0>Cache memory</s0>
<s5>15</s5>
</fC03>
<fC03 i1="11" i2="X" l="SPA">
<s0>Antememoria</s0>
<s5>15</s5>
</fC03>
<fC03 i1="12" i2="X" l="FRE">
<s0>Serveur informatique</s0>
<s5>16</s5>
</fC03>
<fC03 i1="12" i2="X" l="ENG">
<s0>Computer server</s0>
<s5>16</s5>
</fC03>
<fC03 i1="12" i2="X" l="SPA">
<s0>Servidor informático</s0>
<s5>16</s5>
</fC03>
<fC03 i1="13" i2="X" l="FRE">
<s0>Mise à jour</s0>
<s5>17</s5>
</fC03>
<fC03 i1="13" i2="X" l="ENG">
<s0>Updating</s0>
<s5>17</s5>
</fC03>
<fC03 i1="13" i2="X" l="SPA">
<s0>Actualización</s0>
<s5>17</s5>
</fC03>
<fC03 i1="14" i2="X" l="FRE">
<s0>Sensibilité tactile</s0>
<s5>18</s5>
</fC03>
<fC03 i1="14" i2="X" l="ENG">
<s0>Tactile sensitivity</s0>
<s5>18</s5>
</fC03>
<fC03 i1="14" i2="X" l="SPA">
<s0>Sensibilidad tactil</s0>
<s5>18</s5>
</fC03>
<fC03 i1="15" i2="X" l="FRE">
<s0>Illusion</s0>
<s5>19</s5>
</fC03>
<fC03 i1="15" i2="X" l="ENG">
<s0>Illusion</s0>
<s5>19</s5>
</fC03>
<fC03 i1="15" i2="X" l="SPA">
<s0>Ilusión</s0>
<s5>19</s5>
</fC03>
<fC03 i1="16" i2="X" l="FRE">
<s0>Sensation</s0>
<s5>20</s5>
</fC03>
<fC03 i1="16" i2="X" l="ENG">
<s0>Sensation</s0>
<s5>20</s5>
</fC03>
<fC03 i1="16" i2="X" l="SPA">
<s0>Sensación</s0>
<s5>20</s5>
</fC03>
<fC03 i1="17" i2="X" l="FRE">
<s0>Appareil visuel</s0>
<s5>21</s5>
</fC03>
<fC03 i1="17" i2="X" l="ENG">
<s0>Visual system</s0>
<s5>21</s5>
</fC03>
<fC03 i1="17" i2="X" l="SPA">
<s0>Aparato visual</s0>
<s5>21</s5>
</fC03>
<fC03 i1="18" i2="X" l="FRE">
<s0>Système paramètre variable</s0>
<s5>23</s5>
</fC03>
<fC03 i1="18" i2="X" l="ENG">
<s0>Time varying system</s0>
<s5>23</s5>
</fC03>
<fC03 i1="18" i2="X" l="SPA">
<s0>Sistema parámetro variable</s0>
<s5>23</s5>
</fC03>
<fC03 i1="19" i2="X" l="FRE">
<s0>.</s0>
<s4>INC</s4>
<s5>82</s5>
</fC03>
<fC03 i1="20" i2="X" l="FRE">
<s0>Transmission en continu</s0>
<s4>CD</s4>
<s5>96</s5>
</fC03>
<fC03 i1="20" i2="X" l="ENG">
<s0>Streaming</s0>
<s4>CD</s4>
<s5>96</s5>
</fC03>
<fC03 i1="20" i2="X" l="SPA">
<s0>Transmisión continua</s0>
<s4>CD</s4>
<s5>96</s5>
</fC03>
<fN21>
<s1>023</s1>
</fN21>
<fN44 i1="01">
<s1>OTO</s1>
</fN44>
<fN82>
<s1>OTO</s1>
</fN82>
</pA>
<pR>
<fA30 i1="01" i2="1" l="ENG">
<s1>Pacific Rim Conference on Multimedia</s1>
<s2>6</s2>
<s3>KOR</s3>
<s4>2005-11-13</s4>
</fA30>
</pR>
</standard>
</inist>
<affiliations>
<list>
<country>
<li>Corée du Sud</li>
</country>
</list>
<tree>
<country name="Corée du Sud">
<noRegion>
<name sortKey="Cha, Jongeun" sort="Cha, Jongeun" uniqKey="Cha J" first="Jongeun" last="Cha">Jongeun Cha</name>
</noRegion>
<name sortKey="Kim, Seung Man" sort="Kim, Seung Man" uniqKey="Kim S" first="Seung-Man" last="Kim">Seung-Man Kim</name>
<name sortKey="Lee, Kwan H" sort="Lee, Kwan H" uniqKey="Lee K" first="Kwan H." last="Lee">Kwan H. Lee</name>
<name sortKey="Oakley, Ian" sort="Oakley, Ian" uniqKey="Oakley I" first="Ian" last="Oakley">Ian Oakley</name>
<name sortKey="Ryu, Jeha" sort="Ryu, Jeha" uniqKey="Ryu J" first="Jeha" last="Ryu">Jeha Ryu</name>
</country>
</tree>
</affiliations>
</record>

Pour manipuler ce document sous Unix (Dilib)

EXPLOR_STEP=$WICRI_ROOT/Ticri/CIDE/explor/HapticV1/Data/PascalFrancis/Checkpoint
HfdSelect -h $EXPLOR_STEP/biblio.hfd -nk 000B26 | SxmlIndent | more

Ou

HfdSelect -h $EXPLOR_AREA/Data/PascalFrancis/Checkpoint/biblio.hfd -nk 000B26 | SxmlIndent | more

Pour mettre un lien sur cette page dans le réseau Wicri

{{Explor lien
   |wiki=    Ticri/CIDE
   |area=    HapticV1
   |flux=    PascalFrancis
   |étape=   Checkpoint
   |type=    RBID
   |clé=     Pascal:06-0051619
   |texte=   Haptic interaction with depth video media
}}

Wicri

This area was generated with Dilib version V0.6.23.
Data generation: Mon Jun 13 01:09:46 2016. Site generation: Wed Mar 6 09:54:07 2024