Serveur d'exploration sur les dispositifs haptiques

Attention, ce site est en cours de développement !
Attention, site généré par des moyens informatiques à partir de corpus bruts.
Les informations ne sont donc pas validées.

A novel test-bed for immersive and interactive broadcasting production using augmented reality and haptics

Identifieur interne : 000D67 ( PascalFrancis/Corpus ); précédent : 000D66; suivant : 000D68

A novel test-bed for immersive and interactive broadcasting production using augmented reality and haptics

Auteurs : Seungjun Kim ; Jongeun Cha ; Jongphil Kim ; Jeha Ryu ; Seongeun Eom ; Nitaigour P. Mahalik ; Byungha Ahn

Source :

RBID : Pascal:06-0226095

Descripteurs français

English descriptors

Abstract

In this paper, we demonstrate an immersive and interactive broadcasting production system with a new haptically enhanced multimedia broadcasting chain. The system adapts Augmented Reality (AR) techniques, which merges captured videos and virtual 3D media seamlessly through multimedia streaming technology, and haptic interaction technology in near real-time. In this system, viewers at the haptic multimedia client can interact with AR broadcasting production transmitted via communication network. We demonstrate two test applications, which show that the addition of AR- and haptic-interaction to the conventional audio-visual contents can improve immersiveness and interactivity of viewers with rich contents service.

Notice en format standard (ISO 2709)

Pour connaître la documentation sur le format Inist Standard.

pA  
A01 01  1    @0 0916-8532
A03   1    @0 IEICE trans. inf. syst.
A05       @2 89
A06       @2 1
A08 01  1  ENG  @1 A novel test-bed for immersive and interactive broadcasting production using augmented reality and haptics
A11 01  1    @1 KIM (Seungjun)
A11 02  1    @1 CHA (Jongeun)
A11 03  1    @1 KIM (Jongphil)
A11 04  1    @1 RYU (Jeha)
A11 05  1    @1 EOM (Seongeun)
A11 06  1    @1 MAHALIK (Nitaigour P.)
A11 07  1    @1 AHN (Byungha)
A14 01      @1 Department of Mechatronics, Gwangju Institute of Science and Technology (GIST), 1 Oryoung- dong @2 Buk-gu, Gwangju @3 KOR @Z 1 aut. @Z 2 aut. @Z 3 aut. @Z 4 aut. @Z 5 aut. @Z 6 aut. @Z 7 aut.
A20       @1 106-110
A21       @1 2006
A23 01      @0 ENG
A43 01      @1 INIST @2 7315E4 @5 354000135706510120
A44       @0 0000 @1 © 2006 INIST-CNRS. All rights reserved.
A45       @0 17 ref.
A47 01  1    @0 06-0226095
A60       @1 P @2 C @3 CR
A61       @0 A
A64 01  1    @0 IEICE transactions on information and systems
A66 01      @0 JPN
C01 01    ENG  @0 In this paper, we demonstrate an immersive and interactive broadcasting production system with a new haptically enhanced multimedia broadcasting chain. The system adapts Augmented Reality (AR) techniques, which merges captured videos and virtual 3D media seamlessly through multimedia streaming technology, and haptic interaction technology in near real-time. In this system, viewers at the haptic multimedia client can interact with AR broadcasting production transmitted via communication network. We demonstrate two test applications, which show that the addition of AR- and haptic-interaction to the conventional audio-visual contents can improve immersiveness and interactivity of viewers with rich contents service.
C02 01  X    @0 001D04A05D
C03 01  3  FRE  @0 Système multimédia @5 01
C03 01  3  ENG  @0 Multimedia systems @5 01
C03 02  X  FRE  @0 Studio radiodiffusion @5 02
C03 02  X  ENG  @0 Broadcast studio @5 02
C03 02  X  SPA  @0 Estudio radiodifusión @5 02
C03 03  3  FRE  @0 Vidéo interactive @5 03
C03 03  3  ENG  @0 Interactive video @5 03
C03 04  X  FRE  @0 Réalité augmentée @5 04
C03 04  X  ENG  @0 Augmented reality @5 04
C03 04  X  SPA  @0 Realidad aumentada @5 04
C03 05  3  FRE  @0 Interface haptique @5 05
C03 05  3  ENG  @0 Haptic interfaces @5 05
N21       @1 142
N44 01      @1 PSI
N82       @1 PSI
pR  
A30 01  1  ENG  @1 International Conference on Artificial Reality and Telexistence @2 14 @3 Seoul KOR @4 2004-12-02

Format Inist (serveur)

NO : PASCAL 06-0226095 INIST
ET : A novel test-bed for immersive and interactive broadcasting production using augmented reality and haptics
AU : KIM (Seungjun); CHA (Jongeun); KIM (Jongphil); RYU (Jeha); EOM (Seongeun); MAHALIK (Nitaigour P.); AHN (Byungha)
AF : Department of Mechatronics, Gwangju Institute of Science and Technology (GIST), 1 Oryoung- dong/Buk-gu, Gwangju/Corée, République de (1 aut., 2 aut., 3 aut., 4 aut., 5 aut., 6 aut., 7 aut.)
DT : Publication en série; Congrès; Correspondance, lettre; Niveau analytique
SO : IEICE transactions on information and systems; ISSN 0916-8532; Japon; Da. 2006; Vol. 89; No. 1; Pp. 106-110; Bibl. 17 ref.
LA : Anglais
EA : In this paper, we demonstrate an immersive and interactive broadcasting production system with a new haptically enhanced multimedia broadcasting chain. The system adapts Augmented Reality (AR) techniques, which merges captured videos and virtual 3D media seamlessly through multimedia streaming technology, and haptic interaction technology in near real-time. In this system, viewers at the haptic multimedia client can interact with AR broadcasting production transmitted via communication network. We demonstrate two test applications, which show that the addition of AR- and haptic-interaction to the conventional audio-visual contents can improve immersiveness and interactivity of viewers with rich contents service.
CC : 001D04A05D
FD : Système multimédia; Studio radiodiffusion; Vidéo interactive; Réalité augmentée; Interface haptique
ED : Multimedia systems; Broadcast studio; Interactive video; Augmented reality; Haptic interfaces
SD : Estudio radiodifusión; Realidad aumentada
LO : INIST-7315E4.354000135706510120
ID : 06-0226095

Links to Exploration step

Pascal:06-0226095

Le document en format XML

<record>
<TEI>
<teiHeader>
<fileDesc>
<titleStmt>
<title xml:lang="en" level="a">A novel test-bed for immersive and interactive broadcasting production using augmented reality and haptics</title>
<author>
<name sortKey="Kim, Seungjun" sort="Kim, Seungjun" uniqKey="Kim S" first="Seungjun" last="Kim">Seungjun Kim</name>
<affiliation>
<inist:fA14 i1="01">
<s1>Department of Mechatronics, Gwangju Institute of Science and Technology (GIST), 1 Oryoung- dong</s1>
<s2>Buk-gu, Gwangju</s2>
<s3>KOR</s3>
<sZ>1 aut.</sZ>
<sZ>2 aut.</sZ>
<sZ>3 aut.</sZ>
<sZ>4 aut.</sZ>
<sZ>5 aut.</sZ>
<sZ>6 aut.</sZ>
<sZ>7 aut.</sZ>
</inist:fA14>
</affiliation>
</author>
<author>
<name sortKey="Cha, Jongeun" sort="Cha, Jongeun" uniqKey="Cha J" first="Jongeun" last="Cha">Jongeun Cha</name>
<affiliation>
<inist:fA14 i1="01">
<s1>Department of Mechatronics, Gwangju Institute of Science and Technology (GIST), 1 Oryoung- dong</s1>
<s2>Buk-gu, Gwangju</s2>
<s3>KOR</s3>
<sZ>1 aut.</sZ>
<sZ>2 aut.</sZ>
<sZ>3 aut.</sZ>
<sZ>4 aut.</sZ>
<sZ>5 aut.</sZ>
<sZ>6 aut.</sZ>
<sZ>7 aut.</sZ>
</inist:fA14>
</affiliation>
</author>
<author>
<name sortKey="Kim, Jongphil" sort="Kim, Jongphil" uniqKey="Kim J" first="Jongphil" last="Kim">Jongphil Kim</name>
<affiliation>
<inist:fA14 i1="01">
<s1>Department of Mechatronics, Gwangju Institute of Science and Technology (GIST), 1 Oryoung- dong</s1>
<s2>Buk-gu, Gwangju</s2>
<s3>KOR</s3>
<sZ>1 aut.</sZ>
<sZ>2 aut.</sZ>
<sZ>3 aut.</sZ>
<sZ>4 aut.</sZ>
<sZ>5 aut.</sZ>
<sZ>6 aut.</sZ>
<sZ>7 aut.</sZ>
</inist:fA14>
</affiliation>
</author>
<author>
<name sortKey="Ryu, Jeha" sort="Ryu, Jeha" uniqKey="Ryu J" first="Jeha" last="Ryu">Jeha Ryu</name>
<affiliation>
<inist:fA14 i1="01">
<s1>Department of Mechatronics, Gwangju Institute of Science and Technology (GIST), 1 Oryoung- dong</s1>
<s2>Buk-gu, Gwangju</s2>
<s3>KOR</s3>
<sZ>1 aut.</sZ>
<sZ>2 aut.</sZ>
<sZ>3 aut.</sZ>
<sZ>4 aut.</sZ>
<sZ>5 aut.</sZ>
<sZ>6 aut.</sZ>
<sZ>7 aut.</sZ>
</inist:fA14>
</affiliation>
</author>
<author>
<name sortKey="Eom, Seongeun" sort="Eom, Seongeun" uniqKey="Eom S" first="Seongeun" last="Eom">Seongeun Eom</name>
<affiliation>
<inist:fA14 i1="01">
<s1>Department of Mechatronics, Gwangju Institute of Science and Technology (GIST), 1 Oryoung- dong</s1>
<s2>Buk-gu, Gwangju</s2>
<s3>KOR</s3>
<sZ>1 aut.</sZ>
<sZ>2 aut.</sZ>
<sZ>3 aut.</sZ>
<sZ>4 aut.</sZ>
<sZ>5 aut.</sZ>
<sZ>6 aut.</sZ>
<sZ>7 aut.</sZ>
</inist:fA14>
</affiliation>
</author>
<author>
<name sortKey="Mahalik, Nitaigour P" sort="Mahalik, Nitaigour P" uniqKey="Mahalik N" first="Nitaigour P." last="Mahalik">Nitaigour P. Mahalik</name>
<affiliation>
<inist:fA14 i1="01">
<s1>Department of Mechatronics, Gwangju Institute of Science and Technology (GIST), 1 Oryoung- dong</s1>
<s2>Buk-gu, Gwangju</s2>
<s3>KOR</s3>
<sZ>1 aut.</sZ>
<sZ>2 aut.</sZ>
<sZ>3 aut.</sZ>
<sZ>4 aut.</sZ>
<sZ>5 aut.</sZ>
<sZ>6 aut.</sZ>
<sZ>7 aut.</sZ>
</inist:fA14>
</affiliation>
</author>
<author>
<name sortKey="Ahn, Byungha" sort="Ahn, Byungha" uniqKey="Ahn B" first="Byungha" last="Ahn">Byungha Ahn</name>
<affiliation>
<inist:fA14 i1="01">
<s1>Department of Mechatronics, Gwangju Institute of Science and Technology (GIST), 1 Oryoung- dong</s1>
<s2>Buk-gu, Gwangju</s2>
<s3>KOR</s3>
<sZ>1 aut.</sZ>
<sZ>2 aut.</sZ>
<sZ>3 aut.</sZ>
<sZ>4 aut.</sZ>
<sZ>5 aut.</sZ>
<sZ>6 aut.</sZ>
<sZ>7 aut.</sZ>
</inist:fA14>
</affiliation>
</author>
</titleStmt>
<publicationStmt>
<idno type="wicri:source">INIST</idno>
<idno type="inist">06-0226095</idno>
<date when="2006">2006</date>
<idno type="stanalyst">PASCAL 06-0226095 INIST</idno>
<idno type="RBID">Pascal:06-0226095</idno>
<idno type="wicri:Area/PascalFrancis/Corpus">000D67</idno>
</publicationStmt>
<sourceDesc>
<biblStruct>
<analytic>
<title xml:lang="en" level="a">A novel test-bed for immersive and interactive broadcasting production using augmented reality and haptics</title>
<author>
<name sortKey="Kim, Seungjun" sort="Kim, Seungjun" uniqKey="Kim S" first="Seungjun" last="Kim">Seungjun Kim</name>
<affiliation>
<inist:fA14 i1="01">
<s1>Department of Mechatronics, Gwangju Institute of Science and Technology (GIST), 1 Oryoung- dong</s1>
<s2>Buk-gu, Gwangju</s2>
<s3>KOR</s3>
<sZ>1 aut.</sZ>
<sZ>2 aut.</sZ>
<sZ>3 aut.</sZ>
<sZ>4 aut.</sZ>
<sZ>5 aut.</sZ>
<sZ>6 aut.</sZ>
<sZ>7 aut.</sZ>
</inist:fA14>
</affiliation>
</author>
<author>
<name sortKey="Cha, Jongeun" sort="Cha, Jongeun" uniqKey="Cha J" first="Jongeun" last="Cha">Jongeun Cha</name>
<affiliation>
<inist:fA14 i1="01">
<s1>Department of Mechatronics, Gwangju Institute of Science and Technology (GIST), 1 Oryoung- dong</s1>
<s2>Buk-gu, Gwangju</s2>
<s3>KOR</s3>
<sZ>1 aut.</sZ>
<sZ>2 aut.</sZ>
<sZ>3 aut.</sZ>
<sZ>4 aut.</sZ>
<sZ>5 aut.</sZ>
<sZ>6 aut.</sZ>
<sZ>7 aut.</sZ>
</inist:fA14>
</affiliation>
</author>
<author>
<name sortKey="Kim, Jongphil" sort="Kim, Jongphil" uniqKey="Kim J" first="Jongphil" last="Kim">Jongphil Kim</name>
<affiliation>
<inist:fA14 i1="01">
<s1>Department of Mechatronics, Gwangju Institute of Science and Technology (GIST), 1 Oryoung- dong</s1>
<s2>Buk-gu, Gwangju</s2>
<s3>KOR</s3>
<sZ>1 aut.</sZ>
<sZ>2 aut.</sZ>
<sZ>3 aut.</sZ>
<sZ>4 aut.</sZ>
<sZ>5 aut.</sZ>
<sZ>6 aut.</sZ>
<sZ>7 aut.</sZ>
</inist:fA14>
</affiliation>
</author>
<author>
<name sortKey="Ryu, Jeha" sort="Ryu, Jeha" uniqKey="Ryu J" first="Jeha" last="Ryu">Jeha Ryu</name>
<affiliation>
<inist:fA14 i1="01">
<s1>Department of Mechatronics, Gwangju Institute of Science and Technology (GIST), 1 Oryoung- dong</s1>
<s2>Buk-gu, Gwangju</s2>
<s3>KOR</s3>
<sZ>1 aut.</sZ>
<sZ>2 aut.</sZ>
<sZ>3 aut.</sZ>
<sZ>4 aut.</sZ>
<sZ>5 aut.</sZ>
<sZ>6 aut.</sZ>
<sZ>7 aut.</sZ>
</inist:fA14>
</affiliation>
</author>
<author>
<name sortKey="Eom, Seongeun" sort="Eom, Seongeun" uniqKey="Eom S" first="Seongeun" last="Eom">Seongeun Eom</name>
<affiliation>
<inist:fA14 i1="01">
<s1>Department of Mechatronics, Gwangju Institute of Science and Technology (GIST), 1 Oryoung- dong</s1>
<s2>Buk-gu, Gwangju</s2>
<s3>KOR</s3>
<sZ>1 aut.</sZ>
<sZ>2 aut.</sZ>
<sZ>3 aut.</sZ>
<sZ>4 aut.</sZ>
<sZ>5 aut.</sZ>
<sZ>6 aut.</sZ>
<sZ>7 aut.</sZ>
</inist:fA14>
</affiliation>
</author>
<author>
<name sortKey="Mahalik, Nitaigour P" sort="Mahalik, Nitaigour P" uniqKey="Mahalik N" first="Nitaigour P." last="Mahalik">Nitaigour P. Mahalik</name>
<affiliation>
<inist:fA14 i1="01">
<s1>Department of Mechatronics, Gwangju Institute of Science and Technology (GIST), 1 Oryoung- dong</s1>
<s2>Buk-gu, Gwangju</s2>
<s3>KOR</s3>
<sZ>1 aut.</sZ>
<sZ>2 aut.</sZ>
<sZ>3 aut.</sZ>
<sZ>4 aut.</sZ>
<sZ>5 aut.</sZ>
<sZ>6 aut.</sZ>
<sZ>7 aut.</sZ>
</inist:fA14>
</affiliation>
</author>
<author>
<name sortKey="Ahn, Byungha" sort="Ahn, Byungha" uniqKey="Ahn B" first="Byungha" last="Ahn">Byungha Ahn</name>
<affiliation>
<inist:fA14 i1="01">
<s1>Department of Mechatronics, Gwangju Institute of Science and Technology (GIST), 1 Oryoung- dong</s1>
<s2>Buk-gu, Gwangju</s2>
<s3>KOR</s3>
<sZ>1 aut.</sZ>
<sZ>2 aut.</sZ>
<sZ>3 aut.</sZ>
<sZ>4 aut.</sZ>
<sZ>5 aut.</sZ>
<sZ>6 aut.</sZ>
<sZ>7 aut.</sZ>
</inist:fA14>
</affiliation>
</author>
</analytic>
<series>
<title level="j" type="main">IEICE transactions on information and systems</title>
<title level="j" type="abbreviated">IEICE trans. inf. syst.</title>
<idno type="ISSN">0916-8532</idno>
<imprint>
<date when="2006">2006</date>
</imprint>
</series>
</biblStruct>
</sourceDesc>
<seriesStmt>
<title level="j" type="main">IEICE transactions on information and systems</title>
<title level="j" type="abbreviated">IEICE trans. inf. syst.</title>
<idno type="ISSN">0916-8532</idno>
</seriesStmt>
</fileDesc>
<profileDesc>
<textClass>
<keywords scheme="KwdEn" xml:lang="en">
<term>Augmented reality</term>
<term>Broadcast studio</term>
<term>Haptic interfaces</term>
<term>Interactive video</term>
<term>Multimedia systems</term>
</keywords>
<keywords scheme="Pascal" xml:lang="fr">
<term>Système multimédia</term>
<term>Studio radiodiffusion</term>
<term>Vidéo interactive</term>
<term>Réalité augmentée</term>
<term>Interface haptique</term>
</keywords>
</textClass>
</profileDesc>
</teiHeader>
<front>
<div type="abstract" xml:lang="en">In this paper, we demonstrate an immersive and interactive broadcasting production system with a new haptically enhanced multimedia broadcasting chain. The system adapts Augmented Reality (AR) techniques, which merges captured videos and virtual 3D media seamlessly through multimedia streaming technology, and haptic interaction technology in near real-time. In this system, viewers at the haptic multimedia client can interact with AR broadcasting production transmitted via communication network. We demonstrate two test applications, which show that the addition of AR- and haptic-interaction to the conventional audio-visual contents can improve immersiveness and interactivity of viewers with rich contents service.</div>
</front>
</TEI>
<inist>
<standard h6="B">
<pA>
<fA01 i1="01" i2="1">
<s0>0916-8532</s0>
</fA01>
<fA03 i2="1">
<s0>IEICE trans. inf. syst.</s0>
</fA03>
<fA05>
<s2>89</s2>
</fA05>
<fA06>
<s2>1</s2>
</fA06>
<fA08 i1="01" i2="1" l="ENG">
<s1>A novel test-bed for immersive and interactive broadcasting production using augmented reality and haptics</s1>
</fA08>
<fA11 i1="01" i2="1">
<s1>KIM (Seungjun)</s1>
</fA11>
<fA11 i1="02" i2="1">
<s1>CHA (Jongeun)</s1>
</fA11>
<fA11 i1="03" i2="1">
<s1>KIM (Jongphil)</s1>
</fA11>
<fA11 i1="04" i2="1">
<s1>RYU (Jeha)</s1>
</fA11>
<fA11 i1="05" i2="1">
<s1>EOM (Seongeun)</s1>
</fA11>
<fA11 i1="06" i2="1">
<s1>MAHALIK (Nitaigour P.)</s1>
</fA11>
<fA11 i1="07" i2="1">
<s1>AHN (Byungha)</s1>
</fA11>
<fA14 i1="01">
<s1>Department of Mechatronics, Gwangju Institute of Science and Technology (GIST), 1 Oryoung- dong</s1>
<s2>Buk-gu, Gwangju</s2>
<s3>KOR</s3>
<sZ>1 aut.</sZ>
<sZ>2 aut.</sZ>
<sZ>3 aut.</sZ>
<sZ>4 aut.</sZ>
<sZ>5 aut.</sZ>
<sZ>6 aut.</sZ>
<sZ>7 aut.</sZ>
</fA14>
<fA20>
<s1>106-110</s1>
</fA20>
<fA21>
<s1>2006</s1>
</fA21>
<fA23 i1="01">
<s0>ENG</s0>
</fA23>
<fA43 i1="01">
<s1>INIST</s1>
<s2>7315E4</s2>
<s5>354000135706510120</s5>
</fA43>
<fA44>
<s0>0000</s0>
<s1>© 2006 INIST-CNRS. All rights reserved.</s1>
</fA44>
<fA45>
<s0>17 ref.</s0>
</fA45>
<fA47 i1="01" i2="1">
<s0>06-0226095</s0>
</fA47>
<fA60>
<s1>P</s1>
<s2>C</s2>
<s3>CR</s3>
</fA60>
<fA61>
<s0>A</s0>
</fA61>
<fA64 i1="01" i2="1">
<s0>IEICE transactions on information and systems</s0>
</fA64>
<fA66 i1="01">
<s0>JPN</s0>
</fA66>
<fC01 i1="01" l="ENG">
<s0>In this paper, we demonstrate an immersive and interactive broadcasting production system with a new haptically enhanced multimedia broadcasting chain. The system adapts Augmented Reality (AR) techniques, which merges captured videos and virtual 3D media seamlessly through multimedia streaming technology, and haptic interaction technology in near real-time. In this system, viewers at the haptic multimedia client can interact with AR broadcasting production transmitted via communication network. We demonstrate two test applications, which show that the addition of AR- and haptic-interaction to the conventional audio-visual contents can improve immersiveness and interactivity of viewers with rich contents service.</s0>
</fC01>
<fC02 i1="01" i2="X">
<s0>001D04A05D</s0>
</fC02>
<fC03 i1="01" i2="3" l="FRE">
<s0>Système multimédia</s0>
<s5>01</s5>
</fC03>
<fC03 i1="01" i2="3" l="ENG">
<s0>Multimedia systems</s0>
<s5>01</s5>
</fC03>
<fC03 i1="02" i2="X" l="FRE">
<s0>Studio radiodiffusion</s0>
<s5>02</s5>
</fC03>
<fC03 i1="02" i2="X" l="ENG">
<s0>Broadcast studio</s0>
<s5>02</s5>
</fC03>
<fC03 i1="02" i2="X" l="SPA">
<s0>Estudio radiodifusión</s0>
<s5>02</s5>
</fC03>
<fC03 i1="03" i2="3" l="FRE">
<s0>Vidéo interactive</s0>
<s5>03</s5>
</fC03>
<fC03 i1="03" i2="3" l="ENG">
<s0>Interactive video</s0>
<s5>03</s5>
</fC03>
<fC03 i1="04" i2="X" l="FRE">
<s0>Réalité augmentée</s0>
<s5>04</s5>
</fC03>
<fC03 i1="04" i2="X" l="ENG">
<s0>Augmented reality</s0>
<s5>04</s5>
</fC03>
<fC03 i1="04" i2="X" l="SPA">
<s0>Realidad aumentada</s0>
<s5>04</s5>
</fC03>
<fC03 i1="05" i2="3" l="FRE">
<s0>Interface haptique</s0>
<s5>05</s5>
</fC03>
<fC03 i1="05" i2="3" l="ENG">
<s0>Haptic interfaces</s0>
<s5>05</s5>
</fC03>
<fN21>
<s1>142</s1>
</fN21>
<fN44 i1="01">
<s1>PSI</s1>
</fN44>
<fN82>
<s1>PSI</s1>
</fN82>
</pA>
<pR>
<fA30 i1="01" i2="1" l="ENG">
<s1>International Conference on Artificial Reality and Telexistence</s1>
<s2>14</s2>
<s3>Seoul KOR</s3>
<s4>2004-12-02</s4>
</fA30>
</pR>
</standard>
<server>
<NO>PASCAL 06-0226095 INIST</NO>
<ET>A novel test-bed for immersive and interactive broadcasting production using augmented reality and haptics</ET>
<AU>KIM (Seungjun); CHA (Jongeun); KIM (Jongphil); RYU (Jeha); EOM (Seongeun); MAHALIK (Nitaigour P.); AHN (Byungha)</AU>
<AF>Department of Mechatronics, Gwangju Institute of Science and Technology (GIST), 1 Oryoung- dong/Buk-gu, Gwangju/Corée, République de (1 aut., 2 aut., 3 aut., 4 aut., 5 aut., 6 aut., 7 aut.)</AF>
<DT>Publication en série; Congrès; Correspondance, lettre; Niveau analytique</DT>
<SO>IEICE transactions on information and systems; ISSN 0916-8532; Japon; Da. 2006; Vol. 89; No. 1; Pp. 106-110; Bibl. 17 ref.</SO>
<LA>Anglais</LA>
<EA>In this paper, we demonstrate an immersive and interactive broadcasting production system with a new haptically enhanced multimedia broadcasting chain. The system adapts Augmented Reality (AR) techniques, which merges captured videos and virtual 3D media seamlessly through multimedia streaming technology, and haptic interaction technology in near real-time. In this system, viewers at the haptic multimedia client can interact with AR broadcasting production transmitted via communication network. We demonstrate two test applications, which show that the addition of AR- and haptic-interaction to the conventional audio-visual contents can improve immersiveness and interactivity of viewers with rich contents service.</EA>
<CC>001D04A05D</CC>
<FD>Système multimédia; Studio radiodiffusion; Vidéo interactive; Réalité augmentée; Interface haptique</FD>
<ED>Multimedia systems; Broadcast studio; Interactive video; Augmented reality; Haptic interfaces</ED>
<SD>Estudio radiodifusión; Realidad aumentada</SD>
<LO>INIST-7315E4.354000135706510120</LO>
<ID>06-0226095</ID>
</server>
</inist>
</record>

Pour manipuler ce document sous Unix (Dilib)

EXPLOR_STEP=$WICRI_ROOT/Ticri/CIDE/explor/HapticV1/Data/PascalFrancis/Corpus
HfdSelect -h $EXPLOR_STEP/biblio.hfd -nk 000D67 | SxmlIndent | more

Ou

HfdSelect -h $EXPLOR_AREA/Data/PascalFrancis/Corpus/biblio.hfd -nk 000D67 | SxmlIndent | more

Pour mettre un lien sur cette page dans le réseau Wicri

{{Explor lien
   |wiki=    Ticri/CIDE
   |area=    HapticV1
   |flux=    PascalFrancis
   |étape=   Corpus
   |type=    RBID
   |clé=     Pascal:06-0226095
   |texte=   A novel test-bed for immersive and interactive broadcasting production using augmented reality and haptics
}}

Wicri

This area was generated with Dilib version V0.6.23.
Data generation: Mon Jun 13 01:09:46 2016. Site generation: Wed Mar 6 09:54:07 2024