Serveur d'exploration sur les dispositifs haptiques

Attention, ce site est en cours de développement !
Attention, site généré par des moyens informatiques à partir de corpus bruts.
Les informations ne sont donc pas validées.

Multi-modal Interaction in Collaborative Virtual Environments: Study and analysis of performance in collaborative work

Identifieur interne : 000386 ( Hal/Corpus ); précédent : 000385; suivant : 000387

Multi-modal Interaction in Collaborative Virtual Environments: Study and analysis of performance in collaborative work

Auteurs : Sehat Ullah

Source :

RBID : Hal:tel-00562081

Descripteurs français

English descriptors

Abstract

The recent advancement in the field of high quality computer graphics and the capability of inexpensive computers to render realistic 3D scenes have made it possible to develop virtual environments where two or more users can co-exist and work collaboratively to achieve a common goal. Such environments are called Collaborative Virtual Environments (CVEs). The potential application domains of CVEs are many, such as military, medical, assembling, computer aided designing, teleoperation, education, games and social networks etc.. One of the problems related to CVEs is the user's low level of awareness about the status, actions and intentions of his/her collaborator, which not only reduces users' performance but also leads to non satisfactory results. In addition, collaborative tasks without using any proper computer generated assistance are very dicult to perform and are more prone to errors. The basic theme of this thesis is to provide assistance in collaborative 3D interaction in CVEs. In this context, we study and develop the concept of multimodal (audio, visual and haptic) assistance of a user or group of users. Our study focuses on how we can assist users to collaboratively interact with the entities of CVEs. We propose here to study and analyze the contribution of multimodal assistance in collaborative (synchronous and asynchronous) interaction with objects in the virtual environment. Indeed, we propose and implement various multimodal virtual guides. These guides are evaluated through a series of experiments where selection/manipulation task is carried out by users both in synchronous and asynchronous mode. The experiments were carried out in LISA (Laboratoire d'Ingénierie et Systèmes Automatisés) lab at University of Angers and IBISC (Informatique, Biologie Integrative et Systemes Complexes) lab at University of Evry. In these experiments users were asked to perform a task under various conditions ( with and without guides). Analysis was done on the basis of task completion time, errors and users' learning. For subjective evaluations questionnaires were used. The ndings of this research work can contribute to the development of collaborative systems for teleoperation, assembly tasks, e-learning, rehabilitation, computer aided design and entertainment.

Url:

Links to Exploration step

Hal:tel-00562081

Le document en format XML

<record>
<TEI>
<teiHeader>
<fileDesc>
<titleStmt>
<title xml:lang="en">Multi-modal Interaction in Collaborative Virtual Environments: Study and analysis of performance in collaborative work</title>
<title xml:lang="fr">Assistance multimodale pour l'interaction 3D collaborative : étude et analyse des performances pour le travail collaboratif</title>
<author>
<name sortKey="Ullah, Sehat" sort="Ullah, Sehat" uniqKey="Ullah S" first="Sehat" last="Ullah">Sehat Ullah</name>
<affiliation>
<hal:affiliation type="laboratory" xml:id="struct-143247" status="VALID">
<orgName>Informatique, Biologie Intégrative et Systèmes Complexes</orgName>
<orgName type="acronym">IBISC</orgName>
<desc>
<address>
<addrLine>40 rue du Pelvoux, Courcouronnes ; 91020 Evry Cedex</addrLine>
<country key="FR"></country>
</address>
<ref type="url">http://www.ibisc.univ-evry.fr/</ref>
</desc>
<listRelation>
<relation active="#struct-93105" type="direct"></relation>
</listRelation>
<tutelles>
<tutelle active="#struct-93105" type="direct">
<org type="institution" xml:id="struct-93105" status="VALID">
<orgName>Université d'Evry-Val d'Essonne</orgName>
<desc>
<address>
<addrLine>Boulevard François Mitterrand 91025 Evry Cedex</addrLine>
<country key="FR"></country>
</address>
<ref type="url">http://www.univ-evry.fr/fr/index.html</ref>
</desc>
</org>
</tutelle>
</tutelles>
</hal:affiliation>
</affiliation>
</author>
</titleStmt>
<publicationStmt>
<idno type="wicri:source">HAL</idno>
<idno type="RBID">Hal:tel-00562081</idno>
<idno type="halId">tel-00562081</idno>
<idno type="halUri">https://tel.archives-ouvertes.fr/tel-00562081</idno>
<idno type="url">https://tel.archives-ouvertes.fr/tel-00562081</idno>
<date when="2011-01-26">2011-01-26</date>
<idno type="wicri:Area/Hal/Corpus">000386</idno>
</publicationStmt>
<sourceDesc>
<biblStruct>
<analytic>
<title xml:lang="en">Multi-modal Interaction in Collaborative Virtual Environments: Study and analysis of performance in collaborative work</title>
<title xml:lang="fr">Assistance multimodale pour l'interaction 3D collaborative : étude et analyse des performances pour le travail collaboratif</title>
<author>
<name sortKey="Ullah, Sehat" sort="Ullah, Sehat" uniqKey="Ullah S" first="Sehat" last="Ullah">Sehat Ullah</name>
<affiliation>
<hal:affiliation type="laboratory" xml:id="struct-143247" status="VALID">
<orgName>Informatique, Biologie Intégrative et Systèmes Complexes</orgName>
<orgName type="acronym">IBISC</orgName>
<desc>
<address>
<addrLine>40 rue du Pelvoux, Courcouronnes ; 91020 Evry Cedex</addrLine>
<country key="FR"></country>
</address>
<ref type="url">http://www.ibisc.univ-evry.fr/</ref>
</desc>
<listRelation>
<relation active="#struct-93105" type="direct"></relation>
</listRelation>
<tutelles>
<tutelle active="#struct-93105" type="direct">
<org type="institution" xml:id="struct-93105" status="VALID">
<orgName>Université d'Evry-Val d'Essonne</orgName>
<desc>
<address>
<addrLine>Boulevard François Mitterrand 91025 Evry Cedex</addrLine>
<country key="FR"></country>
</address>
<ref type="url">http://www.univ-evry.fr/fr/index.html</ref>
</desc>
</org>
</tutelle>
</tutelles>
</hal:affiliation>
</affiliation>
</author>
</analytic>
</biblStruct>
</sourceDesc>
</fileDesc>
<profileDesc>
<textClass>
<keywords scheme="mix" xml:lang="en">
<term>Collaborative Virtual Environments</term>
<term>Virtual Reality</term>
<term>aura set</term>
<term>haptic guides</term>
<term>human performance</term>
<term>multimodal guides</term>
<term>software architecture</term>
<term>task set</term>
</keywords>
<keywords scheme="mix" xml:lang="fr">
<term>Réalité virtuelle</term>
<term>architecture logicielle</term>
<term>assistance multimodale</term>
<term>environnement virtuel collaboratif</term>
<term>guides haptiques</term>
<term>guides virtuels</term>
<term>performance humaine</term>
</keywords>
</textClass>
</profileDesc>
</teiHeader>
<front>
<div type="abstract" xml:lang="en">The recent advancement in the field of high quality computer graphics and the capability of inexpensive computers to render realistic 3D scenes have made it possible to develop virtual environments where two or more users can co-exist and work collaboratively to achieve a common goal. Such environments are called Collaborative Virtual Environments (CVEs). The potential application domains of CVEs are many, such as military, medical, assembling, computer aided designing, teleoperation, education, games and social networks etc.. One of the problems related to CVEs is the user's low level of awareness about the status, actions and intentions of his/her collaborator, which not only reduces users' performance but also leads to non satisfactory results. In addition, collaborative tasks without using any proper computer generated assistance are very dicult to perform and are more prone to errors. The basic theme of this thesis is to provide assistance in collaborative 3D interaction in CVEs. In this context, we study and develop the concept of multimodal (audio, visual and haptic) assistance of a user or group of users. Our study focuses on how we can assist users to collaboratively interact with the entities of CVEs. We propose here to study and analyze the contribution of multimodal assistance in collaborative (synchronous and asynchronous) interaction with objects in the virtual environment. Indeed, we propose and implement various multimodal virtual guides. These guides are evaluated through a series of experiments where selection/manipulation task is carried out by users both in synchronous and asynchronous mode. The experiments were carried out in LISA (Laboratoire d'Ingénierie et Systèmes Automatisés) lab at University of Angers and IBISC (Informatique, Biologie Integrative et Systemes Complexes) lab at University of Evry. In these experiments users were asked to perform a task under various conditions ( with and without guides). Analysis was done on the basis of task completion time, errors and users' learning. For subjective evaluations questionnaires were used. The ndings of this research work can contribute to the development of collaborative systems for teleoperation, assembly tasks, e-learning, rehabilitation, computer aided design and entertainment.</div>
</front>
</TEI>
<hal api="V3">
<titleStmt>
<title xml:lang="en">Multi-modal Interaction in Collaborative Virtual Environments: Study and analysis of performance in collaborative work</title>
<title xml:lang="fr">Assistance multimodale pour l'interaction 3D collaborative : étude et analyse des performances pour le travail collaboratif</title>
<author role="aut">
<persName>
<forename type="first">Sehat</forename>
<surname>Ullah</surname>
</persName>
<email>Sehat.Ullah@ibisc.univ-evry.fr</email>
<idno type="halauthor">367708</idno>
<affiliation ref="#struct-143247"></affiliation>
</author>
<editor role="depositor">
<persName>
<forename>Frédéric</forename>
<surname>Davesne</surname>
</persName>
<email>frederic.davesne@ibisc.univ-evry.fr</email>
</editor>
</titleStmt>
<editionStmt>
<edition n="v1" type="current">
<date type="whenSubmitted">2011-02-02 16:50:40</date>
<date type="whenModified">2016-03-21 17:35:26</date>
<date type="whenReleased">2011-02-03 08:32:42</date>
<date type="whenProduced">2011-01-26</date>
<date type="whenEndEmbargoed">2011-02-02</date>
<ref type="file" target="https://tel.archives-ouvertes.fr/tel-00562081/document">
<date notBefore="2011-02-02"></date>
</ref>
<ref type="file" n="1" target="https://tel.archives-ouvertes.fr/tel-00562081/file/These.pdf">
<date notBefore="2011-02-02"></date>
</ref>
</edition>
<respStmt>
<resp>contributor</resp>
<name key="127748">
<persName>
<forename>Frédéric</forename>
<surname>Davesne</surname>
</persName>
<email>frederic.davesne@ibisc.univ-evry.fr</email>
</name>
</respStmt>
</editionStmt>
<publicationStmt>
<distributor>CCSD</distributor>
<idno type="halId">tel-00562081</idno>
<idno type="halUri">https://tel.archives-ouvertes.fr/tel-00562081</idno>
<idno type="halBibtex">ullah:tel-00562081</idno>
<idno type="halRefHtml">Human-Computer Interaction [cs.HC]. Université d'Evry-Val d'Essonne, 2011. English</idno>
<idno type="halRef">Human-Computer Interaction [cs.HC]. Université d'Evry-Val d'Essonne, 2011. English</idno>
</publicationStmt>
<seriesStmt>
<idno type="stamp" n="IBISC-RATC" p="IBISC">RATC</idno>
<idno type="stamp" n="IBISC-EVRA" p="IBISC">Environnement Virtuel et Réalité Augmentée</idno>
<idno type="stamp" n="UNIV-EVRY">Université d'Evry-Val d'Essonne</idno>
<idno type="stamp" n="IBISC-IRA2" p="IBISC">Interactions, Réalité Augmentée, Robotique Ambiante</idno>
<idno type="stamp" n="IBISC">Informatique, Biologie Intégrative et Systèmes Complexes</idno>
<idno type="stamp" n="TDS-MACS">Réseau de recherche en Théorie des Systèmes Distribués, Modélisation, Analyse et Contrôle des Systèmes</idno>
</seriesStmt>
<notesStmt></notesStmt>
<sourceDesc>
<biblStruct>
<analytic>
<title xml:lang="en">Multi-modal Interaction in Collaborative Virtual Environments: Study and analysis of performance in collaborative work</title>
<title xml:lang="fr">Assistance multimodale pour l'interaction 3D collaborative : étude et analyse des performances pour le travail collaboratif</title>
<author role="aut">
<persName>
<forename type="first">Sehat</forename>
<surname>Ullah</surname>
</persName>
<email>Sehat.Ullah@ibisc.univ-evry.fr</email>
<idno type="halAuthorId">367708</idno>
<affiliation ref="#struct-143247"></affiliation>
</author>
</analytic>
<monogr>
<imprint>
<date type="dateDefended">2011-01-26</date>
</imprint>
<authority type="institution">Université d'Evry-Val d'Essonne</authority>
<authority type="school">Science et Ingénierie</authority>
<authority type="supervisor">Malik Mallem(malik.mallem@ibisc.univ-evry.fr)</authority>
<authority type="jury">Sabine Coquillart (examinateur)</authority>
<authority type="jury">Anatole Lécuyer (rapporteur)</authority>
<authority type="jury">Malik Mallem (directeur)</authority>
<authority type="jury">Guillaume Moreau (rapporteur)</authority>
<authority type="jury">Samir Otmane (co-encadrant)</authority>
<authority type="jury">Paul Richard (co-encadrant)</authority>
</monogr>
</biblStruct>
</sourceDesc>
<profileDesc>
<langUsage>
<language ident="en">English</language>
</langUsage>
<textClass>
<keywords scheme="author">
<term xml:lang="en">Virtual Reality</term>
<term xml:lang="en">Collaborative Virtual Environments</term>
<term xml:lang="en">haptic guides</term>
<term xml:lang="en">multimodal guides</term>
<term xml:lang="en">aura set</term>
<term xml:lang="en">task set</term>
<term xml:lang="en">software architecture</term>
<term xml:lang="en">human performance</term>
<term xml:lang="fr">Réalité virtuelle</term>
<term xml:lang="fr">environnement virtuel collaboratif</term>
<term xml:lang="fr">guides virtuels</term>
<term xml:lang="fr">guides haptiques</term>
<term xml:lang="fr">assistance multimodale</term>
<term xml:lang="fr">architecture logicielle</term>
<term xml:lang="fr">performance humaine</term>
</keywords>
<classCode scheme="halDomain" n="info.info-hc">Computer Science [cs]/Human-Computer Interaction [cs.HC]</classCode>
<classCode scheme="halDomain" n="spi.auto">Engineering Sciences [physics]/Automatic</classCode>
<classCode scheme="halTypology" n="THESE">Theses</classCode>
</textClass>
<abstract xml:lang="en">The recent advancement in the field of high quality computer graphics and the capability of inexpensive computers to render realistic 3D scenes have made it possible to develop virtual environments where two or more users can co-exist and work collaboratively to achieve a common goal. Such environments are called Collaborative Virtual Environments (CVEs). The potential application domains of CVEs are many, such as military, medical, assembling, computer aided designing, teleoperation, education, games and social networks etc.. One of the problems related to CVEs is the user's low level of awareness about the status, actions and intentions of his/her collaborator, which not only reduces users' performance but also leads to non satisfactory results. In addition, collaborative tasks without using any proper computer generated assistance are very dicult to perform and are more prone to errors. The basic theme of this thesis is to provide assistance in collaborative 3D interaction in CVEs. In this context, we study and develop the concept of multimodal (audio, visual and haptic) assistance of a user or group of users. Our study focuses on how we can assist users to collaboratively interact with the entities of CVEs. We propose here to study and analyze the contribution of multimodal assistance in collaborative (synchronous and asynchronous) interaction with objects in the virtual environment. Indeed, we propose and implement various multimodal virtual guides. These guides are evaluated through a series of experiments where selection/manipulation task is carried out by users both in synchronous and asynchronous mode. The experiments were carried out in LISA (Laboratoire d'Ingénierie et Systèmes Automatisés) lab at University of Angers and IBISC (Informatique, Biologie Integrative et Systemes Complexes) lab at University of Evry. In these experiments users were asked to perform a task under various conditions ( with and without guides). Analysis was done on the basis of task completion time, errors and users' learning. For subjective evaluations questionnaires were used. The ndings of this research work can contribute to the development of collaborative systems for teleoperation, assembly tasks, e-learning, rehabilitation, computer aided design and entertainment.</abstract>
<abstract xml:lang="fr">Les progrès récents dans le domaine de l'infographie et la capacité des ordinateurs personnels de rendre les scènes 3D réalistes ont permis de développer des environnements virtuels dans lesquels plusieurs utilisateurs peuvent co-exister et travailler ensemble pour atteindre un objectif commun. Ces environnements sont appelés Environnements Virtuels Collaboratifs (EVCs). Les applications potentielles des EVCs sont dans les domaines militaire, medical, l'assemblage, la conception assistee par ordinateur, la teleoperation, l'éducation, les jeux et les réseaux sociaux. Un des problemes liés aux EVCs est la faible connaissance des utilisateurs concernant l'état, les actions et les intentions de leur(s) collaborateur(s). Ceci reduit non seulement la performance collective, mais conduit également à des résultats non satisfaisants. En outre, les tâches collaboratives ou coopératives réalisées sans aide ou assistance, sont plus difficiles et plus sujettes aux erreurs. Dans ce travail de thèse, nous étudions l'influence de guides multi-modaux sur la performance des utilisateurs lors de tâches collaboratives en environnement virtuel (EV). Nous proposons un certain nombre de guides basés sur les modalites visuelle, auditive et haptique. Dans ce contexte, nous étudions leur qualité de guidage et examinons leur influence sur l'awareness, la co-presence et la coordination des utilisateurs pendant la réalisation des tâches. A cette effet, nous avons développé une architecture logicielle qui permet la collaboration de deux (peut être étendue à plusieurs utiliateurs) utilisateurs (distribués ou co-localisés). En utilisant cette architecture, nous avons développé des applications qui non seulement permettent un travail collaboratif, mais fournissent aussi des assistances multi-modales aux utilisateurs. Le travail de collaboration soutenu par ces applications comprend des tâches de type "Peg-in-hole", de télé-manipulation coopérative via deux robots, de télé-guidage pour l'écriture ou le dessin. Afin d'évaluer la pertinence et l'influence des guides proposés, une série d'expériences a ete effectuée au LISA (Laboratoire d'Ingénierie et Systemes Automatisés) à l'Université d'Angers et au Laboratoire IBISC (Informatique, Biologie Integrative et Systemes Complexes) à l'Université d'Evry. Dans ces expériences, les utilisateurs ont été invités à effectuer des tâches variées, dans des conditions différentes (avec et sans guides). L'analyse a été effectuée sur la base du temps de réalisation des tâches, des erreurs et de l'apprentissage des utilisateurs. Pour les évaluations subjectives des questionnaires ont été utilisés. Ce travail contribue de manière signicative au développement de systèmes collaboratifs pour la téléoperation, la simulation d'assemblage, l'apprentissage de gestes techniques, la rééducation, la conception assistée par ordinateur et le divertissement.</abstract>
</profileDesc>
</hal>
</record>

Pour manipuler ce document sous Unix (Dilib)

EXPLOR_STEP=$WICRI_ROOT/Ticri/CIDE/explor/HapticV1/Data/Hal/Corpus
HfdSelect -h $EXPLOR_STEP/biblio.hfd -nk 000386 | SxmlIndent | more

Ou

HfdSelect -h $EXPLOR_AREA/Data/Hal/Corpus/biblio.hfd -nk 000386 | SxmlIndent | more

Pour mettre un lien sur cette page dans le réseau Wicri

{{Explor lien
   |wiki=    Ticri/CIDE
   |area=    HapticV1
   |flux=    Hal
   |étape=   Corpus
   |type=    RBID
   |clé=     Hal:tel-00562081
   |texte=   Multi-modal Interaction in Collaborative Virtual Environments: Study and analysis of performance in collaborative work
}}

Wicri

This area was generated with Dilib version V0.6.23.
Data generation: Mon Jun 13 01:09:46 2016. Site generation: Wed Mar 6 09:54:07 2024