Serveur d'exploration sur les dispositifs haptiques

Attention, ce site est en cours de développement !
Attention, site généré par des moyens informatiques à partir de corpus bruts.
Les informations ne sont donc pas validées.

Intention recognition for dynamic role exchange in haptic collaboration.

Identifieur interne : 002332 ( PubMed/Checkpoint ); précédent : 002331; suivant : 002333

Intention recognition for dynamic role exchange in haptic collaboration.

Auteurs : Ayse Kucukyilmaz ; Tevfik Metin Sezgin ; Cagatay Basdogan

Source :

RBID : pubmed:24808268

English descriptors

Abstract

In human-computer collaboration involving haptics, a key issue that remains to be solved is to establish an intuitive communication between the partners. Even though computers are widely used to aid human operators in teleoperation, guidance, and training, because they lack the adaptability, versatility, and awareness of a human, their ability to improve efficiency and effectiveness in dynamic tasks is limited. We suggest that the communication between a human and a computer can be improved if it involves a decision-making process in which the computer is programmed to infer the intentions of the human operator and dynamically adjust the control levels of the interacting parties to facilitate a more intuitive interaction setup. In this paper, we investigate the utility of such a dynamic role exchange mechanism, where partners negotiate through the haptic channel to trade their control levels on a collaborative task. We examine the energy consumption, the work done on the manipulated object, and the joint efficiency in addition to the task performance. We show that when compared to an equal control condition, a role exchange mechanism improves task performance and the joint efficiency of the partners. We also show that augmenting the system with additional informative visual and vibrotactile cues, which are used to display the state of interaction, allows the users to become aware of the underlying role exchange mechanism and utilize it in favor of the task. These cues also improve the user's sense of interaction and reinforce his/her belief that the computer aids with the execution of the task.

DOI: 10.1109/TOH.2012.21
PubMed: 24808268


Affiliations:


Links toward previous steps (curation, corpus...)


Links to Exploration step

pubmed:24808268

Le document en format XML

<record>
<TEI>
<teiHeader>
<fileDesc>
<titleStmt>
<title xml:lang="en">Intention recognition for dynamic role exchange in haptic collaboration.</title>
<author>
<name sortKey="Kucukyilmaz, Ayse" sort="Kucukyilmaz, Ayse" uniqKey="Kucukyilmaz A" first="Ayse" last="Kucukyilmaz">Ayse Kucukyilmaz</name>
</author>
<author>
<name sortKey="Sezgin, Tevfik Metin" sort="Sezgin, Tevfik Metin" uniqKey="Sezgin T" first="Tevfik Metin" last="Sezgin">Tevfik Metin Sezgin</name>
</author>
<author>
<name sortKey="Basdogan, Cagatay" sort="Basdogan, Cagatay" uniqKey="Basdogan C" first="Cagatay" last="Basdogan">Cagatay Basdogan</name>
</author>
</titleStmt>
<publicationStmt>
<idno type="wicri:source">PubMed</idno>
<date when="????">
<PubDate>
<MedlineDate>2013 Jan-Mar</MedlineDate>
</PubDate>
</date>
<idno type="doi">10.1109/TOH.2012.21</idno>
<idno type="RBID">pubmed:24808268</idno>
<idno type="pmid">24808268</idno>
<idno type="wicri:Area/PubMed/Corpus">000A68</idno>
<idno type="wicri:Area/PubMed/Curation">000A68</idno>
<idno type="wicri:Area/PubMed/Checkpoint">002332</idno>
</publicationStmt>
<sourceDesc>
<biblStruct>
<analytic>
<title xml:lang="en">Intention recognition for dynamic role exchange in haptic collaboration.</title>
<author>
<name sortKey="Kucukyilmaz, Ayse" sort="Kucukyilmaz, Ayse" uniqKey="Kucukyilmaz A" first="Ayse" last="Kucukyilmaz">Ayse Kucukyilmaz</name>
</author>
<author>
<name sortKey="Sezgin, Tevfik Metin" sort="Sezgin, Tevfik Metin" uniqKey="Sezgin T" first="Tevfik Metin" last="Sezgin">Tevfik Metin Sezgin</name>
</author>
<author>
<name sortKey="Basdogan, Cagatay" sort="Basdogan, Cagatay" uniqKey="Basdogan C" first="Cagatay" last="Basdogan">Cagatay Basdogan</name>
</author>
</analytic>
<series>
<title level="j">IEEE transactions on haptics</title>
<idno type="eISSN">2329-4051</idno>
</series>
</biblStruct>
</sourceDesc>
</fileDesc>
<profileDesc>
<textClass>
<keywords scheme="KwdEn" xml:lang="en">
<term>Adult</term>
<term>Algorithms</term>
<term>Cooperative Behavior</term>
<term>Female</term>
<term>Games, Experimental</term>
<term>Humans</term>
<term>Intention</term>
<term>Male</term>
<term>Touch Perception (physiology)</term>
<term>User-Computer Interface</term>
<term>Young Adult</term>
</keywords>
<keywords scheme="MESH" qualifier="physiology" xml:lang="en">
<term>Touch Perception</term>
</keywords>
<keywords scheme="MESH" xml:lang="en">
<term>Adult</term>
<term>Algorithms</term>
<term>Cooperative Behavior</term>
<term>Female</term>
<term>Games, Experimental</term>
<term>Humans</term>
<term>Intention</term>
<term>Male</term>
<term>User-Computer Interface</term>
<term>Young Adult</term>
</keywords>
</textClass>
</profileDesc>
</teiHeader>
<front>
<div type="abstract" xml:lang="en">In human-computer collaboration involving haptics, a key issue that remains to be solved is to establish an intuitive communication between the partners. Even though computers are widely used to aid human operators in teleoperation, guidance, and training, because they lack the adaptability, versatility, and awareness of a human, their ability to improve efficiency and effectiveness in dynamic tasks is limited. We suggest that the communication between a human and a computer can be improved if it involves a decision-making process in which the computer is programmed to infer the intentions of the human operator and dynamically adjust the control levels of the interacting parties to facilitate a more intuitive interaction setup. In this paper, we investigate the utility of such a dynamic role exchange mechanism, where partners negotiate through the haptic channel to trade their control levels on a collaborative task. We examine the energy consumption, the work done on the manipulated object, and the joint efficiency in addition to the task performance. We show that when compared to an equal control condition, a role exchange mechanism improves task performance and the joint efficiency of the partners. We also show that augmenting the system with additional informative visual and vibrotactile cues, which are used to display the state of interaction, allows the users to become aware of the underlying role exchange mechanism and utilize it in favor of the task. These cues also improve the user's sense of interaction and reinforce his/her belief that the computer aids with the execution of the task.</div>
</front>
</TEI>
<pubmed>
<MedlineCitation Owner="NLM" Status="MEDLINE">
<PMID Version="1">24808268</PMID>
<DateCreated>
<Year>2014</Year>
<Month>05</Month>
<Day>08</Day>
</DateCreated>
<DateCompleted>
<Year>2015</Year>
<Month>11</Month>
<Day>20</Day>
</DateCompleted>
<Article PubModel="Print">
<Journal>
<ISSN IssnType="Electronic">2329-4051</ISSN>
<JournalIssue CitedMedium="Internet">
<Volume>6</Volume>
<Issue>1</Issue>
<PubDate>
<MedlineDate>2013 Jan-Mar</MedlineDate>
</PubDate>
</JournalIssue>
<Title>IEEE transactions on haptics</Title>
<ISOAbbreviation>IEEE Trans Haptics</ISOAbbreviation>
</Journal>
<ArticleTitle>Intention recognition for dynamic role exchange in haptic collaboration.</ArticleTitle>
<Pagination>
<MedlinePgn>58-68</MedlinePgn>
</Pagination>
<ELocationID EIdType="doi" ValidYN="Y">10.1109/TOH.2012.21</ELocationID>
<Abstract>
<AbstractText>In human-computer collaboration involving haptics, a key issue that remains to be solved is to establish an intuitive communication between the partners. Even though computers are widely used to aid human operators in teleoperation, guidance, and training, because they lack the adaptability, versatility, and awareness of a human, their ability to improve efficiency and effectiveness in dynamic tasks is limited. We suggest that the communication between a human and a computer can be improved if it involves a decision-making process in which the computer is programmed to infer the intentions of the human operator and dynamically adjust the control levels of the interacting parties to facilitate a more intuitive interaction setup. In this paper, we investigate the utility of such a dynamic role exchange mechanism, where partners negotiate through the haptic channel to trade their control levels on a collaborative task. We examine the energy consumption, the work done on the manipulated object, and the joint efficiency in addition to the task performance. We show that when compared to an equal control condition, a role exchange mechanism improves task performance and the joint efficiency of the partners. We also show that augmenting the system with additional informative visual and vibrotactile cues, which are used to display the state of interaction, allows the users to become aware of the underlying role exchange mechanism and utilize it in favor of the task. These cues also improve the user's sense of interaction and reinforce his/her belief that the computer aids with the execution of the task.</AbstractText>
</Abstract>
<AuthorList CompleteYN="Y">
<Author ValidYN="Y">
<LastName>Kucukyilmaz</LastName>
<ForeName>Ayse</ForeName>
<Initials>A</Initials>
</Author>
<Author ValidYN="Y">
<LastName>Sezgin</LastName>
<ForeName>Tevfik Metin</ForeName>
<Initials>TM</Initials>
</Author>
<Author ValidYN="Y">
<LastName>Basdogan</LastName>
<ForeName>Cagatay</ForeName>
<Initials>C</Initials>
</Author>
</AuthorList>
<Language>eng</Language>
<PublicationTypeList>
<PublicationType UI="D016428">Journal Article</PublicationType>
</PublicationTypeList>
</Article>
<MedlineJournalInfo>
<Country>United States</Country>
<MedlineTA>IEEE Trans Haptics</MedlineTA>
<NlmUniqueID>101491191</NlmUniqueID>
<ISSNLinking>1939-1412</ISSNLinking>
</MedlineJournalInfo>
<CitationSubset>IM</CitationSubset>
<MeshHeadingList>
<MeshHeading>
<DescriptorName MajorTopicYN="N" UI="D000328">Adult</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="N" UI="D000465">Algorithms</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="Y" UI="D003299">Cooperative Behavior</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="N" UI="D005260">Female</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="Y" UI="D005717">Games, Experimental</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="N" UI="D006801">Humans</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="Y" UI="D033182">Intention</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="N" UI="D008297">Male</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="N" UI="D055698">Touch Perception</DescriptorName>
<QualifierName MajorTopicYN="Y" UI="Q000502">physiology</QualifierName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="Y" UI="D014584">User-Computer Interface</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="N" UI="D055815">Young Adult</DescriptorName>
</MeshHeading>
</MeshHeadingList>
</MedlineCitation>
<PubmedData>
<History>
<PubMedPubDate PubStatus="entrez">
<Year>2014</Year>
<Month>5</Month>
<Day>9</Day>
<Hour>6</Hour>
<Minute>0</Minute>
</PubMedPubDate>
<PubMedPubDate PubStatus="pubmed">
<Year>2013</Year>
<Month>1</Month>
<Day>1</Day>
<Hour>0</Hour>
<Minute>0</Minute>
</PubMedPubDate>
<PubMedPubDate PubStatus="medline">
<Year>2015</Year>
<Month>12</Month>
<Day>15</Day>
<Hour>6</Hour>
<Minute>0</Minute>
</PubMedPubDate>
</History>
<PublicationStatus>ppublish</PublicationStatus>
<ArticleIdList>
<ArticleId IdType="doi">10.1109/TOH.2012.21</ArticleId>
<ArticleId IdType="pubmed">24808268</ArticleId>
</ArticleIdList>
</PubmedData>
</pubmed>
<affiliations>
<list></list>
<tree>
<noCountry>
<name sortKey="Basdogan, Cagatay" sort="Basdogan, Cagatay" uniqKey="Basdogan C" first="Cagatay" last="Basdogan">Cagatay Basdogan</name>
<name sortKey="Kucukyilmaz, Ayse" sort="Kucukyilmaz, Ayse" uniqKey="Kucukyilmaz A" first="Ayse" last="Kucukyilmaz">Ayse Kucukyilmaz</name>
<name sortKey="Sezgin, Tevfik Metin" sort="Sezgin, Tevfik Metin" uniqKey="Sezgin T" first="Tevfik Metin" last="Sezgin">Tevfik Metin Sezgin</name>
</noCountry>
</tree>
</affiliations>
</record>

Pour manipuler ce document sous Unix (Dilib)

EXPLOR_STEP=$WICRI_ROOT/Ticri/CIDE/explor/HapticV1/Data/PubMed/Checkpoint
HfdSelect -h $EXPLOR_STEP/biblio.hfd -nk 002332 | SxmlIndent | more

Ou

HfdSelect -h $EXPLOR_AREA/Data/PubMed/Checkpoint/biblio.hfd -nk 002332 | SxmlIndent | more

Pour mettre un lien sur cette page dans le réseau Wicri

{{Explor lien
   |wiki=    Ticri/CIDE
   |area=    HapticV1
   |flux=    PubMed
   |étape=   Checkpoint
   |type=    RBID
   |clé=     pubmed:24808268
   |texte=   Intention recognition for dynamic role exchange in haptic collaboration.
}}

Pour générer des pages wiki

HfdIndexSelect -h $EXPLOR_AREA/Data/PubMed/Checkpoint/RBID.i   -Sk "pubmed:24808268" \
       | HfdSelect -Kh $EXPLOR_AREA/Data/PubMed/Checkpoint/biblio.hfd   \
       | NlmPubMed2Wicri -a HapticV1 

Wicri

This area was generated with Dilib version V0.6.23.
Data generation: Mon Jun 13 01:09:46 2016. Site generation: Wed Mar 6 09:54:07 2024