Serveur d'exploration sur les dispositifs haptiques

Attention, ce site est en cours de développement !
Attention, site généré par des moyens informatiques à partir de corpus bruts.
Les informations ne sont donc pas validées.

Recognition of haptic interaction patterns in dyadic joint object manipulation.

Identifieur interne : 000463 ( PubMed/Corpus ); précédent : 000462; suivant : 000464

Recognition of haptic interaction patterns in dyadic joint object manipulation.

Auteurs : Cigil Ece Madan ; Ayse Kucukyilmaz ; Tevfik Metin Sezgin ; Cagatay Basdogan

Source :

RBID : pubmed:25532210

English descriptors

Abstract

The development of robots that can physically cooperate with humans has attained interest in the last decades. Obviously, this effort requires a deep understanding of the intrinsic properties of interaction. Up to now, many researchers have focused on inferring human intents in terms of intermediate or terminal goals in physical tasks. On the other hand, working side by side with people, an autonomous robot additionally needs to come up with in-depth information about underlying haptic interaction patterns that are typically encountered during human-human cooperation. However, to our knowledge, no study has yet focused on characterizing such detailed information. In this sense, this work is pioneering as an effort to gain deeper understanding of interaction patterns involving two or more humans in a physical task. We present a labeled human-human-interaction dataset, which captures the interaction of two humans, who collaboratively transport an object in an haptics-enabled virtual environment. In the light of information gained by studying this dataset, we propose that the actions of cooperating partners can be examined under three interaction types: In any cooperative task, the interacting humans either 1) work in harmony, 2) cope with conflicts, or 3) remain passive during interaction. In line with this conception, we present a taxonomy of human interaction patterns; then propose five different feature sets, comprising force-, velocity-and power-related information, for the classification of these patterns. Our evaluation shows that using a multi-class support vector machine (SVM) classifier, we can accomplish a correct classification rate of 86 percent for the identification of interaction patterns, an accuracy obtained by fusing a selected set of most informative features by Minimum Redundancy Maximum Relevance (mRMR) feature selection method.

DOI: 10.1109/TOH.2014.2384049
PubMed: 25532210

Links to Exploration step

pubmed:25532210

Le document en format XML

<record>
<TEI>
<teiHeader>
<fileDesc>
<titleStmt>
<title xml:lang="en">Recognition of haptic interaction patterns in dyadic joint object manipulation.</title>
<author>
<name sortKey="Madan, Cigil Ece" sort="Madan, Cigil Ece" uniqKey="Madan C" first="Cigil Ece" last="Madan">Cigil Ece Madan</name>
</author>
<author>
<name sortKey="Kucukyilmaz, Ayse" sort="Kucukyilmaz, Ayse" uniqKey="Kucukyilmaz A" first="Ayse" last="Kucukyilmaz">Ayse Kucukyilmaz</name>
</author>
<author>
<name sortKey="Sezgin, Tevfik Metin" sort="Sezgin, Tevfik Metin" uniqKey="Sezgin T" first="Tevfik Metin" last="Sezgin">Tevfik Metin Sezgin</name>
</author>
<author>
<name sortKey="Basdogan, Cagatay" sort="Basdogan, Cagatay" uniqKey="Basdogan C" first="Cagatay" last="Basdogan">Cagatay Basdogan</name>
</author>
</titleStmt>
<publicationStmt>
<idno type="wicri:source">PubMed</idno>
<date when="????">
<PubDate>
<MedlineDate>2015 Jan-Mar</MedlineDate>
</PubDate>
</date>
<idno type="doi">10.1109/TOH.2014.2384049</idno>
<idno type="RBID">pubmed:25532210</idno>
<idno type="pmid">25532210</idno>
<idno type="wicri:Area/PubMed/Corpus">000463</idno>
</publicationStmt>
<sourceDesc>
<biblStruct>
<analytic>
<title xml:lang="en">Recognition of haptic interaction patterns in dyadic joint object manipulation.</title>
<author>
<name sortKey="Madan, Cigil Ece" sort="Madan, Cigil Ece" uniqKey="Madan C" first="Cigil Ece" last="Madan">Cigil Ece Madan</name>
</author>
<author>
<name sortKey="Kucukyilmaz, Ayse" sort="Kucukyilmaz, Ayse" uniqKey="Kucukyilmaz A" first="Ayse" last="Kucukyilmaz">Ayse Kucukyilmaz</name>
</author>
<author>
<name sortKey="Sezgin, Tevfik Metin" sort="Sezgin, Tevfik Metin" uniqKey="Sezgin T" first="Tevfik Metin" last="Sezgin">Tevfik Metin Sezgin</name>
</author>
<author>
<name sortKey="Basdogan, Cagatay" sort="Basdogan, Cagatay" uniqKey="Basdogan C" first="Cagatay" last="Basdogan">Cagatay Basdogan</name>
</author>
</analytic>
<series>
<title level="j">IEEE transactions on haptics</title>
<idno type="eISSN">2329-4051</idno>
</series>
</biblStruct>
</sourceDesc>
</fileDesc>
<profileDesc>
<textClass>
<keywords scheme="KwdEn" xml:lang="en">
<term>Cooperative Behavior</term>
<term>Humans</term>
<term>Man-Machine Systems</term>
<term>Models, Biological</term>
<term>Pattern Recognition, Automated (methods)</term>
<term>Support Vector Machine</term>
<term>Task Performance and Analysis</term>
<term>Touch (physiology)</term>
</keywords>
<keywords scheme="MESH" qualifier="methods" xml:lang="en">
<term>Pattern Recognition, Automated</term>
</keywords>
<keywords scheme="MESH" qualifier="physiology" xml:lang="en">
<term>Touch</term>
</keywords>
<keywords scheme="MESH" xml:lang="en">
<term>Cooperative Behavior</term>
<term>Humans</term>
<term>Man-Machine Systems</term>
<term>Models, Biological</term>
<term>Support Vector Machine</term>
<term>Task Performance and Analysis</term>
</keywords>
</textClass>
</profileDesc>
</teiHeader>
<front>
<div type="abstract" xml:lang="en">The development of robots that can physically cooperate with humans has attained interest in the last decades. Obviously, this effort requires a deep understanding of the intrinsic properties of interaction. Up to now, many researchers have focused on inferring human intents in terms of intermediate or terminal goals in physical tasks. On the other hand, working side by side with people, an autonomous robot additionally needs to come up with in-depth information about underlying haptic interaction patterns that are typically encountered during human-human cooperation. However, to our knowledge, no study has yet focused on characterizing such detailed information. In this sense, this work is pioneering as an effort to gain deeper understanding of interaction patterns involving two or more humans in a physical task. We present a labeled human-human-interaction dataset, which captures the interaction of two humans, who collaboratively transport an object in an haptics-enabled virtual environment. In the light of information gained by studying this dataset, we propose that the actions of cooperating partners can be examined under three interaction types: In any cooperative task, the interacting humans either 1) work in harmony, 2) cope with conflicts, or 3) remain passive during interaction. In line with this conception, we present a taxonomy of human interaction patterns; then propose five different feature sets, comprising force-, velocity-and power-related information, for the classification of these patterns. Our evaluation shows that using a multi-class support vector machine (SVM) classifier, we can accomplish a correct classification rate of 86 percent for the identification of interaction patterns, an accuracy obtained by fusing a selected set of most informative features by Minimum Redundancy Maximum Relevance (mRMR) feature selection method.</div>
</front>
</TEI>
<pubmed>
<MedlineCitation Owner="NLM" Status="MEDLINE">
<PMID Version="1">25532210</PMID>
<DateCreated>
<Year>2015</Year>
<Month>03</Month>
<Day>21</Day>
</DateCreated>
<DateCompleted>
<Year>2015</Year>
<Month>11</Month>
<Day>16</Day>
</DateCompleted>
<DateRevised>
<Year>2015</Year>
<Month>11</Month>
<Day>19</Day>
</DateRevised>
<Article PubModel="Print-Electronic">
<Journal>
<ISSN IssnType="Electronic">2329-4051</ISSN>
<JournalIssue CitedMedium="Internet">
<Volume>8</Volume>
<Issue>1</Issue>
<PubDate>
<MedlineDate>2015 Jan-Mar</MedlineDate>
</PubDate>
</JournalIssue>
<Title>IEEE transactions on haptics</Title>
<ISOAbbreviation>IEEE Trans Haptics</ISOAbbreviation>
</Journal>
<ArticleTitle>Recognition of haptic interaction patterns in dyadic joint object manipulation.</ArticleTitle>
<Pagination>
<MedlinePgn>54-66</MedlinePgn>
</Pagination>
<ELocationID EIdType="doi" ValidYN="Y">10.1109/TOH.2014.2384049</ELocationID>
<Abstract>
<AbstractText>The development of robots that can physically cooperate with humans has attained interest in the last decades. Obviously, this effort requires a deep understanding of the intrinsic properties of interaction. Up to now, many researchers have focused on inferring human intents in terms of intermediate or terminal goals in physical tasks. On the other hand, working side by side with people, an autonomous robot additionally needs to come up with in-depth information about underlying haptic interaction patterns that are typically encountered during human-human cooperation. However, to our knowledge, no study has yet focused on characterizing such detailed information. In this sense, this work is pioneering as an effort to gain deeper understanding of interaction patterns involving two or more humans in a physical task. We present a labeled human-human-interaction dataset, which captures the interaction of two humans, who collaboratively transport an object in an haptics-enabled virtual environment. In the light of information gained by studying this dataset, we propose that the actions of cooperating partners can be examined under three interaction types: In any cooperative task, the interacting humans either 1) work in harmony, 2) cope with conflicts, or 3) remain passive during interaction. In line with this conception, we present a taxonomy of human interaction patterns; then propose five different feature sets, comprising force-, velocity-and power-related information, for the classification of these patterns. Our evaluation shows that using a multi-class support vector machine (SVM) classifier, we can accomplish a correct classification rate of 86 percent for the identification of interaction patterns, an accuracy obtained by fusing a selected set of most informative features by Minimum Redundancy Maximum Relevance (mRMR) feature selection method.</AbstractText>
</Abstract>
<AuthorList CompleteYN="Y">
<Author ValidYN="Y">
<LastName>Madan</LastName>
<ForeName>Cigil Ece</ForeName>
<Initials>CE</Initials>
</Author>
<Author ValidYN="Y">
<LastName>Kucukyilmaz</LastName>
<ForeName>Ayse</ForeName>
<Initials>A</Initials>
</Author>
<Author ValidYN="Y">
<LastName>Sezgin</LastName>
<ForeName>Tevfik Metin</ForeName>
<Initials>TM</Initials>
</Author>
<Author ValidYN="Y">
<LastName>Basdogan</LastName>
<ForeName>Cagatay</ForeName>
<Initials>C</Initials>
</Author>
</AuthorList>
<Language>eng</Language>
<PublicationTypeList>
<PublicationType UI="D016428">Journal Article</PublicationType>
</PublicationTypeList>
<ArticleDate DateType="Electronic">
<Year>2014</Year>
<Month>12</Month>
<Day>18</Day>
</ArticleDate>
</Article>
<MedlineJournalInfo>
<Country>United States</Country>
<MedlineTA>IEEE Trans Haptics</MedlineTA>
<NlmUniqueID>101491191</NlmUniqueID>
<ISSNLinking>1939-1412</ISSNLinking>
</MedlineJournalInfo>
<CitationSubset>IM</CitationSubset>
<MeshHeadingList>
<MeshHeading>
<DescriptorName MajorTopicYN="Y" UI="D003299">Cooperative Behavior</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="N" UI="D006801">Humans</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="Y" UI="D008328">Man-Machine Systems</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="N" UI="D008954">Models, Biological</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="N" UI="D010363">Pattern Recognition, Automated</DescriptorName>
<QualifierName MajorTopicYN="Y" UI="Q000379">methods</QualifierName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="N" UI="D060388">Support Vector Machine</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="Y" UI="D013647">Task Performance and Analysis</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="N" UI="D014110">Touch</DescriptorName>
<QualifierName MajorTopicYN="Y" UI="Q000502">physiology</QualifierName>
</MeshHeading>
</MeshHeadingList>
</MedlineCitation>
<PubmedData>
<History>
<PubMedPubDate PubStatus="aheadofprint">
<Year>2014</Year>
<Month>12</Month>
<Day>18</Day>
</PubMedPubDate>
<PubMedPubDate PubStatus="entrez">
<Year>2014</Year>
<Month>12</Month>
<Day>23</Day>
<Hour>6</Hour>
<Minute>0</Minute>
</PubMedPubDate>
<PubMedPubDate PubStatus="pubmed">
<Year>2014</Year>
<Month>12</Month>
<Day>23</Day>
<Hour>6</Hour>
<Minute>0</Minute>
</PubMedPubDate>
<PubMedPubDate PubStatus="medline">
<Year>2015</Year>
<Month>11</Month>
<Day>17</Day>
<Hour>6</Hour>
<Minute>0</Minute>
</PubMedPubDate>
</History>
<PublicationStatus>ppublish</PublicationStatus>
<ArticleIdList>
<ArticleId IdType="doi">10.1109/TOH.2014.2384049</ArticleId>
<ArticleId IdType="pubmed">25532210</ArticleId>
</ArticleIdList>
</PubmedData>
</pubmed>
</record>

Pour manipuler ce document sous Unix (Dilib)

EXPLOR_STEP=$WICRI_ROOT/Ticri/CIDE/explor/HapticV1/Data/PubMed/Corpus
HfdSelect -h $EXPLOR_STEP/biblio.hfd -nk 000463 | SxmlIndent | more

Ou

HfdSelect -h $EXPLOR_AREA/Data/PubMed/Corpus/biblio.hfd -nk 000463 | SxmlIndent | more

Pour mettre un lien sur cette page dans le réseau Wicri

{{Explor lien
   |wiki=    Ticri/CIDE
   |area=    HapticV1
   |flux=    PubMed
   |étape=   Corpus
   |type=    RBID
   |clé=     pubmed:25532210
   |texte=   Recognition of haptic interaction patterns in dyadic joint object manipulation.
}}

Pour générer des pages wiki

HfdIndexSelect -h $EXPLOR_AREA/Data/PubMed/Corpus/RBID.i   -Sk "pubmed:25532210" \
       | HfdSelect -Kh $EXPLOR_AREA/Data/PubMed/Corpus/biblio.hfd   \
       | NlmPubMed2Wicri -a HapticV1 

Wicri

This area was generated with Dilib version V0.6.23.
Data generation: Mon Jun 13 01:09:46 2016. Site generation: Wed Mar 6 09:54:07 2024