Serveur d'exploration sur les dispositifs haptiques

Attention, ce site est en cours de développement !
Attention, site généré par des moyens informatiques à partir de corpus bruts.
Les informations ne sont donc pas validées.

Categorizing natural objects: a comparison of the visual and the haptic modalities

Identifieur interne : 000360 ( PascalFrancis/Corpus ); précédent : 000359; suivant : 000361

Categorizing natural objects: a comparison of the visual and the haptic modalities

Auteurs : Nina Gaissert ; Christian Wallraven

Source :

RBID : Pascal:12-0125456

Descripteurs français

English descriptors

Abstract

Although the hands are the most important tool for humans to manipulate objects, only little is known about haptic processing of natural objects. Here, we selected a unique set of natural objects, namely seashells, which vary along a variety of object features, while others are shared across all stimuli. To correctly interact with objects, they have to be identified or categorized. For both processes, measuring similarities between objects is crucial. Our goal is to better understand the haptic similarity percept by comparing it to the visual similarity percept. First, direct similarity measures were analyzed using multidimensional scaling techniques to visualize the perceptual spaces of both modalities. We find that the visual and the haptic modality form almost identical perceptual spaces. Next, we performed three different categorization tasks. All tasks exhibit a highly accurate processing of complex shapes of the haptic modality. Moreover, we find that objects grouped into the same category form regions within the perceptual space. Hence, in both modalities, perceived similarity constitutes the basis for categorizing objects. Moreover, both modalities focus on shape to form categories. Taken together, our results lead to the assumption that the same cognitive processes link haptic and visual similarity perception and the resulting categorization behavior.

Notice en format standard (ISO 2709)

Pour connaître la documentation sur le format Inist Standard.

pA  
A01 01  1    @0 0014-4819
A02 01      @0 EXBRAP
A03   1    @0 Exp. brain res.
A05       @2 216
A06       @2 1
A08 01  1  ENG  @1 Categorizing natural objects: a comparison of the visual and the haptic modalities
A11 01  1    @1 GAISSERT (Nina)
A11 02  1    @1 WALLRAVEN (Christian)
A14 01      @1 Max Planck Institute for Biological Cybernetics @2 Tübingen @3 DEU @Z 1 aut.
A14 02      @1 Korea University @2 Seoul @3 KOR @Z 2 aut.
A20       @1 123-134
A21       @1 2012
A23 01      @0 ENG
A43 01      @1 INIST @2 12535 @5 354000502856190130
A44       @0 0000 @1 © 2012 INIST-CNRS. All rights reserved.
A45       @0 3/4 p.
A47 01  1    @0 12-0125456
A60       @1 P
A61       @0 A
A64 01  1    @0 Experimental brain research
A66 01      @0 DEU
C01 01    ENG  @0 Although the hands are the most important tool for humans to manipulate objects, only little is known about haptic processing of natural objects. Here, we selected a unique set of natural objects, namely seashells, which vary along a variety of object features, while others are shared across all stimuli. To correctly interact with objects, they have to be identified or categorized. For both processes, measuring similarities between objects is crucial. Our goal is to better understand the haptic similarity percept by comparing it to the visual similarity percept. First, direct similarity measures were analyzed using multidimensional scaling techniques to visualize the perceptual spaces of both modalities. We find that the visual and the haptic modality form almost identical perceptual spaces. Next, we performed three different categorization tasks. All tasks exhibit a highly accurate processing of complex shapes of the haptic modality. Moreover, we find that objects grouped into the same category form regions within the perceptual space. Hence, in both modalities, perceived similarity constitutes the basis for categorizing objects. Moreover, both modalities focus on shape to form categories. Taken together, our results lead to the assumption that the same cognitive processes link haptic and visual similarity perception and the resulting categorization behavior.
C02 01  X    @0 002A25I
C02 02  X    @0 002B29C02
C03 01  X  FRE  @0 Main @5 01
C03 01  X  ENG  @0 Hand @5 01
C03 01  X  SPA  @0 Mano @5 01
C03 02  X  FRE  @0 Catégorisation @5 02
C03 02  X  ENG  @0 Categorization @5 02
C03 02  X  SPA  @0 Categorización @5 02
C03 03  X  FRE  @0 Perception visuelle @5 03
C03 03  X  ENG  @0 Visual perception @5 03
C03 03  X  SPA  @0 Percepción visual @5 03
C03 04  X  FRE  @0 Homme @5 54
C03 04  X  ENG  @0 Human @5 54
C03 04  X  SPA  @0 Hombre @5 54
C03 05  X  FRE  @0 Perception haptique @4 CD @5 96
C03 05  X  ENG  @0 Haptic perception @4 CD @5 96
N21       @1 093
N44 01      @1 OTO
N82       @1 OTO

Format Inist (serveur)

NO : PASCAL 12-0125456 INIST
ET : Categorizing natural objects: a comparison of the visual and the haptic modalities
AU : GAISSERT (Nina); WALLRAVEN (Christian)
AF : Max Planck Institute for Biological Cybernetics/Tübingen/Allemagne (1 aut.); Korea University/Seoul/Corée, République de (2 aut.)
DT : Publication en série; Niveau analytique
SO : Experimental brain research; ISSN 0014-4819; Coden EXBRAP; Allemagne; Da. 2012; Vol. 216; No. 1; Pp. 123-134; Bibl. 3/4 p.
LA : Anglais
EA : Although the hands are the most important tool for humans to manipulate objects, only little is known about haptic processing of natural objects. Here, we selected a unique set of natural objects, namely seashells, which vary along a variety of object features, while others are shared across all stimuli. To correctly interact with objects, they have to be identified or categorized. For both processes, measuring similarities between objects is crucial. Our goal is to better understand the haptic similarity percept by comparing it to the visual similarity percept. First, direct similarity measures were analyzed using multidimensional scaling techniques to visualize the perceptual spaces of both modalities. We find that the visual and the haptic modality form almost identical perceptual spaces. Next, we performed three different categorization tasks. All tasks exhibit a highly accurate processing of complex shapes of the haptic modality. Moreover, we find that objects grouped into the same category form regions within the perceptual space. Hence, in both modalities, perceived similarity constitutes the basis for categorizing objects. Moreover, both modalities focus on shape to form categories. Taken together, our results lead to the assumption that the same cognitive processes link haptic and visual similarity perception and the resulting categorization behavior.
CC : 002A25I; 002B29C02
FD : Main; Catégorisation; Perception visuelle; Homme; Perception haptique
ED : Hand; Categorization; Visual perception; Human; Haptic perception
SD : Mano; Categorización; Percepción visual; Hombre
LO : INIST-12535.354000502856190130
ID : 12-0125456

Links to Exploration step

Pascal:12-0125456

Le document en format XML

<record>
<TEI>
<teiHeader>
<fileDesc>
<titleStmt>
<title xml:lang="en" level="a">Categorizing natural objects: a comparison of the visual and the haptic modalities</title>
<author>
<name sortKey="Gaissert, Nina" sort="Gaissert, Nina" uniqKey="Gaissert N" first="Nina" last="Gaissert">Nina Gaissert</name>
<affiliation>
<inist:fA14 i1="01">
<s1>Max Planck Institute for Biological Cybernetics</s1>
<s2>Tübingen</s2>
<s3>DEU</s3>
<sZ>1 aut.</sZ>
</inist:fA14>
</affiliation>
</author>
<author>
<name sortKey="Wallraven, Christian" sort="Wallraven, Christian" uniqKey="Wallraven C" first="Christian" last="Wallraven">Christian Wallraven</name>
<affiliation>
<inist:fA14 i1="02">
<s1>Korea University</s1>
<s2>Seoul</s2>
<s3>KOR</s3>
<sZ>2 aut.</sZ>
</inist:fA14>
</affiliation>
</author>
</titleStmt>
<publicationStmt>
<idno type="wicri:source">INIST</idno>
<idno type="inist">12-0125456</idno>
<date when="2012">2012</date>
<idno type="stanalyst">PASCAL 12-0125456 INIST</idno>
<idno type="RBID">Pascal:12-0125456</idno>
<idno type="wicri:Area/PascalFrancis/Corpus">000360</idno>
</publicationStmt>
<sourceDesc>
<biblStruct>
<analytic>
<title xml:lang="en" level="a">Categorizing natural objects: a comparison of the visual and the haptic modalities</title>
<author>
<name sortKey="Gaissert, Nina" sort="Gaissert, Nina" uniqKey="Gaissert N" first="Nina" last="Gaissert">Nina Gaissert</name>
<affiliation>
<inist:fA14 i1="01">
<s1>Max Planck Institute for Biological Cybernetics</s1>
<s2>Tübingen</s2>
<s3>DEU</s3>
<sZ>1 aut.</sZ>
</inist:fA14>
</affiliation>
</author>
<author>
<name sortKey="Wallraven, Christian" sort="Wallraven, Christian" uniqKey="Wallraven C" first="Christian" last="Wallraven">Christian Wallraven</name>
<affiliation>
<inist:fA14 i1="02">
<s1>Korea University</s1>
<s2>Seoul</s2>
<s3>KOR</s3>
<sZ>2 aut.</sZ>
</inist:fA14>
</affiliation>
</author>
</analytic>
<series>
<title level="j" type="main">Experimental brain research</title>
<title level="j" type="abbreviated">Exp. brain res.</title>
<idno type="ISSN">0014-4819</idno>
<imprint>
<date when="2012">2012</date>
</imprint>
</series>
</biblStruct>
</sourceDesc>
<seriesStmt>
<title level="j" type="main">Experimental brain research</title>
<title level="j" type="abbreviated">Exp. brain res.</title>
<idno type="ISSN">0014-4819</idno>
</seriesStmt>
</fileDesc>
<profileDesc>
<textClass>
<keywords scheme="KwdEn" xml:lang="en">
<term>Categorization</term>
<term>Hand</term>
<term>Haptic perception</term>
<term>Human</term>
<term>Visual perception</term>
</keywords>
<keywords scheme="Pascal" xml:lang="fr">
<term>Main</term>
<term>Catégorisation</term>
<term>Perception visuelle</term>
<term>Homme</term>
<term>Perception haptique</term>
</keywords>
</textClass>
</profileDesc>
</teiHeader>
<front>
<div type="abstract" xml:lang="en">Although the hands are the most important tool for humans to manipulate objects, only little is known about haptic processing of natural objects. Here, we selected a unique set of natural objects, namely seashells, which vary along a variety of object features, while others are shared across all stimuli. To correctly interact with objects, they have to be identified or categorized. For both processes, measuring similarities between objects is crucial. Our goal is to better understand the haptic similarity percept by comparing it to the visual similarity percept. First, direct similarity measures were analyzed using multidimensional scaling techniques to visualize the perceptual spaces of both modalities. We find that the visual and the haptic modality form almost identical perceptual spaces. Next, we performed three different categorization tasks. All tasks exhibit a highly accurate processing of complex shapes of the haptic modality. Moreover, we find that objects grouped into the same category form regions within the perceptual space. Hence, in both modalities, perceived similarity constitutes the basis for categorizing objects. Moreover, both modalities focus on shape to form categories. Taken together, our results lead to the assumption that the same cognitive processes link haptic and visual similarity perception and the resulting categorization behavior.</div>
</front>
</TEI>
<inist>
<standard h6="B">
<pA>
<fA01 i1="01" i2="1">
<s0>0014-4819</s0>
</fA01>
<fA02 i1="01">
<s0>EXBRAP</s0>
</fA02>
<fA03 i2="1">
<s0>Exp. brain res.</s0>
</fA03>
<fA05>
<s2>216</s2>
</fA05>
<fA06>
<s2>1</s2>
</fA06>
<fA08 i1="01" i2="1" l="ENG">
<s1>Categorizing natural objects: a comparison of the visual and the haptic modalities</s1>
</fA08>
<fA11 i1="01" i2="1">
<s1>GAISSERT (Nina)</s1>
</fA11>
<fA11 i1="02" i2="1">
<s1>WALLRAVEN (Christian)</s1>
</fA11>
<fA14 i1="01">
<s1>Max Planck Institute for Biological Cybernetics</s1>
<s2>Tübingen</s2>
<s3>DEU</s3>
<sZ>1 aut.</sZ>
</fA14>
<fA14 i1="02">
<s1>Korea University</s1>
<s2>Seoul</s2>
<s3>KOR</s3>
<sZ>2 aut.</sZ>
</fA14>
<fA20>
<s1>123-134</s1>
</fA20>
<fA21>
<s1>2012</s1>
</fA21>
<fA23 i1="01">
<s0>ENG</s0>
</fA23>
<fA43 i1="01">
<s1>INIST</s1>
<s2>12535</s2>
<s5>354000502856190130</s5>
</fA43>
<fA44>
<s0>0000</s0>
<s1>© 2012 INIST-CNRS. All rights reserved.</s1>
</fA44>
<fA45>
<s0>3/4 p.</s0>
</fA45>
<fA47 i1="01" i2="1">
<s0>12-0125456</s0>
</fA47>
<fA60>
<s1>P</s1>
</fA60>
<fA61>
<s0>A</s0>
</fA61>
<fA64 i1="01" i2="1">
<s0>Experimental brain research</s0>
</fA64>
<fA66 i1="01">
<s0>DEU</s0>
</fA66>
<fC01 i1="01" l="ENG">
<s0>Although the hands are the most important tool for humans to manipulate objects, only little is known about haptic processing of natural objects. Here, we selected a unique set of natural objects, namely seashells, which vary along a variety of object features, while others are shared across all stimuli. To correctly interact with objects, they have to be identified or categorized. For both processes, measuring similarities between objects is crucial. Our goal is to better understand the haptic similarity percept by comparing it to the visual similarity percept. First, direct similarity measures were analyzed using multidimensional scaling techniques to visualize the perceptual spaces of both modalities. We find that the visual and the haptic modality form almost identical perceptual spaces. Next, we performed three different categorization tasks. All tasks exhibit a highly accurate processing of complex shapes of the haptic modality. Moreover, we find that objects grouped into the same category form regions within the perceptual space. Hence, in both modalities, perceived similarity constitutes the basis for categorizing objects. Moreover, both modalities focus on shape to form categories. Taken together, our results lead to the assumption that the same cognitive processes link haptic and visual similarity perception and the resulting categorization behavior.</s0>
</fC01>
<fC02 i1="01" i2="X">
<s0>002A25I</s0>
</fC02>
<fC02 i1="02" i2="X">
<s0>002B29C02</s0>
</fC02>
<fC03 i1="01" i2="X" l="FRE">
<s0>Main</s0>
<s5>01</s5>
</fC03>
<fC03 i1="01" i2="X" l="ENG">
<s0>Hand</s0>
<s5>01</s5>
</fC03>
<fC03 i1="01" i2="X" l="SPA">
<s0>Mano</s0>
<s5>01</s5>
</fC03>
<fC03 i1="02" i2="X" l="FRE">
<s0>Catégorisation</s0>
<s5>02</s5>
</fC03>
<fC03 i1="02" i2="X" l="ENG">
<s0>Categorization</s0>
<s5>02</s5>
</fC03>
<fC03 i1="02" i2="X" l="SPA">
<s0>Categorización</s0>
<s5>02</s5>
</fC03>
<fC03 i1="03" i2="X" l="FRE">
<s0>Perception visuelle</s0>
<s5>03</s5>
</fC03>
<fC03 i1="03" i2="X" l="ENG">
<s0>Visual perception</s0>
<s5>03</s5>
</fC03>
<fC03 i1="03" i2="X" l="SPA">
<s0>Percepción visual</s0>
<s5>03</s5>
</fC03>
<fC03 i1="04" i2="X" l="FRE">
<s0>Homme</s0>
<s5>54</s5>
</fC03>
<fC03 i1="04" i2="X" l="ENG">
<s0>Human</s0>
<s5>54</s5>
</fC03>
<fC03 i1="04" i2="X" l="SPA">
<s0>Hombre</s0>
<s5>54</s5>
</fC03>
<fC03 i1="05" i2="X" l="FRE">
<s0>Perception haptique</s0>
<s4>CD</s4>
<s5>96</s5>
</fC03>
<fC03 i1="05" i2="X" l="ENG">
<s0>Haptic perception</s0>
<s4>CD</s4>
<s5>96</s5>
</fC03>
<fN21>
<s1>093</s1>
</fN21>
<fN44 i1="01">
<s1>OTO</s1>
</fN44>
<fN82>
<s1>OTO</s1>
</fN82>
</pA>
</standard>
<server>
<NO>PASCAL 12-0125456 INIST</NO>
<ET>Categorizing natural objects: a comparison of the visual and the haptic modalities</ET>
<AU>GAISSERT (Nina); WALLRAVEN (Christian)</AU>
<AF>Max Planck Institute for Biological Cybernetics/Tübingen/Allemagne (1 aut.); Korea University/Seoul/Corée, République de (2 aut.)</AF>
<DT>Publication en série; Niveau analytique</DT>
<SO>Experimental brain research; ISSN 0014-4819; Coden EXBRAP; Allemagne; Da. 2012; Vol. 216; No. 1; Pp. 123-134; Bibl. 3/4 p.</SO>
<LA>Anglais</LA>
<EA>Although the hands are the most important tool for humans to manipulate objects, only little is known about haptic processing of natural objects. Here, we selected a unique set of natural objects, namely seashells, which vary along a variety of object features, while others are shared across all stimuli. To correctly interact with objects, they have to be identified or categorized. For both processes, measuring similarities between objects is crucial. Our goal is to better understand the haptic similarity percept by comparing it to the visual similarity percept. First, direct similarity measures were analyzed using multidimensional scaling techniques to visualize the perceptual spaces of both modalities. We find that the visual and the haptic modality form almost identical perceptual spaces. Next, we performed three different categorization tasks. All tasks exhibit a highly accurate processing of complex shapes of the haptic modality. Moreover, we find that objects grouped into the same category form regions within the perceptual space. Hence, in both modalities, perceived similarity constitutes the basis for categorizing objects. Moreover, both modalities focus on shape to form categories. Taken together, our results lead to the assumption that the same cognitive processes link haptic and visual similarity perception and the resulting categorization behavior.</EA>
<CC>002A25I; 002B29C02</CC>
<FD>Main; Catégorisation; Perception visuelle; Homme; Perception haptique</FD>
<ED>Hand; Categorization; Visual perception; Human; Haptic perception</ED>
<SD>Mano; Categorización; Percepción visual; Hombre</SD>
<LO>INIST-12535.354000502856190130</LO>
<ID>12-0125456</ID>
</server>
</inist>
</record>

Pour manipuler ce document sous Unix (Dilib)

EXPLOR_STEP=$WICRI_ROOT/Ticri/CIDE/explor/HapticV1/Data/PascalFrancis/Corpus
HfdSelect -h $EXPLOR_STEP/biblio.hfd -nk 000360 | SxmlIndent | more

Ou

HfdSelect -h $EXPLOR_AREA/Data/PascalFrancis/Corpus/biblio.hfd -nk 000360 | SxmlIndent | more

Pour mettre un lien sur cette page dans le réseau Wicri

{{Explor lien
   |wiki=    Ticri/CIDE
   |area=    HapticV1
   |flux=    PascalFrancis
   |étape=   Corpus
   |type=    RBID
   |clé=     Pascal:12-0125456
   |texte=   Categorizing natural objects: a comparison of the visual and the haptic modalities
}}

Wicri

This area was generated with Dilib version V0.6.23.
Data generation: Mon Jun 13 01:09:46 2016. Site generation: Wed Mar 6 09:54:07 2024