Serveur d'exploration sur les dispositifs haptiques

Attention, ce site est en cours de développement !
Attention, site généré par des moyens informatiques à partir de corpus bruts.
Les informations ne sont donc pas validées.

Perceptual equivalence between vision and touch is complexity dependent.

Identifieur interne : 001233 ( PubMed/Corpus ); précédent : 001232; suivant : 001234

Perceptual equivalence between vision and touch is complexity dependent.

Auteurs : F. Phillips ; E J L. Egan ; B N Perry

Source :

RBID : pubmed:19691944

English descriptors

Abstract

We experience the shape of objects in our world largely by way of our vision and touch but the availability and integration of information between the senses remains an open question. The research presented in this article examines the effect of stimulus complexity on visual, haptic and crossmodal discrimination. Using sculpted three-dimensional objects whose features vary systematically, we perform a series of three experiments to determine perceptual equivalence as a function of complexity. Two unimodal experiments--vision and touch-only, and one crossmodal experiment investigating the availability of information across the senses, were performed. We find that, for the class of stimuli used, subjects were able to visually discriminate them reliably across the entire range of complexity, while the experiments involving haptic information show a marked decrease in performance as the objects become more complex. Performance in the crossmodal condition appears to be constrained by the limits of the subjects' haptic representation, but the combination of the two sources of information is of some benefit over vision alone when comparing the simpler, low-frequency stimuli. This result shows that there is crossmodal transfer, and therefore perceptual equivalency, but that this transfer is limited by the object's complexity.

DOI: 10.1016/j.actpsy.2009.07.010
PubMed: 19691944

Links to Exploration step

pubmed:19691944

Le document en format XML

<record>
<TEI>
<teiHeader>
<fileDesc>
<titleStmt>
<title xml:lang="en">Perceptual equivalence between vision and touch is complexity dependent.</title>
<author>
<name sortKey="Phillips, F" sort="Phillips, F" uniqKey="Phillips F" first="F" last="Phillips">F. Phillips</name>
<affiliation>
<nlm:affiliation>Psychology and Neuroscience, Skidmore College, Saratoga Springs, NY 12866, USA. flip@skidmore.edu</nlm:affiliation>
</affiliation>
</author>
<author>
<name sortKey="Egan, E J L" sort="Egan, E J L" uniqKey="Egan E" first="E J L" last="Egan">E J L. Egan</name>
</author>
<author>
<name sortKey="Perry, B N" sort="Perry, B N" uniqKey="Perry B" first="B N" last="Perry">B N Perry</name>
</author>
</titleStmt>
<publicationStmt>
<idno type="wicri:source">PubMed</idno>
<date when="2009">2009</date>
<idno type="doi">10.1016/j.actpsy.2009.07.010</idno>
<idno type="RBID">pubmed:19691944</idno>
<idno type="pmid">19691944</idno>
<idno type="wicri:Area/PubMed/Corpus">001233</idno>
</publicationStmt>
<sourceDesc>
<biblStruct>
<analytic>
<title xml:lang="en">Perceptual equivalence between vision and touch is complexity dependent.</title>
<author>
<name sortKey="Phillips, F" sort="Phillips, F" uniqKey="Phillips F" first="F" last="Phillips">F. Phillips</name>
<affiliation>
<nlm:affiliation>Psychology and Neuroscience, Skidmore College, Saratoga Springs, NY 12866, USA. flip@skidmore.edu</nlm:affiliation>
</affiliation>
</author>
<author>
<name sortKey="Egan, E J L" sort="Egan, E J L" uniqKey="Egan E" first="E J L" last="Egan">E J L. Egan</name>
</author>
<author>
<name sortKey="Perry, B N" sort="Perry, B N" uniqKey="Perry B" first="B N" last="Perry">B N Perry</name>
</author>
</analytic>
<series>
<title level="j">Acta psychologica</title>
<idno type="eISSN">1873-6297</idno>
<imprint>
<date when="2009" type="published">2009</date>
</imprint>
</series>
</biblStruct>
</sourceDesc>
</fileDesc>
<profileDesc>
<textClass>
<keywords scheme="KwdEn" xml:lang="en">
<term>Color Perception</term>
<term>Discrimination (Psychology)</term>
<term>Female</term>
<term>Form Perception</term>
<term>Humans</term>
<term>Male</term>
<term>Pattern Recognition, Visual</term>
<term>Touch Perception</term>
<term>Visual Perception</term>
<term>Young Adult</term>
</keywords>
<keywords scheme="MESH" xml:lang="en">
<term>Color Perception</term>
<term>Discrimination (Psychology)</term>
<term>Female</term>
<term>Form Perception</term>
<term>Humans</term>
<term>Male</term>
<term>Pattern Recognition, Visual</term>
<term>Touch Perception</term>
<term>Visual Perception</term>
<term>Young Adult</term>
</keywords>
</textClass>
</profileDesc>
</teiHeader>
<front>
<div type="abstract" xml:lang="en">We experience the shape of objects in our world largely by way of our vision and touch but the availability and integration of information between the senses remains an open question. The research presented in this article examines the effect of stimulus complexity on visual, haptic and crossmodal discrimination. Using sculpted three-dimensional objects whose features vary systematically, we perform a series of three experiments to determine perceptual equivalence as a function of complexity. Two unimodal experiments--vision and touch-only, and one crossmodal experiment investigating the availability of information across the senses, were performed. We find that, for the class of stimuli used, subjects were able to visually discriminate them reliably across the entire range of complexity, while the experiments involving haptic information show a marked decrease in performance as the objects become more complex. Performance in the crossmodal condition appears to be constrained by the limits of the subjects' haptic representation, but the combination of the two sources of information is of some benefit over vision alone when comparing the simpler, low-frequency stimuli. This result shows that there is crossmodal transfer, and therefore perceptual equivalency, but that this transfer is limited by the object's complexity.</div>
</front>
</TEI>
<pubmed>
<MedlineCitation Owner="NLM" Status="MEDLINE">
<PMID Version="1">19691944</PMID>
<DateCreated>
<Year>2009</Year>
<Month>09</Month>
<Day>30</Day>
</DateCreated>
<DateCompleted>
<Year>2010</Year>
<Month>01</Month>
<Day>05</Day>
</DateCompleted>
<Article PubModel="Print-Electronic">
<Journal>
<ISSN IssnType="Electronic">1873-6297</ISSN>
<JournalIssue CitedMedium="Internet">
<Volume>132</Volume>
<Issue>3</Issue>
<PubDate>
<Year>2009</Year>
<Month>Nov</Month>
</PubDate>
</JournalIssue>
<Title>Acta psychologica</Title>
<ISOAbbreviation>Acta Psychol (Amst)</ISOAbbreviation>
</Journal>
<ArticleTitle>Perceptual equivalence between vision and touch is complexity dependent.</ArticleTitle>
<Pagination>
<MedlinePgn>259-66</MedlinePgn>
</Pagination>
<ELocationID EIdType="doi" ValidYN="Y">10.1016/j.actpsy.2009.07.010</ELocationID>
<Abstract>
<AbstractText>We experience the shape of objects in our world largely by way of our vision and touch but the availability and integration of information between the senses remains an open question. The research presented in this article examines the effect of stimulus complexity on visual, haptic and crossmodal discrimination. Using sculpted three-dimensional objects whose features vary systematically, we perform a series of three experiments to determine perceptual equivalence as a function of complexity. Two unimodal experiments--vision and touch-only, and one crossmodal experiment investigating the availability of information across the senses, were performed. We find that, for the class of stimuli used, subjects were able to visually discriminate them reliably across the entire range of complexity, while the experiments involving haptic information show a marked decrease in performance as the objects become more complex. Performance in the crossmodal condition appears to be constrained by the limits of the subjects' haptic representation, but the combination of the two sources of information is of some benefit over vision alone when comparing the simpler, low-frequency stimuli. This result shows that there is crossmodal transfer, and therefore perceptual equivalency, but that this transfer is limited by the object's complexity.</AbstractText>
</Abstract>
<AuthorList CompleteYN="Y">
<Author ValidYN="Y">
<LastName>Phillips</LastName>
<ForeName>F</ForeName>
<Initials>F</Initials>
<AffiliationInfo>
<Affiliation>Psychology and Neuroscience, Skidmore College, Saratoga Springs, NY 12866, USA. flip@skidmore.edu</Affiliation>
</AffiliationInfo>
</Author>
<Author ValidYN="Y">
<LastName>Egan</LastName>
<ForeName>E J L</ForeName>
<Initials>EJ</Initials>
</Author>
<Author ValidYN="Y">
<LastName>Perry</LastName>
<ForeName>B N</ForeName>
<Initials>BN</Initials>
</Author>
</AuthorList>
<Language>eng</Language>
<PublicationTypeList>
<PublicationType UI="D016428">Journal Article</PublicationType>
</PublicationTypeList>
<ArticleDate DateType="Electronic">
<Year>2009</Year>
<Month>08</Month>
<Day>18</Day>
</ArticleDate>
</Article>
<MedlineJournalInfo>
<Country>Netherlands</Country>
<MedlineTA>Acta Psychol (Amst)</MedlineTA>
<NlmUniqueID>0370366</NlmUniqueID>
<ISSNLinking>0001-6918</ISSNLinking>
</MedlineJournalInfo>
<CitationSubset>IM</CitationSubset>
<MeshHeadingList>
<MeshHeading>
<DescriptorName MajorTopicYN="N" UI="D003118">Color Perception</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="N" UI="D004192">Discrimination (Psychology)</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="N" UI="D005260">Female</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="N" UI="D005556">Form Perception</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="N" UI="D006801">Humans</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="N" UI="D008297">Male</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="N" UI="D010364">Pattern Recognition, Visual</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="Y" UI="D055698">Touch Perception</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="Y" UI="D014796">Visual Perception</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="N" UI="D055815">Young Adult</DescriptorName>
</MeshHeading>
</MeshHeadingList>
</MedlineCitation>
<PubmedData>
<History>
<PubMedPubDate PubStatus="received">
<Year>2009</Year>
<Month>2</Month>
<Day>25</Day>
</PubMedPubDate>
<PubMedPubDate PubStatus="revised">
<Year>2009</Year>
<Month>7</Month>
<Day>9</Day>
</PubMedPubDate>
<PubMedPubDate PubStatus="accepted">
<Year>2009</Year>
<Month>7</Month>
<Day>16</Day>
</PubMedPubDate>
<PubMedPubDate PubStatus="aheadofprint">
<Year>2009</Year>
<Month>8</Month>
<Day>18</Day>
</PubMedPubDate>
<PubMedPubDate PubStatus="entrez">
<Year>2009</Year>
<Month>8</Month>
<Day>21</Day>
<Hour>9</Hour>
<Minute>0</Minute>
</PubMedPubDate>
<PubMedPubDate PubStatus="pubmed">
<Year>2009</Year>
<Month>8</Month>
<Day>21</Day>
<Hour>9</Hour>
<Minute>0</Minute>
</PubMedPubDate>
<PubMedPubDate PubStatus="medline">
<Year>2010</Year>
<Month>1</Month>
<Day>6</Day>
<Hour>6</Hour>
<Minute>0</Minute>
</PubMedPubDate>
</History>
<PublicationStatus>ppublish</PublicationStatus>
<ArticleIdList>
<ArticleId IdType="pii">S0001-6918(09)00099-7</ArticleId>
<ArticleId IdType="doi">10.1016/j.actpsy.2009.07.010</ArticleId>
<ArticleId IdType="pubmed">19691944</ArticleId>
</ArticleIdList>
</PubmedData>
</pubmed>
</record>

Pour manipuler ce document sous Unix (Dilib)

EXPLOR_STEP=$WICRI_ROOT/Ticri/CIDE/explor/HapticV1/Data/PubMed/Corpus
HfdSelect -h $EXPLOR_STEP/biblio.hfd -nk 001233 | SxmlIndent | more

Ou

HfdSelect -h $EXPLOR_AREA/Data/PubMed/Corpus/biblio.hfd -nk 001233 | SxmlIndent | more

Pour mettre un lien sur cette page dans le réseau Wicri

{{Explor lien
   |wiki=    Ticri/CIDE
   |area=    HapticV1
   |flux=    PubMed
   |étape=   Corpus
   |type=    RBID
   |clé=     pubmed:19691944
   |texte=   Perceptual equivalence between vision and touch is complexity dependent.
}}

Pour générer des pages wiki

HfdIndexSelect -h $EXPLOR_AREA/Data/PubMed/Corpus/RBID.i   -Sk "pubmed:19691944" \
       | HfdSelect -Kh $EXPLOR_AREA/Data/PubMed/Corpus/biblio.hfd   \
       | NlmPubMed2Wicri -a HapticV1 

Wicri

This area was generated with Dilib version V0.6.23.
Data generation: Mon Jun 13 01:09:46 2016. Site generation: Wed Mar 6 09:54:07 2024