Serveur d'exploration sur les dispositifs haptiques

Attention, ce site est en cours de développement !
Attention, site généré par des moyens informatiques à partir de corpus bruts.
Les informations ne sont donc pas validées.

Crossmodal Association of Visual and Haptic Material Properties of Objects in the Monkey Ventral Visual Cortex.

Identifieur interne : 000069 ( PubMed/Corpus ); précédent : 000068; suivant : 000070

Crossmodal Association of Visual and Haptic Material Properties of Objects in the Monkey Ventral Visual Cortex.

Auteurs : Naokazu Goda ; Isao Yokoi ; Atsumichi Tachibana ; Takafumi Minamimoto ; Hidehiko Komatsu

Source :

RBID : pubmed:26996504

Abstract

Just by looking at an object, we can recognize its non-visual properties, such as hardness. The visual recognition of non-visual object properties is generally accurate [1], and influences actions toward the object [2]. Recent studies suggest that, in the primate brain, this may involve the ventral visual cortex, which represents objects in a way that reflects not only visual but also non-visual object properties, such as haptic roughness, hardness, and weight [3-7]. This new insight raises a fundamental question: how does the visual cortex come to represent non-visual properties-knowledge that cannot be acquired directly through vision? Here we addressed this unresolved question using fMRI in macaque monkeys. Specifically, we explored whether and how simple visuo-haptic experience-just seeing and touching objects made of various materials-can shape representational content in the visual cortex. We measured brain activity evoked by viewing images of objects before and after the monkeys acquired the visuo-haptic experience and decoded the representational space from the activity patterns [8]. We show that simple long-term visuo-haptic experience greatly impacts representation in the posterior inferior temporal cortex, the higher ventral visual cortex. After the experience, but not before, the activity pattern in this region well reflected the haptic material properties of the experienced objects. Our results suggest that neural representation of non-visual object properties in the visual cortex emerges through long-term crossmodal exposure to objects. This highlights the importance of unsupervised learning of crossmodal associations through everyday experience [9-12] for shaping representation in the visual cortex.

DOI: 10.1016/j.cub.2016.02.003
PubMed: 26996504

Links to Exploration step

pubmed:26996504

Le document en format XML

<record>
<TEI>
<teiHeader>
<fileDesc>
<titleStmt>
<title xml:lang="en">Crossmodal Association of Visual and Haptic Material Properties of Objects in the Monkey Ventral Visual Cortex.</title>
<author>
<name sortKey="Goda, Naokazu" sort="Goda, Naokazu" uniqKey="Goda N" first="Naokazu" last="Goda">Naokazu Goda</name>
<affiliation>
<nlm:affiliation>Division of Sensory and Cognitive Information, National Institute for Physiological Sciences, Okazaki 444-8585, Japan; Department of Physiological Sciences, SOKENDAI (The Graduate University for Advanced Studies), Okazaki 444-8585, Japan. Electronic address: ngoda@nips.ac.jp.</nlm:affiliation>
</affiliation>
</author>
<author>
<name sortKey="Yokoi, Isao" sort="Yokoi, Isao" uniqKey="Yokoi I" first="Isao" last="Yokoi">Isao Yokoi</name>
<affiliation>
<nlm:affiliation>Division of Sensory and Cognitive Information, National Institute for Physiological Sciences, Okazaki 444-8585, Japan; Department of Physiological Sciences, SOKENDAI (The Graduate University for Advanced Studies), Okazaki 444-8585, Japan.</nlm:affiliation>
</affiliation>
</author>
<author>
<name sortKey="Tachibana, Atsumichi" sort="Tachibana, Atsumichi" uniqKey="Tachibana A" first="Atsumichi" last="Tachibana">Atsumichi Tachibana</name>
<affiliation>
<nlm:affiliation>Department of Histology and Neurobiology, Dokkyo Medical University, Tochigi 321-0293, Japan.</nlm:affiliation>
</affiliation>
</author>
<author>
<name sortKey="Minamimoto, Takafumi" sort="Minamimoto, Takafumi" uniqKey="Minamimoto T" first="Takafumi" last="Minamimoto">Takafumi Minamimoto</name>
<affiliation>
<nlm:affiliation>Department of Molecular Neuroimaging, National Institute of Radiological Sciences, Chiba 263-8555, Japan.</nlm:affiliation>
</affiliation>
</author>
<author>
<name sortKey="Komatsu, Hidehiko" sort="Komatsu, Hidehiko" uniqKey="Komatsu H" first="Hidehiko" last="Komatsu">Hidehiko Komatsu</name>
<affiliation>
<nlm:affiliation>Division of Sensory and Cognitive Information, National Institute for Physiological Sciences, Okazaki 444-8585, Japan; Department of Physiological Sciences, SOKENDAI (The Graduate University for Advanced Studies), Okazaki 444-8585, Japan.</nlm:affiliation>
</affiliation>
</author>
</titleStmt>
<publicationStmt>
<idno type="wicri:source">PubMed</idno>
<date when="2016">2016</date>
<idno type="doi">10.1016/j.cub.2016.02.003</idno>
<idno type="RBID">pubmed:26996504</idno>
<idno type="pmid">26996504</idno>
<idno type="wicri:Area/PubMed/Corpus">000069</idno>
</publicationStmt>
<sourceDesc>
<biblStruct>
<analytic>
<title xml:lang="en">Crossmodal Association of Visual and Haptic Material Properties of Objects in the Monkey Ventral Visual Cortex.</title>
<author>
<name sortKey="Goda, Naokazu" sort="Goda, Naokazu" uniqKey="Goda N" first="Naokazu" last="Goda">Naokazu Goda</name>
<affiliation>
<nlm:affiliation>Division of Sensory and Cognitive Information, National Institute for Physiological Sciences, Okazaki 444-8585, Japan; Department of Physiological Sciences, SOKENDAI (The Graduate University for Advanced Studies), Okazaki 444-8585, Japan. Electronic address: ngoda@nips.ac.jp.</nlm:affiliation>
</affiliation>
</author>
<author>
<name sortKey="Yokoi, Isao" sort="Yokoi, Isao" uniqKey="Yokoi I" first="Isao" last="Yokoi">Isao Yokoi</name>
<affiliation>
<nlm:affiliation>Division of Sensory and Cognitive Information, National Institute for Physiological Sciences, Okazaki 444-8585, Japan; Department of Physiological Sciences, SOKENDAI (The Graduate University for Advanced Studies), Okazaki 444-8585, Japan.</nlm:affiliation>
</affiliation>
</author>
<author>
<name sortKey="Tachibana, Atsumichi" sort="Tachibana, Atsumichi" uniqKey="Tachibana A" first="Atsumichi" last="Tachibana">Atsumichi Tachibana</name>
<affiliation>
<nlm:affiliation>Department of Histology and Neurobiology, Dokkyo Medical University, Tochigi 321-0293, Japan.</nlm:affiliation>
</affiliation>
</author>
<author>
<name sortKey="Minamimoto, Takafumi" sort="Minamimoto, Takafumi" uniqKey="Minamimoto T" first="Takafumi" last="Minamimoto">Takafumi Minamimoto</name>
<affiliation>
<nlm:affiliation>Department of Molecular Neuroimaging, National Institute of Radiological Sciences, Chiba 263-8555, Japan.</nlm:affiliation>
</affiliation>
</author>
<author>
<name sortKey="Komatsu, Hidehiko" sort="Komatsu, Hidehiko" uniqKey="Komatsu H" first="Hidehiko" last="Komatsu">Hidehiko Komatsu</name>
<affiliation>
<nlm:affiliation>Division of Sensory and Cognitive Information, National Institute for Physiological Sciences, Okazaki 444-8585, Japan; Department of Physiological Sciences, SOKENDAI (The Graduate University for Advanced Studies), Okazaki 444-8585, Japan.</nlm:affiliation>
</affiliation>
</author>
</analytic>
<series>
<title level="j">Current biology : CB</title>
<idno type="eISSN">1879-0445</idno>
<imprint>
<date when="2016" type="published">2016</date>
</imprint>
</series>
</biblStruct>
</sourceDesc>
</fileDesc>
<profileDesc>
<textClass></textClass>
</profileDesc>
</teiHeader>
<front>
<div type="abstract" xml:lang="en">Just by looking at an object, we can recognize its non-visual properties, such as hardness. The visual recognition of non-visual object properties is generally accurate [1], and influences actions toward the object [2]. Recent studies suggest that, in the primate brain, this may involve the ventral visual cortex, which represents objects in a way that reflects not only visual but also non-visual object properties, such as haptic roughness, hardness, and weight [3-7]. This new insight raises a fundamental question: how does the visual cortex come to represent non-visual properties-knowledge that cannot be acquired directly through vision? Here we addressed this unresolved question using fMRI in macaque monkeys. Specifically, we explored whether and how simple visuo-haptic experience-just seeing and touching objects made of various materials-can shape representational content in the visual cortex. We measured brain activity evoked by viewing images of objects before and after the monkeys acquired the visuo-haptic experience and decoded the representational space from the activity patterns [8]. We show that simple long-term visuo-haptic experience greatly impacts representation in the posterior inferior temporal cortex, the higher ventral visual cortex. After the experience, but not before, the activity pattern in this region well reflected the haptic material properties of the experienced objects. Our results suggest that neural representation of non-visual object properties in the visual cortex emerges through long-term crossmodal exposure to objects. This highlights the importance of unsupervised learning of crossmodal associations through everyday experience [9-12] for shaping representation in the visual cortex.</div>
</front>
</TEI>
<pubmed>
<MedlineCitation Owner="NLM" Status="In-Data-Review">
<PMID Version="1">26996504</PMID>
<DateCreated>
<Year>2016</Year>
<Month>04</Month>
<Day>06</Day>
</DateCreated>
<Article PubModel="Print-Electronic">
<Journal>
<ISSN IssnType="Electronic">1879-0445</ISSN>
<JournalIssue CitedMedium="Internet">
<Volume>26</Volume>
<Issue>7</Issue>
<PubDate>
<Year>2016</Year>
<Month>Apr</Month>
<Day>4</Day>
</PubDate>
</JournalIssue>
<Title>Current biology : CB</Title>
<ISOAbbreviation>Curr. Biol.</ISOAbbreviation>
</Journal>
<ArticleTitle>Crossmodal Association of Visual and Haptic Material Properties of Objects in the Monkey Ventral Visual Cortex.</ArticleTitle>
<Pagination>
<MedlinePgn>928-34</MedlinePgn>
</Pagination>
<ELocationID EIdType="doi" ValidYN="Y">10.1016/j.cub.2016.02.003</ELocationID>
<ELocationID EIdType="pii" ValidYN="Y">S0960-9822(16)30022-7</ELocationID>
<Abstract>
<AbstractText>Just by looking at an object, we can recognize its non-visual properties, such as hardness. The visual recognition of non-visual object properties is generally accurate [1], and influences actions toward the object [2]. Recent studies suggest that, in the primate brain, this may involve the ventral visual cortex, which represents objects in a way that reflects not only visual but also non-visual object properties, such as haptic roughness, hardness, and weight [3-7]. This new insight raises a fundamental question: how does the visual cortex come to represent non-visual properties-knowledge that cannot be acquired directly through vision? Here we addressed this unresolved question using fMRI in macaque monkeys. Specifically, we explored whether and how simple visuo-haptic experience-just seeing and touching objects made of various materials-can shape representational content in the visual cortex. We measured brain activity evoked by viewing images of objects before and after the monkeys acquired the visuo-haptic experience and decoded the representational space from the activity patterns [8]. We show that simple long-term visuo-haptic experience greatly impacts representation in the posterior inferior temporal cortex, the higher ventral visual cortex. After the experience, but not before, the activity pattern in this region well reflected the haptic material properties of the experienced objects. Our results suggest that neural representation of non-visual object properties in the visual cortex emerges through long-term crossmodal exposure to objects. This highlights the importance of unsupervised learning of crossmodal associations through everyday experience [9-12] for shaping representation in the visual cortex.</AbstractText>
<CopyrightInformation>Copyright © 2016 Elsevier Ltd. All rights reserved.</CopyrightInformation>
</Abstract>
<AuthorList CompleteYN="Y">
<Author ValidYN="Y">
<LastName>Goda</LastName>
<ForeName>Naokazu</ForeName>
<Initials>N</Initials>
<AffiliationInfo>
<Affiliation>Division of Sensory and Cognitive Information, National Institute for Physiological Sciences, Okazaki 444-8585, Japan; Department of Physiological Sciences, SOKENDAI (The Graduate University for Advanced Studies), Okazaki 444-8585, Japan. Electronic address: ngoda@nips.ac.jp.</Affiliation>
</AffiliationInfo>
</Author>
<Author ValidYN="Y">
<LastName>Yokoi</LastName>
<ForeName>Isao</ForeName>
<Initials>I</Initials>
<AffiliationInfo>
<Affiliation>Division of Sensory and Cognitive Information, National Institute for Physiological Sciences, Okazaki 444-8585, Japan; Department of Physiological Sciences, SOKENDAI (The Graduate University for Advanced Studies), Okazaki 444-8585, Japan.</Affiliation>
</AffiliationInfo>
</Author>
<Author ValidYN="Y">
<LastName>Tachibana</LastName>
<ForeName>Atsumichi</ForeName>
<Initials>A</Initials>
<AffiliationInfo>
<Affiliation>Department of Histology and Neurobiology, Dokkyo Medical University, Tochigi 321-0293, Japan.</Affiliation>
</AffiliationInfo>
</Author>
<Author ValidYN="Y">
<LastName>Minamimoto</LastName>
<ForeName>Takafumi</ForeName>
<Initials>T</Initials>
<AffiliationInfo>
<Affiliation>Department of Molecular Neuroimaging, National Institute of Radiological Sciences, Chiba 263-8555, Japan.</Affiliation>
</AffiliationInfo>
</Author>
<Author ValidYN="Y">
<LastName>Komatsu</LastName>
<ForeName>Hidehiko</ForeName>
<Initials>H</Initials>
<AffiliationInfo>
<Affiliation>Division of Sensory and Cognitive Information, National Institute for Physiological Sciences, Okazaki 444-8585, Japan; Department of Physiological Sciences, SOKENDAI (The Graduate University for Advanced Studies), Okazaki 444-8585, Japan.</Affiliation>
</AffiliationInfo>
</Author>
</AuthorList>
<Language>eng</Language>
<PublicationTypeList>
<PublicationType UI="D016428">Journal Article</PublicationType>
</PublicationTypeList>
<ArticleDate DateType="Electronic">
<Year>2016</Year>
<Month>03</Month>
<Day>17</Day>
</ArticleDate>
</Article>
<MedlineJournalInfo>
<Country>England</Country>
<MedlineTA>Curr Biol</MedlineTA>
<NlmUniqueID>9107782</NlmUniqueID>
<ISSNLinking>0960-9822</ISSNLinking>
</MedlineJournalInfo>
<CitationSubset>IM</CitationSubset>
</MedlineCitation>
<PubmedData>
<History>
<PubMedPubDate PubStatus="received">
<Year>2015</Year>
<Month>10</Month>
<Day>30</Day>
</PubMedPubDate>
<PubMedPubDate PubStatus="revised">
<Year>2015</Year>
<Month>12</Month>
<Day>15</Day>
</PubMedPubDate>
<PubMedPubDate PubStatus="accepted">
<Year>2016</Year>
<Month>2</Month>
<Day>1</Day>
</PubMedPubDate>
<PubMedPubDate PubStatus="aheadofprint">
<Year>2016</Year>
<Month>3</Month>
<Day>17</Day>
</PubMedPubDate>
<PubMedPubDate PubStatus="entrez">
<Year>2016</Year>
<Month>3</Month>
<Day>22</Day>
<Hour>6</Hour>
<Minute>0</Minute>
</PubMedPubDate>
<PubMedPubDate PubStatus="pubmed">
<Year>2016</Year>
<Month>3</Month>
<Day>22</Day>
<Hour>6</Hour>
<Minute>0</Minute>
</PubMedPubDate>
<PubMedPubDate PubStatus="medline">
<Year>2016</Year>
<Month>3</Month>
<Day>22</Day>
<Hour>6</Hour>
<Minute>0</Minute>
</PubMedPubDate>
</History>
<PublicationStatus>ppublish</PublicationStatus>
<ArticleIdList>
<ArticleId IdType="pii">S0960-9822(16)30022-7</ArticleId>
<ArticleId IdType="doi">10.1016/j.cub.2016.02.003</ArticleId>
<ArticleId IdType="pubmed">26996504</ArticleId>
</ArticleIdList>
</PubmedData>
</pubmed>
</record>

Pour manipuler ce document sous Unix (Dilib)

EXPLOR_STEP=$WICRI_ROOT/Ticri/CIDE/explor/HapticV1/Data/PubMed/Corpus
HfdSelect -h $EXPLOR_STEP/biblio.hfd -nk 000069 | SxmlIndent | more

Ou

HfdSelect -h $EXPLOR_AREA/Data/PubMed/Corpus/biblio.hfd -nk 000069 | SxmlIndent | more

Pour mettre un lien sur cette page dans le réseau Wicri

{{Explor lien
   |wiki=    Ticri/CIDE
   |area=    HapticV1
   |flux=    PubMed
   |étape=   Corpus
   |type=    RBID
   |clé=     pubmed:26996504
   |texte=   Crossmodal Association of Visual and Haptic Material Properties of Objects in the Monkey Ventral Visual Cortex.
}}

Pour générer des pages wiki

HfdIndexSelect -h $EXPLOR_AREA/Data/PubMed/Corpus/RBID.i   -Sk "pubmed:26996504" \
       | HfdSelect -Kh $EXPLOR_AREA/Data/PubMed/Corpus/biblio.hfd   \
       | NlmPubMed2Wicri -a HapticV1 

Wicri

This area was generated with Dilib version V0.6.23.
Data generation: Mon Jun 13 01:09:46 2016. Site generation: Wed Mar 6 09:54:07 2024