Serveur d'exploration sur les dispositifs haptiques

Attention, ce site est en cours de développement !
Attention, site généré par des moyens informatiques à partir de corpus bruts.
Les informations ne sont donc pas validées.

Touch influences perceived gloss

Identifieur interne : 000650 ( Pmc/Curation ); précédent : 000649; suivant : 000651

Touch influences perceived gloss

Auteurs : Wendy J. Adams ; Iona S. Kerrigan ; Erich W. Graf

Source :

RBID : PMC:4768155

Abstract

Identifying an object’s material properties supports recognition and action planning: we grasp objects according to how heavy, hard or slippery we expect them to be. Visual cues to material qualities such as gloss have recently received attention, but how they interact with haptic (touch) information has been largely overlooked. Here, we show that touch modulates gloss perception: objects that feel slippery are perceived as glossier (more shiny).Participants explored virtual objects that varied in look and feel. A discrimination paradigm (Experiment 1) revealed that observers integrate visual gloss with haptic information. Observers could easily detect an increase in glossiness when it was paired with a decrease in friction. In contrast, increased glossiness coupled with decreased slipperiness produced a small perceptual change: the visual and haptic changes counteracted each other. Subjective ratings (Experiment 2) reflected a similar interaction – slippery objects were rated as glossier and vice versa. The sensory system treats visual gloss and haptic friction as correlated cues to surface material. Although friction is not a perfect predictor of gloss, the visual system appears to know and use a probabilistic relationship between these variables to bias perception – a sensible strategy given the ambiguity of visual clues to gloss.


Url:
DOI: 10.1038/srep21866
PubMed: 26915492
PubMed Central: 4768155

Links toward previous steps (curation, corpus...)


Links to Exploration step

PMC:4768155

Curation

No country items

Wendy J. Adams
<affiliation>
<nlm:aff id="a1">
<institution>Psychology, University of Southampton</institution>
, Southampton, SO17 1BJ,
<country>ENGLAND</country>
</nlm:aff>
<wicri:noCountry code="nlm country">ENGLAND</wicri:noCountry>
</affiliation>
Iona S. Kerrigan
<affiliation>
<nlm:aff id="a1">
<institution>Psychology, University of Southampton</institution>
, Southampton, SO17 1BJ,
<country>ENGLAND</country>
</nlm:aff>
<wicri:noCountry code="nlm country">ENGLAND</wicri:noCountry>
</affiliation>
Erich W. Graf
<affiliation>
<nlm:aff id="a1">
<institution>Psychology, University of Southampton</institution>
, Southampton, SO17 1BJ,
<country>ENGLAND</country>
</nlm:aff>
<wicri:noCountry code="nlm country">ENGLAND</wicri:noCountry>
</affiliation>

Le document en format XML

<record>
<TEI>
<teiHeader>
<fileDesc>
<titleStmt>
<title xml:lang="en">Touch influences perceived gloss</title>
<author>
<name sortKey="Adams, Wendy J" sort="Adams, Wendy J" uniqKey="Adams W" first="Wendy J." last="Adams">Wendy J. Adams</name>
<affiliation>
<nlm:aff id="a1">
<institution>Psychology, University of Southampton</institution>
, Southampton, SO17 1BJ,
<country>ENGLAND</country>
</nlm:aff>
<wicri:noCountry code="nlm country">ENGLAND</wicri:noCountry>
</affiliation>
</author>
<author>
<name sortKey="Kerrigan, Iona S" sort="Kerrigan, Iona S" uniqKey="Kerrigan I" first="Iona S." last="Kerrigan">Iona S. Kerrigan</name>
<affiliation>
<nlm:aff id="a1">
<institution>Psychology, University of Southampton</institution>
, Southampton, SO17 1BJ,
<country>ENGLAND</country>
</nlm:aff>
<wicri:noCountry code="nlm country">ENGLAND</wicri:noCountry>
</affiliation>
</author>
<author>
<name sortKey="Graf, Erich W" sort="Graf, Erich W" uniqKey="Graf E" first="Erich W." last="Graf">Erich W. Graf</name>
<affiliation>
<nlm:aff id="a1">
<institution>Psychology, University of Southampton</institution>
, Southampton, SO17 1BJ,
<country>ENGLAND</country>
</nlm:aff>
<wicri:noCountry code="nlm country">ENGLAND</wicri:noCountry>
</affiliation>
</author>
</titleStmt>
<publicationStmt>
<idno type="wicri:source">PMC</idno>
<idno type="pmid">26915492</idno>
<idno type="pmc">4768155</idno>
<idno type="url">http://www.ncbi.nlm.nih.gov/pmc/articles/PMC4768155</idno>
<idno type="RBID">PMC:4768155</idno>
<idno type="doi">10.1038/srep21866</idno>
<date when="2016">2016</date>
<idno type="wicri:Area/Pmc/Corpus">000650</idno>
<idno type="wicri:Area/Pmc/Curation">000650</idno>
</publicationStmt>
<sourceDesc>
<biblStruct>
<analytic>
<title xml:lang="en" level="a" type="main">Touch influences perceived gloss</title>
<author>
<name sortKey="Adams, Wendy J" sort="Adams, Wendy J" uniqKey="Adams W" first="Wendy J." last="Adams">Wendy J. Adams</name>
<affiliation>
<nlm:aff id="a1">
<institution>Psychology, University of Southampton</institution>
, Southampton, SO17 1BJ,
<country>ENGLAND</country>
</nlm:aff>
<wicri:noCountry code="nlm country">ENGLAND</wicri:noCountry>
</affiliation>
</author>
<author>
<name sortKey="Kerrigan, Iona S" sort="Kerrigan, Iona S" uniqKey="Kerrigan I" first="Iona S." last="Kerrigan">Iona S. Kerrigan</name>
<affiliation>
<nlm:aff id="a1">
<institution>Psychology, University of Southampton</institution>
, Southampton, SO17 1BJ,
<country>ENGLAND</country>
</nlm:aff>
<wicri:noCountry code="nlm country">ENGLAND</wicri:noCountry>
</affiliation>
</author>
<author>
<name sortKey="Graf, Erich W" sort="Graf, Erich W" uniqKey="Graf E" first="Erich W." last="Graf">Erich W. Graf</name>
<affiliation>
<nlm:aff id="a1">
<institution>Psychology, University of Southampton</institution>
, Southampton, SO17 1BJ,
<country>ENGLAND</country>
</nlm:aff>
<wicri:noCountry code="nlm country">ENGLAND</wicri:noCountry>
</affiliation>
</author>
</analytic>
<series>
<title level="j">Scientific Reports</title>
<idno type="eISSN">2045-2322</idno>
<imprint>
<date when="2016">2016</date>
</imprint>
</series>
</biblStruct>
</sourceDesc>
</fileDesc>
<profileDesc>
<textClass></textClass>
</profileDesc>
</teiHeader>
<front>
<div type="abstract" xml:lang="en">
<p>Identifying an object’s material properties supports recognition and action planning: we grasp objects according to how heavy, hard or slippery we expect them to be. Visual cues to material qualities such as gloss have recently received attention, but how they interact with haptic (touch) information has been largely overlooked. Here, we show that touch modulates gloss perception: objects that feel slippery are perceived as glossier (more shiny).Participants explored virtual objects that varied in look and feel. A discrimination paradigm (Experiment 1) revealed that observers integrate visual gloss with haptic information. Observers could easily detect an increase in glossiness when it was paired with a decrease in friction. In contrast, increased glossiness coupled with decreased slipperiness produced a small perceptual change: the visual and haptic changes counteracted each other. Subjective ratings (Experiment 2) reflected a similar interaction – slippery objects were rated as glossier and vice versa. The sensory system treats visual gloss and haptic friction as correlated cues to surface material. Although friction is not a perfect predictor of gloss, the visual system appears to know and use a probabilistic relationship between these variables to bias perception – a sensible strategy given the ambiguity of visual clues to gloss.</p>
</div>
</front>
<back>
<div1 type="bibliography">
<listBibl>
<biblStruct>
<analytic>
<author>
<name sortKey="Wiebel, C B" uniqKey="Wiebel C">C. B. Wiebel</name>
</author>
<author>
<name sortKey="Valsecchi, M" uniqKey="Valsecchi M">M. Valsecchi</name>
</author>
<author>
<name sortKey="Gegenfurtner, K R" uniqKey="Gegenfurtner K">K. R. Gegenfurtner</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Chadwick, A C" uniqKey="Chadwick A">A. C. Chadwick</name>
</author>
<author>
<name sortKey="Kentridge, R W" uniqKey="Kentridge R">R. W. Kentridge</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Beck, J" uniqKey="Beck J">J. Beck</name>
</author>
<author>
<name sortKey="Prazdny, S" uniqKey="Prazdny S">S. Prazdny</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Blake, A" uniqKey="Blake A">A. Blake</name>
</author>
<author>
<name sortKey="Bulthoff, H" uniqKey="Bulthoff H">H. Bülthoff</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Todd, J" uniqKey="Todd J">J. Todd</name>
</author>
<author>
<name sortKey="Norman, J" uniqKey="Norman J">J. Norman</name>
</author>
<author>
<name sortKey="Mingolla, E" uniqKey="Mingolla E">E. Mingolla</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Pellacini, F" uniqKey="Pellacini F">F. Pellacini</name>
</author>
<author>
<name sortKey="Ferwerda, J A" uniqKey="Ferwerda J">J. A. Ferwerda</name>
</author>
<author>
<name sortKey="Greenberg, D P" uniqKey="Greenberg D">D. P. Greenberg</name>
</author>
<author>
<name sortKey="Acm, A C M" uniqKey="Acm A">A. C. M. Acm</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Fleming, R W" uniqKey="Fleming R">R. W. Fleming</name>
</author>
<author>
<name sortKey="Dror, R O" uniqKey="Dror R">R. O. Dror</name>
</author>
<author>
<name sortKey="Adelson, E H" uniqKey="Adelson E">E. H. Adelson</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Kerrigan, I S" uniqKey="Kerrigan I">I. S. Kerrigan</name>
</author>
<author>
<name sortKey="Adams, W J" uniqKey="Adams W">W. J. Adams</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Okamoto, S" uniqKey="Okamoto S">S. Okamoto</name>
</author>
<author>
<name sortKey="Nagano, H" uniqKey="Nagano H">H. Nagano</name>
</author>
<author>
<name sortKey="Yamada, Y" uniqKey="Yamada Y">Y. Yamada</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Klatzky, R L" uniqKey="Klatzky R">R. L. Klatzky</name>
</author>
<author>
<name sortKey="Pawluk, D" uniqKey="Pawluk D">D. Pawluk</name>
</author>
<author>
<name sortKey="Peer, A" uniqKey="Peer A">A. Peer</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Bergmann Tiest, W M" uniqKey="Bergmann Tiest W">W. M. Bergmann Tiest</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Lederman, S J" uniqKey="Lederman S">S. J. Lederman</name>
</author>
<author>
<name sortKey="Abbott, S G" uniqKey="Abbott S">S. G. Abbott</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Lederman, S J" uniqKey="Lederman S">S. J. Lederman</name>
</author>
<author>
<name sortKey="Thorne, G" uniqKey="Thorne G">G. Thorne</name>
</author>
<author>
<name sortKey="Jones, B" uniqKey="Jones B">B. Jones</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Klatzky, R L" uniqKey="Klatzky R">R. L. Klatzky</name>
</author>
<author>
<name sortKey="Lederman, S" uniqKey="Lederman S">S. Lederman</name>
</author>
<author>
<name sortKey="Reed, C" uniqKey="Reed C">C. Reed</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Jones, B" uniqKey="Jones B">B. Jones</name>
</author>
<author>
<name sortKey="Oneil, S" uniqKey="Oneil S">S. Oneil</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Heller, M A" uniqKey="Heller M">M. A. Heller</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Guest, S" uniqKey="Guest S">S. Guest</name>
</author>
<author>
<name sortKey="Spence, C" uniqKey="Spence C">C. Spence</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Bergmann Tiest, W M" uniqKey="Bergmann Tiest W">W. M. Bergmann Tiest</name>
</author>
<author>
<name sortKey="Kappers, A M L" uniqKey="Kappers A">A. M. L. Kappers</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Varadharajan, V" uniqKey="Varadharajan V">V. Varadharajan</name>
</author>
<author>
<name sortKey="Klatzky, R" uniqKey="Klatzky R">R. Klatzky</name>
</author>
<author>
<name sortKey="Unger, B" uniqKey="Unger B">B. Unger</name>
</author>
<author>
<name sortKey="Swendsen, R" uniqKey="Swendsen R">R. Swendsen</name>
</author>
<author>
<name sortKey="Hollis, R" uniqKey="Hollis R">R. Hollis</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Kuschel, M" uniqKey="Kuschel M">M. Kuschel</name>
</author>
<author>
<name sortKey="Di Luca, M" uniqKey="Di Luca M">M. Di Luca</name>
</author>
<author>
<name sortKey="Buss, M" uniqKey="Buss M">M. Buss</name>
</author>
<author>
<name sortKey="Klatzky, R L" uniqKey="Klatzky R">R. L. Klatzky</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Drewing, K" uniqKey="Drewing K">K. Drewing</name>
</author>
<author>
<name sortKey="Ramisch, A" uniqKey="Ramisch A">A. Ramisch</name>
</author>
<author>
<name sortKey="Bayer, F" uniqKey="Bayer F">F. Bayer</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Cellini, C" uniqKey="Cellini C">C. Cellini</name>
</author>
<author>
<name sortKey="Kaim, L" uniqKey="Kaim L">L. Kaim</name>
</author>
<author>
<name sortKey="Drewing, K" uniqKey="Drewing K">K. Drewing</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Di Luca, M" uniqKey="Di Luca M">M. Di Luca</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Di Luca, M" uniqKey="Di Luca M">M. Di Luca</name>
</author>
<author>
<name sortKey="Knoerlein, B" uniqKey="Knoerlein B">B. Knoerlein</name>
</author>
<author>
<name sortKey="Ernst, M O" uniqKey="Ernst M">M. O. Ernst</name>
</author>
<author>
<name sortKey="Harders, M" uniqKey="Harders M">M. Harders</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Ernst, M O" uniqKey="Ernst M">M. O. Ernst</name>
</author>
<author>
<name sortKey="Banks, M S" uniqKey="Banks M">M. S. Banks</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Ernst, M O" uniqKey="Ernst M">M. O. Ernst</name>
</author>
<author>
<name sortKey="Banks, M S" uniqKey="Banks M">M. S. Banks</name>
</author>
<author>
<name sortKey="Bulthoff, H H" uniqKey="Bulthoff H">H. H. Bulthoff</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Harshfield, S P" uniqKey="Harshfield S">S. P. Harshfield</name>
</author>
<author>
<name sortKey="Dehardt, D C" uniqKey="Dehardt D">D. C. Dehardt</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Buckingham, G" uniqKey="Buckingham G">G. Buckingham</name>
</author>
<author>
<name sortKey="Cant, J S" uniqKey="Cant J">J. S. Cant</name>
</author>
<author>
<name sortKey="Goodale, M A" uniqKey="Goodale M">M. A. Goodale</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Charpentier, A" uniqKey="Charpentier A">A. Charpentier</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Murray, D J" uniqKey="Murray D">D. J. Murray</name>
</author>
<author>
<name sortKey="Ellis, R R" uniqKey="Ellis R">R. R. Ellis</name>
</author>
<author>
<name sortKey="Bandomir, C A" uniqKey="Bandomir C">C. A. Bandomir</name>
</author>
<author>
<name sortKey="Ross, H E" uniqKey="Ross H">H. E. Ross</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Ernst, M O" uniqKey="Ernst M">M. O. Ernst</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Ernst, M O" uniqKey="Ernst M">M. O. Ernst</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Ernst, M O" uniqKey="Ernst M">M. O. Ernst</name>
</author>
<author>
<name sortKey="Di Luca, M" uniqKey="Di Luca M">M. di Luca</name>
</author>
<author>
<name sortKey="Trommershauser, J" uniqKey="Trommershauser J">J. Trommershauser</name>
</author>
<author>
<name sortKey="Kording, K" uniqKey="Kording K">K. Kording</name>
</author>
<author>
<name sortKey="Landy, M S" uniqKey="Landy M">M. S. Landy</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Wismeijer, D A" uniqKey="Wismeijer D">D. A. Wismeijer</name>
</author>
<author>
<name sortKey="Gegenfurtner, K R" uniqKey="Gegenfurtner K">K. R. Gegenfurtner</name>
</author>
<author>
<name sortKey="Drewing, K" uniqKey="Drewing K">K. Drewing</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Adams, W" uniqKey="Adams W">W. Adams</name>
</author>
<author>
<name sortKey="Graf, E" uniqKey="Graf E">E. Graf</name>
</author>
<author>
<name sortKey="Ernst, M" uniqKey="Ernst M">M. Ernst</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Kerrigan, I S" uniqKey="Kerrigan I">I. S. Kerrigan</name>
</author>
<author>
<name sortKey="Adams, W J" uniqKey="Adams W">W. J. Adams</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Adams, W" uniqKey="Adams W">W. Adams</name>
</author>
<author>
<name sortKey="Kerrigan, I" uniqKey="Kerrigan I">I. Kerrigan</name>
</author>
<author>
<name sortKey="Graf, E" uniqKey="Graf E">E. Graf</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Adelson, E" uniqKey="Adelson E">E. Adelson</name>
</author>
<author>
<name sortKey="Rogowitz, B" uniqKey="Rogowitz B">B. Rogowitz</name>
</author>
<author>
<name sortKey="Pappas, T" uniqKey="Pappas T">T. Pappas</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Doerschner, K" uniqKey="Doerschner K">K. Doerschner</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Fleming, R" uniqKey="Fleming R">R. Fleming</name>
</author>
<author>
<name sortKey="Torralba, A" uniqKey="Torralba A">A. Torralba</name>
</author>
<author>
<name sortKey="Adelson, E" uniqKey="Adelson E">E. Adelson</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Anderson, B" uniqKey="Anderson B">B. Anderson</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Adams, W J" uniqKey="Adams W">W. J. Adams</name>
</author>
<author>
<name sortKey="Elder, J H" uniqKey="Elder J">J. H. Elder</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Krim, J" uniqKey="Krim J">J. Krim</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Wilson, S" uniqKey="Wilson S">S. Wilson</name>
</author>
<author>
<name sortKey="Hutley, M" uniqKey="Hutley M">M. Hutley</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Maia, R" uniqKey="Maia R">R. Maia</name>
</author>
<author>
<name sortKey="D Lba, L" uniqKey="D Lba L">L. D’Alba</name>
</author>
<author>
<name sortKey="Shawkey, M" uniqKey="Shawkey M">M. Shawkey</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Joh, A" uniqKey="Joh A">A. Joh</name>
</author>
<author>
<name sortKey="Adolph, K" uniqKey="Adolph K">K. Adolph</name>
</author>
<author>
<name sortKey="Campbell, M" uniqKey="Campbell M">M. Campbell</name>
</author>
<author>
<name sortKey="Eppler, M" uniqKey="Eppler M">M. Eppler</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Lesch, M" uniqKey="Lesch M">M. Lesch</name>
</author>
<author>
<name sortKey="Chang, W" uniqKey="Chang W">W. Chang</name>
</author>
<author>
<name sortKey="Chang, C" uniqKey="Chang C">C. Chang</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Perlin, K" uniqKey="Perlin K">K. Perlin</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Debevec, P" uniqKey="Debevec P">P. Debevec</name>
</author>
</analytic>
</biblStruct>
</listBibl>
</div1>
</back>
</TEI>
<pmc article-type="research-article">
<pmc-dir>properties open_access</pmc-dir>
<front>
<journal-meta>
<journal-id journal-id-type="nlm-ta">Sci Rep</journal-id>
<journal-id journal-id-type="iso-abbrev">Sci Rep</journal-id>
<journal-title-group>
<journal-title>Scientific Reports</journal-title>
</journal-title-group>
<issn pub-type="epub">2045-2322</issn>
<publisher>
<publisher-name>Nature Publishing Group</publisher-name>
</publisher>
</journal-meta>
<article-meta>
<article-id pub-id-type="pmid">26915492</article-id>
<article-id pub-id-type="pmc">4768155</article-id>
<article-id pub-id-type="pii">srep21866</article-id>
<article-id pub-id-type="doi">10.1038/srep21866</article-id>
<article-categories>
<subj-group subj-group-type="heading">
<subject>Article</subject>
</subj-group>
</article-categories>
<title-group>
<article-title>Touch influences perceived gloss</article-title>
</title-group>
<contrib-group>
<contrib contrib-type="author">
<name>
<surname>Adams</surname>
<given-names>Wendy J.</given-names>
</name>
<xref ref-type="corresp" rid="c1">a</xref>
<xref ref-type="aff" rid="a1">1</xref>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Kerrigan</surname>
<given-names>Iona S.</given-names>
</name>
<xref ref-type="aff" rid="a1">1</xref>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Graf</surname>
<given-names>Erich W.</given-names>
</name>
<xref ref-type="aff" rid="a1">1</xref>
</contrib>
<aff id="a1">
<label>1</label>
<institution>Psychology, University of Southampton</institution>
, Southampton, SO17 1BJ,
<country>ENGLAND</country>
</aff>
</contrib-group>
<author-notes>
<corresp id="c1">
<label>a</label>
<email>wendya@soton.ac.uk</email>
</corresp>
</author-notes>
<pub-date pub-type="epub">
<day>26</day>
<month>02</month>
<year>2016</year>
</pub-date>
<pub-date pub-type="collection">
<year>2016</year>
</pub-date>
<volume>6</volume>
<elocation-id>21866</elocation-id>
<history>
<date date-type="received">
<day>09</day>
<month>10</month>
<year>2015</year>
</date>
<date date-type="accepted">
<day>02</day>
<month>02</month>
<year>2016</year>
</date>
</history>
<permissions>
<copyright-statement>Copyright © 2016, Macmillan Publishers Limited</copyright-statement>
<copyright-year>2016</copyright-year>
<copyright-holder>Macmillan Publishers Limited</copyright-holder>
<license license-type="open-access" xlink:href="http://creativecommons.org/licenses/by/4.0/">
<pmc-comment>author-paid</pmc-comment>
<license-p>This work is licensed under a Creative Commons Attribution 4.0 International License. The images or other third party material in this article are included in the article’s Creative Commons license, unless indicated otherwise in the credit line; if the material is not included under the Creative Commons license, users will need to obtain permission from the license holder to reproduce the material. To view a copy of this license, visit
<ext-link ext-link-type="uri" xlink:href="http://creativecommons.org/licenses/by/4.0/">http://creativecommons.org/licenses/by/4.0/</ext-link>
</license-p>
</license>
</permissions>
<abstract>
<p>Identifying an object’s material properties supports recognition and action planning: we grasp objects according to how heavy, hard or slippery we expect them to be. Visual cues to material qualities such as gloss have recently received attention, but how they interact with haptic (touch) information has been largely overlooked. Here, we show that touch modulates gloss perception: objects that feel slippery are perceived as glossier (more shiny).Participants explored virtual objects that varied in look and feel. A discrimination paradigm (Experiment 1) revealed that observers integrate visual gloss with haptic information. Observers could easily detect an increase in glossiness when it was paired with a decrease in friction. In contrast, increased glossiness coupled with decreased slipperiness produced a small perceptual change: the visual and haptic changes counteracted each other. Subjective ratings (Experiment 2) reflected a similar interaction – slippery objects were rated as glossier and vice versa. The sensory system treats visual gloss and haptic friction as correlated cues to surface material. Although friction is not a perfect predictor of gloss, the visual system appears to know and use a probabilistic relationship between these variables to bias perception – a sensible strategy given the ambiguity of visual clues to gloss.</p>
</abstract>
</article-meta>
</front>
<body>
<p>Humans are proficient at distinguishing different object materials, e.g. metal, glass and plastic, based upon their visual appearance
<xref ref-type="bibr" rid="b1">1</xref>
, an important skill for object recognition and for guiding interaction with the environment. Whilst material perception appears effortless, the underlying computations are complex and under-constrained: an object’s image is determined not only by how its surface reflects light, but also by the object’s shape and the structure of the illumination field. In recent years, one aspect of material perception – visual gloss – has been widely studied
<xref ref-type="bibr" rid="b2">2</xref>
. Glossy and matte surfaces can be differentiated using specular highlights; bright image patches that occur when light is reflected from a surface regularly, in a mirror-like way, rather than scattered diffusely
<xref ref-type="bibr" rid="b3">3</xref>
<xref ref-type="bibr" rid="b4">4</xref>
<xref ref-type="bibr" rid="b5">5</xref>
<xref ref-type="bibr" rid="b6">6</xref>
<xref ref-type="bibr" rid="b7">7</xref>
<xref ref-type="bibr" rid="b8">8</xref>
.</p>
<p>Glossiness is conceptualised as a visual property. In contrast, other material dimensions have been studied within the haptic (touch) domain, such as surface roughness, thermal conductivity, compliance (or softness, i.e. deformation in response to force) and slipperiness (related to the coefficients of friction)
<xref ref-type="bibr" rid="b9">9</xref>
<xref ref-type="bibr" rid="b10">10</xref>
<xref ref-type="bibr" rid="b11">11</xref>
. However, little is known about how visual and haptic information interact when we estimate material properties, with existing research almost entirely constrained to the perception of roughness
<xref ref-type="bibr" rid="b12">12</xref>
<xref ref-type="bibr" rid="b13">13</xref>
<xref ref-type="bibr" rid="b14">14</xref>
<xref ref-type="bibr" rid="b15">15</xref>
<xref ref-type="bibr" rid="b16">16</xref>
<xref ref-type="bibr" rid="b17">17</xref>
<xref ref-type="bibr" rid="b18">18</xref>
with a few studies on the visual-haptic cues to compliance
<xref ref-type="bibr" rid="b19">19</xref>
<xref ref-type="bibr" rid="b20">20</xref>
<xref ref-type="bibr" rid="b21">21</xref>
<xref ref-type="bibr" rid="b22">22</xref>
<xref ref-type="bibr" rid="b23">23</xref>
<xref ref-type="bibr" rid="b24">24</xref>
. Here we ask whether the feel of an object also affects our perception of its glossiness.</p>
<p>A substantial body of work has demonstrated the perceptual benefit of combining vision and haptic information when humans estimate geometric attributes such as surface slant or object size. Visual and haptic information is integrated optimally: judgments are more precise (less variable) for visual-haptic stimuli than when based on either vision or haptics alone
<xref ref-type="bibr" rid="b25">25</xref>
<xref ref-type="bibr" rid="b26">26</xref>
. However, this contrasts with multi-sensory findings in material perception: although some visual-haptic averaging has been found in surface roughness perception, studies have not found improvements in precision when both modalities provide information, relative to a single modality
<xref ref-type="bibr" rid="b12">12</xref>
<xref ref-type="bibr" rid="b15">15</xref>
. Moreover, robust contrast (repulsion) effects – the opposite of integration - have been found within material perception. For example, in the material-weight illusion, when two sized-matched objects appear to be made of different materials (e.g. balsa wood vs. metal), the denser-looking object feels lighter, when both are lifted and have the same mass
<xref ref-type="bibr" rid="b27">27</xref>
<xref ref-type="bibr" rid="b28">28</xref>
. Similarly, in the size-weight illusion, when two objects appear to be constructed of the same material, but differ in size, the larger object feels lighter when both are lifted and have the same mass
<xref ref-type="bibr" rid="b29">29</xref>
<xref ref-type="bibr" rid="b30">30</xref>
. Standard models of sensory integration predict the opposite perceptual effects
<xref ref-type="bibr" rid="b31">31</xref>
.</p>
<p>Here we describe two experiments that investigate whether humans integrate visual gloss information with haptic ‘rubberiness’ information when judging material properties. In our experiments, ‘glossier’ objects have higher specular reflectance, and less specular scatter than less glossy objects (see
<xref ref-type="table" rid="t1">Table 1</xref>
). We define ‘rubbery’ objects (e.g. squash balls) as those with high friction and compliance. In contrast, non-rubbery objects (i.e. a snooker ball) have low friction and high stiffness (low compliance), and can be described as feeling slippery and hard (see
<xref ref-type="table" rid="t1">Table 1</xref>
). Our experimental set-up allows us to present visual-haptic objects that can vary independently along these visual and haptic dimensions (
<xref ref-type="fig" rid="f1">Fig. 1</xref>
). Visual and haptic stimuli were matched in size, shape and location, giving observers the compelling sense that they are touching and viewing the same object. Experiment 1 employed a discrimination paradigm to determine whether our participants’ ability to distinguish different visual-haptic objects is consistent with (i) independent processing of visual and haptic signals, or (ii) integration of visual and haptic cues. In Experiment 2, we used a completely different, but complementary paradigm to further explore visual-haptic interactions: observers were presented with a single visual-haptic object per trial, and gave subjective ratings of a) how glossy each object looked, as well as b) how slippery and c) how hard the object felt.</p>
<sec disp-level="1">
<title>Results</title>
<sec disp-level="2">
<title>Experiment 1</title>
<p>Here we report our participants’ ability to discriminate between stimuli that differ in glossiness and rubberiness. On each trial, observers viewed and touched three visual-haptic stimuli and identified which one was ‘made of something different’. The manner in which the internal estimates of these visual and haptic signals interact and influence discrimination will reflect the degree to which they have been correlated during our previous interactions with the environment
<xref ref-type="bibr" rid="b32">32</xref>
. For example, humans have learnt that the felt size of an object is highly correlated with its visually-estimated size, and these two signals are, therefore, fully integrated to form a single, visual-haptic size estimate
<xref ref-type="bibr" rid="b25">25</xref>
.</p>
<p>The visual and haptic cues to material properties that we consider here are unlikely to be perfectly correlated in the environment; we can conceive of matte and glossy objects that feel equally rubbery. However, we hypothesise that on average, objects that feel more rubbery will look
<italic>less</italic>
glossy, i.e. there is a negative correlation between glossiness and rubberiness for objects that we experience in our environment. In other words, objects that we encounter in our environment might not be uniformly distributed across the stimulus parameter space, but be clustered around the negative correlation axis (
<xref ref-type="fig" rid="f1">Fig. 1b</xref>
). If observers have learnt such a correlation, this should result in partial integration
<xref ref-type="bibr" rid="b32">32</xref>
<xref ref-type="bibr" rid="b33">33</xref>
, which will be reflected by our observers’ ability to discriminate between visual-haptic objects that differ in gloss and rubberiness (see Methods for a detailed explanation of the discrimination predictions). In brief, stimuli that vary along the visual-haptic stimulus axis shown in
<xref ref-type="fig" rid="f1">Fig. 1b</xref>
(Type A stimuli), in which glossiness and rubberiness are negatively correlated, will be relatively easy to discriminate. Stimuli that vary along the orthogonal axis (Type B stimuli,
<xref ref-type="fig" rid="f1">Fig. 1c</xref>
) will be harder to discriminate. In other words, if observers have learnt a correlation between vision and haptic material cues, cross-modal integration will shift visual and haptic estimates towards the learnt visual-haptic mapping.</p>
<p>
<xref ref-type="fig" rid="f2">Figure 2a</xref>
shows three possible experimental outcomes – predicted visual-haptic discrimination thresholds given three different integration behaviours:</p>
<p>(i) No integration: If the visual and haptic dimensions are unrelated in the environment, then the two signals should not be integrated for perception. Independent processing of the visual and haptic estimates will produce identical thresholds for Type A and Type B stimuli. Because the observer can identify the odd stimulus based on vision and/or haptics, she/he will have a two-cue advantage, relative to uni-modal thresholds (1 JND). In the absence of integration we predict thresholds of 0.78 JNDs for both Type A and Type B trials (see Methods: Procedure, Experiment 1 for a full explanation of discrimination predictions).</p>
<p>(ii) Partial integration, negative correlation: If observers have learnt that visual gloss and haptic rubberiness are negatively correlated, as described above, then we expect smaller thresholds for Type A than for Type B stimuli – the magnitude of this difference depends on the strength of integration (see Methods).</p>
<p>(iii) Partial integration, positive correlation: If observers have learnt that visual gloss and haptic rubberiness are positively correlated, i.e. that glossy stimuli feel more rubbery, on average, than matte stimuli, then discrimination will be better (smaller thresholds) for Type B than Type A stimuli.</p>
<p>Error rates for the bi-modal discrimination task for one naïve subject are shown in
<xref ref-type="fig" rid="f2">Fig. 2b</xref>
. Each subject’s data were fit with cumulative Gaussians using a single lapse rate (
<inline-formula id="d33e195">
<inline-graphic id="d33e196" xlink:href="srep21866-m1.jpg"></inline-graphic>
</inline-formula>
, corresponding to the lower asymptote), and an upper asymptote of 0.67 (chance performance: upper dashed line in
<xref ref-type="fig" rid="f2">Fig. 2b</xref>
). Four separate mean (
<inline-formula id="d33e201">
<inline-graphic id="d33e202" xlink:href="srep21866-m2.jpg"></inline-graphic>
</inline-formula>
) and slope (
<inline-formula id="d33e204">
<inline-graphic id="d33e205" xlink:href="srep21866-m3.jpg"></inline-graphic>
</inline-formula>
) parameter pairs were used to fit the data either side of the standard, for both Type A and Type B trials. These fits were used to estimate thresholds at the 0.33 error rate (i.e. halfway between chance and perfect performance: lower dashed line in
<xref ref-type="fig" rid="f2">Fig. 2b</xref>
). This method provided excellent fits to the data (
<italic>r</italic>
<sup>2</sup>
 = 0.95 ± 0.007 across observers). The two thresholds for each trial type were averaged, with the resultant Type A and Type B mean thresholds shown in
<xref ref-type="fig" rid="f2">Fig. 2c</xref>
, averaged across observers (left plot) and for individual observers (right plot). All observers show better discrimination for Type A than Type B trials, consistent with a negative learnt association between visual gloss and haptic rubberiness, i.e. matte objects more often feel rubbery, whereas glossy objects are more likely to feel hard and slippery.</p>
</sec>
<sec disp-level="2">
<title>Experiment 2</title>
<p>Experiment 1 showed that visual gloss and haptic rubberiness are partially integrated - our sensory system has learnt that glossy objects are more likely to be slippery and hard than sticky (high in friction) and compliant, and the visual and haptic signals are integrated accordingly. However, in Experiment 1, compliance and friction always co-varied – thus we cannot determine whether the discrimination effects were driven by the integration of visual gloss with friction, or with compliance or both. Experiment 2 addressed this question and also asked whether the visual-haptic integration suggested by discrimination performance in Experiment 1 is also evident in subjective judgements of perceived gloss, compliance and friction.</p>
<p>In each trial, observers rated the gloss, friction and compliance of a single visual-haptic object. Each of these three stimulus attributes varied independently across trials. To determine which of these visual and haptic stimulus dimensions had a reliable effect on our subject’s ratings we used leave-one-out cross-validation to compare regression models comprised of all possible combinations of single predictors and their interactions for each rating type. To preview the key results: gloss ratings were predicted by stimulus gloss and stimulus friction. In complement to this, friction ratings were predicted by both stimulus friction and stimulus gloss. However, compliance and gloss appear to be processed independently – neither modulated perception of the other.</p>
<sec disp-level="3">
<title>Gloss Ratings</title>
<p>
<xref ref-type="fig" rid="f3">Figure 3a</xref>
shows observers’ gloss perception as a function of stimulus gloss and stimulus friction (left plot), and stimulus gloss and stimulus compliance (right plot). The best model of perceived gloss (as determined by leave-one-out cross-validation) included both stimulus gloss and stimulus friction as predictors and accounted for an average of 66 ± 3% of response variance, across observers. As expected, perceived gloss increased significantly with stimulus gloss. However, as can be seen from
<xref ref-type="fig" rid="f3">Fig. 3a</xref>
, the objects rated as glossiest were not only high in stimulus gloss, but also had the lowest friction, although the effect of changing friction was smaller –11% (on average, across subjects) of the effect of changing stimulus gloss.</p>
<p>Stimulus compliance had no significant effect on perceived gloss. Could this be because compliance is a less reliable cue than gloss or friction in this study? The correlation between stimulus compliance and compliance ratings (
<inline-formula id="d33e238">
<inline-graphic id="d33e239" xlink:href="srep21866-m4.jpg"></inline-graphic>
</inline-formula>
) is weaker than the parallel correlations for the other two attributes (stimulus gloss vs. gloss rating:
<inline-formula id="d33e241">
<inline-graphic id="d33e242" xlink:href="srep21866-m5.jpg"></inline-graphic>
</inline-formula>
, stimulus friction vs. friction rating:
<inline-formula id="d33e244">
<inline-graphic id="d33e245" xlink:href="srep21866-m6.jpg"></inline-graphic>
</inline-formula>
, see
<xref ref-type="fig" rid="f3">Fig. 3d</xref>
), suggesting that compliance is a less reliable signal, and would therefore have a smaller weight if gloss and compliance were (partially) integrated. However, by the same logic, gloss (a more reliable signal) would be expected to have a relatively large influence on perceived compliance, if the two were integrated. This is not evident in the data: neither the correlation between stimulus compliance and perceived gloss, nor the correlation between stimulus gloss and perceived compliance are significantly different from 0. It seems unlikely, therefore, that the lack of interaction between compliance and gloss is due to reliability differences. Rather, our data suggest, in agreement with a previous study
<xref ref-type="bibr" rid="b34">34</xref>
, that observers have not learnt a significant correlation between compliance and gloss from their normal interactions with the world.</p>
<p>The gloss ratings are in agreement with the discrimination data from Experiment 1 – observers appear to have learnt a relationship between how glossy something looks, and how it feels – glossier objects feel less ‘rubbery’, and this is driven by friction – glossier objects tend to be more slippery.</p>
</sec>
<sec disp-level="3">
<title>Friction ratings</title>
<p>Perceived friction was best predicted by stimulus friction, stimulus compliance, a friction x compliance interaction and stimulus gloss. This model explained an average of 67 ± 6% of the variance in friction responses, across observers. There was a small but significant effect of stimulus gloss on perceived friction: glossy objects were perceived as having lower friction: the difference in friction rating from minimum to maximum stimulus gloss was only 3.5% of the full rating scale, averaged across observers.</p>
<p>The intra-modal interactions between friction and compliance, although not directly relevant to our research question, are interesting nonetheless. Compliance had a small but significant effect on perceived friction: more compliant objects were perceived as more slippery (the correlation between compliance and perceived friction did not reach significance – see
<xref ref-type="fig" rid="f3">Fig. 3d</xref>
, but compliance was a significant predictor alongside other terms in the regression model). Compliance could affect perceived friction in this way if observers were partially confusing lateral and tangential finger displacements. Further, in the physical model implemented in the Phantom, the frictional force is proportional to the force in the normal direction (see
<xref ref-type="supplementary-material" rid="S1">supplementary information</xref>
for details of the force models).</p>
</sec>
<sec disp-level="3">
<title>Compliance ratings</title>
<p>As noted above, gloss and compliance appear to be processed independently; compliance ratings were independent of stimulus gloss. However, interactions between compliance and friction were also apparent within compliance ratings – these were best predicted by a model that included both stimulus compliance and stimulus friction, accounting for 59 ± 5% of response variance, averaged across observers. Objects that were physically compliant, but also had high friction were rated as most compliant. The effect of stimulus friction was large: around 40% of the effect of changing stimulus compliance. The reason for this interaction is not clear – when objects have high friction the finger doesn’t slip across the surface - introspection suggests that in this case the observer is confident that they are moving the finger only in the surface normal direction, and that this creates the impression of increased compliance (this might be also be conceptualised as a contrast effect between normal and tangential resistance). We note, however, that the positive relationship between stimulus friction and perceived compliance is hard to reconcile with the negative relationship between perceived friction and stimulus compliance – further work is required to better understand these interactions.</p>
</sec>
</sec>
</sec>
<sec disp-level="1">
<title>Discussion</title>
<p>We show that observers have learnt a statistical relationship between the material cues of visually-defined gloss and haptically-defined friction and that knowledge of this statistical relationship is reflected in the integration of these cues.</p>
<p>Experiment 1 used a discrimination paradigm to demonstrate that observers partially integrate visual and haptic material cues: when a standard stimulus was compared to an object that was visually more glossy, but felt more rubbery, the difference was hard to detect. In contrast, an increase in visual gloss accompanied by a decrease in rubberiness was easier to detect.</p>
<p>Experiment 2 confirmed that friction, rather than compliance, is the haptic dimension that interacts with visual gloss. The pattern of discrimination seen in Experiment 1 was reflected in subjective ratings of gloss and friction: slippery objects are perceived to be glossier, and glossy objects are perceived to be more slippery.</p>
<p>Importantly, visual and haptic cues were uncorrelated across our experimental trials. Our observers did not, therefore, learn their visual-haptic associations in the laboratory. Rather, it seems that observers have learnt, from a lifetime of interactions with the environment, to associate, and thus partially integrate, visual gloss and haptic friction signals. That this association is learnt (rather than hard-wired) seems likely given previous demonstrations of relatively fast learning about the statistics of the environment
<xref ref-type="bibr" rid="b32">32</xref>
<xref ref-type="bibr" rid="b35">35</xref>
<xref ref-type="bibr" rid="b36">36</xref>
<xref ref-type="bibr" rid="b37">37</xref>
.</p>
<p>Is there an ecological basis for this integration? Our rating data suggest that the effect of haptic friction on perceived gloss is larger than vice versa (correlation between stimulus friction and perceived gloss:
<inline-formula id="d33e287">
<inline-graphic id="d33e288" xlink:href="srep21866-m7.jpg"></inline-graphic>
</inline-formula>
, compared to correlation between stimulus gloss and perceived friction:
<inline-formula id="d33e290">
<inline-graphic id="d33e291" xlink:href="srep21866-m8.jpg"></inline-graphic>
</inline-formula>
), although this difference did not reach significance. The visual image is inherently ambiguous, with visual gloss cues affected not just by the specular components of the surface reflectance function, but also by object shape, motion, and the illumination environment
<xref ref-type="bibr" rid="b7">7</xref>
<xref ref-type="bibr" rid="b38">38</xref>
<xref ref-type="bibr" rid="b39">39</xref>
<xref ref-type="bibr" rid="b40">40</xref>
<xref ref-type="bibr" rid="b41">41</xref>
<xref ref-type="bibr" rid="b42">42</xref>
. An optimal perceptual system should, therefore, exploit all available information. However, the physical relationships between gloss and friction are complex: Decreasing surface roughness at the micro level (e.g. by polishing) can both increase gloss and decrease friction. At this scale, smoothness modulates friction by altering the contact area between surfaces
<xref ref-type="bibr" rid="b43">43</xref>
. However, friction and gloss can be unrelated in organic structures, such as the nanostructure that controls the low reflectance of moth eyes
<xref ref-type="bibr" rid="b44">44</xref>
or the glossiness of feathers
<xref ref-type="bibr" rid="b45">45</xref>
. Moreover, the predominant determinant of friction for many solid surfaces may not be roughness, but adhesive forces between thin adsorbed films on solid surfaces
<xref ref-type="bibr" rid="b43">43</xref>
.</p>
<p>One of the main behavioural advantages of learning the relationship between friction and gloss may be identification of lubricant surface coatings. Whilst lubricants can be powdery and matte, they are more often water or oil-based, and highly glossy. Observers appear to use gloss in assessing the slipperiness of a surface
<xref ref-type="bibr" rid="b46">46</xref>
<xref ref-type="bibr" rid="b47">47</xref>
- there are obvious advantages to identifying a wet, slippery floor. Thus, although friction is not a perfect predictor of gloss across all natural objects, the perceptual system appears to have an implicit understanding of the probabilistic relationship between these variables, and uses this to inform estimates of gloss.</p>
</sec>
<sec disp-level="1">
<title>Methods</title>
<sec disp-level="2">
<title>Apparatus and stimuli</title>
<p>Our experimental set-up (depicted in
<xref ref-type="fig" rid="f1">Fig. 1a</xref>
) allowed concurrent and spatially aligned visual-haptic stimulus presentation, centred at 57cm from the observer’s eyes. Head position was maintained using a headrest, and observers wore an eye patch to eliminate binocular cues to shape and gloss
<xref ref-type="bibr" rid="b8">8</xref>
.</p>
<p>Stimulus objects were spherical meshes, deformed by 3D simplex noise
<xref ref-type="bibr" rid="b48">48</xref>
, to produce potato-like shapes. These objects were rendered for visual presentation using an unbiased, physically-based renderer (Octane Render; Otoy Inc.), under one of three complex light fields (Beach, Campus and Uffizi
<xref ref-type="bibr" rid="b49">49</xref>
, see
<xref ref-type="fig" rid="f1">Fig. 1B</xref>
). Stimuli subtended an average visual angle of 8.9°. Visually defined gloss was manipulated by varying the proportion of light reflected specularly, and the degree of specular scatter (micro-scale roughness). Additional examples of the visual stimuli can be found in the
<xref ref-type="supplementary-material" rid="S1">supplementary information</xref>
.</p>
<p>The same shapes were rendered haptically using the OpenHaptics toolkit (Geomagic, USA) and presented via a Phantom force feedback device. We manipulated how ‘rubbery’ an object felt by modulating compliance-related forces in the surface normal direction (i.e. how the object responds when it is poked or squashed), and in the tangential direction (how easy it is to slide your finger across the object’s surface: static and dynamic friction). More rubbery objects were more compliant (squashy) and had higher friction.
<xref ref-type="table" rid="t1">Table 1</xref>
shows the stimulus parameters for the uni-modal task of Experiment 1.</p>
</sec>
<sec disp-level="2">
<title>Procedure: Experiment 1</title>
<sec disp-level="3">
<title>Uni-modal</title>
<p>For each participant we first established uni-modal sets of (i) visual and (ii) haptic stimuli that were equated in terms of discriminability, via an odd-one-out task with uni-modal stimuli (
<xref ref-type="fig" rid="f4">Fig. 4</xref>
). On each trial, three stimuli were displayed in succession (2500 msec for each stimulus): either two standard stimuli and one comparison stimulus, or one standard stimulus and two (parameter-matched) comparison stimuli. Observers reported (by using the Phantom to press a visual/haptic button rendered in the workspace) which stimulus was “made of something different”. To avoid simple image matching on visual-only trials, objects within a trial had different shapes, but all three were rendered under the same light field. In visual-only trials a virtual wall blocked haptic access to the stimuli. In haptic-only trials, a visual silhouette aided object localisation whilst eliminating visual cues to the object’s material. Observers completed 624 trials (12 comparison levels × 2 modalities × 26 repetitions) in random order, across 3 sessions of approximately 45 minutes each.</p>
</sec>
<sec disp-level="3">
<title>Bi-modal</title>
<p>Uni-modal discrimination data were used to create a set of 11 visual and 11 haptic stimulus levels that were (i) linearly spaced in JNDs and (ii) spanned a range of ±2 uni-modal JNDs (Just-Noticeable-Differences) around the visual and haptic standards (see
<xref ref-type="fig" rid="f4">Fig. 4b</xref>
, fewer stimulus levels have been shown in this schematic). Gaussian distributions were fitted separately to each observer’s visual and haptic uni-modal data (fits were constrained to have a mean of 0, but standard deviation varied across observers). As part of this fitting process, stimulus index (
<inline-formula id="d33e357">
<inline-graphic id="d33e358" xlink:href="srep21866-m9.jpg"></inline-graphic>
</inline-formula>
) was transformed according to Equation
<xref ref-type="disp-formula" rid="eq10">1</xref>
, selecting exponent
<italic>p</italic>
that minimised residual error in the subsequent Gaussian fit, effectively linearising the visual and haptic parameter scales in (JND) space.</p>
<p>
<disp-formula id="eq10">
<inline-graphic id="d33e368" xlink:href="srep21866-m10.jpg"></inline-graphic>
</disp-formula>
</p>
<p>Visual and haptic stimulus parameters were then combined in two different ways, to create Type A visual-haptic stimuli and trials, in which more matte-looking stimuli felt more rubbery, or Type B stimuli and trials, in which glossier stimuli felt more rubbery, as shown in
<xref ref-type="fig" rid="f3">Fig. 3B</xref>
. Each observer completed at least 10 blocks of 40 visual-haptic discrimination trials presented in a random order, making a 3IFC decision on each trial to identify the odd-one-out that was “made of something different”. (Due to a programming error, a subset of observers had one condition missing from initial blocks, and completed an additional 10 blocks). The same instruction was given across all uni-modal and bi-modal trials in order to (i) make it clear that 3D shape was irrelevant to the task, and (ii) to avoid biasing the observers to rely on one modality more than the other on bi-modal tasks. Each visual-haptic object was initially presented as a silhouette until the observer made haptic contact, from which point it was viewed and explored haptically for 2500 msecs. Similarly to the uni-modal case, the odd-one-out could be either a standard, or a comparison stimulus. Type A and B trials were randomly intermingled.</p>
</sec>
<sec disp-level="3">
<title>Integration and bi-modal discrimination performance.</title>
<p>If observers have learnt that visual gloss and haptic rubberiness signals are correlated in the environment, the signals should be integrated. Integration will modify estimates of gloss and rubberinesss and discriminability of visual-haptic stimuli that vary along these dimensions. Knowledge about the statistical relationship between sensory signals can be represented by a coupling prior
<xref ref-type="bibr" rid="b32">32</xref>
, where the width of the prior corresponds to the strength of the correlation, and the associated degree of integration.
<xref ref-type="fig" rid="f5">Figure 5</xref>
shows optimal sensory integration under different hypothetical coupling priors.</p>
<p>The 2D Gaussian in the top left plot of
<xref ref-type="fig" rid="f5">Fig. 5</xref>
represents the sensory likelihood for a single object (comparison stimulus C
<sub>A</sub>
) that is visually matte and feels rubbery. If visual gloss signals are statistically unrelated to how an object feels, our perceptual system should have learnt a uniform (flat) coupling prior (
<xref ref-type="fig" rid="f2">Fig. 2</xref>
, top row) and these visual and haptic dimensions would be processed independently, with the prior having no effect on the final visual and haptic estimates represented by the posterior (right column).</p>
<p>Alternatively, humans may have learnt that matte objects are more likely to feel rubbery, whereas glossy objects more often feel hard and slippery (red boxes); this possibility is shown in rows 2–4, for a weak negative correlation (rows 2–3; the spread,
<inline-formula id="d33e397">
<inline-graphic id="d33e398" xlink:href="srep21866-m11.jpg"></inline-graphic>
</inline-formula>
, of the coupling prior shown is twice that of the uni-modal likelihoods) or a near perfect negative correlation (row 4). For a stimulus on the axis of the coupling prior (C
<sub>A</sub>
, row 2), integration does not change the mean visual and haptic estimates (likelihood and posterior distributions are aligned), but noise is reduced in the direction orthogonal to the prior. In this case, discriminating between this stimulus and the standard stimulus (S, blue circle) will be very slightly easier than if the two signals were processed independently, i.e. under a uniform coupling prior (row 1).</p>
<p>We can now consider a stimulus that is visually matte but does not feel rubbery – instead it feels hard and slippery like ice (Stimulus C
<sub>B</sub>
, row 3). Under the weak negative coupling prior, integration shifts the visual and haptic estimates towards the prior, and distinguishing this stimulus from the standard stimulus becomes harder. With the strongest coupling prior (row 4) the visual and haptic estimates are fully integrated; the posterior distributions for the stimuli C
<sub>B</sub>
and S – whose likelihoods were separated in the direction orthogonal to the prior - become superimposed and discrimination between them becomes impossible – they are visual-haptic metamers. In other words, a negative coupling prior, representing a negative correlation between gloss and rubberiness, will result in better performance (smaller discrimination thresholds) for Type A trials than for Type B trials.</p>
<p>In contrast, we may have learnt a statistical relationship between signals such that glossy objects tend to feel more rubbery (green boxes). This situation is shown in the last two rows of
<xref ref-type="fig" rid="f5">Fig. 5</xref>
. Given this positive coupling prior, integration results in poorer discrimination of C
<sub>A</sub>
from S, but slightly improved discrimination of C
<sub>B</sub>
from S, relative to independent processing of the two stimuli. In other words, a positive coupling prior will lead to larger thresholds for Type A, than Type B trials.</p>
<p>
<xref ref-type="fig" rid="f6">Figure 6</xref>
shows how discrimination performance for Type A (red) and Type B (green) stimuli is modulated by the strength of integration (i.e. the spread,
<inline-formula id="d33e426">
<inline-graphic id="d33e427" xlink:href="srep21866-m12.jpg"></inline-graphic>
</inline-formula>
, of the coupling prior). Without integration (top left plot,
<inline-formula id="d33e429">
<inline-graphic id="d33e430" xlink:href="srep21866-m13.jpg"></inline-graphic>
</inline-formula>
), the discrimination threshold (defined by 67% correct performance – dashed grey line) is the same for both trial types, and equal to 0.78 uni-modal JNDs. Note that in our paradigm we don’t expect the usual two cue advantage of
<inline-formula id="d33e432">
<inline-graphic id="d33e433" xlink:href="srep21866-m14.jpg"></inline-graphic>
</inline-formula>
(relative to a uni-modal threshold of 1 JND). This is because our observers are performing a bivariate discrimination task: their visual-haptic estimates are subject to noise in two dimensions (as represented by the Gaussian blobs in
<xref ref-type="fig" rid="f5">Fig. 5</xref>
). If our observers knew, on each trial, whether the stimuli differed along the positive or negative axis and could discount noise in the orthogonal (irrelevant) direction, we would predict thresholds of
<inline-formula id="d33e438">
<inline-graphic id="d33e439" xlink:href="srep21866-m15.jpg"></inline-graphic>
</inline-formula>
in the absence of integration.</p>
<p>For the negative coupling prior modelled in
<xref ref-type="fig" rid="f6">Fig. 6</xref>
, as the spread of the coupling prior decreases, Type A thresholds decrease and Type B thresholds increase. The top right plot shows the prediction for a coupling prior whose spread is 1.75 times that of the unimodal likelihoods. In this case the Type B threshold is around 1JND, and the Type A threshold is close to 0.8 JNDs. This is comparable to our observers’ thresholds (see
<xref ref-type="fig" rid="f2">Fig. 2c</xref>
). Under full integration (
<inline-formula id="d33e449">
<inline-graphic id="d33e450" xlink:href="srep21866-m16.jpg"></inline-graphic>
</inline-formula>
, as in standard Bayesian integration models), as depicted in the bottom panel of
<xref ref-type="fig" rid="f6">Fig. 6</xref>
, the Type A threshold is
<inline-formula id="d33e455">
<inline-graphic id="d33e456" xlink:href="srep21866-m17.jpg"></inline-graphic>
</inline-formula>
and Type B stimuli become indistinguishable.</p>
</sec>
</sec>
<sec disp-level="2">
<title>Procedure, Experiment 2</title>
<p>Observers were presented with a single object on each trial to view and touch. Its visual gloss, haptic compliance, and haptic friction (as well as its shape) varied independently from trial to trial. As in Experiment 1, the visual stimulus appeared as a silhouette until the observer made haptic contact with it. After viewing and haptically exploring the stimulus for 2500 msec., the observer rated it on the dimensions of gloss (from ‘most matte’ to ‘most glossy’), compliance (‘most hard’ to ‘most squashy’) and friction (‘most slippery’ to ‘most sticky’). Responses were given by moving virtual visual-haptic markers along three slider bars, which had linearly spaced tick marks corresponding to steps of 0.1 on a scale of 0 to 1. Before these experimental trials, and at the beginning of each session, each observer was given unlimited time to explore three pairs of reference objects to acquaint themselves with the rating scales: (i) a pair of visual objects (no haptics) that represented the highest and lowest gloss values, labelled ‘most matte’ and ‘most glossy’, (ii) a pair of visual-haptic objects that were shown in silhouette, with mid-range friction, representing the most and least compliant values, labelled ‘hardest’ and ‘most squashy’ (iii) a pair of silhouetted objects, with mid-range compliance, with the lowest or highest friction values of the stimulus set, labelled ‘most slippery’ to ‘most sticky’. The lowest and highest parameter values were the same as those used in the odd-one-out experiment, and shown in
<xref ref-type="table" rid="t1">Table 1</xref>
.</p>
<p>Six observers (2 authors, 4 naive) each completed 1458 trials (9 gloss levels x 9 compliance levels × 9 friction levels × 2 light field illuminations: ‘Beach’ and ‘Campus’ from the Debevec set
<xref ref-type="bibr" rid="b49">49</xref>
split across multiple sessions of around 25–35 minutes each, with the total participation time of approximately 2 hours. Both experiments were approved by the University of Southampton Ethics Committee and were conducted in accordance with the University of Southampton’s policy on the ethical conduct of research and studies involving human participants. All participants gave informed written consent.</p>
</sec>
</sec>
<sec disp-level="1">
<title>Additional Information</title>
<p>
<bold>How to cite this article</bold>
: Adams, W. J.
<italic>et al.</italic>
Touch influences perceived gloss.
<italic>Sci. Rep.</italic>
<bold>6</bold>
, 21866; doi: 10.1038/srep21866 (2016).</p>
</sec>
<sec sec-type="supplementary-material" id="S1">
<title>Supplementary Material</title>
<supplementary-material id="d33e45" content-type="local-data">
<caption>
<title>Supplementary Information</title>
</caption>
<media xlink:href="srep21866-s1.pdf"></media>
</supplementary-material>
</sec>
</body>
<back>
<ack>
<p>I.S.K was supported by ESRC studentship ES/I904026/1. W.J.A. and E.W.G. were supported by EPSRC grant EP/K005952/1.</p>
</ack>
<ref-list>
<ref id="b1">
<mixed-citation publication-type="journal">
<name>
<surname>Wiebel</surname>
<given-names>C. B.</given-names>
</name>
,
<name>
<surname>Valsecchi</surname>
<given-names>M.</given-names>
</name>
&
<name>
<surname>Gegenfurtner</surname>
<given-names>K. R.</given-names>
</name>
<article-title>The speed and accuracy of material recognition in natural images</article-title>
.
<source>Atten. Percept. Psycho.</source>
<volume>75</volume>
,
<fpage>954</fpage>
<lpage>966</lpage>
,
<pub-id pub-id-type="doi">10.3758/s13414-013-0436-y</pub-id>
(
<year>2013</year>
).</mixed-citation>
</ref>
<ref id="b2">
<mixed-citation publication-type="journal">
<name>
<surname>Chadwick</surname>
<given-names>A. C.</given-names>
</name>
&
<name>
<surname>Kentridge</surname>
<given-names>R. W.</given-names>
</name>
<article-title>The perception of gloss: A review</article-title>
.
<source>Vision Res.</source>
<volume>109</volume>
,
<fpage>221</fpage>
<lpage>235</lpage>
,
<pub-id pub-id-type="doi">10.1016/j.visres.2014.10.026</pub-id>
(
<year>2015</year>
).
<pub-id pub-id-type="pmid">25448119</pub-id>
</mixed-citation>
</ref>
<ref id="b3">
<mixed-citation publication-type="journal">
<name>
<surname>Beck</surname>
<given-names>J.</given-names>
</name>
&
<name>
<surname>Prazdny</surname>
<given-names>S.</given-names>
</name>
<article-title>Highlights and the perception of glossiness</article-title>
.
<source>Percept. Psychophys.</source>
<volume>30</volume>
,
<fpage>407</fpage>
<lpage>410</lpage>
,
<pub-id pub-id-type="doi">10.3758/BF03206160</pub-id>
(
<year>1981</year>
).
<pub-id pub-id-type="pmid">7322822</pub-id>
</mixed-citation>
</ref>
<ref id="b4">
<mixed-citation publication-type="journal">
<name>
<surname>Blake</surname>
<given-names>A.</given-names>
</name>
&
<name>
<surname>Bülthoff</surname>
<given-names>H.</given-names>
</name>
<article-title>Does the brain know the physics of specular reflection?</article-title>
<source>Nature</source>
<volume>343</volume>
,
<fpage>165</fpage>
<lpage>168</lpage>
,
<pub-id pub-id-type="doi">10.1038/343165a0</pub-id>
(
<year>1990</year>
).
<pub-id pub-id-type="pmid">2296307</pub-id>
</mixed-citation>
</ref>
<ref id="b5">
<mixed-citation publication-type="journal">
<name>
<surname>Todd</surname>
<given-names>J.</given-names>
</name>
,
<name>
<surname>Norman</surname>
<given-names>J.</given-names>
</name>
&
<name>
<surname>Mingolla</surname>
<given-names>E.</given-names>
</name>
<article-title>Lightness constancy in the presence of specular highlights</article-title>
.
<source>Psychol. Sci.</source>
<volume>15</volume>
,
<fpage>33</fpage>
<lpage>39</lpage>
,
<pub-id pub-id-type="doi">10.1111/j.0963-7214.2004.01501006.x</pub-id>
(
<year>2004</year>
).
<pub-id pub-id-type="pmid">14717829</pub-id>
</mixed-citation>
</ref>
<ref id="b6">
<mixed-citation publication-type="journal">
<name>
<surname>Pellacini</surname>
<given-names>F.</given-names>
</name>
,
<name>
<surname>Ferwerda</surname>
<given-names>J. A.</given-names>
</name>
,
<name>
<surname>Greenberg</surname>
<given-names>D. P.</given-names>
</name>
, Acm &
<name>
<surname>Acm</surname>
<given-names>A. C. M.</given-names>
</name>
<article-title>Toward a psychophysically-based light reflection model for image synthesis</article-title>
.
<source>Conf. Proc. SIGGRAPH 2000</source>
,
<fpage>55</fpage>
<lpage>64</lpage>
(
<year>2000</year>
).</mixed-citation>
</ref>
<ref id="b7">
<mixed-citation publication-type="journal">
<name>
<surname>Fleming</surname>
<given-names>R. W.</given-names>
</name>
,
<name>
<surname>Dror</surname>
<given-names>R. O.</given-names>
</name>
&
<name>
<surname>Adelson</surname>
<given-names>E. H.</given-names>
</name>
<article-title>Real-world illumination and the perception of surface reflectance properties</article-title>
.
<source>J. Vision</source>
<volume>3</volume>
,
<fpage>347</fpage>
<lpage>368</lpage>
,
<pub-id pub-id-type="doi">10.1167/3.5.3</pub-id>
(
<year>2003</year>
).</mixed-citation>
</ref>
<ref id="b8">
<mixed-citation publication-type="journal">
<name>
<surname>Kerrigan</surname>
<given-names>I. S.</given-names>
</name>
&
<name>
<surname>Adams</surname>
<given-names>W. J.</given-names>
</name>
<article-title>Highlights, disparity, and perceived gloss with convex and concave surfaces</article-title>
.
<source>J. Vision</source>
<volume>13</volume>
,
<pub-id pub-id-type="doi">10.1167/13.1.9</pub-id>
(
<year>2013</year>
).</mixed-citation>
</ref>
<ref id="b9">
<mixed-citation publication-type="journal">
<name>
<surname>Okamoto</surname>
<given-names>S.</given-names>
</name>
,
<name>
<surname>Nagano</surname>
<given-names>H.</given-names>
</name>
&
<name>
<surname>Yamada</surname>
<given-names>Y.</given-names>
</name>
<article-title>Psychophysical Dimensions of Tactile Perception of Textures</article-title>
.
<source>IEEE Trans. Haptics</source>
<volume>6</volume>
,
<fpage>81</fpage>
<lpage>93</lpage>
,
<pub-id pub-id-type="doi">10.1109/ToH.2012.32</pub-id>
(
<year>2013</year>
).
<pub-id pub-id-type="pmid">24808270</pub-id>
</mixed-citation>
</ref>
<ref id="b10">
<mixed-citation publication-type="journal">
<name>
<surname>Klatzky</surname>
<given-names>R. L.</given-names>
</name>
,
<name>
<surname>Pawluk</surname>
<given-names>D.</given-names>
</name>
&
<name>
<surname>Peer</surname>
<given-names>A.</given-names>
</name>
<article-title>Haptic Perception of Material Properties and Implications for Applications</article-title>
.
<source>Proc. IEEE</source>
<volume>101</volume>
,
<fpage>2081</fpage>
<lpage>2092</lpage>
,
<pub-id pub-id-type="doi">10.1109/jproc.2013.2248691</pub-id>
(
<year>2013</year>
).</mixed-citation>
</ref>
<ref id="b11">
<mixed-citation publication-type="journal">
<name>
<surname>Bergmann Tiest</surname>
<given-names>W. M.</given-names>
</name>
<article-title>Tactual perception of material properties</article-title>
.
<source>Vision. Res.</source>
<volume>50</volume>
,
<fpage>2775</fpage>
<lpage>2782</lpage>
,
<pub-id pub-id-type="doi">10.1016/j.visres.2010.10.005</pub-id>
(
<year>2010</year>
).
<pub-id pub-id-type="pmid">20937297</pub-id>
</mixed-citation>
</ref>
<ref id="b12">
<mixed-citation publication-type="journal">
<name>
<surname>Lederman</surname>
<given-names>S. J.</given-names>
</name>
&
<name>
<surname>Abbott</surname>
<given-names>S. G.</given-names>
</name>
<article-title>Texture perception - studies of intersensory organization using a discrepancy paradigm, and visual versus tactual psychophysics</article-title>
.
<source>J. Exp. Psychol. Human</source>
<volume>7</volume>
,
<fpage>902</fpage>
<lpage>915</lpage>
,
<pub-id pub-id-type="doi">10.1037/0096-1523.7.4.902</pub-id>
(
<year>1981</year>
).</mixed-citation>
</ref>
<ref id="b13">
<mixed-citation publication-type="journal">
<name>
<surname>Lederman</surname>
<given-names>S. J.</given-names>
</name>
,
<name>
<surname>Thorne</surname>
<given-names>G.</given-names>
</name>
&
<name>
<surname>Jones</surname>
<given-names>B.</given-names>
</name>
<article-title>Perception of texture by vision and touch - multidimensionality and intersensory integration</article-title>
.
<source>J. Exp. Psychol. Human</source>
<volume>12</volume>
,
<fpage>169</fpage>
<lpage>180</lpage>
,
<pub-id pub-id-type="doi">10.1037//0096-1523.12.2.169</pub-id>
(
<year>1986</year>
).</mixed-citation>
</ref>
<ref id="b14">
<mixed-citation publication-type="journal">
<name>
<surname>Klatzky</surname>
<given-names>R. L.</given-names>
</name>
,
<name>
<surname>Lederman</surname>
<given-names>S.</given-names>
</name>
&
<name>
<surname>Reed</surname>
<given-names>C.</given-names>
</name>
<article-title>Theres more to touch than meets the eye - the salience of object attributes for haptics with and without vision</article-title>
.
<source>J. Exp. Psychol. Gen.</source>
<volume>116</volume>
,
<fpage>356</fpage>
<lpage>369</lpage>
(
<year>1987</year>
).</mixed-citation>
</ref>
<ref id="b15">
<mixed-citation publication-type="journal">
<name>
<surname>Jones</surname>
<given-names>B.</given-names>
</name>
&
<name>
<surname>Oneil</surname>
<given-names>S.</given-names>
</name>
<article-title>Combining vision and touch in texture perception</article-title>
.
<source>Percept. Psychophys.</source>
<volume>37</volume>
,
<fpage>66</fpage>
<lpage>72</lpage>
,
<pub-id pub-id-type="doi">10.3758/bf03207140</pub-id>
(
<year>1985</year>
).
<pub-id pub-id-type="pmid">3991320</pub-id>
</mixed-citation>
</ref>
<ref id="b16">
<mixed-citation publication-type="journal">
<name>
<surname>Heller</surname>
<given-names>M. A.</given-names>
</name>
<article-title>Visual and tactual texture perception - intersensory cooperation</article-title>
.
<source>Percept. Psychophys.</source>
<volume>31</volume>
,
<fpage>339</fpage>
<lpage>344</lpage>
,
<pub-id pub-id-type="doi">10.3758/bf03202657</pub-id>
(
<year>1982</year>
).
<pub-id pub-id-type="pmid">7110887</pub-id>
</mixed-citation>
</ref>
<ref id="b17">
<mixed-citation publication-type="journal">
<name>
<surname>Guest</surname>
<given-names>S.</given-names>
</name>
&
<name>
<surname>Spence</surname>
<given-names>C.</given-names>
</name>
<article-title>What role does multisensory integration play in the visuotactile perception of texture?</article-title>
<source>Int. J. Psychophysiol.</source>
<volume>50</volume>
,
<fpage>63</fpage>
<lpage>80</lpage>
,
<pub-id pub-id-type="doi">10.1016/s0167-8760(03)00125-9</pub-id>
(
<year>2003</year>
).
<pub-id pub-id-type="pmid">14511837</pub-id>
</mixed-citation>
</ref>
<ref id="b18">
<mixed-citation publication-type="journal">
<name>
<surname>Bergmann Tiest</surname>
<given-names>W. M.</given-names>
</name>
&
<name>
<surname>Kappers</surname>
<given-names>A. M. L.</given-names>
</name>
<article-title>Haptic and visual perception of roughness</article-title>
.
<source>Acta Psychol.</source>
<volume>124</volume>
,
<fpage>177</fpage>
<lpage>189</lpage>
,
<pub-id pub-id-type="doi">10.1016/j.actpsy.2006.03.002</pub-id>
(
<year>2007</year>
).</mixed-citation>
</ref>
<ref id="b19">
<mixed-citation publication-type="journal">
<name>
<surname>Varadharajan</surname>
<given-names>V.</given-names>
</name>
,
<name>
<surname>Klatzky</surname>
<given-names>R.</given-names>
</name>
,
<name>
<surname>Unger</surname>
<given-names>B.</given-names>
</name>
,
<name>
<surname>Swendsen</surname>
<given-names>R.</given-names>
</name>
&
<name>
<surname>Hollis</surname>
<given-names>R.</given-names>
</name>
<article-title>Haptic rendering and psychophysical evaluation of a virtual three-dimensional helical spring</article-title>
.
<source>Symp. Haptics Interfaces for Virtual Environ. Teleoper. Syst. 2008, Proc.</source>
,
<fpage>57</fpage>
<lpage>64</lpage>
(
<year>2008</year>
).</mixed-citation>
</ref>
<ref id="b20">
<mixed-citation publication-type="journal">
<name>
<surname>Kuschel</surname>
<given-names>M.</given-names>
</name>
,
<name>
<surname>Di Luca</surname>
<given-names>M.</given-names>
</name>
,
<name>
<surname>Buss</surname>
<given-names>M.</given-names>
</name>
&
<name>
<surname>Klatzky</surname>
<given-names>R. L.</given-names>
</name>
<article-title>Combination and Integration in the Perception of Visual-Haptic Compliance Information</article-title>
.
<source>IEEE Trans. Haptics</source>
<volume>3</volume>
,
<fpage>234</fpage>
<lpage>244</lpage>
,
<pub-id pub-id-type="doi">10.1109/ToH.2010.9</pub-id>
(
<year>2010</year>
).</mixed-citation>
</ref>
<ref id="b21">
<mixed-citation publication-type="journal">
<name>
<surname>Drewing</surname>
<given-names>K.</given-names>
</name>
,
<name>
<surname>Ramisch</surname>
<given-names>A.</given-names>
</name>
,
<name>
<surname>Bayer</surname>
<given-names>F.</given-names>
</name>
& Ieee.
<article-title>Haptic, visual and visuo-haptic softness judgments for objects with deformable surfaces</article-title>
.
<source>Joint Eurohaptics Conf. Symp. Haptic Interfaces Virtual Environ. Teleoper. Syst., Proc.</source>
,
<fpage>640</fpage>
<lpage>645</lpage>
,
<pub-id pub-id-type="doi">10.1109/whc.2009.4810828</pub-id>
(
<year>2009</year>
).</mixed-citation>
</ref>
<ref id="b22">
<mixed-citation publication-type="journal">
<name>
<surname>Cellini</surname>
<given-names>C.</given-names>
</name>
,
<name>
<surname>Kaim</surname>
<given-names>L.</given-names>
</name>
&
<name>
<surname>Drewing</surname>
<given-names>K.</given-names>
</name>
<article-title>Visual and haptic integration in the estimation of softness of deformable objects</article-title>
.
<source>i-Percept.</source>
<volume>4</volume>
,
<fpage>516</fpage>
<lpage>531</lpage>
,
<pub-id pub-id-type="doi">10.1068/i0598</pub-id>
(
<year>2013</year>
).</mixed-citation>
</ref>
<ref id="b23">
<mixed-citation publication-type="journal">
<name>
<surname>Di Luca</surname>
<given-names>M.</given-names>
</name>
<article-title>Perceived compliance in a pinch</article-title>
.
<source>Vision Res.</source>
<volume>51</volume>
,
<fpage>961</fpage>
<lpage>967</lpage>
,
<pub-id pub-id-type="doi">10.1016/j.visres.2011.02.021</pub-id>
(
<year>2011</year>
).
<pub-id pub-id-type="pmid">21371492</pub-id>
</mixed-citation>
</ref>
<ref id="b24">
<mixed-citation publication-type="journal">
<name>
<surname>Di Luca</surname>
<given-names>M.</given-names>
</name>
,
<name>
<surname>Knoerlein</surname>
<given-names>B.</given-names>
</name>
,
<name>
<surname>Ernst</surname>
<given-names>M. O.</given-names>
</name>
&
<name>
<surname>Harders</surname>
<given-names>M.</given-names>
</name>
<article-title>Effects of visual-haptic asynchronies and loading-unloading movements on compliance perception</article-title>
.
<source>Brain Res. Bull.</source>
<volume>85</volume>
,
<fpage>245</fpage>
<lpage>259</lpage>
,
<pub-id pub-id-type="doi">10.1016/j.brainresbull.2010.02.009</pub-id>
(
<year>2011</year>
).
<pub-id pub-id-type="pmid">20193747</pub-id>
</mixed-citation>
</ref>
<ref id="b25">
<mixed-citation publication-type="journal">
<name>
<surname>Ernst</surname>
<given-names>M. O.</given-names>
</name>
&
<name>
<surname>Banks</surname>
<given-names>M. S.</given-names>
</name>
<article-title>Humans integrate visual and haptic information in a statistically optimal fashion</article-title>
.
<source>Nature</source>
<volume>415</volume>
,
<fpage>429</fpage>
<lpage>433</lpage>
,
<pub-id pub-id-type="doi">10.1038/415429a</pub-id>
(
<year>2002</year>
).
<pub-id pub-id-type="pmid">11807554</pub-id>
</mixed-citation>
</ref>
<ref id="b26">
<mixed-citation publication-type="journal">
<name>
<surname>Ernst</surname>
<given-names>M. O.</given-names>
</name>
,
<name>
<surname>Banks</surname>
<given-names>M. S.</given-names>
</name>
&
<name>
<surname>Bulthoff</surname>
<given-names>H. H.</given-names>
</name>
<article-title>Touch can change visual slant perception</article-title>
.
<source>Nat. Neuro.</source>
<volume>3</volume>
,
<fpage>69</fpage>
<lpage>73</lpage>
,
<pub-id pub-id-type="doi">10.1038/71140</pub-id>
(
<year>2000</year>
).</mixed-citation>
</ref>
<ref id="b27">
<mixed-citation publication-type="journal">
<name>
<surname>Harshfield</surname>
<given-names>S. P.</given-names>
</name>
&
<name>
<surname>Dehardt</surname>
<given-names>D. C.</given-names>
</name>
<article-title>Weight judgment as a function of apparent density of objects</article-title>
.
<source>Psychon. Sci.</source>
<volume>20</volume>
,
<fpage>365</fpage>
<lpage>366</lpage>
(
<year>1970</year>
).</mixed-citation>
</ref>
<ref id="b28">
<mixed-citation publication-type="journal">
<name>
<surname>Buckingham</surname>
<given-names>G.</given-names>
</name>
,
<name>
<surname>Cant</surname>
<given-names>J. S.</given-names>
</name>
&
<name>
<surname>Goodale</surname>
<given-names>M. A.</given-names>
</name>
<article-title>Living in A Material World: How Visual Cues to Material Properties Affect the Way That We Lift Objects and Perceive Their Weight</article-title>
.
<source>J. Neurophysiol.</source>
<volume>102</volume>
,
<fpage>3111</fpage>
<lpage>3118</lpage>
,
<pub-id pub-id-type="doi">10.1152/jn.00515.2009</pub-id>
(
<year>2009</year>
).
<pub-id pub-id-type="pmid">19793879</pub-id>
</mixed-citation>
</ref>
<ref id="b29">
<mixed-citation publication-type="journal">
<name>
<surname>Charpentier</surname>
<given-names>A.</given-names>
</name>
<article-title>Analyse experimentale quelques elements de la sensation de poids [Experimental study of some aspects of weight perception]</article-title>
.
<source>Arch. Physiol. Normales Pathologiques</source>
<volume>3</volume>
,
<fpage>122</fpage>
<lpage>135</lpage>
(
<year>1891</year>
).</mixed-citation>
</ref>
<ref id="b30">
<mixed-citation publication-type="journal">
<name>
<surname>Murray</surname>
<given-names>D. J.</given-names>
</name>
,
<name>
<surname>Ellis</surname>
<given-names>R. R.</given-names>
</name>
,
<name>
<surname>Bandomir</surname>
<given-names>C. A.</given-names>
</name>
&
<name>
<surname>Ross</surname>
<given-names>H. E.</given-names>
</name>
<article-title>Charpentier (1891) on the size-weight illusion</article-title>
.
<source>Percept. Psychophys.</source>
<volume>61</volume>
,
<fpage>1681</fpage>
<lpage>1685</lpage>
,
<pub-id pub-id-type="doi">10.3758/bf03213127</pub-id>
(
<year>1999</year>
).
<pub-id pub-id-type="pmid">10598479</pub-id>
</mixed-citation>
</ref>
<ref id="b31">
<mixed-citation publication-type="journal">
<name>
<surname>Ernst</surname>
<given-names>M. O.</given-names>
</name>
<article-title>Perceptual Learning: Inverting the Size-Weight Illusion</article-title>
.
<source>Curr. Biol.</source>
<volume>19</volume>
,
<fpage>R23</fpage>
<lpage>R25</lpage>
,
<pub-id pub-id-type="doi">10.1016/j.cub.2008.10.039</pub-id>
(
<year>2009</year>
).
<pub-id pub-id-type="pmid">19138585</pub-id>
</mixed-citation>
</ref>
<ref id="b32">
<mixed-citation publication-type="journal">
<name>
<surname>Ernst</surname>
<given-names>M. O.</given-names>
</name>
<article-title>Learning to integrate arbitrary signals from vision and touch</article-title>
.
<source>J. Vision</source>
<volume>7</volume>
,
<pub-id pub-id-type="doi">10.1167/7.5.7</pub-id>
(
<year>2007</year>
).</mixed-citation>
</ref>
<ref id="b33">
<mixed-citation publication-type="journal">
<name>
<surname>Ernst</surname>
<given-names>M. O.</given-names>
</name>
&
<name>
<surname>di Luca</surname>
<given-names>M.</given-names>
</name>
In
<source>Sensory cue integration</source>
(eds
<name>
<surname>Trommershauser</surname>
<given-names>J.</given-names>
</name>
,
<name>
<surname>Kording</surname>
<given-names>K.</given-names>
</name>
, &
<name>
<surname>Landy</surname>
<given-names>M. S.</given-names>
</name>
)
<fpage>224</fpage>
<lpage>250</lpage>
(
<year>2012</year>
).</mixed-citation>
</ref>
<ref id="b34">
<mixed-citation publication-type="journal">
<name>
<surname>Wismeijer</surname>
<given-names>D. A.</given-names>
</name>
,
<name>
<surname>Gegenfurtner</surname>
<given-names>K. R.</given-names>
</name>
&
<name>
<surname>Drewing</surname>
<given-names>K.</given-names>
</name>
<article-title>Learning from vision-to-touch is different than learning from touch-to-vision</article-title>
.
<source>Frontiers in integrative neuroscience</source>
<volume>6</volume>
,
<fpage>105</fpage>
<lpage>105</lpage>
,
<pub-id pub-id-type="doi">10.3389/fnint.2012.00105</pub-id>
(
<year>2012</year>
).
<pub-id pub-id-type="pmid">23181012</pub-id>
</mixed-citation>
</ref>
<ref id="b35">
<mixed-citation publication-type="journal">
<name>
<surname>Adams</surname>
<given-names>W.</given-names>
</name>
,
<name>
<surname>Graf</surname>
<given-names>E.</given-names>
</name>
&
<name>
<surname>Ernst</surname>
<given-names>M.</given-names>
</name>
<article-title>Experience can change the ‘light-from-above’ prior</article-title>
.
<source>Nat. Neurosci.</source>
<volume>7</volume>
,
<fpage>1057</fpage>
<lpage>1058</lpage>
,
<pub-id pub-id-type="doi">10.1038/nn1312</pub-id>
(
<year>2004</year>
).
<pub-id pub-id-type="pmid">15361877</pub-id>
</mixed-citation>
</ref>
<ref id="b36">
<mixed-citation publication-type="journal">
<name>
<surname>Kerrigan</surname>
<given-names>I. S.</given-names>
</name>
&
<name>
<surname>Adams</surname>
<given-names>W. J.</given-names>
</name>
<article-title>Learning different light prior distributions for different contexts</article-title>
.
<source>Cognition</source>
<volume>127</volume>
,
<fpage>99</fpage>
<lpage>104</lpage>
,
<pub-id pub-id-type="doi">10.1016/j.cognition.2012.12.011</pub-id>
(
<year>2013</year>
).
<pub-id pub-id-type="pmid">23376295</pub-id>
</mixed-citation>
</ref>
<ref id="b37">
<mixed-citation publication-type="journal">
<name>
<surname>Adams</surname>
<given-names>W.</given-names>
</name>
,
<name>
<surname>Kerrigan</surname>
<given-names>I.</given-names>
</name>
&
<name>
<surname>Graf</surname>
<given-names>E.</given-names>
</name>
<article-title>Efficient Visual Recalibration from Either Visual or Haptic Feedback: The Importance of Being Wrong</article-title>
.
<source>J. Neurosci.</source>
<volume>30</volume>
,
<fpage>14745</fpage>
<lpage>14749</lpage>
,
<pub-id pub-id-type="doi">10.1523/JNEUROSCI.2749-10.2010</pub-id>
(
<year>2010</year>
).
<pub-id pub-id-type="pmid">21048133</pub-id>
</mixed-citation>
</ref>
<ref id="b38">
<mixed-citation publication-type="journal">
<name>
<surname>Adelson</surname>
<given-names>E.</given-names>
</name>
,
<name>
<surname>Rogowitz</surname>
<given-names>B.</given-names>
</name>
&
<name>
<surname>Pappas</surname>
<given-names>T.</given-names>
</name>
<article-title>On seeing stuff: The perception of materials by humans and machines</article-title>
.
<source>Proc. SPIE: Human Vis. Electron. Imag. VI</source>
<volume>4299</volume>
,
<fpage>1</fpage>
<lpage>12</lpage>
,
<pub-id pub-id-type="doi">10.1117/12.429489</pub-id>
(
<year>2001</year>
).</mixed-citation>
</ref>
<ref id="b39">
<mixed-citation publication-type="journal">
<name>
<surname>Doerschner</surname>
<given-names>K.</given-names>
</name>
<italic>et al.</italic>
<article-title>Visual Motion and the Perception of Surface Material</article-title>
.
<source>Curr. Biol.</source>
<volume>21</volume>
,
<fpage>2010</fpage>
<lpage>2016</lpage>
,
<pub-id pub-id-type="doi">10.1016/j.cub.2011.10.036</pub-id>
(
<year>2011</year>
).
<pub-id pub-id-type="pmid">22119529</pub-id>
</mixed-citation>
</ref>
<ref id="b40">
<mixed-citation publication-type="journal">
<name>
<surname>Fleming</surname>
<given-names>R.</given-names>
</name>
,
<name>
<surname>Torralba</surname>
<given-names>A.</given-names>
</name>
&
<name>
<surname>Adelson</surname>
<given-names>E.</given-names>
</name>
<article-title>Specular reflections and the perception of shape</article-title>
.
<source>J. Vision</source>
<volume>4</volume>
,
<fpage>798</fpage>
<lpage>820</lpage>
,
<pub-id pub-id-type="doi">10.1167/4.9.10</pub-id>
(
<year>2004</year>
).</mixed-citation>
</ref>
<ref id="b41">
<mixed-citation publication-type="journal">
<name>
<surname>Anderson</surname>
<given-names>B.</given-names>
</name>
<article-title>Visual perception of materials and surfaces</article-title>
.
<source>Curr. Biol.</source>
<volume>21</volume>
,
<fpage>R978</fpage>
<lpage>R983</lpage>
(
<year>2011</year>
).
<pub-id pub-id-type="pmid">22192826</pub-id>
</mixed-citation>
</ref>
<ref id="b42">
<mixed-citation publication-type="journal">
<name>
<surname>Adams</surname>
<given-names>W. J.</given-names>
</name>
&
<name>
<surname>Elder</surname>
<given-names>J. H.</given-names>
</name>
<article-title>Effects of Specular Highlights on Perceived Surface Convexity</article-title>
.
<source>PLOS Comput. Biol.</source>
<volume>10</volume>
,
<pub-id pub-id-type="doi">10.1371/journal.pcbi.1003576</pub-id>
(
<year>2014</year>
).</mixed-citation>
</ref>
<ref id="b43">
<mixed-citation publication-type="journal">
<name>
<surname>Krim</surname>
<given-names>J.</given-names>
</name>
<source>Surface science and the atomic-scale origins of friction: what once was old is new again. Surf. Sci.</source>
<volume>500</volume>
,
<fpage>741</fpage>
<lpage>758</lpage>
,
<pub-id pub-id-type="doi">10.1016/S0039-6028(01)01529-1</pub-id>
(
<year>2002</year>
).</mixed-citation>
</ref>
<ref id="b44">
<mixed-citation publication-type="journal">
<name>
<surname>Wilson</surname>
<given-names>S.</given-names>
</name>
&
<name>
<surname>Hutley</surname>
<given-names>M.</given-names>
</name>
<article-title>The optical-properties of moth eye antireflection surfaces</article-title>
.
<source>Opt. Acta</source>
<volume>29</volume>
,
<fpage>993</fpage>
<lpage>1009</lpage>
(
<year>1982</year>
).</mixed-citation>
</ref>
<ref id="b45">
<mixed-citation publication-type="journal">
<name>
<surname>Maia</surname>
<given-names>R.</given-names>
</name>
,
<name>
<surname>D’Alba</surname>
<given-names>L.</given-names>
</name>
&
<name>
<surname>Shawkey</surname>
<given-names>M.</given-names>
</name>
<article-title>What makes a feather shine? A nanostructural basis for glossy black colours in feathers</article-title>
.
<source>Proc. Roy. Soc. B-Biol. Sci.</source>
<volume>278</volume>
,
<fpage>1973</fpage>
<lpage>1980</lpage>
,
<pub-id pub-id-type="doi">10.1098/rspb.2010.1637</pub-id>
(
<year>2011</year>
).</mixed-citation>
</ref>
<ref id="b46">
<mixed-citation publication-type="journal">
<name>
<surname>Joh</surname>
<given-names>A.</given-names>
</name>
,
<name>
<surname>Adolph</surname>
<given-names>K.</given-names>
</name>
,
<name>
<surname>Campbell</surname>
<given-names>M.</given-names>
</name>
&
<name>
<surname>Eppler</surname>
<given-names>M.</given-names>
</name>
<article-title>Why walkers slip: Shine is not a reliable cue for slippery ground</article-title>
.
<source>Percept. Psychophys.</source>
<volume>68</volume>
,
<fpage>339</fpage>
<lpage>352</lpage>
,
<pub-id pub-id-type="doi">10.3758/BF03193681</pub-id>
(
<year>2006</year>
).
<pub-id pub-id-type="pmid">16900828</pub-id>
</mixed-citation>
</ref>
<ref id="b47">
<mixed-citation publication-type="journal">
<name>
<surname>Lesch</surname>
<given-names>M.</given-names>
</name>
,
<name>
<surname>Chang</surname>
<given-names>W.</given-names>
</name>
&
<name>
<surname>Chang</surname>
<given-names>C.</given-names>
</name>
<article-title>Visually based perceptions of slipperiness: Underlying cues, consistency and relationship to coefficient of friction</article-title>
.
<source>Ergonomics</source>
<volume>51</volume>
,
<fpage>1973</fpage>
<lpage>1983</lpage>
,
<pub-id pub-id-type="doi">10.1080/00140130802558979</pub-id>
(
<year>2008</year>
).
<pub-id pub-id-type="pmid">19034787</pub-id>
</mixed-citation>
</ref>
<ref id="b48">
<mixed-citation publication-type="journal">
<name>
<surname>Perlin</surname>
<given-names>K.</given-names>
</name>
<article-title>Improving noise</article-title>
.
<source>ACM Trans. Graphic.</source>
<volume>21</volume>
,
<fpage>681</fpage>
<lpage>682</lpage>
(
<year>2002</year>
).</mixed-citation>
</ref>
<ref id="b49">
<mixed-citation publication-type="journal">
<name>
<surname>Debevec</surname>
<given-names>P.</given-names>
</name>
<article-title>Rendering synthetic objects into real scenes: bridging traditional and image-based graphics with global illumination and high dynamic range photography</article-title>
.
<source>Comp. Graph. Proc. SIGGRAPH</source>
<volume>98</volume>
,
<fpage>189</fpage>
<lpage>198</lpage>
(
<year>1998</year>
).</mixed-citation>
</ref>
</ref-list>
<fn-group>
<fn>
<p>
<bold>Author Contributions</bold>
W.J.A.,I.S.K. and E.W.G. wrote the manuscript and contributed to experimental design. W.J.A. and I.S.K. conducted the experiments and analysed the data.</p>
</fn>
</fn-group>
</back>
<floats-group>
<fig id="f1">
<label>Figure 1</label>
<caption>
<title>Experiment 1 stimuli.</title>
<p>(
<bold>a</bold>
) The visual-haptic set-up and an example stimulus. (
<bold>b</bold>
) Stimuli for Type A trials lay along the negative diagonal: i.e. visual gloss and haptic rubberiness were negatively correlated. An example trial is shown to the right, see Methods for details. (
<bold>c</bold>
) The stimulus axis for Type B trials, with an example trial.</p>
</caption>
<graphic xlink:href="srep21866-f1"></graphic>
</fig>
<fig id="f2">
<label>Figure 2</label>
<caption>
<title>Experiment 1 Predictions and Results.</title>
<p>(
<bold>a</bold>
) Hypothetical discrimination performance for Type A (red) and Type B (green) bi-modal stimuli (i) in the absence of integration (left), (ii) given integration driven by a negative gloss-rubberiness correlation (middle) or (iii) integration driven by a positive correlation (right). (
<bold>b</bold>
) Error rates and fits for Type A & B trials for one naïve observer. (
<bold>c</bold>
) Left: Averaged (N = 8) threshold data (mean ± 1 SE). Right: Difference between Type A and Type B thresholds for all 8 observers.</p>
</caption>
<graphic xlink:href="srep21866-f2"></graphic>
</fig>
<fig id="f3">
<label>Figure 3</label>
<caption>
<p>Stimulus ratings, averaged across observers, for (
<bold>a</bold>
) perceived gloss, (
<bold>b</bold>
) perceived friction and (
<bold>c</bold>
) perceived compliance. Yellow indicates high ratings; blue indicates low ratings, as indicated by the legend. Dashed contour lines show stimulus parameter pairings that produce equal ratings, as determined by the optimal regression models. (
<bold>d</bold>
) Mean correlation between each stimulus parameter (gloss: ‘Gl’, friction: ‘Fr’ and compliance ‘Co’) and each of the three rating scales (each rating type shown in a separate plot). Error bars give ± 1SE across observers. Asterisks show significant correlations, from one-sample
<italic>t</italic>
-tests against 0 (*
<italic>p</italic>
 < 0.05, **
<italic>p</italic>
 < 0.01, ***
<italic>p</italic>
 < 0.001).</p>
</caption>
<graphic xlink:href="srep21866-f3"></graphic>
</fig>
<fig id="f4">
<label>Figure 4</label>
<caption>
<title>Visual and haptic uni-modal discrimination trials were used to measure each observer’s uni-modal JNDs, and thus define the visual and haptic stimulus parameter space to be used for bi-modal trials.</title>
</caption>
<graphic xlink:href="srep21866-f4"></graphic>
</fig>
<fig id="f5">
<label>Figure 5</label>
<caption>
<title>Visual-haptic integration with different coupling priors.</title>
<p>Each row shows a likelihood (left column: the information associated with a particular visual-haptic signal), a coupling prior (middle column) and their product, the posterior (right column). Coupling priors in different rows reflect (i) no relationship between visual gloss and haptic rubberiness (top row), or (ii) a weak (rows 2–3) or strong (row 4) negative correlation or (iii) a weak positive correlation (rows 5–6). The effect of different coupling priors can be seen for stimuli lying on the axis of the negative prior (such as C
<sub>A</sub>
), or lying on the axis of the positive prior (C
<sub>B</sub>
). For comparison, the blue ring represents a ‘standard’ stimulus – an object in the middle of the visual-haptic space.</p>
</caption>
<graphic xlink:href="srep21866-f5"></graphic>
</fig>
<fig id="f6">
<label>Figure 6</label>
<caption>
<title>Predicted discrimination performance, given a negative coupling prior.</title>
<p>The proportion of correct trials is shown as a function of the distance between standard and comparison stimuli in units of uni-modal JNDs. Each plot shows a different coupling prior, varying from infinitely broad (no integration) to infinitely narrow (full integration).</p>
</caption>
<graphic xlink:href="srep21866-f6"></graphic>
</fig>
<table-wrap position="float" id="t1">
<label>Table 1</label>
<caption>
<title>Visual and haptic stimulus parameters.</title>
</caption>
<graphic xlink:href="srep21866-t1"></graphic>
<table-wrap-foot>
<fn id="t1-fn1">
<p>Shaded cells show the parameters of the standard stimuli used in both uni- and bi-modal trials.</p>
</fn>
</table-wrap-foot>
</table-wrap>
</floats-group>
</pmc>
</record>

Pour manipuler ce document sous Unix (Dilib)

EXPLOR_STEP=$WICRI_ROOT/Ticri/CIDE/explor/HapticV1/Data/Pmc/Curation
HfdSelect -h $EXPLOR_STEP/biblio.hfd -nk 000650 | SxmlIndent | more

Ou

HfdSelect -h $EXPLOR_AREA/Data/Pmc/Curation/biblio.hfd -nk 000650 | SxmlIndent | more

Pour mettre un lien sur cette page dans le réseau Wicri

{{Explor lien
   |wiki=    Ticri/CIDE
   |area=    HapticV1
   |flux=    Pmc
   |étape=   Curation
   |type=    RBID
   |clé=     PMC:4768155
   |texte=   Touch influences perceived gloss
}}

Pour générer des pages wiki

HfdIndexSelect -h $EXPLOR_AREA/Data/Pmc/Curation/RBID.i   -Sk "pubmed:26915492" \
       | HfdSelect -Kh $EXPLOR_AREA/Data/Pmc/Curation/biblio.hfd   \
       | NlmPubMed2Wicri -a HapticV1 

Wicri

This area was generated with Dilib version V0.6.23.
Data generation: Mon Jun 13 01:09:46 2016. Site generation: Wed Mar 6 09:54:07 2024