Serveur d'exploration sur les dispositifs haptiques

Attention, ce site est en cours de développement !
Attention, site généré par des moyens informatiques à partir de corpus bruts.
Les informations ne sont donc pas validées.

Cross-Modal Sensory Integration of Visual-Tactile Motion Information: Instrument Design and Human Psychophysics

Identifieur interne : 002499 ( Pmc/Curation ); précédent : 002498; suivant : 002500

Cross-Modal Sensory Integration of Visual-Tactile Motion Information: Instrument Design and Human Psychophysics

Auteurs : Yu-Cheng Pei [Taïwan] ; Ting-Yu Chang ; Tsung-Chi Lee ; Sudipta Saha ; Hsin-Yi Lai ; Manuel Gomez-Ramirez ; Shih-Wei Chou ; Alice M. K. Wong

Source :

RBID : PMC:3715219

Abstract

Information obtained from multiple sensory modalities, such as vision and touch, is integrated to yield a holistic percept. As a haptic approach usually involves cross-modal sensory experiences, it is necessary to develop an apparatus that can characterize how a biological system integrates visual-tactile sensory information as well as how a robotic device infers object information emanating from both vision and touch. In the present study, we develop a novel visual-tactile cross-modal integration stimulator that consists of an LED panel to present visual stimuli and a tactile stimulator with three degrees of freedom that can present tactile motion stimuli with arbitrary motion direction, speed, and indentation depth in the skin. The apparatus can present cross-modal stimuli in which the spatial locations of visual and tactile stimulations are perfectly aligned. We presented visual-tactile stimuli in which the visual and tactile directions were either congruent or incongruent, and human observers reported the perceived visual direction of motion. Results showed that perceived direction of visual motion can be biased by the direction of tactile motion when visual signals are weakened. The results also showed that the visual-tactile motion integration follows the rule of temporal congruency of multi-modal inputs, a fundamental property known for cross-modal integration.


Url:
DOI: 10.3390/s130607212
PubMed: 23727955
PubMed Central: 3715219

Links toward previous steps (curation, corpus...)


Links to Exploration step

PMC:3715219

Le document en format XML

<record>
<TEI>
<teiHeader>
<fileDesc>
<titleStmt>
<title xml:lang="en">Cross-Modal Sensory Integration of Visual-Tactile Motion Information: Instrument Design and Human Psychophysics</title>
<author>
<name sortKey="Pei, Yu Cheng" sort="Pei, Yu Cheng" uniqKey="Pei Y" first="Yu-Cheng" last="Pei">Yu-Cheng Pei</name>
<affiliation>
<nlm:aff id="af1-sensors-13-07212"> Department of Physical Medicine and Rehabilitation, Chang Gung Memorial Hospital at Linkou, No. 5, Fushing St, Taoyuan 333, Taiwan; E-Mails:
<email>taiwan.changty@gmail.com</email>
(T.-Y.C.);
<email>toasty1041@gmail.com</email>
(T.-C.L.);
<email>sudiptasaha49@yahoo.co.in</email>
(S.S.);
<email>happydry@ms36.hinet.net</email>
(H.-Y.L.);
<email>yesyesweiwei@gmail.com</email>
(S.-W.C.);
<email>walice@adm.cgmh.org.tw</email>
(A.M.K.W.)</nlm:aff>
</affiliation>
<affiliation wicri:level="1">
<nlm:aff id="af2-sensors-13-07212"> Healthy Aging Research Center, Chang Gung University, No. 259, Wen-Hwa 1st Road, Taoyuan 333, Taiwan</nlm:aff>
<country xml:lang="fr">Taïwan</country>
<wicri:regionArea> Healthy Aging Research Center, Chang Gung University, No. 259, Wen-Hwa 1st Road, Taoyuan 333</wicri:regionArea>
</affiliation>
<affiliation wicri:level="1">
<nlm:aff id="af3-sensors-13-07212"> School of Medicine, Chang Gung University, No. 259, Wen-Hwa 1st Road, Taoyuan 333, Taiwan</nlm:aff>
<country xml:lang="fr">Taïwan</country>
<wicri:regionArea> School of Medicine, Chang Gung University, No. 259, Wen-Hwa 1st Road, Taoyuan 333</wicri:regionArea>
</affiliation>
</author>
<author>
<name sortKey="Chang, Ting Yu" sort="Chang, Ting Yu" uniqKey="Chang T" first="Ting-Yu" last="Chang">Ting-Yu Chang</name>
<affiliation>
<nlm:aff id="af1-sensors-13-07212"> Department of Physical Medicine and Rehabilitation, Chang Gung Memorial Hospital at Linkou, No. 5, Fushing St, Taoyuan 333, Taiwan; E-Mails:
<email>taiwan.changty@gmail.com</email>
(T.-Y.C.);
<email>toasty1041@gmail.com</email>
(T.-C.L.);
<email>sudiptasaha49@yahoo.co.in</email>
(S.S.);
<email>happydry@ms36.hinet.net</email>
(H.-Y.L.);
<email>yesyesweiwei@gmail.com</email>
(S.-W.C.);
<email>walice@adm.cgmh.org.tw</email>
(A.M.K.W.)</nlm:aff>
</affiliation>
</author>
<author>
<name sortKey="Lee, Tsung Chi" sort="Lee, Tsung Chi" uniqKey="Lee T" first="Tsung-Chi" last="Lee">Tsung-Chi Lee</name>
<affiliation>
<nlm:aff id="af1-sensors-13-07212"> Department of Physical Medicine and Rehabilitation, Chang Gung Memorial Hospital at Linkou, No. 5, Fushing St, Taoyuan 333, Taiwan; E-Mails:
<email>taiwan.changty@gmail.com</email>
(T.-Y.C.);
<email>toasty1041@gmail.com</email>
(T.-C.L.);
<email>sudiptasaha49@yahoo.co.in</email>
(S.S.);
<email>happydry@ms36.hinet.net</email>
(H.-Y.L.);
<email>yesyesweiwei@gmail.com</email>
(S.-W.C.);
<email>walice@adm.cgmh.org.tw</email>
(A.M.K.W.)</nlm:aff>
</affiliation>
</author>
<author>
<name sortKey="Saha, Sudipta" sort="Saha, Sudipta" uniqKey="Saha S" first="Sudipta" last="Saha">Sudipta Saha</name>
<affiliation>
<nlm:aff id="af1-sensors-13-07212"> Department of Physical Medicine and Rehabilitation, Chang Gung Memorial Hospital at Linkou, No. 5, Fushing St, Taoyuan 333, Taiwan; E-Mails:
<email>taiwan.changty@gmail.com</email>
(T.-Y.C.);
<email>toasty1041@gmail.com</email>
(T.-C.L.);
<email>sudiptasaha49@yahoo.co.in</email>
(S.S.);
<email>happydry@ms36.hinet.net</email>
(H.-Y.L.);
<email>yesyesweiwei@gmail.com</email>
(S.-W.C.);
<email>walice@adm.cgmh.org.tw</email>
(A.M.K.W.)</nlm:aff>
</affiliation>
</author>
<author>
<name sortKey="Lai, Hsin Yi" sort="Lai, Hsin Yi" uniqKey="Lai H" first="Hsin-Yi" last="Lai">Hsin-Yi Lai</name>
<affiliation>
<nlm:aff id="af1-sensors-13-07212"> Department of Physical Medicine and Rehabilitation, Chang Gung Memorial Hospital at Linkou, No. 5, Fushing St, Taoyuan 333, Taiwan; E-Mails:
<email>taiwan.changty@gmail.com</email>
(T.-Y.C.);
<email>toasty1041@gmail.com</email>
(T.-C.L.);
<email>sudiptasaha49@yahoo.co.in</email>
(S.S.);
<email>happydry@ms36.hinet.net</email>
(H.-Y.L.);
<email>yesyesweiwei@gmail.com</email>
(S.-W.C.);
<email>walice@adm.cgmh.org.tw</email>
(A.M.K.W.)</nlm:aff>
</affiliation>
</author>
<author>
<name sortKey="Gomez Ramirez, Manuel" sort="Gomez Ramirez, Manuel" uniqKey="Gomez Ramirez M" first="Manuel" last="Gomez-Ramirez">Manuel Gomez-Ramirez</name>
<affiliation>
<nlm:aff id="af4-sensors-13-07212"> The Zanvyl Krieger Mind/Brain Institute, Johns Hopkins University, 3400 N. Charles Street 338 Krieger Hall, Baltimore, MD 21218, USA; E-Mail:
<email>gomezramirezm@jhu.edu</email>
</nlm:aff>
</affiliation>
</author>
<author>
<name sortKey="Chou, Shih Wei" sort="Chou, Shih Wei" uniqKey="Chou S" first="Shih-Wei" last="Chou">Shih-Wei Chou</name>
<affiliation>
<nlm:aff id="af1-sensors-13-07212"> Department of Physical Medicine and Rehabilitation, Chang Gung Memorial Hospital at Linkou, No. 5, Fushing St, Taoyuan 333, Taiwan; E-Mails:
<email>taiwan.changty@gmail.com</email>
(T.-Y.C.);
<email>toasty1041@gmail.com</email>
(T.-C.L.);
<email>sudiptasaha49@yahoo.co.in</email>
(S.S.);
<email>happydry@ms36.hinet.net</email>
(H.-Y.L.);
<email>yesyesweiwei@gmail.com</email>
(S.-W.C.);
<email>walice@adm.cgmh.org.tw</email>
(A.M.K.W.)</nlm:aff>
</affiliation>
</author>
<author>
<name sortKey="Wong, Alice M K" sort="Wong, Alice M K" uniqKey="Wong A" first="Alice M. K." last="Wong">Alice M. K. Wong</name>
<affiliation>
<nlm:aff id="af1-sensors-13-07212"> Department of Physical Medicine and Rehabilitation, Chang Gung Memorial Hospital at Linkou, No. 5, Fushing St, Taoyuan 333, Taiwan; E-Mails:
<email>taiwan.changty@gmail.com</email>
(T.-Y.C.);
<email>toasty1041@gmail.com</email>
(T.-C.L.);
<email>sudiptasaha49@yahoo.co.in</email>
(S.S.);
<email>happydry@ms36.hinet.net</email>
(H.-Y.L.);
<email>yesyesweiwei@gmail.com</email>
(S.-W.C.);
<email>walice@adm.cgmh.org.tw</email>
(A.M.K.W.)</nlm:aff>
</affiliation>
</author>
</titleStmt>
<publicationStmt>
<idno type="wicri:source">PMC</idno>
<idno type="pmid">23727955</idno>
<idno type="pmc">3715219</idno>
<idno type="url">http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3715219</idno>
<idno type="RBID">PMC:3715219</idno>
<idno type="doi">10.3390/s130607212</idno>
<date when="2013">2013</date>
<idno type="wicri:Area/Pmc/Corpus">002499</idno>
<idno type="wicri:Area/Pmc/Curation">002499</idno>
</publicationStmt>
<sourceDesc>
<biblStruct>
<analytic>
<title xml:lang="en" level="a" type="main">Cross-Modal Sensory Integration of Visual-Tactile Motion Information: Instrument Design and Human Psychophysics</title>
<author>
<name sortKey="Pei, Yu Cheng" sort="Pei, Yu Cheng" uniqKey="Pei Y" first="Yu-Cheng" last="Pei">Yu-Cheng Pei</name>
<affiliation>
<nlm:aff id="af1-sensors-13-07212"> Department of Physical Medicine and Rehabilitation, Chang Gung Memorial Hospital at Linkou, No. 5, Fushing St, Taoyuan 333, Taiwan; E-Mails:
<email>taiwan.changty@gmail.com</email>
(T.-Y.C.);
<email>toasty1041@gmail.com</email>
(T.-C.L.);
<email>sudiptasaha49@yahoo.co.in</email>
(S.S.);
<email>happydry@ms36.hinet.net</email>
(H.-Y.L.);
<email>yesyesweiwei@gmail.com</email>
(S.-W.C.);
<email>walice@adm.cgmh.org.tw</email>
(A.M.K.W.)</nlm:aff>
</affiliation>
<affiliation wicri:level="1">
<nlm:aff id="af2-sensors-13-07212"> Healthy Aging Research Center, Chang Gung University, No. 259, Wen-Hwa 1st Road, Taoyuan 333, Taiwan</nlm:aff>
<country xml:lang="fr">Taïwan</country>
<wicri:regionArea> Healthy Aging Research Center, Chang Gung University, No. 259, Wen-Hwa 1st Road, Taoyuan 333</wicri:regionArea>
</affiliation>
<affiliation wicri:level="1">
<nlm:aff id="af3-sensors-13-07212"> School of Medicine, Chang Gung University, No. 259, Wen-Hwa 1st Road, Taoyuan 333, Taiwan</nlm:aff>
<country xml:lang="fr">Taïwan</country>
<wicri:regionArea> School of Medicine, Chang Gung University, No. 259, Wen-Hwa 1st Road, Taoyuan 333</wicri:regionArea>
</affiliation>
</author>
<author>
<name sortKey="Chang, Ting Yu" sort="Chang, Ting Yu" uniqKey="Chang T" first="Ting-Yu" last="Chang">Ting-Yu Chang</name>
<affiliation>
<nlm:aff id="af1-sensors-13-07212"> Department of Physical Medicine and Rehabilitation, Chang Gung Memorial Hospital at Linkou, No. 5, Fushing St, Taoyuan 333, Taiwan; E-Mails:
<email>taiwan.changty@gmail.com</email>
(T.-Y.C.);
<email>toasty1041@gmail.com</email>
(T.-C.L.);
<email>sudiptasaha49@yahoo.co.in</email>
(S.S.);
<email>happydry@ms36.hinet.net</email>
(H.-Y.L.);
<email>yesyesweiwei@gmail.com</email>
(S.-W.C.);
<email>walice@adm.cgmh.org.tw</email>
(A.M.K.W.)</nlm:aff>
</affiliation>
</author>
<author>
<name sortKey="Lee, Tsung Chi" sort="Lee, Tsung Chi" uniqKey="Lee T" first="Tsung-Chi" last="Lee">Tsung-Chi Lee</name>
<affiliation>
<nlm:aff id="af1-sensors-13-07212"> Department of Physical Medicine and Rehabilitation, Chang Gung Memorial Hospital at Linkou, No. 5, Fushing St, Taoyuan 333, Taiwan; E-Mails:
<email>taiwan.changty@gmail.com</email>
(T.-Y.C.);
<email>toasty1041@gmail.com</email>
(T.-C.L.);
<email>sudiptasaha49@yahoo.co.in</email>
(S.S.);
<email>happydry@ms36.hinet.net</email>
(H.-Y.L.);
<email>yesyesweiwei@gmail.com</email>
(S.-W.C.);
<email>walice@adm.cgmh.org.tw</email>
(A.M.K.W.)</nlm:aff>
</affiliation>
</author>
<author>
<name sortKey="Saha, Sudipta" sort="Saha, Sudipta" uniqKey="Saha S" first="Sudipta" last="Saha">Sudipta Saha</name>
<affiliation>
<nlm:aff id="af1-sensors-13-07212"> Department of Physical Medicine and Rehabilitation, Chang Gung Memorial Hospital at Linkou, No. 5, Fushing St, Taoyuan 333, Taiwan; E-Mails:
<email>taiwan.changty@gmail.com</email>
(T.-Y.C.);
<email>toasty1041@gmail.com</email>
(T.-C.L.);
<email>sudiptasaha49@yahoo.co.in</email>
(S.S.);
<email>happydry@ms36.hinet.net</email>
(H.-Y.L.);
<email>yesyesweiwei@gmail.com</email>
(S.-W.C.);
<email>walice@adm.cgmh.org.tw</email>
(A.M.K.W.)</nlm:aff>
</affiliation>
</author>
<author>
<name sortKey="Lai, Hsin Yi" sort="Lai, Hsin Yi" uniqKey="Lai H" first="Hsin-Yi" last="Lai">Hsin-Yi Lai</name>
<affiliation>
<nlm:aff id="af1-sensors-13-07212"> Department of Physical Medicine and Rehabilitation, Chang Gung Memorial Hospital at Linkou, No. 5, Fushing St, Taoyuan 333, Taiwan; E-Mails:
<email>taiwan.changty@gmail.com</email>
(T.-Y.C.);
<email>toasty1041@gmail.com</email>
(T.-C.L.);
<email>sudiptasaha49@yahoo.co.in</email>
(S.S.);
<email>happydry@ms36.hinet.net</email>
(H.-Y.L.);
<email>yesyesweiwei@gmail.com</email>
(S.-W.C.);
<email>walice@adm.cgmh.org.tw</email>
(A.M.K.W.)</nlm:aff>
</affiliation>
</author>
<author>
<name sortKey="Gomez Ramirez, Manuel" sort="Gomez Ramirez, Manuel" uniqKey="Gomez Ramirez M" first="Manuel" last="Gomez-Ramirez">Manuel Gomez-Ramirez</name>
<affiliation>
<nlm:aff id="af4-sensors-13-07212"> The Zanvyl Krieger Mind/Brain Institute, Johns Hopkins University, 3400 N. Charles Street 338 Krieger Hall, Baltimore, MD 21218, USA; E-Mail:
<email>gomezramirezm@jhu.edu</email>
</nlm:aff>
</affiliation>
</author>
<author>
<name sortKey="Chou, Shih Wei" sort="Chou, Shih Wei" uniqKey="Chou S" first="Shih-Wei" last="Chou">Shih-Wei Chou</name>
<affiliation>
<nlm:aff id="af1-sensors-13-07212"> Department of Physical Medicine and Rehabilitation, Chang Gung Memorial Hospital at Linkou, No. 5, Fushing St, Taoyuan 333, Taiwan; E-Mails:
<email>taiwan.changty@gmail.com</email>
(T.-Y.C.);
<email>toasty1041@gmail.com</email>
(T.-C.L.);
<email>sudiptasaha49@yahoo.co.in</email>
(S.S.);
<email>happydry@ms36.hinet.net</email>
(H.-Y.L.);
<email>yesyesweiwei@gmail.com</email>
(S.-W.C.);
<email>walice@adm.cgmh.org.tw</email>
(A.M.K.W.)</nlm:aff>
</affiliation>
</author>
<author>
<name sortKey="Wong, Alice M K" sort="Wong, Alice M K" uniqKey="Wong A" first="Alice M. K." last="Wong">Alice M. K. Wong</name>
<affiliation>
<nlm:aff id="af1-sensors-13-07212"> Department of Physical Medicine and Rehabilitation, Chang Gung Memorial Hospital at Linkou, No. 5, Fushing St, Taoyuan 333, Taiwan; E-Mails:
<email>taiwan.changty@gmail.com</email>
(T.-Y.C.);
<email>toasty1041@gmail.com</email>
(T.-C.L.);
<email>sudiptasaha49@yahoo.co.in</email>
(S.S.);
<email>happydry@ms36.hinet.net</email>
(H.-Y.L.);
<email>yesyesweiwei@gmail.com</email>
(S.-W.C.);
<email>walice@adm.cgmh.org.tw</email>
(A.M.K.W.)</nlm:aff>
</affiliation>
</author>
</analytic>
<series>
<title level="j">Sensors (Basel, Switzerland)</title>
<idno type="eISSN">1424-8220</idno>
<imprint>
<date when="2013">2013</date>
</imprint>
</series>
</biblStruct>
</sourceDesc>
</fileDesc>
<profileDesc>
<textClass></textClass>
</profileDesc>
</teiHeader>
<front>
<div type="abstract" xml:lang="en">
<p>Information obtained from multiple sensory modalities, such as vision and touch, is integrated to yield a holistic percept. As a haptic approach usually involves cross-modal sensory experiences, it is necessary to develop an apparatus that can characterize how a biological system integrates visual-tactile sensory information as well as how a robotic device infers object information emanating from both vision and touch. In the present study, we develop a novel visual-tactile cross-modal integration stimulator that consists of an LED panel to present visual stimuli and a tactile stimulator with three degrees of freedom that can present tactile motion stimuli with arbitrary motion direction, speed, and indentation depth in the skin. The apparatus can present cross-modal stimuli in which the spatial locations of visual and tactile stimulations are perfectly aligned. We presented visual-tactile stimuli in which the visual and tactile directions were either congruent or incongruent, and human observers reported the perceived visual direction of motion. Results showed that perceived direction of visual motion can be biased by the direction of tactile motion when visual signals are weakened. The results also showed that the visual-tactile motion integration follows the rule of temporal congruency of multi-modal inputs, a fundamental property known for cross-modal integration.</p>
</div>
</front>
<back>
<div1 type="bibliography">
<listBibl>
<biblStruct>
<analytic>
<author>
<name sortKey="Stein, B E" uniqKey="Stein B">B.E. Stein</name>
</author>
<author>
<name sortKey="Stanford, T R" uniqKey="Stanford T">T.R. Stanford</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Beauchamp, M S" uniqKey="Beauchamp M">M.S. Beauchamp</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Ernst, M O" uniqKey="Ernst M">M.O. Ernst</name>
</author>
<author>
<name sortKey="Bulthoff, H H" uniqKey="Bulthoff H">H.H. Bulthoff</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Mcgurk, H" uniqKey="Mcgurk H">H. McGurk</name>
</author>
<author>
<name sortKey="Macdonald, J" uniqKey="Macdonald J">J. MacDonald</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Sekuler, R" uniqKey="Sekuler R">R. Sekuler</name>
</author>
<author>
<name sortKey="Sekuler, A B" uniqKey="Sekuler A">A.B. Sekuler</name>
</author>
<author>
<name sortKey="Lau, R" uniqKey="Lau R">R. Lau</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Guest, S" uniqKey="Guest S">S. Guest</name>
</author>
<author>
<name sortKey="Catmur, C" uniqKey="Catmur C">C. Catmur</name>
</author>
<author>
<name sortKey="Lloyd, D" uniqKey="Lloyd D">D. Lloyd</name>
</author>
<author>
<name sortKey="Spence, C" uniqKey="Spence C">C. Spence</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Ernst, M O" uniqKey="Ernst M">M.O. Ernst</name>
</author>
<author>
<name sortKey="Banks, M S" uniqKey="Banks M">M.S. Banks</name>
</author>
<author>
<name sortKey="Bulthoff, H H" uniqKey="Bulthoff H">H.H. Bulthoff</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Spence, C" uniqKey="Spence C">C. Spence</name>
</author>
<author>
<name sortKey="Pavani, F" uniqKey="Pavani F">F. Pavani</name>
</author>
<author>
<name sortKey="Driver, J" uniqKey="Driver J">J. Driver</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Kennett, S" uniqKey="Kennett S">S. Kennett</name>
</author>
<author>
<name sortKey="Spence, C" uniqKey="Spence C">C. Spence</name>
</author>
<author>
<name sortKey="Driver, J" uniqKey="Driver J">J. Driver</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Hyv Rinen, J" uniqKey="Hyv Rinen J">J. Hyvärinen</name>
</author>
<author>
<name sortKey="Shelepin, Y" uniqKey="Shelepin Y">Y. Shelepin</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Hyv Rinen, J" uniqKey="Hyv Rinen J">J. Hyvärinen</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Avillac, M" uniqKey="Avillac M">M. Avillac</name>
</author>
<author>
<name sortKey="Deneve, S" uniqKey="Deneve S">S. Deneve</name>
</author>
<author>
<name sortKey="Olivier, E" uniqKey="Olivier E">E. Olivier</name>
</author>
<author>
<name sortKey="Pouget, A" uniqKey="Pouget A">A. Pouget</name>
</author>
<author>
<name sortKey="Duhamel, J R" uniqKey="Duhamel J">J.R. Duhamel</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Carter, O" uniqKey="Carter O">O. Carter</name>
</author>
<author>
<name sortKey="Konkle, T" uniqKey="Konkle T">T. Konkle</name>
</author>
<author>
<name sortKey="Wang, Q" uniqKey="Wang Q">Q. Wang</name>
</author>
<author>
<name sortKey="Hayward, V" uniqKey="Hayward V">V. Hayward</name>
</author>
<author>
<name sortKey="Moore, C" uniqKey="Moore C">C. Moore</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Konkle, T" uniqKey="Konkle T">T. Konkle</name>
</author>
<author>
<name sortKey="Wang, Q" uniqKey="Wang Q">Q. Wang</name>
</author>
<author>
<name sortKey="Hayward, V" uniqKey="Hayward V">V. Hayward</name>
</author>
<author>
<name sortKey="Moore, C I" uniqKey="Moore C">C.I. Moore</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Bensmaia, S J" uniqKey="Bensmaia S">S.J. Bensmaia</name>
</author>
<author>
<name sortKey="Killebrew, J H" uniqKey="Killebrew J">J.H. Killebrew</name>
</author>
<author>
<name sortKey="Craig, J C" uniqKey="Craig J">J.C. Craig</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Blake, R" uniqKey="Blake R">R. Blake</name>
</author>
<author>
<name sortKey="Sobel, K V" uniqKey="Sobel K">K.V. Sobel</name>
</author>
<author>
<name sortKey="James, T W" uniqKey="James T">T.W. James</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Shore, D I" uniqKey="Shore D">D.I. Shore</name>
</author>
<author>
<name sortKey="Barnes, M E" uniqKey="Barnes M">M.E. Barnes</name>
</author>
<author>
<name sortKey="Spence, C" uniqKey="Spence C">C. Spence</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Pei, Y C" uniqKey="Pei Y">Y.C. Pei</name>
</author>
<author>
<name sortKey="Hsiao, S S" uniqKey="Hsiao S">S.S. Hsiao</name>
</author>
<author>
<name sortKey="Bensmaia, S J" uniqKey="Bensmaia S">S.J. Bensmaia</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Pei, Y C" uniqKey="Pei Y">Y.C. Pei</name>
</author>
<author>
<name sortKey="Hsiao, S S" uniqKey="Hsiao S">S.S. Hsiao</name>
</author>
<author>
<name sortKey="Craig, J C" uniqKey="Craig J">J.C. Craig</name>
</author>
<author>
<name sortKey="Bensmaia, S J" uniqKey="Bensmaia S">S.J. Bensmaia</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Killebrew, J H" uniqKey="Killebrew J">J.H. Killebrew</name>
</author>
<author>
<name sortKey="Bensmaia, S J" uniqKey="Bensmaia S">S.J. Bensmaia</name>
</author>
<author>
<name sortKey="Dammann, J F" uniqKey="Dammann J">J.F. Dammann</name>
</author>
<author>
<name sortKey="Denchev, P" uniqKey="Denchev P">P. Denchev</name>
</author>
<author>
<name sortKey="Hsiao, S S" uniqKey="Hsiao S">S.S. Hsiao</name>
</author>
<author>
<name sortKey="Craig, J C" uniqKey="Craig J">J.C. Craig</name>
</author>
<author>
<name sortKey="Johnson, K O" uniqKey="Johnson K">K.O. Johnson</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Gori, M" uniqKey="Gori M">M. Gori</name>
</author>
<author>
<name sortKey="Mazzilli, G" uniqKey="Mazzilli G">G. Mazzilli</name>
</author>
<author>
<name sortKey="Sandini, G" uniqKey="Sandini G">G. Sandini</name>
</author>
<author>
<name sortKey="Burr, D" uniqKey="Burr D">D. Burr</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Johnson, K O" uniqKey="Johnson K">K.O. Johnson</name>
</author>
<author>
<name sortKey="Phillips, J R" uniqKey="Phillips J">J.R. Phillips</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Essick, G K" uniqKey="Essick G">G.K. Essick</name>
</author>
<author>
<name sortKey="Whitsel, B L" uniqKey="Whitsel B">B.L. Whitsel</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Meredith, M A" uniqKey="Meredith M">M.A. Meredith</name>
</author>
<author>
<name sortKey="Nemitz, J W" uniqKey="Nemitz J">J.W. Nemitz</name>
</author>
<author>
<name sortKey="Stein, B E" uniqKey="Stein B">B.E. Stein</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Holmes, N P" uniqKey="Holmes N">N.P. Holmes</name>
</author>
<author>
<name sortKey="Spence, C" uniqKey="Spence C">C. Spence</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Ernst, M O" uniqKey="Ernst M">M.O. Ernst</name>
</author>
<author>
<name sortKey="Banks, M S" uniqKey="Banks M">M.S. Banks</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Meredith, M A" uniqKey="Meredith M">M.A. Meredith</name>
</author>
<author>
<name sortKey="Stein, B E" uniqKey="Stein B">B.E. Stein</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Meredith, M A" uniqKey="Meredith M">M.A. Meredith</name>
</author>
<author>
<name sortKey="Stein, B E" uniqKey="Stein B">B.E. Stein</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Morgan, M L" uniqKey="Morgan M">M.L. Morgan</name>
</author>
<author>
<name sortKey="Deangelis, G C" uniqKey="Deangelis G">G.C. DeAngelis</name>
</author>
<author>
<name sortKey="Angelaki, D E" uniqKey="Angelaki D">D.E. Angelaki</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Fetsch, C R" uniqKey="Fetsch C">C.R. Fetsch</name>
</author>
<author>
<name sortKey="Pouget, A" uniqKey="Pouget A">A. Pouget</name>
</author>
<author>
<name sortKey="Deangelis, G C" uniqKey="Deangelis G">G.C. DeAngelis</name>
</author>
<author>
<name sortKey="Angelaki, D E" uniqKey="Angelaki D">D.E. Angelaki</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Hillis, J M" uniqKey="Hillis J">J.M. Hillis</name>
</author>
<author>
<name sortKey="Watt, S J" uniqKey="Watt S">S.J. Watt</name>
</author>
<author>
<name sortKey="Landy, M S" uniqKey="Landy M">M.S. Landy</name>
</author>
<author>
<name sortKey="Banks, M S" uniqKey="Banks M">M.S. Banks</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Jacobs, R A" uniqKey="Jacobs R">R.A. Jacobs</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Alais, D" uniqKey="Alais D">D. Alais</name>
</author>
<author>
<name sortKey="Burr, D" uniqKey="Burr D">D. Burr</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Butz, M V" uniqKey="Butz M">M.V. Butz</name>
</author>
<author>
<name sortKey="Thomaschke, R" uniqKey="Thomaschke R">R. Thomaschke</name>
</author>
<author>
<name sortKey="Linhardt, M J" uniqKey="Linhardt M">M.J. Linhardt</name>
</author>
<author>
<name sortKey="Herbort, O" uniqKey="Herbort O">O. Herbort</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Hagen, M C" uniqKey="Hagen M">M.C. Hagen</name>
</author>
<author>
<name sortKey="Franzen, O" uniqKey="Franzen O">O. Franzen</name>
</author>
<author>
<name sortKey="Mcglone, F" uniqKey="Mcglone F">F. McGlone</name>
</author>
<author>
<name sortKey="Essick, G" uniqKey="Essick G">G. Essick</name>
</author>
<author>
<name sortKey="Dancer, C" uniqKey="Dancer C">C. Dancer</name>
</author>
<author>
<name sortKey="Pardo, J V" uniqKey="Pardo J">J.V. Pardo</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Beauchamp, M S" uniqKey="Beauchamp M">M.S. Beauchamp</name>
</author>
<author>
<name sortKey="Yasar, N E" uniqKey="Yasar N">N.E. Yasar</name>
</author>
<author>
<name sortKey="Kishan, N" uniqKey="Kishan N">N. Kishan</name>
</author>
<author>
<name sortKey="Ro, T" uniqKey="Ro T">T. Ro</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="De Gelder, B" uniqKey="De Gelder B">B. De Gelder</name>
</author>
<author>
<name sortKey="Bertelson, P" uniqKey="Bertelson P">P. Bertelson</name>
</author>
</analytic>
</biblStruct>
</listBibl>
</div1>
</back>
</TEI>
<pmc article-type="research-article">
<pmc-dir>properties open_access</pmc-dir>
<front>
<journal-meta>
<journal-id journal-id-type="nlm-ta">Sensors (Basel)</journal-id>
<journal-id journal-id-type="iso-abbrev">Sensors (Basel)</journal-id>
<journal-title-group>
<journal-title>Sensors (Basel, Switzerland)</journal-title>
</journal-title-group>
<issn pub-type="epub">1424-8220</issn>
<publisher>
<publisher-name>Molecular Diversity Preservation International (MDPI)</publisher-name>
</publisher>
</journal-meta>
<article-meta>
<article-id pub-id-type="pmid">23727955</article-id>
<article-id pub-id-type="pmc">3715219</article-id>
<article-id pub-id-type="doi">10.3390/s130607212</article-id>
<article-id pub-id-type="publisher-id">sensors-13-07212</article-id>
<article-categories>
<subj-group subj-group-type="heading">
<subject>Article</subject>
</subj-group>
</article-categories>
<title-group>
<article-title>Cross-Modal Sensory Integration of Visual-Tactile Motion Information: Instrument Design and Human Psychophysics</article-title>
</title-group>
<contrib-group>
<contrib contrib-type="author">
<name>
<surname>Pei</surname>
<given-names>Yu-Cheng</given-names>
</name>
<xref ref-type="aff" rid="af1-sensors-13-07212">
<sup>1</sup>
</xref>
<xref ref-type="aff" rid="af2-sensors-13-07212">
<sup>2</sup>
</xref>
<xref ref-type="aff" rid="af3-sensors-13-07212">
<sup>3</sup>
</xref>
<xref rid="c1-sensors-13-07212" ref-type="corresp">
<sup>*</sup>
</xref>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Chang</surname>
<given-names>Ting-Yu</given-names>
</name>
<xref ref-type="aff" rid="af1-sensors-13-07212">
<sup>1</sup>
</xref>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Lee</surname>
<given-names>Tsung-Chi</given-names>
</name>
<xref ref-type="aff" rid="af1-sensors-13-07212">
<sup>1</sup>
</xref>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Saha</surname>
<given-names>Sudipta</given-names>
</name>
<xref ref-type="aff" rid="af1-sensors-13-07212">
<sup>1</sup>
</xref>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Lai</surname>
<given-names>Hsin-Yi</given-names>
</name>
<xref ref-type="aff" rid="af1-sensors-13-07212">
<sup>1</sup>
</xref>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Gomez-Ramirez</surname>
<given-names>Manuel</given-names>
</name>
<xref ref-type="aff" rid="af4-sensors-13-07212">
<sup>4</sup>
</xref>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Chou</surname>
<given-names>Shih-Wei</given-names>
</name>
<xref ref-type="aff" rid="af1-sensors-13-07212">
<sup>1</sup>
</xref>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Wong</surname>
<given-names>Alice M. K.</given-names>
</name>
<xref ref-type="aff" rid="af1-sensors-13-07212">
<sup>1</sup>
</xref>
</contrib>
</contrib-group>
<aff id="af1-sensors-13-07212">
<label>1</label>
Department of Physical Medicine and Rehabilitation, Chang Gung Memorial Hospital at Linkou, No. 5, Fushing St, Taoyuan 333, Taiwan; E-Mails:
<email>taiwan.changty@gmail.com</email>
(T.-Y.C.);
<email>toasty1041@gmail.com</email>
(T.-C.L.);
<email>sudiptasaha49@yahoo.co.in</email>
(S.S.);
<email>happydry@ms36.hinet.net</email>
(H.-Y.L.);
<email>yesyesweiwei@gmail.com</email>
(S.-W.C.);
<email>walice@adm.cgmh.org.tw</email>
(A.M.K.W.)</aff>
<aff id="af2-sensors-13-07212">
<label>2</label>
Healthy Aging Research Center, Chang Gung University, No. 259, Wen-Hwa 1st Road, Taoyuan 333, Taiwan</aff>
<aff id="af3-sensors-13-07212">
<label>3</label>
School of Medicine, Chang Gung University, No. 259, Wen-Hwa 1st Road, Taoyuan 333, Taiwan</aff>
<aff id="af4-sensors-13-07212">
<label>4</label>
The Zanvyl Krieger Mind/Brain Institute, Johns Hopkins University, 3400 N. Charles Street 338 Krieger Hall, Baltimore, MD 21218, USA; E-Mail:
<email>gomezramirezm@jhu.edu</email>
</aff>
<author-notes>
<corresp id="c1-sensors-13-07212">
<label>*</label>
Author to whom correspondence should be addressed; E-Mail:
<email>yspeii@adm.cgmh.org.tw</email>
; Tel.: +886-33281200 (ext. 8146); Fax: +886-33281200 (ext. 2667).</corresp>
</author-notes>
<pub-date pub-type="collection">
<month>6</month>
<year>2013</year>
</pub-date>
<pub-date pub-type="epub">
<day>31</day>
<month>5</month>
<year>2013</year>
</pub-date>
<volume>13</volume>
<issue>6</issue>
<fpage>7212</fpage>
<lpage>7223</lpage>
<history>
<date date-type="received">
<day>18</day>
<month>3</month>
<year>2013</year>
</date>
<date date-type="rev-recd">
<day>22</day>
<month>5</month>
<year>2013</year>
</date>
<date date-type="accepted">
<day>23</day>
<month>5</month>
<year>2013</year>
</date>
</history>
<permissions>
<copyright-statement>© 2013 by the authors; licensee MDPI, Basel, Switzerland.</copyright-statement>
<copyright-year>2013</copyright-year>
<license>
<license-p>
<pmc-comment>CREATIVE COMMONS</pmc-comment>
This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution license (
<ext-link ext-link-type="uri" xlink:href="http://creativecommons.org/licenses/by/3.0/">http://creativecommons.org/licenses/by/3.0/</ext-link>
).</license-p>
</license>
</permissions>
<abstract>
<p>Information obtained from multiple sensory modalities, such as vision and touch, is integrated to yield a holistic percept. As a haptic approach usually involves cross-modal sensory experiences, it is necessary to develop an apparatus that can characterize how a biological system integrates visual-tactile sensory information as well as how a robotic device infers object information emanating from both vision and touch. In the present study, we develop a novel visual-tactile cross-modal integration stimulator that consists of an LED panel to present visual stimuli and a tactile stimulator with three degrees of freedom that can present tactile motion stimuli with arbitrary motion direction, speed, and indentation depth in the skin. The apparatus can present cross-modal stimuli in which the spatial locations of visual and tactile stimulations are perfectly aligned. We presented visual-tactile stimuli in which the visual and tactile directions were either congruent or incongruent, and human observers reported the perceived visual direction of motion. Results showed that perceived direction of visual motion can be biased by the direction of tactile motion when visual signals are weakened. The results also showed that the visual-tactile motion integration follows the rule of temporal congruency of multi-modal inputs, a fundamental property known for cross-modal integration.</p>
</abstract>
<kwd-group>
<kwd>visual-tactile integration</kwd>
<kwd>direction of motion</kwd>
<kwd>congruency</kwd>
<kwd>haptic approach</kwd>
<kwd>tactile stimulator</kwd>
</kwd-group>
</article-meta>
</front>
<body>
<sec>
<label>1.</label>
<title>Introduction</title>
<p>For a biological system, perception often requires information emanating from sensors of multiple modalities, such as vision, touch and audition [
<xref rid="b1-sensors-13-07212" ref-type="bibr">1</xref>
<xref rid="b3-sensors-13-07212" ref-type="bibr">3</xref>
]. For example, the interaction between audition and vision determines perceived speech [
<xref rid="b4-sensors-13-07212" ref-type="bibr">4</xref>
] and perceived timing of collision [
<xref rid="b5-sensors-13-07212" ref-type="bibr">5</xref>
]. Similarly, sound can influence the perceived roughness of a touched surface [
<xref rid="b6-sensors-13-07212" ref-type="bibr">6</xref>
] and can also affect perceived surface slant [
<xref rid="b7-sensors-13-07212" ref-type="bibr">7</xref>
].</p>
<p>Touch and vision are similar in that both sensory signals derive from a sheet of sensor arrays, cutaneous receptors in the skin and photoreceptors in the retina. Indeed, touch and vision are intuitively integrated to yield a holistic percept of the environment around us [
<xref rid="b8-sensors-13-07212" ref-type="bibr">8</xref>
,
<xref rid="b9-sensors-13-07212" ref-type="bibr">9</xref>
]. It has been hypothesized that cross-modal integration is processed in cortical regions that receive both visual and tactile signals [
<xref rid="b10-sensors-13-07212" ref-type="bibr">10</xref>
<xref rid="b12-sensors-13-07212" ref-type="bibr">12</xref>
] and it is of interested to understand where and how in the brain this occurs. Recent human psychophysical experiments have shed some light on these questions. Moore
<italic>et al.</italic>
reported a close interaction between saccade directions and the processing of tactile motion [
<xref rid="b13-sensors-13-07212" ref-type="bibr">13</xref>
] as well as motion after-effect transfer between touch and vision [
<xref rid="b14-sensors-13-07212" ref-type="bibr">14</xref>
], implying a hard-wired connection between tactile and visual systems for motion processing. Bensmaia
<italic>et al.</italic>
[
<xref rid="b15-sensors-13-07212" ref-type="bibr">15</xref>
] found that the perceived speed of tactile motion is influenced by the speed of a concurrent visual-motion stimulus, again indicating the existence of cross-modal integration between touch and vision. Blake
<italic>et al.</italic>
developed a visual sphere with visual ambiguity in its direction of rotation and results showed that touching the sphere disambiguates the visual percept [
<xref rid="b16-sensors-13-07212" ref-type="bibr">16</xref>
]. Finally, Shore
<italic>et al.</italic>
investigated the temporal constraints of the visual-tactile crossmodal congruency effect in an experiment in which vibrotactile targets were presented to the index finger or thumb of either hand while visual distractors were presented from one of four possible locations. Participants made speeded discrimination responses regarding the spatial location of vibrotactile targets while ignoring the visual distractors. The results showed that the cross-modal effects were significant when visual and vibrotactile stimuli occurred within 100 ms. [
<xref rid="b17-sensors-13-07212" ref-type="bibr">17</xref>
].</p>
<p>Visual-tactile motion integration has been explored using several apparatuses, with tactile-array stimulators comprising a matrix of linear motors or actuators being the most sophisticated devices [
<xref rid="b13-sensors-13-07212" ref-type="bibr">13</xref>
<xref rid="b15-sensors-13-07212" ref-type="bibr">15</xref>
]. The spatial-temporal indentation pattern from a population of independently moving motors can create simulated motion with arbitrary stimuli directions and contour, a property that is suitable for a variety of haptic experiments [
<xref rid="b18-sensors-13-07212" ref-type="bibr">18</xref>
,
<xref rid="b19-sensors-13-07212" ref-type="bibr">19</xref>
]. However, current array stimulators with a motor-array arrangement that is dense enough to exceed the innervation density of peripheral afferents in the human fingerpad (such as the 400-probe stimulator) [
<xref rid="b20-sensors-13-07212" ref-type="bibr">20</xref>
] are expensive and bulky. Consequently, most researchers use rotator motors with one degree of freedom, in which a rotating object is touched and the subject reports the direction of rotation or discriminates the surface speed [
<xref rid="b16-sensors-13-07212" ref-type="bibr">16</xref>
,
<xref rid="b21-sensors-13-07212" ref-type="bibr">21</xref>
]. Although the use of rotator motors has been a well-established method in somatosensory research [
<xref rid="b22-sensors-13-07212" ref-type="bibr">22</xref>
,
<xref rid="b23-sensors-13-07212" ref-type="bibr">23</xref>
], the commonly used apparatuses with rotator motors are limited by having only one degree of freedom (restricting motion to two opposite directions such as clockwise and counterclockwise) and the inability to control the indentation depth into the finger. Moreover, spatially aligning the positions of visual and tactile stimuli is difficult as both video monitors and tactile stimulators occupy substantial space. Here, we develop a novel visual-tactile cross-modal integration apparatus that consists of a visual display for presenting visual stimuli and a tactile stimulator with three degrees of freedom for presenting tactile-motion stimuli in arbitrary directions, speeds, and indentation depths. Additionally, this apparatus can present cross-modal stimulation in which the spatial locations of visual and tactile stimuli are perfectly aligned. Tactile stimulus saliency can be modulated by controlling the indentation depth and visual stimulus saliency can be modulated by adjusting the level of superimposed noise presented on the visual display.</p>
<p>Using this apparatus, we presented visual-tactile motion stimuli in which the directions of motion were either congruent or incongruent between sensory modalities, and participants reported the perceived visual direction of motion. The magnitude of visual-tactile integration was then gauged by the degree to which the perceived visual direction was biased toward the tactile direction of motion. We hypothesize that the direction of perceived visual motion can be biased by the direction of tactile motion when visual signals are weakened. We also hypothesize that visual-tactile motion integration follows the rule of temporal congruency of multi-modal inputs: The effect of multi-sensory integration is most robust when sensory information from multiple modalities coincides [
<xref rid="b1-sensors-13-07212" ref-type="bibr">1</xref>
,
<xref rid="b24-sensors-13-07212" ref-type="bibr">24</xref>
,
<xref rid="b25-sensors-13-07212" ref-type="bibr">25</xref>
]. Accordingly, we predicted that the integration effect would peak when visual and tactile stimuli were presented simultaneously.</p>
</sec>
<sec>
<label>2.</label>
<title>Experimental Section</title>
<sec>
<label>2.1.</label>
<title>Development of the Tactile Motion Stimulator</title>
<p>To present tactile motion stimuli in specified directions, speeds, and indentation depths we developed a tactile motion stimulator (
<xref rid="f1-sensors-13-07212" ref-type="fig">Figure 1(A)</xref>
) with three degrees of freedom, controlled by three five-phase step motors (PK545-B, Oriental Motor Co. Ltd., Tokyo, Japan). Each step motor has a basic step angle of 0.72°, a precision that meets the needs of our experiments. One step motor rotates the stimulus drum for producing motion (
<xref rid="f1-sensors-13-07212" ref-type="fig">Figure 1(A-a)</xref>
). A second motor controls the arm for orienting the direction of stimulator motion (
<xref rid="f1-sensors-13-07212" ref-type="fig">Figure 1(A-b)</xref>
). The third step motor has a lead screw shaft (screw pitch, 1 mm; diameter, 9 mm; length, 150 mm) and translates rotational motion to linear motion, and adjusts the vertical excursion of the stimulator for controlling depth of indentation into the skin (
<xref rid="f1-sensors-13-07212" ref-type="fig">Figure 1(A-c)</xref>
). We used a programmable logic controller (PLC) to drive the step motors for the desired position and movement speed. The PLC was serially connected to a PC via an RS-232 port. In-house software using Matlab (MathWorks Inc., Natick, MA, USA) was developed to communicate with the PLC.</p>
</sec>
<sec>
<label>2.2.</label>
<title>Stimulus Drum</title>
<p>The surface of the stimulus drum was made from a grating whose orientation was orthogonal to the direction of surface motion. Specifically, the stimulus drum (a truncated ball) had a diameter of 160 mm and was engraved with a square-wave grating of 3.9 mm wave length, 500 μm peak-to-peak amplitude, and a 0.4 duty cycle (ridge length/cycle length,
<xref rid="f1-sensors-13-07212" ref-type="fig">Figure 1(B)</xref>
). The drum was made of polyvinyl chloride and manufactured using Computer-Aided Design and Computer-Aided Manufacturing (CAD/CAM) to achieve high precision for these surface contours.</p>
</sec>
<sec>
<label>2.3.</label>
<title>Finger-Hand Holder</title>
<p>The participant's left finger was supported by a finger-hand holder and was positioned volar-side-downward upon the upper surface of the stimulus drum (
<xref rid="f1-sensors-13-07212" ref-type="fig">Figure 1(C)</xref>
). The finger-hand holder was made from thermoplastic material so that it could fit the finger-hand anthropometric properties of each participant.</p>
</sec>
<sec>
<label>2.4.</label>
<title>Experimental Setup for Visual-Tactile Experiment</title>
<p>We developed a setup that allows for spatially aligned presentation of visual and tactile stimulation (
<xref rid="f2-sensors-13-07212" ref-type="fig">Figure 2(A,B)</xref>
). The tactile stimulus was presented using the tactile stimulator and the visual stimulus via video displayed on the LED panel. A black board covered the entire setup and the participant was asked to place his left index finger on the stimulus drum to receive tactile stimulation and look at the screen through the small eyepiece fixed on the black cover to receive visual stimulation. The LED panel was placed above the tactile stimulator. The visual stimuli were videos of the subject's own index finger viewed through mirror reflection (inspired by a setup proposed by Ernst and Banks [
<xref rid="b26-sensors-13-07212" ref-type="bibr">26</xref>
]), creating a visual experience as if the participant is looking at his own finger (
<xref rid="f2-sensors-13-07212" ref-type="fig">Figure 2(A)</xref>
). The assumption was that cross-modal integration would tend to occur when visual and tactile inputs were spatially matched. During the experiment, white noise was presented through earphones to avoid auditory cues that could arise from sound of the motors.</p>
</sec>
<sec>
<label>2.5.</label>
<title>Generation of Visual Motion Stimuli</title>
<p>We first video recorded a participant's own hand when the fingertip was presented with rightward (0°) or leftward (180°) motion. Use of the subject's own hand vivified the subsequent visual percepts. To eliminate possible cues elicited by the machine's shadow, the stimulus drum was visually presented through a specified aperture. The video was first transformed to gray scale, and different levels of Gaussian noise (Gaussian noise level, 0, 0.1, 0.2, 0.3, 0.4, or 0.5) were superimposed on each frame (
<xref rid="f3-sensors-13-07212" ref-type="fig">Figure 3</xref>
).</p>
<p>The assumption was that superimposition of noise would degrade the signal-to-noise ratio in the visual signals, allowing us to explore how visual-tactile integration may depend on visual signal certainty. The Gaussian noise level is defined by the ratio between the standard deviation of superimposed Gaussian noise and the largest luminance difference observed in the original image (
<xref rid="FD1" ref-type="disp-formula">Equation (1)</xref>
):
<disp-formula id="FD1">
<label>(1)</label>
<mml:math id="mm1">
<mml:mrow>
<mml:mtext mathvariant="italic">Gaussian noise level</mml:mtext>
<mml:mo>=</mml:mo>
<mml:mfrac>
<mml:mrow>
<mml:mtext>Standard deviation of superimposed noise</mml:mtext>
</mml:mrow>
<mml:mrow>
<mml:mtext>Largest luminance difference in original image</mml:mtext>
</mml:mrow>
</mml:mfrac>
</mml:mrow>
</mml:math>
</disp-formula>
</p>
</sec>
<sec>
<label>2.6.</label>
<title>Subjects</title>
<p>Ten volunteer subjects (five males, five females), ranging from 24 to 40 years of age, were paid for their participation. Five participated in the visual certainty experiment and seven in the temporal congruency experiment. Informed consent was obtained from each subject and the protocol was approved by the Institutional Review Board of Human Research of the Chang Gung Medical Foundation.</p>
</sec>
<sec>
<label>2.7.</label>
<title>Visual Certainty Experiment</title>
<p>We performed two psychophysical experiments investigating the integration of visual and tactile signals in human observers. In the first experiment, subjects discriminated the direction of motion of visual stimuli when visual and tactile stimuli were simultaneously presented.</p>
<p>In a factorial design, the visual direction of motion was rightward (0°) or leftward (180°), tactile direction of motion was rightward (0°) or leftward (180°), and visual Gaussian noise level was zero, 0.1, 0.2, 0.3, 0.4 or 0.5. Both visual and tactile stimuli were defined by retinotopic (eye-centered) coordinates. Each stimulus condition was presented 10 times. The experiment was split into five blocks to allow subjects to rest so that each block contained 48 trials (2 visual directions × 2 tactile directions × 6 noise levels × 2 repetitions). The surface-motion speed of the tactile stimulus was 40 mm/s and its indention depth was 1 mm. In each trial, the visual-tactile motion was presented for 1 s and then the subject reported the direction of the visual stimulus by pressing one of two buttons on a computer mouse in a left-right two-alternative forced-choice design. The stimulus duration was the total indentation duration of the rotating drum, defined as the time from initial contact to the offset of indentation. Ramp-down period, defined as the time from initial contact to full indentation, and ramp-up period, defined as the time from full indentation to leave-off, lasted 0.15 s. The inter-trial-interval was 1.6 s. It was hypothesized that strength of visual-tactile motion integration can be gauged by the degree to which the perceived visual direction of motion is affected by the direction of tactile stimulus motion.</p>
<p>As a control experiment, we also performed a visual-only experiment in which the moving visual stimulus was presented as stated above, while the tactile stimulus was static (no surface motion), with a constant indentation of 1 mm. Each stimulus condition was presented 10 times. The experiment was split into five blocks and each block contained 24 trials (2 visual directions × 6 noise levels × 2 repetitions). Thus, we can compare task performance with and without tactile motion stimulation. We first performed the visual-only experiment and then the visual-tactile experiment.</p>
</sec>
<sec>
<label>2.8.</label>
<title>Temporal Congruency Experiment</title>
<p>We then examined whether the results obtained in the previous experiment were compatible with the rule of temporal congruency of multi-modal inputs [
<xref rid="b1-sensors-13-07212" ref-type="bibr">1</xref>
]. We performed a direction-congruency experiment with a variety of discrepancies in stimulus-onset latency. We hypothesized that the integration effect would peak when the onset of visual and tactile stimuli was simultaneous. We presented visual-tactile stimuli identical to those used in the visual certainty experiment, stimulus duration for each of visual and tactile stimuli was 0.5 s, and stimulus onset asynchrony (SOA), defined as the onset latency between the tactile and visual stimuli (
<xref rid="FD2" ref-type="disp-formula">Equation (2)</xref>
), was −2, −1, −0.5, −0.25, 0, 0.25, 0.5, 1 or 2 s:
<disp-formula id="FD2">
<label>(2)</label>
<mml:math id="mm2">
<mml:mrow>
<mml:mtext mathvariant="italic">SOA</mml:mtext>
<mml:mo>=</mml:mo>
<mml:msub>
<mml:mi>L</mml:mi>
<mml:mrow>
<mml:mtext mathvariant="italic">tactile</mml:mtext>
</mml:mrow>
</mml:msub>
<mml:mo>-</mml:mo>
<mml:msub>
<mml:mi>L</mml:mi>
<mml:mrow>
<mml:mtext mathvariant="italic">visual</mml:mtext>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:math>
</disp-formula>
where
<italic>L
<sub>tactile</sub>
</italic>
and
<italic>L
<sub>visual</sub>
</italic>
are the onset latencies of tactile and visual stimuli, respectively. We first performed a pilot experiment to find the optimal visual noise level for individual subjects that could induce a specific level of visual uncertainty. The Gaussian noise level was chosen to induce accuracy range from 0.6 to 0.7 in the visual-only condition because the visual-tactile integration effect was observed robust within this Gaussian noise level in the visual uncertainty experiment. Each stimulus condition was repeated 10 times. The experiment was split into 10 blocks and each block contained 36 trials (2 visual directions × 2 tactile directions × 9 SOAs).</p>
</sec>
</sec>
<sec>
<label>3.</label>
<title>Results and Discussion</title>
<sec>
<label>3.1.</label>
<title>Results</title>
<sec>
<label>3.1.1.</label>
<title>Visual Certainty Experiment</title>
<p>We used the visual-tactile apparatus to perform direction-congruency experiment with a variety of visual noise levels. In the visual only condition, we found that the probability of choosing the veridical direction of visual motion (accuracy) peaked at zero noise, monotonically decreased as noise levels increased, and finally reached chance level (accuracy = 0.5) at the maximum level of visual noise (
<xref rid="f4-sensors-13-07212" ref-type="fig">Figure 4(A)</xref>
, green trace for data obtained from one subject;
<xref rid="f4-sensors-13-07212" ref-type="fig">Figure 4(B)</xref>
, green trace for data averaged across subjects). The same trend was also found in direction congruent (
<xref rid="f4-sensors-13-07212" ref-type="fig">Figure 4(A,B)</xref>
red trace) and incongruent conditions (
<xref rid="f4-sensors-13-07212" ref-type="fig">Figure 4(A,B)</xref>
blue trace). Most importantly, compared with the visual only condition, accuracy was higher in congruent and lower in incongruent conditions. Specifically, perceived direction of visual motion was significantly biased toward the direction of tactile motion, indicating that visual and tactile motion information is integrated to yield a holistic percept (interaction effect in repeated-measured ANOVA,
<italic>p</italic>
< 0.05). Results indicate that the perceived visual direction is biased toward the tactile direction especially when visual noise level is high, providing evidence of visual-tactile integration.</p>
</sec>
<sec>
<label>3.1.2.</label>
<title>Temporal Congruency Experiment</title>
<p>We then examined whether visual-tactile motion integration follows the rule of temporal congruency of multi-modal inputs. We performed the temporal congruency experiment with several SOAs and a fixed visual noise level. Across SOAs, the strength of visual-tactile integration, gauged by the degree to which perceived direction of visual motion is biased toward the direction of tactile motion, peaked when the SOA was close to zero (simultaneous presentation) and gradually decreased as the SOA deviated away from zero (asynchronous presentation) (
<xref rid="f5-sensors-13-07212" ref-type="fig">Figure 5(A)</xref>
, single subject;
<xref rid="f5-sensors-13-07212" ref-type="fig">Figure 5(B)</xref>
, averaged across subjects). An interaction effect was observed using a repeated-measured ANOVA (
<italic>p</italic>
< 0.001). Post-hoc analysis using paired t-test showed that accuracy differed significantly between the congruent and incongruent conditions when SOAs were -0.5, -0.25, 0, 0.25, 0.5 and 1 s (
<italic>p</italic>
< 0.05). That is, visual-tactile integration peaks when visual and tactile stimuli are presented simultaneously, a finding that is compatible with the rule of temporal congruency of multi-modal inputs. Also, the rule of temporal congruency for visual-tactile motion integration has a relatively wide tolerance of SOA up to 1 s.</p>
</sec>
</sec>
<sec>
<label>3.2.</label>
<title>Discussion</title>
<p>Here, we introduce a novel visual-tactile cross-modal stimulation apparatus that allows for simultaneous presentation of visual and tactile motion stimuli at aligned spatial locations. This apparatus can present different combinations of visual and tactile stimuli, varying in direction and speed, while avoiding the physical constraint inherent in the one-degree-of-freedom rotator tactile stimulator. Most importantly, the signal-to-noise ratio of the visual stimulus can be modulated so that properties of visual-tactile motion integration can be more accurately characterized. To our knowledge, no previous stimulator apparatus has accomplished this. Other tactile motion stimulation apparatuses cannot align visual and tactile motion stimulation or precisely control the indentation depth of the tactile stimulus. Using this apparatus, several properties of cross-modal integration, including inverse effectiveness [
<xref rid="b27-sensors-13-07212" ref-type="bibr">27</xref>
], temporal congruency [
<xref rid="b24-sensors-13-07212" ref-type="bibr">24</xref>
], and spatial congruency [
<xref rid="b28-sensors-13-07212" ref-type="bibr">28</xref>
] can be examined. Furthermore, the direction and speed constraints underlying cross-modal motion integration can be systemically characterized.</p>
<p>Results indicate that perceived direction of visual motion can be biased toward the direction of tactile motion when visual signals are degraded by the superimposition of noise. Because the percept could be dominated by tactile inputs when visual signals are uncertain, this finding implies that visual dominance of visual-tactile integration is adaptive as the percept could be dominated by tactile inputs. It also highlights the importance of including the capability to adjust the signal saliency for each modality when developing cross-modal integration stimulation apparatuses [
<xref rid="b29-sensors-13-07212" ref-type="bibr">29</xref>
]. Using this apparatus, visual saliency can be modulated by the magnitude of superimposed noise while tactile saliency can be adjusted by indentation depth or the dimension of the engraved texture on the stimulus drum. Indeed, Fetsch
<italic>et al.</italic>
found that the neural system employs an optimal strategy of weighting cues from each modality in proportion to cue reliability [
<xref rid="b30-sensors-13-07212" ref-type="bibr">30</xref>
], indicating the use of optimal probabilistic computation in neural circuits. That is, a modality whose signals are more salient will tend to be weighted more highly, a property that is consistent with our findings. This computation can be accounted for by Bayesian inference or maximum-likehood [
<xref rid="b26-sensors-13-07212" ref-type="bibr">26</xref>
,
<xref rid="b31-sensors-13-07212" ref-type="bibr">31</xref>
<xref rid="b33-sensors-13-07212" ref-type="bibr">33</xref>
].</p>
<p>The results in the present study indicated that visual-tactile motion integration follows the rule of temporal congruency of multi-modal inputs. The temporal congruency is functionally relevant in that information from different senses occurring at approximately the same time most likely come from the same physical event [
<xref rid="b25-sensors-13-07212" ref-type="bibr">25</xref>
]. Butz
<italic>et al.</italic>
[
<xref rid="b34-sensors-13-07212" ref-type="bibr">34</xref>
] investigated visual-tactile motion integration and observed that the range of SOA that yielded significant integration could span 1 s, which was similar to the findings in the present study. However, Shore
<italic>et al.</italic>
examined the effect of spatial attention on the judgment of the position of a vibrotactile stimulus and found that the integration effect was significant only when SOAs were within 100 ms [
<xref rid="b17-sensors-13-07212" ref-type="bibr">17</xref>
]. One possibility to explain this discrepancy is the difference of task structure. The present study was similar to Butz'
<italic>s</italic>
study in that these two studies investigated visual-tactile motion integration while Shore
<italic>et al.</italic>
studied position discrimination. Another possibility is that the stimulus duration was relatively longer in the present study (500 ms) and in Butz's study (960ms) while that in Shore's study was 10 ms.</p>
<p>Human brain imaging studies have showed robust increases of blood oxygen level-dependent (BOLD) signals in the extrastriate visual cortex when human observers are presented with tactile stimulus. These areas include the middle temporal (MT) [
<xref rid="b16-sensors-13-07212" ref-type="bibr">16</xref>
,
<xref rid="b35-sensors-13-07212" ref-type="bibr">35</xref>
] and medial superior temporal (MST) [
<xref rid="b36-sensors-13-07212" ref-type="bibr">36</xref>
] cortices that are well known for their specialized processing of visual motion. The present stimulus apparatus will offer a unique opportunity to perform neurophysiological studies to characterize the functional relevance of these tactile-related BOLD signals in visual association areas. Finally, the apparatus can present different combinations of visual and tactile stimuli, varying in direction and speed. Although we did not use this feature here, future studies will use the apparatus to examine the speed and direction constraints underlying visual-tactile motion integration.</p>
<p>A haptic approach is usually applied in multi-sensory scenarios. It is then of utmost importance to characterize how information is processed in biological systems to infer a holistic percept. Furthermore, this apparatus will help develop cross-modal inference algorithms to determine how robotic systems resolve conflicting visual and tactile sensory information in scenarios that could occur in the real world [
<xref rid="b37-sensors-13-07212" ref-type="bibr">37</xref>
]. However, to date, no neurophysiological experiment using non-human primates has been performed to explore visual-tactile motion integration. The main reason for this lack of information is instrumental limitation. The apparatus developed in the present study could be used for computational, psychophysical, and neurophysiological studies.</p>
</sec>
</sec>
<sec sec-type="conclusions">
<label>4.</label>
<title>Conclusions</title>
<p>The present study introduces the design and demonstrates the validity of a novel visual-tactile motion integration apparatus that consists of a video display and tactile stimulator with three degrees of freedom. Using this apparatus, we showed that visual direction of motion is biased by the tactile direction of motion when visual signals are weakened. Additionally, the visual-tactile motion integration follows the rule of temporal congruency of multi-modal inputs. Further studies will be able to use this apparatus to investigate cross-modal motion integration mechanisms.</p>
</sec>
</body>
<back>
<ack>
<p>The authors wish to thank for the grant support from National Science Council, Taiwan (NSC 100-2321-B-182A-002), and National Health Research Institutes, Taiwan (NHRI-EX101-10113EC), and Chang Gung Medial Foundation (CMRPG590022G). The authors also acknowledge technique support from Chien-Chun Pai MS, You-Ping Yang MS, and Yau-Chu Chen MS.</p>
</ack>
<notes>
<title>Conflict of Interest</title>
<p>The authors declare no conflict of interest</p>
</notes>
<ref-list>
<title>References</title>
<ref id="b1-sensors-13-07212">
<label>1.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Stein</surname>
<given-names>B.E.</given-names>
</name>
<name>
<surname>Stanford</surname>
<given-names>T.R.</given-names>
</name>
</person-group>
<article-title>Multisensory integration: current issues from the perspective of the single neuron</article-title>
<source>Nat. Rev. Neurosci.</source>
<year>2008</year>
<volume>9</volume>
<fpage>255</fpage>
<lpage>266</lpage>
<pub-id pub-id-type="pmid">18354398</pub-id>
</element-citation>
</ref>
<ref id="b2-sensors-13-07212">
<label>2.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Beauchamp</surname>
<given-names>M.S.</given-names>
</name>
</person-group>
<article-title>See me, hear me, touch me: Multisensory integration in lateral occipital-temporal cortex</article-title>
<source>Curr. Opin. Neurobiol.</source>
<year>2005</year>
<volume>15</volume>
<fpage>145</fpage>
<lpage>153</lpage>
<pub-id pub-id-type="pmid">15831395</pub-id>
</element-citation>
</ref>
<ref id="b3-sensors-13-07212">
<label>3.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Ernst</surname>
<given-names>M.O.</given-names>
</name>
<name>
<surname>Bulthoff</surname>
<given-names>H.H.</given-names>
</name>
</person-group>
<article-title>Merging the senses into a robust percept</article-title>
<source>Trends Cogn. Sci.</source>
<year>2004</year>
<volume>8</volume>
<fpage>162</fpage>
<lpage>169</lpage>
<pub-id pub-id-type="pmid">15050512</pub-id>
</element-citation>
</ref>
<ref id="b4-sensors-13-07212">
<label>4.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>McGurk</surname>
<given-names>H.</given-names>
</name>
<name>
<surname>MacDonald</surname>
<given-names>J.</given-names>
</name>
</person-group>
<article-title>Hearing lips and seeing voices</article-title>
<source>Nature</source>
<year>1976</year>
<volume>264</volume>
<fpage>746</fpage>
<lpage>748</lpage>
<pub-id pub-id-type="pmid">1012311</pub-id>
</element-citation>
</ref>
<ref id="b5-sensors-13-07212">
<label>5.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Sekuler</surname>
<given-names>R.</given-names>
</name>
<name>
<surname>Sekuler</surname>
<given-names>A.B.</given-names>
</name>
<name>
<surname>Lau</surname>
<given-names>R.</given-names>
</name>
</person-group>
<article-title>Sound alters visual motion perception</article-title>
<source>Nature</source>
<year>1997</year>
<volume>385</volume>
<fpage>308</fpage>
<pub-id pub-id-type="pmid">9002513</pub-id>
</element-citation>
</ref>
<ref id="b6-sensors-13-07212">
<label>6.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Guest</surname>
<given-names>S.</given-names>
</name>
<name>
<surname>Catmur</surname>
<given-names>C.</given-names>
</name>
<name>
<surname>Lloyd</surname>
<given-names>D.</given-names>
</name>
<name>
<surname>Spence</surname>
<given-names>C.</given-names>
</name>
</person-group>
<article-title>Audiotactile interactions in roughness perception</article-title>
<source>Exp. Brain Res.</source>
<year>2002</year>
<volume>146</volume>
<fpage>161</fpage>
<lpage>171</lpage>
<pub-id pub-id-type="pmid">12195518</pub-id>
</element-citation>
</ref>
<ref id="b7-sensors-13-07212">
<label>7.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Ernst</surname>
<given-names>M.O.</given-names>
</name>
<name>
<surname>Banks</surname>
<given-names>M.S.</given-names>
</name>
<name>
<surname>Bulthoff</surname>
<given-names>H.H.</given-names>
</name>
</person-group>
<article-title>Touch can change visual slant perception</article-title>
<source>Nat. Neurosci.</source>
<year>2000</year>
<volume>3</volume>
<fpage>69</fpage>
<lpage>73</lpage>
<pub-id pub-id-type="pmid">10607397</pub-id>
</element-citation>
</ref>
<ref id="b8-sensors-13-07212">
<label>8.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Spence</surname>
<given-names>C.</given-names>
</name>
<name>
<surname>Pavani</surname>
<given-names>F.</given-names>
</name>
<name>
<surname>Driver</surname>
<given-names>J.</given-names>
</name>
</person-group>
<article-title>Spatial constraints on visual-tactile cross-modal distractor congruency effects</article-title>
<source>Cogn. Affect Behav. Neurosci.</source>
<year>2004</year>
<volume>4</volume>
<fpage>148</fpage>
<lpage>169</lpage>
<pub-id pub-id-type="pmid">15460922</pub-id>
</element-citation>
</ref>
<ref id="b9-sensors-13-07212">
<label>9.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Kennett</surname>
<given-names>S.</given-names>
</name>
<name>
<surname>Spence</surname>
<given-names>C.</given-names>
</name>
<name>
<surname>Driver</surname>
<given-names>J.</given-names>
</name>
</person-group>
<article-title>Visuo-tactile links in covert exogenous spatial attention remap across changes in unseen hand posture</article-title>
<source>Per. Psychophys.</source>
<year>2002</year>
<volume>64</volume>
<fpage>1083</fpage>
<lpage>1094</lpage>
</element-citation>
</ref>
<ref id="b10-sensors-13-07212">
<label>10.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Hyvärinen</surname>
<given-names>J.</given-names>
</name>
<name>
<surname>Shelepin</surname>
<given-names>Y.</given-names>
</name>
</person-group>
<article-title>Distribution of visual and somatic functions in the parietal associative area 7 of the monkey</article-title>
<source>Brain Res.</source>
<year>1979</year>
<volume>169</volume>
<fpage>561</fpage>
<lpage>564</lpage>
<pub-id pub-id-type="pmid">109170</pub-id>
</element-citation>
</ref>
<ref id="b11-sensors-13-07212">
<label>11.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Hyvärinen</surname>
<given-names>J.</given-names>
</name>
</person-group>
<article-title>Regional distribution of functions in parietal association area 7 of the monkey</article-title>
<source>Brain Res.</source>
<year>1981</year>
<volume>206</volume>
<fpage>287</fpage>
<lpage>303</lpage>
<pub-id pub-id-type="pmid">7214136</pub-id>
</element-citation>
</ref>
<ref id="b12-sensors-13-07212">
<label>12.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Avillac</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>Deneve</surname>
<given-names>S.</given-names>
</name>
<name>
<surname>Olivier</surname>
<given-names>E.</given-names>
</name>
<name>
<surname>Pouget</surname>
<given-names>A.</given-names>
</name>
<name>
<surname>Duhamel</surname>
<given-names>J.R.</given-names>
</name>
</person-group>
<article-title>Reference frames for representing visual and tactile locations in parietal cortex</article-title>
<source>Nat. Neurosci.</source>
<year>2005</year>
<volume>8</volume>
<fpage>941</fpage>
<lpage>949</lpage>
<pub-id pub-id-type="pmid">15951810</pub-id>
</element-citation>
</ref>
<ref id="b13-sensors-13-07212">
<label>13.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Carter</surname>
<given-names>O.</given-names>
</name>
<name>
<surname>Konkle</surname>
<given-names>T.</given-names>
</name>
<name>
<surname>Wang</surname>
<given-names>Q.</given-names>
</name>
<name>
<surname>Hayward</surname>
<given-names>V.</given-names>
</name>
<name>
<surname>Moore</surname>
<given-names>C.</given-names>
</name>
</person-group>
<article-title>Tactile rivalry demonstrated with an ambiguous apparent-motion quartet</article-title>
<source>Curr. Biol.</source>
<year>2008</year>
<volume>18</volume>
<fpage>1050</fpage>
<lpage>1054</lpage>
<pub-id pub-id-type="pmid">18635355</pub-id>
</element-citation>
</ref>
<ref id="b14-sensors-13-07212">
<label>14.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Konkle</surname>
<given-names>T.</given-names>
</name>
<name>
<surname>Wang</surname>
<given-names>Q.</given-names>
</name>
<name>
<surname>Hayward</surname>
<given-names>V.</given-names>
</name>
<name>
<surname>Moore</surname>
<given-names>C.I.</given-names>
</name>
</person-group>
<article-title>Motion aftereffects transfer between touch and vision</article-title>
<source>Curr. Biol.</source>
<year>2009</year>
<volume>19</volume>
<fpage>745</fpage>
<lpage>750</lpage>
<pub-id pub-id-type="pmid">19361996</pub-id>
</element-citation>
</ref>
<ref id="b15-sensors-13-07212">
<label>15.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Bensmaia</surname>
<given-names>S.J.</given-names>
</name>
<name>
<surname>Killebrew</surname>
<given-names>J.H.</given-names>
</name>
<name>
<surname>Craig</surname>
<given-names>J.C.</given-names>
</name>
</person-group>
<article-title>Influence of visual motion on tactile motion perception</article-title>
<source>J. Neurophysiol.</source>
<year>2006</year>
<volume>96</volume>
<fpage>1625</fpage>
<lpage>1637</lpage>
<pub-id pub-id-type="pmid">16723415</pub-id>
</element-citation>
</ref>
<ref id="b16-sensors-13-07212">
<label>16.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Blake</surname>
<given-names>R.</given-names>
</name>
<name>
<surname>Sobel</surname>
<given-names>K.V.</given-names>
</name>
<name>
<surname>James</surname>
<given-names>T.W.</given-names>
</name>
</person-group>
<article-title>Neural synergy between kinetic vision and touch</article-title>
<source>Psychol. Sci.</source>
<year>2004</year>
<volume>15</volume>
<fpage>397</fpage>
<lpage>402</lpage>
<pub-id pub-id-type="pmid">15147493</pub-id>
</element-citation>
</ref>
<ref id="b17-sensors-13-07212">
<label>17.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Shore</surname>
<given-names>D.I.</given-names>
</name>
<name>
<surname>Barnes</surname>
<given-names>M.E.</given-names>
</name>
<name>
<surname>Spence</surname>
<given-names>C.</given-names>
</name>
</person-group>
<article-title>Temporal aspects of the visuotactile congruency effect</article-title>
<source>Neurosci. Lett.</source>
<year>2006</year>
<volume>392</volume>
<fpage>96</fpage>
<lpage>100</lpage>
<pub-id pub-id-type="pmid">16213655</pub-id>
</element-citation>
</ref>
<ref id="b18-sensors-13-07212">
<label>18.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Pei</surname>
<given-names>Y.C.</given-names>
</name>
<name>
<surname>Hsiao</surname>
<given-names>S.S.</given-names>
</name>
<name>
<surname>Bensmaia</surname>
<given-names>S.J.</given-names>
</name>
</person-group>
<article-title>The tactile integration of local motion cues is analogous to its visual counterpart</article-title>
<source>Proc. Natl. Acad. Sci. USA</source>
<year>2008</year>
<volume>105</volume>
<fpage>8130</fpage>
<lpage>8135</lpage>
<pub-id pub-id-type="pmid">18524953</pub-id>
</element-citation>
</ref>
<ref id="b19-sensors-13-07212">
<label>19.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Pei</surname>
<given-names>Y.C.</given-names>
</name>
<name>
<surname>Hsiao</surname>
<given-names>S.S.</given-names>
</name>
<name>
<surname>Craig</surname>
<given-names>J.C.</given-names>
</name>
<name>
<surname>Bensmaia</surname>
<given-names>S.J.</given-names>
</name>
</person-group>
<article-title>Neural mechanisms of tactile motion integration in somatosensory cortex</article-title>
<source>Neuron</source>
<year>2011</year>
<volume>69</volume>
<fpage>536</fpage>
<lpage>547</lpage>
<pub-id pub-id-type="pmid">21315263</pub-id>
</element-citation>
</ref>
<ref id="b20-sensors-13-07212">
<label>20.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Killebrew</surname>
<given-names>J.H.</given-names>
</name>
<name>
<surname>Bensmaia</surname>
<given-names>S.J.</given-names>
</name>
<name>
<surname>Dammann</surname>
<given-names>J.F.</given-names>
</name>
<name>
<surname>Denchev</surname>
<given-names>P.</given-names>
</name>
<name>
<surname>Hsiao</surname>
<given-names>S.S.</given-names>
</name>
<name>
<surname>Craig</surname>
<given-names>J.C.</given-names>
</name>
<name>
<surname>Johnson</surname>
<given-names>K.O.</given-names>
</name>
</person-group>
<article-title>A dense array stimulator to generate arbitrary spatio-temporal tactile stimuli</article-title>
<source>J. Neurosci. Meth.</source>
<year>2007</year>
<volume>161</volume>
<fpage>62</fpage>
<lpage>74</lpage>
</element-citation>
</ref>
<ref id="b21-sensors-13-07212">
<label>21.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Gori</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>Mazzilli</surname>
<given-names>G.</given-names>
</name>
<name>
<surname>Sandini</surname>
<given-names>G.</given-names>
</name>
<name>
<surname>Burr</surname>
<given-names>D.</given-names>
</name>
</person-group>
<article-title>Cross-sensory facilitation reveals neural interactions between visual and tactile motion in humans</article-title>
<source>Front Psychol.</source>
<year>2011</year>
<pub-id pub-id-type="doi">10.3389/fpsyg.2011.00055</pub-id>
</element-citation>
</ref>
<ref id="b22-sensors-13-07212">
<label>22.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Johnson</surname>
<given-names>K.O.</given-names>
</name>
<name>
<surname>Phillips</surname>
<given-names>J.R.</given-names>
</name>
</person-group>
<article-title>A rotating drum stimulator for scanning embossed patterns and textures across the skin</article-title>
<source>J. Neurosci. Meth.</source>
<year>1988</year>
<volume>22</volume>
<fpage>221</fpage>
<lpage>231</lpage>
</element-citation>
</ref>
<ref id="b23-sensors-13-07212">
<label>23.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Essick</surname>
<given-names>G.K.</given-names>
</name>
<name>
<surname>Whitsel</surname>
<given-names>B.L.</given-names>
</name>
</person-group>
<article-title>Factors influencing cutaneous directional sensitivity: A correlative psychophysical and neurophysiological investigation</article-title>
<source>Brain Res.</source>
<year>1985</year>
<volume>357</volume>
<fpage>213</fpage>
<lpage>230</lpage>
<pub-id pub-id-type="pmid">3938308</pub-id>
</element-citation>
</ref>
<ref id="b24-sensors-13-07212">
<label>24.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Meredith</surname>
<given-names>M.A.</given-names>
</name>
<name>
<surname>Nemitz</surname>
<given-names>J.W.</given-names>
</name>
<name>
<surname>Stein</surname>
<given-names>B.E.</given-names>
</name>
</person-group>
<article-title>Determinants of multisensory integration in superior colliculus neurons. I. Temporal factors</article-title>
<source>J. Neurosci.</source>
<year>1987</year>
<volume>7</volume>
<fpage>3215</fpage>
<lpage>3229</lpage>
<pub-id pub-id-type="pmid">3668625</pub-id>
</element-citation>
</ref>
<ref id="b25-sensors-13-07212">
<label>25.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Holmes</surname>
<given-names>N.P.</given-names>
</name>
<name>
<surname>Spence</surname>
<given-names>C.</given-names>
</name>
</person-group>
<article-title>Multisensory integration: Space, time and superadditivity</article-title>
<source>Curr. Biol.</source>
<year>2005</year>
<volume>15</volume>
<fpage>R762</fpage>
<lpage>R764</lpage>
<pub-id pub-id-type="pmid">16169476</pub-id>
</element-citation>
</ref>
<ref id="b26-sensors-13-07212">
<label>26.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Ernst</surname>
<given-names>M.O.</given-names>
</name>
<name>
<surname>Banks</surname>
<given-names>M.S.</given-names>
</name>
</person-group>
<article-title>Humans integrate visual and haptic information in a statistically optimal fashion</article-title>
<source>Nature</source>
<year>2002</year>
<volume>415</volume>
<fpage>429</fpage>
<lpage>433</lpage>
<pub-id pub-id-type="pmid">11807554</pub-id>
</element-citation>
</ref>
<ref id="b27-sensors-13-07212">
<label>27.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Meredith</surname>
<given-names>M.A.</given-names>
</name>
<name>
<surname>Stein</surname>
<given-names>B.E.</given-names>
</name>
</person-group>
<article-title>Interactions among converging sensory inputs in the superior colliculus</article-title>
<source>Science</source>
<year>1983</year>
<volume>221</volume>
<fpage>389</fpage>
<lpage>391</lpage>
<pub-id pub-id-type="pmid">6867718</pub-id>
</element-citation>
</ref>
<ref id="b28-sensors-13-07212">
<label>28.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Meredith</surname>
<given-names>M.A.</given-names>
</name>
<name>
<surname>Stein</surname>
<given-names>B.E.</given-names>
</name>
</person-group>
<article-title>Spatial factors determine the activity of multisensory neurons in cat superior colliculus</article-title>
<source>Brain Res.</source>
<year>1986</year>
<volume>365</volume>
<fpage>350</fpage>
<lpage>354</lpage>
<pub-id pub-id-type="pmid">3947999</pub-id>
</element-citation>
</ref>
<ref id="b29-sensors-13-07212">
<label>29.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Morgan</surname>
<given-names>M.L.</given-names>
</name>
<name>
<surname>DeAngelis</surname>
<given-names>G.C.</given-names>
</name>
<name>
<surname>Angelaki</surname>
<given-names>D.E.</given-names>
</name>
</person-group>
<article-title>Multisensory integration in macaque visual cortex depends on cue reliability</article-title>
<source>Neuron</source>
<year>2008</year>
<volume>59</volume>
<fpage>662</fpage>
<lpage>673</lpage>
<pub-id pub-id-type="pmid">18760701</pub-id>
</element-citation>
</ref>
<ref id="b30-sensors-13-07212">
<label>30.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Fetsch</surname>
<given-names>C.R.</given-names>
</name>
<name>
<surname>Pouget</surname>
<given-names>A.</given-names>
</name>
<name>
<surname>DeAngelis</surname>
<given-names>G.C.</given-names>
</name>
<name>
<surname>Angelaki</surname>
<given-names>D.E.</given-names>
</name>
</person-group>
<article-title>Neural correlates of reliability-based cue weighting during multisensory integration</article-title>
<source>Nat. Neurosci.</source>
<year>2012</year>
<volume>15</volume>
<fpage>146</fpage>
<lpage>154</lpage>
<pub-id pub-id-type="pmid">22101645</pub-id>
</element-citation>
</ref>
<ref id="b31-sensors-13-07212">
<label>31.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Hillis</surname>
<given-names>J.M.</given-names>
</name>
<name>
<surname>Watt</surname>
<given-names>S.J.</given-names>
</name>
<name>
<surname>Landy</surname>
<given-names>M.S.</given-names>
</name>
<name>
<surname>Banks</surname>
<given-names>M.S.</given-names>
</name>
</person-group>
<article-title>Slant from texture and disparity cues: Optimal cue combination</article-title>
<source>J. Vis.</source>
<year>2004</year>
<volume>4</volume>
<fpage>967</fpage>
<lpage>992</lpage>
<pub-id pub-id-type="pmid">15669906</pub-id>
</element-citation>
</ref>
<ref id="b32-sensors-13-07212">
<label>32.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Jacobs</surname>
<given-names>R.A.</given-names>
</name>
</person-group>
<article-title>Optimal integration of texture and motion cues to depth</article-title>
<source>Vis. Res.</source>
<year>1999</year>
<volume>39</volume>
<fpage>3621</fpage>
<lpage>3629</lpage>
<pub-id pub-id-type="pmid">10746132</pub-id>
</element-citation>
</ref>
<ref id="b33-sensors-13-07212">
<label>33.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Alais</surname>
<given-names>D.</given-names>
</name>
<name>
<surname>Burr</surname>
<given-names>D.</given-names>
</name>
</person-group>
<article-title>The ventriloquist effect results from near-optimal bimodal integration</article-title>
<source>Curr. Biol.</source>
<year>2004</year>
<volume>14</volume>
<fpage>257</fpage>
<lpage>262</lpage>
<pub-id pub-id-type="pmid">14761661</pub-id>
</element-citation>
</ref>
<ref id="b34-sensors-13-07212">
<label>34.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Butz</surname>
<given-names>M.V.</given-names>
</name>
<name>
<surname>Thomaschke</surname>
<given-names>R.</given-names>
</name>
<name>
<surname>Linhardt</surname>
<given-names>M.J.</given-names>
</name>
<name>
<surname>Herbort</surname>
<given-names>O.</given-names>
</name>
</person-group>
<article-title>Remapping motion across modalities: Tactile rotations influence visual motion judgments</article-title>
<source>Exp. Brain Res.</source>
<year>2010</year>
<volume>207</volume>
<fpage>1</fpage>
<lpage>11</lpage>
<pub-id pub-id-type="pmid">20878396</pub-id>
</element-citation>
</ref>
<ref id="b35-sensors-13-07212">
<label>35.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Hagen</surname>
<given-names>M.C.</given-names>
</name>
<name>
<surname>Franzen</surname>
<given-names>O.</given-names>
</name>
<name>
<surname>McGlone</surname>
<given-names>F.</given-names>
</name>
<name>
<surname>Essick</surname>
<given-names>G.</given-names>
</name>
<name>
<surname>Dancer</surname>
<given-names>C.</given-names>
</name>
<name>
<surname>Pardo</surname>
<given-names>J.V.</given-names>
</name>
</person-group>
<article-title>Tactile motion activates the human middle temporal/V5 (MT/V5) complex</article-title>
<source>Eur. J. Neurosci.</source>
<year>2002</year>
<volume>16</volume>
<fpage>957</fpage>
<lpage>964</lpage>
<pub-id pub-id-type="pmid">12372032</pub-id>
</element-citation>
</ref>
<ref id="b36-sensors-13-07212">
<label>36.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Beauchamp</surname>
<given-names>M.S.</given-names>
</name>
<name>
<surname>Yasar</surname>
<given-names>N.E.</given-names>
</name>
<name>
<surname>Kishan</surname>
<given-names>N.</given-names>
</name>
<name>
<surname>Ro</surname>
<given-names>T.</given-names>
</name>
</person-group>
<article-title>Human MST but not MT responds to tactile stimulation</article-title>
<source>J. Neurosci.</source>
<year>2007</year>
<volume>27</volume>
<fpage>8261</fpage>
<lpage>8267</lpage>
<pub-id pub-id-type="pmid">17670972</pub-id>
</element-citation>
</ref>
<ref id="b37-sensors-13-07212">
<label>37.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>De Gelder</surname>
<given-names>B.</given-names>
</name>
<name>
<surname>Bertelson</surname>
<given-names>P.</given-names>
</name>
</person-group>
<article-title>Multisensory integration, perception and ecological validity</article-title>
<source>Trends Cogn. Sci.</source>
<year>2003</year>
<volume>7</volume>
<fpage>460</fpage>
<lpage>467</lpage>
<pub-id pub-id-type="pmid">14550494</pub-id>
</element-citation>
</ref>
</ref-list>
</back>
<floats-group>
<fig id="f1-sensors-13-07212" position="float">
<label>Figure 1.</label>
<caption>
<p>The tactile motion stimulator. (
<bold>A</bold>
) The three step motors (a, b and c) control each of the three degrees of freedom. (
<bold>B</bold>
) The stimulus drum. (
<bold>C</bold>
) The finger-hand holder.</p>
</caption>
<graphic xlink:href="sensors-13-07212f1"></graphic>
</fig>
<fig id="f2-sensors-13-07212" position="float">
<label>Figure 2.</label>
<caption>
<p>The apparatus for characterizing visual-tactile motion integration. (
<bold>A</bold>
) The schematic diagram of the setup that uses a mirror to achieve spatially aligned presentation of visual and tactile stimulation. (
<bold>B</bold>
) Three views of the apparatus and the control module.</p>
</caption>
<graphic xlink:href="sensors-13-07212f2"></graphic>
</fig>
<fig id="f3-sensors-13-07212" position="float">
<label>Figure 3.</label>
<caption>
<p>Snapshots from video clips with superimposition of different levels of Gaussian noise: Zero (
<bold>A</bold>
), 0.1 (
<bold>B</bold>
), 0.2 (
<bold>C</bold>
), 0.3 (
<bold>D</bold>
), 0.4 (
<bold>E</bold>
) and 0.5 (
<bold>F</bold>
). As can be seen in the example images, contour information is degraded as noise level increases.</p>
</caption>
<graphic xlink:href="sensors-13-07212f3"></graphic>
</fig>
<fig id="f4-sensors-13-07212" position="float">
<label>Figure 4.</label>
<caption>
<p>Accuracy in judging the direction of the visual-motion stimulus (left or right) as a function of visual noise level in visual only (green trace), direction congruent (red trace), and direction incongruent (blue trace) conditions. (
<bold>A</bold>
) Data obtained from a single subject. (
<bold>B</bold>
) Data averaged across subjects. Error bars indicate the standard error of mean. The results showed that perceived direction of visual motion was biased toward the direction of tactile motion, especially when visual noise level was high.</p>
</caption>
<graphic xlink:href="sensors-13-07212f4"></graphic>
</fig>
<fig id="f5-sensors-13-07212" position="float">
<label>Figure 5.</label>
<caption>
<p>Accuracy as a function of stimulus onset asynchrony (SOA) in direction congruent (red trace) and direction incongruent (blue trace) conditions. For negative SOAs, the tactile stimulus preceded the visual stimulus; for positive SOAs, the tactile stimulus followed the visual stimulus. (
<bold>A</bold>
) Data obtained from a single subject. (
<bold>B</bold>
) Data averaged across subjects. The results showed that the degree to which perceived direction of visual motion was biased toward that of tactile motion peaked when SOA was close to zero. (*:
<italic>p</italic>
< 0.05 between the direction congruent and incongruent conditions).</p>
</caption>
<graphic xlink:href="sensors-13-07212f5"></graphic>
</fig>
</floats-group>
</pmc>
</record>

Pour manipuler ce document sous Unix (Dilib)

EXPLOR_STEP=$WICRI_ROOT/Ticri/CIDE/explor/HapticV1/Data/Pmc/Curation
HfdSelect -h $EXPLOR_STEP/biblio.hfd -nk 002499 | SxmlIndent | more

Ou

HfdSelect -h $EXPLOR_AREA/Data/Pmc/Curation/biblio.hfd -nk 002499 | SxmlIndent | more

Pour mettre un lien sur cette page dans le réseau Wicri

{{Explor lien
   |wiki=    Ticri/CIDE
   |area=    HapticV1
   |flux=    Pmc
   |étape=   Curation
   |type=    RBID
   |clé=     PMC:3715219
   |texte=   Cross-Modal Sensory Integration of Visual-Tactile Motion Information: Instrument Design and Human Psychophysics
}}

Pour générer des pages wiki

HfdIndexSelect -h $EXPLOR_AREA/Data/Pmc/Curation/RBID.i   -Sk "pubmed:23727955" \
       | HfdSelect -Kh $EXPLOR_AREA/Data/Pmc/Curation/biblio.hfd   \
       | NlmPubMed2Wicri -a HapticV1 

Wicri

This area was generated with Dilib version V0.6.23.
Data generation: Mon Jun 13 01:09:46 2016. Site generation: Wed Mar 6 09:54:07 2024