Serveur d'exploration sur les dispositifs haptiques

Attention, ce site est en cours de développement !
Attention, site généré par des moyens informatiques à partir de corpus bruts.
Les informations ne sont donc pas validées.

Exploratory Movement Generates Higher-Order Information That Is Sufficient for Accurate Perception of Scaled Egocentric Distance

Identifieur interne : 000284 ( Pmc/Curation ); précédent : 000283; suivant : 000285

Exploratory Movement Generates Higher-Order Information That Is Sufficient for Accurate Perception of Scaled Egocentric Distance

Auteurs : Bruno Mantel [France] ; Thomas A. Stoffregen [États-Unis] ; Alain Campbell [France] ; Benoît G. Bardy [France]

Source :

RBID : PMC:4391914

Abstract

Body movement influences the structure of multiple forms of ambient energy, including optics and gravito-inertial force. Some researchers have argued that egocentric distance is derived from inferential integration of visual and non-visual stimulation. We suggest that accurate information about egocentric distance exists in perceptual stimulation as higher-order patterns that extend across optics and inertia. We formalize a pattern that specifies the egocentric distance of a stationary object across higher-order relations between optics and inertia. This higher-order parameter is created by self-generated movement of the perceiver in inertial space relative to the illuminated environment. For this reason, we placed minimal restrictions on the exploratory movements of our participants. We asked whether humans can detect and use the information available in this higher-order pattern. Participants judged whether a virtual object was within reach. We manipulated relations between body movement and the ambient structure of optics and inertia. Judgments were precise and accurate when the higher-order optical-inertial parameter was available. When only optic flow was available, judgments were poor. Our results reveal that participants perceived egocentric distance from the higher-order, optical-inertial consequences of their own exploratory activity. Analysis of participants’ movement trajectories revealed that self-selected movements were complex, and tended to optimize availability of the optical-inertial pattern that specifies egocentric distance. We argue that accurate information about egocentric distance exists in higher-order patterns of ambient energy, that self-generated movement can generate these higher-order patterns, and that these patterns can be detected and used to support perception of egocentric distance that is precise and accurate.


Url:
DOI: 10.1371/journal.pone.0120025
PubMed: 25856410
PubMed Central: 4391914

Links toward previous steps (curation, corpus...)


Links to Exploration step

PMC:4391914

Le document en format XML

<record>
<TEI>
<teiHeader>
<fileDesc>
<titleStmt>
<title xml:lang="en">Exploratory Movement Generates Higher-Order Information That Is Sufficient for Accurate Perception of Scaled Egocentric Distance</title>
<author>
<name sortKey="Mantel, Bruno" sort="Mantel, Bruno" uniqKey="Mantel B" first="Bruno" last="Mantel">Bruno Mantel</name>
<affiliation wicri:level="1">
<nlm:aff id="aff001">
<addr-line>Movement-to-Health Laboratory, EuroMov, Montpellier-1 University, Montpellier, France</addr-line>
</nlm:aff>
<country xml:lang="fr">France</country>
<wicri:regionArea>Movement-to-Health Laboratory, EuroMov, Montpellier-1 University, Montpellier</wicri:regionArea>
</affiliation>
<affiliation wicri:level="1">
<nlm:aff id="aff002">
<addr-line>Normandie Université, Caen, France</addr-line>
</nlm:aff>
<country xml:lang="fr">France</country>
<wicri:regionArea>Normandie Université, Caen</wicri:regionArea>
</affiliation>
<affiliation wicri:level="1">
<nlm:aff id="aff003">
<addr-line>Centre d’Etudes Sport et Actions Motrices, Université de Caen Basse-Normandie, Caen, France</addr-line>
</nlm:aff>
<country xml:lang="fr">France</country>
<wicri:regionArea>Centre d’Etudes Sport et Actions Motrices, Université de Caen Basse-Normandie, Caen</wicri:regionArea>
</affiliation>
</author>
<author>
<name sortKey="Stoffregen, Thomas A" sort="Stoffregen, Thomas A" uniqKey="Stoffregen T" first="Thomas A." last="Stoffregen">Thomas A. Stoffregen</name>
<affiliation wicri:level="1">
<nlm:aff id="aff004">
<addr-line>Affordance Perception-Action Laboratory, University of Minnesota, Minneapolis, United States of America</addr-line>
</nlm:aff>
<country xml:lang="fr">États-Unis</country>
<wicri:regionArea>Affordance Perception-Action Laboratory, University of Minnesota, Minneapolis</wicri:regionArea>
</affiliation>
</author>
<author>
<name sortKey="Campbell, Alain" sort="Campbell, Alain" uniqKey="Campbell A" first="Alain" last="Campbell">Alain Campbell</name>
<affiliation wicri:level="1">
<nlm:aff id="aff002">
<addr-line>Normandie Université, Caen, France</addr-line>
</nlm:aff>
<country xml:lang="fr">France</country>
<wicri:regionArea>Normandie Université, Caen</wicri:regionArea>
</affiliation>
<affiliation wicri:level="1">
<nlm:aff id="aff005">
<addr-line>UMR 6139 Laboratoire de Mathématiques Nicolas Oresme, Université de Caen-Basse Normandie & CNRS, Caen, France</addr-line>
</nlm:aff>
<country xml:lang="fr">France</country>
<wicri:regionArea>UMR 6139 Laboratoire de Mathématiques Nicolas Oresme, Université de Caen-Basse Normandie & CNRS, Caen</wicri:regionArea>
</affiliation>
</author>
<author>
<name sortKey="Bardy, Benoit G" sort="Bardy, Benoit G" uniqKey="Bardy B" first="Benoît G." last="Bardy">Benoît G. Bardy</name>
<affiliation wicri:level="1">
<nlm:aff id="aff001">
<addr-line>Movement-to-Health Laboratory, EuroMov, Montpellier-1 University, Montpellier, France</addr-line>
</nlm:aff>
<country xml:lang="fr">France</country>
<wicri:regionArea>Movement-to-Health Laboratory, EuroMov, Montpellier-1 University, Montpellier</wicri:regionArea>
</affiliation>
<affiliation wicri:level="1">
<nlm:aff id="aff006">
<addr-line>Institut Universitaire de France, Paris, France</addr-line>
</nlm:aff>
<country xml:lang="fr">France</country>
<wicri:regionArea>Institut Universitaire de France, Paris</wicri:regionArea>
</affiliation>
</author>
</titleStmt>
<publicationStmt>
<idno type="wicri:source">PMC</idno>
<idno type="pmid">25856410</idno>
<idno type="pmc">4391914</idno>
<idno type="url">http://www.ncbi.nlm.nih.gov/pmc/articles/PMC4391914</idno>
<idno type="RBID">PMC:4391914</idno>
<idno type="doi">10.1371/journal.pone.0120025</idno>
<date when="2015">2015</date>
<idno type="wicri:Area/Pmc/Corpus">000284</idno>
<idno type="wicri:Area/Pmc/Curation">000284</idno>
</publicationStmt>
<sourceDesc>
<biblStruct>
<analytic>
<title xml:lang="en" level="a" type="main">Exploratory Movement Generates Higher-Order Information That Is Sufficient for Accurate Perception of Scaled Egocentric Distance</title>
<author>
<name sortKey="Mantel, Bruno" sort="Mantel, Bruno" uniqKey="Mantel B" first="Bruno" last="Mantel">Bruno Mantel</name>
<affiliation wicri:level="1">
<nlm:aff id="aff001">
<addr-line>Movement-to-Health Laboratory, EuroMov, Montpellier-1 University, Montpellier, France</addr-line>
</nlm:aff>
<country xml:lang="fr">France</country>
<wicri:regionArea>Movement-to-Health Laboratory, EuroMov, Montpellier-1 University, Montpellier</wicri:regionArea>
</affiliation>
<affiliation wicri:level="1">
<nlm:aff id="aff002">
<addr-line>Normandie Université, Caen, France</addr-line>
</nlm:aff>
<country xml:lang="fr">France</country>
<wicri:regionArea>Normandie Université, Caen</wicri:regionArea>
</affiliation>
<affiliation wicri:level="1">
<nlm:aff id="aff003">
<addr-line>Centre d’Etudes Sport et Actions Motrices, Université de Caen Basse-Normandie, Caen, France</addr-line>
</nlm:aff>
<country xml:lang="fr">France</country>
<wicri:regionArea>Centre d’Etudes Sport et Actions Motrices, Université de Caen Basse-Normandie, Caen</wicri:regionArea>
</affiliation>
</author>
<author>
<name sortKey="Stoffregen, Thomas A" sort="Stoffregen, Thomas A" uniqKey="Stoffregen T" first="Thomas A." last="Stoffregen">Thomas A. Stoffregen</name>
<affiliation wicri:level="1">
<nlm:aff id="aff004">
<addr-line>Affordance Perception-Action Laboratory, University of Minnesota, Minneapolis, United States of America</addr-line>
</nlm:aff>
<country xml:lang="fr">États-Unis</country>
<wicri:regionArea>Affordance Perception-Action Laboratory, University of Minnesota, Minneapolis</wicri:regionArea>
</affiliation>
</author>
<author>
<name sortKey="Campbell, Alain" sort="Campbell, Alain" uniqKey="Campbell A" first="Alain" last="Campbell">Alain Campbell</name>
<affiliation wicri:level="1">
<nlm:aff id="aff002">
<addr-line>Normandie Université, Caen, France</addr-line>
</nlm:aff>
<country xml:lang="fr">France</country>
<wicri:regionArea>Normandie Université, Caen</wicri:regionArea>
</affiliation>
<affiliation wicri:level="1">
<nlm:aff id="aff005">
<addr-line>UMR 6139 Laboratoire de Mathématiques Nicolas Oresme, Université de Caen-Basse Normandie & CNRS, Caen, France</addr-line>
</nlm:aff>
<country xml:lang="fr">France</country>
<wicri:regionArea>UMR 6139 Laboratoire de Mathématiques Nicolas Oresme, Université de Caen-Basse Normandie & CNRS, Caen</wicri:regionArea>
</affiliation>
</author>
<author>
<name sortKey="Bardy, Benoit G" sort="Bardy, Benoit G" uniqKey="Bardy B" first="Benoît G." last="Bardy">Benoît G. Bardy</name>
<affiliation wicri:level="1">
<nlm:aff id="aff001">
<addr-line>Movement-to-Health Laboratory, EuroMov, Montpellier-1 University, Montpellier, France</addr-line>
</nlm:aff>
<country xml:lang="fr">France</country>
<wicri:regionArea>Movement-to-Health Laboratory, EuroMov, Montpellier-1 University, Montpellier</wicri:regionArea>
</affiliation>
<affiliation wicri:level="1">
<nlm:aff id="aff006">
<addr-line>Institut Universitaire de France, Paris, France</addr-line>
</nlm:aff>
<country xml:lang="fr">France</country>
<wicri:regionArea>Institut Universitaire de France, Paris</wicri:regionArea>
</affiliation>
</author>
</analytic>
<series>
<title level="j">PLoS ONE</title>
<idno type="eISSN">1932-6203</idno>
<imprint>
<date when="2015">2015</date>
</imprint>
</series>
</biblStruct>
</sourceDesc>
</fileDesc>
<profileDesc>
<textClass></textClass>
</profileDesc>
</teiHeader>
<front>
<div type="abstract" xml:lang="en">
<p>Body movement influences the structure of multiple forms of ambient energy, including optics and gravito-inertial force. Some researchers have argued that egocentric distance is derived from inferential integration of visual and non-visual stimulation. We suggest that accurate information about egocentric distance exists in perceptual stimulation as higher-order patterns that extend across optics and inertia. We formalize a pattern that specifies the egocentric distance of a stationary object across higher-order relations between optics and inertia. This higher-order parameter is created by self-generated movement of the perceiver in inertial space relative to the illuminated environment. For this reason, we placed minimal restrictions on the exploratory movements of our participants. We asked whether humans can detect and use the information available in this higher-order pattern. Participants judged whether a virtual object was within reach. We manipulated relations between body movement and the ambient structure of optics and inertia. Judgments were precise and accurate when the higher-order optical-inertial parameter was available. When only optic flow was available, judgments were poor. Our results reveal that participants perceived egocentric distance from the higher-order, optical-inertial consequences of their own exploratory activity. Analysis of participants’ movement trajectories revealed that self-selected movements were complex, and tended to optimize availability of the optical-inertial pattern that specifies egocentric distance. We argue that accurate information about egocentric distance exists in higher-order patterns of ambient energy, that self-generated movement can generate these higher-order patterns, and that these patterns can be detected and used to support perception of egocentric distance that is precise and accurate.</p>
</div>
</front>
<back>
<div1 type="bibliography">
<listBibl>
<biblStruct>
<analytic>
<author>
<name sortKey="Alais, D" uniqKey="Alais D">D Alais</name>
</author>
<author>
<name sortKey="Burr, D" uniqKey="Burr D">D Burr</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Ernst, Mo" uniqKey="Ernst M">MO Ernst</name>
</author>
<author>
<name sortKey="Banks, Ms" uniqKey="Banks M">MS Banks</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Guillaud, E" uniqKey="Guillaud E">E Guillaud</name>
</author>
<author>
<name sortKey="Gauthier, G" uniqKey="Gauthier G">G Gauthier</name>
</author>
<author>
<name sortKey="Vercher, J L" uniqKey="Vercher J">J-L Vercher</name>
</author>
<author>
<name sortKey="Blouin, J" uniqKey="Blouin J">J Blouin</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Hillis, Jm" uniqKey="Hillis J">JM Hillis</name>
</author>
<author>
<name sortKey="Ernst, Mo" uniqKey="Ernst M">MO Ernst</name>
</author>
<author>
<name sortKey="Banks, Ms" uniqKey="Banks M">MS Banks</name>
</author>
<author>
<name sortKey="Landy, Ms" uniqKey="Landy M">MS Landy</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Pouget, A" uniqKey="Pouget A">A Pouget</name>
</author>
<author>
<name sortKey="Deneve, S" uniqKey="Deneve S">S Deneve</name>
</author>
<author>
<name sortKey="Duhamel, J R" uniqKey="Duhamel J">J-R Duhamel</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Stoffregen, Ta" uniqKey="Stoffregen T">TA Stoffregen</name>
</author>
<author>
<name sortKey="Bardy, Bg" uniqKey="Bardy B">BG Bardy</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Gibson, Jj" uniqKey="Gibson J">JJ Gibson</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Gibson, Jj" uniqKey="Gibson J">JJ Gibson</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Michaels, Cf" uniqKey="Michaels C">CF Michaels</name>
</author>
<author>
<name sortKey="Carello, Cc" uniqKey="Carello C">CC Carello</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Turvey, Mt" uniqKey="Turvey M">MT Turvey</name>
</author>
<author>
<name sortKey="Shaw, Re" uniqKey="Shaw R">RE Shaw</name>
</author>
<author>
<name sortKey="Reed, Es" uniqKey="Reed E">ES Reed</name>
</author>
<author>
<name sortKey="Mace, Wm" uniqKey="Mace W">WM Mace</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Sedgwick, Ha" uniqKey="Sedgwick H">HA Sedgwick</name>
</author>
<author>
<name sortKey="Boff, Kr" uniqKey="Boff K">KR Boff</name>
</author>
<author>
<name sortKey="Kaufman, L" uniqKey="Kaufman L">L Kaufman</name>
</author>
<author>
<name sortKey="Thomas, Jp" uniqKey="Thomas J">JP Thomas</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Panerai, F" uniqKey="Panerai F">F Panerai</name>
</author>
<author>
<name sortKey="Cornilleau Peres, V" uniqKey="Cornilleau Peres V">V Cornilleau-Pérès</name>
</author>
<author>
<name sortKey="Droulez, J" uniqKey="Droulez J">J Droulez</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Peh, C H" uniqKey="Peh C">C-H Peh</name>
</author>
<author>
<name sortKey="Panerai, F" uniqKey="Panerai F">F Panerai</name>
</author>
<author>
<name sortKey="Droulez, J" uniqKey="Droulez J">J Droulez</name>
</author>
<author>
<name sortKey="Cornilleau Peres, V" uniqKey="Cornilleau Peres V">V Cornilleau-Pérès</name>
</author>
<author>
<name sortKey="Cheong, L F" uniqKey="Cheong L">L-F Cheong</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Gibson, Jj" uniqKey="Gibson J">JJ Gibson</name>
</author>
<author>
<name sortKey="Olum, P" uniqKey="Olum P">P Olum</name>
</author>
<author>
<name sortKey="Rosenblatt, F" uniqKey="Rosenblatt F">F Rosenblatt</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Johansson, G" uniqKey="Johansson G">G Johansson</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Nakayama, K" uniqKey="Nakayama K">K Nakayama</name>
</author>
<author>
<name sortKey="Loomis, J" uniqKey="Loomis J">J Loomis</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Bingham, Gp" uniqKey="Bingham G">GP Bingham</name>
</author>
<author>
<name sortKey="Stassen, Mg" uniqKey="Stassen M">MG Stassen</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Gogel, Wc" uniqKey="Gogel W">WC Gogel</name>
</author>
<author>
<name sortKey="Tietz, Jd" uniqKey="Tietz J">JD Tietz</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Bingham, Gp" uniqKey="Bingham G">GP Bingham</name>
</author>
<author>
<name sortKey="Pagano, Cc" uniqKey="Pagano C">CC Pagano</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Watt, Sj" uniqKey="Watt S">SJ Watt</name>
</author>
<author>
<name sortKey="Bradshaw, Mf" uniqKey="Bradshaw M">MF Bradshaw</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Gomer, Ja" uniqKey="Gomer J">JA Gomer</name>
</author>
<author>
<name sortKey="Dash, Ch" uniqKey="Dash C">CH Dash</name>
</author>
<author>
<name sortKey="Moore, Ks" uniqKey="Moore K">KS Moore</name>
</author>
<author>
<name sortKey="Pagano, Cc" uniqKey="Pagano C">CC Pagano</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Wickelgren, Ea" uniqKey="Wickelgren E">EA Wickelgren</name>
</author>
<author>
<name sortKey="Mcconnell, Ds" uniqKey="Mcconnell D">DS Mcconnell</name>
</author>
<author>
<name sortKey="Bingham, Gr" uniqKey="Bingham G">GR Bingham</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Gogel, Wc" uniqKey="Gogel W">WC Gogel</name>
</author>
<author>
<name sortKey="Tietz, Jd" uniqKey="Tietz J">JD Tietz</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Beall, Ac" uniqKey="Beall A">AC Beall</name>
</author>
<author>
<name sortKey="Loomis, Jm" uniqKey="Loomis J">JM Loomis</name>
</author>
<author>
<name sortKey="Philbeck, Jw" uniqKey="Philbeck J">JW Philbeck</name>
</author>
<author>
<name sortKey="Fikes, Tg" uniqKey="Fikes T">TG Fikes</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Eriksson, Es" uniqKey="Eriksson E">ES Eriksson</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Ferris, Sh" uniqKey="Ferris S">SH Ferris</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Mantel, B" uniqKey="Mantel B">B Mantel</name>
</author>
<author>
<name sortKey="Bardy, Bg" uniqKey="Bardy B">BG Bardy</name>
</author>
<author>
<name sortKey="Stoffregen, Ta" uniqKey="Stoffregen T">TA Stoffregen</name>
</author>
<author>
<name sortKey="Cummins Sebree, S" uniqKey="Cummins Sebree S">S Cummins-Sebree</name>
</author>
<author>
<name sortKey="Riley, M" uniqKey="Riley M">M Riley</name>
</author>
<author>
<name sortKey="Shockley, K" uniqKey="Shockley K">K Shockley</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Poulton, Ec" uniqKey="Poulton E">EC Poulton</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Warren, Wh" uniqKey="Warren W">WH Warren</name>
</author>
</analytic>
</biblStruct>
<biblStruct></biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Streit, M" uniqKey="Streit M">M Streit</name>
</author>
<author>
<name sortKey="Shockley, K" uniqKey="Shockley K">K Shockley</name>
</author>
<author>
<name sortKey="Riley, Ma" uniqKey="Riley M">MA Riley</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Runeson, S" uniqKey="Runeson S">S Runeson</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Carello, C" uniqKey="Carello C">C Carello</name>
</author>
<author>
<name sortKey="Grosofsky, A" uniqKey="Grosofsky A">A Grosofsky</name>
</author>
<author>
<name sortKey="Reichel, Fd" uniqKey="Reichel F">FD Reichel</name>
</author>
<author>
<name sortKey="Solomon, Hy" uniqKey="Solomon H">HY Solomon</name>
</author>
<author>
<name sortKey="Turvey, Mt" uniqKey="Turvey M">MT Turvey</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Heft, H" uniqKey="Heft H">H Heft</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Mark, Ls" uniqKey="Mark L">LS Mark</name>
</author>
<author>
<name sortKey="Nemeth, K" uniqKey="Nemeth K">K Nemeth</name>
</author>
<author>
<name sortKey="Gardner, D" uniqKey="Gardner D">D Gardner</name>
</author>
<author>
<name sortKey="Dainoff, Mj" uniqKey="Dainoff M">MJ Dainoff</name>
</author>
<author>
<name sortKey="Paasche, J" uniqKey="Paasche J">J Paasche</name>
</author>
<author>
<name sortKey="Duffy, M" uniqKey="Duffy M">M Duffy</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Loomis, Jm" uniqKey="Loomis J">JM Loomis</name>
</author>
<author>
<name sortKey="Knapp, Jm" uniqKey="Knapp J">JM Knapp</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Creem Regehr, Sh" uniqKey="Creem Regehr S">SH Creem-Regehr</name>
</author>
<author>
<name sortKey="Willemsen, P" uniqKey="Willemsen P">P Willemsen</name>
</author>
<author>
<name sortKey="Gooch, Aa" uniqKey="Gooch A">AA Gooch</name>
</author>
<author>
<name sortKey="Thompson, Wb" uniqKey="Thompson W">WB Thompson</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Crowell, Ja" uniqKey="Crowell J">JA Crowell</name>
</author>
<author>
<name sortKey="Banks, Ms" uniqKey="Banks M">MS Banks</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Marotta, J" uniqKey="Marotta J">J Marotta</name>
</author>
<author>
<name sortKey="Perrot, T" uniqKey="Perrot T">T Perrot</name>
</author>
<author>
<name sortKey="Nicolle, D" uniqKey="Nicolle D">D Nicolle</name>
</author>
<author>
<name sortKey="Servos, P" uniqKey="Servos P">P Servos</name>
</author>
<author>
<name sortKey="Goodale, Ma" uniqKey="Goodale M">MA Goodale</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Ellard, Cg" uniqKey="Ellard C">CG Ellard</name>
</author>
<author>
<name sortKey="Goodale, Ma" uniqKey="Goodale M">MA Goodale</name>
</author>
<author>
<name sortKey="Timney, B" uniqKey="Timney B">B Timney</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Turvey, Mt" uniqKey="Turvey M">MT Turvey</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Turvey, Mt" uniqKey="Turvey M">MT Turvey</name>
</author>
<author>
<name sortKey="Fonseca, St" uniqKey="Fonseca S">ST Fonseca</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Bruner, J" uniqKey="Bruner J">J Bruner</name>
</author>
<author>
<name sortKey="Kalnins, I" uniqKey="Kalnins I">I Kalnins</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Gibson, Ej" uniqKey="Gibson E">EJ Gibson</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Amazeen, El" uniqKey="Amazeen E">EL Amazeen</name>
</author>
<author>
<name sortKey="Tseng, Ph" uniqKey="Tseng P">PH Tseng</name>
</author>
<author>
<name sortKey="Valdez, Ab" uniqKey="Valdez A">AB Valdez</name>
</author>
<author>
<name sortKey="Vera, D" uniqKey="Vera D">D Vera</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Eriksson, Es" uniqKey="Eriksson E">ES Eriksson</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Longuet Higgins, H" uniqKey="Longuet Higgins H">H Longuet-Higgins</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Ullman, S" uniqKey="Ullman S">S Ullman</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Stoffregen, Ta" uniqKey="Stoffregen T">TA Stoffregen</name>
</author>
<author>
<name sortKey="Bardy, Bg" uniqKey="Bardy B">BG Bardy</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Loomis, Jm" uniqKey="Loomis J">JM Loomis</name>
</author>
<author>
<name sortKey="Da Silva, Ja" uniqKey="Da Silva J">JA Da Silva</name>
</author>
<author>
<name sortKey="Fujita, N" uniqKey="Fujita N">N Fujita</name>
</author>
<author>
<name sortKey="Fukusima, Ss" uniqKey="Fukusima S">SS Fukusima</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Gardner, Dl" uniqKey="Gardner D">DL Gardner</name>
</author>
<author>
<name sortKey="Mark, Ls" uniqKey="Mark L">LS Mark</name>
</author>
<author>
<name sortKey="Ward, Ja" uniqKey="Ward J">JA Ward</name>
</author>
<author>
<name sortKey="Edkins, H" uniqKey="Edkins H">H Edkins</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Yonas, A" uniqKey="Yonas A">A Yonas</name>
</author>
<author>
<name sortKey="Hartman, B" uniqKey="Hartman B">B Hartman</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Solomon, Hy" uniqKey="Solomon H">HY Solomon</name>
</author>
<author>
<name sortKey="Turvey, Mt" uniqKey="Turvey M">MT Turvey</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Carello, C" uniqKey="Carello C">C Carello</name>
</author>
<author>
<name sortKey="Fitzpatrick, P" uniqKey="Fitzpatrick P">P Fitzpatrick</name>
</author>
<author>
<name sortKey="Turvey, Mt" uniqKey="Turvey M">MT Turvey</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Mantel, B" uniqKey="Mantel B">B Mantel</name>
</author>
<author>
<name sortKey="Bardy, Bg" uniqKey="Bardy B">BG Bardy</name>
</author>
<author>
<name sortKey="Stoffregen, Ta" uniqKey="Stoffregen T">TA Stoffregen</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Smets, Gj" uniqKey="Smets G">GJ Smets</name>
</author>
<author>
<name sortKey="Overbeeke, C" uniqKey="Overbeeke C">C Overbeeke</name>
</author>
<author>
<name sortKey="Stratmann, M" uniqKey="Stratmann M">M Stratmann</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Wallace, G" uniqKey="Wallace G">G Wallace</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Cornus, S" uniqKey="Cornus S">S Cornus</name>
</author>
<author>
<name sortKey="Montagne, G" uniqKey="Montagne G">G Montagne</name>
</author>
<author>
<name sortKey="Laurent, M" uniqKey="Laurent M">M Laurent</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Falmagne, J C" uniqKey="Falmagne J">J-C Falmagne</name>
</author>
<author>
<name sortKey="Boff, Kr" uniqKey="Boff K">KR Boff</name>
</author>
<author>
<name sortKey="Kaufman, L" uniqKey="Kaufman L">L Kaufman</name>
</author>
<author>
<name sortKey="Thomas, Jp" uniqKey="Thomas J">JP Thomas</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Mark, Ls" uniqKey="Mark L">LS Mark</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Durgin, Fh" uniqKey="Durgin F">FH Durgin</name>
</author>
<author>
<name sortKey="Proffitt, Dr" uniqKey="Proffitt D">DR Proffitt</name>
</author>
<author>
<name sortKey="Olson, Tj" uniqKey="Olson T">TJ Olson</name>
</author>
<author>
<name sortKey="Reinke, Ks" uniqKey="Reinke K">KS Reinke</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Treutwein, B" uniqKey="Treutwein B">B Treutwein</name>
</author>
<author>
<name sortKey="Strasburger, H" uniqKey="Strasburger H">H Strasburger</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Wichmann, Fa" uniqKey="Wichmann F">FA Wichmann</name>
</author>
<author>
<name sortKey="Hill, Nj" uniqKey="Hill N">NJ Hill</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Wichmann, Fa" uniqKey="Wichmann F">FA Wichmann</name>
</author>
<author>
<name sortKey="Hill, Nj" uniqKey="Hill N">NJ Hill</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Bootsma, Rj" uniqKey="Bootsma R">RJ Bootsma</name>
</author>
<author>
<name sortKey="Bakker, Fc" uniqKey="Bakker F">FC Bakker</name>
</author>
<author>
<name sortKey="Van Snippenberg, Fej" uniqKey="Van Snippenberg F">FEJ van Snippenberg</name>
</author>
<author>
<name sortKey="Tdlohreg, Cw" uniqKey="Tdlohreg C">CW Tdlohreg</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Fisher, Ni" uniqKey="Fisher N">NI Fisher</name>
</author>
<author>
<name sortKey="Lewis, T" uniqKey="Lewis T">T Lewis</name>
</author>
<author>
<name sortKey="Embleton, Bjj" uniqKey="Embleton B">BJJ Embleton</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Mardia, Kv" uniqKey="Mardia K">KV Mardia</name>
</author>
<author>
<name sortKey="Jupp, Pe" uniqKey="Jupp P">PE Jupp</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Cliff, N" uniqKey="Cliff N">N Cliff</name>
</author>
</analytic>
</biblStruct>
</listBibl>
</div1>
</back>
</TEI>
<pmc article-type="research-article">
<pmc-dir>properties open_access</pmc-dir>
<front>
<journal-meta>
<journal-id journal-id-type="nlm-ta">PLoS One</journal-id>
<journal-id journal-id-type="iso-abbrev">PLoS ONE</journal-id>
<journal-id journal-id-type="publisher-id">plos</journal-id>
<journal-id journal-id-type="pmc">plosone</journal-id>
<journal-title-group>
<journal-title>PLoS ONE</journal-title>
</journal-title-group>
<issn pub-type="epub">1932-6203</issn>
<publisher>
<publisher-name>Public Library of Science</publisher-name>
<publisher-loc>San Francisco, CA USA</publisher-loc>
</publisher>
</journal-meta>
<article-meta>
<article-id pub-id-type="pmid">25856410</article-id>
<article-id pub-id-type="pmc">4391914</article-id>
<article-id pub-id-type="doi">10.1371/journal.pone.0120025</article-id>
<article-id pub-id-type="publisher-id">PONE-D-14-32643</article-id>
<article-categories>
<subj-group subj-group-type="heading">
<subject>Research Article</subject>
</subj-group>
</article-categories>
<title-group>
<article-title>Exploratory Movement Generates Higher-Order Information That Is Sufficient for Accurate Perception of Scaled Egocentric Distance</article-title>
<alt-title alt-title-type="running-head">Egocentric Distance in Optic-Inertial Patterns</alt-title>
</title-group>
<contrib-group>
<contrib contrib-type="author">
<name>
<surname>Mantel</surname>
<given-names>Bruno</given-names>
</name>
<xref ref-type="aff" rid="aff001">
<sup>1</sup>
</xref>
<xref ref-type="aff" rid="aff002">
<sup>2</sup>
</xref>
<xref ref-type="aff" rid="aff003">
<sup>3</sup>
</xref>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Stoffregen</surname>
<given-names>Thomas A.</given-names>
</name>
<xref ref-type="aff" rid="aff004">
<sup>4</sup>
</xref>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Campbell</surname>
<given-names>Alain</given-names>
</name>
<xref ref-type="aff" rid="aff002">
<sup>2</sup>
</xref>
<xref ref-type="aff" rid="aff005">
<sup>5</sup>
</xref>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Bardy</surname>
<given-names>Benoît G.</given-names>
</name>
<xref ref-type="aff" rid="aff001">
<sup>1</sup>
</xref>
<xref ref-type="aff" rid="aff006">
<sup>6</sup>
</xref>
<xref rid="cor001" ref-type="corresp">*</xref>
</contrib>
</contrib-group>
<aff id="aff001">
<label>1</label>
<addr-line>Movement-to-Health Laboratory, EuroMov, Montpellier-1 University, Montpellier, France</addr-line>
</aff>
<aff id="aff002">
<label>2</label>
<addr-line>Normandie Université, Caen, France</addr-line>
</aff>
<aff id="aff003">
<label>3</label>
<addr-line>Centre d’Etudes Sport et Actions Motrices, Université de Caen Basse-Normandie, Caen, France</addr-line>
</aff>
<aff id="aff004">
<label>4</label>
<addr-line>Affordance Perception-Action Laboratory, University of Minnesota, Minneapolis, United States of America</addr-line>
</aff>
<aff id="aff005">
<label>5</label>
<addr-line>UMR 6139 Laboratoire de Mathématiques Nicolas Oresme, Université de Caen-Basse Normandie & CNRS, Caen, France</addr-line>
</aff>
<aff id="aff006">
<label>6</label>
<addr-line>Institut Universitaire de France, Paris, France</addr-line>
</aff>
<contrib-group>
<contrib contrib-type="editor">
<name>
<surname>Ben Hamed</surname>
<given-names>Suliann</given-names>
</name>
<role>Academic Editor</role>
<xref ref-type="aff" rid="edit1"></xref>
</contrib>
</contrib-group>
<aff id="edit1">
<addr-line>Centre de Neuroscience Cognitive, FRANCE</addr-line>
</aff>
<author-notes>
<fn fn-type="conflict" id="coi001">
<p>
<bold>Competing Interests: </bold>
The authors have declared that no competing interests exist.</p>
</fn>
<fn fn-type="con" id="contrib001">
<p>Conceived and designed the experiments: BM TS BB. Performed the experiments: BM. Analyzed the data: BM TS BB. Contributed reagents/materials/analysis tools: BM. Wrote the paper: BM TS AC BB.</p>
</fn>
<corresp id="cor001">* E-mail:
<email>benoit.bardy@univ-montp1.fr</email>
</corresp>
</author-notes>
<pub-date pub-type="epub">
<day>9</day>
<month>4</month>
<year>2015</year>
</pub-date>
<pub-date pub-type="collection">
<year>2015</year>
</pub-date>
<volume>10</volume>
<issue>4</issue>
<elocation-id>e0120025</elocation-id>
<history>
<date date-type="received">
<day>5</day>
<month>8</month>
<year>2014</year>
</date>
<date date-type="accepted">
<day>19</day>
<month>1</month>
<year>2015</year>
</date>
</history>
<permissions>
<copyright-year>2015</copyright-year>
<copyright-holder>Mantel et al</copyright-holder>
<license xlink:href="http://creativecommons.org/licenses/by/4.0/">
<license-p>This is an open access article distributed under the terms of the
<ext-link ext-link-type="uri" xlink:href="http://creativecommons.org/licenses/by/4.0/">Creative Commons Attribution License</ext-link>
, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited</license-p>
</license>
</permissions>
<self-uri content-type="pdf" xlink:type="simple" xlink:href="pone.0120025.pdf"></self-uri>
<abstract>
<p>Body movement influences the structure of multiple forms of ambient energy, including optics and gravito-inertial force. Some researchers have argued that egocentric distance is derived from inferential integration of visual and non-visual stimulation. We suggest that accurate information about egocentric distance exists in perceptual stimulation as higher-order patterns that extend across optics and inertia. We formalize a pattern that specifies the egocentric distance of a stationary object across higher-order relations between optics and inertia. This higher-order parameter is created by self-generated movement of the perceiver in inertial space relative to the illuminated environment. For this reason, we placed minimal restrictions on the exploratory movements of our participants. We asked whether humans can detect and use the information available in this higher-order pattern. Participants judged whether a virtual object was within reach. We manipulated relations between body movement and the ambient structure of optics and inertia. Judgments were precise and accurate when the higher-order optical-inertial parameter was available. When only optic flow was available, judgments were poor. Our results reveal that participants perceived egocentric distance from the higher-order, optical-inertial consequences of their own exploratory activity. Analysis of participants’ movement trajectories revealed that self-selected movements were complex, and tended to optimize availability of the optical-inertial pattern that specifies egocentric distance. We argue that accurate information about egocentric distance exists in higher-order patterns of ambient energy, that self-generated movement can generate these higher-order patterns, and that these patterns can be detected and used to support perception of egocentric distance that is precise and accurate.</p>
</abstract>
<funding-group>
<funding-statement>This research was funded by ENACTIVE, a network of Excellence of the European Union (FP6 -IST-022114). The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.</funding-statement>
</funding-group>
<counts>
<fig-count count="7"></fig-count>
<table-count count="0"></table-count>
<page-count count="26"></page-count>
</counts>
<custom-meta-group>
<custom-meta id="data-availability">
<meta-name>Data Availability</meta-name>
<meta-value>All relevant data are within the paper and its Supporting Information files.</meta-value>
</custom-meta>
</custom-meta-group>
</article-meta>
<notes>
<title>Data Availability</title>
<p>All relevant data are within the paper and its Supporting Information files.</p>
</notes>
</front>
<body>
<sec sec-type="intro" id="sec001">
<title>Introduction</title>
<p>Animate movement alters the structure of multiple forms of ambient energy. Consider walking. As the feet strike the surface of support this inertial contact alters the stimulation of pressure sensors in the skin, of receptors in the joints, of stretch receptors in the muscles, as well as dynamic patterns of gravitoinertial force at the vestibular system. If the environment is illuminated, walking will alter patterns of optic flow. The patterns that are created in optics and inertia are not identical; they are non-redundant. Traditionally, the existence of non-redundancy in patterns of simultaneous multimodal stimulation arising from animate movement have been interpreted within the epistemological assumptions of indirect perception [
<xref rid="pone.0120025.ref001" ref-type="bibr">1</xref>
<xref rid="pone.0120025.ref005" ref-type="bibr">5</xref>
]. In theories of indirect perception it is assumed that patterns of stimulation available to the perceiver bear an ambiguous relation to physical reality. If this is true, then accurate perception can occur only as a product of inferential processing within the nervous system. In the case of multimodal stimulation the required processing is assumed to entail some type of integration of disparate inputs from different senses. In the present contribution, we offer an interpretation that is consistent with the epistemological assumptions of direct perception. In theories of direct perception it is argued that patterns of stimulation available to the perceiver bear a unique, lawful relation to physical reality [
<xref rid="pone.0120025.ref006" ref-type="bibr">6</xref>
<xref rid="pone.0120025.ref010" ref-type="bibr">10</xref>
]. If this is true, and if perceivers are sensitive to the relevant patterns of stimulation, then sensory stimulation may be sufficient for accurate perception, such that inferential processing is not required. We argue that the epistemological assumptions of direct perception can apply to the multisensory consequences of animate movement [
<xref rid="pone.0120025.ref006" ref-type="bibr">6</xref>
].</p>
<p>In this article, we focus on the perception of scaled egocentric distance (sometimes referred to as absolute distance, e.g., [
<xref rid="pone.0120025.ref011" ref-type="bibr">11</xref>
]). Perception of egocentric distance tends to be very poor for stationary perceivers, but often is greatly improved when perceivers are allowed to move [
<xref rid="pone.0120025.ref012" ref-type="bibr">12</xref>
,
<xref rid="pone.0120025.ref013" ref-type="bibr">13</xref>
]. How might this superior performance be achieved? Traditional analyses of multisensory stimulation have focused exclusively on patterns that exist within individual forms of ambient energy, such as patterns of optic flow, patterns of acoustic stimulation, and so on. In such analyses relations between different senses can exist only as products of internal processing. Taking a qualitatively different approach, we consider the possibility that accurate information may exist in patterns that extend across multiple forms of ambient energy. We quantify analytically an emergent, higher order parameter that extends across optic and inertial energies, and which is related to egocentric distance. We report an experiment in which we manipulated the availability of this parameter independent of parameters that were available to individual perceptual systems. Our results are consistent with the hypothesis that participants detected and used the higher order parameter, rather than internal processing to inputs derived from individual perceptual systems.</p>
<p>An important methodological feature of our study concerns the types of exploratory movement that are available to perceivers. In previous research on the perception of egocentric distance experimental participants typically have been limited to self-generated movement in one or two dimensions. In the experiment that we report participants were free to move in any direction(s) that they wished. This methodological factor is important because the higher order parameter that we identify is generated by movement of the perceiver in three dimensions. For this reason, our study is the first empirical research in which it has been possible to evaluate the perceptual reality of the higher order parameter.</p>
<sec id="sec002">
<title>Analysis of available information</title>
<p>Displacement of the head relative to the illuminated environment generates optic flow. Displacement of the head relative to the gravito-inertial force environment generates changes in ambient forces. Thus, self-generated head movements yield simultaneous stimulation of the visual, vestibular, and kinesthetic systems. Relations between patterns in optics and in gravito-inertial forces depend upon relations between head movements relative to the illuminated and gravito-inertial environments.</p>
<p>Early models of the information about egocentric distance available in optic flow have been restricted to one-dimensional movements [
<xref rid="pone.0120025.ref011" ref-type="bibr">11</xref>
,
<xref rid="pone.0120025.ref014" ref-type="bibr">14</xref>
<xref rid="pone.0120025.ref016" ref-type="bibr">16</xref>
] and sometimes further limited in terms of direction, frequency, or amplitude of movement [
<xref rid="pone.0120025.ref017" ref-type="bibr">17</xref>
,
<xref rid="pone.0120025.ref018" ref-type="bibr">18</xref>
]. For example, when the point of observation moves along a rectilinear trajectory, the distance to a stationary environmental object can be expressed as a function of the (optical) angle between the axis of motion and the direction of the object, the (optical) speed at which this angle changes and the (non optical) velocity of the point of observation [
<xref rid="pone.0120025.ref014" ref-type="bibr">14</xref>
] (see also [
<xref rid="pone.0120025.ref011" ref-type="bibr">11</xref>
,
<xref rid="pone.0120025.ref016" ref-type="bibr">16</xref>
,
<xref rid="pone.0120025.ref017" ref-type="bibr">17</xref>
]):
<disp-formula id="pone.0120025.e001">
<alternatives>
<graphic xlink:href="pone.0120025.e001.jpg" id="pone.0120025.e001g" position="anchor" mimetype="image" orientation="portrait"></graphic>
<mml:math id="M1">
<mml:mrow>
<mml:mi>D</mml:mi>
<mml:mo>=</mml:mo>
<mml:mfrac>
<mml:mrow>
<mml:mi>v</mml:mi>
<mml:mo></mml:mo>
<mml:mi>sin</mml:mi>
<mml:mi>α</mml:mi>
</mml:mrow>
<mml:mover accent="true">
<mml:mi>α</mml:mi>
<mml:mo>˙</mml:mo>
</mml:mover>
</mml:mfrac>
</mml:mrow>
</mml:math>
</alternatives>
<label>(1)</label>
</disp-formula>
</p>
<p>with
<inline-formula id="pone.0120025.e002">
<alternatives>
<graphic xlink:href="pone.0120025.e002.jpg" id="pone.0120025.e002g" position="anchor" mimetype="image" orientation="portrait"></graphic>
<mml:math id="M2">
<mml:mrow>
<mml:mi>v</mml:mi>
<mml:mo>=</mml:mo>
<mml:mrow>
<mml:mo></mml:mo>
<mml:mover accent="true">
<mml:mi>v</mml:mi>
<mml:mo></mml:mo>
</mml:mover>
<mml:mo></mml:mo>
</mml:mrow>
</mml:mrow>
</mml:math>
</alternatives>
</inline-formula>
. Below, we extend these analyses to include natural, unrestricted head movements. As a preliminary step, we first consider the case of any 2D movements performed within the plane of the object.</p>
</sec>
<sec id="sec003">
<title>Information about egocentric distance in the case of 2D head movements</title>
<p>Let
<inline-formula id="pone.0120025.e003">
<alternatives>
<graphic xlink:href="pone.0120025.e003.jpg" id="pone.0120025.e003g" position="anchor" mimetype="image" orientation="portrait"></graphic>
<mml:math id="M3">
<mml:mrow>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:mover accent="true">
<mml:mi>i</mml:mi>
<mml:mo></mml:mo>
</mml:mover>
<mml:mo>,</mml:mo>
<mml:mover accent="true">
<mml:mi>j</mml:mi>
<mml:mo></mml:mo>
</mml:mover>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
</mml:mrow>
</mml:math>
</alternatives>
</inline-formula>
be a mobile orthonormal basis lying within the plane of movement. The basis is always centered on the stationary object
<italic>O</italic>
but its orientation changes such that
<inline-formula id="pone.0120025.e004">
<alternatives>
<graphic xlink:href="pone.0120025.e004.jpg" id="pone.0120025.e004g" position="anchor" mimetype="image" orientation="portrait"></graphic>
<mml:math id="M4">
<mml:mover accent="true">
<mml:mi>i</mml:mi>
<mml:mo></mml:mo>
</mml:mover>
</mml:math>
</alternatives>
</inline-formula>
is always pointing toward the point of observation
<italic>P</italic>
, and makes an angle
<italic>θ</italic>
with the axis of another basis whose direction is fixed relative to the object/earth (
<xref rid="pone.0120025.g001" ref-type="fig">Fig 1</xref>
). The velocity
<inline-formula id="pone.0120025.e005">
<alternatives>
<graphic xlink:href="pone.0120025.e005.jpg" id="pone.0120025.e005g" position="anchor" mimetype="image" orientation="portrait"></graphic>
<mml:math id="M5">
<mml:mover accent="true">
<mml:mi>v</mml:mi>
<mml:mo></mml:mo>
</mml:mover>
</mml:math>
</alternatives>
</inline-formula>
of the point of observation makes an angle
<italic>α</italic>
with the direction of the object and thus can be decomposed as
<disp-formula id="pone.0120025.e006">
<alternatives>
<graphic xlink:href="pone.0120025.e006.jpg" id="pone.0120025.e006g" position="anchor" mimetype="image" orientation="portrait"></graphic>
<mml:math id="M6">
<mml:mrow>
<mml:mover accent="true">
<mml:mi>v</mml:mi>
<mml:mo></mml:mo>
</mml:mover>
<mml:mo>=</mml:mo>
<mml:mi>v</mml:mi>
<mml:mo></mml:mo>
<mml:mtext></mml:mtext>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:mi>cos</mml:mi>
<mml:mi>α</mml:mi>
<mml:mtext></mml:mtext>
<mml:mover accent="true">
<mml:mi>i</mml:mi>
<mml:mo></mml:mo>
</mml:mover>
<mml:mo>+</mml:mo>
<mml:mi>sin</mml:mi>
<mml:mi>α</mml:mi>
<mml:mtext></mml:mtext>
<mml:mover accent="true">
<mml:mi>j</mml:mi>
<mml:mo></mml:mo>
</mml:mover>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
</mml:mrow>
</mml:math>
</alternatives>
<label>(2)</label>
</disp-formula>
</p>
<fig id="pone.0120025.g001" orientation="portrait" position="float">
<object-id pub-id-type="doi">10.1371/journal.pone.0120025.g001</object-id>
<label>Fig 1</label>
<caption>
<title>Optical and non-optical consequences of a movement executed within the plane of a stationary object.</title>
<p>The egocentric distance can be expressed as a function of directional parameters (
<italic>α</italic>
,
<italic>θ</italic>
) and linear parameters about head movements (
<italic>v</italic>
).</p>
</caption>
<graphic xlink:href="pone.0120025.g001"></graphic>
</fig>
<p>The velocity can also be expressed as a function of the distance
<italic>D</italic>
between the point of observation and the object,
<disp-formula id="pone.0120025.e007">
<alternatives>
<graphic xlink:href="pone.0120025.e007.jpg" id="pone.0120025.e007g" position="anchor" mimetype="image" orientation="portrait"></graphic>
<mml:math id="M7">
<mml:mrow>
<mml:mover accent="true">
<mml:mi>v</mml:mi>
<mml:mo></mml:mo>
</mml:mover>
<mml:mo>=</mml:mo>
<mml:mfrac>
<mml:mrow>
<mml:mi>d</mml:mi>
<mml:mtext></mml:mtext>
<mml:mover accent="true">
<mml:mrow>
<mml:mi>O</mml:mi>
<mml:mi>P</mml:mi>
</mml:mrow>
<mml:mo stretchy="true"></mml:mo>
</mml:mover>
</mml:mrow>
<mml:mrow>
<mml:mi>d</mml:mi>
<mml:mi>t</mml:mi>
</mml:mrow>
</mml:mfrac>
<mml:mo>=</mml:mo>
<mml:mfrac>
<mml:mrow>
<mml:mi>d</mml:mi>
<mml:mtext></mml:mtext>
<mml:mi>D</mml:mi>
<mml:mtext></mml:mtext>
<mml:mover accent="true">
<mml:mi>i</mml:mi>
<mml:mo></mml:mo>
</mml:mover>
</mml:mrow>
<mml:mrow>
<mml:mi>d</mml:mi>
<mml:mi>t</mml:mi>
</mml:mrow>
</mml:mfrac>
<mml:mo>=</mml:mo>
<mml:mover accent="true">
<mml:mi>D</mml:mi>
<mml:mo>˙</mml:mo>
</mml:mover>
<mml:mtext></mml:mtext>
<mml:mover accent="true">
<mml:mi>i</mml:mi>
<mml:mo></mml:mo>
</mml:mover>
<mml:mo>+</mml:mo>
<mml:mi>D</mml:mi>
<mml:mtext></mml:mtext>
<mml:mover accent="true">
<mml:mi>θ</mml:mi>
<mml:mo>˙</mml:mo>
</mml:mover>
<mml:mtext></mml:mtext>
<mml:mover accent="true">
<mml:mi>j</mml:mi>
<mml:mo></mml:mo>
</mml:mover>
</mml:mrow>
</mml:math>
</alternatives>
<label>(3)</label>
</disp-formula>
</p>
<p>By combining Eqs
<xref rid="pone.0120025.e006" ref-type="disp-formula">2</xref>
and
<xref rid="pone.0120025.e007" ref-type="disp-formula">3</xref>
and projecting on unit axis
<inline-formula id="pone.0120025.e008">
<alternatives>
<graphic xlink:href="pone.0120025.e008.jpg" id="pone.0120025.e008g" position="anchor" mimetype="image" orientation="portrait"></graphic>
<mml:math id="M8">
<mml:mover accent="true">
<mml:mi>j</mml:mi>
<mml:mo></mml:mo>
</mml:mover>
</mml:math>
</alternatives>
</inline-formula>
we obtain
<disp-formula id="pone.0120025.e009">
<alternatives>
<graphic xlink:href="pone.0120025.e009.jpg" id="pone.0120025.e009g" position="anchor" mimetype="image" orientation="portrait"></graphic>
<mml:math id="M9">
<mml:mrow>
<mml:mi>D</mml:mi>
<mml:mo>=</mml:mo>
<mml:mfrac>
<mml:mrow>
<mml:mi>v</mml:mi>
<mml:mo></mml:mo>
<mml:mtext></mml:mtext>
<mml:mi>sin</mml:mi>
<mml:mi>α</mml:mi>
</mml:mrow>
<mml:mover accent="true">
<mml:mi>θ</mml:mi>
<mml:mo>˙</mml:mo>
</mml:mover>
</mml:mfrac>
</mml:mrow>
</mml:math>
</alternatives>
<label>(4)</label>
</disp-formula>
where
<italic>D</italic>
is the distance between the point of observation and the object,
<italic>v</italic>
is the norm of the linear velocity
<inline-formula id="pone.0120025.e010">
<alternatives>
<graphic xlink:href="pone.0120025.e010.jpg" id="pone.0120025.e010g" position="anchor" mimetype="image" orientation="portrait"></graphic>
<mml:math id="M10">
<mml:mover accent="true">
<mml:mi>v</mml:mi>
<mml:mo></mml:mo>
</mml:mover>
</mml:math>
</alternatives>
</inline-formula>
of the point of observation
<italic>P</italic>
,
<italic>α</italic>
is the angle between
<inline-formula id="pone.0120025.e011">
<alternatives>
<graphic xlink:href="pone.0120025.e011.jpg" id="pone.0120025.e011g" position="anchor" mimetype="image" orientation="portrait"></graphic>
<mml:math id="M11">
<mml:mover accent="true">
<mml:mi>v</mml:mi>
<mml:mo></mml:mo>
</mml:mover>
</mml:math>
</alternatives>
</inline-formula>
and the direction of the object relative to
<italic>P</italic>
, and
<inline-formula id="pone.0120025.e012">
<alternatives>
<graphic xlink:href="pone.0120025.e012.jpg" id="pone.0120025.e012g" position="anchor" mimetype="image" orientation="portrait"></graphic>
<mml:math id="M12">
<mml:mover accent="true">
<mml:mi>θ</mml:mi>
<mml:mo>˙</mml:mo>
</mml:mover>
</mml:math>
</alternatives>
</inline-formula>
is the angular velocity at which the direction of
<italic>P</italic>
changes relative to the object, or equivalently (by symmetry) the angular velocity at which the direction of the object changes relative to
<italic>P</italic>
(i.e., so-called apparent motion of the target).</p>
<p>The above analysis indicates that scaled egocentric distance is also specified to a perceiver moving in 2D in his/her optic-inertial stimulation. As opposed to
<inline-formula id="pone.0120025.e013">
<alternatives>
<graphic xlink:href="pone.0120025.e013.jpg" id="pone.0120025.e013g" position="anchor" mimetype="image" orientation="portrait"></graphic>
<mml:math id="M13">
<mml:mover accent="true">
<mml:mi>α</mml:mi>
<mml:mo>˙</mml:mo>
</mml:mover>
</mml:math>
</alternatives>
</inline-formula>
in the 1D model (
<xref rid="pone.0120025.e001" ref-type="disp-formula">Eq 1</xref>
; [
<xref rid="pone.0120025.ref011" ref-type="bibr">11</xref>
,
<xref rid="pone.0120025.ref014" ref-type="bibr">14</xref>
<xref rid="pone.0120025.ref016" ref-type="bibr">16</xref>
]), in the 2D model, the optical velocity parameter
<inline-formula id="pone.0120025.e014">
<alternatives>
<graphic xlink:href="pone.0120025.e014.jpg" id="pone.0120025.e014g" position="anchor" mimetype="image" orientation="portrait"></graphic>
<mml:math id="M14">
<mml:mover accent="true">
<mml:mi>θ</mml:mi>
<mml:mo>˙</mml:mo>
</mml:mover>
</mml:math>
</alternatives>
</inline-formula>
is no longer defined relative to the direction toward which the point of observation is moving.</p>
</sec>
<sec id="sec004">
<title>Higher order specification of egocentric distance in the case of 3D head movements</title>
<p>To extend
<xref rid="pone.0120025.e009" ref-type="disp-formula">Eq 4</xref>
to the case of unrestrained three-dimensional movements, a first intuitive approach is to consider that the plane which contains
<inline-formula id="pone.0120025.e015">
<alternatives>
<graphic xlink:href="pone.0120025.e015.jpg" id="pone.0120025.e015g" position="anchor" mimetype="image" orientation="portrait"></graphic>
<mml:math id="M15">
<mml:mover accent="true">
<mml:mi>v</mml:mi>
<mml:mo></mml:mo>
</mml:mover>
</mml:math>
</alternatives>
</inline-formula>
and
<inline-formula id="pone.0120025.e016">
<alternatives>
<graphic xlink:href="pone.0120025.e016.jpg" id="pone.0120025.e016g" position="anchor" mimetype="image" orientation="portrait"></graphic>
<mml:math id="M16">
<mml:mrow>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:mover accent="true">
<mml:mi>i</mml:mi>
<mml:mo></mml:mo>
</mml:mover>
<mml:mo>,</mml:mo>
<mml:mover accent="true">
<mml:mi>j</mml:mi>
<mml:mo></mml:mo>
</mml:mover>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
</mml:mrow>
</mml:math>
</alternatives>
</inline-formula>
is itself rotating relative to the earth (about
<italic>O</italic>
) and, accordingly, to introduce two more angles (
<italic>φ</italic>
and
<italic>ψ</italic>
) to characterize the orientation of that plane relative to an earth-fixed reference frame (see
<xref rid="pone.0120025.s001" ref-type="supplementary-material">S1 Fig</xref>
and
<xref rid="pone.0120025.s005" ref-type="supplementary-material">S1 Text</xref>
for details):
<disp-formula id="pone.0120025.e017">
<alternatives>
<graphic xlink:href="pone.0120025.e017.jpg" id="pone.0120025.e017g" position="anchor" mimetype="image" orientation="portrait"></graphic>
<mml:math id="M17">
<mml:mrow>
<mml:mi>D</mml:mi>
<mml:mo>=</mml:mo>
<mml:mfrac>
<mml:mrow>
<mml:mi>v</mml:mi>
<mml:mo></mml:mo>
<mml:mi>sin</mml:mi>
<mml:mi>α</mml:mi>
</mml:mrow>
<mml:mrow>
<mml:mover accent="true">
<mml:mi>θ</mml:mi>
<mml:mo>˙</mml:mo>
</mml:mover>
<mml:mo>+</mml:mo>
<mml:mover accent="true">
<mml:mi>ψ</mml:mi>
<mml:mo>˙</mml:mo>
</mml:mover>
<mml:mi>cos</mml:mi>
<mml:mi>φ</mml:mi>
</mml:mrow>
</mml:mfrac>
</mml:mrow>
</mml:math>
</alternatives>
<label>(5)</label>
</disp-formula>
</p>
<p>Although intuitive for describing three-dimensional movements, the decomposition of optical motion in three rotations (
<italic>θ</italic>
,
<italic>φ</italic>
and
<italic>ψ</italic>
) about arbitrary axes is not necessarily relevant for a perceiver interacting with his/her environment. In addition, the description obtained within that framework also indicates that the three angular parameters are related (see
<xref rid="pone.0120025.s005" ref-type="supplementary-material">S1 Text</xref>
):
<disp-formula id="pone.0120025.e018">
<alternatives>
<graphic xlink:href="pone.0120025.e018.jpg" id="pone.0120025.e018g" position="anchor" mimetype="image" orientation="portrait"></graphic>
<mml:math id="M18">
<mml:mrow>
<mml:mtext></mml:mtext>
<mml:mo></mml:mo>
<mml:mover accent="true">
<mml:mi>ψ</mml:mi>
<mml:mo>˙</mml:mo>
</mml:mover>
<mml:mi>cos</mml:mi>
<mml:mi>θ</mml:mi>
<mml:mtext></mml:mtext>
<mml:mi>sin</mml:mi>
<mml:mi>ϕ</mml:mi>
<mml:mo>+</mml:mo>
<mml:mover accent="true">
<mml:mi>ϕ</mml:mi>
<mml:mo>˙</mml:mo>
</mml:mover>
<mml:mi>sin</mml:mi>
<mml:mi>θ</mml:mi>
<mml:mo>=</mml:mo>
<mml:mn>0</mml:mn>
<mml:mo>,</mml:mo>
</mml:mrow>
</mml:math>
</alternatives>
<label>(6)</label>
</disp-formula>
suggesting that a more circumspect description can be reached. A first solution, which allows us to use two rotations instead of three, is to use a spherical coordinate system in place of the Cartesian one (see
<xref rid="pone.0120025.s001" ref-type="supplementary-material">S1B Fig</xref>
and
<xref rid="pone.0120025.s005" ref-type="supplementary-material">S1 Text</xref>
for details):
<disp-formula id="pone.0120025.e019">
<alternatives>
<graphic xlink:href="pone.0120025.e019.jpg" id="pone.0120025.e019g" position="anchor" mimetype="image" orientation="portrait"></graphic>
<mml:math id="M19">
<mml:mrow>
<mml:mrow>
<mml:mo>|</mml:mo>
<mml:mi>D</mml:mi>
<mml:mo>|</mml:mo>
</mml:mrow>
<mml:mo>=</mml:mo>
<mml:mfrac>
<mml:mrow>
<mml:mi>v</mml:mi>
<mml:mo></mml:mo>
<mml:mrow>
<mml:mo>|</mml:mo>
<mml:mrow>
<mml:mtext></mml:mtext>
<mml:mi>s</mml:mi>
<mml:mi>i</mml:mi>
<mml:mi>n</mml:mi>
<mml:mtext></mml:mtext>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mi>α</mml:mi>
<mml:mo>)</mml:mo>
</mml:mrow>
<mml:mtext></mml:mtext>
</mml:mrow>
<mml:mo>|</mml:mo>
</mml:mrow>
</mml:mrow>
<mml:mrow>
<mml:msqrt>
<mml:mrow>
<mml:mtext></mml:mtext>
<mml:msup>
<mml:mover accent="true">
<mml:mi>φ</mml:mi>
<mml:mo>˙</mml:mo>
</mml:mover>
<mml:mn>2</mml:mn>
</mml:msup>
<mml:mi>c</mml:mi>
<mml:mi>o</mml:mi>
<mml:msup>
<mml:mi>s</mml:mi>
<mml:mn>2</mml:mn>
</mml:msup>
<mml:mi>δ</mml:mi>
<mml:mo>+</mml:mo>
<mml:msup>
<mml:mover accent="true">
<mml:mi>δ</mml:mi>
<mml:mo>˙</mml:mo>
</mml:mover>
<mml:mn>2</mml:mn>
</mml:msup>
</mml:mrow>
</mml:msqrt>
</mml:mrow>
</mml:mfrac>
</mml:mrow>
</mml:math>
</alternatives>
<label>(7)</label>
</disp-formula>
</p>
<p>However, as above, decomposing angular motion about two arbitrary axis is not satisfactory. An alternative solution, which allows for a more parsimonious description can be obtained from the definition of the cross-product:
<disp-formula id="pone.0120025.e020">
<alternatives>
<graphic xlink:href="pone.0120025.e020.jpg" id="pone.0120025.e020g" position="anchor" mimetype="image" orientation="portrait"></graphic>
<mml:math id="M20">
<mml:mrow>
<mml:mrow>
<mml:mo></mml:mo>
<mml:mrow>
<mml:mtext></mml:mtext>
<mml:mover accent="true">
<mml:mi>i</mml:mi>
<mml:mo></mml:mo>
</mml:mover>
<mml:mo></mml:mo>
<mml:mover accent="true">
<mml:mi>v</mml:mi>
<mml:mo></mml:mo>
</mml:mover>
<mml:mtext></mml:mtext>
</mml:mrow>
<mml:mo></mml:mo>
</mml:mrow>
<mml:mo>=</mml:mo>
<mml:mi>v</mml:mi>
<mml:mo></mml:mo>
<mml:mrow>
<mml:mo>|</mml:mo>
<mml:mrow>
<mml:mtext></mml:mtext>
<mml:mi>sin</mml:mi>
<mml:mtext></mml:mtext>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mi>α</mml:mi>
<mml:mo>)</mml:mo>
</mml:mrow>
<mml:mtext></mml:mtext>
</mml:mrow>
<mml:mo>|</mml:mo>
</mml:mrow>
</mml:mrow>
</mml:math>
</alternatives>
<label>(8)</label>
</disp-formula>
</p>
<p>From
<xref rid="pone.0120025.e007" ref-type="disp-formula">Eq 3</xref>
we also have,
<disp-formula id="pone.0120025.e021">
<alternatives>
<graphic xlink:href="pone.0120025.e021.jpg" id="pone.0120025.e021g" position="anchor" mimetype="image" orientation="portrait"></graphic>
<mml:math id="M21">
<mml:mrow>
<mml:mrow>
<mml:mo></mml:mo>
<mml:mrow>
<mml:mtext></mml:mtext>
<mml:mover accent="true">
<mml:mi>i</mml:mi>
<mml:mo></mml:mo>
</mml:mover>
<mml:mo></mml:mo>
<mml:mover accent="true">
<mml:mi>v</mml:mi>
<mml:mo></mml:mo>
</mml:mover>
<mml:mtext></mml:mtext>
</mml:mrow>
<mml:mo></mml:mo>
</mml:mrow>
<mml:mo>=</mml:mo>
<mml:mrow>
<mml:mo></mml:mo>
<mml:mrow>
<mml:mtext></mml:mtext>
<mml:mover accent="true">
<mml:mi>i</mml:mi>
<mml:mo></mml:mo>
</mml:mover>
<mml:mo></mml:mo>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:mover accent="true">
<mml:mi>D</mml:mi>
<mml:mo>˙</mml:mo>
</mml:mover>
<mml:mtext></mml:mtext>
<mml:mover accent="true">
<mml:mi>i</mml:mi>
<mml:mo></mml:mo>
</mml:mover>
<mml:mo>+</mml:mo>
<mml:mi>D</mml:mi>
<mml:mtext></mml:mtext>
<mml:mfrac>
<mml:mrow>
<mml:mi>d</mml:mi>
<mml:mtext></mml:mtext>
<mml:mover accent="true">
<mml:mi>i</mml:mi>
<mml:mo></mml:mo>
</mml:mover>
</mml:mrow>
<mml:mrow>
<mml:mi>d</mml:mi>
<mml:mi>t</mml:mi>
</mml:mrow>
</mml:mfrac>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
<mml:mtext></mml:mtext>
</mml:mrow>
<mml:mo></mml:mo>
</mml:mrow>
<mml:mo>=</mml:mo>
<mml:mrow>
<mml:mo></mml:mo>
<mml:mrow>
<mml:mtext></mml:mtext>
<mml:mover accent="true">
<mml:mi>i</mml:mi>
<mml:mo></mml:mo>
</mml:mover>
<mml:mo></mml:mo>
<mml:mi>D</mml:mi>
<mml:mtext></mml:mtext>
<mml:mfrac>
<mml:mrow>
<mml:mi>d</mml:mi>
<mml:mtext></mml:mtext>
<mml:mover accent="true">
<mml:mi>i</mml:mi>
<mml:mo></mml:mo>
</mml:mover>
</mml:mrow>
<mml:mrow>
<mml:mi>d</mml:mi>
<mml:mi>t</mml:mi>
</mml:mrow>
</mml:mfrac>
<mml:mtext></mml:mtext>
</mml:mrow>
<mml:mo></mml:mo>
</mml:mrow>
</mml:mrow>
</mml:math>
</alternatives>
<label>(9)</label>
</disp-formula>
</p>
<p>By combining Eqs
<xref rid="pone.0120025.e020" ref-type="disp-formula">8</xref>
and
<xref rid="pone.0120025.e021" ref-type="disp-formula">9</xref>
we can thus express the distance between a perceiver performing any 3D movement and a stationary object:
<disp-formula id="pone.0120025.e022">
<alternatives>
<graphic xlink:href="pone.0120025.e022.jpg" id="pone.0120025.e022g" position="anchor" mimetype="image" orientation="portrait"></graphic>
<mml:math id="M22">
<mml:mrow>
<mml:mrow>
<mml:mo>|</mml:mo>
<mml:mi>D</mml:mi>
<mml:mo>|</mml:mo>
</mml:mrow>
<mml:mo>=</mml:mo>
<mml:mfrac>
<mml:mrow>
<mml:mi>v</mml:mi>
<mml:mo></mml:mo>
<mml:mrow>
<mml:mo>|</mml:mo>
<mml:mrow>
<mml:mtext></mml:mtext>
<mml:mi>sin</mml:mi>
<mml:mtext></mml:mtext>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mi>α</mml:mi>
<mml:mo>)</mml:mo>
</mml:mrow>
<mml:mtext></mml:mtext>
</mml:mrow>
<mml:mo>|</mml:mo>
</mml:mrow>
</mml:mrow>
<mml:mi>Q</mml:mi>
</mml:mfrac>
</mml:mrow>
</mml:math>
</alternatives>
<label>(10)</label>
</disp-formula>
where
<italic>v</italic>
is the norm of the velocity of the point of observation
<italic>P</italic>
,
<italic>α</italic>
is the angle between the direction of movement and the direction of the object relative to
<italic>P</italic>
, and
<italic>Q</italic>
is the norm of a rotational vector
<inline-formula id="pone.0120025.e023">
<alternatives>
<graphic xlink:href="pone.0120025.e023.jpg" id="pone.0120025.e023g" position="anchor" mimetype="image" orientation="portrait"></graphic>
<mml:math id="M23">
<mml:mover accent="true">
<mml:mi>Ω</mml:mi>
<mml:mo></mml:mo>
</mml:mover>
</mml:math>
</alternatives>
</inline-formula>
, characterizing the change in the direction of
<italic>P</italic>
relative to the object (or, symmetrically, of the object relative to
<italic>P</italic>
):
<disp-formula id="pone.0120025.e024">
<alternatives>
<graphic xlink:href="pone.0120025.e024.jpg" id="pone.0120025.e024g" position="anchor" mimetype="image" orientation="portrait"></graphic>
<mml:math id="M24">
<mml:mrow>
<mml:mi>Q</mml:mi>
<mml:mo>=</mml:mo>
<mml:mrow>
<mml:mo></mml:mo>
<mml:mover accent="true">
<mml:mi>Ω</mml:mi>
<mml:mo></mml:mo>
</mml:mover>
<mml:mo></mml:mo>
</mml:mrow>
<mml:mo>=</mml:mo>
<mml:mrow>
<mml:mo></mml:mo>
<mml:mrow>
<mml:mtext></mml:mtext>
<mml:mover accent="true">
<mml:mi>i</mml:mi>
<mml:mo></mml:mo>
</mml:mover>
<mml:mo></mml:mo>
<mml:mfrac>
<mml:mrow>
<mml:mi>d</mml:mi>
<mml:mover accent="true">
<mml:mi>i</mml:mi>
<mml:mo></mml:mo>
</mml:mover>
</mml:mrow>
<mml:mrow>
<mml:mi>d</mml:mi>
<mml:mi>t</mml:mi>
</mml:mrow>
</mml:mfrac>
<mml:mtext></mml:mtext>
</mml:mrow>
<mml:mo></mml:mo>
</mml:mrow>
</mml:mrow>
</mml:math>
</alternatives>
</disp-formula>
with
<inline-formula id="pone.0120025.e025">
<alternatives>
<graphic xlink:href="pone.0120025.e025.jpg" id="pone.0120025.e025g" position="anchor" mimetype="image" orientation="portrait"></graphic>
<mml:math id="M25">
<mml:mover accent="true">
<mml:mi>i</mml:mi>
<mml:mo></mml:mo>
</mml:mover>
</mml:math>
</alternatives>
</inline-formula>
the unit vector of a mobile base, pointing from the object toward the point of observation (see
<xref rid="pone.0120025.g001" ref-type="fig">Fig 1</xref>
). As
<inline-formula id="pone.0120025.e026">
<alternatives>
<graphic xlink:href="pone.0120025.e026.jpg" id="pone.0120025.e026g" position="anchor" mimetype="image" orientation="portrait"></graphic>
<mml:math id="M26">
<mml:mover accent="true">
<mml:mi>θ</mml:mi>
<mml:mo>˙</mml:mo>
</mml:mover>
</mml:math>
</alternatives>
</inline-formula>
in the 2D model (
<xref rid="pone.0120025.e009" ref-type="disp-formula">Eq 4</xref>
), but as opposed to
<inline-formula id="pone.0120025.e027">
<alternatives>
<graphic xlink:href="pone.0120025.e027.jpg" id="pone.0120025.e027g" position="anchor" mimetype="image" orientation="portrait"></graphic>
<mml:math id="M27">
<mml:mover accent="true">
<mml:mi>α</mml:mi>
<mml:mo>˙</mml:mo>
</mml:mover>
</mml:math>
</alternatives>
</inline-formula>
in the 1D model (
<xref rid="pone.0120025.e001" ref-type="disp-formula">Eq 1</xref>
; [
<xref rid="pone.0120025.ref011" ref-type="bibr">11</xref>
,
<xref rid="pone.0120025.ref014" ref-type="bibr">14</xref>
<xref rid="pone.0120025.ref016" ref-type="bibr">16</xref>
]),
<italic>Q</italic>
is not defined relative to the direction toward which the point of observation
<italic>P</italic>
is moving; rather it is the rate of change of the direction of
<italic>P</italic>
relative to the object (or symmetrically of the object relative to
<italic>P</italic>
). According to
<xref rid="pone.0120025.e022" ref-type="disp-formula">Eq 10</xref>
the distance
<italic>D</italic>
is a relational property of optical (
<italic>α</italic>
,
<italic>Q</italic>
) and non optical (
<inline-formula id="pone.0120025.e028">
<alternatives>
<graphic xlink:href="pone.0120025.e028.jpg" id="pone.0120025.e028g" position="anchor" mimetype="image" orientation="portrait"></graphic>
<mml:math id="M28">
<mml:mover accent="true">
<mml:mi>v</mml:mi>
<mml:mo></mml:mo>
</mml:mover>
</mml:math>
</alternatives>
</inline-formula>
) dimensions of the stimulation (
<italic>Q</italic>
can also be viewed as the speed at which the eye/head must counter rotate in order to maintain the object at the same location within the field of view). Eqs
<xref rid="pone.0120025.e001" ref-type="disp-formula">1</xref>
,
<xref rid="pone.0120025.e009" ref-type="disp-formula">4</xref>
, and
<xref rid="pone.0120025.e022" ref-type="disp-formula">10</xref>
can also be differentiated to express distance as a function of higher order derivatives (acceleration, jerk, etc.). For example, the two successive time derivatives of
<xref rid="pone.0120025.e022" ref-type="disp-formula">Eq 10</xref>
yield
<disp-formula id="pone.0120025.e029">
<alternatives>
<graphic xlink:href="pone.0120025.e029.jpg" id="pone.0120025.e029g" position="anchor" mimetype="image" orientation="portrait"></graphic>
<mml:math id="M29">
<mml:mrow>
<mml:mi>D</mml:mi>
<mml:mo>=</mml:mo>
<mml:mfrac>
<mml:mrow>
<mml:mover accent="true">
<mml:mi>v</mml:mi>
<mml:mo>˙</mml:mo>
</mml:mover>
<mml:mi>sin</mml:mi>
<mml:mi>α</mml:mi>
<mml:mo>+</mml:mo>
<mml:mi>v</mml:mi>
<mml:mi>cos</mml:mi>
<mml:mi>α</mml:mi>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:mover accent="true">
<mml:mi>α</mml:mi>
<mml:mo>˙</mml:mo>
</mml:mover>
<mml:mo></mml:mo>
<mml:mi>Q</mml:mi>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
</mml:mrow>
<mml:mover accent="true">
<mml:mi>Q</mml:mi>
<mml:mo>˙</mml:mo>
</mml:mover>
</mml:mfrac>
</mml:mrow>
</mml:math>
</alternatives>
<label>(11)</label>
</disp-formula>
and
<disp-formula id="pone.0120025.e030">
<alternatives>
<graphic xlink:href="pone.0120025.e030.jpg" id="pone.0120025.e030g" position="anchor" mimetype="image" orientation="portrait"></graphic>
<mml:math id="M30">
<mml:mrow>
<mml:mi>D</mml:mi>
<mml:mo>=</mml:mo>
<mml:mfrac>
<mml:mrow>
<mml:mover accent="true">
<mml:mi>v</mml:mi>
<mml:mo>¨</mml:mo>
</mml:mover>
<mml:mi>sin</mml:mi>
<mml:mi>α</mml:mi>
<mml:mo>+</mml:mo>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:mover accent="true">
<mml:mi>v</mml:mi>
<mml:mo>˙</mml:mo>
</mml:mover>
<mml:mtext></mml:mtext>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:mn>2</mml:mn>
<mml:mover accent="true">
<mml:mi>α</mml:mi>
<mml:mo>˙</mml:mo>
</mml:mover>
<mml:mo></mml:mo>
<mml:mi>Q</mml:mi>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
<mml:mo>+</mml:mo>
<mml:mi>v</mml:mi>
<mml:mtext></mml:mtext>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:mover accent="true">
<mml:mi>α</mml:mi>
<mml:mo>¨</mml:mo>
</mml:mover>
<mml:mo></mml:mo>
<mml:mn>2</mml:mn>
<mml:mover accent="true">
<mml:mi>Q</mml:mi>
<mml:mo>˙</mml:mo>
</mml:mover>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
<mml:mi>cos</mml:mi>
<mml:mi>α</mml:mi>
<mml:mo></mml:mo>
<mml:mi>v</mml:mi>
<mml:mover accent="true">
<mml:mi>α</mml:mi>
<mml:mo>˙</mml:mo>
</mml:mover>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:mover accent="true">
<mml:mi>α</mml:mi>
<mml:mo>˙</mml:mo>
</mml:mover>
<mml:mo>+</mml:mo>
<mml:mi>Q</mml:mi>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
<mml:mi>sin</mml:mi>
<mml:mi>α</mml:mi>
</mml:mrow>
<mml:mover accent="true">
<mml:mi>Q</mml:mi>
<mml:mo>¨</mml:mo>
</mml:mover>
</mml:mfrac>
</mml:mrow>
</mml:math>
</alternatives>
<label>(12)</label>
</disp-formula>
</p>
<p>Interestingly, Eqs
<xref rid="pone.0120025.e022" ref-type="disp-formula">10</xref>
<xref rid="pone.0120025.e030" ref-type="disp-formula">12</xref>
assume simpler forms for particular trajectory shapes. For example, if the perceiver’s movement is rectilinear or plane, then
<xref rid="pone.0120025.e022" ref-type="disp-formula">Eq 10</xref>
simplifies into
<xref rid="pone.0120025.e001" ref-type="disp-formula">Eq 1</xref>
and
<xref rid="pone.0120025.e009" ref-type="disp-formula">Eq 4</xref>
, respectively. The formal description also simplifies when the perceiver’s movement is orthogonal to the direction of the target (i.e., when movement is on a sphere centered on the target). Within optic flow, this type of movement generates translatory motion and perspective changes but no looming or receding motion. In that case,
<italic>α</italic>
= 90°, sin(
<italic>α</italic>
) = 1 and
<xref rid="pone.0120025.e022" ref-type="disp-formula">Eq 10</xref>
simplifies into:
<disp-formula id="pone.0120025.e031">
<alternatives>
<graphic xlink:href="pone.0120025.e031.jpg" id="pone.0120025.e031g" position="anchor" mimetype="image" orientation="portrait"></graphic>
<mml:math id="M31">
<mml:mrow>
<mml:mi>D</mml:mi>
<mml:mo>=</mml:mo>
<mml:mfrac>
<mml:mi>v</mml:mi>
<mml:mi>Q</mml:mi>
</mml:mfrac>
</mml:mrow>
</mml:math>
</alternatives>
<label>(13)</label>
</disp-formula>
and its two first time derivatives are
<disp-formula id="pone.0120025.e032">
<alternatives>
<graphic xlink:href="pone.0120025.e032.jpg" id="pone.0120025.e032g" position="anchor" mimetype="image" orientation="portrait"></graphic>
<mml:math id="M32">
<mml:mrow>
<mml:mi>D</mml:mi>
<mml:mo>=</mml:mo>
<mml:mfrac>
<mml:mover accent="true">
<mml:mi>v</mml:mi>
<mml:mo>˙</mml:mo>
</mml:mover>
<mml:mover accent="true">
<mml:mi>Q</mml:mi>
<mml:mo>˙</mml:mo>
</mml:mover>
</mml:mfrac>
</mml:mrow>
</mml:math>
</alternatives>
<label>(14)</label>
</disp-formula>
<disp-formula id="pone.0120025.e033">
<alternatives>
<graphic xlink:href="pone.0120025.e033.jpg" id="pone.0120025.e033g" position="anchor" mimetype="image" orientation="portrait"></graphic>
<mml:math id="M33">
<mml:mrow>
<mml:mi>D</mml:mi>
<mml:mo>=</mml:mo>
<mml:mfrac>
<mml:mover accent="true">
<mml:mi>v</mml:mi>
<mml:mo>¨</mml:mo>
</mml:mover>
<mml:mover accent="true">
<mml:mi>Q</mml:mi>
<mml:mo>¨</mml:mo>
</mml:mover>
</mml:mfrac>
</mml:mrow>
</mml:math>
</alternatives>
<label>(15)</label>
</disp-formula>
</p>
<p>Importantly, these simplifications do not necessitate that the whole trajectory be rectilinear or orthogonal to the target, but only that it be at least locally (and approximately) rectilinear, plane or orthogonal (e.g., tangential). For instance,
<xref rid="pone.0120025.e031" ref-type="disp-formula">Eq 13</xref>
provides a consistent approximation of the actual distance of the object even when the direction of head movements (relative to the target) deviate from pure orthogonality: If the deviation is equal to ±15°, the distance specified will overestimate the actual distance by only 3.4% (i.e., 3.4 cm if the object is at 1 m, 6.8 cm if it is 2 m far, etc.). Whereas 2-d movement yielded ambiguity, our analysis shows that 3-d movement yields specification in an emergent, higher order pattern. Thus, our analysis reveals that, in principle, movement of the observer generates patterns in ambient energy that are sufficient for accurate, non-inferential perception of egocentric distance.</p>
</sec>
<sec id="sec005">
<title>Exploratory activity and the pickup of information</title>
<p>In Eqs
<xref rid="pone.0120025.e001" ref-type="disp-formula">1</xref>
,
<xref rid="pone.0120025.e009" ref-type="disp-formula">4</xref>
<xref rid="pone.0120025.e017" ref-type="disp-formula">5</xref>
,
<xref rid="pone.0120025.e019" ref-type="disp-formula">7</xref>
,
<xref rid="pone.0120025.e022" ref-type="disp-formula">10</xref>
<xref rid="pone.0120025.e030" ref-type="disp-formula">12</xref>
,
<xref rid="pone.0120025.e031" ref-type="disp-formula">13</xref>
<xref rid="pone.0120025.e033" ref-type="disp-formula">15</xref>
,
<italic>v</italic>
and its derivatives influence the stimulation of the vestibular and kinesthetic systems, while
<italic>α</italic>
,
<italic>θ</italic>
,
<italic>Q</italic>
and their derivatives influence the stimulation of the visual system. Determinate information about egocentric distance is available only in relations between stimulation available to these perceptual systems. That is, the information is an emergent property that does not exist in the stimulation available to any individual perceptual system. The equations do not impose any particular metrics. The unit in which they specify egocentric distance depends on the unit used to describe head kinematics (e.g., conventional or intrinsic).</p>
<p>If the perceiver is stationary relative to the illuminated environment, the gravito-inertial environment or both, then Eqs
<xref rid="pone.0120025.e001" ref-type="disp-formula">1</xref>
,
<xref rid="pone.0120025.e001" ref-type="disp-formula">1</xref>
,
<xref rid="pone.0120025.e009" ref-type="disp-formula">4</xref>
<xref rid="pone.0120025.e017" ref-type="disp-formula">5</xref>
,
<xref rid="pone.0120025.e019" ref-type="disp-formula">7</xref>
,
<xref rid="pone.0120025.e022" ref-type="disp-formula">10</xref>
<xref rid="pone.0120025.e030" ref-type="disp-formula">12</xref>
,
<xref rid="pone.0120025.e031" ref-type="disp-formula">13</xref>
<xref rid="pone.0120025.e033" ref-type="disp-formula">15</xref>
are undefined or ineffective. Therefore, specification of egocentric distance requires not only optic flow, but also movements of the head and/or body relative to the gravito-inertial environment. The differences among equations further underline that the form of the intermodal pattern specifying distance depends on the characteristics of the movement performed by the perceiver. As a consequence, the perceptual skills required to perceive distance could vary as a function of the exploratory activity of the perceiver. For example, when the point of observation is moving at constant speed (i.e., linear acceleration
<inline-formula id="pone.0120025.e034">
<alternatives>
<graphic xlink:href="pone.0120025.e034.jpg" id="pone.0120025.e034g" position="anchor" mimetype="image" orientation="portrait"></graphic>
<mml:math id="M34">
<mml:mrow>
<mml:mover accent="true">
<mml:mi>v</mml:mi>
<mml:mo>˙</mml:mo>
</mml:mover>
<mml:mo>=</mml:mo>
<mml:mn>0</mml:mn>
</mml:mrow>
</mml:math>
</alternatives>
</inline-formula>
), some equations simplify (e.g., Eqs
<xref rid="pone.0120025.e029" ref-type="disp-formula">11</xref>
<xref rid="pone.0120025.e030" ref-type="disp-formula">12</xref>
, dedicated to any 3D movements) while some others are ineffective (e.g.,
<xref rid="pone.0120025.e032" ref-type="disp-formula">Eq 14</xref>
, dedicated to movements orthogonal to the direction of the target). Hence, if a perceiver is not sensitive to linear acceleration, then moving at constant speed can allow him/her to exploit Eqs
<xref rid="pone.0120025.e029" ref-type="disp-formula">11</xref>
<xref rid="pone.0120025.e030" ref-type="disp-formula">12</xref>
to perceive distance. On the other hand, if the perceiver is sensitive to linear acceleration, then moving at constant speed would not change his/her ability to exploit the Eqs
<xref rid="pone.0120025.e029" ref-type="disp-formula">11</xref>
<xref rid="pone.0120025.e030" ref-type="disp-formula">12</xref>
, but would prevent him/her from exploiting
<xref rid="pone.0120025.e032" ref-type="disp-formula">Eq 14</xref>
. In a similar way, if the perceiver’s movement is roughly orthogonal to the direction of the target, then the description of available information about distance simplify into Eqs
<xref rid="pone.0120025.e031" ref-type="disp-formula">13</xref>
<xref rid="pone.0120025.e033" ref-type="disp-formula">15</xref>
. In these equations, egocentric distance is no longer a function of parameter
<italic>α</italic>
and its derivatives. In other words, with such movements, distance perception no more depends on the perceiver’s sensitivity to the direction of the target relative to his/her direction of movement.</p>
<p>Outside the laboratory, situations in which an actor is completely static or moving at constant velocity are rare. However, we can predict a failure to perceive scaled egocentric distance if the actor’s movement and the resulting apparent optical motion of the object were too low (or with a too low acceleration, jerk, etc.) relative to the object’s distance. More generally, the formal analysis suggests that using particular movements could ease (or impede) information pick up.</p>
<p>Existing experimental work that has addressed the perception of egocentric distance systematically has restricted participants to rectilinear movements. The majority of these studies have evaluated the role of head translations in two directions: either toward the target, e.g., [
<xref rid="pone.0120025.ref013" ref-type="bibr">13</xref>
,
<xref rid="pone.0120025.ref019" ref-type="bibr">19</xref>
<xref rid="pone.0120025.ref022" ref-type="bibr">22</xref>
], or in the orthogonal direction—laterally—[
<xref rid="pone.0120025.ref012" ref-type="bibr">12</xref>
,
<xref rid="pone.0120025.ref018" ref-type="bibr">18</xref>
,
<xref rid="pone.0120025.ref022" ref-type="bibr">22</xref>
<xref rid="pone.0120025.ref024" ref-type="bibr">24</xref>
]. Other studies have examined slightly diagonal translation [
<xref rid="pone.0120025.ref025" ref-type="bibr">25</xref>
] or rotation of the head about its longitudinal axis [
<xref rid="pone.0120025.ref026" ref-type="bibr">26</xref>
]. In many cases, experimenters have instructed participants to perform “regular, repetitive”, “oscillatory” or “rhythmic” movements, e.g., [
<xref rid="pone.0120025.ref019" ref-type="bibr">19</xref>
,
<xref rid="pone.0120025.ref022" ref-type="bibr">22</xref>
,
<xref rid="pone.0120025.ref026" ref-type="bibr">26</xref>
], and have restricted exploratory movements to specific amplitudes and frequencies, e.g., [
<xref rid="pone.0120025.ref012" ref-type="bibr">12</xref>
,
<xref rid="pone.0120025.ref013" ref-type="bibr">13</xref>
,
<xref rid="pone.0120025.ref018" ref-type="bibr">18</xref>
,
<xref rid="pone.0120025.ref023" ref-type="bibr">23</xref>
]. These restrictions on movement may have been convenient analytically, but are difficult to justify in terms of natural behavior outside the laboratory (except when the point of observation moves together with an aircraft, as in [
<xref rid="pone.0120025.ref014" ref-type="bibr">14</xref>
]). Moreover, as we underlined, artificial restrictions on exploratory behavior necessarily constrain the nature and availability of emergent intermodal information about egocentric distance. It is worth noting that
<xref rid="pone.0120025.e001" ref-type="disp-formula">Eq 1</xref>
, which pertains to rectilinear movements, applies identically to rectilinear movements in any direction, whether along the line of sight, perpendicular to it, or any intermediate direction.</p>
<p>In the experiment reported below, we allowed participants to move freely, provided only that they remained seated. Permitting free movement made it possible for us (i) to evaluate human’s ability to perceive egocentric distance when the form under which information is available is not artificially restricted by the experimenter and (ii) to analyze participants’ self-selected exploratory activity.</p>
</sec>
<sec id="sec006">
<title>Experiment</title>
<p>We formalized above an intermodal invariant,
<italic>Ii</italic>
, specifying the egocentric distance of a static object for a moving perceiver. That information is available in the structure of ambient energy does not imply that it is actually detected. To evaluate whether
<italic>Ii</italic>
can be picked up and used by humans, we conducted the following experiment. Seated participants were asked to judge (yes/no) whether a visible object was within reach. We manipulated the relation between optics and haptics/inertia that was available to participants during the judgment task. We used a virtual environment system in which an optical display could be updated in real time on the basis of data about displacement of the head in the gravito-inertial space. In addition to manipulating
<italic>Ii</italic>
, the virtual set up allowed to control all other potential information about egocentric distance.
<xref rid="pone.0120025.g002" ref-type="fig">Fig 2</xref>
illustrates the experimental set up (A), the experimental conditions (B) and shows a screen capture illustrating the participants view during the experiment (C). We hypothesized that egocentric distance would be perceived accurately when the natural congruence between sources was preserved, that is, when Equation Eqs
<xref rid="pone.0120025.e001" ref-type="disp-formula">1</xref>
,
<xref rid="pone.0120025.e001" ref-type="disp-formula">1</xref>
,
<xref rid="pone.0120025.e009" ref-type="disp-formula">4</xref>
<xref rid="pone.0120025.e017" ref-type="disp-formula">5</xref>
,
<xref rid="pone.0120025.e019" ref-type="disp-formula">7</xref>
,
<xref rid="pone.0120025.e022" ref-type="disp-formula">10</xref>
<xref rid="pone.0120025.e030" ref-type="disp-formula">12</xref>
,
<xref rid="pone.0120025.e031" ref-type="disp-formula">13</xref>
<xref rid="pone.0120025.e033" ref-type="disp-formula">15</xref>
were defined. Hereafter, this intermodal condition is called the Movement condition (
<xref rid="pone.0120025.g002" ref-type="fig">Fig 2B1</xref>
) because participants were allowed to freely explore the scene by moving their head relative to the virtual object prior to giving their judgment. We also hypothesized that performance in the Movement condition would contrast with the performance in two control conditions in which optic flow was presented alone (i.e., in the absence of coordinated inertial stimulation). In the Stationary condition, participants were stationary while looking at a static display of the object (
<xref rid="pone.0120025.g002" ref-type="fig">Fig 2B2</xref>
). In the Playback condition, participant also remained still, but the display of the object was driven by his/her own previously recorded movements, recorded during earlier (Movement) trials (
<xref rid="pone.0120025.g002" ref-type="fig">Fig 2B3</xref>
).</p>
<fig id="pone.0120025.g002" orientation="portrait" position="float">
<object-id pub-id-type="doi">10.1371/journal.pone.0120025.g002</object-id>
<label>Fig 2</label>
<caption>
<title>Experimental design.</title>
<p>(A) Set up. The participant’s head position and orientation were sampled and used in real time to drive the display of a HMD so as to depict a stationary virtual target. (B) Experimental conditions (see text for details). (C) Screen capture. The virtual target was the only visible element on the HMD screen.</p>
</caption>
<graphic xlink:href="pone.0120025.g002"></graphic>
</fig>
<p>Participants were presented targets at 17 different distances. To control for potential range effects, e.g., [
<xref rid="pone.0120025.ref027" ref-type="bibr">27</xref>
,
<xref rid="pone.0120025.ref028" ref-type="bibr">28</xref>
], participants were randomly split into two groups. For the Near group, target distances ranged from 23 to 135% of the actual maximum reachable distance (MR
<sub>A</sub>
, measured prior to the experiment for each participant, see
<xref rid="sec017" ref-type="sec">Material and Methods</xref>
section). For the Far group, distances ranged from 72 to 184%. Our primary dependent variables were the accuracy and precision of the judgments about whether stimulus objects were within reach. These were derived from the psychometric curves fitted to the data and were respectively indicators of the constant and variable error in judgments. We also analyzed exploratory head movements in different conditions as an indicator of whether and how participants exploited Eqs
<xref rid="pone.0120025.e001" ref-type="disp-formula">1</xref>
,
<xref rid="pone.0120025.e009" ref-type="disp-formula">4</xref>
,
<xref rid="pone.0120025.e022" ref-type="disp-formula">10</xref>
<xref rid="pone.0120025.e030" ref-type="disp-formula">12</xref>
,
<xref rid="pone.0120025.e031" ref-type="disp-formula">13</xref>
<xref rid="pone.0120025.e033" ref-type="disp-formula">15</xref>
.</p>
</sec>
</sec>
<sec sec-type="results" id="sec007">
<title>Results</title>
<sec id="sec008">
<title>Reachability judgments</title>
<p>The deviance test, assessing goodness-of-fit, showed that all the global regressions were good summaries of the corresponding group data, thus ensuring that the derived parameters were relevant. The global regression curves for all experimental conditions (Movement, Stationary and Playback) are plotted in
<xref rid="pone.0120025.g003" ref-type="fig">Fig 3A</xref>
, showing the evolution of the proportion of “yes” judgments as a function of increasing simulated distance. Individual regressions were used to compare performance across conditions. Individual fits for which the deviance test was not significant were not included in the analyses. In the Stationary condition, the frequent absence of consistency in judgments yielded only 7 reliable individual fits. Means and standard deviations of slopes and perceived maximum reachable distance (MR
<sub>P</sub>
) for significant individual fits are given in
<xref rid="pone.0120025.s003" ref-type="supplementary-material">S1 Table</xref>
. Because of the non-normal distribution of slopes and absolute errors from individual regression across conditions, we compared them with Wilcoxon non-parametric tests.</p>
<fig id="pone.0120025.g003" orientation="portrait" position="float">
<object-id pub-id-type="doi">10.1371/journal.pone.0120025.g003</object-id>
<label>Fig 3</label>
<caption>
<title>Reachability judgments.</title>
<p>(A) Global regression curves (lines) and means (symbols) for both groups of participants in Movement (blue), Stationary (violet) and Playback conditions (green). The percentage of positive responses is plotted as a function of target distance (expressed in proportion of the actual maximum reachable distance, MR
<sub>A</sub>
). (B) Perceived maximum reachable distance MR
<sub>P</sub>
for each group (Near and Far, respectively dark and light blue dots) in the Movement condition. The vertical grey bars mark the median of the set of tested distances for each group of participant (the range of tested distances is indicated by the grey brackets and dotted lines). The vertical orange bar represents MR
<sub>A</sub>
for both groups. In each group, MR
<sub>P</sub>
is biased toward the median (black arrows) but simultaneously attracted toward MR
<sub>A</sub>
, as expected (brown arrows).</p>
</caption>
<graphic xlink:href="pone.0120025.g003"></graphic>
</fig>
<p>In the Movement condition, the slopes obtained from the global regression were -1.76 and -1.33, respectively for the Near and Far groups (
<italic>CI</italic>
<sub>.
<italic>95</italic>
</sub>
= -2.15, -1.45 and
<italic>CI</italic>
<sub>.
<italic>95</italic>
</sub>
= -1.68, -1.09). This meant that when the target’s distance was increased from -20% to + 20% of MR
<sub>A</sub>
(i.e., approximately -12cm to +12cm), it caused reachability judgments in the Movement condition to drop from 78% to 26% of “yes” (i.e., reachable). By contrast, these slopes were 4 times smaller (-0.45 and -0.34) in the Playback condition (
<italic>CI</italic>
<sub>.
<italic>95</italic>
</sub>
= -0.60, -0.26 and
<italic>CI</italic>
<sub>.
<italic>95</italic>
</sub>
= -0.47, -0.19, for Group Near and Far), and more than 60 times smaller (-0.03 and -0.01) in the Stationary condition (
<italic>CI</italic>
<sub>.
<italic>95</italic>
</sub>
= -0.17, 0.00 and
<italic>CI</italic>
<sub>.
<italic>95</italic>
</sub>
= -0.15, 0.00). Wilcoxon tests confirmed that the precision of judgments was significantly lower for Movement than for the two other conditions: between Movement and Stationary (
<italic>Z</italic>
(
<italic>N</italic>
= 7) = 2.20, p <. 05,
<italic>d</italic>
<sub>
<italic>w</italic>
</sub>
= -0.71), between Movement and Playback (
<italic>Z</italic>
(
<italic>N</italic>
= 13) = 3.18, p <. 005,
<italic>d</italic>
<sub>
<italic>w</italic>
</sub>
= -1.00). Conversely the difference between Stationary and Playback was not significant (
<italic>Z</italic>
(
<italic>N</italic>
= 7) = 1.18, ns,
<italic>d</italic>
<sub>
<italic>w</italic>
</sub>
= -0.71). Altogether, these results indicate that reachability judgments in the Movement condition were far more precise than those in the Playback and Stationary conditions.</p>
<p>In the Movement condition, participants from the Near group underestimated their actual reaching capabilities, while participants in the Far group overestimated their actual reaching abilities. Each shift was in direction of the middle of the sets (
<xref rid="pone.0120025.g003" ref-type="fig">Fig 3B</xref>
, black arrows), suggesting a classic centering bias, that is, a bias toward the middle of the testing interval, e.g., [
<xref rid="pone.0120025.ref028" ref-type="bibr">28</xref>
]. Despite the absence of prior training or knowledge, participants appear to have calibrated their judgments relative to the sets of distances, as if they were roughly balancing the number of their positive and negative judgments. Nevertheless, the perceived maximum reachable distance MR
<sub>P</sub>
was attracted by the actual maximum reachable distance MR
<sub>A</sub>
(
<xref rid="pone.0120025.g003" ref-type="fig">Fig 3B</xref>
, brown arrows). When averaged over participants, the absolute error of judgments in the Movement condition was 20.3% of MR
<sub>A</sub>
(
<italic>CI</italic>
<sub>.
<italic>95</italic>
</sub>
= 11.4, 29.1), which corresponded to 12.5cm. By contrast, the mean absolute errors in the Stationary and Playback conditions, respectively 114.6% and 111.1%, were more than 5 times larger (
<italic>CI</italic>
<sub>.
<italic>95</italic>
</sub>
= 51.7, 177.6 and
<italic>CI</italic>
<sub>.
<italic>95</italic>
</sub>
= 0.0, 234.5). This difference between Movement and the two others condition was confirmed by the Wilcoxon tests conducted on individual absolute error values,
<italic>Z</italic>
(
<italic>N</italic>
= 7) = 2.37,
<italic>p</italic>
<. 05,
<italic>d</italic>
<sub>
<italic>w</italic>
</sub>
= 1.00 and
<italic>Z</italic>
(
<italic>N</italic>
= 13) = 2.27,
<italic>p</italic>
<. 05,
<italic>d</italic>
<sub>
<italic>w</italic>
</sub>
= 0.38, respectively for comparison with Stationary and Playback. The difference between Stationary and Playback was not significant (
<italic>Z</italic>
(
<italic>N</italic>
= 7) = 0.68,
<italic>ns</italic>
,
<italic>d</italic>
<sub>
<italic>w</italic>
</sub>
= -0.43). As predicted, judgments made when the
<italic>Ii</italic>
was available (Movement condition) were more accurate than when it was not (Stationary and Playback conditions).</p>
</sec>
<sec id="sec009">
<title>Confidence ratings</title>
<p>In addition to judging whether the target was reachable, participants also rated between 1 (low) and 5 (high) how confident they were about their judgments. Following previous studies, e.g., [
<xref rid="pone.0120025.ref029" ref-type="bibr">29</xref>
], we hypothesized that confidence ratings would offer a converging indicator of the participants’ ability to perceive whether the targets were within reach or not. We expected that, when distance information was available (i.e., in the Movement condition), participants would be more confident about their judgments when the target was very near or very far (unambiguous situation) than when the target was close to the reachable/not reachable boundary (ambiguous situation). Conversely, we anticipated that no such trend would emerge in the two other conditions. This U-shape hypothesis can also be understood from a dynamical systems perspective [
<xref rid="pone.0120025.ref030" ref-type="bibr">30</xref>
]: in that case, the actual maximum reachable distance corresponds to a transition point between two stable regimes, which represent two different action modes (e.g., reaching with the arm only vs. reaching with the arm plus leaning the torso forward). From that point of view, the lower confidence of participants would result from the larger susceptibility to noise and from the increase of the variability of the order parameter, exhibited by such nonlinear systems at regime boundaries.</p>
<p>The data are summarized in
<xref rid="pone.0120025.g004" ref-type="fig">Fig 4</xref>
. As expected, when plotted as a function of target distance, ratings exhibited U-shaped curves in the Movement condition (with an additional slight asymmetry between near and far judgments), whereas the curves were flat in the Stationary condition and exhibited a constant decrease in the Playback condition. To quantify these trends we fitted 2
<sup>nd</sup>
order polynomials to the rating curves for each group and in each condition (full equations,
<italic>R
<sup>2</sup>
</italic>
and parameters statistics are given in
<xref rid="pone.0120025.s004" ref-type="supplementary-material">S2 Table</xref>
). The parameters associated to the
<italic>x
<sup>2</sup>
</italic>
terms (quantifying the ‘openness of the U’) were of 3.18 and 2.15 for the Near and Far groups in the Movement condition, while they did not exceeded 0.51 in the two other conditions. Among the six regressions, these parameters were only significant for the Movement condition.</p>
<fig id="pone.0120025.g004" orientation="portrait" position="float">
<object-id pub-id-type="doi">10.1371/journal.pone.0120025.g004</object-id>
<label>Fig 4</label>
<caption>
<title>Mean confidence ratings expressed as a function of target distance.</title>
<p>The rating scale ranged from 1 (lowest) to 5 (highest). (A) Near Group. (B) Far Group. Distance on abscissa axis is expressed in proportion of the actual maximum reachable distance, MR
<sub>A</sub>
. Dotted lines represent second order polynomial regressions (see text and
<xref rid="pone.0120025.s004" ref-type="supplementary-material">S2 Table</xref>
for details).</p>
</caption>
<graphic xlink:href="pone.0120025.g004"></graphic>
</fig>
</sec>
<sec id="sec010">
<title>Exploratory movements</title>
<p>We also analyzed the exploratory movements that participants used to create information about distance in the Movement condition. The mean amplitude of head displacement along the principal axis of movement was 1.7 cm (
<italic>CI</italic>
<sub>.
<italic>95</italic>
</sub>
= 1.6, 1.9) in the Playback condition, and 1.4 cm (
<italic>CI</italic>
<sub>.
<italic>95</italic>
</sub>
= 1.3, 1.5) in the Stationary condition, as compared with 26.5 cm (
<italic>CI</italic>
<sub>.
<italic>95</italic>
</sub>
= 25.9, 27.1) in the Movement condition. For this reason, and because in the two conditions without movement the display was not driven by participants’ movement, we limited our analysis of exploratory movement to the Movement condition.</p>
<p>As we have seen, formal descriptions of
<italic>Ii</italic>
(e.g., 1, 4, 10–12, 13–15) depend on the characteristics of the perceiver’s movement (e.g., 1D, 2D or 3D; orthogonal to the direction of the target). Previous empirical investigations systematically restricted exploratory behavior in terms of dimension (1D) and direction (lateral or toward the target). As our participants were not imposed any specific trajectory, we first wondered whether they would spontaneously use 1D displacements, and if so, whether these movements would be oriented toward either of the two directions usually tested in the literature (lateral or toward the target). To that end, we analyzed the distribution of instantaneous directions of movement during each trial. The eigenvalues of the orientation matrix provide a measure of the variance in movement direction explained by each of the corresponding eigenvectors. When averaged over participants and trials, the three eigenvalues were respectively 68.1% (
<italic>CI</italic>
<sub>.
<italic>95</italic>
</sub>
= 67.6, 68.6), 23.6% (
<italic>CI</italic>
<sub>.
<italic>95</italic>
</sub>
= 23.1, 24.1) and 8.3% (
<italic>CI</italic>
<sub>.
<italic>95</italic>
</sub>
= 8.2, 8.5). That the second and smallest eigenvalues accounted for more than 30% of the total variability indicates that movements were not merely one-dimensional (rectilinear). For each trial, the eigenvector associated with the largest eigenvalue provided a measure of the principal direction about which head trajectory was organized. As illustrated in the frequency plot in
<xref rid="pone.0120025.g005" ref-type="fig">Fig 5</xref>
, these principal directions were more often heading toward the target (red point) than perpendicular to it (orange point). However, they were not merely heading straight toward the target; rather they were spread over a wide range of diagonal directions, which differed among participants and to a smaller extent across trials for a same participant. Hence, as can be seen on
<xref rid="pone.0120025.g005" ref-type="fig">Fig 5</xref>
from the comparison between the green cloud and the red and orange points, our participants’ self chosen patterns of movement contrasted with the two unique directions usually imposed in the literature, e.g., [
<xref rid="pone.0120025.ref012" ref-type="bibr">12</xref>
<xref rid="pone.0120025.ref014" ref-type="bibr">14</xref>
,
<xref rid="pone.0120025.ref018" ref-type="bibr">18</xref>
<xref rid="pone.0120025.ref023" ref-type="bibr">23</xref>
].</p>
<fig id="pone.0120025.g005" orientation="portrait" position="float">
<object-id pub-id-type="doi">10.1371/journal.pone.0120025.g005</object-id>
<label>Fig 5</label>
<caption>
<title>Frequency plot of principal directions of movement for all trials and participants in Movement condition.</title>
<p>The principal direction corresponded to the eigenvector associated with the largest eigenvalue of the trial orientation matrix. A principal direction was computed for each trial, but only those for which the eigenvalue exceeded. 5 and the ratio of second and largest eigenvalues was less than. 8 were included in the plot (1147 out of the 1190 trials). The red and orange dots represent the two directions usually tested in the literature: toward the target (AP) and laterally (ML), respectively.</p>
</caption>
<graphic xlink:href="pone.0120025.g005"></graphic>
</fig>
<p>Although the equations we formalized specify egocentric distance for any amplitude of movement and any magnitude of head speed (or acceleration, jerk, etc.), the perceiver’s ability to pick up
<italic>Ii</italic>
could impose specific constraints on exploratory movement. In particular, for a same movement of the perceiver, the resulting speed (and acceleration, jerk, etc.) at which the direction of the object changes relative to the eye decreases as the distance to the (virtual) target increases. A limit case would be that of objects in the horizon, which are so far that the change in their direction relative to the eye induced by head movements is marginal (and imperceptible). To use
<italic>Ii</italic>
, a perceiver must keep its salience within stimulation and thus has to move in a way that maintains a certain minimal amount of angular speed, acceleration and/or jerk in optic flow. To do so, when the distance to the object increases, the perceiver can increase his/her movement amplitude (when it results in getting closer to the target, increasing movement amplitude yields higher optical speed, acceleration and jerk for a given head movement), speed, acceleration, and/or jerk, depending on whether he/she is using first, second and/or third order-based information.</p>
<p>
<xref rid="pone.0120025.g006" ref-type="fig">Fig 6</xref>
shows the evolution of the average movement amplitude (A), as well as the evolution of the average norm of instantaneous velocity, and acceleration (B and C) as a function of target distance. The sampled head movement signal was too noisy to perform jerk analyses. As expected and in line with previous studies, e.g., [
<xref rid="pone.0120025.ref012" ref-type="bibr">12</xref>
], linear regressions confirmed that the amplitude of head movements (i.e., the range along the principal axis) increased as the simulated distance of the object increased,
<italic>R
<sup>2</sup>
</italic>
= .893, slope = 0.255,
<italic>p</italic>
<. 001 and
<italic>R
<sup>2</sup>
</italic>
= .711, slope = 0.105,
<italic>p</italic>
<. 001, respectively for the Near and Far groups. In addition, the linear regressions performed on the average norm of instantaneous head velocity revealed that participants also increased the speed at which they were moving as the distance of the simulated object increased,
<italic>R
<sup>2</sup>
</italic>
= .945, slope = 0.083,
<italic>p</italic>
<. 001 and
<italic>R
<sup>2</sup>
</italic>
= .860, slope = 0.054,
<italic>p</italic>
<. 001. Similarly, the regressions indicated that the average norm of instantaneous head acceleration increased as the distance of the simulated object increased,
<italic>R
<sup>2</sup>
</italic>
= .804, slope = 0.244,
<italic>p</italic>
<. 001 and
<italic>R
<sup>2</sup>
</italic>
= .815, slope = 0.242,
<italic>p</italic>
<. 001.</p>
<fig id="pone.0120025.g006" orientation="portrait" position="float">
<object-id pub-id-type="doi">10.1371/journal.pone.0120025.g006</object-id>
<label>Fig 6</label>
<caption>
<title>Amplitude, instantaneous speed, and acceleration of head movements in the Movement condition.</title>
<p>Mean values with standard errors are plotted as a function of initial target distance, expressed in proportion of the actual maximum reachable distance, MR
<sub>A</sub>
. (A) Mean range along principal axis. (B) Mean norm of instantaneous velocity. (C) Mean norm of instantaneous acceleration.</p>
</caption>
<graphic xlink:href="pone.0120025.g006"></graphic>
</fig>
</sec>
<sec id="sec011">
<title>Movements, information, and performance</title>
<p>We also applied our equations to real movement data sampled in the Movement condition for all trials and participants. As expected,
<xref rid="pone.0120025.e022" ref-type="disp-formula">Eq 10</xref>
(dedicated to 3D movements) provided accurate information about distance 100% of the time. Interestingly, the analyses also revealed that the participants moved such that the simplified equation dedicated to movements orthogonal to the direction of the target (i.e., on a target-centered sphere; 13) specified the actual egocentric distance (±5cm) 28.7% of the time (
<italic>CI</italic>
<sub>.
<italic>95</italic>
</sub>
= 22.8, 31.6). In contrast, the equation dedicated to rectilinear movements (
<xref rid="pone.0120025.e001" ref-type="disp-formula">Eq 1</xref>
; e.g., [
<xref rid="pone.0120025.ref014" ref-type="bibr">14</xref>
,
<xref rid="pone.0120025.ref016" ref-type="bibr">16</xref>
]) specified the actual egocentric distance only 3.5% of the time (
<italic>CI</italic>
<sub>.
<italic>95</italic>
</sub>
= 2.3, 4.0).
<xref rid="pone.0120025.g007" ref-type="fig">Fig 7</xref>
illustrates these results using data from one participant from the Near group in one representative Movement trial. Head trajectory (seen from above) is plotted on the left side (A). The top graph on the right represents the information about egocentric distance generated by the participant during the trial, as specified by Eqs
<xref rid="pone.0120025.e022" ref-type="disp-formula">10</xref>
,
<xref rid="pone.0120025.e001" ref-type="disp-formula">1</xref>
and
<xref rid="pone.0120025.e031" ref-type="disp-formula">13</xref>
(B). The bottom right graph (C) shows the evolution of optic and inertial parameters of
<italic>Ii</italic>
(
<xref rid="pone.0120025.e022" ref-type="disp-formula">Eq 10</xref>
) over the same period. Taken together, the graph plots B and C underline that distance information is neither available in optics nor in inertia but only in the intermodal relation across the two.</p>
<fig id="pone.0120025.g007" orientation="portrait" position="float">
<object-id pub-id-type="doi">10.1371/journal.pone.0120025.g007</object-id>
<label>Fig 7</label>
<caption>
<title>Distance information created through movement during one representative trial in the Movement condition.</title>
<p>Data from participant Near 2, trial 33. (A) Head trajectory (bird’s eye view). The scale is in meters. The green cylinder indicates the location and size of the virtual target. (B) Evolution over time of the actual distance of the target and of the distance specified by Eqs
<xref rid="pone.0120025.e022" ref-type="disp-formula">10</xref>
,
<xref rid="pone.0120025.e001" ref-type="disp-formula">1</xref>
and
<xref rid="pone.0120025.e031" ref-type="disp-formula">13</xref>
. The curve for the actual distance is not visible on the plot because of the perfect overlap with the distance specified by
<xref rid="pone.0120025.e022" ref-type="disp-formula">Eq 10</xref>
. (C) Evolution over time of the optic and inertial components of Ii as formalized in
<xref rid="pone.0120025.e022" ref-type="disp-formula">Eq 10</xref>
.</p>
</caption>
<graphic xlink:href="pone.0120025.g007"></graphic>
</fig>
</sec>
</sec>
<sec sec-type="conclusions" id="sec012">
<title>Discussion</title>
<p>Behavior causes simultaneous, inter-related changes in the structure of multiple forms of ambient energy. These emergent interrelations can constitute higher-order information that differs qualitatively from patterns that exist in individual forms of ambient energy [
<xref rid="pone.0120025.ref006" ref-type="bibr">6</xref>
]. Previous research has shown that information about an object’s heaviness exists in (and can be perceived from) the relation across optical and inertial consequences of wielding the objet [
<xref rid="pone.0120025.ref031" ref-type="bibr">31</xref>
]. In this article, we formalized a property of potential sensory stimulation,
<italic>Ii</italic>
, that is deterministically related to the egocentric distance of objects.
<italic>Ii</italic>
extends across two forms of ambient energy, optics and inertia. In an experiment, we manipulated the availability of
<italic>Ii</italic>
independent of the availability of patterns in individual forms of ambient energy, and we investigated its influence on participants’ ability to judge whether a virtual object was within reach.</p>
<p>The results revealed that judgments of reachability were most precise (steeper slopes) and most accurate (lower absolute error) when information about egocentric distance was available through
<italic>Ii</italic>
(the Movement condition) than when it was not (the Playback and Static conditions). We begin this section by discussing some aspects of the results that, we argue, arise from methodological factors.</p>
<sec id="sec013">
<title>Judgments in the Playback condition</title>
<p>The smooth changes in judgments observed in the Playback condition were unexpected, given the ambiguity of pure optic flow relative to scale. These changes might occur if participants scaled optics using stored or memorized qualities of their previous movements [
<xref rid="pone.0120025.ref012" ref-type="bibr">12</xref>
]. An alternative interpretation can be derived from considering exploratory movements and their consequences on optic flow. Theoretically a given optic flow can correspond to an infinite number of combinations of target distance and perceiver movement. In practice, however, constraints proper to the animal (e.g., biomechanical), the environment (e.g., gravity) or the task (e.g., remaining seated), reduce this infinity to a smaller set of ecologically valid combinations [
<xref rid="pone.0120025.ref032" ref-type="bibr">32</xref>
]. In our experiment, participants increased their movement amplitude, speed, acceleration, and jerk as the depicted distance of the virtual target increased across trials. However, this regulation did not fully compensate for the concurrent increase in the distance at which the target was depicted. As a result, the average speed, acceleration and jerk at which the direction of the target changed relative to the perceiver (i.e., in optic flow) decreased as a function of target distance (see
<xref rid="pone.0120025.s002" ref-type="supplementary-material">S2 Fig</xref>
). Thus, pure optic flow provided information about the relative distance of targets across trials. The low magnitude of curves’ slopes in the Playback condition suggests that—in this condition—participants might have sorted targets in terms of relative distance (i.e., closer vs. farther), rather than according to their scaled distance. In any event, given that optic flow was identical in the Playback and Movement conditions, the better performance that we observed in the Movement condition can be explained only in terms of the presence and use of patterns in ambient arrays above and beyond optic flow.</p>
</sec>
<sec id="sec014">
<title>Influence of centering bias on judgments</title>
<p>The accuracy of judgments appeared to be affected by a so-called centering bias, e.g., [
<xref rid="pone.0120025.ref027" ref-type="bibr">27</xref>
,
<xref rid="pone.0120025.ref028" ref-type="bibr">28</xref>
]. By separating participants’ actual maximum reachable distance (MR
<sub>A</sub>
) from the middle of the set of depicted distances, we were able to assess independently the influence of each of these factors. Such independent control of the investigated variable (i.e., MR
<sub>A</sub>
) and methodological factors (i.e., the symmetry of the range of targets around MR
<sub>A</sub>
) is rare in psychophysical experiments, e.g., [
<xref rid="pone.0120025.ref012" ref-type="bibr">12</xref>
,
<xref rid="pone.0120025.ref013" ref-type="bibr">13</xref>
,
<xref rid="pone.0120025.ref027" ref-type="bibr">27</xref>
]. Thus, some of the effects observed in previous studies may reflect confounding of these factors. Here in our Movement condition, despite the concurrent influence of the bias, the perceived maximum reachable distance (MR
<sub>P</sub>
) was consistently attracted toward the actual boundary (MR
<sub>A</sub>
), confirming that scaled distance was perceived. Overall, the constant error in that condition was only 6.5 cm in average, replicating the overestimation of the distance perceived reachable in real environments reported in several studies, e.g., [
<xref rid="pone.0120025.ref019" ref-type="bibr">19</xref>
,
<xref rid="pone.0120025.ref033" ref-type="bibr">33</xref>
<xref rid="pone.0120025.ref035" ref-type="bibr">35</xref>
] and the additional compression of distances generally observed in virtual environments, e.g., [
<xref rid="pone.0120025.ref036" ref-type="bibr">36</xref>
,
<xref rid="pone.0120025.ref037" ref-type="bibr">37</xref>
].</p>
</sec>
<sec id="sec015">
<title>Exploratory movements provide higher-order information about egocentric distance</title>
<p>Movement facilitates perception. In our experiment, movements were necessary to generate emergent, higher order patterns that provided unambiguous information about egocentric distance. The formal analysis of these patterns underlined the fact that the information that is available to a perceiver is contingent on the characteristics of his/her exploratory movement (see Analysis of Available Information). Our participants were free to move their eyes, head, and torso. We argue that it is for this reason that the present study provides the first evidence that exploratory activity can be sufficient for the perception of egocentric distance.</p>
<p>Previous studies investigating egocentric distance perception from optic flow have restricted participants to one-dimensional oscillatory movements in a specific direction [
<xref rid="pone.0120025.ref012" ref-type="bibr">12</xref>
,
<xref rid="pone.0120025.ref013" ref-type="bibr">13</xref>
,
<xref rid="pone.0120025.ref018" ref-type="bibr">18</xref>
<xref rid="pone.0120025.ref025" ref-type="bibr">25</xref>
]. With a few exceptions [
<xref rid="pone.0120025.ref025" ref-type="bibr">25</xref>
], they focused on two directions: toward the object or laterally. Generally this restriction was adopted because these directions were assumed to generate two different optical patterns (radial flow and motion parallax, respectively). By contrast, in ordinary life a forward head movement always entails a downward movement and thus generates motion parallax (up/down) in addition to radial optic flow [
<xref rid="pone.0120025.ref022" ref-type="bibr">22</xref>
]. A similar analysis also applies to side-to-side head movements, which, by modifying head-object distance, also generate radial flow patterns in addition to motion parallax (only a curvilinear movement orthogonal to the direction of the object, can generate pure motion parallax). When distance perception was compared across these two types of movement in those earlier studies, participants exhibited better distance estimates when moving toward the target than when moving laterally [
<xref rid="pone.0120025.ref013" ref-type="bibr">13</xref>
,
<xref rid="pone.0120025.ref022" ref-type="bibr">22</xref>
]. In our experiment, the majority of principal directions about which movements were organized (85.6%) were localized in the hemifield centered on the direction of the target (i.e., [-45°, +45°] in azimuth). However, in agreement with our model, which does not suggest any advantage of moving in a particular absolute direction (e.g., Eqs
<xref rid="pone.0120025.e001" ref-type="disp-formula">1</xref>
,
<xref rid="pone.0120025.e001" ref-type="disp-formula">1</xref>
,
<xref rid="pone.0120025.e009" ref-type="disp-formula">4</xref>
<xref rid="pone.0120025.e017" ref-type="disp-formula">5</xref>
,
<xref rid="pone.0120025.e019" ref-type="disp-formula">7</xref>
,
<xref rid="pone.0120025.e022" ref-type="disp-formula">10</xref>
<xref rid="pone.0120025.e030" ref-type="disp-formula">12</xref>
,
<xref rid="pone.0120025.e031" ref-type="disp-formula">13</xref>
<xref rid="pone.0120025.e033" ref-type="disp-formula">15</xref>
; see also [
<xref rid="pone.0120025.ref014" ref-type="bibr">14</xref>
,
<xref rid="pone.0120025.ref016" ref-type="bibr">16</xref>
]), within this hemifield participants did not favor a unique direction of movement. Rather, the principal direction varied across participants and, to a smaller extent, across trials for individual participants.</p>
<p>Our equations do not simplify for any particular absolute direction of movement; however, they undergo a qualitative change when the angle
<italic>α</italic>
between heading (i.e., the direction of movement) and the direction of the target is kept constant. For example, when the point of observation follows a curvilinear path at constant distance from the target, specified distance no longer depends upon the instantaneous direction of the target relative to heading (Eqs
<xref rid="pone.0120025.e031" ref-type="disp-formula">13</xref>
<xref rid="pone.0120025.e033" ref-type="disp-formula">15</xref>
). Such simplified patterns could turn out to be particularly useful in presence of a visual scene that lacks rich optical structure, such as when a single object is presented in an otherwise empty visual field (as in many experiments, including ours). Indeed, this impoverished optical structure reduces available information about heading and impairs its perception by humans [
<xref rid="pone.0120025.ref038" ref-type="bibr">38</xref>
], thereby limiting available information about object-relative heading. In our experiment, approximately 30% of the time participants generated accurate distance information through movements that were orthogonal to the direction of the object (Eqs
<xref rid="pone.0120025.e031" ref-type="disp-formula">13</xref>
<xref rid="pone.0120025.e033" ref-type="disp-formula">15</xref>
). This tendency varied among participants. Participants that were more likely to move in this way (i.e., generating information through these simplified equations) tended to exhibit greater accuracy in their reachability judgments (see
<xref rid="pone.0120025.s006" ref-type="supplementary-material">S2 Text</xref>
for details). Thus, these participants may have exploited these simplified patterns as a means to ease or improve the generation and/or pickup of information.</p>
<p>Individuals with only one eye have been observed to utilize larger and more rapid head movements than persons for whom monocular vision is temporary [
<xref rid="pone.0120025.ref039" ref-type="bibr">39</xref>
]. Before jumping over a gap, Mongolian gerbils move their head with amplitude and velocity that are correlated to the size of the gap [
<xref rid="pone.0120025.ref040" ref-type="bibr">40</xref>
]. Our experiment extended this finding to humans. Participants increased the amplitude, speed, and acceleration of their head movements for more distant targets. By doing so, participants tended to minimize the decrease in the (angular) amplitude, speed, and acceleration at which the direction of the object changed relative to their head (i.e., the optical parameters in Eqs
<xref rid="pone.0120025.e001" ref-type="disp-formula">1</xref>
<xref rid="pone.0120025.e033" ref-type="disp-formula">15</xref>
). Interestingly, if participants effectively attempted to preserve a certain amount of optical kinematics, the need to increase the head kinematics as a function of target’s distance was probably heightened by our experimental apparatus, which altered the spatiotemporal resolution of optic flow. In particular, the angular resolution of optic flow was constrained by the size of pixels on the HMD screen, the distance of the screen relative to the eye and the lens that lied in between.</p>
<p>Altogether, our results suggest that participants engaged in types of exploratory activity that would tend to increase or preserve the salience of
<italic>Ii</italic>
. Interestingly, this shaping of exploratory behavior by informational constraints occurred despite the fact that participants had no training with our virtual environment and were not given any practice trials or feedback about their performance. Within this perspective, perception influenced action, which influenced perception, in a continuous process that unfolded until the desired outcome was achieved (e.g., information about egocentric distance; cf. [
<xref rid="pone.0120025.ref041" ref-type="bibr">41</xref>
,
<xref rid="pone.0120025.ref042" ref-type="bibr">42</xref>
]). In a similar fashion, infants as early as 5 to 12 weeks old have been observed to increase their sucking rate when the clarity (i.e., focus) of the visual scene is made contingent on sucking rate [
<xref rid="pone.0120025.ref043" ref-type="bibr">43</xref>
], see also [
<xref rid="pone.0120025.ref044" ref-type="bibr">44</xref>
]. In adults, it has been proposed that optical information could serve to enhance and facilitate the detection of rotational inertia, e.g., [
<xref rid="pone.0120025.ref031" ref-type="bibr">31</xref>
,
<xref rid="pone.0120025.ref045" ref-type="bibr">45</xref>
].</p>
</sec>
<sec id="sec016">
<title>Conclusion</title>
<p>By themselves, the optical consequences of a perceiver’s movement are not sufficient to specify egocentric distance [
<xref rid="pone.0120025.ref011" ref-type="bibr">11</xref>
,
<xref rid="pone.0120025.ref015" ref-type="bibr">15</xref>
,
<xref rid="pone.0120025.ref017" ref-type="bibr">17</xref>
,
<xref rid="pone.0120025.ref046" ref-type="bibr">46</xref>
<xref rid="pone.0120025.ref048" ref-type="bibr">48</xref>
]. Information about egocentric distance requires that optic flow be scaled. In our experiment, optic flow might have been scaled as a fact of ambient energy through a higher order pattern comprising haptic/gravito-inertial stimulation, as we suggest. Alternatively, optic flow might have been scaled via an efferent copy of motor commands. The empirical findings of the present study cannot resolve this debate. Whatever the interpretation may be, information about egocentric distance that exists in
<italic>Ii</italic>
has at least two important characteristics. First, this information is intermodal in a novel sense: The information exists as a pattern in ambient energy prior to stimulation of any receptor systems. The distance information that participants exploited is not available in any of its constitutive modal patterns (e.g., those available separately to the visual, kinaesthetic, and vestibular systems), but only in a higher order pattern that extends across these forms of ambient energy (see
<xref rid="pone.0120025.g007" ref-type="fig">Fig 7</xref>
). Second, this information is generated by the animal through its exploratory activity [
<xref rid="pone.0120025.ref007" ref-type="bibr">7</xref>
]. By moving, a perceiver reveals higher order, invariant structures that are the consequences of his/her movement, and these structures specify the dynamics of the perceiver-environment system that gives rise to them. The emergent, higher order relations across different forms of ambient energy are not due to chance [
<xref rid="pone.0120025.ref006" ref-type="bibr">6</xref>
,
<xref rid="pone.0120025.ref049" ref-type="bibr">49</xref>
]. For example, there is a tripartite causal relation between changes in head position relative to the environment (which structure haptic/gravito-inertial stimulation), changes in objects angular direction relative to the head (which structure optic flow) and changes in head-object distance. As a consequence, the higher order invariant pattern specifying egocentric distance is a parameter that is available to the animal as a consequence of its movement. It does not need to be inferred. Altogether, this underlines that there is more than spatio-temporal redundancy or coherence across the dimensions of stimulation (optics, acoustics, haptics, etc.): In addition, there is information. Our results suggest that this emergent, higher order information is sufficient for perception of egocentric distance and that humans are sensitive to this information and can use it to perceive egocentric distance.</p>
</sec>
</sec>
<sec sec-type="materials|methods" id="sec017">
<title>Materials and Methods</title>
<sec id="sec018">
<title>Participants</title>
<p>Thirteen undergraduate students from the University of Minnesota and one of the authors (6 females, 8 males) volunteered to participate in the experiment. Students received academic credits for their participation. Participants ranged in age from 18 to 42 years (
<italic>M</italic>
= 23,
<italic>SD</italic>
= 5.6), and in height from 155 to 193 cm (
<italic>M</italic>
= 172.9,
<italic>SD</italic>
= 11.6). Their arm length, measured from the acromion to the tip of the index finger ranged from 61 to 81 cm (
<italic>M</italic>
= 72.4,
<italic>SD</italic>
= 5.6). All participants had normal or corrected-to-normal vision and all but one were naïve to the purpose of the experiment.</p>
</sec>
<sec id="sec019">
<title>Experimental task</title>
<p>Our response measure differed from that used in some previous studies. Some studies, e.g., [
<xref rid="pone.0120025.ref012" ref-type="bibr">12</xref>
,
<xref rid="pone.0120025.ref013" ref-type="bibr">13</xref>
,
<xref rid="pone.0120025.ref024" ref-type="bibr">24</xref>
] required participants to report their percepts using numerical scales (e.g., in inches, “between 50 and 100 cm”). Such responses are problematic because they confound perceptual accuracy with the participants’ ability to convert perception into numbers [
<xref rid="pone.0120025.ref019" ref-type="bibr">19</xref>
,
<xref rid="pone.0120025.ref034" ref-type="bibr">34</xref>
,
<xref rid="pone.0120025.ref036" ref-type="bibr">36</xref>
]. Following other studies, e.g., [
<xref rid="pone.0120025.ref019" ref-type="bibr">19</xref>
,
<xref rid="pone.0120025.ref050" ref-type="bibr">50</xref>
] we used a response measure that did not require such a conversion.</p>
<p>Previous studies have shown that adults exhibit refined knowledge of whether an object is within reach, and even of the particular type of reaching (arm-only, leaning at the waist, taking a step, and so on) that will be optimal to reach a given object at a given distance [
<xref rid="pone.0120025.ref033" ref-type="bibr">33</xref>
,
<xref rid="pone.0120025.ref035" ref-type="bibr">35</xref>
,
<xref rid="pone.0120025.ref051" ref-type="bibr">51</xref>
]. Even infants can do this: Yonas and Hartman [
<xref rid="pone.0120025.ref052" ref-type="bibr">52</xref>
] found that 5-months old infants regularly reached toward objects that were less than one arm length distant, but did not reach for objects that were more than one arm length away. In this work, we focus on the patterns within stimulation that could support such perception. Presumably, the prospective perception of whether an object is within reach requires relational information about the egocentric distance of the object and the distance that can be reached by the actor/perceiver (which depends on arm length, posture, tool characteristics, etc.; e.g., [
<xref rid="pone.0120025.ref053" ref-type="bibr">53</xref>
<xref rid="pone.0120025.ref055" ref-type="bibr">55</xref>
]). We do not claim (nor disclaim) that egocentric distance information is relevant for the
<italic>online</italic>
regulation of reaching movements. Our concern is with affordance perception, the role of which is to support the
<italic>prospective</italic>
control of actions (e.g., movement selection). We hypothesize that the intermodal pattern we identified and formalized can provide information about the former aspect (i.e., the distance of the targeted object) and thus that manipulating this information would influence participants’ perception of whether the object is within reach. Participants were asked to judge verbally (yes/no) whether they could reach a stationary virtual target, whose distance was varied among trials. In addition, they were asked to rank verbally how confident they were about their reachability judgments, using a 1–5 scale with 5 being the most confident. “Reaching”, meant “reaching with your finger while extending your preferred arm, without twisting shoulders or leaning forward”. Our definition of reaching corresponded to a 1 degree of freedom reach in the terminology used in previous studies [
<xref rid="pone.0120025.ref033" ref-type="bibr">33</xref>
,
<xref rid="pone.0120025.ref035" ref-type="bibr">35</xref>
].</p>
</sec>
<sec id="sec020">
<title>Apparatus</title>
<p>The set up is illustrated in
<xref rid="pone.0120025.g002" ref-type="fig">Fig 2A</xref>
. Participants were seated on a height-adjustable office chair, in a dark room. They were presented with stationary virtual objects at eye height through an HMD (Visette Pro, Cyberminds, Netherlands). Among the two LCD matrixes of the HMD (640 × 480 pixels each, 60 Hz), only that facing the (self-reported) preferred eye was turned on, yielding a monocular field of 60° × 46°. The simulated object was a green and blue coin-like cylinder displayed in an otherwise empty (black) visual field (
<xref rid="pone.0120025.g002" ref-type="fig">Fig 2C</xref>
). Before each trial, the size of the target was determined as a function of its distance such that its angular size was always 9° at the beginning of the trial (the thickness of the target also was set proportionally). To depict a stationary virtual object at a given distance beyond the HMD screen, the participant’s position and orientation were sampled with a 6-dof electromagnetic sensor (Flock of Birds, Ascension, Burlington, VT, USA) and used to drive in real time the display of the object on the screen of the HMD (
<xref rid="pone.0120025.g002" ref-type="fig">Fig 2B1</xref>
;
<xref rid="pone.0120025.s007" ref-type="supplementary-material">S1 Video</xref>
). The sensor was attached to the HMD, above the eyes, centered (
<xref rid="pone.0120025.g002" ref-type="fig">Fig 2A</xref>
). Although improved in recent decades, e.g. [
<xref rid="pone.0120025.ref056" ref-type="bibr">56</xref>
], the principle used to depict a virtual object at a distance traces back to at least to Wallace [
<xref rid="pone.0120025.ref057" ref-type="bibr">57</xref>
]. With our apparatus, many visual cues to distance were absent, and most others were neutralized (i.e., they were influenced by the eye’s distance relative to the screen and not to the virtual object). The invariant pattern across inertial and optical consequences of participant’ movement was the only remaining source of information about the egocentric distance of the simulated target. We were able to manipulate that pattern, either by reproducing the relation existing in real world, or by simply breaking it. A second electromagnetic sensor was used (located on a real target) for the preliminary measurement of the actual maximal reachable distance (MR
<sub>A</sub>
; see below).</p>
</sec>
<sec id="sec021">
<title>Design and procedure</title>
<p>All participants gave their written informed consent to participate in the study. The protocol was approved by the Institutional Review Board of the University of Minnesota where the data were collected. Before the beginning of the experiment, we measured for each participant the maximal distance from which they could actually reach an object (MR
<sub>A</sub>
), by having them performing real reaching actions (with their arm only, see Task section above). The MR
<sub>A</sub>
was measured from the preferred eye to the target, using the method of limits [
<xref rid="pone.0120025.ref058" ref-type="bibr">58</xref>
<xref rid="pone.0120025.ref060" ref-type="bibr">60</xref>
]. Participants were presented real targets at eye level and asked to reach them using their arm only. After each reach, the target was pulled back centimeter-by-centimeter (measured with the electromagnetic sensors) to increase the egocentric distance. This procedure was repeated until the target was too far to be reached for two consecutive trials. The operation was then restarted with decreasing distances, starting from unreachable targets and continuing until the participant could reach two consecutive targets. MR
<sub>A</sub>
was defined as the mean of the last and first distances judged reachable in respectively these ascending and descending series. The obtained values for MR
<sub>A</sub>
ranged from 50 to 72 cm (
<italic>M</italic>
= 62.1,
<italic>SD</italic>
= 5.3).</p>
<p>The judgment session involved three experimental conditions (
<xref rid="pone.0120025.g002" ref-type="fig">Fig 2B</xref>
) in which we manipulated the availability of the different components of the invariant relation specifying egocentric distance (i.e., in Eqs.
<xref rid="pone.0120025.e001" ref-type="disp-formula">1</xref>
,
<xref rid="pone.0120025.e009" ref-type="disp-formula">4</xref>
,
<xref rid="pone.0120025.e022" ref-type="disp-formula">10</xref>
<xref rid="pone.0120025.e030" ref-type="disp-formula">12</xref>
,
<xref rid="pone.0120025.e031" ref-type="disp-formula">13</xref>
<xref rid="pone.0120025.e033" ref-type="disp-formula">15</xref>
). In the Movement condition, participants were encouraged to freely explore the scene by moving their head relative to the virtual object prior to giving their judgment. The closed-loop updating system was activated, such that the information described in
<xref rid="pone.0120025.e022" ref-type="disp-formula">Eq 10</xref>
(and related forms) was available when the perceiver moved. Accordingly, we expected that participants would give precise and accurate reachability judgments. To appreciate performance in that main experimental condition, we designed two control conditions. In the Stationary condition, participants were instructed not to move while looking at the target. They were not physically restrained but the closed-loop updating display was deactivated, such that changes in haptic/gravito-inertial stimulation were minimized and not related to the display of the object. In the Playback condition, participants were also instructed to remain still, but they were presented a moving display of the target. The movements of the displayed target were driven by previously recorded movements of the same participant played back from earlier trials (see
<xref rid="pone.0120025.s007" ref-type="supplementary-material">S1 Video</xref>
). Thus, the Playback condition provided optical stimulation that was driven by body movement, but the relation between head movements and optics was open loop [
<xref rid="pone.0120025.ref012" ref-type="bibr">12</xref>
,
<xref rid="pone.0120025.ref013" ref-type="bibr">13</xref>
,
<xref rid="pone.0120025.ref056" ref-type="bibr">56</xref>
,
<xref rid="pone.0120025.ref061" ref-type="bibr">61</xref>
].</p>
<p>Each judgment followed the same procedure. Participants opened their eyes on a “Go” signal, took as much time as they wished and then said “yes” or “no” to indicate whether they judged the virtual target to be within reach, followed by a number between 1 and 5 for the confidence rating. As soon as they gave their responses, the simulation was turned off (dark screen) and participants were asked to close their eyes until the beginning of the next trial.</p>
<p>Targets were presented at 17 different distances, calculated for each participant on the basis of his/her MR
<sub>A</sub>
. Participants were randomly split into Near and Far groups. For the Near Group, target distances ranged from 23% to 135% of each participant’s MR
<sub>A</sub>
, while they ranged from 72 to 184% for the Far Group. The motivation for having two groups was to control for potential centering bias, that is, a potential influence on perceived maximum reachable distance, MR
<sub>P</sub>
, of the localization of MR
<sub>A</sub>
within the tested interval [
<xref rid="pone.0120025.ref027" ref-type="bibr">27</xref>
,
<xref rid="pone.0120025.ref028" ref-type="bibr">28</xref>
]. There were five trials at each distance. All trials were randomized within condition, then grouped by blocks of 17 trials of the same condition. All blocks were then randomized, with the only restriction that at least one block of Movement trials be present before the first block of Playback trials, in order to feed the database of recorded head movements. Participants performed a total of 255 trials (15 blocks of 17 trials), which were divided in two sessions of 45 to 60 minutes each, conducted on different days.</p>
</sec>
<sec id="sec022">
<title>Analysis of judgment data</title>
<p>In psychophysical experiments, a typical response curve exhibits a clear transition from a majority of “yes” (e.g., reachable) to a majority of “no” (e.g., not reachable) judgments as the stimulus (e.g., simulated distance) increases or decreases. Ideally, transitions are located around an expected threshold. To analyze this transition, we fitted psychometric functions to the percentage of positive responses expressed as a function of the object’s distance [
<xref rid="pone.0120025.ref059" ref-type="bibr">59</xref>
,
<xref rid="pone.0120025.ref062" ref-type="bibr">62</xref>
<xref rid="pone.0120025.ref064" ref-type="bibr">64</xref>
] using the psignifit toolbox (version 2.5.6 for Matlab
<ext-link ext-link-type="uri" xlink:href="http://bootstrap-software.org/psignifit/">http://bootstrap-software.org/psignifit/</ext-link>
):
<disp-formula id="pone.0120025.e035">
<alternatives>
<graphic xlink:href="pone.0120025.e035.jpg" id="pone.0120025.e035g" position="anchor" mimetype="image" orientation="portrait"></graphic>
<mml:math id="M35">
<mml:mrow>
<mml:mi>Ψ</mml:mi>
<mml:mtext></mml:mtext>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:mi>x</mml:mi>
<mml:mo>,</mml:mo>
<mml:mi>θ</mml:mi>
<mml:mo>,</mml:mo>
<mml:msub>
<mml:mi>π</mml:mi>
<mml:mi>c</mml:mi>
</mml:msub>
<mml:mo>,</mml:mo>
<mml:msub>
<mml:mi>π</mml:mi>
<mml:mi>l</mml:mi>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
<mml:mo>=</mml:mo>
<mml:msub>
<mml:mi>π</mml:mi>
<mml:mi>c</mml:mi>
</mml:msub>
<mml:mo>+</mml:mo>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:mn>1</mml:mn>
<mml:mo></mml:mo>
<mml:msub>
<mml:mi>π</mml:mi>
<mml:mi>c</mml:mi>
</mml:msub>
<mml:mo></mml:mo>
<mml:msub>
<mml:mi>π</mml:mi>
<mml:mi>l</mml:mi>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
<mml:mtext></mml:mtext>
<mml:mi>F</mml:mi>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:mi>x</mml:mi>
<mml:mo>,</mml:mo>
<mml:mi>θ</mml:mi>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
</mml:mrow>
</mml:math>
</alternatives>
<label>(16)</label>
</disp-formula>
where the probability
<italic>Ψ</italic>
that the participant gives a positive answer is defined as a function of the probability
<italic>π</italic>
<sub>
<italic>l</italic>
</sub>
that she/he answers independently of the stimulus intensity
<italic>x</italic>
(
<italic>miss rate</italic>
), the stimulus-independent probability (or
<italic>guess rate</italic>
)
<italic>π</italic>
<sub>
<italic>c</italic>
</sub>
that she/he gives a positive response, and the probability
<italic>F</italic>
(
<italic>x</italic>
, θ) of positively evaluating the stimulus. Following Wichmann and Hill, we did not fix the values of π
<sub>
<italic>l</italic>
</sub>
and π
<sub>
<italic>c</italic>
</sub>
<italic>a priori</italic>
. Rather, we constrained their values to lie within a reasonable interval (i.e., [0, 0.05]). In accordance with previous work on affordance judgments, e.g., [
<xref rid="pone.0120025.ref058" ref-type="bibr">58</xref>
,
<xref rid="pone.0120025.ref065" ref-type="bibr">65</xref>
], we used a logistic function for
<italic>F</italic>
. We derived two indicators of performance from
<italic>F</italic>
: the abscissa of the point where positive and negative judgments were balanced and the value of the slope at this point. The first provided a measure of the perceived maximum reachable distance (MR
<sub>P</sub>
) and was used to quantify judgment accuracy through absolute error. The second indicator was our measure of the precision of judgments (i.e., consistency). Goodness-of-fit was assessed with the deviance test based on Monte-Carlo simulations described in Wichmann and Hill [
<xref rid="pone.0120025.ref063" ref-type="bibr">63</xref>
] and confidence interval of each parameters were found by the BCa bootstrap method implemented by psignifit, based on 4000 simulations [
<xref rid="pone.0120025.ref064" ref-type="bibr">64</xref>
]. Altogether, these psychophysical tools permitted a precise analysis of the accuracy and precision of judgments, as well as control over the relevance and consistency of these dependent variables. We modelled the data for each participant in each condition (individual fits) and for all participants in each condition (global fits). We used results from individual fits to compare experimental conditions, while global fits provided good descriptions of the results at group level.</p>
</sec>
<sec id="sec023">
<title>Analysis of movement data</title>
<p>To attenuate the noise of the sensor, movement data were filtered using a two-way low-pass Butterworth filter (4 order, 12 Hz cutoff frequency). From the filtered data we computed for each trial the principal direction of movement, the amplitude of movement (range) along this preferred axis, and the average norm of instantaneous velocity, acceleration and jerk. The principal direction of movement was computed from the distribution of instantaneous directions of movement by taking the eigenvector corresponding to the largest eigenvalue of the orientation matrix (also called scatter matrix when normalized by sample size; cf. [
<xref rid="pone.0120025.ref066" ref-type="bibr">66</xref>
], p.162, and [
<xref rid="pone.0120025.ref067" ref-type="bibr">67</xref>
], p.233). The eigenvalues provided a measure of the proportion of variance in movement direction explained by each of the eigenvectors.</p>
<p>Additionally, we analyzed exploratory movements in the light of available intermodal information as we formalized it through
<xref rid="pone.0120025.e022" ref-type="disp-formula">Eq 10</xref>
and related forms. Using head movement data, we calculated for each participant the instantaneous egocentric distances that were specified at each point of the trajectory by
<xref rid="pone.0120025.e022" ref-type="disp-formula">Eq 10</xref>
dedicated to 3D movements, by
<xref rid="pone.0120025.e001" ref-type="disp-formula">Eq 1</xref>
dedicated to 1D movements, and by
<xref rid="pone.0120025.e031" ref-type="disp-formula">Eq 13</xref>
dedicated to movements orthogonal to the direction of the target (e.g., tangential to a target-centered sphere). Using these, for each equation we computed the percentage of points (i.e., the amount of time) for which the difference between specified and actual distance was lower than 5 cm. This provided a measure of the amount of time during which each equation provided accurate information about distance (given each participant’s exploratory motion).</p>
</sec>
<sec id="sec024">
<title>Inferential statistics</title>
<p>Except where otherwise indicated, we used an alpha level of. 05 for all inferential tests. We also report effect sizes for each test using partial η
<sup>2</sup>
for ANOVA (noted pη
<sup>2</sup>
) and Cliff’s
<italic>dw</italic>
and
<italic>d</italic>
statistic for nonparametric tests [
<xref rid="pone.0120025.ref068" ref-type="bibr">68</xref>
]. In repeated measures designs, the
<italic>dw</italic>
statistic is the proportion of participants who change in one direction minus the proportion who change in the opposite direction. For independent samples, the
<italic>d</italic>
statistic indicates the proportion of scores from one population that are higher than those from the other, minus the reverse proportion [
<xref rid="pone.0120025.ref068" ref-type="bibr">68</xref>
], p.495.
<italic>dw</italic>
and
<italic>d</italic>
vary from 1 (all scores greater at the second test or in the second population) to -1 (all scores smaller at the second test/population).</p>
</sec>
</sec>
<sec sec-type="supplementary-material" id="sec025">
<title>Supporting Information</title>
<supplementary-material content-type="local-data" id="pone.0120025.s001">
<label>S1 Fig</label>
<caption>
<title>Optical and non optical consequences of a 3D movement relative to a stationary object.</title>
<p>(A) Using a Cartesian coordinate system, the egocentric distance can be expressed as a function of directional parameters (
<italic>α</italic>
,
<italic>θ</italic>
) describing the motion of the point of observation in the plane defined by
<italic>O</italic>
and
<inline-formula id="pone.0120025.e036">
<alternatives>
<graphic xlink:href="pone.0120025.e036.jpg" id="pone.0120025.e036g" position="anchor" mimetype="image" orientation="portrait"></graphic>
<mml:math id="M36">
<mml:mover accent="true">
<mml:mi>v</mml:mi>
<mml:mo></mml:mo>
</mml:mover>
</mml:math>
</alternatives>
</inline-formula>
(a), two more directional parameters
<italic>φ</italic>
and
<italic>ψ</italic>
(b, c) characterizing the orientation of that plane relative to an earth-fixed reference frame
<inline-formula id="pone.0120025.e037">
<alternatives>
<graphic xlink:href="pone.0120025.e037.jpg" id="pone.0120025.e037g" position="anchor" mimetype="image" orientation="portrait"></graphic>
<mml:math id="M37">
<mml:mrow>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mover accent="true">
<mml:mi>x</mml:mi>
<mml:mo></mml:mo>
</mml:mover>
<mml:mn>0</mml:mn>
</mml:msub>
<mml:mo>,</mml:mo>
<mml:msub>
<mml:mover accent="true">
<mml:mi>y</mml:mi>
<mml:mo></mml:mo>
</mml:mover>
<mml:mn>0</mml:mn>
</mml:msub>
<mml:mo>,</mml:mo>
<mml:msub>
<mml:mover accent="true">
<mml:mi>z</mml:mi>
<mml:mo></mml:mo>
</mml:mover>
<mml:mn>0</mml:mn>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
</mml:mrow>
</mml:math>
</alternatives>
</inline-formula>
, and linear parameters about head movements (
<italic>v</italic>
). (B) Using a spherical coordinate system, the egocentric distance can be expressed as a function of directional parameters (Φ,
<italic>δ</italic>
) and linear parameters about head movements (
<italic>v</italic>
) (see
<xref rid="pone.0120025.s005" ref-type="supplementary-material">S1 Text</xref>
for details).</p>
<p>(PDF)</p>
</caption>
<media xlink:href="pone.0120025.s001.pdf">
<caption>
<p>Click here for additional data file.</p>
</caption>
</media>
</supplementary-material>
<supplementary-material content-type="local-data" id="pone.0120025.s002">
<label>S2 Fig</label>
<caption>
<title>Kinematics of the direction of the object relative to the point of observation.</title>
<p>Optical parameters in Eqs
<xref rid="pone.0120025.e022" ref-type="disp-formula">10</xref>
<xref rid="pone.0120025.e029" ref-type="disp-formula">11</xref>
: (A) Parameter
<italic>Q</italic>
in deg.s
<sup>-1</sup>
. (B) Parameter
<italic>Q</italic>
in deg.s
<sup>-2</sup>
. In the two panels, average instantaneous values are plotted as a function of the distance at which the target was simulated at the beginning of the trial (expressed as a proportion of the actual maximum reachable distance MR
<sub>A</sub>
).</p>
<p>(PDF)</p>
</caption>
<media xlink:href="pone.0120025.s002.pdf">
<caption>
<p>Click here for additional data file.</p>
</caption>
</media>
</supplementary-material>
<supplementary-material content-type="local-data" id="pone.0120025.s003">
<label>S1 Table</label>
<caption>
<title>Perceived maximum reachable distance (MR
<sub>P</sub>
) and slope derived from judgments curves.</title>
<p>The mean values (
<italic>M</italic>
) and confidence interval (
<italic>CI</italic>
<sub>.
<italic>95</italic>
</sub>
) of MR
<sub>P</sub>
and slope are calculated from significant individual fits in each experimental condition (Movement, Stationary, Playback). The
<italic>N</italic>
values indicate the number of participants included in the analyses (i.e., whom deviance test assessing goodness of fit was significant.</p>
<p>(PDF)</p>
</caption>
<media xlink:href="pone.0120025.s003.pdf">
<caption>
<p>Click here for additional data file.</p>
</caption>
</media>
</supplementary-material>
<supplementary-material content-type="local-data" id="pone.0120025.s004">
<label>S2 Table</label>
<caption>
<title>Second order polynomial regressions fitted to confidence ratings.</title>
<p>The values obtained for each parameter of the equation (
<italic>Ax
<sup>2</sup>
+ Bx + C</italic>
), the determination coefficient of the fit (
<italic>R
<sup>2</sup>
</italic>
) and the
<italic>p</italic>
statistics associated to each parameter of the regression are shown for each group of participant in each experimental condition.</p>
<p>(PDF)</p>
</caption>
<media xlink:href="pone.0120025.s004.pdf">
<caption>
<p>Click here for additional data file.</p>
</caption>
</media>
</supplementary-material>
<supplementary-material content-type="local-data" id="pone.0120025.s005">
<label>S1 Text</label>
<caption>
<title>Mathematical derivation of Eqs
<xref rid="pone.0120025.e017" ref-type="disp-formula">5</xref>
and
<xref rid="pone.0120025.e019" ref-type="disp-formula">7</xref>
. Eqs
<xref rid="pone.0120025.e017" ref-type="disp-formula">5</xref>
and
<xref rid="pone.0120025.e019" ref-type="disp-formula">7</xref>
.</title>
<p>These equations describe the information about scaled egocentric distance available in the intermodal consequences of 3D movements, using Cartesian and spherical coordinates, respectively.</p>
<p>(PDF)</p>
</caption>
<media xlink:href="pone.0120025.s005.pdf">
<caption>
<p>Click here for additional data file.</p>
</caption>
</media>
</supplementary-material>
<supplementary-material content-type="local-data" id="pone.0120025.s006">
<label>S2 Text</label>
<caption>
<title>Influence of the amount of information generated on performance.</title>
<p>Comparison of the accuracy of reachability judgments between participants who (according to
<xref rid="pone.0120025.e031" ref-type="disp-formula">Eq 13</xref>
) most and least often generated accurate information about distance by moving orthogonal to the direction of the target.</p>
<p>(PDF)</p>
</caption>
<media xlink:href="pone.0120025.s006.pdf">
<caption>
<p>Click here for additional data file.</p>
</caption>
</media>
</supplementary-material>
<supplementary-material content-type="local-data" id="pone.0120025.s007">
<label>S1 Video</label>
<caption>
<title>Video caption showing what was displayed on the screen of the HMD wore by participants.</title>
<p>The display of the object is driven by previously recorded movements played back from one representative trial of the Movement condition (participant Far 1, trial 12).</p>
<p>(MOV)</p>
</caption>
<media xlink:href="pone.0120025.s007.mov">
<caption>
<p>Click here for additional data file.</p>
</caption>
</media>
</supplementary-material>
</sec>
</body>
<back>
<ack>
<p>The authors would like to thank Perrine Guerin for creating the CG illustration of the experimental setup and the teaching faculty of the School of Kinesiology, University of Minnesota, for their help in recruiting experimental participants.</p>
</ack>
<ref-list>
<title>References</title>
<ref id="pone.0120025.ref001">
<label>1</label>
<mixed-citation publication-type="journal">
<name>
<surname>Alais</surname>
<given-names>D</given-names>
</name>
,
<name>
<surname>Burr</surname>
<given-names>D</given-names>
</name>
.
<article-title>The ventriloquist effect results from near-optimal bimodal integration</article-title>
.
<source>Curr Biol</source>
.
<year>2004</year>
;
<volume>14</volume>
:
<fpage>257</fpage>
<lpage>262</lpage>
.
<pub-id pub-id-type="pmid">14761661</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0120025.ref002">
<label>2</label>
<mixed-citation publication-type="journal">
<name>
<surname>Ernst</surname>
<given-names>MO</given-names>
</name>
,
<name>
<surname>Banks</surname>
<given-names>MS</given-names>
</name>
.
<article-title>Humans integrate visual and haptic information in a statistically optimal fashion</article-title>
.
<source>Nature</source>
.
<year>2002</year>
;
<volume>415</volume>
:
<fpage>429</fpage>
<lpage>433</lpage>
.
<pub-id pub-id-type="pmid">11807554</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0120025.ref003">
<label>3</label>
<mixed-citation publication-type="journal">
<name>
<surname>Guillaud</surname>
<given-names>E</given-names>
</name>
,
<name>
<surname>Gauthier</surname>
<given-names>G</given-names>
</name>
,
<name>
<surname>Vercher</surname>
<given-names>J-L</given-names>
</name>
,
<name>
<surname>Blouin</surname>
<given-names>J</given-names>
</name>
.
<article-title>Fusion of visuo-ocular and vestibular signals in arm motor control</article-title>
.
<source>J Neurophysiol</source>
.
<year>2006</year>
;
<volume>95</volume>
:
<fpage>1134</fpage>
<lpage>1146</lpage>
.
<pub-id pub-id-type="pmid">16221749</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0120025.ref004">
<label>4</label>
<mixed-citation publication-type="journal">
<name>
<surname>Hillis</surname>
<given-names>JM</given-names>
</name>
,
<name>
<surname>Ernst</surname>
<given-names>MO</given-names>
</name>
,
<name>
<surname>Banks</surname>
<given-names>MS</given-names>
</name>
,
<name>
<surname>Landy</surname>
<given-names>MS</given-names>
</name>
.
<article-title>Combining sensory information: Mandatory fusion within, but not between, senses</article-title>
.
<source>Science</source>
.
<year>2002</year>
;
<volume>298</volume>
:
<fpage>1627</fpage>
<lpage>1630</lpage>
.
<pub-id pub-id-type="pmid">12446912</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0120025.ref005">
<label>5</label>
<mixed-citation publication-type="journal">
<name>
<surname>Pouget</surname>
<given-names>A</given-names>
</name>
,
<name>
<surname>Deneve</surname>
<given-names>S</given-names>
</name>
,
<name>
<surname>Duhamel</surname>
<given-names>J-R</given-names>
</name>
.
<article-title>A computational perspective on the neural basis of multisensory spatial representations</article-title>
.
<source>Nat Rev Neurosci</source>
.
<year>2002</year>
;
<volume>3</volume>
:
<fpage>741</fpage>
<lpage>747</lpage>
.
<pub-id pub-id-type="pmid">12209122</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0120025.ref006">
<label>6</label>
<mixed-citation publication-type="journal">
<name>
<surname>Stoffregen</surname>
<given-names>TA</given-names>
</name>
,
<name>
<surname>Bardy</surname>
<given-names>BG</given-names>
</name>
.
<article-title>On specification and the senses</article-title>
.
<source>Behav Brain Sci</source>
.
<year>2001</year>
;
<volume>24</volume>
:
<fpage>195</fpage>
<lpage>261</lpage>
.
<pub-id pub-id-type="pmid">11530542</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0120025.ref007">
<label>7</label>
<mixed-citation publication-type="book">
<name>
<surname>Gibson</surname>
<given-names>JJ</given-names>
</name>
.
<source>The senses considered as perceptual systems</source>
.
<publisher-loc>Boston, MA, USA</publisher-loc>
:
<publisher-name>Houghton-Mifflin</publisher-name>
;
<year>1966</year>
</mixed-citation>
</ref>
<ref id="pone.0120025.ref008">
<label>8</label>
<mixed-citation publication-type="book">
<name>
<surname>Gibson</surname>
<given-names>JJ</given-names>
</name>
.
<source>The ecological approach to visual perception</source>
.
<publisher-loc>Boston, MA, USA</publisher-loc>
:
<publisher-name>Houghton-Mifflin</publisher-name>
;
<year>1979</year>
.</mixed-citation>
</ref>
<ref id="pone.0120025.ref009">
<label>9</label>
<mixed-citation publication-type="book">
<name>
<surname>Michaels</surname>
<given-names>CF</given-names>
</name>
,
<name>
<surname>Carello</surname>
<given-names>CC</given-names>
</name>
.
<chapter-title>Direct Perception</chapter-title>
<source>Upper Saddle River</source>
,
<publisher-loc>NJ, U.S.A.</publisher-loc>
:
<publisher-name>Prentice Hall</publisher-name>
;
<year>1981</year>
.</mixed-citation>
</ref>
<ref id="pone.0120025.ref010">
<label>10</label>
<mixed-citation publication-type="journal">
<name>
<surname>Turvey</surname>
<given-names>MT</given-names>
</name>
,
<name>
<surname>Shaw</surname>
<given-names>RE</given-names>
</name>
,
<name>
<surname>Reed</surname>
<given-names>ES</given-names>
</name>
,
<name>
<surname>Mace</surname>
<given-names>WM</given-names>
</name>
.
<article-title>Ecological laws of perceiving and acting: In reply to Fodor and Pylyshyn (1981)</article-title>
.
<source>Cognition</source>
.
<year>1981</year>
;
<volume>9</volume>
:
<fpage>237</fpage>
<lpage>304</lpage>
.
<pub-id pub-id-type="pmid">7197604</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0120025.ref011">
<label>11</label>
<mixed-citation publication-type="book">
<name>
<surname>Sedgwick</surname>
<given-names>HA</given-names>
</name>
.
<chapter-title>Space perception</chapter-title>
In:
<name>
<surname>Boff</surname>
<given-names>KR</given-names>
</name>
,
<name>
<surname>Kaufman</surname>
<given-names>L</given-names>
</name>
,
<name>
<surname>Thomas</surname>
<given-names>JP</given-names>
</name>
, editors.
<source>Handbook of perception and human performance, Vol 1: Sensory processes and perception</source>
.
<publisher-loc>New York, NY, USA</publisher-loc>
:
<publisher-name>Wiley</publisher-name>
;
<year>1986</year>
pp.
<fpage>21.1</fpage>
<lpage>21.57</lpage>
.</mixed-citation>
</ref>
<ref id="pone.0120025.ref012">
<label>12</label>
<mixed-citation publication-type="journal">
<name>
<surname>Panerai</surname>
<given-names>F</given-names>
</name>
,
<name>
<surname>Cornilleau-Pérès</surname>
<given-names>V</given-names>
</name>
,
<name>
<surname>Droulez</surname>
<given-names>J</given-names>
</name>
.
<article-title>Contribution of extraretinal signals to the scaling of object distance during self-motion</article-title>
.
<source>Percept Psychophys</source>
.
<year>2002</year>
;
<volume>64</volume>
:
<fpage>717</fpage>
<lpage>731</lpage>
.
<pub-id pub-id-type="pmid">12201331</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0120025.ref013">
<label>13</label>
<mixed-citation publication-type="journal">
<name>
<surname>Peh</surname>
<given-names>C-H</given-names>
</name>
,
<name>
<surname>Panerai</surname>
<given-names>F</given-names>
</name>
,
<name>
<surname>Droulez</surname>
<given-names>J</given-names>
</name>
,
<name>
<surname>Cornilleau-Pérès</surname>
<given-names>V</given-names>
</name>
,
<name>
<surname>Cheong</surname>
<given-names>L-F</given-names>
</name>
.
<article-title>Absolute distance perception during in-depth head movement: Calibrating optic flow with extra-retinal information</article-title>
.
<source>Vision Res</source>
.
<year>2002</year>
;
<volume>42</volume>
:
<fpage>1991</fpage>
<lpage>2003</lpage>
.
<pub-id pub-id-type="pmid">12160571</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0120025.ref014">
<label>14</label>
<mixed-citation publication-type="journal">
<name>
<surname>Gibson</surname>
<given-names>JJ</given-names>
</name>
,
<name>
<surname>Olum</surname>
<given-names>P</given-names>
</name>
,
<name>
<surname>Rosenblatt</surname>
<given-names>F</given-names>
</name>
.
<article-title>Parallax and perspective during aircraft landings</article-title>
.
<source>Am J Psychol</source>
.
<year>1955</year>
;
<fpage>372</fpage>
<lpage>385</lpage>
.
<pub-id pub-id-type="pmid">13248971</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0120025.ref015">
<label>15</label>
<mixed-citation publication-type="journal">
<name>
<surname>Johansson</surname>
<given-names>G</given-names>
</name>
.
<article-title>Monocular movement parallax and near-space perception</article-title>
.
<source>Perception</source>
.
<year>1973</year>
;
<volume>2</volume>
:
<fpage>136</fpage>
<lpage>145</lpage>
.</mixed-citation>
</ref>
<ref id="pone.0120025.ref016">
<label>16</label>
<mixed-citation publication-type="journal">
<name>
<surname>Nakayama</surname>
<given-names>K</given-names>
</name>
,
<name>
<surname>Loomis</surname>
<given-names>J</given-names>
</name>
.
<article-title>Optical velocity patterns, velocity-sensitive neurons, and space perception: A hypothesis</article-title>
.
<source>Perception</source>
.
<year>1974</year>
;
<volume>3</volume>
:
<fpage>63</fpage>
<lpage>80</lpage>
.
<pub-id pub-id-type="pmid">4444922</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0120025.ref017">
<label>17</label>
<mixed-citation publication-type="journal">
<name>
<surname>Bingham</surname>
<given-names>GP</given-names>
</name>
,
<name>
<surname>Stassen</surname>
<given-names>MG</given-names>
</name>
.
<article-title>Monocular egocentric distance information generated by head movement</article-title>
.
<source>Ecol Psychol</source>
.
<year>1994</year>
;
<volume>6</volume>
:
<fpage>219</fpage>
<lpage>238</lpage>
.</mixed-citation>
</ref>
<ref id="pone.0120025.ref018">
<label>18</label>
<mixed-citation publication-type="journal">
<name>
<surname>Gogel</surname>
<given-names>WC</given-names>
</name>
,
<name>
<surname>Tietz</surname>
<given-names>JD</given-names>
</name>
.
<article-title>Absolute motion parallax and the specific distance tendency</article-title>
.
<source>Percept Psychophys</source>
.
<year>1973</year>
;
<volume>13</volume>
:
<fpage>284</fpage>
<lpage>292</lpage>
.</mixed-citation>
</ref>
<ref id="pone.0120025.ref019">
<label>19</label>
<mixed-citation publication-type="journal">
<name>
<surname>Bingham</surname>
<given-names>GP</given-names>
</name>
,
<name>
<surname>Pagano</surname>
<given-names>CC</given-names>
</name>
.
<article-title>The necessity of a perception–action approach to definite distance perception: Monocular distance perception to guide reaching</article-title>
.
<source>J Exp Psychol Hum Percept Perform</source>
.
<year>1998</year>
;
<volume>24</volume>
:
<fpage>145</fpage>
<lpage>168</lpage>
.
<pub-id pub-id-type="pmid">9483825</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0120025.ref020">
<label>20</label>
<mixed-citation publication-type="journal">
<name>
<surname>Watt</surname>
<given-names>SJ</given-names>
</name>
,
<name>
<surname>Bradshaw</surname>
<given-names>MF</given-names>
</name>
.
<article-title>The visual control of reaching and grasping: Binocular disparity and motion parallax</article-title>
.
<source>J Exp Psychol Hum Percept Perform</source>
.
<year>2003</year>
;
<volume>29</volume>
:
<fpage>404</fpage>
<lpage>415</lpage>
.
<pub-id pub-id-type="pmid">12760624</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0120025.ref021">
<label>21</label>
<mixed-citation publication-type="journal">
<name>
<surname>Gomer</surname>
<given-names>JA</given-names>
</name>
,
<name>
<surname>Dash</surname>
<given-names>CH</given-names>
</name>
,
<name>
<surname>Moore</surname>
<given-names>KS</given-names>
</name>
,
<name>
<surname>Pagano</surname>
<given-names>CC</given-names>
</name>
.
<article-title>Using radial outflow to provide depth information during teleoperation</article-title>
.
<source>Presence</source>
.
<year>2009</year>
;
<volume>18</volume>
:
<fpage>304</fpage>
<lpage>320</lpage>
.</mixed-citation>
</ref>
<ref id="pone.0120025.ref022">
<label>22</label>
<mixed-citation publication-type="journal">
<name>
<surname>Wickelgren</surname>
<given-names>EA</given-names>
</name>
,
<name>
<surname>Mcconnell</surname>
<given-names>DS</given-names>
</name>
,
<name>
<surname>Bingham</surname>
<given-names>GR</given-names>
</name>
.
<article-title>Reaching measures of monocular distance perception: Forward versus side-to-side head movements and haptic feedback</article-title>
.
<source>Percept Psychophys</source>
.
<year>2000</year>
;
<volume>62</volume>
:
<fpage>1051</fpage>
<lpage>1059</lpage>
.
<pub-id pub-id-type="pmid">10997049</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0120025.ref023">
<label>23</label>
<mixed-citation publication-type="journal">
<name>
<surname>Gogel</surname>
<given-names>WC</given-names>
</name>
,
<name>
<surname>Tietz</surname>
<given-names>JD</given-names>
</name>
.
<article-title>A comparison of oculomotor and motion parallax cues of egocentric distance</article-title>
.
<source>Vision Res</source>
.
<year>1979</year>
;
<volume>19</volume>
:
<fpage>1161</fpage>
<lpage>1170</lpage>
.
<pub-id pub-id-type="pmid">550575</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0120025.ref024">
<label>24</label>
<mixed-citation publication-type="journal">
<name>
<surname>Beall</surname>
<given-names>AC</given-names>
</name>
,
<name>
<surname>Loomis</surname>
<given-names>JM</given-names>
</name>
,
<name>
<surname>Philbeck</surname>
<given-names>JW</given-names>
</name>
,
<name>
<surname>Fikes</surname>
<given-names>TG</given-names>
</name>
.
<article-title>Absolute motion parallax weakly determines visual scale in real and virtual environments. IS&T/SPIE’s Symposium on Electronic Imaging: Science & Technology</article-title>
.
<source>International Society for Optics and Photonics</source>
;
<year>1995</year>
pp.
<fpage>288</fpage>
<lpage>297</lpage>
.</mixed-citation>
</ref>
<ref id="pone.0120025.ref025">
<label>25</label>
<mixed-citation publication-type="journal">
<name>
<surname>Eriksson</surname>
<given-names>ES</given-names>
</name>
.
<article-title>Movement parallax during locomotion</article-title>
.
<source>Percept Psychophys</source>
.
<year>1974</year>
;
<volume>16</volume>
:
<fpage>197</fpage>
<lpage>200</lpage>
.</mixed-citation>
</ref>
<ref id="pone.0120025.ref026">
<label>26</label>
<mixed-citation publication-type="journal">
<name>
<surname>Ferris</surname>
<given-names>SH</given-names>
</name>
.
<article-title>Motion parallax and absolute distance</article-title>
.
<source>J Exp Psychol</source>
.
<year>1972</year>
;
<volume>95</volume>
:
<fpage>258</fpage>
<lpage>263</lpage>
.
<pub-id pub-id-type="pmid">5071906</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0120025.ref027">
<label>27</label>
<mixed-citation publication-type="book">
<name>
<surname>Mantel</surname>
<given-names>B</given-names>
</name>
,
<name>
<surname>Bardy</surname>
<given-names>BG</given-names>
</name>
,
<name>
<surname>Stoffregen</surname>
<given-names>TA</given-names>
</name>
.
<chapter-title>Critical boundaries and median values in affordance perception</chapter-title>
In:
<name>
<surname>Cummins-Sebree</surname>
<given-names>S</given-names>
</name>
,
<name>
<surname>Riley</surname>
<given-names>M</given-names>
</name>
,
<name>
<surname>Shockley</surname>
<given-names>K</given-names>
</name>
, editors.
<source>Studies in perception and action IX: Fourteenth International Conference on Perception and Action</source>
.
<publisher-loc>Mahwah, NJ, USA</publisher-loc>
:
<publisher-name>Erlbaum</publisher-name>
<publisher-loc>Mahwah, NJ</publisher-loc>
;
<year>2007</year>
pp.
<fpage>222</fpage>
<lpage>225</lpage>
.</mixed-citation>
</ref>
<ref id="pone.0120025.ref028">
<label>28</label>
<mixed-citation publication-type="book">
<name>
<surname>Poulton</surname>
<given-names>EC</given-names>
</name>
.
<source>Bias in quantifying judgements</source>
.
<publisher-loc>Hove, UK</publisher-loc>
:
<publisher-name>Lawrence Erlbaum Associates</publisher-name>
;
<year>1989</year>
.</mixed-citation>
</ref>
<ref id="pone.0120025.ref029">
<label>29</label>
<mixed-citation publication-type="journal">
<name>
<surname>Warren</surname>
<given-names>WH</given-names>
</name>
.
<article-title>Perceiving affordances: visual guidance of stair climbing</article-title>
.
<source>J Exp Psychol Hum Percept Perform</source>
.
<year>1984</year>
;
<volume>10</volume>
:
<fpage>683</fpage>
<lpage>703</lpage>
.
<pub-id pub-id-type="pmid">6238127</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0120025.ref030">
<label>30</label>
<mixed-citation publication-type="other">Stappers PJ. Scaling the visual consequences of active head movements: A study of active perceivers and spatial technology [Internet]. Ph.D. Thesis, Delft University. 1992. Available:
<ext-link ext-link-type="uri" xlink:href="http://repository.tudelft.nl/view/ir/uuid%3A09cd1a02-cb65-454c-bd95-0515962ca94f/">http://repository.tudelft.nl/view/ir/uuid%3A09cd1a02-cb65-454c-bd95-0515962ca94f/</ext-link>
. Accessed 2014 dec 15.</mixed-citation>
</ref>
<ref id="pone.0120025.ref031">
<label>31</label>
<mixed-citation publication-type="journal">
<name>
<surname>Streit</surname>
<given-names>M</given-names>
</name>
,
<name>
<surname>Shockley</surname>
<given-names>K</given-names>
</name>
,
<name>
<surname>Riley</surname>
<given-names>MA</given-names>
</name>
.
<article-title>Rotational inertia and multimodal heaviness perception</article-title>
.
<source>Psychon Bull Rev</source>
.
<year>2007</year>
;
<volume>14</volume>
:
<fpage>1001</fpage>
<lpage>1006</lpage>
.
<pub-id pub-id-type="pmid">18087973</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0120025.ref032">
<label>32</label>
<mixed-citation publication-type="journal">
<name>
<surname>Runeson</surname>
<given-names>S</given-names>
</name>
.
<article-title>The distorted room illusion, equivalent configurations, and the specificity of static optic arrays</article-title>
.
<source>J Exp Psychol Hum Percept Perform</source>
.
<year>1988</year>
;
<volume>14</volume>
:
<fpage>295</fpage>
<lpage>304</lpage>
.
<pub-id pub-id-type="pmid">2967881</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0120025.ref033">
<label>33</label>
<mixed-citation publication-type="journal">
<name>
<surname>Carello</surname>
<given-names>C</given-names>
</name>
,
<name>
<surname>Grosofsky</surname>
<given-names>A</given-names>
</name>
,
<name>
<surname>Reichel</surname>
<given-names>FD</given-names>
</name>
,
<name>
<surname>Solomon</surname>
<given-names>HY</given-names>
</name>
,
<name>
<surname>Turvey</surname>
<given-names>MT</given-names>
</name>
.
<article-title>Visually perceiving what is reachable</article-title>
.
<source>Ecol Psychol</source>
.
<year>1989</year>
;
<volume>1</volume>
:
<fpage>27</fpage>
<lpage>54</lpage>
.</mixed-citation>
</ref>
<ref id="pone.0120025.ref034">
<label>34</label>
<mixed-citation publication-type="journal">
<name>
<surname>Heft</surname>
<given-names>H</given-names>
</name>
.
<article-title>A methodological note on overestimates of reaching distance: Distinguishing between perceptual and analytical judgments</article-title>
.
<source>Ecol Psychol</source>
.
<year>1993</year>
;
<volume>5</volume>
:
<fpage>255</fpage>
<lpage>271</lpage>
.</mixed-citation>
</ref>
<ref id="pone.0120025.ref035">
<label>35</label>
<mixed-citation publication-type="journal">
<name>
<surname>Mark</surname>
<given-names>LS</given-names>
</name>
,
<name>
<surname>Nemeth</surname>
<given-names>K</given-names>
</name>
,
<name>
<surname>Gardner</surname>
<given-names>D</given-names>
</name>
,
<name>
<surname>Dainoff</surname>
<given-names>MJ</given-names>
</name>
,
<name>
<surname>Paasche</surname>
<given-names>J</given-names>
</name>
,
<name>
<surname>Duffy</surname>
<given-names>M</given-names>
</name>
,
<etal>et al</etal>
<article-title>Postural dynamics and the preferred critical boundary for visually guided reaching</article-title>
.
<source>J Exp Psychol Hum Percept Perform</source>
.
<year>1997</year>
;
<volume>23</volume>
:
<fpage>1365</fpage>
<lpage>1379</lpage>
.
<pub-id pub-id-type="pmid">9336957</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0120025.ref036">
<label>36</label>
<mixed-citation publication-type="book">
<name>
<surname>Loomis</surname>
<given-names>JM</given-names>
</name>
,
<name>
<surname>Knapp</surname>
<given-names>JM</given-names>
</name>
.
<chapter-title>Visual perception of egocentric distance in real and virtual environments</chapter-title>
<source>Virtual and adaptive environments</source>
.
<publisher-loc>Mahwah, NJ, USA</publisher-loc>
:
<publisher-name>Lawrence Erlbaum Associates</publisher-name>
;
<year>2003</year>
pp.
<fpage>21</fpage>
<lpage>46</lpage>
.</mixed-citation>
</ref>
<ref id="pone.0120025.ref037">
<label>37</label>
<mixed-citation publication-type="journal">
<name>
<surname>Creem-Regehr</surname>
<given-names>SH</given-names>
</name>
,
<name>
<surname>Willemsen</surname>
<given-names>P</given-names>
</name>
,
<name>
<surname>Gooch</surname>
<given-names>AA</given-names>
</name>
,
<name>
<surname>Thompson</surname>
<given-names>WB</given-names>
</name>
.
<article-title>The influence of restricted viewing conditions on egocentric distance perception: Implications for real and virtual environments</article-title>
.
<source>Perception</source>
.
<year>2005</year>
;
<volume>34</volume>
:
<fpage>191</fpage>
<lpage>204</lpage>
.
<pub-id pub-id-type="pmid">15832569</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0120025.ref038">
<label>38</label>
<mixed-citation publication-type="journal">
<name>
<surname>Crowell</surname>
<given-names>JA</given-names>
</name>
,
<name>
<surname>Banks</surname>
<given-names>MS</given-names>
</name>
.
<article-title>Perceiving heading with different retinal regions and types of optic flow</article-title>
.
<source>Percept Psychophys</source>
.
<year>1993</year>
;
<volume>53</volume>
:
<fpage>325</fpage>
<lpage>337</lpage>
.
<pub-id pub-id-type="pmid">8483696</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0120025.ref039">
<label>39</label>
<mixed-citation publication-type="journal">
<name>
<surname>Marotta</surname>
<given-names>J</given-names>
</name>
,
<name>
<surname>Perrot</surname>
<given-names>T</given-names>
</name>
,
<name>
<surname>Nicolle</surname>
<given-names>D</given-names>
</name>
,
<name>
<surname>Servos</surname>
<given-names>P</given-names>
</name>
,
<name>
<surname>Goodale</surname>
<given-names>MA</given-names>
</name>
.
<article-title>Adapting to monocular vision: Grasping with one eye</article-title>
.
<source>Exp Brain Res</source>
.
<year>1995</year>
;
<volume>104</volume>
:
<fpage>107</fpage>
<lpage>114</lpage>
.
<pub-id pub-id-type="pmid">7621928</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0120025.ref040">
<label>40</label>
<mixed-citation publication-type="journal">
<name>
<surname>Ellard</surname>
<given-names>CG</given-names>
</name>
,
<name>
<surname>Goodale</surname>
<given-names>MA</given-names>
</name>
,
<name>
<surname>Timney</surname>
<given-names>B</given-names>
</name>
.
<article-title>Distance estimation in the Mongolian gerbil: The role of dynamic depth cues</article-title>
.
<source>Behav Brain Res</source>
.
<year>1984</year>
;
<volume>14</volume>
:
<fpage>29</fpage>
<lpage>39</lpage>
.
<pub-id pub-id-type="pmid">6518079</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0120025.ref041">
<label>41</label>
<mixed-citation publication-type="journal">
<name>
<surname>Turvey</surname>
<given-names>MT</given-names>
</name>
.
<article-title>Action and perception at the level of synergies</article-title>
.
<source>Hum Mov Sci</source>
.
<year>2007</year>
;
<volume>26</volume>
:
<fpage>657</fpage>
<lpage>697</lpage>
.
<pub-id pub-id-type="pmid">17604860</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0120025.ref042">
<label>42</label>
<mixed-citation publication-type="journal">
<name>
<surname>Turvey</surname>
<given-names>MT</given-names>
</name>
,
<name>
<surname>Fonseca</surname>
<given-names>ST</given-names>
</name>
.
<article-title>The medium of haptic perception: A tensegrity hypothesis</article-title>
.
<source>J Mot Behav</source>
.
<year>2014</year>
;
<volume>46</volume>
:
<fpage>143</fpage>
<lpage>187</lpage>
.
<comment>doi:
<ext-link ext-link-type="uri" xlink:href="http://dx.doi.org/10.1080/00222895.2013.798252">10.1080/00222895.2013.798252</ext-link>
</comment>
<pub-id pub-id-type="pmid">24628057</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0120025.ref043">
<label>43</label>
<mixed-citation publication-type="journal">
<name>
<surname>Bruner</surname>
<given-names>J</given-names>
</name>
,
<name>
<surname>Kalnins</surname>
<given-names>I</given-names>
</name>
.
<article-title>The coordination of visual observation and instrumental behavior in early infancy</article-title>
.
<source>Perception</source>
.
<year>1973</year>
;
<volume>2</volume>
:
<fpage>307</fpage>
<lpage>314</lpage>
.
<pub-id pub-id-type="pmid">4794126</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0120025.ref044">
<label>44</label>
<mixed-citation publication-type="journal">
<name>
<surname>Gibson</surname>
<given-names>EJ</given-names>
</name>
.
<article-title>Exploratory behavior in the development of perceiving, acting, and the acquiring of knowledge</article-title>
.
<source>Annu Rev Psychol</source>
.
<year>1988</year>
;
<volume>39</volume>
:
<fpage>1</fpage>
<lpage>41</lpage>
.</mixed-citation>
</ref>
<ref id="pone.0120025.ref045">
<label>45</label>
<mixed-citation publication-type="journal">
<name>
<surname>Amazeen</surname>
<given-names>EL</given-names>
</name>
,
<name>
<surname>Tseng</surname>
<given-names>PH</given-names>
</name>
,
<name>
<surname>Valdez</surname>
<given-names>AB</given-names>
</name>
,
<name>
<surname>Vera</surname>
<given-names>D</given-names>
</name>
.
<article-title>Perceived heaviness is influenced by the style of lifting</article-title>
.
<source>Ecol Psychol</source>
.
<year>2011</year>
;
<volume>23</volume>
:
<fpage>1</fpage>
<lpage>18</lpage>
.</mixed-citation>
</ref>
<ref id="pone.0120025.ref046">
<label>46</label>
<mixed-citation publication-type="journal">
<name>
<surname>Eriksson</surname>
<given-names>ES</given-names>
</name>
.
<article-title>Distance perception and the ambiguity of visual stimulation: A theoretical note</article-title>
.
<source>Percept Psychophys</source>
.
<year>1973</year>
;
<volume>13</volume>
:
<fpage>379</fpage>
<lpage>381</lpage>
.</mixed-citation>
</ref>
<ref id="pone.0120025.ref047">
<label>47</label>
<mixed-citation publication-type="journal">
<name>
<surname>Longuet-Higgins</surname>
<given-names>H</given-names>
</name>
.
<article-title>Visual motion ambiguity</article-title>
.
<source>Vision Res</source>
.
<year>1986</year>
;
<volume>26</volume>
:
<fpage>181</fpage>
<lpage>183</lpage>
.
<pub-id pub-id-type="pmid">3716210</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0120025.ref048">
<label>48</label>
<mixed-citation publication-type="book">
<name>
<surname>Ullman</surname>
<given-names>S</given-names>
</name>
.
<source>The interpretation of visual motion</source>
.
<publisher-loc>Cambridge, MA, USA</publisher-loc>
:
<publisher-name>MIT Press</publisher-name>
;
<year>1979</year>
.</mixed-citation>
</ref>
<ref id="pone.0120025.ref049">
<label>49</label>
<mixed-citation publication-type="journal">
<name>
<surname>Stoffregen</surname>
<given-names>TA</given-names>
</name>
,
<name>
<surname>Bardy</surname>
<given-names>BG</given-names>
</name>
.
<article-title>Theory testing and the global array</article-title>
.
<source>Behav Brain Sci</source>
.
<year>2004</year>
;
<volume>27</volume>
:
<fpage>892</fpage>
<lpage>900</lpage>
.</mixed-citation>
</ref>
<ref id="pone.0120025.ref050">
<label>50</label>
<mixed-citation publication-type="journal">
<name>
<surname>Loomis</surname>
<given-names>JM</given-names>
</name>
,
<name>
<surname>Da Silva</surname>
<given-names>JA</given-names>
</name>
,
<name>
<surname>Fujita</surname>
<given-names>N</given-names>
</name>
,
<name>
<surname>Fukusima</surname>
<given-names>SS</given-names>
</name>
.
<article-title>Visual space perception and visually directed action</article-title>
.
<source>J Exp Psychol Hum Percept Perform</source>
.
<year>1992</year>
;
<volume>18</volume>
:
<fpage>906</fpage>
<lpage>921</lpage>
.
<pub-id pub-id-type="pmid">1431754</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0120025.ref051">
<label>51</label>
<mixed-citation publication-type="journal">
<name>
<surname>Gardner</surname>
<given-names>DL</given-names>
</name>
,
<name>
<surname>Mark</surname>
<given-names>LS</given-names>
</name>
,
<name>
<surname>Ward</surname>
<given-names>JA</given-names>
</name>
,
<name>
<surname>Edkins</surname>
<given-names>H</given-names>
</name>
.
<article-title>How do task characteristics affect the transitions between seated and standing reaches?</article-title>
<source>Ecol Psychol</source>
.
<year>2001</year>
;
<volume>13</volume>
:
<fpage>245</fpage>
<lpage>274</lpage>
.</mixed-citation>
</ref>
<ref id="pone.0120025.ref052">
<label>52</label>
<mixed-citation publication-type="journal">
<name>
<surname>Yonas</surname>
<given-names>A</given-names>
</name>
,
<name>
<surname>Hartman</surname>
<given-names>B</given-names>
</name>
.
<article-title>Perceiving the affordance of contact in four and five-month-old infants</article-title>
.
<source>Child Dev</source>
.
<year>1993</year>
;
<volume>64</volume>
:
<fpage>298</fpage>
<lpage>308</lpage>
.
<pub-id pub-id-type="pmid">8436036</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0120025.ref053">
<label>53</label>
<mixed-citation publication-type="journal">
<name>
<surname>Solomon</surname>
<given-names>HY</given-names>
</name>
,
<name>
<surname>Turvey</surname>
<given-names>MT</given-names>
</name>
.
<article-title>Haptically perceiving the distances reachable with hand-held objects</article-title>
.
<source>J Exp Psychol Hum Percept Perform</source>
.
<year>1988</year>
;
<volume>14</volume>
:
<fpage>404</fpage>
<lpage>427</lpage>
.
<pub-id pub-id-type="pmid">2971770</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0120025.ref054">
<label>54</label>
<mixed-citation publication-type="journal">
<name>
<surname>Carello</surname>
<given-names>C</given-names>
</name>
,
<name>
<surname>Fitzpatrick</surname>
<given-names>P</given-names>
</name>
,
<name>
<surname>Turvey</surname>
<given-names>MT</given-names>
</name>
.
<article-title>Haptic probing: Perceiving the length of a probe and the distance of a surface probed</article-title>
.
<source>Percept Psychophys</source>
.
<year>1992</year>
;
<volume>51</volume>
:
<fpage>580</fpage>
<lpage>598</lpage>
.
<pub-id pub-id-type="pmid">1620570</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0120025.ref055">
<label>55</label>
<mixed-citation publication-type="journal">
<name>
<surname>Mantel</surname>
<given-names>B</given-names>
</name>
,
<name>
<surname>Bardy</surname>
<given-names>BG</given-names>
</name>
,
<name>
<surname>Stoffregen</surname>
<given-names>TA</given-names>
</name>
.
<article-title>Multimodal perception of reachability expressed through locomotion</article-title>
.
<source>Ecol Psychol</source>
.
<year>2010</year>
;
<volume>22</volume>
:
<fpage>192</fpage>
<lpage>211</lpage>
.</mixed-citation>
</ref>
<ref id="pone.0120025.ref056">
<label>56</label>
<mixed-citation publication-type="journal">
<name>
<surname>Smets</surname>
<given-names>GJ</given-names>
</name>
,
<name>
<surname>Overbeeke</surname>
<given-names>C</given-names>
</name>
,
<name>
<surname>Stratmann</surname>
<given-names>M</given-names>
</name>
.
<article-title>Depth on a flat screen</article-title>
.
<source>Percept Mot Skills</source>
.
<year>1987</year>
;
<volume>64</volume>
:
<fpage>1023</fpage>
<lpage>1034</lpage>
.
<pub-id pub-id-type="pmid">3627907</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0120025.ref057">
<label>57</label>
<mixed-citation publication-type="journal">
<name>
<surname>Wallace</surname>
<given-names>G</given-names>
</name>
.
<article-title>Visual scanning in the desert locust Schistocerca gregaria Forskål</article-title>
.
<source>J Exp Biol</source>
.
<year>1959</year>
;
<volume>36</volume>
:
<fpage>512</fpage>
<lpage>525</lpage>
.</mixed-citation>
</ref>
<ref id="pone.0120025.ref058">
<label>58</label>
<mixed-citation publication-type="journal">
<name>
<surname>Cornus</surname>
<given-names>S</given-names>
</name>
,
<name>
<surname>Montagne</surname>
<given-names>G</given-names>
</name>
,
<name>
<surname>Laurent</surname>
<given-names>M</given-names>
</name>
.
<article-title>Perception of a stepping-across affordance</article-title>
.
<source>Ecol Psychol</source>
.
<year>1999</year>
;
<volume>11</volume>
:
<fpage>249</fpage>
<lpage>267</lpage>
.</mixed-citation>
</ref>
<ref id="pone.0120025.ref059">
<label>59</label>
<mixed-citation publication-type="book">
<name>
<surname>Falmagne</surname>
<given-names>J-C</given-names>
</name>
.
<chapter-title>Psychophysical measurement and theory</chapter-title>
In:
<name>
<surname>Boff</surname>
<given-names>KR</given-names>
</name>
,
<name>
<surname>Kaufman</surname>
<given-names>L</given-names>
</name>
,
<name>
<surname>Thomas</surname>
<given-names>JP</given-names>
</name>
, editors.
<source>Handbook of perception and human performance, Vol 1: Sensory processes and perception</source>
.
<publisher-loc>New York, NY, USA</publisher-loc>
:
<publisher-name>Wiley</publisher-name>
;
<year>1986</year>
pp.
<fpage>1.1</fpage>
<lpage>1.66</lpage>
.</mixed-citation>
</ref>
<ref id="pone.0120025.ref060">
<label>60</label>
<mixed-citation publication-type="journal">
<name>
<surname>Mark</surname>
<given-names>LS</given-names>
</name>
.
<article-title>Eyeheight-scaled information about affordances: A study of sitting and stair climbing</article-title>
.
<source>J Exp Psychol Hum Percept Perform</source>
.
<year>1987</year>
;
<volume>13</volume>
:
<fpage>361</fpage>
<lpage>370</lpage>
.
<pub-id pub-id-type="pmid">2958585</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0120025.ref061">
<label>61</label>
<mixed-citation publication-type="journal">
<name>
<surname>Durgin</surname>
<given-names>FH</given-names>
</name>
,
<name>
<surname>Proffitt</surname>
<given-names>DR</given-names>
</name>
,
<name>
<surname>Olson</surname>
<given-names>TJ</given-names>
</name>
,
<name>
<surname>Reinke</surname>
<given-names>KS</given-names>
</name>
.
<article-title>Comparing depth from motion with depth from binocular disparity</article-title>
.
<source>J Exp Psychol Hum Percept Perform</source>
.
<year>1995</year>
;
<volume>21</volume>
:
<fpage>679</fpage>
<lpage>699</lpage>
.
<pub-id pub-id-type="pmid">7790841</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0120025.ref062">
<label>62</label>
<mixed-citation publication-type="journal">
<name>
<surname>Treutwein</surname>
<given-names>B</given-names>
</name>
,
<name>
<surname>Strasburger</surname>
<given-names>H</given-names>
</name>
.
<article-title>Fitting the psychometric function</article-title>
.
<source>Percept Psychophys</source>
.
<year>1999</year>
;
<volume>61</volume>
:
<fpage>87</fpage>
<lpage>106</lpage>
.
<pub-id pub-id-type="pmid">10070202</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0120025.ref063">
<label>63</label>
<mixed-citation publication-type="journal">
<name>
<surname>Wichmann</surname>
<given-names>FA</given-names>
</name>
,
<name>
<surname>Hill</surname>
<given-names>NJ</given-names>
</name>
.
<article-title>The psychometric function: I. Fitting, sampling, and goodness of fit</article-title>
.
<source>Percept Psychophys</source>
.
<year>2001</year>
;
<volume>63</volume>
:
<fpage>1293</fpage>
<lpage>1313</lpage>
.
<pub-id pub-id-type="pmid">11800458</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0120025.ref064">
<label>64</label>
<mixed-citation publication-type="journal">
<name>
<surname>Wichmann</surname>
<given-names>FA</given-names>
</name>
,
<name>
<surname>Hill</surname>
<given-names>NJ</given-names>
</name>
.
<article-title>The psychometric function: II. Bootstrap-based confidence intervals and sampling</article-title>
.
<source>Percept Psychophys</source>
.
<year>2001</year>
;
<volume>63</volume>
:
<fpage>1314</fpage>
<lpage>1329</lpage>
.
<pub-id pub-id-type="pmid">11800459</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0120025.ref065">
<label>65</label>
<mixed-citation publication-type="journal">
<name>
<surname>Bootsma</surname>
<given-names>RJ</given-names>
</name>
,
<name>
<surname>Bakker</surname>
<given-names>FC</given-names>
</name>
,
<name>
<surname>van Snippenberg</surname>
<given-names>FEJ</given-names>
</name>
,
<name>
<surname>Tdlohreg</surname>
<given-names>CW</given-names>
</name>
.
<article-title>The effects of anxiety on perceiving the reachability of passing objects</article-title>
.
<source>Ecol Psychol</source>
.
<year>1992</year>
;
<volume>4</volume>
:
<fpage>1</fpage>
<lpage>16</lpage>
.</mixed-citation>
</ref>
<ref id="pone.0120025.ref066">
<label>66</label>
<mixed-citation publication-type="book">
<name>
<surname>Fisher</surname>
<given-names>NI</given-names>
</name>
,
<name>
<surname>Lewis</surname>
<given-names>T</given-names>
</name>
,
<name>
<surname>Embleton</surname>
<given-names>BJJ</given-names>
</name>
.
<source>Statistical analysis of spherical data [Internet]</source>
.
<publisher-loc>Cambridge, UK</publisher-loc>
:
<publisher-name>Cambridge University Press</publisher-name>
;
<year>1993</year>
Available:
<comment>doi:
<ext-link ext-link-type="uri" xlink:href="http://dx.doi.org/10.1017/CBO9780511623059">10.1017/CBO9780511623059</ext-link>
</comment>
</mixed-citation>
</ref>
<ref id="pone.0120025.ref067">
<label>67</label>
<mixed-citation publication-type="book">
<name>
<surname>Mardia</surname>
<given-names>KV</given-names>
</name>
,
<name>
<surname>Jupp</surname>
<given-names>PE</given-names>
</name>
.
<source>Directional statistics</source>
.
<publisher-loc>Chichester, UK</publisher-loc>
:
<publisher-name>John Wiley & Sons</publisher-name>
;
<year>2000</year>
.</mixed-citation>
</ref>
<ref id="pone.0120025.ref068">
<label>68</label>
<mixed-citation publication-type="journal">
<name>
<surname>Cliff</surname>
<given-names>N</given-names>
</name>
.
<article-title>Dominance statistics: Ordinal analyses to answer ordinal questions</article-title>
.
<source>Psychol Bull</source>
.
<year>1993</year>
;
<volume>114</volume>
:
<fpage>494</fpage>
<lpage>509</lpage>
.</mixed-citation>
</ref>
</ref-list>
</back>
</pmc>
</record>

Pour manipuler ce document sous Unix (Dilib)

EXPLOR_STEP=$WICRI_ROOT/Ticri/CIDE/explor/HapticV1/Data/Pmc/Curation
HfdSelect -h $EXPLOR_STEP/biblio.hfd -nk 000284 | SxmlIndent | more

Ou

HfdSelect -h $EXPLOR_AREA/Data/Pmc/Curation/biblio.hfd -nk 000284 | SxmlIndent | more

Pour mettre un lien sur cette page dans le réseau Wicri

{{Explor lien
   |wiki=    Ticri/CIDE
   |area=    HapticV1
   |flux=    Pmc
   |étape=   Curation
   |type=    RBID
   |clé=     PMC:4391914
   |texte=   Exploratory Movement Generates Higher-Order Information That Is Sufficient for Accurate Perception of Scaled Egocentric Distance
}}

Pour générer des pages wiki

HfdIndexSelect -h $EXPLOR_AREA/Data/Pmc/Curation/RBID.i   -Sk "pubmed:25856410" \
       | HfdSelect -Kh $EXPLOR_AREA/Data/Pmc/Curation/biblio.hfd   \
       | NlmPubMed2Wicri -a HapticV1 

Wicri

This area was generated with Dilib version V0.6.23.
Data generation: Mon Jun 13 01:09:46 2016. Site generation: Wed Mar 6 09:54:07 2024