Serveur d'exploration sur les dispositifs haptiques

Attention, ce site est en cours de développement !
Attention, site généré par des moyens informatiques à partir de corpus bruts.
Les informations ne sont donc pas validées.

Beaming into the Rat World: Enabling Real-Time Interaction between Rat and Human Each at Their Own Scale

Identifieur interne : 002318 ( Ncbi/Merge ); précédent : 002317; suivant : 002319

Beaming into the Rat World: Enabling Real-Time Interaction between Rat and Human Each at Their Own Scale

Auteurs : Jean-Marie Normand [Espagne] ; Maria V. Sanchez-Vives [Espagne] ; Christian Waechter [Allemagne] ; Elias Giannopoulos [Espagne] ; Bernhard Grosswindhager [Autriche] ; Bernhard Spanlang [Espagne] ; Christoph Guger [Autriche] ; Gudrun Klinker [Allemagne] ; Mandayam A. Srinivasan [États-Unis, Royaume-Uni] ; Mel Slater [Espagne, Royaume-Uni]

Source :

RBID : PMC:3485138

Abstract

Immersive virtual reality (IVR) typically generates the illusion in participants that they are in the displayed virtual scene where they can experience and interact in events as if they were really happening. Teleoperator (TO) systems place people at a remote physical destination embodied as a robotic device, and where typically participants have the sensation of being at the destination, with the ability to interact with entities there. In this paper, we show how to combine IVR and TO to allow a new class of application. The participant in the IVR is represented in the destination by a physical robot (TO) and simultaneously the remote place and entities within it are represented to the participant in the IVR. Hence, the IVR participant has a normal virtual reality experience, but where his or her actions and behaviour control the remote robot and can therefore have physical consequences. Here, we show how such a system can be deployed to allow a human and a rat to operate together, but the human interacting with the rat on a human scale, and the rat interacting with the human on the rat scale. The human is represented in a rat arena by a small robot that is slaved to the human’s movements, whereas the tracked rat is represented to the human in the virtual reality by a humanoid avatar. We describe the system and also a study that was designed to test whether humans can successfully play a game with the rat. The results show that the system functioned well and that the humans were able to interact with the rat to fulfil the tasks of the game. This system opens up the possibility of new applications in the life sciences involving participant observation of and interaction with animals but at human scale.


Url:
DOI: 10.1371/journal.pone.0048331
PubMed: 23118987
PubMed Central: 3485138

Links toward previous steps (curation, corpus...)


Links to Exploration step

PMC:3485138

Le document en format XML

<record>
<TEI>
<teiHeader>
<fileDesc>
<titleStmt>
<title xml:lang="en">Beaming into the Rat World: Enabling Real-Time Interaction between Rat and Human Each at Their Own Scale</title>
<author>
<name sortKey="Normand, Jean Marie" sort="Normand, Jean Marie" uniqKey="Normand J" first="Jean-Marie" last="Normand">Jean-Marie Normand</name>
<affiliation wicri:level="1">
<nlm:aff id="aff1">
<addr-line>EVENT Lab, Faculty of Psychology, University of Barcelona, Spain</addr-line>
</nlm:aff>
<country xml:lang="fr">Espagne</country>
<wicri:regionArea>EVENT Lab, Faculty of Psychology, University of Barcelona</wicri:regionArea>
<wicri:noRegion>University of Barcelona</wicri:noRegion>
</affiliation>
</author>
<author>
<name sortKey="Sanchez Vives, Maria V" sort="Sanchez Vives, Maria V" uniqKey="Sanchez Vives M" first="Maria V." last="Sanchez-Vives">Maria V. Sanchez-Vives</name>
<affiliation wicri:level="3">
<nlm:aff id="aff2">
<addr-line>Institució Catalana de Recerca i Estudis Avançats (ICREA), Barcelona, Spain</addr-line>
</nlm:aff>
<country xml:lang="fr">Espagne</country>
<wicri:regionArea>Institució Catalana de Recerca i Estudis Avançats (ICREA), Barcelona</wicri:regionArea>
<placeName>
<settlement type="city">Barcelone</settlement>
<region nuts="2" type="region">Catalogne</region>
</placeName>
</affiliation>
<affiliation wicri:level="3">
<nlm:aff id="aff3">
<addr-line>Institut d’Investigacions Biomèdiques August Pi i Sunyer (IDIBAPS), Barcelona, Spain</addr-line>
</nlm:aff>
<country xml:lang="fr">Espagne</country>
<wicri:regionArea>Institut d’Investigacions Biomèdiques August Pi i Sunyer (IDIBAPS), Barcelona</wicri:regionArea>
<placeName>
<settlement type="city">Barcelone</settlement>
<region nuts="2" type="region">Catalogne</region>
</placeName>
</affiliation>
</author>
<author>
<name sortKey="Waechter, Christian" sort="Waechter, Christian" uniqKey="Waechter C" first="Christian" last="Waechter">Christian Waechter</name>
<affiliation wicri:level="4">
<nlm:aff id="aff4">
<addr-line>Fachbereich Informatik, Technische Universität München, Munich, Germany</addr-line>
</nlm:aff>
<country xml:lang="fr">Allemagne</country>
<wicri:regionArea>Fachbereich Informatik, Technische Universität München, Munich</wicri:regionArea>
<placeName>
<region type="land" nuts="1">Bavière</region>
<region type="district" nuts="2">District de Haute-Bavière</region>
<settlement type="city">Munich</settlement>
</placeName>
<orgName type="university">Université technique de Munich</orgName>
</affiliation>
</author>
<author>
<name sortKey="Giannopoulos, Elias" sort="Giannopoulos, Elias" uniqKey="Giannopoulos E" first="Elias" last="Giannopoulos">Elias Giannopoulos</name>
<affiliation wicri:level="1">
<nlm:aff id="aff1">
<addr-line>EVENT Lab, Faculty of Psychology, University of Barcelona, Spain</addr-line>
</nlm:aff>
<country xml:lang="fr">Espagne</country>
<wicri:regionArea>EVENT Lab, Faculty of Psychology, University of Barcelona</wicri:regionArea>
<wicri:noRegion>University of Barcelona</wicri:noRegion>
</affiliation>
</author>
<author>
<name sortKey="Grosswindhager, Bernhard" sort="Grosswindhager, Bernhard" uniqKey="Grosswindhager B" first="Bernhard" last="Grosswindhager">Bernhard Grosswindhager</name>
<affiliation wicri:level="1">
<nlm:aff id="aff5">
<addr-line>Guger Technologies (g.tec), Schiedlberg, Austria</addr-line>
</nlm:aff>
<country xml:lang="fr">Autriche</country>
<wicri:regionArea>Guger Technologies (g.tec), Schiedlberg</wicri:regionArea>
<wicri:noRegion>Schiedlberg</wicri:noRegion>
</affiliation>
</author>
<author>
<name sortKey="Spanlang, Bernhard" sort="Spanlang, Bernhard" uniqKey="Spanlang B" first="Bernhard" last="Spanlang">Bernhard Spanlang</name>
<affiliation wicri:level="1">
<nlm:aff id="aff1">
<addr-line>EVENT Lab, Faculty of Psychology, University of Barcelona, Spain</addr-line>
</nlm:aff>
<country xml:lang="fr">Espagne</country>
<wicri:regionArea>EVENT Lab, Faculty of Psychology, University of Barcelona</wicri:regionArea>
<wicri:noRegion>University of Barcelona</wicri:noRegion>
</affiliation>
</author>
<author>
<name sortKey="Guger, Christoph" sort="Guger, Christoph" uniqKey="Guger C" first="Christoph" last="Guger">Christoph Guger</name>
<affiliation wicri:level="1">
<nlm:aff id="aff5">
<addr-line>Guger Technologies (g.tec), Schiedlberg, Austria</addr-line>
</nlm:aff>
<country xml:lang="fr">Autriche</country>
<wicri:regionArea>Guger Technologies (g.tec), Schiedlberg</wicri:regionArea>
<wicri:noRegion>Schiedlberg</wicri:noRegion>
</affiliation>
</author>
<author>
<name sortKey="Klinker, Gudrun" sort="Klinker, Gudrun" uniqKey="Klinker G" first="Gudrun" last="Klinker">Gudrun Klinker</name>
<affiliation wicri:level="4">
<nlm:aff id="aff4">
<addr-line>Fachbereich Informatik, Technische Universität München, Munich, Germany</addr-line>
</nlm:aff>
<country xml:lang="fr">Allemagne</country>
<wicri:regionArea>Fachbereich Informatik, Technische Universität München, Munich</wicri:regionArea>
<placeName>
<region type="land" nuts="1">Bavière</region>
<region type="district" nuts="2">District de Haute-Bavière</region>
<settlement type="city">Munich</settlement>
</placeName>
<orgName type="university">Université technique de Munich</orgName>
</affiliation>
</author>
<author>
<name sortKey="Srinivasan, Mandayam A" sort="Srinivasan, Mandayam A" uniqKey="Srinivasan M" first="Mandayam A." last="Srinivasan">Mandayam A. Srinivasan</name>
<affiliation wicri:level="2">
<nlm:aff id="aff6">
<addr-line>The Touch Lab, Research Laboratory of Electronics and Department of Mechanical Engineering, Massachusetts Institute of Technology, Cambridge, Massachusetts, United States of America</addr-line>
</nlm:aff>
<country xml:lang="fr">États-Unis</country>
<wicri:regionArea>The Touch Lab, Research Laboratory of Electronics and Department of Mechanical Engineering, Massachusetts Institute of Technology, Cambridge, Massachusetts</wicri:regionArea>
<placeName>
<region type="state">Massachusetts</region>
</placeName>
</affiliation>
<affiliation wicri:level="3">
<nlm:aff id="aff7">
<addr-line>Department of Computer Science, University College London, London, United Kingdom</addr-line>
</nlm:aff>
<country xml:lang="fr">Royaume-Uni</country>
<wicri:regionArea>Department of Computer Science, University College London, London</wicri:regionArea>
<placeName>
<settlement type="city">Londres</settlement>
<region type="country">Angleterre</region>
<region type="région" nuts="1">Grand Londres</region>
</placeName>
</affiliation>
</author>
<author>
<name sortKey="Slater, Mel" sort="Slater, Mel" uniqKey="Slater M" first="Mel" last="Slater">Mel Slater</name>
<affiliation wicri:level="1">
<nlm:aff id="aff1">
<addr-line>EVENT Lab, Faculty of Psychology, University of Barcelona, Spain</addr-line>
</nlm:aff>
<country xml:lang="fr">Espagne</country>
<wicri:regionArea>EVENT Lab, Faculty of Psychology, University of Barcelona</wicri:regionArea>
<wicri:noRegion>University of Barcelona</wicri:noRegion>
</affiliation>
<affiliation wicri:level="3">
<nlm:aff id="aff2">
<addr-line>Institució Catalana de Recerca i Estudis Avançats (ICREA), Barcelona, Spain</addr-line>
</nlm:aff>
<country xml:lang="fr">Espagne</country>
<wicri:regionArea>Institució Catalana de Recerca i Estudis Avançats (ICREA), Barcelona</wicri:regionArea>
<placeName>
<settlement type="city">Barcelone</settlement>
<region nuts="2" type="region">Catalogne</region>
</placeName>
</affiliation>
<affiliation wicri:level="3">
<nlm:aff id="aff7">
<addr-line>Department of Computer Science, University College London, London, United Kingdom</addr-line>
</nlm:aff>
<country xml:lang="fr">Royaume-Uni</country>
<wicri:regionArea>Department of Computer Science, University College London, London</wicri:regionArea>
<placeName>
<settlement type="city">Londres</settlement>
<region type="country">Angleterre</region>
<region type="région" nuts="1">Grand Londres</region>
</placeName>
</affiliation>
</author>
</titleStmt>
<publicationStmt>
<idno type="wicri:source">PMC</idno>
<idno type="pmid">23118987</idno>
<idno type="pmc">3485138</idno>
<idno type="url">http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3485138</idno>
<idno type="RBID">PMC:3485138</idno>
<idno type="doi">10.1371/journal.pone.0048331</idno>
<date when="2012">2012</date>
<idno type="wicri:Area/Pmc/Corpus">002231</idno>
<idno type="wicri:Area/Pmc/Curation">002231</idno>
<idno type="wicri:Area/Pmc/Checkpoint">001842</idno>
<idno type="wicri:Area/Ncbi/Merge">002318</idno>
</publicationStmt>
<sourceDesc>
<biblStruct>
<analytic>
<title xml:lang="en" level="a" type="main">Beaming into the Rat World: Enabling Real-Time Interaction between Rat and Human Each at Their Own Scale</title>
<author>
<name sortKey="Normand, Jean Marie" sort="Normand, Jean Marie" uniqKey="Normand J" first="Jean-Marie" last="Normand">Jean-Marie Normand</name>
<affiliation wicri:level="1">
<nlm:aff id="aff1">
<addr-line>EVENT Lab, Faculty of Psychology, University of Barcelona, Spain</addr-line>
</nlm:aff>
<country xml:lang="fr">Espagne</country>
<wicri:regionArea>EVENT Lab, Faculty of Psychology, University of Barcelona</wicri:regionArea>
<wicri:noRegion>University of Barcelona</wicri:noRegion>
</affiliation>
</author>
<author>
<name sortKey="Sanchez Vives, Maria V" sort="Sanchez Vives, Maria V" uniqKey="Sanchez Vives M" first="Maria V." last="Sanchez-Vives">Maria V. Sanchez-Vives</name>
<affiliation wicri:level="3">
<nlm:aff id="aff2">
<addr-line>Institució Catalana de Recerca i Estudis Avançats (ICREA), Barcelona, Spain</addr-line>
</nlm:aff>
<country xml:lang="fr">Espagne</country>
<wicri:regionArea>Institució Catalana de Recerca i Estudis Avançats (ICREA), Barcelona</wicri:regionArea>
<placeName>
<settlement type="city">Barcelone</settlement>
<region nuts="2" type="region">Catalogne</region>
</placeName>
</affiliation>
<affiliation wicri:level="3">
<nlm:aff id="aff3">
<addr-line>Institut d’Investigacions Biomèdiques August Pi i Sunyer (IDIBAPS), Barcelona, Spain</addr-line>
</nlm:aff>
<country xml:lang="fr">Espagne</country>
<wicri:regionArea>Institut d’Investigacions Biomèdiques August Pi i Sunyer (IDIBAPS), Barcelona</wicri:regionArea>
<placeName>
<settlement type="city">Barcelone</settlement>
<region nuts="2" type="region">Catalogne</region>
</placeName>
</affiliation>
</author>
<author>
<name sortKey="Waechter, Christian" sort="Waechter, Christian" uniqKey="Waechter C" first="Christian" last="Waechter">Christian Waechter</name>
<affiliation wicri:level="4">
<nlm:aff id="aff4">
<addr-line>Fachbereich Informatik, Technische Universität München, Munich, Germany</addr-line>
</nlm:aff>
<country xml:lang="fr">Allemagne</country>
<wicri:regionArea>Fachbereich Informatik, Technische Universität München, Munich</wicri:regionArea>
<placeName>
<region type="land" nuts="1">Bavière</region>
<region type="district" nuts="2">District de Haute-Bavière</region>
<settlement type="city">Munich</settlement>
</placeName>
<orgName type="university">Université technique de Munich</orgName>
</affiliation>
</author>
<author>
<name sortKey="Giannopoulos, Elias" sort="Giannopoulos, Elias" uniqKey="Giannopoulos E" first="Elias" last="Giannopoulos">Elias Giannopoulos</name>
<affiliation wicri:level="1">
<nlm:aff id="aff1">
<addr-line>EVENT Lab, Faculty of Psychology, University of Barcelona, Spain</addr-line>
</nlm:aff>
<country xml:lang="fr">Espagne</country>
<wicri:regionArea>EVENT Lab, Faculty of Psychology, University of Barcelona</wicri:regionArea>
<wicri:noRegion>University of Barcelona</wicri:noRegion>
</affiliation>
</author>
<author>
<name sortKey="Grosswindhager, Bernhard" sort="Grosswindhager, Bernhard" uniqKey="Grosswindhager B" first="Bernhard" last="Grosswindhager">Bernhard Grosswindhager</name>
<affiliation wicri:level="1">
<nlm:aff id="aff5">
<addr-line>Guger Technologies (g.tec), Schiedlberg, Austria</addr-line>
</nlm:aff>
<country xml:lang="fr">Autriche</country>
<wicri:regionArea>Guger Technologies (g.tec), Schiedlberg</wicri:regionArea>
<wicri:noRegion>Schiedlberg</wicri:noRegion>
</affiliation>
</author>
<author>
<name sortKey="Spanlang, Bernhard" sort="Spanlang, Bernhard" uniqKey="Spanlang B" first="Bernhard" last="Spanlang">Bernhard Spanlang</name>
<affiliation wicri:level="1">
<nlm:aff id="aff1">
<addr-line>EVENT Lab, Faculty of Psychology, University of Barcelona, Spain</addr-line>
</nlm:aff>
<country xml:lang="fr">Espagne</country>
<wicri:regionArea>EVENT Lab, Faculty of Psychology, University of Barcelona</wicri:regionArea>
<wicri:noRegion>University of Barcelona</wicri:noRegion>
</affiliation>
</author>
<author>
<name sortKey="Guger, Christoph" sort="Guger, Christoph" uniqKey="Guger C" first="Christoph" last="Guger">Christoph Guger</name>
<affiliation wicri:level="1">
<nlm:aff id="aff5">
<addr-line>Guger Technologies (g.tec), Schiedlberg, Austria</addr-line>
</nlm:aff>
<country xml:lang="fr">Autriche</country>
<wicri:regionArea>Guger Technologies (g.tec), Schiedlberg</wicri:regionArea>
<wicri:noRegion>Schiedlberg</wicri:noRegion>
</affiliation>
</author>
<author>
<name sortKey="Klinker, Gudrun" sort="Klinker, Gudrun" uniqKey="Klinker G" first="Gudrun" last="Klinker">Gudrun Klinker</name>
<affiliation wicri:level="4">
<nlm:aff id="aff4">
<addr-line>Fachbereich Informatik, Technische Universität München, Munich, Germany</addr-line>
</nlm:aff>
<country xml:lang="fr">Allemagne</country>
<wicri:regionArea>Fachbereich Informatik, Technische Universität München, Munich</wicri:regionArea>
<placeName>
<region type="land" nuts="1">Bavière</region>
<region type="district" nuts="2">District de Haute-Bavière</region>
<settlement type="city">Munich</settlement>
</placeName>
<orgName type="university">Université technique de Munich</orgName>
</affiliation>
</author>
<author>
<name sortKey="Srinivasan, Mandayam A" sort="Srinivasan, Mandayam A" uniqKey="Srinivasan M" first="Mandayam A." last="Srinivasan">Mandayam A. Srinivasan</name>
<affiliation wicri:level="2">
<nlm:aff id="aff6">
<addr-line>The Touch Lab, Research Laboratory of Electronics and Department of Mechanical Engineering, Massachusetts Institute of Technology, Cambridge, Massachusetts, United States of America</addr-line>
</nlm:aff>
<country xml:lang="fr">États-Unis</country>
<wicri:regionArea>The Touch Lab, Research Laboratory of Electronics and Department of Mechanical Engineering, Massachusetts Institute of Technology, Cambridge, Massachusetts</wicri:regionArea>
<placeName>
<region type="state">Massachusetts</region>
</placeName>
</affiliation>
<affiliation wicri:level="3">
<nlm:aff id="aff7">
<addr-line>Department of Computer Science, University College London, London, United Kingdom</addr-line>
</nlm:aff>
<country xml:lang="fr">Royaume-Uni</country>
<wicri:regionArea>Department of Computer Science, University College London, London</wicri:regionArea>
<placeName>
<settlement type="city">Londres</settlement>
<region type="country">Angleterre</region>
<region type="région" nuts="1">Grand Londres</region>
</placeName>
</affiliation>
</author>
<author>
<name sortKey="Slater, Mel" sort="Slater, Mel" uniqKey="Slater M" first="Mel" last="Slater">Mel Slater</name>
<affiliation wicri:level="1">
<nlm:aff id="aff1">
<addr-line>EVENT Lab, Faculty of Psychology, University of Barcelona, Spain</addr-line>
</nlm:aff>
<country xml:lang="fr">Espagne</country>
<wicri:regionArea>EVENT Lab, Faculty of Psychology, University of Barcelona</wicri:regionArea>
<wicri:noRegion>University of Barcelona</wicri:noRegion>
</affiliation>
<affiliation wicri:level="3">
<nlm:aff id="aff2">
<addr-line>Institució Catalana de Recerca i Estudis Avançats (ICREA), Barcelona, Spain</addr-line>
</nlm:aff>
<country xml:lang="fr">Espagne</country>
<wicri:regionArea>Institució Catalana de Recerca i Estudis Avançats (ICREA), Barcelona</wicri:regionArea>
<placeName>
<settlement type="city">Barcelone</settlement>
<region nuts="2" type="region">Catalogne</region>
</placeName>
</affiliation>
<affiliation wicri:level="3">
<nlm:aff id="aff7">
<addr-line>Department of Computer Science, University College London, London, United Kingdom</addr-line>
</nlm:aff>
<country xml:lang="fr">Royaume-Uni</country>
<wicri:regionArea>Department of Computer Science, University College London, London</wicri:regionArea>
<placeName>
<settlement type="city">Londres</settlement>
<region type="country">Angleterre</region>
<region type="région" nuts="1">Grand Londres</region>
</placeName>
</affiliation>
</author>
</analytic>
<series>
<title level="j">PLoS ONE</title>
<idno type="eISSN">1932-6203</idno>
<imprint>
<date when="2012">2012</date>
</imprint>
</series>
</biblStruct>
</sourceDesc>
</fileDesc>
<profileDesc>
<textClass></textClass>
</profileDesc>
</teiHeader>
<front>
<div type="abstract" xml:lang="en">
<p>Immersive virtual reality (IVR) typically generates the illusion in participants that they are in the displayed virtual scene where they can experience and interact in events as if they were really happening. Teleoperator (TO) systems place people at a remote physical destination embodied as a robotic device, and where typically participants have the sensation of being at the destination, with the ability to interact with entities there. In this paper, we show how to combine IVR and TO to allow a new class of application. The participant in the IVR is represented in the destination by a physical robot (TO) and simultaneously the remote place and entities within it are represented to the participant in the IVR. Hence, the IVR participant has a normal virtual reality experience, but where his or her actions and behaviour control the remote robot and can therefore have physical consequences. Here, we show how such a system can be deployed to allow a human and a rat to operate together, but the human interacting with the rat on a human scale, and the rat interacting with the human on the rat scale. The human is represented in a rat arena by a small robot that is slaved to the human’s movements, whereas the tracked rat is represented to the human in the virtual reality by a humanoid avatar. We describe the system and also a study that was designed to test whether humans can successfully play a game with the rat. The results show that the system functioned well and that the humans were able to interact with the rat to fulfil the tasks of the game. This system opens up the possibility of new applications in the life sciences involving participant observation of and interaction with animals but at human scale.</p>
</div>
</front>
<back>
<div1 type="bibliography">
<listBibl>
<biblStruct>
<analytic>
<author>
<name sortKey="Brooks Jr, F" uniqKey="Brooks Jr F">F Brooks Jr</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Rizzo, A" uniqKey="Rizzo A">A Rizzo</name>
</author>
<author>
<name sortKey="Kim, G" uniqKey="Kim G">G Kim</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Slater, M" uniqKey="Slater M">M Slater</name>
</author>
<author>
<name sortKey="Antley, A" uniqKey="Antley A">A Antley</name>
</author>
<author>
<name sortKey="Davison, A" uniqKey="Davison A">A Davison</name>
</author>
<author>
<name sortKey="Swapp, D" uniqKey="Swapp D">D Swapp</name>
</author>
<author>
<name sortKey="Guger, C" uniqKey="Guger C">C Guger</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Rovira, A" uniqKey="Rovira A">A Rovira</name>
</author>
<author>
<name sortKey="Swapp, D" uniqKey="Swapp D">D Swapp</name>
</author>
<author>
<name sortKey="Spanlang, B" uniqKey="Spanlang B">B Spanlang</name>
</author>
<author>
<name sortKey="Slater, M" uniqKey="Slater M">M Slater</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Prabhat, A" uniqKey="Prabhat A">A Prabhat</name>
</author>
<author>
<name sortKey="Katzourin, M" uniqKey="Katzourin M">M Katzourin</name>
</author>
<author>
<name sortKey="Wharton, K" uniqKey="Wharton K">K Wharton</name>
</author>
<author>
<name sortKey="Slater, M" uniqKey="Slater M">M Slater</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Ferreira, A" uniqKey="Ferreira A">A Ferreira</name>
</author>
<author>
<name sortKey="Mavroidis, C" uniqKey="Mavroidis C">C Mavroidis</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Zhao, Q" uniqKey="Zhao Q">Q Zhao</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Sanchez Vives, Mv" uniqKey="Sanchez Vives M">MV Sanchez-Vives</name>
</author>
<author>
<name sortKey="Slater, M" uniqKey="Slater M">M Slater</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Mondada, F" uniqKey="Mondada F">F Mondada</name>
</author>
<author>
<name sortKey="Bonani, M" uniqKey="Bonani M">M Bonani</name>
</author>
<author>
<name sortKey="Raemy, X" uniqKey="Raemy X">X Raemy</name>
</author>
<author>
<name sortKey="Pugh, J" uniqKey="Pugh J">J Pugh</name>
</author>
<author>
<name sortKey="Cianci, C" uniqKey="Cianci C">C Cianci</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Tecchia, F" uniqKey="Tecchia F">F Tecchia</name>
</author>
<author>
<name sortKey="Carrozzino, M" uniqKey="Carrozzino M">M Carrozzino</name>
</author>
<author>
<name sortKey="Bacinelli, S" uniqKey="Bacinelli S">S Bacinelli</name>
</author>
<author>
<name sortKey="Rossi, F" uniqKey="Rossi F">F Rossi</name>
</author>
<author>
<name sortKey="Vercelli, D" uniqKey="Vercelli D">D Vercelli</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Gillies, M" uniqKey="Gillies M">M Gillies</name>
</author>
<author>
<name sortKey="Spanlang, B" uniqKey="Spanlang B">B Spanlang</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Pustka, D" uniqKey="Pustka D">D Pustka</name>
</author>
<author>
<name sortKey="Huber, M" uniqKey="Huber M">M Huber</name>
</author>
<author>
<name sortKey="Waechter, C" uniqKey="Waechter C">C Waechter</name>
</author>
<author>
<name sortKey="Echtler, F" uniqKey="Echtler F">F Echtler</name>
</author>
<author>
<name sortKey="Keitler, P" uniqKey="Keitler P">P Keitler</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Valle, Fp" uniqKey="Valle F">FP Valle</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Brudzynski, Sm" uniqKey="Brudzynski S">SM Brudzynski</name>
</author>
<author>
<name sortKey="Krol, S" uniqKey="Krol S">S Krol</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Alger, Jm" uniqKey="Alger J">JM Alger</name>
</author>
<author>
<name sortKey="Alger, Sf" uniqKey="Alger S">SF Alger</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Brandt, K" uniqKey="Brandt K">K Brandt</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Takanishi, A" uniqKey="Takanishi A">A Takanishi</name>
</author>
<author>
<name sortKey="Aoki, T" uniqKey="Aoki T">T Aoki</name>
</author>
<author>
<name sortKey="Ito, M" uniqKey="Ito M">M Ito</name>
</author>
<author>
<name sortKey="Ohkawa, Y" uniqKey="Ohkawa Y">Y Ohkawa</name>
</author>
<author>
<name sortKey="Yamaguchi, J" uniqKey="Yamaguchi J">J Yamaguchi</name>
</author>
</analytic>
</biblStruct>
<biblStruct></biblStruct>
<biblStruct></biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Wood, Rj" uniqKey="Wood R">RJ Wood</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Bluff, La" uniqKey="Bluff L">LA Bluff</name>
</author>
<author>
<name sortKey="Rutz, C" uniqKey="Rutz C">C Rutz</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Talwar, Sk" uniqKey="Talwar S">SK Talwar</name>
</author>
<author>
<name sortKey="Xu, S" uniqKey="Xu S">S Xu</name>
</author>
<author>
<name sortKey="Hawley, Es" uniqKey="Hawley E">ES Hawley</name>
</author>
<author>
<name sortKey="Weiss, Sa" uniqKey="Weiss S">SA Weiss</name>
</author>
<author>
<name sortKey="Moxon, Ka" uniqKey="Moxon K">KA Moxon</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Benford, S" uniqKey="Benford S">S Benford</name>
</author>
<author>
<name sortKey="Greenhalgh, C" uniqKey="Greenhalgh C">C Greenhalgh</name>
</author>
<author>
<name sortKey="Rodden, T" uniqKey="Rodden T">T Rodden</name>
</author>
<author>
<name sortKey="Pycock, J" uniqKey="Pycock J">J Pycock</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Kim, J" uniqKey="Kim J">J Kim</name>
</author>
<author>
<name sortKey="Kim, H" uniqKey="Kim H">H Kim</name>
</author>
<author>
<name sortKey="Tay, Bk" uniqKey="Tay B">BK Tay</name>
</author>
<author>
<name sortKey="Muniyandi, M" uniqKey="Muniyandi M">M Muniyandi</name>
</author>
<author>
<name sortKey="Srinivasan, Ma" uniqKey="Srinivasan M">MA Srinivasan</name>
</author>
</analytic>
</biblStruct>
<biblStruct></biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Lincoln, P" uniqKey="Lincoln P">P Lincoln</name>
</author>
<author>
<name sortKey="Welch, G" uniqKey="Welch G">G Welch</name>
</author>
<author>
<name sortKey="Nashel, A" uniqKey="Nashel A">A Nashel</name>
</author>
<author>
<name sortKey="Ilie, A" uniqKey="Ilie A">A Ilie</name>
</author>
<author>
<name sortKey="Fuchs, H" uniqKey="Fuchs H">H Fuchs</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Perez Marcos, D" uniqKey="Perez Marcos D">D Perez-Marcos</name>
</author>
<author>
<name sortKey="Solazzi, M" uniqKey="Solazzi M">M Solazzi</name>
</author>
<author>
<name sortKey="Steptoe, W" uniqKey="Steptoe W">W Steptoe</name>
</author>
<author>
<name sortKey="Oyekoya, O" uniqKey="Oyekoya O">O Oyekoya</name>
</author>
<author>
<name sortKey="Frisoli, A" uniqKey="Frisoli A">A Frisoli</name>
</author>
</analytic>
</biblStruct>
</listBibl>
</div1>
</back>
</TEI>
<pmc article-type="research-article">
<pmc-dir>properties open_access</pmc-dir>
<front>
<journal-meta>
<journal-id journal-id-type="nlm-ta">PLoS One</journal-id>
<journal-id journal-id-type="iso-abbrev">PLoS ONE</journal-id>
<journal-id journal-id-type="publisher-id">plos</journal-id>
<journal-id journal-id-type="pmc">plosone</journal-id>
<journal-title-group>
<journal-title>PLoS ONE</journal-title>
</journal-title-group>
<issn pub-type="epub">1932-6203</issn>
<publisher>
<publisher-name>Public Library of Science</publisher-name>
<publisher-loc>San Francisco, USA</publisher-loc>
</publisher>
</journal-meta>
<article-meta>
<article-id pub-id-type="pmid">23118987</article-id>
<article-id pub-id-type="pmc">3485138</article-id>
<article-id pub-id-type="publisher-id">PONE-D-12-17877</article-id>
<article-id pub-id-type="doi">10.1371/journal.pone.0048331</article-id>
<article-categories>
<subj-group subj-group-type="heading">
<subject>Research Article</subject>
</subj-group>
<subj-group subj-group-type="Discipline-v2">
<subject>Engineering</subject>
<subj-group>
<subject>Human Factors Engineering</subject>
<subj-group>
<subject>Man Computer Interface</subject>
<subj-group>
<subject>Virtual Reality</subject>
</subj-group>
</subj-group>
</subj-group>
<subj-group>
<subject>Mechanical Engineering</subject>
<subj-group>
<subject>Robotics</subject>
</subj-group>
</subj-group>
</subj-group>
<subj-group subj-group-type="Discipline-v2">
<subject>Biology</subject>
<subj-group>
<subject>Anatomy and Physiology</subject>
<subj-group>
<subject>Musculoskeletal System</subject>
<subj-group>
<subject>Robotics</subject>
</subj-group>
</subj-group>
</subj-group>
<subj-group>
<subject>Bioethics</subject>
<subj-group>
<subject>Animal Studies</subject>
</subj-group>
</subj-group>
<subj-group>
<subject>Computational Biology</subject>
<subj-group>
<subject>Computational Neuroscience</subject>
</subj-group>
</subj-group>
<subj-group>
<subject>Ecology</subject>
<subj-group>
<subject>Community Ecology</subject>
<subj-group>
<subject>Species Interactions</subject>
</subj-group>
</subj-group>
</subj-group>
<subj-group>
<subject>Evolutionary Biology</subject>
<subj-group>
<subject>Animal Behavior</subject>
</subj-group>
</subj-group>
<subj-group>
<subject>Neuroscience</subject>
<subj-group>
<subject>Animal Cognition</subject>
<subject>Behavioral Neuroscience</subject>
<subject>Computational Neuroscience</subject>
</subj-group>
</subj-group>
<subj-group>
<subject>Model Organisms</subject>
<subj-group>
<subject>Animal Models</subject>
<subj-group>
<subject>Rat</subject>
</subj-group>
</subj-group>
</subj-group>
<subj-group>
<subject>Zoology</subject>
<subj-group>
<subject>Animal Behavior</subject>
</subj-group>
</subj-group>
</subj-group>
<subj-group subj-group-type="Discipline-v2">
<subject>Science Policy</subject>
<subj-group>
<subject>Bioethics</subject>
<subj-group>
<subject>Animal Studies</subject>
</subj-group>
</subj-group>
</subj-group>
<subj-group subj-group-type="Discipline-v2">
<subject>Computer Science</subject>
<subj-group>
<subject>Computer Applications</subject>
</subj-group>
<subj-group>
<subject>Computer Modeling</subject>
</subj-group>
<subj-group>
<subject>Computerized Simulations</subject>
</subj-group>
<subj-group>
<subject>Computing Methods</subject>
<subj-group>
<subject>Computer Animation</subject>
<subject>Computer Graphics</subject>
</subj-group>
</subj-group>
</subj-group>
<subj-group subj-group-type="Discipline-v2">
<subject>Veterinary Science</subject>
<subj-group>
<subject>Animal Management</subject>
<subj-group>
<subject>Animal Behavior</subject>
</subj-group>
</subj-group>
<subj-group>
<subject>Animal Types</subject>
<subj-group>
<subject>Laboratory Animals</subject>
</subj-group>
</subj-group>
</subj-group>
</article-categories>
<title-group>
<article-title>Beaming into the Rat World: Enabling Real-Time Interaction between Rat and Human Each at Their Own Scale</article-title>
<alt-title alt-title-type="running-head">Beaming into the Rat World</alt-title>
</title-group>
<contrib-group>
<contrib contrib-type="author">
<name>
<surname>Normand</surname>
<given-names>Jean-Marie</given-names>
</name>
<xref ref-type="aff" rid="aff1">
<sup>1</sup>
</xref>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Sanchez-Vives</surname>
<given-names>Maria V.</given-names>
</name>
<xref ref-type="aff" rid="aff2">
<sup>2</sup>
</xref>
<xref ref-type="aff" rid="aff3">
<sup>3</sup>
</xref>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Waechter</surname>
<given-names>Christian</given-names>
</name>
<xref ref-type="aff" rid="aff4">
<sup>4</sup>
</xref>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Giannopoulos</surname>
<given-names>Elias</given-names>
</name>
<xref ref-type="aff" rid="aff1">
<sup>1</sup>
</xref>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Grosswindhager</surname>
<given-names>Bernhard</given-names>
</name>
<xref ref-type="aff" rid="aff5">
<sup>5</sup>
</xref>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Spanlang</surname>
<given-names>Bernhard</given-names>
</name>
<xref ref-type="aff" rid="aff1">
<sup>1</sup>
</xref>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Guger</surname>
<given-names>Christoph</given-names>
</name>
<xref ref-type="aff" rid="aff5">
<sup>5</sup>
</xref>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Klinker</surname>
<given-names>Gudrun</given-names>
</name>
<xref ref-type="aff" rid="aff4">
<sup>4</sup>
</xref>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Srinivasan</surname>
<given-names>Mandayam A.</given-names>
</name>
<xref ref-type="aff" rid="aff6">
<sup>6</sup>
</xref>
<xref ref-type="aff" rid="aff7">
<sup>7</sup>
</xref>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Slater</surname>
<given-names>Mel</given-names>
</name>
<xref ref-type="aff" rid="aff1">
<sup>1</sup>
</xref>
<xref ref-type="aff" rid="aff2">
<sup>2</sup>
</xref>
<xref ref-type="aff" rid="aff7">
<sup>7</sup>
</xref>
<xref ref-type="corresp" rid="cor1">
<sup>*</sup>
</xref>
</contrib>
</contrib-group>
<aff id="aff1">
<label>1</label>
<addr-line>EVENT Lab, Faculty of Psychology, University of Barcelona, Spain</addr-line>
</aff>
<aff id="aff2">
<label>2</label>
<addr-line>Institució Catalana de Recerca i Estudis Avançats (ICREA), Barcelona, Spain</addr-line>
</aff>
<aff id="aff3">
<label>3</label>
<addr-line>Institut d’Investigacions Biomèdiques August Pi i Sunyer (IDIBAPS), Barcelona, Spain</addr-line>
</aff>
<aff id="aff4">
<label>4</label>
<addr-line>Fachbereich Informatik, Technische Universität München, Munich, Germany</addr-line>
</aff>
<aff id="aff5">
<label>5</label>
<addr-line>Guger Technologies (g.tec), Schiedlberg, Austria</addr-line>
</aff>
<aff id="aff6">
<label>6</label>
<addr-line>The Touch Lab, Research Laboratory of Electronics and Department of Mechanical Engineering, Massachusetts Institute of Technology, Cambridge, Massachusetts, United States of America</addr-line>
</aff>
<aff id="aff7">
<label>7</label>
<addr-line>Department of Computer Science, University College London, London, United Kingdom</addr-line>
</aff>
<contrib-group>
<contrib contrib-type="editor">
<name>
<surname>de Polavieja</surname>
<given-names>Gonzalo G.</given-names>
</name>
<role>Editor</role>
<xref ref-type="aff" rid="edit1"></xref>
</contrib>
</contrib-group>
<aff id="edit1">
<addr-line>Cajal Institute, Consejo Superior de Investigaciones Científicas, Spain</addr-line>
</aff>
<author-notes>
<corresp id="cor1">* E-mail:
<email>melslater@ub.edu</email>
</corresp>
<fn fn-type="conflict">
<p>
<bold>Competing Interests: </bold>
The authors in the paper who are employed by the company Guger Technologies are Bernhard Grosswindhager and Christoph Guger. The main business of that company is brain-computer interfaces (
<ext-link ext-link-type="uri" xlink:href="http://www.gtec.at">www.gtec.at</ext-link>
). In the work described in this paper, these authors were responsible for implementing the robot controller. Dr Guger, the director of Guger Technologies, has sent the corresponding author an email stating that there is “no conflict of interest with the publication as it was done for research purposes.” There is a small commercial relationship between Guger Technologies and the University of Barcelona (UB). UB licenses to Guger Technologies a system that controls a virtual character that can be moved by the company’s brain-computer interface system. This has nothing to do with the work described in the present paper. Taking into account all of the above this does not alter the authors' adherence to all the PLOS ONE policies on sharing data and materials.</p>
</fn>
<fn fn-type="con">
<p>Conceived and designed the experiments: MS MVSV MAS. Performed the experiments: J-MN EG. Analyzed the data: MS. Wrote the paper: MS MVSV J-MN GK CG MAS. Programming the environment: J-MN CW BG BS.</p>
</fn>
</author-notes>
<pub-date pub-type="collection">
<year>2012</year>
</pub-date>
<pub-date pub-type="epub">
<day>31</day>
<month>10</month>
<year>2012</year>
</pub-date>
<volume>7</volume>
<issue>10</issue>
<elocation-id>e48331</elocation-id>
<history>
<date date-type="received">
<day>15</day>
<month>6</month>
<year>2012</year>
</date>
<date date-type="accepted">
<day>24</day>
<month>9</month>
<year>2012</year>
</date>
</history>
<permissions>
<copyright-year>2012</copyright-year>
<copyright-holder>Normand et al</copyright-holder>
<license>
<license-p>This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.</license-p>
</license>
</permissions>
<abstract>
<p>Immersive virtual reality (IVR) typically generates the illusion in participants that they are in the displayed virtual scene where they can experience and interact in events as if they were really happening. Teleoperator (TO) systems place people at a remote physical destination embodied as a robotic device, and where typically participants have the sensation of being at the destination, with the ability to interact with entities there. In this paper, we show how to combine IVR and TO to allow a new class of application. The participant in the IVR is represented in the destination by a physical robot (TO) and simultaneously the remote place and entities within it are represented to the participant in the IVR. Hence, the IVR participant has a normal virtual reality experience, but where his or her actions and behaviour control the remote robot and can therefore have physical consequences. Here, we show how such a system can be deployed to allow a human and a rat to operate together, but the human interacting with the rat on a human scale, and the rat interacting with the human on the rat scale. The human is represented in a rat arena by a small robot that is slaved to the human’s movements, whereas the tracked rat is represented to the human in the virtual reality by a humanoid avatar. We describe the system and also a study that was designed to test whether humans can successfully play a game with the rat. The results show that the system functioned well and that the humans were able to interact with the rat to fulfil the tasks of the game. This system opens up the possibility of new applications in the life sciences involving participant observation of and interaction with animals but at human scale.</p>
</abstract>
<funding-group>
<funding-statement>This study was funded by the European Commission through the European Union projects PRESENCCIA FP6-027731, IMMERSENCE FP6-027141 BEAMING FP7-248620, MicroNanoTeleHaptics (ERC 247401) and TRAVERSE (ERC 227985). European FP6 and FP7 projects' URL is
<ext-link ext-link-type="uri" xlink:href="http://cordis.europa.eu/home_en.html">http://cordis.europa.eu/home_en.html</ext-link>
and the European Research Council's is
<ext-link ext-link-type="uri" xlink:href="http://erc.europa.eu/">http://erc.europa.eu/</ext-link>
. The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.</funding-statement>
</funding-group>
<counts>
<page-count count="10"></page-count>
</counts>
</article-meta>
</front>
<body>
<sec id="s1">
<title>Introduction</title>
<p>The potential for immersive virtual reality remains largely untapped, and although the promise and excitement that it generated in the early 1990s has waned, it is an extremely powerful technology with applications that range far beyond those that have hitherto been developed. These have included simulation and training
<xref ref-type="bibr" rid="pone.0048331-BrooksJr1">[1]</xref>
, therapy and rehabilitation
<xref ref-type="bibr" rid="pone.0048331-Rizzo1">[2]</xref>
, simulation of social situations in experimental studies
<xref ref-type="bibr" rid="pone.0048331-Slater1">[3]</xref>
,
<xref ref-type="bibr" rid="pone.0048331-Rovira1">[4]</xref>
and many others of a similar type. The vast majority of applications operate at human scale, except when virtual reality has been used for data visualisation, for example of data obtained from a confocal microscope
<xref ref-type="bibr" rid="pone.0048331-Prabhat1">[5]</xref>
or for manipulation at the nanoscale
<xref ref-type="bibr" rid="pone.0048331-Ferreira1">[6]</xref>
. Virtual reality still requires significant technical and conceptual advances
<xref ref-type="bibr" rid="pone.0048331-Zhao1">[7]</xref>
but such advances will come through novel applications that spur further technical and scientific research. In particular when combined with teleoperation it can open up a new class of applications such as the one considered in this paper.</p>
<p>Immersive virtual reality (IVR) and teleoperator (TO) systems provide the technical means for instantaneously transferring a person into a different place. An IVR system places people into a computer-generated environment where they can use their body normally for perception and interact with virtual objects, and with representations of other humans. Such virtual reality systems can be used to give people the illusion of being in the place depicted by the environment where they tend to behave as if what they were experiencing were real
<xref ref-type="bibr" rid="pone.0048331-SanchezVives1">[8]</xref>
. With TO an operator can have the sense of being physically in a remote real place, embodied there as a robot – seeing through the eyes of the robot whose actions are slaved to the motor actions of the operator. There the operator can, for example, operate remote machinery, collect samples, and so on.</p>
<p>When we combine IVR with TO we open up a new class of application where the human participant operates in a virtual (possibly transformed) representation of a remote physical space in which there are other live beings that may exist and act on an entirely different scale to humans. In particular here we show how to use IVR and TO to create a system that allows humans, and in principle, the smallest of animals or insects to interact together at the same scale. The fundamental idea is that the human participant is in an IVR system interacting with a virtual character (avatar) representing a remote animal. The animal is tracked in its physical space. The tracking information from the animal is relayed to the IVR and controls the actions of the avatar that represents it. The VR is scaled so that movements of the animals are mapped into appropriate changes in position of their avatar representations on a human scale. From the point of view of the humans there is a VR in which other live beings are represented with which they can interact.</p>
<p>We have so far described the setup from the human point of view - but how do the animals interact with the human, since the animals themselves are not in a virtual environment but in their own habitat without any special displays? The answer is that just as the animals are tracked and this information controls the movements of their virtual representations, so the humans are tracked and this controls the movements of a robotic device that is located within the animal habitat. Hence when the human, for example, moves close to the representation of the animal in the virtual environment, so the robot moves close to the corresponding animal in the physical habitat. There is a proportional mapping between the spatial relationships and orientations of the robot with respect to the animal in the physical space, and the human with respect to the animal’s avatar representation in the virtual reality. Both animals and humans experience their environment at their own scales. We call this process ‘beaming’ since the human in effect digitally beams a physical representation of him- or herself into the animal environment.</p>
<p>We describe an example of such a system that enables people to beam into a rat arena and interact with the rat at human scale, while the rat interacts with the human on the rat scale. In our particular application, a humanoid avatar represented the rat in virtual reality, and a small robot in the rat open arena represented the human. The human and rat played a game together as an example of the type of interaction that is straightforward to achieve in such a system. The purpose was to (a) Test the overall system performance during an interactive game played between person and rat. (b) To examine how the rat reacted to the robotic device. (c) To examine how the human participants accepted the setup and played the game, indeed whether it was possible to play the game at all.</p>
<fig id="pone-0048331-g001" orientation="portrait" position="float">
<object-id pub-id-type="doi">10.1371/journal.pone.0048331.g001</object-id>
<label>Figure 1</label>
<caption>
<title>The rat arena and robot device.</title>
<p>(a) Two of the pictures on the wall can be seen, and the frame on which a webcam was mounted for tracking purposes. (b) The e-puck robot protected by a purpose-made armour. For tracking purposes, a typical Augmented Reality marker was attached on top of the armour. The plastic platform in front was used to hold the food (strawberry jelly) for the rat. (c) Left hand side: View of the robot and rat for tracking. Right hand side: Result of the threshold used to detect the rat in the image.</p>
</caption>
<graphic xlink:href="pone.0048331.g001"></graphic>
</fig>
</sec>
<sec sec-type="materials|methods" id="s2">
<title>Materials and Methods</title>
<sec id="s2a">
<title>Ethics Statement</title>
<p>The study was approved by the Ethics Committee of the Hospital Clinic (Barcelona, Spain) under the regulations of the Autonomous Government of Catalonia and following the guidelines of the European Communities Council (86/609/EEC). Participants gave written informed consent.</p>
<fig id="pone-0048331-g002" orientation="portrait" position="float">
<object-id pub-id-type="doi">10.1371/journal.pone.0048331.g002</object-id>
<label>Figure 2</label>
<caption>
<title>Simplified hardware and software architectures, and dataflow of the experiment.</title>
</caption>
<graphic xlink:href="pone.0048331.g002"></graphic>
</fig>
</sec>
<sec id="s2b">
<title>The Human-side Experimental Set up</title>
<p>A head-tracked wide field of view head-mounted display (HMD) was used. The HMD was a NVIS nVisor SX111 with a field of view of 76°×64° per eye, resulting in a total of 111°FOV and a resolution of 1280×1024 pixels per eye displayed at 60 Hz. Head tracking was performed by a 6-DOF Intersense IS-900 device.</p>
<fig id="pone-0048331-g003" orientation="portrait" position="float">
<object-id pub-id-type="doi">10.1371/journal.pone.0048331.g003</object-id>
<label>Figure 3</label>
<caption>
<title>Screenshot of the virtual environment.</title>
<p>Three of the four posters are visible in the image as well as the two avatars representing both the participant and the rat.</p>
</caption>
<graphic xlink:href="pone.0048331.g003"></graphic>
</fig>
<p>Due to the head-tracking, the participant could turn his or her head and body in any direction, and physically walk a pace or two. However, to move through the VR a hand held Intersense Wand was used. The participant could press a button on the Wand to move forward and backward at a speed constrained by the maximum speed of the robot in the rat arena. The rotation of the head tracker was used to change the direction of locomotion within the IVR and consequently of the robot’s movement.</p>
<fig id="pone-0048331-g004" orientation="portrait" position="float">
<object-id pub-id-type="doi">10.1371/journal.pone.0048331.g004</object-id>
<label>Figure 4</label>
<caption>
<title>Movement of rats and humans (a) Rat A with (b) corresponding participant, (c) Rat B with (d) corresponding participant.</title>
<p>Axes are in metres, and all movements are measured in the rat arena. Hence the human movements are those of the slaved robot.</p>
</caption>
<graphic xlink:href="pone.0048331.g004"></graphic>
</fig>
</sec>
<sec id="s2c">
<title>The Rat-side Experimental Set up</title>
<p>There was an open arena, a small robot and two webcams. The rat open arena was an 80 cm×80 cm×60 cm (width×length×height) box, with some pictures on the inside walls (
<xref ref-type="fig" rid="pone-0048331-g001">Figure 1a</xref>
). The rat was free to move anywhere in the box. Also inside the open arena was an e-puck® robot
<xref ref-type="bibr" rid="pone.0048331-Mondada1">[9]</xref>
(
<xref ref-type="fig" rid="pone-0048331-g001">Figure 1b</xref>
). The movements of the human in the VR were mapped to movements of this robot in real-time (
<xref ref-type="supplementary-material" rid="pone.0048331.s001">Text S1</xref>
). The e-puck has a size of 70 mm (diameter) by 50 mm (height), weighs 150 g and moves at a maximum speed of 12.9 cm/s. A small (65 mm×65 mm) marker was placed on top of the robot in order to facilitate camera based tracking of its position and to prevent potential errors due to the presence of the rat in the cage. Also the robot was encased in a special wooden handmade armour to avoid potential damage from the rat. The dimensions of the robot within the armour were 70 mm (height) and 90 mm (diameter of the armour). A food-support was attached to the armour in order to train the rat to follow the robot. The diameter with the food support was 120 mm.</p>
<fig id="pone-0048331-g005" orientation="portrait" position="float">
<object-id pub-id-type="doi">10.1371/journal.pone.0048331.g005</object-id>
<label>Figure 5</label>
<caption>
<title>Scatter diagram of the proportion of time that the rat was within a radius of 20 cm from the arena centre by the number of rat points over all participants, for both rats.</title>
<p>The number of rat points is the number of collisions between rat and robot that occurred away from the correct poster for the human to obtain a point. The Pearson correlation is significant for each rat separately (Rat A: r = 0.70, P<0.04; Rat B: r = 0.82, P<0.008).</p>
</caption>
<graphic xlink:href="pone.0048331.g005"></graphic>
</fig>
<p>Two webcams were mounted over the top of the open arena to do the tracking, from a top-view perspective looking down into the arena. The first one was used only for tracking (both rat and robot) while the second one was also used to convey video information to the human participant at various times in the course of the game. It should be noted that only one webcam would have been enough to perform both tracking and video streaming but with the drawback of high CPU usage on the computer.</p>
</sec>
<sec id="s2d">
<title>Overall Software Framework</title>
<p>Three computers were used each playing a different role, streaming different type of data (
<xref ref-type="fig" rid="pone-0048331-g002">Figure 2</xref>
). The three computers involved (two at the rat site and one at the human participant site), served the following functions:</p>
<list list-type="bullet">
<list-item>
<p>The first was dedicated to the tracking and control of the robot and tracking of the rat.</p>
</list-item>
<list-item>
<p>The second was dedicated to video streaming from the rat open arena to the HMD machine.</p>
</list-item>
<list-item>
<p>The third was dedicated to the management of the IVR (HMD display of the virtual environment and video from the rat site, tracking of the participant).</p>
</list-item>
</list>
<p>At the participant’s site, where the VR was displayed in the HMD, the software platform used was XVR
<xref ref-type="bibr" rid="pone.0048331-Tecchia1">[10]</xref>
. XVR provided the framework to handle all the display and the networking activities related to streaming data, over the network arrangement of the various connected peers. The hardware accelerated library for character animation (HALCA)
<xref ref-type="bibr" rid="pone.0048331-Gillies1">[11]</xref>
was used for display and real-time animation of human characters.</p>
<fig id="pone-0048331-g006" orientation="portrait" position="float">
<object-id pub-id-type="doi">10.1371/journal.pone.0048331.g006</object-id>
<label>Figure 6</label>
<caption>
<title>Distance between rat and robot by rat position.</title>
<p>The vertical axis is the distance between the rat and robot corresponding to the position of the rat on the horizontal plane representing the rat arena. (a) Representing all 9 participants for rat A over trial 1 where the participants knew that the avatar represented a rat (b) The same participants for rat A over trial 2 where participants thought that the rat represented a remote human. (c) All 9 participants for rat B over trial 1. (d) The same participants over trial 2 for rat B.</p>
</caption>
<graphic xlink:href="pone.0048331.g006"></graphic>
</fig>
<p>At the rat site, the laptop dedicated to the tracking and robot control used MATLAB and Simulink (for the robot) and the Ubitrack framework
<xref ref-type="bibr" rid="pone.0048331-Pustka1">[12]</xref>
for the tracking. The second laptop was running the application dedicated to video streaming as well as a Skype chat where both experimenters (the one located on the rat site and the one located on the participant’s site) could keep in contact in order to ensure the smooth progress of the experiment.</p>
<fig id="pone-0048331-g007" orientation="portrait" position="float">
<object-id pub-id-type="doi">10.1371/journal.pone.0048331.g007</object-id>
<label>Figure 7</label>
<caption>
<title>Waveform of distance between Rat A and the robot device for an arbitrary participant.</title>
</caption>
<graphic xlink:href="pone.0048331.g007"></graphic>
</fig>
</sec>
<sec id="s2e">
<title>The Virtual Reality</title>
<p>The VR displayed to the participant consisted of a closed 3D room with posters on the walls replicating the situation in the arena. The rat and the participant were each represented by an avatar (
<xref ref-type="fig" rid="pone-0048331-g003">Figure 3</xref>
) and were animated via the HALCA library. The XVR framework was used to display the VR stereoscopically to the participant in the HMD and to combine the various data flows (tracking, video, etc.) and devices together. The position of the avatar representing the rat was computed based on the tracking data received from the laptop located at the rat site. A walking animation was used to move this character from one position to another in order to maintain plausibility of the movements of the avatar. The participant controlled the position of his or her avatar by using head turns to orient and a button press on the Wand to move through the environment.</p>
</sec>
<sec id="s2f">
<title>Tracking in the Rat Arena</title>
<p>The rat and the robot in the open arena were tracked using a vision based tracking system. The system used a single camera mounted on top of the cage looking down into it, thus providing a bird’s-eye view. Two different tracking algorithms were implemented to estimate the trajectories and orientations of the rat and robot since they differed very much in their shape and behaviour.</p>
<p>Due to the cylindrical shape of the robot we were able to attach a typical rectangular, black-white pattern on its flat top surface. A marker-tracking algorithm, which is well researched in the computer vision community, was used to identify the position and orientation of the robot in three degrees of freedom each. The centre of the marker was associated with the centre of the robot since it was itself mounted in the centre. The orientation between the robot and the marker was estimated by a short registration procedure.</p>
<p>Two points on the rat were of interest: the major position being the body, and the subsidiary position the head for orientation. The first step in tracking made use of the already known position of the robot including its known extensions (i.e. the plastic platform used as food support) in order to exclude the space it occupied from the possible space of the rat. In order to estimate the rat’s body position the rat’s shape and outline are isolated in the current image through segmentation. The rat’s body position is then computed by searching for a global maximum of pixel intensities within its shape and outline.</p>
<p>Estimating the rat’s head position is slightly more complicated. Since the camera sees the rat from a top-view perspective, we could make use of the fact that the shape of the rat’s nose is triangular, and therefore relatively straightforward to detect. Once the nose position is known the rat’s head position can easily be estimated. As a consequence, a visual pattern matching approach was used to detect the rat’s nose position (rotated images of a rat’s nose were used as templates). The best matching position was chosen as the rat’s nose position and used to estimate the head position. In order to avoid jerkiness from one frame to another, an exponential moving average was applied to the head positions estimated in the current and previous frames.</p>
<p>The tracked body position of the rat was used to position the avatar in the virtual reality space, and the orientation was used to determine the forward-facing direction of the avatar. Although relatively simple, the methods to estimate the rat’s body and head positions proved to be efficient and robust.</p>
<p>Further technical aspects of the robot control, video and data streaming are discussed in
<xref ref-type="supplementary-material" rid="pone.0048331.s001">Text S1</xref>
.</p>
</sec>
<sec id="s2g">
<title>Interaction between Person and Rat</title>
<p>We tested our setup with a simple game that people could play with the rat. A video of all the phases is shown in
<xref ref-type="supplementary-material" rid="pone.0048331.s005">Video S1</xref>
. The participants entered the IVR through the HMD. They held the tracked Wand device in their dominant hand. There were two rats located in an animal care facility twelve kilometres distant from the IVR laboratory. Network communications between the two sites allowed sharing of the state of both the rat and the person, and therefore the computer programs were able to maintain the IVR and the physical environment in consistent states. The robot was slaved to the location and orientation of the tracked human. The rats had been earlier been trained to follow the robot, in order to get the food (jelly) on an attached tray (
<xref ref-type="supplementary-material" rid="pone.0048331.s001">Text S1</xref>
).</p>
<p>The participants were 7 men and 11 women from the campus (University of Barcelona). Their mean age was 23±2 (S.D.) years. They were non-experts in computer programming, had little or no experience with virtual reality, and were not much involved in computer game playing (
<xref ref-type="supplementary-material" rid="pone.0048331.s001">Text S1</xref>
).</p>
<p>Nine were assigned to one rat and the other 9 to the other rat. This was so that in one period of lab availability two participants could experience the system, one with one rat followed by the other with the second rat.</p>
</sec>
<sec id="s2h">
<title>The Scenario</title>
<p>The 80 cm×80 cm×60 cm (width×length×height) rat open arena had a different picture on each of its 4 walls (a computer mouse, the face of Mickey Mouse, a poster from the movie Ratatouille, a picture of a real rat with a piece of cheese,
<xref ref-type="fig" rid="pone-0048331-g001">Figure 1a</xref>
). The VR was a room of the same proportions as the cage, 3.2 m×3.2 m×3 m (width×length×height), and with the same pictures on the walls in the same places (
<xref ref-type="fig" rid="pone-0048331-g003">Figure 3</xref>
).</p>
<p>Upon arrival at the virtual reality laboratory the participant was given an information sheet to read that outlined procedures as part of the written informed consent process (see also
<xref ref-type="supplementary-material" rid="pone.0048331.s001">Text S1</xref>
regarding the issue of excluding participants with animal phobia and further details of the procedures). Each session (completing paperwork, training and playing the game) took approximately 30 minutes, and the participants were paid 10€ for their time.</p>
<p>Then participants donned the HMD and held the Wand in their dominant hand and were instructed to look around the scene and describe what they saw. There was then a training period where they learned to navigate the environment using the Wand. Then in the remote animal care facility, the rat and robot were placed into the cage, and the whole system was started (rat tracking, robot activation and tracking and display) and the participant would then see the avatar representing the rat in the IVR. In order for the participants to understand that they were actually interacting with a remote rat, and the relationship between their own movements in the IVR and the robot movements in the rat arena, the experimenter switched, several times, the view in the HMD between the VR and a bird’s-eye video stream of the rat cage containing the rat and the robot device. Finally a simple procedure was carried out to convince the participants that what they were seeing in the video of the rat arena was live and that the VR represented this (
<xref ref-type="supplementary-material" rid="pone.0048331.s001">Text S1</xref>
).</p>
<p>The interaction between the rat and the person was designed as a game that lasted for 5 minutes. The participants were told that they would win a point when they were situated close enough to their opponent avatar provided that they were standing by the ‘correct’ poster at the time, and that success would be signified by a bell ring. The game was played in a series of rounds and at each round the point-winning poster was changed, but the participant was not informed about which was the correct poster except for the very first one. They were told that they would lose a point to the opponent (signified by a horn sound) whenever they were close to the avatar but situated anywhere except under the correct poster. The purpose of this was to encourage the participant to move around the virtual room and to engage their opponent avatar.</p>
<p>The minimum distance between rat and robot in order for the human to gain a point was set to 10 cm in the rat open arena coordinates. This threshold was motivated by the size of the armour encompassing the robot and the imprecision of the rat position due to the tracking. The minimum distance between the participant and the correct poster on the wall was set to 28 cm.</p>
<p>Two such games were played by each person. In the second game participants were in the same virtual room with the virtual character. However, this time the switch to the video view showed a woman waving at them (a bird’s eye view from approximately 4 meters high) and near her was a small humanoid robot. It was explained that everything was the same as before, except that now their opponent was a remote human, and that the humanoid robot that they could see was their own representation. In reality this video had been pre-recorded, there was no remote human participant, and during this second phase of the experiment the rat again controlled the avatar in the virtual environment. The purpose of this second trial of the experiment was only out of interest to see whether the behaviour or attitudes of the participants changed when their opponent was believed to be the rat compared to when it was believed to be human. This second game lasted also 5 minutes under the same conditions as the previous one. After removing the HMD, they were interviewed, debriefed about the purpose of the experiment, and paid.</p>
</sec>
</sec>
<sec id="s3">
<title>Results</title>
<sec id="s3a">
<title>System Performance</title>
<p>A number of measures were used in order to evaluate the performance of the system, in terms of network performance, video streaming latency and robot command latency. The software architecture of the experiment was distributed on three different machines at the two different physical sites both connected via the internal network of the University of Barcelona (
<xref ref-type="fig" rid="pone-0048331-g002">Figure 2</xref>
). Hence, a ‘ping’ command issued between distant computers, which corresponds to measuring the time between sending and receiving back 32 bytes of data, showed an unnoticeable delay (<1 ms). The video stream required sending a 640×480 pixels RGB video between two distant computers. The latency measured revealed a delay of 120 ms (±20 ms) between a frame sent from the video streaming laptop and the IVR computer. Finally, the measured delay of the robot command stream between the computer responsible for tracking and that running the virtual reality displays was 150 ms (±20 ms). This delay corresponded to sending a command via the UDP protocol from the IVR computer, receiving this command on the tracking computer in the MATLAB software, and processing the command before finally sending it to the robot via the Bluetooth protocol. The Bluetooth protocol itself induced a delay up to 20 ms. The human participants in virtual space and the rat and robot device in the physical space of the open arena were tracked at the sampling rate of 30 Hz.</p>
<p>Since there is no Gold Standard algorithm against which we can compare the accuracy of our system we only can provide the algorithm’s runtime, which was estimated as 10 ms for the calculation of the rat's major position and 20 ms for the estimation of the head position and viewing direction on an Intel Core2 Duo CPU with 2.50 GHz. The robot tracking which is marker-based is very efficient and is negligible compared to the rat tracking.</p>
<p>Putting everything together the time spent in the tracking process represents roughly 30 ms, which consists of both robot and rat tracking (body position, head position and head orientation).</p>
</sec>
<sec id="s3b">
<title>Movement Distributions</title>
<p>The two rats both showed typical navigational patterns, staying close to the walls for most of the time, with occasional forays towards the centre. This is a typical behaviour of rodents referred to as thigmotaxis, enhanced by illumination
<xref ref-type="bibr" rid="pone.0048331-Valle1">[13]</xref>
which was the case in our experiments.
<xref ref-type="fig" rid="pone-0048331-g004">Figure 4</xref>
shows movements over the whole period of an arbitrarily selected trial for both rats, and the movements of the corresponding participants. It is shown that the rat tended to gravitate towards the edges and corners. The human covered more the central area to entice the rat towards the centres of the walls (where the posters were located).</p>
<p>Rats were trained to follow the robot in the search for reward, and thus the principal reason for the rat to move away from the thigmotactic pattern of remaining close to walls and corners was most probably the presence of the robot. This can be seen in
<xref ref-type="supplementary-material" rid="pone.0048331.s006">Video S2</xref>
, which shows 6 typical sequences of the movements of rat and robot.</p>
<p>We obtained all of the (x, y) positions of each of the two rats during all the trials using the sampling unit of time as 0.2 s following
<xref ref-type="bibr" rid="pone.0048331-Brudzynski1">[14]</xref>
. The proportion of time that the tracked centre of the rat’s body (without tail) was within a radius of 20 cm of the centre of the arena was computed. The rats were approximately 18–20 cm in length and 5–6 cm in width. Hence a radius of 20 cm in the area size of 80×80 cm
<sup>2</sup>
indicates a region quite distant from the edges. We counted the number of times that there was contact between the rat and robot that occurred while not by the correct poster, referred to as ‘rat points’ (since the humans only obtained a point when the collision was near the correct poster).
<xref ref-type="fig" rid="pone-0048331-g005">Figure 5</xref>
shows the number of rat points by the proportion of time that the rat was in this central region, over all participants and for both rats (for the first trials only). There is a linear relationship between these (Pearson r = 0.71, P<0.001) indicating that the greater the time that the rat was in the centre the greater the number of collisions with the robot. Since the participants knew that they would lose a point in the game if a collision occurred that was not under a poster, it is likely that such encounters were due to the rat following the robot, rather than through the actions of the human. A similar result holds for a radius of 15 cm, and even with a radius of 10 cm the relationship is still significant for rat A (r = 0.89, P<0.0015).</p>
<p>Was the game played? Corresponding to each (x, y) position was the distance between the robot and the rat at that moment (which itself was directly proportional to the distance between human participant and the avatar representing the rat in the VR). We divided the arena floor into a 5×5 grid and found the mean distance between rat and robot for each grid cell over all the participants. We were interested to see whether any pattern could be found that indicated that movements were not just random, and that indeed the game was played.
<xref ref-type="fig" rid="pone-0048331-g006">Figure 6</xref>
shows the resulting graphs.</p>
<p>The figure shows that the distance between rat and robot (human) was greatest when the rat was in its starting corner or an adjacent corner. The graphs also show minima where the posters were located indicating that the game was being successfully played. This is most pronounced in the case of
<xref ref-type="fig" rid="pone-0048331-g006">Figure 6</xref>
(a) and least pronounced for
<xref ref-type="fig" rid="pone-0048331-g006">Figure 6</xref>
(b) which corresponded to trials when the participants believed that they were playing against a human opponent. However, in almost all cases the mean distances near the posters are significantly less than the mean overall distance between the rat and robot taken over the whole time period. This can be shown by calculating the normal z-statistic for comparison of a sample mean with a population mean, here taking the population mean to be the mean distance over the whole time period for a particular rat and trial. These overall means are 0.37 m and 0.39 m for Rat A for trials 1 and 2 respectively, and 0.38 m and 0.40 m in the case of Rat B. For Rat A in trial 1 the four regions in the 5×5 grid corresponding to the positions of the posters have |z| >4 for all but one, and similarly for trial 2 all |z| >3.6 except for (the same) one. For Rat B all |z| >6.6 for trial 1, and all |z| >10.7 for trial 2. This does strongly suggest that the distances around the posters were usually quite different from the overall distance.</p>
<p>The time varying distance between the rat and the robot representing the human is illustrated in
<xref ref-type="fig" rid="pone-0048331-g007">Figure 7</xref>
which shows the plot of distance between the rat and the robot (human participant) over the 5 minute period of the experimental trial, following the same rat (A) and participant as in
<xref ref-type="fig" rid="pone-0048331-g004">Figure 4</xref>
(a, b). This is typical of all such plots representing the dynamics of movement of both rat and human, as they approached each other and moved away again. The evidence suggests the distances between rat and human tended to be slightly greater in trial 2 than in trial 1. In trial 2 participants believed that their opponent was a human. This change in distance could be due to that belief and therefore the desire of human participants to follow rules of proxemics, that is to keep a socially acceptable distance from their opponent, or could be due to the fact that in the second trial the game was played less successfully than in the first. In fact the total number of points scored by participants in the second trial was about half that scored in the first trial. This may have been because the rats were tired or satiated, or it could have been because the humans believed that they were playing against another real human, and adjusted their behaviour accordingly. The evidence regarding this issue is considered and weighed in
<xref ref-type="supplementary-material" rid="pone.0048331.s004">Text S4</xref>
.</p>
</sec>
</sec>
<sec id="s4">
<title>Discussion</title>
<p>Since this is a newly developed system it is interesting to consider possible applications. Unlike existing ethological studies of animals, for example, cats
<xref ref-type="bibr" rid="pone.0048331-Alger1">[15]</xref>
and horses
<xref ref-type="bibr" rid="pone.0048331-Brandt1">[16]</xref>
, it may be interesting for life science investigators to obtain an entirely different view of animal behaviour, by seeing the animals on a human scale, even represented as humans. This would offer a possibility of participant-observational study of animal behaviour and generally of animal communities in a way never before possible. Such changes of view may offer quite new insights.</p>
<p>It might be thought that generally rats would not behave normally when there are robots in their vicinity. However, the placing of robots in rat arenas has been carried out before, as part of the quest to develop a robot that is rat-like in its behaviour. For example, in one system
<xref ref-type="bibr" rid="pone.0048331-Takanishi1">[17]</xref>
a robot that emulated some rat-like behaviour was placed in a open arena with a rat. An experimental study concluded that the robot influenced the rat behaviour in an appropriate way. Ultimately the authors wished to create robots that would interact with humans; however, working with rats provided an environment in which to understand the relationships that may develop between animal and robot in a simplified form. Other work has also had this motivation
<xref ref-type="bibr" rid="pone.0048331-Ishii1">[18]</xref>
, where the rat and robot developed a symbiotic relationship over many hours, and where the robot could learn to manipulate the behaviour of the rat.</p>
<p>Generally there is an increasing amount of work that seeks to understand animal behaviour for the engineering of robots and then testing the robots in the context of interacting with the animals that they emulate, for example, an ‘animat’, a robot that navigates like a rat
<xref ref-type="bibr" rid="pone.0048331-Ball1">[19]</xref>
. The flow of understanding is two-way, where such animal-based robots can shed light on animal behaviour and cognition.</p>
<p>To our knowledge there has never been a system where a physical device operating in a rat environment acts as a surrogate representation of a human operating in an equivalent virtual environment. Some specific computational requirements are discussed in
<xref ref-type="supplementary-material" rid="pone.0048331.s002">Text S2</xref>
, but in general the system components needed to do this are: (a) An IVR system that can track the movements of a human participant; (b) A device that can be slaved to the actions of the human which is located in the animal space - a teleoperation system; (c) Tracking of the animals in their space and the relaying of the tracking information to control avatars in the virtual environment; (d) A network capable of real-time distribution of data between the human and animal sites; (e) A virtual model of the remote (animal) locale. As an example, this type of system could even be used to allow interaction between humans and birds or flying insects. There exist today flying robots
<xref ref-type="bibr" rid="pone.0048331-Wood1">[20]</xref>
so that (b) would be supported. Moreover, it is possible to track, for example, birds
<xref ref-type="bibr" rid="pone.0048331-Bluff1">[21]</xref>
so that (c) would also be supported. Also in relation to (b) another instance of this type of system could replace the robotic device by a real rat with its movements controlled remotely though brain stimulation
<xref ref-type="bibr" rid="pone.0048331-Talwar1">[22]</xref>
.</p>
<p>In the paragraph above we have extended beyond a single animal - which requires the capability to track multiple animals simultaneously and thereby control multiple avatars. Moreover, the same could be extended to multiple human participants (further technical details are discussed in
<xref ref-type="supplementary-material" rid="pone.0048331.s003">Text S3</xref>
). Virtual reality has previously been used for communication between multiple participants where people in remote places can meet in a virtual environment shared by all. In such applications each of the participants uses their own virtual reality system, perhaps separated by thousands of kilometres, and they can see and talk to life-sized representations of one another, and carry out tasks together
<xref ref-type="bibr" rid="pone.0048331-Benford1">[23]</xref>
. This is facilitated by Internet network protocols that distribute the data between the various systems, and each system is responsible for displaying the virtual environment from the viewpoint of its particular participant. This has even been achieved with haptic interaction between the remote participants
<xref ref-type="bibr" rid="pone.0048331-Kim1">[24]</xref>
,
<xref ref-type="bibr" rid="pone.0048331-Tachi1">[25]</xref>
. However, what is different in our system is that the human is represented in the animal environment through a physical surrogate. In shared virtual environments all participants are in a virtual reality system. In our case only the human is in such a system, whereas the animals are located in their own physical environment without any need for virtual displays.</p>
<p>The conjunction of immersive virtual reality with teleoperator systems supports a class of application that would be very hard to achieve through any other means. The virtual environment acts as a unifying medium through which participants who operate at quite different scales can be brought together, and their appearance changed as appropriate to the demands of the application. Although we have applied this technology to interaction between humans and animals, primarily for use in the life sciences, the very same idea could be used for example, to realise human to remote-human interaction, with an example of such remote communication described in
<xref ref-type="bibr" rid="pone.0048331-Lincoln1">[26]</xref>
,
<xref ref-type="bibr" rid="pone.0048331-PerezMarcos1">[27]</xref>
.</p>
</sec>
<sec sec-type="supplementary-material" id="s5">
<title>Supporting Information</title>
<supplementary-material content-type="local-data" id="pone.0048331.s001">
<label>Text S1</label>
<caption>
<p>
<bold>Supporting procedures and methods.</bold>
A number of procedures and methods are described in detail, including rat training, robot control, video and data streaming, and experimental procedures.</p>
<p>(DOCX)</p>
</caption>
<media xlink:href="pone.0048331.s001.docx" mimetype="application" mime-subtype="msword">
<caption>
<p>Click here for additional data file.</p>
</caption>
</media>
</supplementary-material>
<supplementary-material content-type="local-data" id="pone.0048331.s002">
<label>Text S2</label>
<caption>
<p>
<bold>Computational and network requirements.</bold>
This describes the technical computational requirements to execute the system described.</p>
<p>(DOCX)</p>
</caption>
<media xlink:href="pone.0048331.s002.docx" mimetype="application" mime-subtype="msword">
<caption>
<p>Click here for additional data file.</p>
</caption>
</media>
</supplementary-material>
<supplementary-material content-type="local-data" id="pone.0048331.s003">
<label>Text S3</label>
<caption>
<p>
<bold>Multiple participants and animals.</bold>
This describes what would be needed to extend the system to cater for multiple human and animal participants.</p>
<p>(DOCX)</p>
</caption>
<media xlink:href="pone.0048331.s003.docx" mimetype="application" mime-subtype="msword">
<caption>
<p>Click here for additional data file.</p>
</caption>
</media>
</supplementary-material>
<supplementary-material content-type="local-data" id="pone.0048331.s004">
<label>Text S4</label>
<caption>
<p>
<bold>Distance distributions in trials 1 and 2.</bold>
This presents further analysis of the distances between the rat and human participants, and in particular there is a comparison between trials 1 and 2.</p>
<p>(DOCX)</p>
</caption>
<media xlink:href="pone.0048331.s004.docx" mimetype="application" mime-subtype="msword">
<caption>
<p>Click here for additional data file.</p>
</caption>
</media>
</supplementary-material>
<supplementary-material content-type="local-data" id="pone.0048331.s005">
<label>Video S1</label>
<caption>
<p>
<bold>A human participant interacts with the rat represented as a virtual human character in immersive virtual reality.</bold>
</p>
<p>(MP4)</p>
</caption>
<media xlink:href="pone.0048331.s005.mp4" mimetype="video" mime-subtype="mp4">
<caption>
<p>Click here for additional data file.</p>
</caption>
</media>
</supplementary-material>
<supplementary-material content-type="local-data" id="pone.0048331.s006">
<label>Video S2</label>
<caption>
<p>
<bold>The first 200 seconds of rat and robot movements for 6 participant trials.</bold>
The rat is represented by the blue square and path, and the human is represented by the red circle and path. Note that the sizes of the square representing the rat and the circle representing the robot are much smaller than would be if they were drawn to scale. Hence the videos under-represent the closeness of the rat and robot. The video timing is not real-time.</p>
<p>(MP4)</p>
</caption>
<media xlink:href="pone.0048331.s006.mp4" mimetype="video" mime-subtype="mp4">
<caption>
<p>Click here for additional data file.</p>
</caption>
</media>
</supplementary-material>
</sec>
</body>
<back>
<ack>
<p>We thank Alvaro Gimeno from the animal care facility in Bellvitge and Sílvia Aliagas for the animal training.</p>
</ack>
<ref-list>
<title>References</title>
<ref id="pone.0048331-BrooksJr1">
<label>1</label>
<mixed-citation publication-type="journal">
<name>
<surname>Brooks Jr</surname>
<given-names>F</given-names>
</name>
(
<year>1999</year>
)
<article-title>What's real about virtual reality?</article-title>
<source>Computer Graphics and Applications, IEEE</source>
<volume>19</volume>
:
<fpage>16</fpage>
<lpage>27</lpage>
</mixed-citation>
</ref>
<ref id="pone.0048331-Rizzo1">
<label>2</label>
<mixed-citation publication-type="journal">
<name>
<surname>Rizzo</surname>
<given-names>A</given-names>
</name>
,
<name>
<surname>Kim</surname>
<given-names>G</given-names>
</name>
(
<year>2005</year>
)
<article-title>A SWOT analysis of the field of virtual reality rehabilitation and therapy</article-title>
.
<source>PRESENCE: Teleoperators and Virtual Environments</source>
<volume>14</volume>
:
<fpage>119</fpage>
<lpage>146</lpage>
</mixed-citation>
</ref>
<ref id="pone.0048331-Slater1">
<label>3</label>
<mixed-citation publication-type="journal">
<name>
<surname>Slater</surname>
<given-names>M</given-names>
</name>
,
<name>
<surname>Antley</surname>
<given-names>A</given-names>
</name>
,
<name>
<surname>Davison</surname>
<given-names>A</given-names>
</name>
,
<name>
<surname>Swapp</surname>
<given-names>D</given-names>
</name>
,
<name>
<surname>Guger</surname>
<given-names>C</given-names>
</name>
,
<etal>et al</etal>
(
<year>2006</year>
)
<article-title>A virtual reprise of the Stanley milgram obedience experiments</article-title>
.
<source>PLoS ONE</source>
<volume>1</volume>
:
<fpage>e39</fpage>
doi:10.1371/journal.pone.0000039.
<pub-id pub-id-type="pmid">17183667</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0048331-Rovira1">
<label>4</label>
<mixed-citation publication-type="journal">
<name>
<surname>Rovira</surname>
<given-names>A</given-names>
</name>
,
<name>
<surname>Swapp</surname>
<given-names>D</given-names>
</name>
,
<name>
<surname>Spanlang</surname>
<given-names>B</given-names>
</name>
,
<name>
<surname>Slater</surname>
<given-names>M</given-names>
</name>
(
<year>2009</year>
)
<article-title>The use of virtual reality in the study of people's responses to violent incidents</article-title>
.
<source>Frontiers in Behavioral Neuroscience</source>
<volume>3</volume>
:
<fpage>59</fpage>
doi:10.3389/neuro.08.059.2009.
<pub-id pub-id-type="pmid">20076762</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0048331-Prabhat1">
<label>5</label>
<mixed-citation publication-type="journal">
<name>
<surname>Prabhat</surname>
<given-names>A</given-names>
</name>
,
<name>
<surname>Katzourin</surname>
<given-names>M</given-names>
</name>
,
<name>
<surname>Wharton</surname>
<given-names>K</given-names>
</name>
,
<name>
<surname>Slater</surname>
<given-names>M</given-names>
</name>
(
<year>2008</year>
)
<article-title>A Comparative Study of Desktop, Fishtank, and Cave Systems for the Exploration of Volume Rendered Confocal Data Sets</article-title>
.
<source>IEEE Transactions on Visualization & Computer Graphics</source>
<volume>14</volume>
:
<fpage>551</fpage>
<lpage>563</lpage>
<pub-id pub-id-type="pmid">18461742</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0048331-Ferreira1">
<label>6</label>
<mixed-citation publication-type="journal">
<name>
<surname>Ferreira</surname>
<given-names>A</given-names>
</name>
,
<name>
<surname>Mavroidis</surname>
<given-names>C</given-names>
</name>
(
<year>2006</year>
)
<article-title>Virtual reality and haptics for nanorobotics</article-title>
.
<source>Robotics & Automation Magazine, IEEE</source>
<volume>13</volume>
:
<fpage>78</fpage>
<lpage>92</lpage>
</mixed-citation>
</ref>
<ref id="pone.0048331-Zhao1">
<label>7</label>
<mixed-citation publication-type="journal">
<name>
<surname>Zhao</surname>
<given-names>Q</given-names>
</name>
(
<year>2011</year>
)
<article-title>10 scientific problems in virtual reality</article-title>
.
<source>Communications of the ACM</source>
<volume>54</volume>
:
<fpage>116</fpage>
<lpage>118</lpage>
</mixed-citation>
</ref>
<ref id="pone.0048331-SanchezVives1">
<label>8</label>
<mixed-citation publication-type="journal">
<name>
<surname>Sanchez-Vives</surname>
<given-names>MV</given-names>
</name>
,
<name>
<surname>Slater</surname>
<given-names>M</given-names>
</name>
(
<year>2005</year>
)
<article-title>From Presence to Consciousness through Virtual Reality</article-title>
.
<source>Nature Reviews Neuroscience</source>
<volume>6</volume>
:
<fpage>332</fpage>
<lpage>339</lpage>
</mixed-citation>
</ref>
<ref id="pone.0048331-Mondada1">
<label>9</label>
<mixed-citation publication-type="journal">
<name>
<surname>Mondada</surname>
<given-names>F</given-names>
</name>
,
<name>
<surname>Bonani</surname>
<given-names>M</given-names>
</name>
,
<name>
<surname>Raemy</surname>
<given-names>X</given-names>
</name>
,
<name>
<surname>Pugh</surname>
<given-names>J</given-names>
</name>
,
<name>
<surname>Cianci</surname>
<given-names>C</given-names>
</name>
,
<etal>et al</etal>
(
<year>2009</year>
)
<article-title>The e-puck, a robot designed for education in engineering</article-title>
.
<source>Proceedings of the 9th Conference on Autnomous Robot Systems and Competitions</source>
<volume>1</volume>
:
<fpage>59</fpage>
<lpage>65</lpage>
</mixed-citation>
</ref>
<ref id="pone.0048331-Tecchia1">
<label>10</label>
<mixed-citation publication-type="journal">
<name>
<surname>Tecchia</surname>
<given-names>F</given-names>
</name>
,
<name>
<surname>Carrozzino</surname>
<given-names>M</given-names>
</name>
,
<name>
<surname>Bacinelli</surname>
<given-names>S</given-names>
</name>
,
<name>
<surname>Rossi</surname>
<given-names>F</given-names>
</name>
,
<name>
<surname>Vercelli</surname>
<given-names>D</given-names>
</name>
,
<etal>et al</etal>
(
<year>2010</year>
)
<article-title>A Flexible Framework for Wide-Spectrum VR Development</article-title>
.
<source>PRESENCE: Teleoperators and Virtual Environments</source>
<volume>19</volume>
:
<fpage>302</fpage>
<lpage>312</lpage>
</mixed-citation>
</ref>
<ref id="pone.0048331-Gillies1">
<label>11</label>
<mixed-citation publication-type="journal">
<name>
<surname>Gillies</surname>
<given-names>M</given-names>
</name>
,
<name>
<surname>Spanlang</surname>
<given-names>B</given-names>
</name>
(
<year>2010</year>
)
<article-title>Comparing and evaluating real-time character engines for virtual environments</article-title>
.
<source>PRESENCE : Teleoperators and Virtual Environments</source>
<volume>19</volume>
:
<fpage>95</fpage>
<lpage>117</lpage>
</mixed-citation>
</ref>
<ref id="pone.0048331-Pustka1">
<label>12</label>
<mixed-citation publication-type="journal">
<name>
<surname>Pustka</surname>
<given-names>D</given-names>
</name>
,
<name>
<surname>Huber</surname>
<given-names>M</given-names>
</name>
,
<name>
<surname>Waechter</surname>
<given-names>C</given-names>
</name>
,
<name>
<surname>Echtler</surname>
<given-names>F</given-names>
</name>
,
<name>
<surname>Keitler</surname>
<given-names>P</given-names>
</name>
,
<etal>et al</etal>
(
<year>2011</year>
)
<article-title>Automatic configuration of Pervasive Sensor Networks for Augmented Reality</article-title>
.
<source>IEEE Pervasive Computing</source>
<volume>10</volume>
:
<fpage>68</fpage>
<lpage>79</lpage>
</mixed-citation>
</ref>
<ref id="pone.0048331-Valle1">
<label>13</label>
<mixed-citation publication-type="journal">
<name>
<surname>Valle</surname>
<given-names>FP</given-names>
</name>
(
<year>1970</year>
)
<article-title>Effects of strain, sex, and illumination on open-field behavior of rats</article-title>
.
<source>The American Journal of Psychology</source>
<volume>83</volume>
:
<fpage>103</fpage>
<lpage>111</lpage>
<pub-id pub-id-type="pmid">5465190</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0048331-Brudzynski1">
<label>14</label>
<mixed-citation publication-type="journal">
<name>
<surname>Brudzynski</surname>
<given-names>SM</given-names>
</name>
,
<name>
<surname>Krol</surname>
<given-names>S</given-names>
</name>
(
<year>1997</year>
)
<article-title>Analysis of locomotor activity in the rat: parallelism index, a new measure of locomotor exploratory pattern</article-title>
.
<source>Physiology & Behavior</source>
<volume>62</volume>
:
<fpage>635</fpage>
<lpage>642</lpage>
<pub-id pub-id-type="pmid">9272676</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0048331-Alger1">
<label>15</label>
<mixed-citation publication-type="journal">
<name>
<surname>Alger</surname>
<given-names>JM</given-names>
</name>
,
<name>
<surname>Alger</surname>
<given-names>SF</given-names>
</name>
(
<year>1999</year>
)
<article-title>Cat culture, human culture: An ethnographic study of a cat shelter</article-title>
.
<source>Society and Animals</source>
<volume>7</volume>
:
<fpage>199</fpage>
<lpage>218</lpage>
</mixed-citation>
</ref>
<ref id="pone.0048331-Brandt1">
<label>16</label>
<mixed-citation publication-type="journal">
<name>
<surname>Brandt</surname>
<given-names>K</given-names>
</name>
(
<year>2004</year>
)
<article-title>A language of their own: An interactionist approach to human-horse communication</article-title>
.
<source>Society and Animals</source>
<volume>12</volume>
:
<fpage>299</fpage>
<lpage>316</lpage>
</mixed-citation>
</ref>
<ref id="pone.0048331-Takanishi1">
<label>17</label>
<mixed-citation publication-type="journal">
<name>
<surname>Takanishi</surname>
<given-names>A</given-names>
</name>
,
<name>
<surname>Aoki</surname>
<given-names>T</given-names>
</name>
,
<name>
<surname>Ito</surname>
<given-names>M</given-names>
</name>
,
<name>
<surname>Ohkawa</surname>
<given-names>Y</given-names>
</name>
,
<name>
<surname>Yamaguchi</surname>
<given-names>J</given-names>
</name>
(
<year>1998</year>
)
<article-title>Interaction between creature and robot: development of an experiment system for rat and rat robot interaction</article-title>
.
<source>IEEE/RSJ International Conference on Intelligent Robots and Systems</source>
<volume>3</volume>
:
<fpage>1975</fpage>
<lpage>1980</lpage>
</mixed-citation>
</ref>
<ref id="pone.0048331-Ishii1">
<label>18</label>
<mixed-citation publication-type="other">Ishii H, Aoki T, Moribe K, Nakasuji M, Miwa H,
<etal>et al</etal>
. (2003) Interactive experiments between creature and robot as a basic research for coexistence between human and robot. The 12th IEEE International Workshop on Robot and Interactive Communication: 347–352.</mixed-citation>
</ref>
<ref id="pone.0048331-Ball1">
<label>19</label>
<mixed-citation publication-type="other">Ball D, Heath S, Milford M, Wyeth G, Wiles J (2010) A navigating rat animat. The 12th International Conference on the Synthesis and Simulation of Living Systems: 804–811.</mixed-citation>
</ref>
<ref id="pone.0048331-Wood1">
<label>20</label>
<mixed-citation publication-type="journal">
<name>
<surname>Wood</surname>
<given-names>RJ</given-names>
</name>
(
<year>2008</year>
)
<article-title>The first takeoff of a biologically inspired at-scale robotic insect</article-title>
.
<source>IEEE Transactions on Robotics</source>
<volume>24</volume>
:
<fpage>341</fpage>
<lpage>347</lpage>
</mixed-citation>
</ref>
<ref id="pone.0048331-Bluff1">
<label>21</label>
<mixed-citation publication-type="journal">
<name>
<surname>Bluff</surname>
<given-names>LA</given-names>
</name>
,
<name>
<surname>Rutz</surname>
<given-names>C</given-names>
</name>
(
<year>2008</year>
)
<article-title>A quick guide to video-tracking birds</article-title>
.
<source>Biology Letters</source>
<volume>4</volume>
:
<fpage>319</fpage>
<pub-id pub-id-type="pmid">18430668</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0048331-Talwar1">
<label>22</label>
<mixed-citation publication-type="journal">
<name>
<surname>Talwar</surname>
<given-names>SK</given-names>
</name>
,
<name>
<surname>Xu</surname>
<given-names>S</given-names>
</name>
,
<name>
<surname>Hawley</surname>
<given-names>ES</given-names>
</name>
,
<name>
<surname>Weiss</surname>
<given-names>SA</given-names>
</name>
,
<name>
<surname>Moxon</surname>
<given-names>KA</given-names>
</name>
,
<etal>et al</etal>
(
<year>2002</year>
)
<article-title>Behavioural neuroscience: Rat navigation guided by remote control</article-title>
.
<source>Nature</source>
<volume>417</volume>
:
<fpage>37</fpage>
<lpage>38</lpage>
<pub-id pub-id-type="pmid">11986657</pub-id>
</mixed-citation>
</ref>
<ref id="pone.0048331-Benford1">
<label>23</label>
<mixed-citation publication-type="journal">
<name>
<surname>Benford</surname>
<given-names>S</given-names>
</name>
,
<name>
<surname>Greenhalgh</surname>
<given-names>C</given-names>
</name>
,
<name>
<surname>Rodden</surname>
<given-names>T</given-names>
</name>
,
<name>
<surname>Pycock</surname>
<given-names>J</given-names>
</name>
(
<year>2001</year>
)
<article-title>Collaborative virtual environments</article-title>
.
<source>Communications of the ACM</source>
<volume>44</volume>
:
<fpage>79</fpage>
<lpage>85</lpage>
</mixed-citation>
</ref>
<ref id="pone.0048331-Kim1">
<label>24</label>
<mixed-citation publication-type="journal">
<name>
<surname>Kim</surname>
<given-names>J</given-names>
</name>
,
<name>
<surname>Kim</surname>
<given-names>H</given-names>
</name>
,
<name>
<surname>Tay</surname>
<given-names>BK</given-names>
</name>
,
<name>
<surname>Muniyandi</surname>
<given-names>M</given-names>
</name>
,
<name>
<surname>Srinivasan</surname>
<given-names>MA</given-names>
</name>
,
<etal>et al</etal>
(
<year>2004</year>
)
<article-title>Transatlantic touch: A study of haptic collaboration over long distance</article-title>
.
<source>PRESENCE: Teleoperators and Virtual Environments</source>
<volume>13</volume>
:
<fpage>328</fpage>
<lpage>337</lpage>
</mixed-citation>
</ref>
<ref id="pone.0048331-Tachi1">
<label>25</label>
<mixed-citation publication-type="other">Tachi S (2009) Telexistence. River Edge, NJ: World Scientific Pub Co Inc.</mixed-citation>
</ref>
<ref id="pone.0048331-Lincoln1">
<label>26</label>
<mixed-citation publication-type="journal">
<name>
<surname>Lincoln</surname>
<given-names>P</given-names>
</name>
,
<name>
<surname>Welch</surname>
<given-names>G</given-names>
</name>
,
<name>
<surname>Nashel</surname>
<given-names>A</given-names>
</name>
,
<name>
<surname>Ilie</surname>
<given-names>A</given-names>
</name>
,
<name>
<surname>Fuchs</surname>
<given-names>H</given-names>
</name>
(
<year>2009</year>
)
<article-title>Animatronic Shader Lamps Avatars</article-title>
.
<source>ISMAR 2009 8th IEEE International Symposium on Mixed and Augmented Reality</source>
<volume>2009</volume>
:
<fpage>27</fpage>
<lpage>33</lpage>
</mixed-citation>
</ref>
<ref id="pone.0048331-PerezMarcos1">
<label>27</label>
<mixed-citation publication-type="journal">
<name>
<surname>Perez-Marcos</surname>
<given-names>D</given-names>
</name>
,
<name>
<surname>Solazzi</surname>
<given-names>M</given-names>
</name>
,
<name>
<surname>Steptoe</surname>
<given-names>W</given-names>
</name>
,
<name>
<surname>Oyekoya</surname>
<given-names>O</given-names>
</name>
,
<name>
<surname>Frisoli</surname>
<given-names>A</given-names>
</name>
,
<etal>et al</etal>
(
<year>2012</year>
)
<article-title>A fully immersive set-up for remote interaction and neurorehabilitation based on virtual body ownership</article-title>
.
<source>Front Neur</source>
<volume>3</volume>
:
<fpage>110</fpage>
doi: 10.3389/fneur.2012.00110.</mixed-citation>
</ref>
</ref-list>
</back>
</pmc>
<affiliations>
<list>
<country>
<li>Allemagne</li>
<li>Autriche</li>
<li>Espagne</li>
<li>Royaume-Uni</li>
<li>États-Unis</li>
</country>
<region>
<li>Angleterre</li>
<li>Bavière</li>
<li>Catalogne</li>
<li>District de Haute-Bavière</li>
<li>Grand Londres</li>
<li>Massachusetts</li>
</region>
<settlement>
<li>Barcelone</li>
<li>Londres</li>
<li>Munich</li>
</settlement>
<orgName>
<li>Université technique de Munich</li>
</orgName>
</list>
<tree>
<country name="Espagne">
<noRegion>
<name sortKey="Normand, Jean Marie" sort="Normand, Jean Marie" uniqKey="Normand J" first="Jean-Marie" last="Normand">Jean-Marie Normand</name>
</noRegion>
<name sortKey="Giannopoulos, Elias" sort="Giannopoulos, Elias" uniqKey="Giannopoulos E" first="Elias" last="Giannopoulos">Elias Giannopoulos</name>
<name sortKey="Sanchez Vives, Maria V" sort="Sanchez Vives, Maria V" uniqKey="Sanchez Vives M" first="Maria V." last="Sanchez-Vives">Maria V. Sanchez-Vives</name>
<name sortKey="Sanchez Vives, Maria V" sort="Sanchez Vives, Maria V" uniqKey="Sanchez Vives M" first="Maria V." last="Sanchez-Vives">Maria V. Sanchez-Vives</name>
<name sortKey="Slater, Mel" sort="Slater, Mel" uniqKey="Slater M" first="Mel" last="Slater">Mel Slater</name>
<name sortKey="Slater, Mel" sort="Slater, Mel" uniqKey="Slater M" first="Mel" last="Slater">Mel Slater</name>
<name sortKey="Spanlang, Bernhard" sort="Spanlang, Bernhard" uniqKey="Spanlang B" first="Bernhard" last="Spanlang">Bernhard Spanlang</name>
</country>
<country name="Allemagne">
<region name="Bavière">
<name sortKey="Waechter, Christian" sort="Waechter, Christian" uniqKey="Waechter C" first="Christian" last="Waechter">Christian Waechter</name>
</region>
<name sortKey="Klinker, Gudrun" sort="Klinker, Gudrun" uniqKey="Klinker G" first="Gudrun" last="Klinker">Gudrun Klinker</name>
</country>
<country name="Autriche">
<noRegion>
<name sortKey="Grosswindhager, Bernhard" sort="Grosswindhager, Bernhard" uniqKey="Grosswindhager B" first="Bernhard" last="Grosswindhager">Bernhard Grosswindhager</name>
</noRegion>
<name sortKey="Guger, Christoph" sort="Guger, Christoph" uniqKey="Guger C" first="Christoph" last="Guger">Christoph Guger</name>
</country>
<country name="États-Unis">
<region name="Massachusetts">
<name sortKey="Srinivasan, Mandayam A" sort="Srinivasan, Mandayam A" uniqKey="Srinivasan M" first="Mandayam A." last="Srinivasan">Mandayam A. Srinivasan</name>
</region>
</country>
<country name="Royaume-Uni">
<region name="Angleterre">
<name sortKey="Srinivasan, Mandayam A" sort="Srinivasan, Mandayam A" uniqKey="Srinivasan M" first="Mandayam A." last="Srinivasan">Mandayam A. Srinivasan</name>
</region>
<name sortKey="Slater, Mel" sort="Slater, Mel" uniqKey="Slater M" first="Mel" last="Slater">Mel Slater</name>
</country>
</tree>
</affiliations>
</record>

Pour manipuler ce document sous Unix (Dilib)

EXPLOR_STEP=$WICRI_ROOT/Ticri/CIDE/explor/HapticV1/Data/Ncbi/Merge
HfdSelect -h $EXPLOR_STEP/biblio.hfd -nk 002318 | SxmlIndent | more

Ou

HfdSelect -h $EXPLOR_AREA/Data/Ncbi/Merge/biblio.hfd -nk 002318 | SxmlIndent | more

Pour mettre un lien sur cette page dans le réseau Wicri

{{Explor lien
   |wiki=    Ticri/CIDE
   |area=    HapticV1
   |flux=    Ncbi
   |étape=   Merge
   |type=    RBID
   |clé=     PMC:3485138
   |texte=   Beaming into the Rat World: Enabling Real-Time Interaction between Rat and Human Each at Their Own Scale
}}

Pour générer des pages wiki

HfdIndexSelect -h $EXPLOR_AREA/Data/Ncbi/Merge/RBID.i   -Sk "pubmed:23118987" \
       | HfdSelect -Kh $EXPLOR_AREA/Data/Ncbi/Merge/biblio.hfd   \
       | NlmPubMed2Wicri -a HapticV1 

Wicri

This area was generated with Dilib version V0.6.23.
Data generation: Mon Jun 13 01:09:46 2016. Site generation: Wed Mar 6 09:54:07 2024