Serveur d'exploration sur les dispositifs haptiques

Attention, ce site est en cours de développement !
Attention, site généré par des moyens informatiques à partir de corpus bruts.
Les informations ne sont donc pas validées.

Real-time recording and classification of eye movements in an immersive virtual environment

Identifieur interne : 001489 ( Pmc/Curation ); précédent : 001488; suivant : 001490

Real-time recording and classification of eye movements in an immersive virtual environment

Auteurs : Gabriel Diaz [États-Unis] ; Joseph Cooper [États-Unis] ; Dmitry Kit [États-Unis] ; Mary Hayhoe [États-Unis]

Source :

RBID : PMC:3795427

Abstract

Despite the growing popularity of virtual reality environments, few laboratories are equipped to investigate eye movements within these environments. This primer is intended to reduce the time and effort required to incorporate eye-tracking equipment into a virtual reality environment. We discuss issues related to the initial startup and provide algorithms necessary for basic analysis. Algorithms are provided for the calculation of gaze angle within a virtual world using a monocular eye-tracker in a three-dimensional environment. In addition, we provide algorithms for the calculation of the angular distance between the gaze and a relevant virtual object and for the identification of fixations, saccades, and pursuit eye movements. Finally, we provide tools that temporally synchronize gaze data and the visual stimulus and enable real-time assembly of a video-based record of the experiment using the Quicktime MOV format, available at http://sourceforge.net/p/utdvrlibraries/. This record contains the visual stimulus, the gaze cursor, and associated numerical data and can be used for data exportation, visual inspection, and validation of calculated gaze movements.


Url:
DOI: 10.1167/13.12.5
PubMed: 24113087
PubMed Central: 3795427

Links toward previous steps (curation, corpus...)


Links to Exploration step

PMC:3795427

Le document en format XML

<record>
<TEI>
<teiHeader>
<fileDesc>
<titleStmt>
<title xml:lang="en">Real-time recording and classification of eye movements in an immersive virtual environment</title>
<author>
<name sortKey="Diaz, Gabriel" sort="Diaz, Gabriel" uniqKey="Diaz G" first="Gabriel" last="Diaz">Gabriel Diaz</name>
<affiliation wicri:level="1">
<nlm:aff id="aff1">Center for Perceptual Systems, University of Texas Austin, Austin, TX, USA</nlm:aff>
<country xml:lang="fr">États-Unis</country>
<wicri:regionArea>Center for Perceptual Systems, University of Texas Austin, Austin, TX</wicri:regionArea>
</affiliation>
</author>
<author>
<name sortKey="Cooper, Joseph" sort="Cooper, Joseph" uniqKey="Cooper J" first="Joseph" last="Cooper">Joseph Cooper</name>
<affiliation wicri:level="1">
<nlm:aff id="aff2">Department of Computer Science, University of Texas Austin, Austin, TX, USA</nlm:aff>
<country xml:lang="fr">États-Unis</country>
<wicri:regionArea>Department of Computer Science, University of Texas Austin, Austin, TX</wicri:regionArea>
</affiliation>
</author>
<author>
<name sortKey="Kit, Dmitry" sort="Kit, Dmitry" uniqKey="Kit D" first="Dmitry" last="Kit">Dmitry Kit</name>
<affiliation wicri:level="1">
<nlm:aff id="aff2">Department of Computer Science, University of Texas Austin, Austin, TX, USA</nlm:aff>
<country xml:lang="fr">États-Unis</country>
<wicri:regionArea>Department of Computer Science, University of Texas Austin, Austin, TX</wicri:regionArea>
</affiliation>
</author>
<author>
<name sortKey="Hayhoe, Mary" sort="Hayhoe, Mary" uniqKey="Hayhoe M" first="Mary" last="Hayhoe">Mary Hayhoe</name>
<affiliation wicri:level="1">
<nlm:aff id="aff1">Center for Perceptual Systems, University of Texas Austin, Austin, TX, USA</nlm:aff>
<country xml:lang="fr">États-Unis</country>
<wicri:regionArea>Center for Perceptual Systems, University of Texas Austin, Austin, TX</wicri:regionArea>
</affiliation>
</author>
</titleStmt>
<publicationStmt>
<idno type="wicri:source">PMC</idno>
<idno type="pmid">24113087</idno>
<idno type="pmc">3795427</idno>
<idno type="url">http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3795427</idno>
<idno type="RBID">PMC:3795427</idno>
<idno type="doi">10.1167/13.12.5</idno>
<date when="2013">2013</date>
<idno type="wicri:Area/Pmc/Corpus">001489</idno>
<idno type="wicri:Area/Pmc/Curation">001489</idno>
</publicationStmt>
<sourceDesc>
<biblStruct>
<analytic>
<title xml:lang="en" level="a" type="main">Real-time recording and classification of eye movements in an immersive virtual environment</title>
<author>
<name sortKey="Diaz, Gabriel" sort="Diaz, Gabriel" uniqKey="Diaz G" first="Gabriel" last="Diaz">Gabriel Diaz</name>
<affiliation wicri:level="1">
<nlm:aff id="aff1">Center for Perceptual Systems, University of Texas Austin, Austin, TX, USA</nlm:aff>
<country xml:lang="fr">États-Unis</country>
<wicri:regionArea>Center for Perceptual Systems, University of Texas Austin, Austin, TX</wicri:regionArea>
</affiliation>
</author>
<author>
<name sortKey="Cooper, Joseph" sort="Cooper, Joseph" uniqKey="Cooper J" first="Joseph" last="Cooper">Joseph Cooper</name>
<affiliation wicri:level="1">
<nlm:aff id="aff2">Department of Computer Science, University of Texas Austin, Austin, TX, USA</nlm:aff>
<country xml:lang="fr">États-Unis</country>
<wicri:regionArea>Department of Computer Science, University of Texas Austin, Austin, TX</wicri:regionArea>
</affiliation>
</author>
<author>
<name sortKey="Kit, Dmitry" sort="Kit, Dmitry" uniqKey="Kit D" first="Dmitry" last="Kit">Dmitry Kit</name>
<affiliation wicri:level="1">
<nlm:aff id="aff2">Department of Computer Science, University of Texas Austin, Austin, TX, USA</nlm:aff>
<country xml:lang="fr">États-Unis</country>
<wicri:regionArea>Department of Computer Science, University of Texas Austin, Austin, TX</wicri:regionArea>
</affiliation>
</author>
<author>
<name sortKey="Hayhoe, Mary" sort="Hayhoe, Mary" uniqKey="Hayhoe M" first="Mary" last="Hayhoe">Mary Hayhoe</name>
<affiliation wicri:level="1">
<nlm:aff id="aff1">Center for Perceptual Systems, University of Texas Austin, Austin, TX, USA</nlm:aff>
<country xml:lang="fr">États-Unis</country>
<wicri:regionArea>Center for Perceptual Systems, University of Texas Austin, Austin, TX</wicri:regionArea>
</affiliation>
</author>
</analytic>
<series>
<title level="j">Journal of Vision</title>
<idno type="eISSN">1534-7362</idno>
<imprint>
<date when="2013">2013</date>
</imprint>
</series>
</biblStruct>
</sourceDesc>
</fileDesc>
<profileDesc>
<textClass></textClass>
</profileDesc>
</teiHeader>
<front>
<div type="abstract" xml:lang="en">
<p>Despite the growing popularity of virtual reality environments, few laboratories are equipped to investigate eye movements within these environments. This primer is intended to reduce the time and effort required to incorporate eye-tracking equipment into a virtual reality environment. We discuss issues related to the initial startup and provide algorithms necessary for basic analysis. Algorithms are provided for the calculation of gaze angle within a virtual world using a monocular eye-tracker in a three-dimensional environment. In addition, we provide algorithms for the calculation of the angular distance between the gaze and a relevant virtual object and for the identification of fixations, saccades, and pursuit eye movements. Finally, we provide tools that temporally synchronize gaze data and the visual stimulus and enable real-time assembly of a video-based record of the experiment using the Quicktime MOV format, available at
<uri xlink:type="simple" xlink:href="http://sourceforge.net/p/utdvrlibraries/">http://sourceforge.net/p/utdvrlibraries/</uri>
. This record contains the visual stimulus, the gaze cursor, and associated numerical data and can be used for data exportation, visual inspection, and validation of calculated gaze movements.</p>
</div>
</front>
</TEI>
<pmc article-type="research-article">
<pmc-comment>The publisher of this article does not allow downloading of the full text in XML form.</pmc-comment>
<front>
<journal-meta>
<journal-id journal-id-type="nlm-ta">J Vis</journal-id>
<journal-id journal-id-type="iso-abbrev">J Vis</journal-id>
<journal-id journal-id-type="hwp">jov</journal-id>
<journal-id journal-id-type="pmc">jov</journal-id>
<journal-id journal-id-type="publisher-id">JOV</journal-id>
<journal-title-group>
<journal-title>Journal of Vision</journal-title>
</journal-title-group>
<issn pub-type="epub">1534-7362</issn>
<publisher>
<publisher-name>The Association for Research in Vision and Ophthalmology</publisher-name>
</publisher>
</journal-meta>
<article-meta>
<article-id pub-id-type="pmid">24113087</article-id>
<article-id pub-id-type="pmc">3795427</article-id>
<article-id pub-id-type="doi">10.1167/13.12.5</article-id>
<article-id pub-id-type="sici">jovi-13-11-10</article-id>
<article-id pub-id-type="other">MS#: JOV-03427-2012</article-id>
<article-categories>
<subj-group subj-group-type="heading">
<subject>Methods</subject>
</subj-group>
</article-categories>
<title-group>
<article-title>Real-time recording and classification of eye movements in an immersive virtual environment</article-title>
</title-group>
<contrib-group>
<contrib contrib-type="author">
<name>
<surname>Diaz</surname>
<given-names>Gabriel</given-names>
</name>
<xref ref-type="aff" rid="aff1">1</xref>
<email>gdiaz@mail.cps.utexas.edu</email>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Cooper</surname>
<given-names>Joseph</given-names>
</name>
<xref ref-type="aff" rid="aff2">2</xref>
<email>jcooper@cs.utexas.edu</email>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Kit</surname>
<given-names>Dmitry</given-names>
</name>
<xref ref-type="aff" rid="aff2">2</xref>
<email>dkit@cs.utexas.edu</email>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Hayhoe</surname>
<given-names>Mary</given-names>
</name>
<xref ref-type="aff" rid="aff1">1</xref>
<email>mary@mail.cps.utexas.edu</email>
</contrib>
<aff id="aff1">
<label>1</label>
Center for Perceptual Systems, University of Texas Austin, Austin, TX, USA</aff>
<aff id="aff2">
<label>2</label>
Department of Computer Science, University of Texas Austin, Austin, TX, USA</aff>
</contrib-group>
<pub-date pub-type="collection">
<year>2013</year>
</pub-date>
<pub-date pub-type="epub">
<day>10</day>
<month>10</month>
<year>2013</year>
</pub-date>
<volume>13</volume>
<issue>12</issue>
<elocation-id>5</elocation-id>
<history>
<date date-type="received">
<day>31</day>
<month>10</month>
<year>2012</year>
</date>
<date date-type="accepted">
<day>24</day>
<month>6</month>
<year>2013</year>
</date>
</history>
<permissions>
<copyright-statement>© 2013 ARVO</copyright-statement>
<copyright-year>2013</copyright-year>
</permissions>
<self-uri xlink:title="pdf" xlink:type="simple" xlink:href="i1534-7362-13-12-5.pdf"></self-uri>
<self-uri xlink:role="icon" xlink:type="simple" xlink:href="13.12.5.gif"></self-uri>
<abstract>
<p>Despite the growing popularity of virtual reality environments, few laboratories are equipped to investigate eye movements within these environments. This primer is intended to reduce the time and effort required to incorporate eye-tracking equipment into a virtual reality environment. We discuss issues related to the initial startup and provide algorithms necessary for basic analysis. Algorithms are provided for the calculation of gaze angle within a virtual world using a monocular eye-tracker in a three-dimensional environment. In addition, we provide algorithms for the calculation of the angular distance between the gaze and a relevant virtual object and for the identification of fixations, saccades, and pursuit eye movements. Finally, we provide tools that temporally synchronize gaze data and the visual stimulus and enable real-time assembly of a video-based record of the experiment using the Quicktime MOV format, available at
<uri xlink:type="simple" xlink:href="http://sourceforge.net/p/utdvrlibraries/">http://sourceforge.net/p/utdvrlibraries/</uri>
. This record contains the visual stimulus, the gaze cursor, and associated numerical data and can be used for data exportation, visual inspection, and validation of calculated gaze movements.</p>
</abstract>
<kwd-group>
<title>Keywords</title>
<kwd>
<italic>virtual reality</italic>
</kwd>
<kwd>
<italic>eye movements</italic>
</kwd>
<kwd>
<italic>gaze</italic>
</kwd>
<kwd>
<italic>methods</italic>
</kwd>
</kwd-group>
</article-meta>
</front>
</pmc>
</record>

Pour manipuler ce document sous Unix (Dilib)

EXPLOR_STEP=$WICRI_ROOT/Ticri/CIDE/explor/HapticV1/Data/Pmc/Curation
HfdSelect -h $EXPLOR_STEP/biblio.hfd -nk 001489 | SxmlIndent | more

Ou

HfdSelect -h $EXPLOR_AREA/Data/Pmc/Curation/biblio.hfd -nk 001489 | SxmlIndent | more

Pour mettre un lien sur cette page dans le réseau Wicri

{{Explor lien
   |wiki=    Ticri/CIDE
   |area=    HapticV1
   |flux=    Pmc
   |étape=   Curation
   |type=    RBID
   |clé=     PMC:3795427
   |texte=   Real-time recording and classification of eye movements in an immersive virtual environment
}}

Pour générer des pages wiki

HfdIndexSelect -h $EXPLOR_AREA/Data/Pmc/Curation/RBID.i   -Sk "pubmed:24113087" \
       | HfdSelect -Kh $EXPLOR_AREA/Data/Pmc/Curation/biblio.hfd   \
       | NlmPubMed2Wicri -a HapticV1 

Wicri

This area was generated with Dilib version V0.6.23.
Data generation: Mon Jun 13 01:09:46 2016. Site generation: Wed Mar 6 09:54:07 2024