Serveur d'exploration sur les dispositifs haptiques

Attention, ce site est en cours de développement !
Attention, site généré par des moyens informatiques à partir de corpus bruts.
Les informations ne sont donc pas validées.

Understanding How Adolescents with Autism Respond to Facial Expressions in Virtual Reality Environments

Identifieur interne : 002532 ( Ncbi/Merge ); précédent : 002531; suivant : 002533

Understanding How Adolescents with Autism Respond to Facial Expressions in Virtual Reality Environments

Auteurs : Esubalew Bekele ; Zhi Zheng ; Amy Swanson ; Julie Crittendon ; Zachary Warren ; Nilanjan Sarkar

Source :

RBID : PMC:3867269

Abstract

Autism Spectrum Disorders (ASD) are characterized by atypical patterns of behaviors and impairments in social communication. Among the fundamental social impairments in the ASD population are challenges in appropriately recognizing and responding to facial expressions. Traditional intervention approaches often require intensive support and well-trained therapists to address core deficits, with many with ASD having tremendous difficulty accessing such care due to lack of available trained therapists as well as intervention costs. As a result, emerging technology such as virtual reality (VR) has the potential to offer useful technology-enabled intervention systems. In this paper, an innovative VR-based facial emotional expression presentation system was developed that allows monitoring of eye gaze and physiological signals related to emotion identification to explore new efficient therapeutic paradigms. A usability study of this new system involving ten adolescents with ASD and ten typically developing adolescents as a control group was performed. The eye tracking and physiological data were analyzed to determine intragroup and intergroup variations of gaze and physiological patterns. Performance data, eye tracking indices and physiological features indicated that there were differences in the way adolescents with ASD process and recognize emotional faces compared to their typically developing peers. These results will be used in the future for an online adaptive VR-based multimodal social interaction system to improve emotion recognition abilities of individuals with ASD.


Url:
DOI: 10.1109/TVCG.2013.42
PubMed: 23428456
PubMed Central: 3867269

Links toward previous steps (curation, corpus...)


Links to Exploration step

PMC:3867269

Le document en format XML

<record>
<TEI>
<teiHeader>
<fileDesc>
<titleStmt>
<title xml:lang="en">Understanding How Adolescents with Autism Respond to Facial Expressions in Virtual Reality Environments</title>
<author>
<name sortKey="Bekele, Esubalew" sort="Bekele, Esubalew" uniqKey="Bekele E" first="Esubalew" last="Bekele">Esubalew Bekele</name>
</author>
<author>
<name sortKey="Zheng, Zhi" sort="Zheng, Zhi" uniqKey="Zheng Z" first="Zhi" last="Zheng">Zhi Zheng</name>
</author>
<author>
<name sortKey="Swanson, Amy" sort="Swanson, Amy" uniqKey="Swanson A" first="Amy" last="Swanson">Amy Swanson</name>
</author>
<author>
<name sortKey="Crittendon, Julie" sort="Crittendon, Julie" uniqKey="Crittendon J" first="Julie" last="Crittendon">Julie Crittendon</name>
</author>
<author>
<name sortKey="Warren, Zachary" sort="Warren, Zachary" uniqKey="Warren Z" first="Zachary" last="Warren">Zachary Warren</name>
</author>
<author>
<name sortKey="Sarkar, Nilanjan" sort="Sarkar, Nilanjan" uniqKey="Sarkar N" first="Nilanjan" last="Sarkar">Nilanjan Sarkar</name>
</author>
</titleStmt>
<publicationStmt>
<idno type="wicri:source">PMC</idno>
<idno type="pmid">23428456</idno>
<idno type="pmc">3867269</idno>
<idno type="url">http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3867269</idno>
<idno type="RBID">PMC:3867269</idno>
<idno type="doi">10.1109/TVCG.2013.42</idno>
<date when="2013">2013</date>
<idno type="wicri:Area/Pmc/Corpus">001B53</idno>
<idno type="wicri:Area/Pmc/Curation">001B53</idno>
<idno type="wicri:Area/Pmc/Checkpoint">000F19</idno>
<idno type="wicri:Area/Ncbi/Merge">002532</idno>
</publicationStmt>
<sourceDesc>
<biblStruct>
<analytic>
<title xml:lang="en" level="a" type="main">Understanding How Adolescents with Autism Respond to Facial Expressions in Virtual Reality Environments</title>
<author>
<name sortKey="Bekele, Esubalew" sort="Bekele, Esubalew" uniqKey="Bekele E" first="Esubalew" last="Bekele">Esubalew Bekele</name>
</author>
<author>
<name sortKey="Zheng, Zhi" sort="Zheng, Zhi" uniqKey="Zheng Z" first="Zhi" last="Zheng">Zhi Zheng</name>
</author>
<author>
<name sortKey="Swanson, Amy" sort="Swanson, Amy" uniqKey="Swanson A" first="Amy" last="Swanson">Amy Swanson</name>
</author>
<author>
<name sortKey="Crittendon, Julie" sort="Crittendon, Julie" uniqKey="Crittendon J" first="Julie" last="Crittendon">Julie Crittendon</name>
</author>
<author>
<name sortKey="Warren, Zachary" sort="Warren, Zachary" uniqKey="Warren Z" first="Zachary" last="Warren">Zachary Warren</name>
</author>
<author>
<name sortKey="Sarkar, Nilanjan" sort="Sarkar, Nilanjan" uniqKey="Sarkar N" first="Nilanjan" last="Sarkar">Nilanjan Sarkar</name>
</author>
</analytic>
<series>
<title level="j">IEEE transactions on visualization and computer graphics</title>
<idno type="ISSN">1077-2626</idno>
<idno type="eISSN">1941-0506</idno>
<imprint>
<date when="2013">2013</date>
</imprint>
</series>
</biblStruct>
</sourceDesc>
</fileDesc>
<profileDesc>
<textClass></textClass>
</profileDesc>
</teiHeader>
<front>
<div type="abstract" xml:lang="en">
<p id="P1">Autism Spectrum Disorders (ASD) are characterized by atypical patterns of behaviors and impairments in social communication. Among the fundamental social impairments in the ASD population are challenges in appropriately recognizing and responding to facial expressions. Traditional intervention approaches often require intensive support and well-trained therapists to address core deficits, with many with ASD having tremendous difficulty accessing such care due to lack of available trained therapists as well as intervention costs. As a result, emerging technology such as virtual reality (VR) has the potential to offer useful technology-enabled intervention systems. In this paper, an innovative VR-based facial emotional expression presentation system was developed that allows monitoring of eye gaze and physiological signals related to emotion identification to explore new efficient therapeutic paradigms. A usability study of this new system involving ten adolescents with ASD and ten typically developing adolescents as a control group was performed. The eye tracking and physiological data were analyzed to determine intragroup and intergroup variations of gaze and physiological patterns. Performance data, eye tracking indices and physiological features indicated that there were differences in the way adolescents with ASD process and recognize emotional faces compared to their typically developing peers. These results will be used in the future for an online adaptive VR-based multimodal social interaction system to improve emotion recognition abilities of individuals with ASD.</p>
</div>
</front>
</TEI>
<pmc article-type="research-article">
<pmc-comment>The publisher of this article does not allow downloading of the full text in XML form.</pmc-comment>
<pmc-dir>properties manuscript</pmc-dir>
<front>
<journal-meta>
<journal-id journal-id-type="nlm-journal-id">9891704</journal-id>
<journal-id journal-id-type="pubmed-jr-id">31980</journal-id>
<journal-id journal-id-type="nlm-ta">IEEE Trans Vis Comput Graph</journal-id>
<journal-id journal-id-type="iso-abbrev">IEEE Trans Vis Comput Graph</journal-id>
<journal-title-group>
<journal-title>IEEE transactions on visualization and computer graphics</journal-title>
</journal-title-group>
<issn pub-type="ppub">1077-2626</issn>
<issn pub-type="epub">1941-0506</issn>
</journal-meta>
<article-meta>
<article-id pub-id-type="pmid">23428456</article-id>
<article-id pub-id-type="pmc">3867269</article-id>
<article-id pub-id-type="doi">10.1109/TVCG.2013.42</article-id>
<article-id pub-id-type="manuscript">NIHMS535580</article-id>
<article-categories>
<subj-group subj-group-type="heading">
<subject>Article</subject>
</subj-group>
</article-categories>
<title-group>
<article-title>Understanding How Adolescents with Autism Respond to Facial Expressions in Virtual Reality Environments</article-title>
</title-group>
<contrib-group>
<contrib contrib-type="author">
<name>
<surname>Bekele</surname>
<given-names>Esubalew</given-names>
</name>
<role>Student Member, IEEE</role>
<aff id="A1">EECS Department, Vanderbilt University</aff>
<email>esubalew.bekele@vanderbilt.edu</email>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Zheng</surname>
<given-names>Zhi</given-names>
</name>
<aff id="A2">Electrical Engineering and Computer Science Department, Vanderbilt University</aff>
<email>zhi.zheng@vanderbilt.edu</email>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Swanson</surname>
<given-names>Amy</given-names>
</name>
<aff id="A3">Treatment and Research in Autism Disorders (TRIAD), Vanderbilt University</aff>
<email>amy.r.swanson @vanderbilt.edu.</email>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Crittendon</surname>
<given-names>Julie</given-names>
</name>
<aff id="A4">Department of Pediatrics and Psychiatry, Vanderbilt University</aff>
<email>julie.a.crittendon@vanderbilt.edu</email>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Warren</surname>
<given-names>Zachary</given-names>
</name>
<aff id="A5">Department of Pediatrics and Psychiatry, Vanderbilt University</aff>
<email>zachary.e.warren@vanderbilt.edu</email>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Sarkar</surname>
<given-names>Nilanjan</given-names>
</name>
<role>Senior Member, IEEE</role>
<aff id="A6">Mechanical Engineering Department, Vanderbilt University</aff>
<email>nilanjan.sarkar@vanderbilt.edu</email>
</contrib>
</contrib-group>
<pub-date pub-type="nihms-submitted">
<day>4</day>
<month>12</month>
<year>2013</year>
</pub-date>
<pub-date pub-type="ppub">
<month>4</month>
<year>2013</year>
</pub-date>
<pub-date pub-type="pmc-release">
<day>18</day>
<month>12</month>
<year>2013</year>
</pub-date>
<volume>19</volume>
<issue>4</issue>
<elocation-id>10.1109/TVCG.2013.42</elocation-id>
<permissions>
<copyright-statement>© 2013 IEEE</copyright-statement>
<copyright-year>2013</copyright-year>
</permissions>
<abstract>
<p id="P1">Autism Spectrum Disorders (ASD) are characterized by atypical patterns of behaviors and impairments in social communication. Among the fundamental social impairments in the ASD population are challenges in appropriately recognizing and responding to facial expressions. Traditional intervention approaches often require intensive support and well-trained therapists to address core deficits, with many with ASD having tremendous difficulty accessing such care due to lack of available trained therapists as well as intervention costs. As a result, emerging technology such as virtual reality (VR) has the potential to offer useful technology-enabled intervention systems. In this paper, an innovative VR-based facial emotional expression presentation system was developed that allows monitoring of eye gaze and physiological signals related to emotion identification to explore new efficient therapeutic paradigms. A usability study of this new system involving ten adolescents with ASD and ten typically developing adolescents as a control group was performed. The eye tracking and physiological data were analyzed to determine intragroup and intergroup variations of gaze and physiological patterns. Performance data, eye tracking indices and physiological features indicated that there were differences in the way adolescents with ASD process and recognize emotional faces compared to their typically developing peers. These results will be used in the future for an online adaptive VR-based multimodal social interaction system to improve emotion recognition abilities of individuals with ASD.</p>
</abstract>
<kwd-group>
<title>Index Terms</title>
<kwd>3D Interaction</kwd>
<kwd>multimodal interaction</kwd>
<kwd>psychology</kwd>
<kwd>usability</kwd>
<kwd>vr-based response systems</kwd>
</kwd-group>
<funding-group>
<award-group>
<funding-source country="United States">National Institute of Mental Health : NIMH</funding-source>
<award-id>R01 MH091102 || MH</award-id>
</award-group>
</funding-group>
</article-meta>
</front>
</pmc>
<affiliations>
<list></list>
<tree>
<noCountry>
<name sortKey="Bekele, Esubalew" sort="Bekele, Esubalew" uniqKey="Bekele E" first="Esubalew" last="Bekele">Esubalew Bekele</name>
<name sortKey="Crittendon, Julie" sort="Crittendon, Julie" uniqKey="Crittendon J" first="Julie" last="Crittendon">Julie Crittendon</name>
<name sortKey="Sarkar, Nilanjan" sort="Sarkar, Nilanjan" uniqKey="Sarkar N" first="Nilanjan" last="Sarkar">Nilanjan Sarkar</name>
<name sortKey="Swanson, Amy" sort="Swanson, Amy" uniqKey="Swanson A" first="Amy" last="Swanson">Amy Swanson</name>
<name sortKey="Warren, Zachary" sort="Warren, Zachary" uniqKey="Warren Z" first="Zachary" last="Warren">Zachary Warren</name>
<name sortKey="Zheng, Zhi" sort="Zheng, Zhi" uniqKey="Zheng Z" first="Zhi" last="Zheng">Zhi Zheng</name>
</noCountry>
</tree>
</affiliations>
</record>

Pour manipuler ce document sous Unix (Dilib)

EXPLOR_STEP=$WICRI_ROOT/Ticri/CIDE/explor/HapticV1/Data/Ncbi/Merge
HfdSelect -h $EXPLOR_STEP/biblio.hfd -nk 002532 | SxmlIndent | more

Ou

HfdSelect -h $EXPLOR_AREA/Data/Ncbi/Merge/biblio.hfd -nk 002532 | SxmlIndent | more

Pour mettre un lien sur cette page dans le réseau Wicri

{{Explor lien
   |wiki=    Ticri/CIDE
   |area=    HapticV1
   |flux=    Ncbi
   |étape=   Merge
   |type=    RBID
   |clé=     PMC:3867269
   |texte=   Understanding How Adolescents with Autism Respond to Facial Expressions in Virtual Reality Environments
}}

Pour générer des pages wiki

HfdIndexSelect -h $EXPLOR_AREA/Data/Ncbi/Merge/RBID.i   -Sk "pubmed:23428456" \
       | HfdSelect -Kh $EXPLOR_AREA/Data/Ncbi/Merge/biblio.hfd   \
       | NlmPubMed2Wicri -a HapticV1 

Wicri

This area was generated with Dilib version V0.6.23.
Data generation: Mon Jun 13 01:09:46 2016. Site generation: Wed Mar 6 09:54:07 2024