Serveur d'exploration sur les dispositifs haptiques

Attention, ce site est en cours de développement !
Attention, site généré par des moyens informatiques à partir de corpus bruts.
Les informations ne sont donc pas validées.

Self-adapting user interfaces as assistive technology for handheld mobile devices

Identifieur interne : 000932 ( PascalFrancis/Checkpoint ); précédent : 000931; suivant : 000933

Self-adapting user interfaces as assistive technology for handheld mobile devices

Auteurs : Robert Dodd [Royaume-Uni]

Source :

RBID : Pascal:07-0392587

Descripteurs français

English descriptors

Abstract

The accessibility of handheld mobile devices is a unique problem domain. They present with a small form factor, constraining display size, and making serious demands on user mobility. Existing assistive technology tackles these problems with bespoke solutions and text-to-speech augmentation, bulking out the device, and forcing visual metaphors upon blind users. Stepping away from such "bolt-on" accessibility, this research revisits the processes by which user interfaces are designed, constructing a model of user interface development that allows for dynamic adaptation of the interface to match individual user capability profiles. In doing so, it abstracts content meaning from presentation, mapping interaction metaphors to categorized user capabilities within individual design spaces (visual, sonic, and haptic) and interaction metaphors to relevant content meaning.


Affiliations:


Links toward previous steps (curation, corpus...)


Links to Exploration step

Pascal:07-0392587

Le document en format XML

<record>
<TEI>
<teiHeader>
<fileDesc>
<titleStmt>
<title xml:lang="en" level="a">Self-adapting user interfaces as assistive technology for handheld mobile devices</title>
<author>
<name sortKey="Dodd, Robert" sort="Dodd, Robert" uniqKey="Dodd R" first="Robert" last="Dodd">Robert Dodd</name>
<affiliation wicri:level="1">
<inist:fA14 i1="01">
<s1>University of Teesside School of Computing</s1>
<s2>Tees Valley TS1 3BA</s2>
<s3>GBR</s3>
<sZ>1 aut.</sZ>
</inist:fA14>
<country>Royaume-Uni</country>
<wicri:noRegion>University of Teesside School of Computing</wicri:noRegion>
</affiliation>
</author>
</titleStmt>
<publicationStmt>
<idno type="wicri:source">INIST</idno>
<idno type="inist">07-0392587</idno>
<date when="2006">2006</date>
<idno type="stanalyst">PASCAL 07-0392587 INIST</idno>
<idno type="RBID">Pascal:07-0392587</idno>
<idno type="wicri:Area/PascalFrancis/Corpus">000B22</idno>
<idno type="wicri:Area/PascalFrancis/Curation">000973</idno>
<idno type="wicri:Area/PascalFrancis/Checkpoint">000932</idno>
</publicationStmt>
<sourceDesc>
<biblStruct>
<analytic>
<title xml:lang="en" level="a">Self-adapting user interfaces as assistive technology for handheld mobile devices</title>
<author>
<name sortKey="Dodd, Robert" sort="Dodd, Robert" uniqKey="Dodd R" first="Robert" last="Dodd">Robert Dodd</name>
<affiliation wicri:level="1">
<inist:fA14 i1="01">
<s1>University of Teesside School of Computing</s1>
<s2>Tees Valley TS1 3BA</s2>
<s3>GBR</s3>
<sZ>1 aut.</sZ>
</inist:fA14>
<country>Royaume-Uni</country>
<wicri:noRegion>University of Teesside School of Computing</wicri:noRegion>
</affiliation>
</author>
</analytic>
</biblStruct>
</sourceDesc>
</fileDesc>
<profileDesc>
<textClass>
<keywords scheme="KwdEn" xml:lang="en">
<term>Accessibility</term>
<term>Blind</term>
<term>Bolted joint</term>
<term>Display</term>
<term>Form factor</term>
<term>Metaphor</term>
<term>Mobile computing</term>
<term>Mobility</term>
<term>Speech synthesis</term>
<term>Tactile sensitivity</term>
<term>Text to speech</term>
<term>User behavior</term>
<term>User interface</term>
</keywords>
<keywords scheme="Pascal" xml:lang="fr">
<term>Interface utilisateur</term>
<term>Mobilité</term>
<term>Synthèse parole</term>
<term>Texte à parole</term>
<term>Informatique mobile</term>
<term>Accessibilité</term>
<term>Facteur forme</term>
<term>Affichage</term>
<term>Métaphore</term>
<term>Aveugle</term>
<term>Assemblage boulonné</term>
<term>Comportement utilisateur</term>
<term>Sensibilité tactile</term>
</keywords>
<keywords scheme="Wicri" type="topic" xml:lang="fr">
<term>Affichage</term>
</keywords>
</textClass>
</profileDesc>
</teiHeader>
<front>
<div type="abstract" xml:lang="en">The accessibility of handheld mobile devices is a unique problem domain. They present with a small form factor, constraining display size, and making serious demands on user mobility. Existing assistive technology tackles these problems with bespoke solutions and text-to-speech augmentation, bulking out the device, and forcing visual metaphors upon blind users. Stepping away from such "bolt-on" accessibility, this research revisits the processes by which user interfaces are designed, constructing a model of user interface development that allows for dynamic adaptation of the interface to match individual user capability profiles. In doing so, it abstracts content meaning from presentation, mapping interaction metaphors to categorized user capabilities within individual design spaces (visual, sonic, and haptic) and interaction metaphors to relevant content meaning.</div>
</front>
</TEI>
<inist>
<standard h6="B">
<pA>
<fA08 i1="01" i2="1" l="ENG">
<s1>Self-adapting user interfaces as assistive technology for handheld mobile devices</s1>
</fA08>
<fA09 i1="01" i2="1" l="ENG">
<s1>ASSESTS 2006 : Eighth international ACM SIGACCESS Conference on computers and accessibility : October 23-25, 2006, Portland OR, USA</s1>
</fA09>
<fA11 i1="01" i2="1">
<s1>DODD (Robert)</s1>
</fA11>
<fA14 i1="01">
<s1>University of Teesside School of Computing</s1>
<s2>Tees Valley TS1 3BA</s2>
<s3>GBR</s3>
<sZ>1 aut.</sZ>
</fA14>
<fA18 i1="01" i2="1">
<s1>Association for computing machinery</s1>
<s3>USA</s3>
<s9>org-cong.</s9>
</fA18>
<fA20>
<s1>297-298</s1>
</fA20>
<fA21>
<s1>2006</s1>
</fA21>
<fA23 i1="01">
<s0>ENG</s0>
</fA23>
<fA25 i1="01">
<s1>ACM Press</s1>
<s2>New York NY</s2>
</fA25>
<fA26 i1="01">
<s0>1-59593-290-9</s0>
</fA26>
<fA30 i1="01" i2="1" l="ENG">
<s1>International ACM SIGACCESS conference on computers and accessibility</s1>
<s2>8</s2>
<s3>Portland OR USA</s3>
<s4>2006</s4>
</fA30>
<fA43 i1="01">
<s1>INIST</s1>
<s2>Y 39071</s2>
<s5>354000153605150670</s5>
</fA43>
<fA44>
<s0>0000</s0>
<s1>© 2007 INIST-CNRS. All rights reserved.</s1>
</fA44>
<fA45>
<s0>7 ref.</s0>
</fA45>
<fA47 i1="01" i2="1">
<s0>07-0392587</s0>
</fA47>
<fA60>
<s1>C</s1>
</fA60>
<fA61>
<s0>A</s0>
</fA61>
<fA66 i1="01">
<s0>USA</s0>
</fA66>
<fC01 i1="01" l="ENG">
<s0>The accessibility of handheld mobile devices is a unique problem domain. They present with a small form factor, constraining display size, and making serious demands on user mobility. Existing assistive technology tackles these problems with bespoke solutions and text-to-speech augmentation, bulking out the device, and forcing visual metaphors upon blind users. Stepping away from such "bolt-on" accessibility, this research revisits the processes by which user interfaces are designed, constructing a model of user interface development that allows for dynamic adaptation of the interface to match individual user capability profiles. In doing so, it abstracts content meaning from presentation, mapping interaction metaphors to categorized user capabilities within individual design spaces (visual, sonic, and haptic) and interaction metaphors to relevant content meaning.</s0>
</fC01>
<fC02 i1="01" i2="X">
<s0>001B40C38</s0>
</fC02>
<fC02 i1="02" i2="X">
<s0>001D02B04</s0>
</fC02>
<fC03 i1="01" i2="X" l="FRE">
<s0>Interface utilisateur</s0>
<s5>06</s5>
</fC03>
<fC03 i1="01" i2="X" l="ENG">
<s0>User interface</s0>
<s5>06</s5>
</fC03>
<fC03 i1="01" i2="X" l="SPA">
<s0>Interfase usuario</s0>
<s5>06</s5>
</fC03>
<fC03 i1="02" i2="X" l="FRE">
<s0>Mobilité</s0>
<s5>07</s5>
</fC03>
<fC03 i1="02" i2="X" l="ENG">
<s0>Mobility</s0>
<s5>07</s5>
</fC03>
<fC03 i1="02" i2="X" l="SPA">
<s0>Movilidad</s0>
<s5>07</s5>
</fC03>
<fC03 i1="03" i2="X" l="FRE">
<s0>Synthèse parole</s0>
<s5>08</s5>
</fC03>
<fC03 i1="03" i2="X" l="ENG">
<s0>Speech synthesis</s0>
<s5>08</s5>
</fC03>
<fC03 i1="03" i2="X" l="SPA">
<s0>Síntesis palabra</s0>
<s5>08</s5>
</fC03>
<fC03 i1="04" i2="X" l="FRE">
<s0>Texte à parole</s0>
<s5>09</s5>
</fC03>
<fC03 i1="04" i2="X" l="ENG">
<s0>Text to speech</s0>
<s5>09</s5>
</fC03>
<fC03 i1="04" i2="X" l="SPA">
<s0>Texto hacia palabra</s0>
<s5>09</s5>
</fC03>
<fC03 i1="05" i2="3" l="FRE">
<s0>Informatique mobile</s0>
<s5>18</s5>
</fC03>
<fC03 i1="05" i2="3" l="ENG">
<s0>Mobile computing</s0>
<s5>18</s5>
</fC03>
<fC03 i1="06" i2="X" l="FRE">
<s0>Accessibilité</s0>
<s5>19</s5>
</fC03>
<fC03 i1="06" i2="X" l="ENG">
<s0>Accessibility</s0>
<s5>19</s5>
</fC03>
<fC03 i1="06" i2="X" l="SPA">
<s0>Accesibilidad</s0>
<s5>19</s5>
</fC03>
<fC03 i1="07" i2="X" l="FRE">
<s0>Facteur forme</s0>
<s5>20</s5>
</fC03>
<fC03 i1="07" i2="X" l="ENG">
<s0>Form factor</s0>
<s5>20</s5>
</fC03>
<fC03 i1="07" i2="X" l="SPA">
<s0>Factor forma</s0>
<s5>20</s5>
</fC03>
<fC03 i1="08" i2="X" l="FRE">
<s0>Affichage</s0>
<s5>21</s5>
</fC03>
<fC03 i1="08" i2="X" l="ENG">
<s0>Display</s0>
<s5>21</s5>
</fC03>
<fC03 i1="08" i2="X" l="SPA">
<s0>Visualización</s0>
<s5>21</s5>
</fC03>
<fC03 i1="09" i2="X" l="FRE">
<s0>Métaphore</s0>
<s5>22</s5>
</fC03>
<fC03 i1="09" i2="X" l="ENG">
<s0>Metaphor</s0>
<s5>22</s5>
</fC03>
<fC03 i1="09" i2="X" l="SPA">
<s0>Metáfora</s0>
<s5>22</s5>
</fC03>
<fC03 i1="10" i2="X" l="FRE">
<s0>Aveugle</s0>
<s5>23</s5>
</fC03>
<fC03 i1="10" i2="X" l="ENG">
<s0>Blind</s0>
<s5>23</s5>
</fC03>
<fC03 i1="10" i2="X" l="SPA">
<s0>Ciego</s0>
<s5>23</s5>
</fC03>
<fC03 i1="11" i2="X" l="FRE">
<s0>Assemblage boulonné</s0>
<s5>24</s5>
</fC03>
<fC03 i1="11" i2="X" l="ENG">
<s0>Bolted joint</s0>
<s5>24</s5>
</fC03>
<fC03 i1="11" i2="X" l="SPA">
<s0>Ensamblaje empernado</s0>
<s5>24</s5>
</fC03>
<fC03 i1="12" i2="X" l="FRE">
<s0>Comportement utilisateur</s0>
<s5>25</s5>
</fC03>
<fC03 i1="12" i2="X" l="ENG">
<s0>User behavior</s0>
<s5>25</s5>
</fC03>
<fC03 i1="12" i2="X" l="SPA">
<s0>Comportamiento usuario</s0>
<s5>25</s5>
</fC03>
<fC03 i1="13" i2="X" l="FRE">
<s0>Sensibilité tactile</s0>
<s5>26</s5>
</fC03>
<fC03 i1="13" i2="X" l="ENG">
<s0>Tactile sensitivity</s0>
<s5>26</s5>
</fC03>
<fC03 i1="13" i2="X" l="SPA">
<s0>Sensibilidad tactil</s0>
<s5>26</s5>
</fC03>
<fN21>
<s1>253</s1>
</fN21>
<fN44 i1="01">
<s1>OTO</s1>
</fN44>
<fN82>
<s1>OTO</s1>
</fN82>
</pA>
</standard>
</inist>
<affiliations>
<list>
<country>
<li>Royaume-Uni</li>
</country>
</list>
<tree>
<country name="Royaume-Uni">
<noRegion>
<name sortKey="Dodd, Robert" sort="Dodd, Robert" uniqKey="Dodd R" first="Robert" last="Dodd">Robert Dodd</name>
</noRegion>
</country>
</tree>
</affiliations>
</record>

Pour manipuler ce document sous Unix (Dilib)

EXPLOR_STEP=$WICRI_ROOT/Ticri/CIDE/explor/HapticV1/Data/PascalFrancis/Checkpoint
HfdSelect -h $EXPLOR_STEP/biblio.hfd -nk 000932 | SxmlIndent | more

Ou

HfdSelect -h $EXPLOR_AREA/Data/PascalFrancis/Checkpoint/biblio.hfd -nk 000932 | SxmlIndent | more

Pour mettre un lien sur cette page dans le réseau Wicri

{{Explor lien
   |wiki=    Ticri/CIDE
   |area=    HapticV1
   |flux=    PascalFrancis
   |étape=   Checkpoint
   |type=    RBID
   |clé=     Pascal:07-0392587
   |texte=   Self-adapting user interfaces as assistive technology for handheld mobile devices
}}

Wicri

This area was generated with Dilib version V0.6.23.
Data generation: Mon Jun 13 01:09:46 2016. Site generation: Wed Mar 6 09:54:07 2024