Serveur d'exploration sur les dispositifs haptiques

Attention, ce site est en cours de développement !
Attention, site généré par des moyens informatiques à partir de corpus bruts.
Les informations ne sont donc pas validées.

Self-adapting user interfaces as assistive technology for handheld mobile devices

Identifieur interne : 000973 ( PascalFrancis/Curation ); précédent : 000972; suivant : 000974

Self-adapting user interfaces as assistive technology for handheld mobile devices

Auteurs : Robert Dodd [Royaume-Uni]

Source :

RBID : Pascal:07-0392587

Descripteurs français

English descriptors

Abstract

The accessibility of handheld mobile devices is a unique problem domain. They present with a small form factor, constraining display size, and making serious demands on user mobility. Existing assistive technology tackles these problems with bespoke solutions and text-to-speech augmentation, bulking out the device, and forcing visual metaphors upon blind users. Stepping away from such "bolt-on" accessibility, this research revisits the processes by which user interfaces are designed, constructing a model of user interface development that allows for dynamic adaptation of the interface to match individual user capability profiles. In doing so, it abstracts content meaning from presentation, mapping interaction metaphors to categorized user capabilities within individual design spaces (visual, sonic, and haptic) and interaction metaphors to relevant content meaning.
pA  
A08 01  1  ENG  @1 Self-adapting user interfaces as assistive technology for handheld mobile devices
A09 01  1  ENG  @1 ASSESTS 2006 : Eighth international ACM SIGACCESS Conference on computers and accessibility : October 23-25, 2006, Portland OR, USA
A11 01  1    @1 DODD (Robert)
A14 01      @1 University of Teesside School of Computing @2 Tees Valley TS1 3BA @3 GBR @Z 1 aut.
A18 01  1    @1 Association for computing machinery @3 USA @9 org-cong.
A20       @1 297-298
A21       @1 2006
A23 01      @0 ENG
A25 01      @1 ACM Press @2 New York NY
A26 01      @0 1-59593-290-9
A30 01  1  ENG  @1 International ACM SIGACCESS conference on computers and accessibility @2 8 @3 Portland OR USA @4 2006
A43 01      @1 INIST @2 Y 39071 @5 354000153605150670
A44       @0 0000 @1 © 2007 INIST-CNRS. All rights reserved.
A45       @0 7 ref.
A47 01  1    @0 07-0392587
A60       @1 C
A61       @0 A
A66 01      @0 USA
C01 01    ENG  @0 The accessibility of handheld mobile devices is a unique problem domain. They present with a small form factor, constraining display size, and making serious demands on user mobility. Existing assistive technology tackles these problems with bespoke solutions and text-to-speech augmentation, bulking out the device, and forcing visual metaphors upon blind users. Stepping away from such "bolt-on" accessibility, this research revisits the processes by which user interfaces are designed, constructing a model of user interface development that allows for dynamic adaptation of the interface to match individual user capability profiles. In doing so, it abstracts content meaning from presentation, mapping interaction metaphors to categorized user capabilities within individual design spaces (visual, sonic, and haptic) and interaction metaphors to relevant content meaning.
C02 01  X    @0 001B40C38
C02 02  X    @0 001D02B04
C03 01  X  FRE  @0 Interface utilisateur @5 06
C03 01  X  ENG  @0 User interface @5 06
C03 01  X  SPA  @0 Interfase usuario @5 06
C03 02  X  FRE  @0 Mobilité @5 07
C03 02  X  ENG  @0 Mobility @5 07
C03 02  X  SPA  @0 Movilidad @5 07
C03 03  X  FRE  @0 Synthèse parole @5 08
C03 03  X  ENG  @0 Speech synthesis @5 08
C03 03  X  SPA  @0 Síntesis palabra @5 08
C03 04  X  FRE  @0 Texte à parole @5 09
C03 04  X  ENG  @0 Text to speech @5 09
C03 04  X  SPA  @0 Texto hacia palabra @5 09
C03 05  3  FRE  @0 Informatique mobile @5 18
C03 05  3  ENG  @0 Mobile computing @5 18
C03 06  X  FRE  @0 Accessibilité @5 19
C03 06  X  ENG  @0 Accessibility @5 19
C03 06  X  SPA  @0 Accesibilidad @5 19
C03 07  X  FRE  @0 Facteur forme @5 20
C03 07  X  ENG  @0 Form factor @5 20
C03 07  X  SPA  @0 Factor forma @5 20
C03 08  X  FRE  @0 Affichage @5 21
C03 08  X  ENG  @0 Display @5 21
C03 08  X  SPA  @0 Visualización @5 21
C03 09  X  FRE  @0 Métaphore @5 22
C03 09  X  ENG  @0 Metaphor @5 22
C03 09  X  SPA  @0 Metáfora @5 22
C03 10  X  FRE  @0 Aveugle @5 23
C03 10  X  ENG  @0 Blind @5 23
C03 10  X  SPA  @0 Ciego @5 23
C03 11  X  FRE  @0 Assemblage boulonné @5 24
C03 11  X  ENG  @0 Bolted joint @5 24
C03 11  X  SPA  @0 Ensamblaje empernado @5 24
C03 12  X  FRE  @0 Comportement utilisateur @5 25
C03 12  X  ENG  @0 User behavior @5 25
C03 12  X  SPA  @0 Comportamiento usuario @5 25
C03 13  X  FRE  @0 Sensibilité tactile @5 26
C03 13  X  ENG  @0 Tactile sensitivity @5 26
C03 13  X  SPA  @0 Sensibilidad tactil @5 26
N21       @1 253
N44 01      @1 OTO
N82       @1 OTO

Links toward previous steps (curation, corpus...)


Links to Exploration step

Pascal:07-0392587

Le document en format XML

<record>
<TEI>
<teiHeader>
<fileDesc>
<titleStmt>
<title xml:lang="en" level="a">Self-adapting user interfaces as assistive technology for handheld mobile devices</title>
<author>
<name sortKey="Dodd, Robert" sort="Dodd, Robert" uniqKey="Dodd R" first="Robert" last="Dodd">Robert Dodd</name>
<affiliation wicri:level="1">
<inist:fA14 i1="01">
<s1>University of Teesside School of Computing</s1>
<s2>Tees Valley TS1 3BA</s2>
<s3>GBR</s3>
<sZ>1 aut.</sZ>
</inist:fA14>
<country>Royaume-Uni</country>
</affiliation>
</author>
</titleStmt>
<publicationStmt>
<idno type="wicri:source">INIST</idno>
<idno type="inist">07-0392587</idno>
<date when="2006">2006</date>
<idno type="stanalyst">PASCAL 07-0392587 INIST</idno>
<idno type="RBID">Pascal:07-0392587</idno>
<idno type="wicri:Area/PascalFrancis/Corpus">000B22</idno>
<idno type="wicri:Area/PascalFrancis/Curation">000973</idno>
</publicationStmt>
<sourceDesc>
<biblStruct>
<analytic>
<title xml:lang="en" level="a">Self-adapting user interfaces as assistive technology for handheld mobile devices</title>
<author>
<name sortKey="Dodd, Robert" sort="Dodd, Robert" uniqKey="Dodd R" first="Robert" last="Dodd">Robert Dodd</name>
<affiliation wicri:level="1">
<inist:fA14 i1="01">
<s1>University of Teesside School of Computing</s1>
<s2>Tees Valley TS1 3BA</s2>
<s3>GBR</s3>
<sZ>1 aut.</sZ>
</inist:fA14>
<country>Royaume-Uni</country>
</affiliation>
</author>
</analytic>
</biblStruct>
</sourceDesc>
</fileDesc>
<profileDesc>
<textClass>
<keywords scheme="KwdEn" xml:lang="en">
<term>Accessibility</term>
<term>Blind</term>
<term>Bolted joint</term>
<term>Display</term>
<term>Form factor</term>
<term>Metaphor</term>
<term>Mobile computing</term>
<term>Mobility</term>
<term>Speech synthesis</term>
<term>Tactile sensitivity</term>
<term>Text to speech</term>
<term>User behavior</term>
<term>User interface</term>
</keywords>
<keywords scheme="Pascal" xml:lang="fr">
<term>Interface utilisateur</term>
<term>Mobilité</term>
<term>Synthèse parole</term>
<term>Texte à parole</term>
<term>Informatique mobile</term>
<term>Accessibilité</term>
<term>Facteur forme</term>
<term>Affichage</term>
<term>Métaphore</term>
<term>Aveugle</term>
<term>Assemblage boulonné</term>
<term>Comportement utilisateur</term>
<term>Sensibilité tactile</term>
</keywords>
<keywords scheme="Wicri" type="topic" xml:lang="fr">
<term>Affichage</term>
</keywords>
</textClass>
</profileDesc>
</teiHeader>
<front>
<div type="abstract" xml:lang="en">The accessibility of handheld mobile devices is a unique problem domain. They present with a small form factor, constraining display size, and making serious demands on user mobility. Existing assistive technology tackles these problems with bespoke solutions and text-to-speech augmentation, bulking out the device, and forcing visual metaphors upon blind users. Stepping away from such "bolt-on" accessibility, this research revisits the processes by which user interfaces are designed, constructing a model of user interface development that allows for dynamic adaptation of the interface to match individual user capability profiles. In doing so, it abstracts content meaning from presentation, mapping interaction metaphors to categorized user capabilities within individual design spaces (visual, sonic, and haptic) and interaction metaphors to relevant content meaning.</div>
</front>
</TEI>
<inist>
<standard h6="B">
<pA>
<fA08 i1="01" i2="1" l="ENG">
<s1>Self-adapting user interfaces as assistive technology for handheld mobile devices</s1>
</fA08>
<fA09 i1="01" i2="1" l="ENG">
<s1>ASSESTS 2006 : Eighth international ACM SIGACCESS Conference on computers and accessibility : October 23-25, 2006, Portland OR, USA</s1>
</fA09>
<fA11 i1="01" i2="1">
<s1>DODD (Robert)</s1>
</fA11>
<fA14 i1="01">
<s1>University of Teesside School of Computing</s1>
<s2>Tees Valley TS1 3BA</s2>
<s3>GBR</s3>
<sZ>1 aut.</sZ>
</fA14>
<fA18 i1="01" i2="1">
<s1>Association for computing machinery</s1>
<s3>USA</s3>
<s9>org-cong.</s9>
</fA18>
<fA20>
<s1>297-298</s1>
</fA20>
<fA21>
<s1>2006</s1>
</fA21>
<fA23 i1="01">
<s0>ENG</s0>
</fA23>
<fA25 i1="01">
<s1>ACM Press</s1>
<s2>New York NY</s2>
</fA25>
<fA26 i1="01">
<s0>1-59593-290-9</s0>
</fA26>
<fA30 i1="01" i2="1" l="ENG">
<s1>International ACM SIGACCESS conference on computers and accessibility</s1>
<s2>8</s2>
<s3>Portland OR USA</s3>
<s4>2006</s4>
</fA30>
<fA43 i1="01">
<s1>INIST</s1>
<s2>Y 39071</s2>
<s5>354000153605150670</s5>
</fA43>
<fA44>
<s0>0000</s0>
<s1>© 2007 INIST-CNRS. All rights reserved.</s1>
</fA44>
<fA45>
<s0>7 ref.</s0>
</fA45>
<fA47 i1="01" i2="1">
<s0>07-0392587</s0>
</fA47>
<fA60>
<s1>C</s1>
</fA60>
<fA61>
<s0>A</s0>
</fA61>
<fA66 i1="01">
<s0>USA</s0>
</fA66>
<fC01 i1="01" l="ENG">
<s0>The accessibility of handheld mobile devices is a unique problem domain. They present with a small form factor, constraining display size, and making serious demands on user mobility. Existing assistive technology tackles these problems with bespoke solutions and text-to-speech augmentation, bulking out the device, and forcing visual metaphors upon blind users. Stepping away from such "bolt-on" accessibility, this research revisits the processes by which user interfaces are designed, constructing a model of user interface development that allows for dynamic adaptation of the interface to match individual user capability profiles. In doing so, it abstracts content meaning from presentation, mapping interaction metaphors to categorized user capabilities within individual design spaces (visual, sonic, and haptic) and interaction metaphors to relevant content meaning.</s0>
</fC01>
<fC02 i1="01" i2="X">
<s0>001B40C38</s0>
</fC02>
<fC02 i1="02" i2="X">
<s0>001D02B04</s0>
</fC02>
<fC03 i1="01" i2="X" l="FRE">
<s0>Interface utilisateur</s0>
<s5>06</s5>
</fC03>
<fC03 i1="01" i2="X" l="ENG">
<s0>User interface</s0>
<s5>06</s5>
</fC03>
<fC03 i1="01" i2="X" l="SPA">
<s0>Interfase usuario</s0>
<s5>06</s5>
</fC03>
<fC03 i1="02" i2="X" l="FRE">
<s0>Mobilité</s0>
<s5>07</s5>
</fC03>
<fC03 i1="02" i2="X" l="ENG">
<s0>Mobility</s0>
<s5>07</s5>
</fC03>
<fC03 i1="02" i2="X" l="SPA">
<s0>Movilidad</s0>
<s5>07</s5>
</fC03>
<fC03 i1="03" i2="X" l="FRE">
<s0>Synthèse parole</s0>
<s5>08</s5>
</fC03>
<fC03 i1="03" i2="X" l="ENG">
<s0>Speech synthesis</s0>
<s5>08</s5>
</fC03>
<fC03 i1="03" i2="X" l="SPA">
<s0>Síntesis palabra</s0>
<s5>08</s5>
</fC03>
<fC03 i1="04" i2="X" l="FRE">
<s0>Texte à parole</s0>
<s5>09</s5>
</fC03>
<fC03 i1="04" i2="X" l="ENG">
<s0>Text to speech</s0>
<s5>09</s5>
</fC03>
<fC03 i1="04" i2="X" l="SPA">
<s0>Texto hacia palabra</s0>
<s5>09</s5>
</fC03>
<fC03 i1="05" i2="3" l="FRE">
<s0>Informatique mobile</s0>
<s5>18</s5>
</fC03>
<fC03 i1="05" i2="3" l="ENG">
<s0>Mobile computing</s0>
<s5>18</s5>
</fC03>
<fC03 i1="06" i2="X" l="FRE">
<s0>Accessibilité</s0>
<s5>19</s5>
</fC03>
<fC03 i1="06" i2="X" l="ENG">
<s0>Accessibility</s0>
<s5>19</s5>
</fC03>
<fC03 i1="06" i2="X" l="SPA">
<s0>Accesibilidad</s0>
<s5>19</s5>
</fC03>
<fC03 i1="07" i2="X" l="FRE">
<s0>Facteur forme</s0>
<s5>20</s5>
</fC03>
<fC03 i1="07" i2="X" l="ENG">
<s0>Form factor</s0>
<s5>20</s5>
</fC03>
<fC03 i1="07" i2="X" l="SPA">
<s0>Factor forma</s0>
<s5>20</s5>
</fC03>
<fC03 i1="08" i2="X" l="FRE">
<s0>Affichage</s0>
<s5>21</s5>
</fC03>
<fC03 i1="08" i2="X" l="ENG">
<s0>Display</s0>
<s5>21</s5>
</fC03>
<fC03 i1="08" i2="X" l="SPA">
<s0>Visualización</s0>
<s5>21</s5>
</fC03>
<fC03 i1="09" i2="X" l="FRE">
<s0>Métaphore</s0>
<s5>22</s5>
</fC03>
<fC03 i1="09" i2="X" l="ENG">
<s0>Metaphor</s0>
<s5>22</s5>
</fC03>
<fC03 i1="09" i2="X" l="SPA">
<s0>Metáfora</s0>
<s5>22</s5>
</fC03>
<fC03 i1="10" i2="X" l="FRE">
<s0>Aveugle</s0>
<s5>23</s5>
</fC03>
<fC03 i1="10" i2="X" l="ENG">
<s0>Blind</s0>
<s5>23</s5>
</fC03>
<fC03 i1="10" i2="X" l="SPA">
<s0>Ciego</s0>
<s5>23</s5>
</fC03>
<fC03 i1="11" i2="X" l="FRE">
<s0>Assemblage boulonné</s0>
<s5>24</s5>
</fC03>
<fC03 i1="11" i2="X" l="ENG">
<s0>Bolted joint</s0>
<s5>24</s5>
</fC03>
<fC03 i1="11" i2="X" l="SPA">
<s0>Ensamblaje empernado</s0>
<s5>24</s5>
</fC03>
<fC03 i1="12" i2="X" l="FRE">
<s0>Comportement utilisateur</s0>
<s5>25</s5>
</fC03>
<fC03 i1="12" i2="X" l="ENG">
<s0>User behavior</s0>
<s5>25</s5>
</fC03>
<fC03 i1="12" i2="X" l="SPA">
<s0>Comportamiento usuario</s0>
<s5>25</s5>
</fC03>
<fC03 i1="13" i2="X" l="FRE">
<s0>Sensibilité tactile</s0>
<s5>26</s5>
</fC03>
<fC03 i1="13" i2="X" l="ENG">
<s0>Tactile sensitivity</s0>
<s5>26</s5>
</fC03>
<fC03 i1="13" i2="X" l="SPA">
<s0>Sensibilidad tactil</s0>
<s5>26</s5>
</fC03>
<fN21>
<s1>253</s1>
</fN21>
<fN44 i1="01">
<s1>OTO</s1>
</fN44>
<fN82>
<s1>OTO</s1>
</fN82>
</pA>
</standard>
</inist>
</record>

Pour manipuler ce document sous Unix (Dilib)

EXPLOR_STEP=$WICRI_ROOT/Ticri/CIDE/explor/HapticV1/Data/PascalFrancis/Curation
HfdSelect -h $EXPLOR_STEP/biblio.hfd -nk 000973 | SxmlIndent | more

Ou

HfdSelect -h $EXPLOR_AREA/Data/PascalFrancis/Curation/biblio.hfd -nk 000973 | SxmlIndent | more

Pour mettre un lien sur cette page dans le réseau Wicri

{{Explor lien
   |wiki=    Ticri/CIDE
   |area=    HapticV1
   |flux=    PascalFrancis
   |étape=   Curation
   |type=    RBID
   |clé=     Pascal:07-0392587
   |texte=   Self-adapting user interfaces as assistive technology for handheld mobile devices
}}

Wicri

This area was generated with Dilib version V0.6.23.
Data generation: Mon Jun 13 01:09:46 2016. Site generation: Wed Mar 6 09:54:07 2024