Serveur d'exploration sur les dispositifs haptiques

Attention, ce site est en cours de développement !
Attention, site généré par des moyens informatiques à partir de corpus bruts.
Les informations ne sont donc pas validées.

CyARM : Interactive device for environment recognition and joint haptic attention using non-visual modality

Identifieur interne : 000A78 ( PascalFrancis/Corpus ); précédent : 000A77; suivant : 000A79

CyARM : Interactive device for environment recognition and joint haptic attention using non-visual modality

Auteurs : Tetsuo Ono ; Takanori Komatsu ; Jun-Ichi Akita ; Kiyohide Ito ; Makoto Okamoto

Source :

RBID : Pascal:07-0525257

Descripteurs français

English descriptors

Abstract

We have developed CyARM, a new kind of sensing device especially for visually impaired persons, to assist with mobility and detection of nearby objects. This user interface has unique characteristics of giving visually impaired persons the impression of an "imaginary arm" that extends to existing obstacles. CyARM is also a communication device for constructing "joint haptic attention" between impaired and unimpaired persons watching or feeling the same objects. In other words, this device offers a new methodology for judging the others' attentions or intentions without using their eye gaze direction. We verified the efficiency and ability of CyARM for environment recognition through experiments and discuss its possibility as a communication device for realizing joint haptic attention.

Notice en format standard (ISO 2709)

Pour connaître la documentation sur le format Inist Standard.

pA  
A01 01  1    @0 0302-9743
A05       @2 4061
A08 01  1  ENG  @1 CyARM : Interactive device for environment recognition and joint haptic attention using non-visual modality
A09 01  1  ENG  @1 Computers helping people with special needs : 10th International Conference, ICCHP 2006, Linz, Austria, July 11-13, 2006 : proceedings
A11 01  1    @1 ONO (Tetsuo)
A11 02  1    @1 KOMATSU (Takanori)
A11 03  1    @1 AKITA (Jun-Ichi)
A11 04  1    @1 ITO (Kiyohide)
A11 05  1    @1 OKAMOTO (Makoto)
A14 01      @1 Department of Media Architecture, Future University-Hakodate @2 116-2 Kamedanakano, Hakodate 041-8655 @3 JPN @Z 1 aut. @Z 2 aut. @Z 4 aut. @Z 5 aut.
A14 02      @1 Department of Information and Systems Engineering, Kanazawa University @2 Kakuma, Kanazawa 920-1192 @3 JPN @Z 3 aut.
A20       @1 1251-1258
A21       @1 2006
A23 01      @0 ENG
A26 01      @0 3-540-36020-4
A43 01      @1 INIST @2 16343 @5 354000153622481800
A44       @0 0000 @1 © 2007 INIST-CNRS. All rights reserved.
A45       @0 5 ref.
A47 01  1    @0 07-0525257
A60       @1 P @2 C
A61       @0 A
A64 01  1    @0 Lecture notes in computer science
A66 01      @0 DEU
A66 02      @0 USA
C01 01    ENG  @0 We have developed CyARM, a new kind of sensing device especially for visually impaired persons, to assist with mobility and detection of nearby objects. This user interface has unique characteristics of giving visually impaired persons the impression of an "imaginary arm" that extends to existing obstacles. CyARM is also a communication device for constructing "joint haptic attention" between impaired and unimpaired persons watching or feeling the same objects. In other words, this device offers a new methodology for judging the others' attentions or intentions without using their eye gaze direction. We verified the efficiency and ability of CyARM for environment recognition through experiments and discuss its possibility as a communication device for realizing joint haptic attention.
C02 01  X    @0 001D02B04
C03 01  X  FRE  @0 Assistance utilisateur @5 01
C03 01  X  ENG  @0 User assistance @5 01
C03 01  X  SPA  @0 Asistencia usuario @5 01
C03 02  X  FRE  @0 Aide handicapé @5 02
C03 02  X  ENG  @0 Handicapped aid @5 02
C03 02  X  SPA  @0 Ayuda minusválido @5 02
C03 03  X  FRE  @0 Mobilité @5 06
C03 03  X  ENG  @0 Mobility @5 06
C03 03  X  SPA  @0 Movilidad @5 06
C03 04  3  FRE  @0 Détection objet @5 07
C03 04  3  ENG  @0 Object detection @5 07
C03 05  X  FRE  @0 Interface utilisateur @5 08
C03 05  X  ENG  @0 User interface @5 08
C03 05  X  SPA  @0 Interfase usuario @5 08
C03 06  X  FRE  @0 Regard @5 09
C03 06  X  ENG  @0 Gaze @5 09
C03 06  X  SPA  @0 Mirada @5 09
C03 07  X  FRE  @0 Sensibilité tactile @5 18
C03 07  X  ENG  @0 Tactile sensitivity @5 18
C03 07  X  SPA  @0 Sensibilidad tactil @5 18
C03 08  X  FRE  @0 Attention visuelle @5 19
C03 08  X  ENG  @0 Visual attention @5 19
C03 08  X  SPA  @0 Atención visual @5 19
C03 09  X  FRE  @0 Trouble vision @5 20
C03 09  X  ENG  @0 Vision disorder @5 20
C03 09  X  SPA  @0 Trastorno visión @5 20
C03 10  X  FRE  @0 Détecteur proximité @5 21
C03 10  X  ENG  @0 Proximity detector @5 21
C03 10  X  SPA  @0 Detector proximidad @5 21
C03 11  X  FRE  @0 Intention @5 22
C03 11  X  ENG  @0 Intention @5 22
C03 11  X  SPA  @0 Intencíon @5 22
N21       @1 344
N44 01      @1 OTO
N82       @1 OTO
pR  
A30 01  1  ENG  @1 International Conference on Computers Helping people with Special Needs @2 10 @3 Linz AUT @4 2006

Format Inist (serveur)

NO : PASCAL 07-0525257 INIST
ET : CyARM : Interactive device for environment recognition and joint haptic attention using non-visual modality
AU : ONO (Tetsuo); KOMATSU (Takanori); AKITA (Jun-Ichi); ITO (Kiyohide); OKAMOTO (Makoto)
AF : Department of Media Architecture, Future University-Hakodate/116-2 Kamedanakano, Hakodate 041-8655/Japon (1 aut., 2 aut., 4 aut., 5 aut.); Department of Information and Systems Engineering, Kanazawa University/Kakuma, Kanazawa 920-1192/Japon (3 aut.)
DT : Publication en série; Congrès; Niveau analytique
SO : Lecture notes in computer science; ISSN 0302-9743; Allemagne; Da. 2006; Vol. 4061; Pp. 1251-1258; Bibl. 5 ref.
LA : Anglais
EA : We have developed CyARM, a new kind of sensing device especially for visually impaired persons, to assist with mobility and detection of nearby objects. This user interface has unique characteristics of giving visually impaired persons the impression of an "imaginary arm" that extends to existing obstacles. CyARM is also a communication device for constructing "joint haptic attention" between impaired and unimpaired persons watching or feeling the same objects. In other words, this device offers a new methodology for judging the others' attentions or intentions without using their eye gaze direction. We verified the efficiency and ability of CyARM for environment recognition through experiments and discuss its possibility as a communication device for realizing joint haptic attention.
CC : 001D02B04
FD : Assistance utilisateur; Aide handicapé; Mobilité; Détection objet; Interface utilisateur; Regard; Sensibilité tactile; Attention visuelle; Trouble vision; Détecteur proximité; Intention
ED : User assistance; Handicapped aid; Mobility; Object detection; User interface; Gaze; Tactile sensitivity; Visual attention; Vision disorder; Proximity detector; Intention
SD : Asistencia usuario; Ayuda minusválido; Movilidad; Interfase usuario; Mirada; Sensibilidad tactil; Atención visual; Trastorno visión; Detector proximidad; Intencíon
LO : INIST-16343.354000153622481800
ID : 07-0525257

Links to Exploration step

Pascal:07-0525257

Le document en format XML

<record>
<TEI>
<teiHeader>
<fileDesc>
<titleStmt>
<title xml:lang="en" level="a">CyARM : Interactive device for environment recognition and joint haptic attention using non-visual modality</title>
<author>
<name sortKey="Ono, Tetsuo" sort="Ono, Tetsuo" uniqKey="Ono T" first="Tetsuo" last="Ono">Tetsuo Ono</name>
<affiliation>
<inist:fA14 i1="01">
<s1>Department of Media Architecture, Future University-Hakodate</s1>
<s2>116-2 Kamedanakano, Hakodate 041-8655</s2>
<s3>JPN</s3>
<sZ>1 aut.</sZ>
<sZ>2 aut.</sZ>
<sZ>4 aut.</sZ>
<sZ>5 aut.</sZ>
</inist:fA14>
</affiliation>
</author>
<author>
<name sortKey="Komatsu, Takanori" sort="Komatsu, Takanori" uniqKey="Komatsu T" first="Takanori" last="Komatsu">Takanori Komatsu</name>
<affiliation>
<inist:fA14 i1="01">
<s1>Department of Media Architecture, Future University-Hakodate</s1>
<s2>116-2 Kamedanakano, Hakodate 041-8655</s2>
<s3>JPN</s3>
<sZ>1 aut.</sZ>
<sZ>2 aut.</sZ>
<sZ>4 aut.</sZ>
<sZ>5 aut.</sZ>
</inist:fA14>
</affiliation>
</author>
<author>
<name sortKey="Akita, Jun Ichi" sort="Akita, Jun Ichi" uniqKey="Akita J" first="Jun-Ichi" last="Akita">Jun-Ichi Akita</name>
<affiliation>
<inist:fA14 i1="02">
<s1>Department of Information and Systems Engineering, Kanazawa University</s1>
<s2>Kakuma, Kanazawa 920-1192</s2>
<s3>JPN</s3>
<sZ>3 aut.</sZ>
</inist:fA14>
</affiliation>
</author>
<author>
<name sortKey="Ito, Kiyohide" sort="Ito, Kiyohide" uniqKey="Ito K" first="Kiyohide" last="Ito">Kiyohide Ito</name>
<affiliation>
<inist:fA14 i1="01">
<s1>Department of Media Architecture, Future University-Hakodate</s1>
<s2>116-2 Kamedanakano, Hakodate 041-8655</s2>
<s3>JPN</s3>
<sZ>1 aut.</sZ>
<sZ>2 aut.</sZ>
<sZ>4 aut.</sZ>
<sZ>5 aut.</sZ>
</inist:fA14>
</affiliation>
</author>
<author>
<name sortKey="Okamoto, Makoto" sort="Okamoto, Makoto" uniqKey="Okamoto M" first="Makoto" last="Okamoto">Makoto Okamoto</name>
<affiliation>
<inist:fA14 i1="01">
<s1>Department of Media Architecture, Future University-Hakodate</s1>
<s2>116-2 Kamedanakano, Hakodate 041-8655</s2>
<s3>JPN</s3>
<sZ>1 aut.</sZ>
<sZ>2 aut.</sZ>
<sZ>4 aut.</sZ>
<sZ>5 aut.</sZ>
</inist:fA14>
</affiliation>
</author>
</titleStmt>
<publicationStmt>
<idno type="wicri:source">INIST</idno>
<idno type="inist">07-0525257</idno>
<date when="2006">2006</date>
<idno type="stanalyst">PASCAL 07-0525257 INIST</idno>
<idno type="RBID">Pascal:07-0525257</idno>
<idno type="wicri:Area/PascalFrancis/Corpus">000A78</idno>
</publicationStmt>
<sourceDesc>
<biblStruct>
<analytic>
<title xml:lang="en" level="a">CyARM : Interactive device for environment recognition and joint haptic attention using non-visual modality</title>
<author>
<name sortKey="Ono, Tetsuo" sort="Ono, Tetsuo" uniqKey="Ono T" first="Tetsuo" last="Ono">Tetsuo Ono</name>
<affiliation>
<inist:fA14 i1="01">
<s1>Department of Media Architecture, Future University-Hakodate</s1>
<s2>116-2 Kamedanakano, Hakodate 041-8655</s2>
<s3>JPN</s3>
<sZ>1 aut.</sZ>
<sZ>2 aut.</sZ>
<sZ>4 aut.</sZ>
<sZ>5 aut.</sZ>
</inist:fA14>
</affiliation>
</author>
<author>
<name sortKey="Komatsu, Takanori" sort="Komatsu, Takanori" uniqKey="Komatsu T" first="Takanori" last="Komatsu">Takanori Komatsu</name>
<affiliation>
<inist:fA14 i1="01">
<s1>Department of Media Architecture, Future University-Hakodate</s1>
<s2>116-2 Kamedanakano, Hakodate 041-8655</s2>
<s3>JPN</s3>
<sZ>1 aut.</sZ>
<sZ>2 aut.</sZ>
<sZ>4 aut.</sZ>
<sZ>5 aut.</sZ>
</inist:fA14>
</affiliation>
</author>
<author>
<name sortKey="Akita, Jun Ichi" sort="Akita, Jun Ichi" uniqKey="Akita J" first="Jun-Ichi" last="Akita">Jun-Ichi Akita</name>
<affiliation>
<inist:fA14 i1="02">
<s1>Department of Information and Systems Engineering, Kanazawa University</s1>
<s2>Kakuma, Kanazawa 920-1192</s2>
<s3>JPN</s3>
<sZ>3 aut.</sZ>
</inist:fA14>
</affiliation>
</author>
<author>
<name sortKey="Ito, Kiyohide" sort="Ito, Kiyohide" uniqKey="Ito K" first="Kiyohide" last="Ito">Kiyohide Ito</name>
<affiliation>
<inist:fA14 i1="01">
<s1>Department of Media Architecture, Future University-Hakodate</s1>
<s2>116-2 Kamedanakano, Hakodate 041-8655</s2>
<s3>JPN</s3>
<sZ>1 aut.</sZ>
<sZ>2 aut.</sZ>
<sZ>4 aut.</sZ>
<sZ>5 aut.</sZ>
</inist:fA14>
</affiliation>
</author>
<author>
<name sortKey="Okamoto, Makoto" sort="Okamoto, Makoto" uniqKey="Okamoto M" first="Makoto" last="Okamoto">Makoto Okamoto</name>
<affiliation>
<inist:fA14 i1="01">
<s1>Department of Media Architecture, Future University-Hakodate</s1>
<s2>116-2 Kamedanakano, Hakodate 041-8655</s2>
<s3>JPN</s3>
<sZ>1 aut.</sZ>
<sZ>2 aut.</sZ>
<sZ>4 aut.</sZ>
<sZ>5 aut.</sZ>
</inist:fA14>
</affiliation>
</author>
</analytic>
<series>
<title level="j" type="main">Lecture notes in computer science</title>
<idno type="ISSN">0302-9743</idno>
<imprint>
<date when="2006">2006</date>
</imprint>
</series>
</biblStruct>
</sourceDesc>
<seriesStmt>
<title level="j" type="main">Lecture notes in computer science</title>
<idno type="ISSN">0302-9743</idno>
</seriesStmt>
</fileDesc>
<profileDesc>
<textClass>
<keywords scheme="KwdEn" xml:lang="en">
<term>Gaze</term>
<term>Handicapped aid</term>
<term>Intention</term>
<term>Mobility</term>
<term>Object detection</term>
<term>Proximity detector</term>
<term>Tactile sensitivity</term>
<term>User assistance</term>
<term>User interface</term>
<term>Vision disorder</term>
<term>Visual attention</term>
</keywords>
<keywords scheme="Pascal" xml:lang="fr">
<term>Assistance utilisateur</term>
<term>Aide handicapé</term>
<term>Mobilité</term>
<term>Détection objet</term>
<term>Interface utilisateur</term>
<term>Regard</term>
<term>Sensibilité tactile</term>
<term>Attention visuelle</term>
<term>Trouble vision</term>
<term>Détecteur proximité</term>
<term>Intention</term>
</keywords>
</textClass>
</profileDesc>
</teiHeader>
<front>
<div type="abstract" xml:lang="en">We have developed CyARM, a new kind of sensing device especially for visually impaired persons, to assist with mobility and detection of nearby objects. This user interface has unique characteristics of giving visually impaired persons the impression of an "imaginary arm" that extends to existing obstacles. CyARM is also a communication device for constructing "joint haptic attention" between impaired and unimpaired persons watching or feeling the same objects. In other words, this device offers a new methodology for judging the others' attentions or intentions without using their eye gaze direction. We verified the efficiency and ability of CyARM for environment recognition through experiments and discuss its possibility as a communication device for realizing joint haptic attention.</div>
</front>
</TEI>
<inist>
<standard h6="B">
<pA>
<fA01 i1="01" i2="1">
<s0>0302-9743</s0>
</fA01>
<fA05>
<s2>4061</s2>
</fA05>
<fA08 i1="01" i2="1" l="ENG">
<s1>CyARM : Interactive device for environment recognition and joint haptic attention using non-visual modality</s1>
</fA08>
<fA09 i1="01" i2="1" l="ENG">
<s1>Computers helping people with special needs : 10th International Conference, ICCHP 2006, Linz, Austria, July 11-13, 2006 : proceedings</s1>
</fA09>
<fA11 i1="01" i2="1">
<s1>ONO (Tetsuo)</s1>
</fA11>
<fA11 i1="02" i2="1">
<s1>KOMATSU (Takanori)</s1>
</fA11>
<fA11 i1="03" i2="1">
<s1>AKITA (Jun-Ichi)</s1>
</fA11>
<fA11 i1="04" i2="1">
<s1>ITO (Kiyohide)</s1>
</fA11>
<fA11 i1="05" i2="1">
<s1>OKAMOTO (Makoto)</s1>
</fA11>
<fA14 i1="01">
<s1>Department of Media Architecture, Future University-Hakodate</s1>
<s2>116-2 Kamedanakano, Hakodate 041-8655</s2>
<s3>JPN</s3>
<sZ>1 aut.</sZ>
<sZ>2 aut.</sZ>
<sZ>4 aut.</sZ>
<sZ>5 aut.</sZ>
</fA14>
<fA14 i1="02">
<s1>Department of Information and Systems Engineering, Kanazawa University</s1>
<s2>Kakuma, Kanazawa 920-1192</s2>
<s3>JPN</s3>
<sZ>3 aut.</sZ>
</fA14>
<fA20>
<s1>1251-1258</s1>
</fA20>
<fA21>
<s1>2006</s1>
</fA21>
<fA23 i1="01">
<s0>ENG</s0>
</fA23>
<fA26 i1="01">
<s0>3-540-36020-4</s0>
</fA26>
<fA43 i1="01">
<s1>INIST</s1>
<s2>16343</s2>
<s5>354000153622481800</s5>
</fA43>
<fA44>
<s0>0000</s0>
<s1>© 2007 INIST-CNRS. All rights reserved.</s1>
</fA44>
<fA45>
<s0>5 ref.</s0>
</fA45>
<fA47 i1="01" i2="1">
<s0>07-0525257</s0>
</fA47>
<fA60>
<s1>P</s1>
<s2>C</s2>
</fA60>
<fA61>
<s0>A</s0>
</fA61>
<fA64 i1="01" i2="1">
<s0>Lecture notes in computer science</s0>
</fA64>
<fA66 i1="01">
<s0>DEU</s0>
</fA66>
<fA66 i1="02">
<s0>USA</s0>
</fA66>
<fC01 i1="01" l="ENG">
<s0>We have developed CyARM, a new kind of sensing device especially for visually impaired persons, to assist with mobility and detection of nearby objects. This user interface has unique characteristics of giving visually impaired persons the impression of an "imaginary arm" that extends to existing obstacles. CyARM is also a communication device for constructing "joint haptic attention" between impaired and unimpaired persons watching or feeling the same objects. In other words, this device offers a new methodology for judging the others' attentions or intentions without using their eye gaze direction. We verified the efficiency and ability of CyARM for environment recognition through experiments and discuss its possibility as a communication device for realizing joint haptic attention.</s0>
</fC01>
<fC02 i1="01" i2="X">
<s0>001D02B04</s0>
</fC02>
<fC03 i1="01" i2="X" l="FRE">
<s0>Assistance utilisateur</s0>
<s5>01</s5>
</fC03>
<fC03 i1="01" i2="X" l="ENG">
<s0>User assistance</s0>
<s5>01</s5>
</fC03>
<fC03 i1="01" i2="X" l="SPA">
<s0>Asistencia usuario</s0>
<s5>01</s5>
</fC03>
<fC03 i1="02" i2="X" l="FRE">
<s0>Aide handicapé</s0>
<s5>02</s5>
</fC03>
<fC03 i1="02" i2="X" l="ENG">
<s0>Handicapped aid</s0>
<s5>02</s5>
</fC03>
<fC03 i1="02" i2="X" l="SPA">
<s0>Ayuda minusválido</s0>
<s5>02</s5>
</fC03>
<fC03 i1="03" i2="X" l="FRE">
<s0>Mobilité</s0>
<s5>06</s5>
</fC03>
<fC03 i1="03" i2="X" l="ENG">
<s0>Mobility</s0>
<s5>06</s5>
</fC03>
<fC03 i1="03" i2="X" l="SPA">
<s0>Movilidad</s0>
<s5>06</s5>
</fC03>
<fC03 i1="04" i2="3" l="FRE">
<s0>Détection objet</s0>
<s5>07</s5>
</fC03>
<fC03 i1="04" i2="3" l="ENG">
<s0>Object detection</s0>
<s5>07</s5>
</fC03>
<fC03 i1="05" i2="X" l="FRE">
<s0>Interface utilisateur</s0>
<s5>08</s5>
</fC03>
<fC03 i1="05" i2="X" l="ENG">
<s0>User interface</s0>
<s5>08</s5>
</fC03>
<fC03 i1="05" i2="X" l="SPA">
<s0>Interfase usuario</s0>
<s5>08</s5>
</fC03>
<fC03 i1="06" i2="X" l="FRE">
<s0>Regard</s0>
<s5>09</s5>
</fC03>
<fC03 i1="06" i2="X" l="ENG">
<s0>Gaze</s0>
<s5>09</s5>
</fC03>
<fC03 i1="06" i2="X" l="SPA">
<s0>Mirada</s0>
<s5>09</s5>
</fC03>
<fC03 i1="07" i2="X" l="FRE">
<s0>Sensibilité tactile</s0>
<s5>18</s5>
</fC03>
<fC03 i1="07" i2="X" l="ENG">
<s0>Tactile sensitivity</s0>
<s5>18</s5>
</fC03>
<fC03 i1="07" i2="X" l="SPA">
<s0>Sensibilidad tactil</s0>
<s5>18</s5>
</fC03>
<fC03 i1="08" i2="X" l="FRE">
<s0>Attention visuelle</s0>
<s5>19</s5>
</fC03>
<fC03 i1="08" i2="X" l="ENG">
<s0>Visual attention</s0>
<s5>19</s5>
</fC03>
<fC03 i1="08" i2="X" l="SPA">
<s0>Atención visual</s0>
<s5>19</s5>
</fC03>
<fC03 i1="09" i2="X" l="FRE">
<s0>Trouble vision</s0>
<s5>20</s5>
</fC03>
<fC03 i1="09" i2="X" l="ENG">
<s0>Vision disorder</s0>
<s5>20</s5>
</fC03>
<fC03 i1="09" i2="X" l="SPA">
<s0>Trastorno visión</s0>
<s5>20</s5>
</fC03>
<fC03 i1="10" i2="X" l="FRE">
<s0>Détecteur proximité</s0>
<s5>21</s5>
</fC03>
<fC03 i1="10" i2="X" l="ENG">
<s0>Proximity detector</s0>
<s5>21</s5>
</fC03>
<fC03 i1="10" i2="X" l="SPA">
<s0>Detector proximidad</s0>
<s5>21</s5>
</fC03>
<fC03 i1="11" i2="X" l="FRE">
<s0>Intention</s0>
<s5>22</s5>
</fC03>
<fC03 i1="11" i2="X" l="ENG">
<s0>Intention</s0>
<s5>22</s5>
</fC03>
<fC03 i1="11" i2="X" l="SPA">
<s0>Intencíon</s0>
<s5>22</s5>
</fC03>
<fN21>
<s1>344</s1>
</fN21>
<fN44 i1="01">
<s1>OTO</s1>
</fN44>
<fN82>
<s1>OTO</s1>
</fN82>
</pA>
<pR>
<fA30 i1="01" i2="1" l="ENG">
<s1>International Conference on Computers Helping people with Special Needs</s1>
<s2>10</s2>
<s3>Linz AUT</s3>
<s4>2006</s4>
</fA30>
</pR>
</standard>
<server>
<NO>PASCAL 07-0525257 INIST</NO>
<ET>CyARM : Interactive device for environment recognition and joint haptic attention using non-visual modality</ET>
<AU>ONO (Tetsuo); KOMATSU (Takanori); AKITA (Jun-Ichi); ITO (Kiyohide); OKAMOTO (Makoto)</AU>
<AF>Department of Media Architecture, Future University-Hakodate/116-2 Kamedanakano, Hakodate 041-8655/Japon (1 aut., 2 aut., 4 aut., 5 aut.); Department of Information and Systems Engineering, Kanazawa University/Kakuma, Kanazawa 920-1192/Japon (3 aut.)</AF>
<DT>Publication en série; Congrès; Niveau analytique</DT>
<SO>Lecture notes in computer science; ISSN 0302-9743; Allemagne; Da. 2006; Vol. 4061; Pp. 1251-1258; Bibl. 5 ref.</SO>
<LA>Anglais</LA>
<EA>We have developed CyARM, a new kind of sensing device especially for visually impaired persons, to assist with mobility and detection of nearby objects. This user interface has unique characteristics of giving visually impaired persons the impression of an "imaginary arm" that extends to existing obstacles. CyARM is also a communication device for constructing "joint haptic attention" between impaired and unimpaired persons watching or feeling the same objects. In other words, this device offers a new methodology for judging the others' attentions or intentions without using their eye gaze direction. We verified the efficiency and ability of CyARM for environment recognition through experiments and discuss its possibility as a communication device for realizing joint haptic attention.</EA>
<CC>001D02B04</CC>
<FD>Assistance utilisateur; Aide handicapé; Mobilité; Détection objet; Interface utilisateur; Regard; Sensibilité tactile; Attention visuelle; Trouble vision; Détecteur proximité; Intention</FD>
<ED>User assistance; Handicapped aid; Mobility; Object detection; User interface; Gaze; Tactile sensitivity; Visual attention; Vision disorder; Proximity detector; Intention</ED>
<SD>Asistencia usuario; Ayuda minusválido; Movilidad; Interfase usuario; Mirada; Sensibilidad tactil; Atención visual; Trastorno visión; Detector proximidad; Intencíon</SD>
<LO>INIST-16343.354000153622481800</LO>
<ID>07-0525257</ID>
</server>
</inist>
</record>

Pour manipuler ce document sous Unix (Dilib)

EXPLOR_STEP=$WICRI_ROOT/Ticri/CIDE/explor/HapticV1/Data/PascalFrancis/Corpus
HfdSelect -h $EXPLOR_STEP/biblio.hfd -nk 000A78 | SxmlIndent | more

Ou

HfdSelect -h $EXPLOR_AREA/Data/PascalFrancis/Corpus/biblio.hfd -nk 000A78 | SxmlIndent | more

Pour mettre un lien sur cette page dans le réseau Wicri

{{Explor lien
   |wiki=    Ticri/CIDE
   |area=    HapticV1
   |flux=    PascalFrancis
   |étape=   Corpus
   |type=    RBID
   |clé=     Pascal:07-0525257
   |texte=   CyARM : Interactive device for environment recognition and joint haptic attention using non-visual modality
}}

Wicri

This area was generated with Dilib version V0.6.23.
Data generation: Mon Jun 13 01:09:46 2016. Site generation: Wed Mar 6 09:54:07 2024