Serveur d'exploration sur les dispositifs haptiques

Attention, ce site est en cours de développement !
Attention, site généré par des moyens informatiques à partir de corpus bruts.
Les informations ne sont donc pas validées.

Rhythmic interaction for song filtering on a mobile device

Identifieur interne : 000A43 ( PascalFrancis/Curation ); précédent : 000A42; suivant : 000A44

Rhythmic interaction for song filtering on a mobile device

Auteurs : Andrew Crossan [Royaume-Uni] ; Roderick Murray-Smith [Royaume-Uni, Irlande (pays)]

Source :

RBID : Pascal:08-0032199

Descripteurs français

English descriptors

Abstract

This paper describes a mobile implementation of song filtering using rhythmic interaction. A user taps the screen or shakes the device (sensed through an accelerometer) at the tempo of a particular song in order to listen to it. We use the variability in beat frequency to display ambiguity to allow users to adjust their actions based on the given feedback. The results of a pilot study for a simple object selection task showed that although the tapping interface provided a larger range of comfortable tempos, participants could use both tapping and shaking methods to select a given song. Finally, the effects of variability in a rhythmic interaction style of interface are discussed.
pA  
A01 01  1    @0 0302-9743
A05       @2 4129
A08 01  1  ENG  @1 Rhythmic interaction for song filtering on a mobile device
A09 01  1  ENG  @1 Haptic and audio interaction design : First international workshop, HAID 2006, Glasgow, UK, August 31-September 1, 2006 : proceedings
A11 01  1    @1 CROSSAN (Andrew)
A11 02  1    @1 MURRAY-SMITH (Roderick)
A12 01  1    @1 MCGOOKIN (David) @9 ed.
A12 02  1    @1 BREWSTER (Stephen) @9 ed.
A14 01      @1 Department of Computing Science, University of Glasgow @2 Scotland, G12 8QQ @3 GBR @Z 1 aut. @Z 2 aut.
A14 02      @1 Hamilton Institute, NUI Maynooth @2 Co. Kildare @3 IRL @Z 2 aut.
A20       @1 45-55
A21       @1 2006
A23 01      @0 ENG
A26 01      @0 3-540-37595-3
A43 01      @1 INIST @2 16343 @5 354000153642100050
A44       @0 0000 @1 © 2008 INIST-CNRS. All rights reserved.
A45       @0 9 ref.
A47 01  1    @0 08-0032199
A60       @1 P @2 C
A61       @0 A
A64 01  1    @0 Lecture notes in computer science
A66 01      @0 DEU
A66 02      @0 USA
C01 01    ENG  @0 This paper describes a mobile implementation of song filtering using rhythmic interaction. A user taps the screen or shakes the device (sensed through an accelerometer) at the tempo of a particular song in order to listen to it. We use the variability in beat frequency to display ambiguity to allow users to adjust their actions based on the given feedback. The results of a pilot study for a simple object selection task showed that although the tapping interface provided a larger range of comfortable tempos, participants could use both tapping and shaking methods to select a given song. Finally, the effects of variability in a rhythmic interaction style of interface are discussed.
C02 01  X    @0 001D02B04
C03 01  X  FRE  @0 Interface utilisateur @5 01
C03 01  X  ENG  @0 User interface @5 01
C03 01  X  SPA  @0 Interfase usuario @5 01
C03 02  X  FRE  @0 Aide handicapé @5 02
C03 02  X  ENG  @0 Handicapped aid @5 02
C03 02  X  SPA  @0 Ayuda minusválido @5 02
C03 03  X  FRE  @0 Assistance utilisateur @5 03
C03 03  X  ENG  @0 User assistance @5 03
C03 03  X  SPA  @0 Asistencia usuario @5 03
C03 04  X  FRE  @0 Perception @5 04
C03 04  X  ENG  @0 Perception @5 04
C03 04  X  SPA  @0 Percepción @5 04
C03 05  X  FRE  @0 Fréquence battement @5 06
C03 05  X  ENG  @0 Beat frequency @5 06
C03 05  X  SPA  @0 Frecuencia golpeo @5 06
C03 06  X  FRE  @0 Boucle réaction @5 07
C03 06  X  ENG  @0 Feedback @5 07
C03 06  X  SPA  @0 Retroalimentación @5 07
C03 07  X  FRE  @0 Chant @5 18
C03 07  X  ENG  @0 Song @5 18
C03 07  X  SPA  @0 Canto @5 18
C03 08  3  FRE  @0 Informatique mobile @5 19
C03 08  3  ENG  @0 Mobile computing @5 19
C03 09  X  FRE  @0 Variabilité @5 20
C03 09  X  ENG  @0 Variability @5 20
C03 09  X  SPA  @0 Variabilidad @5 20
C03 10  X  FRE  @0 Affichage @5 21
C03 10  X  ENG  @0 Display @5 21
C03 10  X  SPA  @0 Visualización @5 21
C03 11  X  FRE  @0 Ambiguité @5 22
C03 11  X  ENG  @0 Ambiguity @5 22
C03 11  X  SPA  @0 Ambiguedad @5 22
C03 12  X  FRE  @0 Filtrage @5 23
C03 12  X  ENG  @0 Filtering @5 23
C03 12  X  SPA  @0 Filtrado @5 23
C03 13  X  FRE  @0 Filtre @5 24
C03 13  X  ENG  @0 Filter @5 24
C03 13  X  SPA  @0 Filtro @5 24
C03 14  X  FRE  @0 Taraudage @5 25
C03 14  X  ENG  @0 Tapping @5 25
C03 14  X  SPA  @0 Aterrajado @5 25
C03 15  X  FRE  @0 Accéléromètre @5 33
C03 15  X  ENG  @0 Accelerometer @5 33
C03 15  X  SPA  @0 Acelerómetro @5 33
C03 16  X  FRE  @0 . @4 INC @5 82
N21       @1 052
N44 01      @1 OTO
N82       @1 OTO
pR  
A30 01  1  ENG  @1 International Workshop on Haptic and Audio Interaction Design @2 1 @3 Glasgow GBR @4 2006

Links toward previous steps (curation, corpus...)


Links to Exploration step

Pascal:08-0032199

Le document en format XML

<record>
<TEI>
<teiHeader>
<fileDesc>
<titleStmt>
<title xml:lang="en" level="a">Rhythmic interaction for song filtering on a mobile device</title>
<author>
<name sortKey="Crossan, Andrew" sort="Crossan, Andrew" uniqKey="Crossan A" first="Andrew" last="Crossan">Andrew Crossan</name>
<affiliation wicri:level="1">
<inist:fA14 i1="01">
<s1>Department of Computing Science, University of Glasgow</s1>
<s2>Scotland, G12 8QQ</s2>
<s3>GBR</s3>
<sZ>1 aut.</sZ>
<sZ>2 aut.</sZ>
</inist:fA14>
<country>Royaume-Uni</country>
</affiliation>
</author>
<author>
<name sortKey="Murray Smith, Roderick" sort="Murray Smith, Roderick" uniqKey="Murray Smith R" first="Roderick" last="Murray-Smith">Roderick Murray-Smith</name>
<affiliation wicri:level="1">
<inist:fA14 i1="01">
<s1>Department of Computing Science, University of Glasgow</s1>
<s2>Scotland, G12 8QQ</s2>
<s3>GBR</s3>
<sZ>1 aut.</sZ>
<sZ>2 aut.</sZ>
</inist:fA14>
<country>Royaume-Uni</country>
</affiliation>
<affiliation wicri:level="1">
<inist:fA14 i1="02">
<s1>Hamilton Institute, NUI Maynooth</s1>
<s2>Co. Kildare</s2>
<s3>IRL</s3>
<sZ>2 aut.</sZ>
</inist:fA14>
<country>Irlande (pays)</country>
</affiliation>
</author>
</titleStmt>
<publicationStmt>
<idno type="wicri:source">INIST</idno>
<idno type="inist">08-0032199</idno>
<date when="2006">2006</date>
<idno type="stanalyst">PASCAL 08-0032199 INIST</idno>
<idno type="RBID">Pascal:08-0032199</idno>
<idno type="wicri:Area/PascalFrancis/Corpus">000A18</idno>
<idno type="wicri:Area/PascalFrancis/Curation">000A43</idno>
</publicationStmt>
<sourceDesc>
<biblStruct>
<analytic>
<title xml:lang="en" level="a">Rhythmic interaction for song filtering on a mobile device</title>
<author>
<name sortKey="Crossan, Andrew" sort="Crossan, Andrew" uniqKey="Crossan A" first="Andrew" last="Crossan">Andrew Crossan</name>
<affiliation wicri:level="1">
<inist:fA14 i1="01">
<s1>Department of Computing Science, University of Glasgow</s1>
<s2>Scotland, G12 8QQ</s2>
<s3>GBR</s3>
<sZ>1 aut.</sZ>
<sZ>2 aut.</sZ>
</inist:fA14>
<country>Royaume-Uni</country>
</affiliation>
</author>
<author>
<name sortKey="Murray Smith, Roderick" sort="Murray Smith, Roderick" uniqKey="Murray Smith R" first="Roderick" last="Murray-Smith">Roderick Murray-Smith</name>
<affiliation wicri:level="1">
<inist:fA14 i1="01">
<s1>Department of Computing Science, University of Glasgow</s1>
<s2>Scotland, G12 8QQ</s2>
<s3>GBR</s3>
<sZ>1 aut.</sZ>
<sZ>2 aut.</sZ>
</inist:fA14>
<country>Royaume-Uni</country>
</affiliation>
<affiliation wicri:level="1">
<inist:fA14 i1="02">
<s1>Hamilton Institute, NUI Maynooth</s1>
<s2>Co. Kildare</s2>
<s3>IRL</s3>
<sZ>2 aut.</sZ>
</inist:fA14>
<country>Irlande (pays)</country>
</affiliation>
</author>
</analytic>
<series>
<title level="j" type="main">Lecture notes in computer science</title>
<idno type="ISSN">0302-9743</idno>
<imprint>
<date when="2006">2006</date>
</imprint>
</series>
</biblStruct>
</sourceDesc>
<seriesStmt>
<title level="j" type="main">Lecture notes in computer science</title>
<idno type="ISSN">0302-9743</idno>
</seriesStmt>
</fileDesc>
<profileDesc>
<textClass>
<keywords scheme="KwdEn" xml:lang="en">
<term>Accelerometer</term>
<term>Ambiguity</term>
<term>Beat frequency</term>
<term>Display</term>
<term>Feedback</term>
<term>Filter</term>
<term>Filtering</term>
<term>Handicapped aid</term>
<term>Mobile computing</term>
<term>Perception</term>
<term>Song</term>
<term>Tapping</term>
<term>User assistance</term>
<term>User interface</term>
<term>Variability</term>
</keywords>
<keywords scheme="Pascal" xml:lang="fr">
<term>Interface utilisateur</term>
<term>Aide handicapé</term>
<term>Assistance utilisateur</term>
<term>Perception</term>
<term>Fréquence battement</term>
<term>Boucle réaction</term>
<term>Chant</term>
<term>Informatique mobile</term>
<term>Variabilité</term>
<term>Affichage</term>
<term>Ambiguité</term>
<term>Filtrage</term>
<term>Filtre</term>
<term>Taraudage</term>
<term>Accéléromètre</term>
<term>.</term>
</keywords>
<keywords scheme="Wicri" type="topic" xml:lang="fr">
<term>Affichage</term>
</keywords>
</textClass>
</profileDesc>
</teiHeader>
<front>
<div type="abstract" xml:lang="en">This paper describes a mobile implementation of song filtering using rhythmic interaction. A user taps the screen or shakes the device (sensed through an accelerometer) at the tempo of a particular song in order to listen to it. We use the variability in beat frequency to display ambiguity to allow users to adjust their actions based on the given feedback. The results of a pilot study for a simple object selection task showed that although the tapping interface provided a larger range of comfortable tempos, participants could use both tapping and shaking methods to select a given song. Finally, the effects of variability in a rhythmic interaction style of interface are discussed.</div>
</front>
</TEI>
<inist>
<standard h6="B">
<pA>
<fA01 i1="01" i2="1">
<s0>0302-9743</s0>
</fA01>
<fA05>
<s2>4129</s2>
</fA05>
<fA08 i1="01" i2="1" l="ENG">
<s1>Rhythmic interaction for song filtering on a mobile device</s1>
</fA08>
<fA09 i1="01" i2="1" l="ENG">
<s1>Haptic and audio interaction design : First international workshop, HAID 2006, Glasgow, UK, August 31-September 1, 2006 : proceedings</s1>
</fA09>
<fA11 i1="01" i2="1">
<s1>CROSSAN (Andrew)</s1>
</fA11>
<fA11 i1="02" i2="1">
<s1>MURRAY-SMITH (Roderick)</s1>
</fA11>
<fA12 i1="01" i2="1">
<s1>MCGOOKIN (David)</s1>
<s9>ed.</s9>
</fA12>
<fA12 i1="02" i2="1">
<s1>BREWSTER (Stephen)</s1>
<s9>ed.</s9>
</fA12>
<fA14 i1="01">
<s1>Department of Computing Science, University of Glasgow</s1>
<s2>Scotland, G12 8QQ</s2>
<s3>GBR</s3>
<sZ>1 aut.</sZ>
<sZ>2 aut.</sZ>
</fA14>
<fA14 i1="02">
<s1>Hamilton Institute, NUI Maynooth</s1>
<s2>Co. Kildare</s2>
<s3>IRL</s3>
<sZ>2 aut.</sZ>
</fA14>
<fA20>
<s1>45-55</s1>
</fA20>
<fA21>
<s1>2006</s1>
</fA21>
<fA23 i1="01">
<s0>ENG</s0>
</fA23>
<fA26 i1="01">
<s0>3-540-37595-3</s0>
</fA26>
<fA43 i1="01">
<s1>INIST</s1>
<s2>16343</s2>
<s5>354000153642100050</s5>
</fA43>
<fA44>
<s0>0000</s0>
<s1>© 2008 INIST-CNRS. All rights reserved.</s1>
</fA44>
<fA45>
<s0>9 ref.</s0>
</fA45>
<fA47 i1="01" i2="1">
<s0>08-0032199</s0>
</fA47>
<fA60>
<s1>P</s1>
<s2>C</s2>
</fA60>
<fA61>
<s0>A</s0>
</fA61>
<fA64 i1="01" i2="1">
<s0>Lecture notes in computer science</s0>
</fA64>
<fA66 i1="01">
<s0>DEU</s0>
</fA66>
<fA66 i1="02">
<s0>USA</s0>
</fA66>
<fC01 i1="01" l="ENG">
<s0>This paper describes a mobile implementation of song filtering using rhythmic interaction. A user taps the screen or shakes the device (sensed through an accelerometer) at the tempo of a particular song in order to listen to it. We use the variability in beat frequency to display ambiguity to allow users to adjust their actions based on the given feedback. The results of a pilot study for a simple object selection task showed that although the tapping interface provided a larger range of comfortable tempos, participants could use both tapping and shaking methods to select a given song. Finally, the effects of variability in a rhythmic interaction style of interface are discussed.</s0>
</fC01>
<fC02 i1="01" i2="X">
<s0>001D02B04</s0>
</fC02>
<fC03 i1="01" i2="X" l="FRE">
<s0>Interface utilisateur</s0>
<s5>01</s5>
</fC03>
<fC03 i1="01" i2="X" l="ENG">
<s0>User interface</s0>
<s5>01</s5>
</fC03>
<fC03 i1="01" i2="X" l="SPA">
<s0>Interfase usuario</s0>
<s5>01</s5>
</fC03>
<fC03 i1="02" i2="X" l="FRE">
<s0>Aide handicapé</s0>
<s5>02</s5>
</fC03>
<fC03 i1="02" i2="X" l="ENG">
<s0>Handicapped aid</s0>
<s5>02</s5>
</fC03>
<fC03 i1="02" i2="X" l="SPA">
<s0>Ayuda minusválido</s0>
<s5>02</s5>
</fC03>
<fC03 i1="03" i2="X" l="FRE">
<s0>Assistance utilisateur</s0>
<s5>03</s5>
</fC03>
<fC03 i1="03" i2="X" l="ENG">
<s0>User assistance</s0>
<s5>03</s5>
</fC03>
<fC03 i1="03" i2="X" l="SPA">
<s0>Asistencia usuario</s0>
<s5>03</s5>
</fC03>
<fC03 i1="04" i2="X" l="FRE">
<s0>Perception</s0>
<s5>04</s5>
</fC03>
<fC03 i1="04" i2="X" l="ENG">
<s0>Perception</s0>
<s5>04</s5>
</fC03>
<fC03 i1="04" i2="X" l="SPA">
<s0>Percepción</s0>
<s5>04</s5>
</fC03>
<fC03 i1="05" i2="X" l="FRE">
<s0>Fréquence battement</s0>
<s5>06</s5>
</fC03>
<fC03 i1="05" i2="X" l="ENG">
<s0>Beat frequency</s0>
<s5>06</s5>
</fC03>
<fC03 i1="05" i2="X" l="SPA">
<s0>Frecuencia golpeo</s0>
<s5>06</s5>
</fC03>
<fC03 i1="06" i2="X" l="FRE">
<s0>Boucle réaction</s0>
<s5>07</s5>
</fC03>
<fC03 i1="06" i2="X" l="ENG">
<s0>Feedback</s0>
<s5>07</s5>
</fC03>
<fC03 i1="06" i2="X" l="SPA">
<s0>Retroalimentación</s0>
<s5>07</s5>
</fC03>
<fC03 i1="07" i2="X" l="FRE">
<s0>Chant</s0>
<s5>18</s5>
</fC03>
<fC03 i1="07" i2="X" l="ENG">
<s0>Song</s0>
<s5>18</s5>
</fC03>
<fC03 i1="07" i2="X" l="SPA">
<s0>Canto</s0>
<s5>18</s5>
</fC03>
<fC03 i1="08" i2="3" l="FRE">
<s0>Informatique mobile</s0>
<s5>19</s5>
</fC03>
<fC03 i1="08" i2="3" l="ENG">
<s0>Mobile computing</s0>
<s5>19</s5>
</fC03>
<fC03 i1="09" i2="X" l="FRE">
<s0>Variabilité</s0>
<s5>20</s5>
</fC03>
<fC03 i1="09" i2="X" l="ENG">
<s0>Variability</s0>
<s5>20</s5>
</fC03>
<fC03 i1="09" i2="X" l="SPA">
<s0>Variabilidad</s0>
<s5>20</s5>
</fC03>
<fC03 i1="10" i2="X" l="FRE">
<s0>Affichage</s0>
<s5>21</s5>
</fC03>
<fC03 i1="10" i2="X" l="ENG">
<s0>Display</s0>
<s5>21</s5>
</fC03>
<fC03 i1="10" i2="X" l="SPA">
<s0>Visualización</s0>
<s5>21</s5>
</fC03>
<fC03 i1="11" i2="X" l="FRE">
<s0>Ambiguité</s0>
<s5>22</s5>
</fC03>
<fC03 i1="11" i2="X" l="ENG">
<s0>Ambiguity</s0>
<s5>22</s5>
</fC03>
<fC03 i1="11" i2="X" l="SPA">
<s0>Ambiguedad</s0>
<s5>22</s5>
</fC03>
<fC03 i1="12" i2="X" l="FRE">
<s0>Filtrage</s0>
<s5>23</s5>
</fC03>
<fC03 i1="12" i2="X" l="ENG">
<s0>Filtering</s0>
<s5>23</s5>
</fC03>
<fC03 i1="12" i2="X" l="SPA">
<s0>Filtrado</s0>
<s5>23</s5>
</fC03>
<fC03 i1="13" i2="X" l="FRE">
<s0>Filtre</s0>
<s5>24</s5>
</fC03>
<fC03 i1="13" i2="X" l="ENG">
<s0>Filter</s0>
<s5>24</s5>
</fC03>
<fC03 i1="13" i2="X" l="SPA">
<s0>Filtro</s0>
<s5>24</s5>
</fC03>
<fC03 i1="14" i2="X" l="FRE">
<s0>Taraudage</s0>
<s5>25</s5>
</fC03>
<fC03 i1="14" i2="X" l="ENG">
<s0>Tapping</s0>
<s5>25</s5>
</fC03>
<fC03 i1="14" i2="X" l="SPA">
<s0>Aterrajado</s0>
<s5>25</s5>
</fC03>
<fC03 i1="15" i2="X" l="FRE">
<s0>Accéléromètre</s0>
<s5>33</s5>
</fC03>
<fC03 i1="15" i2="X" l="ENG">
<s0>Accelerometer</s0>
<s5>33</s5>
</fC03>
<fC03 i1="15" i2="X" l="SPA">
<s0>Acelerómetro</s0>
<s5>33</s5>
</fC03>
<fC03 i1="16" i2="X" l="FRE">
<s0>.</s0>
<s4>INC</s4>
<s5>82</s5>
</fC03>
<fN21>
<s1>052</s1>
</fN21>
<fN44 i1="01">
<s1>OTO</s1>
</fN44>
<fN82>
<s1>OTO</s1>
</fN82>
</pA>
<pR>
<fA30 i1="01" i2="1" l="ENG">
<s1>International Workshop on Haptic and Audio Interaction Design</s1>
<s2>1</s2>
<s3>Glasgow GBR</s3>
<s4>2006</s4>
</fA30>
</pR>
</standard>
</inist>
</record>

Pour manipuler ce document sous Unix (Dilib)

EXPLOR_STEP=$WICRI_ROOT/Ticri/CIDE/explor/HapticV1/Data/PascalFrancis/Curation
HfdSelect -h $EXPLOR_STEP/biblio.hfd -nk 000A43 | SxmlIndent | more

Ou

HfdSelect -h $EXPLOR_AREA/Data/PascalFrancis/Curation/biblio.hfd -nk 000A43 | SxmlIndent | more

Pour mettre un lien sur cette page dans le réseau Wicri

{{Explor lien
   |wiki=    Ticri/CIDE
   |area=    HapticV1
   |flux=    PascalFrancis
   |étape=   Curation
   |type=    RBID
   |clé=     Pascal:08-0032199
   |texte=   Rhythmic interaction for song filtering on a mobile device
}}

Wicri

This area was generated with Dilib version V0.6.23.
Data generation: Mon Jun 13 01:09:46 2016. Site generation: Wed Mar 6 09:54:07 2024