Serveur d'exploration sur les dispositifs haptiques

Attention, ce site est en cours de développement !
Attention, site généré par des moyens informatiques à partir de corpus bruts.
Les informations ne sont donc pas validées.

Lemma 4 : Haptic input + auditory display = musical instrument?

Identifieur interne : 000A69 ( PascalFrancis/Corpus ); précédent : 000A68; suivant : 000A70

Lemma 4 : Haptic input + auditory display = musical instrument?

Auteurs : Paul Vickers

Source :

RBID : Francis:08-0032200

Descripteurs français

English descriptors

Abstract

In this paper we look at some of the design issues that affect the success of multimodal displays that combine acoustic and haptic modalities. First, issues affecting successful sonification design are explored and suggestions are made about how the language of electroa-coustic music can assist. Next, haptic interaction is introduced in the light of this discussion, particularly focusing on the roles of gesture and mimesis. Finally, some observations are made regarding some of the issues that arise when the haptic and acoustic modalities are combined in the interface. This paper looks at examples of where auditory and haptic interaction have been successfully combined beyond the strict confines of the human-computer application interface (musical instruments in particular) and discusses lessons that may be drawn from these domains and applied to the world of multimodal human-computer interaction. The argument is made that combined haptic-auditory interaction schemes can be thought of as musical instruments and some of the possible ramifications of this are raised.

Notice en format standard (ISO 2709)

Pour connaître la documentation sur le format Inist Standard.

pA  
A01 01  1    @0 0302-9743
A05       @2 4129
A08 01  1  ENG  @1 Lemma 4 : Haptic input + auditory display = musical instrument?
A09 01  1  ENG  @1 Haptic and audio interaction design : First international workshop, HAID 2006, Glasgow, UK, August 31-September 1, 2006 : proceedings
A11 01  1    @1 VICKERS (Paul)
A12 01  1    @1 MCGOOKIN (David) @9 ed.
A12 02  1    @1 BREWSTER (Stephen) @9 ed.
A14 01      @1 Northumbria University, School of Computing, Engineering, and Information Sciences, Pandon Building, Camden St @2 Newcastle upon Tyne NE2 1XE @3 GBR @Z 1 aut.
A20       @1 56-67
A21       @1 2006
A23 01      @0 ENG
A26 01      @0 3-540-37595-3
A43 01      @1 INIST @2 16343 @5 354000153642100060
A44       @0 0000 @1 © 2008 INIST-CNRS. All rights reserved.
A45       @0 28 ref.
A47 01  1    @0 08-0032200
A60       @1 P @2 C
A61       @0 A
A64 01  1    @0 Lecture notes in computer science
A66 01      @0 DEU
A66 02      @0 USA
C01 01    ENG  @0 In this paper we look at some of the design issues that affect the success of multimodal displays that combine acoustic and haptic modalities. First, issues affecting successful sonification design are explored and suggestions are made about how the language of electroa-coustic music can assist. Next, haptic interaction is introduced in the light of this discussion, particularly focusing on the roles of gesture and mimesis. Finally, some observations are made regarding some of the issues that arise when the haptic and acoustic modalities are combined in the interface. This paper looks at examples of where auditory and haptic interaction have been successfully combined beyond the strict confines of the human-computer application interface (musical instruments in particular) and discusses lessons that may be drawn from these domains and applied to the world of multimodal human-computer interaction. The argument is made that combined haptic-auditory interaction schemes can be thought of as musical instruments and some of the possible ramifications of this are raised.
C02 01  X    @0 770B05D @1 II
C03 01  X  FRE  @0 Aide handicapé @5 01
C03 01  X  ENG  @0 Handicapped aid @5 01
C03 01  X  SPA  @0 Ayuda minusválido @5 01
C03 02  X  FRE  @0 Assistance utilisateur @5 02
C03 02  X  ENG  @0 User assistance @5 02
C03 02  X  SPA  @0 Asistencia usuario @5 02
C03 03  X  FRE  @0 Perception @5 03
C03 03  X  ENG  @0 Perception @5 03
C03 03  X  SPA  @0 Percepción @5 03
C03 04  3  FRE  @0 Acoustique audio @5 06
C03 04  3  ENG  @0 Audio acoustics @5 06
C03 05  X  FRE  @0 Instrument musique @5 07
C03 05  X  ENG  @0 Musical instrument @5 07
C03 05  X  SPA  @0 Instrumento musical @5 07
C03 06  X  FRE  @0 Acoustique musicale @5 08
C03 06  X  ENG  @0 Musical acoustics @5 08
C03 06  X  SPA  @0 Acústica musical @5 08
C03 07  X  FRE  @0 Guide onde multimode @5 09
C03 07  X  ENG  @0 Multimode waveguide @5 09
C03 07  X  SPA  @0 Guía onda multimodo @5 09
C03 08  X  FRE  @0 Système homme machine @5 10
C03 08  X  ENG  @0 Man machine system @5 10
C03 08  X  SPA  @0 Sistema hombre máquina @5 10
C03 09  X  FRE  @0 Interface utilisateur @5 15
C03 09  X  ENG  @0 User interface @5 15
C03 09  X  SPA  @0 Interfase usuario @5 15
C03 10  X  FRE  @0 Sensibilité tactile @5 18
C03 10  X  ENG  @0 Tactile sensitivity @5 18
C03 10  X  SPA  @0 Sensibilidad tactil @5 18
C03 11  X  FRE  @0 Affichage @5 19
C03 11  X  ENG  @0 Display @5 19
C03 11  X  SPA  @0 Visualización @5 19
C03 12  X  FRE  @0 Musique @5 20
C03 12  X  ENG  @0 Music @5 20
C03 12  X  SPA  @0 Música @5 20
C03 13  X  FRE  @0 Geste @5 21
C03 13  X  ENG  @0 Gesture @5 21
C03 13  X  SPA  @0 Gesto @5 21
C03 14  X  FRE  @0 Audition @5 22
C03 14  X  ENG  @0 Hearing @5 22
C03 14  X  SPA  @0 Audición @5 22
C03 15  X  FRE  @0 . @4 INC @5 82
N21       @1 052
N44 01      @1 OTO
N82       @1 OTO
pR  
A30 01  1  ENG  @1 International Workshop on Haptic and Audio Interaction Design @2 1 @3 Glasgow GBR @4 2006

Format Inist (serveur)

NO : FRANCIS 08-0032200 INIST
ET : Lemma 4 : Haptic input + auditory display = musical instrument?
AU : VICKERS (Paul); MCGOOKIN (David); BREWSTER (Stephen)
AF : Northumbria University, School of Computing, Engineering, and Information Sciences, Pandon Building, Camden St/Newcastle upon Tyne NE2 1XE/Royaume-Uni (1 aut.)
DT : Publication en série; Congrès; Niveau analytique
SO : Lecture notes in computer science; ISSN 0302-9743; Allemagne; Da. 2006; Vol. 4129; Pp. 56-67; Bibl. 28 ref.
LA : Anglais
EA : In this paper we look at some of the design issues that affect the success of multimodal displays that combine acoustic and haptic modalities. First, issues affecting successful sonification design are explored and suggestions are made about how the language of electroa-coustic music can assist. Next, haptic interaction is introduced in the light of this discussion, particularly focusing on the roles of gesture and mimesis. Finally, some observations are made regarding some of the issues that arise when the haptic and acoustic modalities are combined in the interface. This paper looks at examples of where auditory and haptic interaction have been successfully combined beyond the strict confines of the human-computer application interface (musical instruments in particular) and discusses lessons that may be drawn from these domains and applied to the world of multimodal human-computer interaction. The argument is made that combined haptic-auditory interaction schemes can be thought of as musical instruments and some of the possible ramifications of this are raised.
CC : 770B05D
FD : Aide handicapé; Assistance utilisateur; Perception; Acoustique audio; Instrument musique; Acoustique musicale; Guide onde multimode; Système homme machine; Interface utilisateur; Sensibilité tactile; Affichage; Musique; Geste; Audition; .
ED : Handicapped aid; User assistance; Perception; Audio acoustics; Musical instrument; Musical acoustics; Multimode waveguide; Man machine system; User interface; Tactile sensitivity; Display; Music; Gesture; Hearing
SD : Ayuda minusválido; Asistencia usuario; Percepción; Instrumento musical; Acústica musical; Guía onda multimodo; Sistema hombre máquina; Interfase usuario; Sensibilidad tactil; Visualización; Música; Gesto; Audición
LO : INIST-16343.354000153642100060
ID : 08-0032200

Links to Exploration step

Francis:08-0032200

Le document en format XML

<record>
<TEI>
<teiHeader>
<fileDesc>
<titleStmt>
<title xml:lang="en" level="a">Lemma 4 : Haptic input + auditory display = musical instrument?</title>
<author>
<name sortKey="Vickers, Paul" sort="Vickers, Paul" uniqKey="Vickers P" first="Paul" last="Vickers">Paul Vickers</name>
<affiliation>
<inist:fA14 i1="01">
<s1>Northumbria University, School of Computing, Engineering, and Information Sciences, Pandon Building, Camden St</s1>
<s2>Newcastle upon Tyne NE2 1XE</s2>
<s3>GBR</s3>
<sZ>1 aut.</sZ>
</inist:fA14>
</affiliation>
</author>
</titleStmt>
<publicationStmt>
<idno type="wicri:source">INIST</idno>
<idno type="inist">08-0032200</idno>
<date when="2006">2006</date>
<idno type="stanalyst">FRANCIS 08-0032200 INIST</idno>
<idno type="RBID">Francis:08-0032200</idno>
<idno type="wicri:Area/PascalFrancis/Corpus">000A69</idno>
</publicationStmt>
<sourceDesc>
<biblStruct>
<analytic>
<title xml:lang="en" level="a">Lemma 4 : Haptic input + auditory display = musical instrument?</title>
<author>
<name sortKey="Vickers, Paul" sort="Vickers, Paul" uniqKey="Vickers P" first="Paul" last="Vickers">Paul Vickers</name>
<affiliation>
<inist:fA14 i1="01">
<s1>Northumbria University, School of Computing, Engineering, and Information Sciences, Pandon Building, Camden St</s1>
<s2>Newcastle upon Tyne NE2 1XE</s2>
<s3>GBR</s3>
<sZ>1 aut.</sZ>
</inist:fA14>
</affiliation>
</author>
</analytic>
<series>
<title level="j" type="main">Lecture notes in computer science</title>
<idno type="ISSN">0302-9743</idno>
<imprint>
<date when="2006">2006</date>
</imprint>
</series>
</biblStruct>
</sourceDesc>
<seriesStmt>
<title level="j" type="main">Lecture notes in computer science</title>
<idno type="ISSN">0302-9743</idno>
</seriesStmt>
</fileDesc>
<profileDesc>
<textClass>
<keywords scheme="KwdEn" xml:lang="en">
<term>Audio acoustics</term>
<term>Display</term>
<term>Gesture</term>
<term>Handicapped aid</term>
<term>Hearing</term>
<term>Man machine system</term>
<term>Multimode waveguide</term>
<term>Music</term>
<term>Musical acoustics</term>
<term>Musical instrument</term>
<term>Perception</term>
<term>Tactile sensitivity</term>
<term>User assistance</term>
<term>User interface</term>
</keywords>
<keywords scheme="Pascal" xml:lang="fr">
<term>Aide handicapé</term>
<term>Assistance utilisateur</term>
<term>Perception</term>
<term>Acoustique audio</term>
<term>Instrument musique</term>
<term>Acoustique musicale</term>
<term>Guide onde multimode</term>
<term>Système homme machine</term>
<term>Interface utilisateur</term>
<term>Sensibilité tactile</term>
<term>Affichage</term>
<term>Musique</term>
<term>Geste</term>
<term>Audition</term>
<term>.</term>
</keywords>
</textClass>
</profileDesc>
</teiHeader>
<front>
<div type="abstract" xml:lang="en">In this paper we look at some of the design issues that affect the success of multimodal displays that combine acoustic and haptic modalities. First, issues affecting successful sonification design are explored and suggestions are made about how the language of electroa-coustic music can assist. Next, haptic interaction is introduced in the light of this discussion, particularly focusing on the roles of gesture and mimesis. Finally, some observations are made regarding some of the issues that arise when the haptic and acoustic modalities are combined in the interface. This paper looks at examples of where auditory and haptic interaction have been successfully combined beyond the strict confines of the human-computer application interface (musical instruments in particular) and discusses lessons that may be drawn from these domains and applied to the world of multimodal human-computer interaction. The argument is made that combined haptic-auditory interaction schemes can be thought of as musical instruments and some of the possible ramifications of this are raised.</div>
</front>
</TEI>
<inist>
<standard h6="B">
<pA>
<fA01 i1="01" i2="1">
<s0>0302-9743</s0>
</fA01>
<fA05>
<s2>4129</s2>
</fA05>
<fA08 i1="01" i2="1" l="ENG">
<s1>Lemma 4 : Haptic input + auditory display = musical instrument?</s1>
</fA08>
<fA09 i1="01" i2="1" l="ENG">
<s1>Haptic and audio interaction design : First international workshop, HAID 2006, Glasgow, UK, August 31-September 1, 2006 : proceedings</s1>
</fA09>
<fA11 i1="01" i2="1">
<s1>VICKERS (Paul)</s1>
</fA11>
<fA12 i1="01" i2="1">
<s1>MCGOOKIN (David)</s1>
<s9>ed.</s9>
</fA12>
<fA12 i1="02" i2="1">
<s1>BREWSTER (Stephen)</s1>
<s9>ed.</s9>
</fA12>
<fA14 i1="01">
<s1>Northumbria University, School of Computing, Engineering, and Information Sciences, Pandon Building, Camden St</s1>
<s2>Newcastle upon Tyne NE2 1XE</s2>
<s3>GBR</s3>
<sZ>1 aut.</sZ>
</fA14>
<fA20>
<s1>56-67</s1>
</fA20>
<fA21>
<s1>2006</s1>
</fA21>
<fA23 i1="01">
<s0>ENG</s0>
</fA23>
<fA26 i1="01">
<s0>3-540-37595-3</s0>
</fA26>
<fA43 i1="01">
<s1>INIST</s1>
<s2>16343</s2>
<s5>354000153642100060</s5>
</fA43>
<fA44>
<s0>0000</s0>
<s1>© 2008 INIST-CNRS. All rights reserved.</s1>
</fA44>
<fA45>
<s0>28 ref.</s0>
</fA45>
<fA47 i1="01" i2="1">
<s0>08-0032200</s0>
</fA47>
<fA60>
<s1>P</s1>
<s2>C</s2>
</fA60>
<fA61>
<s0>A</s0>
</fA61>
<fA64 i1="01" i2="1">
<s0>Lecture notes in computer science</s0>
</fA64>
<fA66 i1="01">
<s0>DEU</s0>
</fA66>
<fA66 i1="02">
<s0>USA</s0>
</fA66>
<fC01 i1="01" l="ENG">
<s0>In this paper we look at some of the design issues that affect the success of multimodal displays that combine acoustic and haptic modalities. First, issues affecting successful sonification design are explored and suggestions are made about how the language of electroa-coustic music can assist. Next, haptic interaction is introduced in the light of this discussion, particularly focusing on the roles of gesture and mimesis. Finally, some observations are made regarding some of the issues that arise when the haptic and acoustic modalities are combined in the interface. This paper looks at examples of where auditory and haptic interaction have been successfully combined beyond the strict confines of the human-computer application interface (musical instruments in particular) and discusses lessons that may be drawn from these domains and applied to the world of multimodal human-computer interaction. The argument is made that combined haptic-auditory interaction schemes can be thought of as musical instruments and some of the possible ramifications of this are raised.</s0>
</fC01>
<fC02 i1="01" i2="X">
<s0>770B05D</s0>
<s1>II</s1>
</fC02>
<fC03 i1="01" i2="X" l="FRE">
<s0>Aide handicapé</s0>
<s5>01</s5>
</fC03>
<fC03 i1="01" i2="X" l="ENG">
<s0>Handicapped aid</s0>
<s5>01</s5>
</fC03>
<fC03 i1="01" i2="X" l="SPA">
<s0>Ayuda minusválido</s0>
<s5>01</s5>
</fC03>
<fC03 i1="02" i2="X" l="FRE">
<s0>Assistance utilisateur</s0>
<s5>02</s5>
</fC03>
<fC03 i1="02" i2="X" l="ENG">
<s0>User assistance</s0>
<s5>02</s5>
</fC03>
<fC03 i1="02" i2="X" l="SPA">
<s0>Asistencia usuario</s0>
<s5>02</s5>
</fC03>
<fC03 i1="03" i2="X" l="FRE">
<s0>Perception</s0>
<s5>03</s5>
</fC03>
<fC03 i1="03" i2="X" l="ENG">
<s0>Perception</s0>
<s5>03</s5>
</fC03>
<fC03 i1="03" i2="X" l="SPA">
<s0>Percepción</s0>
<s5>03</s5>
</fC03>
<fC03 i1="04" i2="3" l="FRE">
<s0>Acoustique audio</s0>
<s5>06</s5>
</fC03>
<fC03 i1="04" i2="3" l="ENG">
<s0>Audio acoustics</s0>
<s5>06</s5>
</fC03>
<fC03 i1="05" i2="X" l="FRE">
<s0>Instrument musique</s0>
<s5>07</s5>
</fC03>
<fC03 i1="05" i2="X" l="ENG">
<s0>Musical instrument</s0>
<s5>07</s5>
</fC03>
<fC03 i1="05" i2="X" l="SPA">
<s0>Instrumento musical</s0>
<s5>07</s5>
</fC03>
<fC03 i1="06" i2="X" l="FRE">
<s0>Acoustique musicale</s0>
<s5>08</s5>
</fC03>
<fC03 i1="06" i2="X" l="ENG">
<s0>Musical acoustics</s0>
<s5>08</s5>
</fC03>
<fC03 i1="06" i2="X" l="SPA">
<s0>Acústica musical</s0>
<s5>08</s5>
</fC03>
<fC03 i1="07" i2="X" l="FRE">
<s0>Guide onde multimode</s0>
<s5>09</s5>
</fC03>
<fC03 i1="07" i2="X" l="ENG">
<s0>Multimode waveguide</s0>
<s5>09</s5>
</fC03>
<fC03 i1="07" i2="X" l="SPA">
<s0>Guía onda multimodo</s0>
<s5>09</s5>
</fC03>
<fC03 i1="08" i2="X" l="FRE">
<s0>Système homme machine</s0>
<s5>10</s5>
</fC03>
<fC03 i1="08" i2="X" l="ENG">
<s0>Man machine system</s0>
<s5>10</s5>
</fC03>
<fC03 i1="08" i2="X" l="SPA">
<s0>Sistema hombre máquina</s0>
<s5>10</s5>
</fC03>
<fC03 i1="09" i2="X" l="FRE">
<s0>Interface utilisateur</s0>
<s5>15</s5>
</fC03>
<fC03 i1="09" i2="X" l="ENG">
<s0>User interface</s0>
<s5>15</s5>
</fC03>
<fC03 i1="09" i2="X" l="SPA">
<s0>Interfase usuario</s0>
<s5>15</s5>
</fC03>
<fC03 i1="10" i2="X" l="FRE">
<s0>Sensibilité tactile</s0>
<s5>18</s5>
</fC03>
<fC03 i1="10" i2="X" l="ENG">
<s0>Tactile sensitivity</s0>
<s5>18</s5>
</fC03>
<fC03 i1="10" i2="X" l="SPA">
<s0>Sensibilidad tactil</s0>
<s5>18</s5>
</fC03>
<fC03 i1="11" i2="X" l="FRE">
<s0>Affichage</s0>
<s5>19</s5>
</fC03>
<fC03 i1="11" i2="X" l="ENG">
<s0>Display</s0>
<s5>19</s5>
</fC03>
<fC03 i1="11" i2="X" l="SPA">
<s0>Visualización</s0>
<s5>19</s5>
</fC03>
<fC03 i1="12" i2="X" l="FRE">
<s0>Musique</s0>
<s5>20</s5>
</fC03>
<fC03 i1="12" i2="X" l="ENG">
<s0>Music</s0>
<s5>20</s5>
</fC03>
<fC03 i1="12" i2="X" l="SPA">
<s0>Música</s0>
<s5>20</s5>
</fC03>
<fC03 i1="13" i2="X" l="FRE">
<s0>Geste</s0>
<s5>21</s5>
</fC03>
<fC03 i1="13" i2="X" l="ENG">
<s0>Gesture</s0>
<s5>21</s5>
</fC03>
<fC03 i1="13" i2="X" l="SPA">
<s0>Gesto</s0>
<s5>21</s5>
</fC03>
<fC03 i1="14" i2="X" l="FRE">
<s0>Audition</s0>
<s5>22</s5>
</fC03>
<fC03 i1="14" i2="X" l="ENG">
<s0>Hearing</s0>
<s5>22</s5>
</fC03>
<fC03 i1="14" i2="X" l="SPA">
<s0>Audición</s0>
<s5>22</s5>
</fC03>
<fC03 i1="15" i2="X" l="FRE">
<s0>.</s0>
<s4>INC</s4>
<s5>82</s5>
</fC03>
<fN21>
<s1>052</s1>
</fN21>
<fN44 i1="01">
<s1>OTO</s1>
</fN44>
<fN82>
<s1>OTO</s1>
</fN82>
</pA>
<pR>
<fA30 i1="01" i2="1" l="ENG">
<s1>International Workshop on Haptic and Audio Interaction Design</s1>
<s2>1</s2>
<s3>Glasgow GBR</s3>
<s4>2006</s4>
</fA30>
</pR>
</standard>
<server>
<NO>FRANCIS 08-0032200 INIST</NO>
<ET>Lemma 4 : Haptic input + auditory display = musical instrument?</ET>
<AU>VICKERS (Paul); MCGOOKIN (David); BREWSTER (Stephen)</AU>
<AF>Northumbria University, School of Computing, Engineering, and Information Sciences, Pandon Building, Camden St/Newcastle upon Tyne NE2 1XE/Royaume-Uni (1 aut.)</AF>
<DT>Publication en série; Congrès; Niveau analytique</DT>
<SO>Lecture notes in computer science; ISSN 0302-9743; Allemagne; Da. 2006; Vol. 4129; Pp. 56-67; Bibl. 28 ref.</SO>
<LA>Anglais</LA>
<EA>In this paper we look at some of the design issues that affect the success of multimodal displays that combine acoustic and haptic modalities. First, issues affecting successful sonification design are explored and suggestions are made about how the language of electroa-coustic music can assist. Next, haptic interaction is introduced in the light of this discussion, particularly focusing on the roles of gesture and mimesis. Finally, some observations are made regarding some of the issues that arise when the haptic and acoustic modalities are combined in the interface. This paper looks at examples of where auditory and haptic interaction have been successfully combined beyond the strict confines of the human-computer application interface (musical instruments in particular) and discusses lessons that may be drawn from these domains and applied to the world of multimodal human-computer interaction. The argument is made that combined haptic-auditory interaction schemes can be thought of as musical instruments and some of the possible ramifications of this are raised.</EA>
<CC>770B05D</CC>
<FD>Aide handicapé; Assistance utilisateur; Perception; Acoustique audio; Instrument musique; Acoustique musicale; Guide onde multimode; Système homme machine; Interface utilisateur; Sensibilité tactile; Affichage; Musique; Geste; Audition; .</FD>
<ED>Handicapped aid; User assistance; Perception; Audio acoustics; Musical instrument; Musical acoustics; Multimode waveguide; Man machine system; User interface; Tactile sensitivity; Display; Music; Gesture; Hearing</ED>
<SD>Ayuda minusválido; Asistencia usuario; Percepción; Instrumento musical; Acústica musical; Guía onda multimodo; Sistema hombre máquina; Interfase usuario; Sensibilidad tactil; Visualización; Música; Gesto; Audición</SD>
<LO>INIST-16343.354000153642100060</LO>
<ID>08-0032200</ID>
</server>
</inist>
</record>

Pour manipuler ce document sous Unix (Dilib)

EXPLOR_STEP=$WICRI_ROOT/Ticri/CIDE/explor/HapticV1/Data/PascalFrancis/Corpus
HfdSelect -h $EXPLOR_STEP/biblio.hfd -nk 000A69 | SxmlIndent | more

Ou

HfdSelect -h $EXPLOR_AREA/Data/PascalFrancis/Corpus/biblio.hfd -nk 000A69 | SxmlIndent | more

Pour mettre un lien sur cette page dans le réseau Wicri

{{Explor lien
   |wiki=    Ticri/CIDE
   |area=    HapticV1
   |flux=    PascalFrancis
   |étape=   Corpus
   |type=    RBID
   |clé=     Francis:08-0032200
   |texte=   Lemma 4 : Haptic input + auditory display = musical instrument?
}}

Wicri

This area was generated with Dilib version V0.6.23.
Data generation: Mon Jun 13 01:09:46 2016. Site generation: Wed Mar 6 09:54:07 2024