Human Perception of Haptic-to-Video and Haptic-to-Audio Skew in Multimedia Applications
Identifieur interne : 000192 ( PascalFrancis/Corpus ); précédent : 000191; suivant : 000193Human Perception of Haptic-to-Video and Haptic-to-Audio Skew in Multimedia Applications
Auteurs : Juan M. Silva ; Mauricio Orozco ; Jongeun Cha ; Abdulmotaleb El Saddik ; Emil M. PetriuSource :
- ACM transactions on multimedia computing communications and applications [ 1551-6857 ] ; 2013.
Descripteurs français
- Pascal (Inist)
English descriptors
- KwdEn :
Abstract
The purpose of this research is to assess the sensitivity of humans to perceive asynchrony among media signals coming from a computer application. Particularly we examine haptic-to-video and haptic-to-audio skew. For this purpose we have designed an experimental setup, where users are exposed to a basic multimedia presentation resembling a ping-pong game. For every collision between a ball and a racket, the user is able to perceive auditory, visual, and haptic cues about the collision event. We artificially introduce negative and positive delay to the auditory and visual cues with respect to the haptic stream. We subjectively evaluate the perception of inter-stream asynchrony perceived by the users using two types of haptic devices. The statistical results of our evaluation show perception rates of around 100 ms regardless of modality and type of device.
Notice en format standard (ISO 2709)
Pour connaître la documentation sur le format Inist Standard.
pA |
|
---|
Format Inist (serveur)
NO : | PASCAL 13-0216960 INIST |
---|---|
ET : | Human Perception of Haptic-to-Video and Haptic-to-Audio Skew in Multimedia Applications |
AU : | SILVA (Juan M.); OROZCO (Mauricio); CHA (Jongeun); EL SADDIK (Abdulmotaleb); PETRIU (Emil M.) |
AF : | University of Ottawa/Etats-Unis (1 aut., 3 aut., 4 aut., 5 aut.); New York University/Etats-Unis (2 aut.) |
DT : | Publication en série; Niveau analytique |
SO : | ACM transactions on multimedia computing communications and applications; ISSN 1551-6857; Etats-Unis; Da. 2013; Vol. 9; No. 2; 9.1-9.16; Bibl. 3/4 p. |
LA : | Anglais |
EA : | The purpose of this research is to assess the sensitivity of humans to perceive asynchrony among media signals coming from a computer application. Particularly we examine haptic-to-video and haptic-to-audio skew. For this purpose we have designed an experimental setup, where users are exposed to a basic multimedia presentation resembling a ping-pong game. For every collision between a ball and a racket, the user is able to perceive auditory, visual, and haptic cues about the collision event. We artificially introduce negative and positive delay to the auditory and visual cues with respect to the haptic stream. We subjectively evaluate the perception of inter-stream asynchrony perceived by the users using two types of haptic devices. The statistical results of our evaluation show perception rates of around 100 ms regardless of modality and type of device. |
CC : | 001D02B04 |
FD : | Multimédia; Interface utilisateur; Synchronisation; Perception sensorielle; Sensibilité tactile; Régime asynchrone; Stimulus acoustique; Repère visuel; Stimulus visuel; Retard; Comportement utilisateur; Analyse statistique; Facteur humain |
ED : | Multimedia; User interface; Synchronization; Sensorial perception; Tactile sensitivity; Asynchronous regime; Acoustic stimulus; Visual cue; Visual stimulus; Delay; User behavior; Statistical analysis; Human factor |
SD : | Multimedia; Interfase usuario; Sincronización; Percepción sensorial; Sensibilidad tactil; Régimen asincrono; Estímulo acústico; Marca visual; Estimulo visual; Retraso; Comportamiento usuario; Análisis estadístico; Factor humano |
LO : | INIST-28207.354000504148130010 |
ID : | 13-0216960 |
Links to Exploration step
Pascal:13-0216960Le document en format XML
<record><TEI><teiHeader><fileDesc><titleStmt><title xml:lang="en" level="a">Human Perception of Haptic-to-Video and Haptic-to-Audio Skew in Multimedia Applications</title>
<author><name sortKey="Silva, Juan M" sort="Silva, Juan M" uniqKey="Silva J" first="Juan M." last="Silva">Juan M. Silva</name>
<affiliation><inist:fA14 i1="01"><s1>University of Ottawa</s1>
<s3>USA</s3>
<sZ>1 aut.</sZ>
<sZ>3 aut.</sZ>
<sZ>4 aut.</sZ>
<sZ>5 aut.</sZ>
</inist:fA14>
</affiliation>
</author>
<author><name sortKey="Orozco, Mauricio" sort="Orozco, Mauricio" uniqKey="Orozco M" first="Mauricio" last="Orozco">Mauricio Orozco</name>
<affiliation><inist:fA14 i1="02"><s1>New York University</s1>
<s3>USA</s3>
<sZ>2 aut.</sZ>
</inist:fA14>
</affiliation>
</author>
<author><name sortKey="Cha, Jongeun" sort="Cha, Jongeun" uniqKey="Cha J" first="Jongeun" last="Cha">Jongeun Cha</name>
<affiliation><inist:fA14 i1="01"><s1>University of Ottawa</s1>
<s3>USA</s3>
<sZ>1 aut.</sZ>
<sZ>3 aut.</sZ>
<sZ>4 aut.</sZ>
<sZ>5 aut.</sZ>
</inist:fA14>
</affiliation>
</author>
<author><name sortKey="El Saddik, Abdulmotaleb" sort="El Saddik, Abdulmotaleb" uniqKey="El Saddik A" first="Abdulmotaleb" last="El Saddik">Abdulmotaleb El Saddik</name>
<affiliation><inist:fA14 i1="01"><s1>University of Ottawa</s1>
<s3>USA</s3>
<sZ>1 aut.</sZ>
<sZ>3 aut.</sZ>
<sZ>4 aut.</sZ>
<sZ>5 aut.</sZ>
</inist:fA14>
</affiliation>
</author>
<author><name sortKey="Petriu, Emil M" sort="Petriu, Emil M" uniqKey="Petriu E" first="Emil M." last="Petriu">Emil M. Petriu</name>
<affiliation><inist:fA14 i1="01"><s1>University of Ottawa</s1>
<s3>USA</s3>
<sZ>1 aut.</sZ>
<sZ>3 aut.</sZ>
<sZ>4 aut.</sZ>
<sZ>5 aut.</sZ>
</inist:fA14>
</affiliation>
</author>
</titleStmt>
<publicationStmt><idno type="wicri:source">INIST</idno>
<idno type="inist">13-0216960</idno>
<date when="2013">2013</date>
<idno type="stanalyst">PASCAL 13-0216960 INIST</idno>
<idno type="RBID">Pascal:13-0216960</idno>
<idno type="wicri:Area/PascalFrancis/Corpus">000192</idno>
</publicationStmt>
<sourceDesc><biblStruct><analytic><title xml:lang="en" level="a">Human Perception of Haptic-to-Video and Haptic-to-Audio Skew in Multimedia Applications</title>
<author><name sortKey="Silva, Juan M" sort="Silva, Juan M" uniqKey="Silva J" first="Juan M." last="Silva">Juan M. Silva</name>
<affiliation><inist:fA14 i1="01"><s1>University of Ottawa</s1>
<s3>USA</s3>
<sZ>1 aut.</sZ>
<sZ>3 aut.</sZ>
<sZ>4 aut.</sZ>
<sZ>5 aut.</sZ>
</inist:fA14>
</affiliation>
</author>
<author><name sortKey="Orozco, Mauricio" sort="Orozco, Mauricio" uniqKey="Orozco M" first="Mauricio" last="Orozco">Mauricio Orozco</name>
<affiliation><inist:fA14 i1="02"><s1>New York University</s1>
<s3>USA</s3>
<sZ>2 aut.</sZ>
</inist:fA14>
</affiliation>
</author>
<author><name sortKey="Cha, Jongeun" sort="Cha, Jongeun" uniqKey="Cha J" first="Jongeun" last="Cha">Jongeun Cha</name>
<affiliation><inist:fA14 i1="01"><s1>University of Ottawa</s1>
<s3>USA</s3>
<sZ>1 aut.</sZ>
<sZ>3 aut.</sZ>
<sZ>4 aut.</sZ>
<sZ>5 aut.</sZ>
</inist:fA14>
</affiliation>
</author>
<author><name sortKey="El Saddik, Abdulmotaleb" sort="El Saddik, Abdulmotaleb" uniqKey="El Saddik A" first="Abdulmotaleb" last="El Saddik">Abdulmotaleb El Saddik</name>
<affiliation><inist:fA14 i1="01"><s1>University of Ottawa</s1>
<s3>USA</s3>
<sZ>1 aut.</sZ>
<sZ>3 aut.</sZ>
<sZ>4 aut.</sZ>
<sZ>5 aut.</sZ>
</inist:fA14>
</affiliation>
</author>
<author><name sortKey="Petriu, Emil M" sort="Petriu, Emil M" uniqKey="Petriu E" first="Emil M." last="Petriu">Emil M. Petriu</name>
<affiliation><inist:fA14 i1="01"><s1>University of Ottawa</s1>
<s3>USA</s3>
<sZ>1 aut.</sZ>
<sZ>3 aut.</sZ>
<sZ>4 aut.</sZ>
<sZ>5 aut.</sZ>
</inist:fA14>
</affiliation>
</author>
</analytic>
<series><title level="j" type="main">ACM transactions on multimedia computing communications and applications</title>
<title level="j" type="abbreviated">ACM trans. multimed comput. commun. appl.</title>
<idno type="ISSN">1551-6857</idno>
<imprint><date when="2013">2013</date>
</imprint>
</series>
</biblStruct>
</sourceDesc>
<seriesStmt><title level="j" type="main">ACM transactions on multimedia computing communications and applications</title>
<title level="j" type="abbreviated">ACM trans. multimed comput. commun. appl.</title>
<idno type="ISSN">1551-6857</idno>
</seriesStmt>
</fileDesc>
<profileDesc><textClass><keywords scheme="KwdEn" xml:lang="en"><term>Acoustic stimulus</term>
<term>Asynchronous regime</term>
<term>Delay</term>
<term>Human factor</term>
<term>Multimedia</term>
<term>Sensorial perception</term>
<term>Statistical analysis</term>
<term>Synchronization</term>
<term>Tactile sensitivity</term>
<term>User behavior</term>
<term>User interface</term>
<term>Visual cue</term>
<term>Visual stimulus</term>
</keywords>
<keywords scheme="Pascal" xml:lang="fr"><term>Multimédia</term>
<term>Interface utilisateur</term>
<term>Synchronisation</term>
<term>Perception sensorielle</term>
<term>Sensibilité tactile</term>
<term>Régime asynchrone</term>
<term>Stimulus acoustique</term>
<term>Repère visuel</term>
<term>Stimulus visuel</term>
<term>Retard</term>
<term>Comportement utilisateur</term>
<term>Analyse statistique</term>
<term>Facteur humain</term>
</keywords>
</textClass>
</profileDesc>
</teiHeader>
<front><div type="abstract" xml:lang="en">The purpose of this research is to assess the sensitivity of humans to perceive asynchrony among media signals coming from a computer application. Particularly we examine haptic-to-video and haptic-to-audio skew. For this purpose we have designed an experimental setup, where users are exposed to a basic multimedia presentation resembling a ping-pong game. For every collision between a ball and a racket, the user is able to perceive auditory, visual, and haptic cues about the collision event. We artificially introduce negative and positive delay to the auditory and visual cues with respect to the haptic stream. We subjectively evaluate the perception of inter-stream asynchrony perceived by the users using two types of haptic devices. The statistical results of our evaluation show perception rates of around 100 ms regardless of modality and type of device.</div>
</front>
</TEI>
<inist><standard h6="B"><pA><fA01 i1="01" i2="1"><s0>1551-6857</s0>
</fA01>
<fA03 i2="1"><s0>ACM trans. multimed comput. commun. appl.</s0>
</fA03>
<fA05><s2>9</s2>
</fA05>
<fA06><s2>2</s2>
</fA06>
<fA08 i1="01" i2="1" l="ENG"><s1>Human Perception of Haptic-to-Video and Haptic-to-Audio Skew in Multimedia Applications</s1>
</fA08>
<fA11 i1="01" i2="1"><s1>SILVA (Juan M.)</s1>
</fA11>
<fA11 i1="02" i2="1"><s1>OROZCO (Mauricio)</s1>
</fA11>
<fA11 i1="03" i2="1"><s1>CHA (Jongeun)</s1>
</fA11>
<fA11 i1="04" i2="1"><s1>EL SADDIK (Abdulmotaleb)</s1>
</fA11>
<fA11 i1="05" i2="1"><s1>PETRIU (Emil M.)</s1>
</fA11>
<fA14 i1="01"><s1>University of Ottawa</s1>
<s3>USA</s3>
<sZ>1 aut.</sZ>
<sZ>3 aut.</sZ>
<sZ>4 aut.</sZ>
<sZ>5 aut.</sZ>
</fA14>
<fA14 i1="02"><s1>New York University</s1>
<s3>USA</s3>
<sZ>2 aut.</sZ>
</fA14>
<fA20><s2>9.1-9.16</s2>
</fA20>
<fA21><s1>2013</s1>
</fA21>
<fA23 i1="01"><s0>ENG</s0>
</fA23>
<fA43 i1="01"><s1>INIST</s1>
<s2>28207</s2>
<s5>354000504148130010</s5>
</fA43>
<fA44><s0>0000</s0>
<s1>© 2013 INIST-CNRS. All rights reserved.</s1>
</fA44>
<fA45><s0>3/4 p.</s0>
</fA45>
<fA47 i1="01" i2="1"><s0>13-0216960</s0>
</fA47>
<fA60><s1>P</s1>
</fA60>
<fA61><s0>A</s0>
</fA61>
<fA64 i1="01" i2="1"><s0>ACM transactions on multimedia computing communications and applications</s0>
</fA64>
<fA66 i1="01"><s0>USA</s0>
</fA66>
<fC01 i1="01" l="ENG"><s0>The purpose of this research is to assess the sensitivity of humans to perceive asynchrony among media signals coming from a computer application. Particularly we examine haptic-to-video and haptic-to-audio skew. For this purpose we have designed an experimental setup, where users are exposed to a basic multimedia presentation resembling a ping-pong game. For every collision between a ball and a racket, the user is able to perceive auditory, visual, and haptic cues about the collision event. We artificially introduce negative and positive delay to the auditory and visual cues with respect to the haptic stream. We subjectively evaluate the perception of inter-stream asynchrony perceived by the users using two types of haptic devices. The statistical results of our evaluation show perception rates of around 100 ms regardless of modality and type of device.</s0>
</fC01>
<fC02 i1="01" i2="X"><s0>001D02B04</s0>
</fC02>
<fC03 i1="01" i2="X" l="FRE"><s0>Multimédia</s0>
<s5>06</s5>
</fC03>
<fC03 i1="01" i2="X" l="ENG"><s0>Multimedia</s0>
<s5>06</s5>
</fC03>
<fC03 i1="01" i2="X" l="SPA"><s0>Multimedia</s0>
<s5>06</s5>
</fC03>
<fC03 i1="02" i2="X" l="FRE"><s0>Interface utilisateur</s0>
<s5>07</s5>
</fC03>
<fC03 i1="02" i2="X" l="ENG"><s0>User interface</s0>
<s5>07</s5>
</fC03>
<fC03 i1="02" i2="X" l="SPA"><s0>Interfase usuario</s0>
<s5>07</s5>
</fC03>
<fC03 i1="03" i2="X" l="FRE"><s0>Synchronisation</s0>
<s5>08</s5>
</fC03>
<fC03 i1="03" i2="X" l="ENG"><s0>Synchronization</s0>
<s5>08</s5>
</fC03>
<fC03 i1="03" i2="X" l="SPA"><s0>Sincronización</s0>
<s5>08</s5>
</fC03>
<fC03 i1="04" i2="X" l="FRE"><s0>Perception sensorielle</s0>
<s5>18</s5>
</fC03>
<fC03 i1="04" i2="X" l="ENG"><s0>Sensorial perception</s0>
<s5>18</s5>
</fC03>
<fC03 i1="04" i2="X" l="SPA"><s0>Percepción sensorial</s0>
<s5>18</s5>
</fC03>
<fC03 i1="05" i2="X" l="FRE"><s0>Sensibilité tactile</s0>
<s5>19</s5>
</fC03>
<fC03 i1="05" i2="X" l="ENG"><s0>Tactile sensitivity</s0>
<s5>19</s5>
</fC03>
<fC03 i1="05" i2="X" l="SPA"><s0>Sensibilidad tactil</s0>
<s5>19</s5>
</fC03>
<fC03 i1="06" i2="X" l="FRE"><s0>Régime asynchrone</s0>
<s5>20</s5>
</fC03>
<fC03 i1="06" i2="X" l="ENG"><s0>Asynchronous regime</s0>
<s5>20</s5>
</fC03>
<fC03 i1="06" i2="X" l="SPA"><s0>Régimen asincrono</s0>
<s5>20</s5>
</fC03>
<fC03 i1="07" i2="X" l="FRE"><s0>Stimulus acoustique</s0>
<s5>21</s5>
</fC03>
<fC03 i1="07" i2="X" l="ENG"><s0>Acoustic stimulus</s0>
<s5>21</s5>
</fC03>
<fC03 i1="07" i2="X" l="SPA"><s0>Estímulo acústico</s0>
<s5>21</s5>
</fC03>
<fC03 i1="08" i2="X" l="FRE"><s0>Repère visuel</s0>
<s5>22</s5>
</fC03>
<fC03 i1="08" i2="X" l="ENG"><s0>Visual cue</s0>
<s5>22</s5>
</fC03>
<fC03 i1="08" i2="X" l="SPA"><s0>Marca visual</s0>
<s5>22</s5>
</fC03>
<fC03 i1="09" i2="X" l="FRE"><s0>Stimulus visuel</s0>
<s5>23</s5>
</fC03>
<fC03 i1="09" i2="X" l="ENG"><s0>Visual stimulus</s0>
<s5>23</s5>
</fC03>
<fC03 i1="09" i2="X" l="SPA"><s0>Estimulo visual</s0>
<s5>23</s5>
</fC03>
<fC03 i1="10" i2="X" l="FRE"><s0>Retard</s0>
<s5>24</s5>
</fC03>
<fC03 i1="10" i2="X" l="ENG"><s0>Delay</s0>
<s5>24</s5>
</fC03>
<fC03 i1="10" i2="X" l="SPA"><s0>Retraso</s0>
<s5>24</s5>
</fC03>
<fC03 i1="11" i2="X" l="FRE"><s0>Comportement utilisateur</s0>
<s5>25</s5>
</fC03>
<fC03 i1="11" i2="X" l="ENG"><s0>User behavior</s0>
<s5>25</s5>
</fC03>
<fC03 i1="11" i2="X" l="SPA"><s0>Comportamiento usuario</s0>
<s5>25</s5>
</fC03>
<fC03 i1="12" i2="X" l="FRE"><s0>Analyse statistique</s0>
<s5>26</s5>
</fC03>
<fC03 i1="12" i2="X" l="ENG"><s0>Statistical analysis</s0>
<s5>26</s5>
</fC03>
<fC03 i1="12" i2="X" l="SPA"><s0>Análisis estadístico</s0>
<s5>26</s5>
</fC03>
<fC03 i1="13" i2="X" l="FRE"><s0>Facteur humain</s0>
<s5>27</s5>
</fC03>
<fC03 i1="13" i2="X" l="ENG"><s0>Human factor</s0>
<s5>27</s5>
</fC03>
<fC03 i1="13" i2="X" l="SPA"><s0>Factor humano</s0>
<s5>27</s5>
</fC03>
<fN21><s1>203</s1>
</fN21>
<fN44 i1="01"><s1>OTO</s1>
</fN44>
<fN82><s1>OTO</s1>
</fN82>
</pA>
</standard>
<server><NO>PASCAL 13-0216960 INIST</NO>
<ET>Human Perception of Haptic-to-Video and Haptic-to-Audio Skew in Multimedia Applications</ET>
<AU>SILVA (Juan M.); OROZCO (Mauricio); CHA (Jongeun); EL SADDIK (Abdulmotaleb); PETRIU (Emil M.)</AU>
<AF>University of Ottawa/Etats-Unis (1 aut., 3 aut., 4 aut., 5 aut.); New York University/Etats-Unis (2 aut.)</AF>
<DT>Publication en série; Niveau analytique</DT>
<SO>ACM transactions on multimedia computing communications and applications; ISSN 1551-6857; Etats-Unis; Da. 2013; Vol. 9; No. 2; 9.1-9.16; Bibl. 3/4 p.</SO>
<LA>Anglais</LA>
<EA>The purpose of this research is to assess the sensitivity of humans to perceive asynchrony among media signals coming from a computer application. Particularly we examine haptic-to-video and haptic-to-audio skew. For this purpose we have designed an experimental setup, where users are exposed to a basic multimedia presentation resembling a ping-pong game. For every collision between a ball and a racket, the user is able to perceive auditory, visual, and haptic cues about the collision event. We artificially introduce negative and positive delay to the auditory and visual cues with respect to the haptic stream. We subjectively evaluate the perception of inter-stream asynchrony perceived by the users using two types of haptic devices. The statistical results of our evaluation show perception rates of around 100 ms regardless of modality and type of device.</EA>
<CC>001D02B04</CC>
<FD>Multimédia; Interface utilisateur; Synchronisation; Perception sensorielle; Sensibilité tactile; Régime asynchrone; Stimulus acoustique; Repère visuel; Stimulus visuel; Retard; Comportement utilisateur; Analyse statistique; Facteur humain</FD>
<ED>Multimedia; User interface; Synchronization; Sensorial perception; Tactile sensitivity; Asynchronous regime; Acoustic stimulus; Visual cue; Visual stimulus; Delay; User behavior; Statistical analysis; Human factor</ED>
<SD>Multimedia; Interfase usuario; Sincronización; Percepción sensorial; Sensibilidad tactil; Régimen asincrono; Estímulo acústico; Marca visual; Estimulo visual; Retraso; Comportamiento usuario; Análisis estadístico; Factor humano</SD>
<LO>INIST-28207.354000504148130010</LO>
<ID>13-0216960</ID>
</server>
</inist>
</record>
Pour manipuler ce document sous Unix (Dilib)
EXPLOR_STEP=$WICRI_ROOT/Ticri/CIDE/explor/HapticV1/Data/PascalFrancis/Corpus
HfdSelect -h $EXPLOR_STEP/biblio.hfd -nk 000192 | SxmlIndent | more
Ou
HfdSelect -h $EXPLOR_AREA/Data/PascalFrancis/Corpus/biblio.hfd -nk 000192 | SxmlIndent | more
Pour mettre un lien sur cette page dans le réseau Wicri
{{Explor lien |wiki= Ticri/CIDE |area= HapticV1 |flux= PascalFrancis |étape= Corpus |type= RBID |clé= Pascal:13-0216960 |texte= Human Perception of Haptic-to-Video and Haptic-to-Audio Skew in Multimedia Applications }}
This area was generated with Dilib version V0.6.23. |