Serveur d'exploration sur les dispositifs haptiques

Attention, ce site est en cours de développement !
Attention, site généré par des moyens informatiques à partir de corpus bruts.
Les informations ne sont donc pas validées.

Object discrimination using optimized multi-frequency auditory cross-modal haptic feedback.

Identifieur interne : 000F85 ( Main/Merge ); précédent : 000F84; suivant : 000F86

Object discrimination using optimized multi-frequency auditory cross-modal haptic feedback.

Auteurs : Alison Gibson ; Panagiotis Artemiadis

Source :

RBID : pubmed:25571486

English descriptors

Abstract

As the field of brain-machine interfaces and neuro-prosthetics continues to grow, there is a high need for sensor and actuation mechanisms that can provide haptic feedback to the user. Current technologies employ expensive, invasive and often inefficient force feedback methods, resulting in an unrealistic solution for individuals who rely on these devices. This paper responds through the development, integration and analysis of a novel feedback architecture where haptic information during the neural control of a prosthetic hand is perceived through multi-frequency auditory signals. Through representing force magnitude with volume and force location with frequency, the feedback architecture can translate the haptic experiences of a robotic end effector into the alternative sensory modality of sound. Previous research with the proposed cross-modal feedback method confirmed its learnability, so the current work aimed to investigate which frequency map (i.e. frequency-specific locations on the hand) is optimal in helping users distinguish between hand-held objects and tasks associated with them. After short use with the cross-modal feedback during the electromyographic (EMG) control of a prosthetic hand, testing results show that users are able to use audial feedback alone to discriminate between everyday objects. While users showed adaptation to three different frequency maps, the simplest map containing only two frequencies was found to be the most useful in discriminating between objects. This outcome provides support for the feasibility and practicality of the cross-modal feedback method during the neural control of prosthetics.

DOI: 10.1109/EMBC.2014.6945118
PubMed: 25571486

Links toward previous steps (curation, corpus...)


Links to Exploration step

pubmed:25571486

Le document en format XML

<record>
<TEI>
<teiHeader>
<fileDesc>
<titleStmt>
<title xml:lang="en">Object discrimination using optimized multi-frequency auditory cross-modal haptic feedback.</title>
<author>
<name sortKey="Gibson, Alison" sort="Gibson, Alison" uniqKey="Gibson A" first="Alison" last="Gibson">Alison Gibson</name>
</author>
<author>
<name sortKey="Artemiadis, Panagiotis" sort="Artemiadis, Panagiotis" uniqKey="Artemiadis P" first="Panagiotis" last="Artemiadis">Panagiotis Artemiadis</name>
</author>
</titleStmt>
<publicationStmt>
<idno type="wicri:source">PubMed</idno>
<date when="2014">2014</date>
<idno type="doi">10.1109/EMBC.2014.6945118</idno>
<idno type="RBID">pubmed:25571486</idno>
<idno type="pmid">25571486</idno>
<idno type="wicri:Area/PubMed/Corpus">000441</idno>
<idno type="wicri:Area/PubMed/Curation">000441</idno>
<idno type="wicri:Area/PubMed/Checkpoint">000576</idno>
<idno type="wicri:Area/Ncbi/Merge">003609</idno>
<idno type="wicri:Area/Ncbi/Curation">003609</idno>
<idno type="wicri:Area/Ncbi/Checkpoint">003609</idno>
<idno type="wicri:doubleKey">1557-170X:2014:Gibson A:object:discrimination:using</idno>
<idno type="wicri:Area/Main/Merge">000F85</idno>
</publicationStmt>
<sourceDesc>
<biblStruct>
<analytic>
<title xml:lang="en">Object discrimination using optimized multi-frequency auditory cross-modal haptic feedback.</title>
<author>
<name sortKey="Gibson, Alison" sort="Gibson, Alison" uniqKey="Gibson A" first="Alison" last="Gibson">Alison Gibson</name>
</author>
<author>
<name sortKey="Artemiadis, Panagiotis" sort="Artemiadis, Panagiotis" uniqKey="Artemiadis P" first="Panagiotis" last="Artemiadis">Panagiotis Artemiadis</name>
</author>
</analytic>
<series>
<title level="j">Conference proceedings : ... Annual International Conference of the IEEE Engineering in Medicine and Biology Society. IEEE Engineering in Medicine and Biology Society. Annual Conference</title>
<idno type="ISSN">1557-170X</idno>
<imprint>
<date when="2014" type="published">2014</date>
</imprint>
</series>
</biblStruct>
</sourceDesc>
</fileDesc>
<profileDesc>
<textClass>
<keywords scheme="KwdEn" xml:lang="en">
<term>Acoustic Stimulation</term>
<term>Artificial Limbs</term>
<term>Brain-Computer Interfaces</term>
<term>Feedback, Sensory</term>
<term>Hand (physiology)</term>
<term>Humans</term>
<term>Learning</term>
<term>Robotics</term>
</keywords>
<keywords scheme="MESH" qualifier="physiology" xml:lang="en">
<term>Hand</term>
</keywords>
<keywords scheme="MESH" xml:lang="en">
<term>Acoustic Stimulation</term>
<term>Artificial Limbs</term>
<term>Brain-Computer Interfaces</term>
<term>Feedback, Sensory</term>
<term>Humans</term>
<term>Learning</term>
<term>Robotics</term>
</keywords>
</textClass>
</profileDesc>
</teiHeader>
<front>
<div type="abstract" xml:lang="en">As the field of brain-machine interfaces and neuro-prosthetics continues to grow, there is a high need for sensor and actuation mechanisms that can provide haptic feedback to the user. Current technologies employ expensive, invasive and often inefficient force feedback methods, resulting in an unrealistic solution for individuals who rely on these devices. This paper responds through the development, integration and analysis of a novel feedback architecture where haptic information during the neural control of a prosthetic hand is perceived through multi-frequency auditory signals. Through representing force magnitude with volume and force location with frequency, the feedback architecture can translate the haptic experiences of a robotic end effector into the alternative sensory modality of sound. Previous research with the proposed cross-modal feedback method confirmed its learnability, so the current work aimed to investigate which frequency map (i.e. frequency-specific locations on the hand) is optimal in helping users distinguish between hand-held objects and tasks associated with them. After short use with the cross-modal feedback during the electromyographic (EMG) control of a prosthetic hand, testing results show that users are able to use audial feedback alone to discriminate between everyday objects. While users showed adaptation to three different frequency maps, the simplest map containing only two frequencies was found to be the most useful in discriminating between objects. This outcome provides support for the feasibility and practicality of the cross-modal feedback method during the neural control of prosthetics.</div>
</front>
</TEI>
</record>

Pour manipuler ce document sous Unix (Dilib)

EXPLOR_STEP=$WICRI_ROOT/Ticri/CIDE/explor/HapticV1/Data/Main/Merge
HfdSelect -h $EXPLOR_STEP/biblio.hfd -nk 000F85 | SxmlIndent | more

Ou

HfdSelect -h $EXPLOR_AREA/Data/Main/Merge/biblio.hfd -nk 000F85 | SxmlIndent | more

Pour mettre un lien sur cette page dans le réseau Wicri

{{Explor lien
   |wiki=    Ticri/CIDE
   |area=    HapticV1
   |flux=    Main
   |étape=   Merge
   |type=    RBID
   |clé=     pubmed:25571486
   |texte=   Object discrimination using optimized multi-frequency auditory cross-modal haptic feedback.
}}

Pour générer des pages wiki

HfdIndexSelect -h $EXPLOR_AREA/Data/Main/Merge/RBID.i   -Sk "pubmed:25571486" \
       | HfdSelect -Kh $EXPLOR_AREA/Data/Main/Merge/biblio.hfd   \
       | NlmPubMed2Wicri -a HapticV1 

Wicri

This area was generated with Dilib version V0.6.23.
Data generation: Mon Jun 13 01:09:46 2016. Site generation: Wed Mar 6 09:54:07 2024