Serveur d'exploration sur les dispositifs haptiques

Attention, ce site est en cours de développement !
Attention, site généré par des moyens informatiques à partir de corpus bruts.
Les informations ne sont donc pas validées.

Seeing by Touch: Evaluation of a Soft Biologically-Inspired Artificial Fingertip in Real-Time Active Touch

Identifieur interne : 002C97 ( Ncbi/Merge ); précédent : 002C96; suivant : 002C98

Seeing by Touch: Evaluation of a Soft Biologically-Inspired Artificial Fingertip in Real-Time Active Touch

Auteurs : Tareq Assaf ; Calum Roke ; Jonathan Rossiter ; Tony Pipe ; Chris Melhuish

Source :

RBID : PMC:3958268

Abstract

Effective tactile sensing for artificial platforms remains an open issue in robotics. This study investigates the performance of a soft biologically-inspired artificial fingertip in active exploration tasks. The fingertip sensor replicates the mechanisms within human skin and offers a robust solution that can be used both for tactile sensing and gripping/manipulating objects. The softness of the optical sensor's contact surface also allows safer interactions with objects. High-level tactile features such as edges are extrapolated from the sensor's output and the information is used to generate a tactile image. The work presented in this paper aims to investigate and evaluate this artificial fingertip for 2D shape reconstruction. The sensor was mounted on a robot arm to allow autonomous exploration of different objects. The sensor and a number of human participants were then tested for their abilities to track the raised perimeters of different planar objects and compared. By observing the technique and accuracy of the human subjects, simple but effective parameters were determined in order to evaluate the artificial system's performance. The results prove the capability of the sensor in such active exploration tasks, with a comparable performance to the human subjects despite it using tactile data alone whereas the human participants were also able to use proprioceptive cues.


Url:
DOI: 10.3390/s140202561
PubMed: 24514881
PubMed Central: 3958268

Links toward previous steps (curation, corpus...)


Links to Exploration step

PMC:3958268

Le document en format XML

<record>
<TEI>
<teiHeader>
<fileDesc>
<titleStmt>
<title xml:lang="en">Seeing by Touch: Evaluation of a Soft Biologically-Inspired Artificial Fingertip in Real-Time Active Touch</title>
<author>
<name sortKey="Assaf, Tareq" sort="Assaf, Tareq" uniqKey="Assaf T" first="Tareq" last="Assaf">Tareq Assaf</name>
</author>
<author>
<name sortKey="Roke, Calum" sort="Roke, Calum" uniqKey="Roke C" first="Calum" last="Roke">Calum Roke</name>
</author>
<author>
<name sortKey="Rossiter, Jonathan" sort="Rossiter, Jonathan" uniqKey="Rossiter J" first="Jonathan" last="Rossiter">Jonathan Rossiter</name>
</author>
<author>
<name sortKey="Pipe, Tony" sort="Pipe, Tony" uniqKey="Pipe T" first="Tony" last="Pipe">Tony Pipe</name>
</author>
<author>
<name sortKey="Melhuish, Chris" sort="Melhuish, Chris" uniqKey="Melhuish C" first="Chris" last="Melhuish">Chris Melhuish</name>
</author>
</titleStmt>
<publicationStmt>
<idno type="wicri:source">PMC</idno>
<idno type="pmid">24514881</idno>
<idno type="pmc">3958268</idno>
<idno type="url">http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3958268</idno>
<idno type="RBID">PMC:3958268</idno>
<idno type="doi">10.3390/s140202561</idno>
<date when="2014">2014</date>
<idno type="wicri:Area/Pmc/Corpus">002502</idno>
<idno type="wicri:Area/Pmc/Curation">002502</idno>
<idno type="wicri:Area/Pmc/Checkpoint">000963</idno>
<idno type="wicri:Area/Ncbi/Merge">002C97</idno>
</publicationStmt>
<sourceDesc>
<biblStruct>
<analytic>
<title xml:lang="en" level="a" type="main">Seeing by Touch: Evaluation of a Soft Biologically-Inspired Artificial Fingertip in Real-Time Active Touch</title>
<author>
<name sortKey="Assaf, Tareq" sort="Assaf, Tareq" uniqKey="Assaf T" first="Tareq" last="Assaf">Tareq Assaf</name>
</author>
<author>
<name sortKey="Roke, Calum" sort="Roke, Calum" uniqKey="Roke C" first="Calum" last="Roke">Calum Roke</name>
</author>
<author>
<name sortKey="Rossiter, Jonathan" sort="Rossiter, Jonathan" uniqKey="Rossiter J" first="Jonathan" last="Rossiter">Jonathan Rossiter</name>
</author>
<author>
<name sortKey="Pipe, Tony" sort="Pipe, Tony" uniqKey="Pipe T" first="Tony" last="Pipe">Tony Pipe</name>
</author>
<author>
<name sortKey="Melhuish, Chris" sort="Melhuish, Chris" uniqKey="Melhuish C" first="Chris" last="Melhuish">Chris Melhuish</name>
</author>
</analytic>
<series>
<title level="j">Sensors (Basel, Switzerland)</title>
<idno type="eISSN">1424-8220</idno>
<imprint>
<date when="2014">2014</date>
</imprint>
</series>
</biblStruct>
</sourceDesc>
</fileDesc>
<profileDesc>
<textClass></textClass>
</profileDesc>
</teiHeader>
<front>
<div type="abstract" xml:lang="en">
<p>Effective tactile sensing for artificial platforms remains an open issue in robotics. This study investigates the performance of a soft biologically-inspired artificial fingertip in active exploration tasks. The fingertip sensor replicates the mechanisms within human skin and offers a robust solution that can be used both for tactile sensing and gripping/manipulating objects. The softness of the optical sensor's contact surface also allows safer interactions with objects. High-level tactile features such as edges are extrapolated from the sensor's output and the information is used to generate a tactile image. The work presented in this paper aims to investigate and evaluate this artificial fingertip for 2D shape reconstruction. The sensor was mounted on a robot arm to allow autonomous exploration of different objects. The sensor and a number of human participants were then tested for their abilities to track the raised perimeters of different planar objects and compared. By observing the technique and accuracy of the human subjects, simple but effective parameters were determined in order to evaluate the artificial system's performance. The results prove the capability of the sensor in such active exploration tasks, with a comparable performance to the human subjects despite it using tactile data alone whereas the human participants were also able to use proprioceptive cues.</p>
</div>
</front>
<back>
<div1 type="bibliography">
<listBibl>
<biblStruct>
<analytic>
<author>
<name sortKey="Tegin, J" uniqKey="Tegin J">J. Tegin</name>
</author>
<author>
<name sortKey="Wikander, J" uniqKey="Wikander J">J. Wikander</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Lee, M" uniqKey="Lee M">M. Lee</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Lee, M H" uniqKey="Lee M">M.H. Lee</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Dario, P" uniqKey="Dario P">P. Dario</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Suwanratchatamanee, K" uniqKey="Suwanratchatamanee K">K. Suwanratchatamanee</name>
</author>
<author>
<name sortKey="Saegusa, R" uniqKey="Saegusa R">R. Saegusa</name>
</author>
<author>
<name sortKey="Matsumoto, M" uniqKey="Matsumoto M">M. Matsumoto</name>
</author>
<author>
<name sortKey="Hashimoto, S" uniqKey="Hashimoto S">S. Hashimoto</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Suwanratchatamanee, K" uniqKey="Suwanratchatamanee K">K. Suwanratchatamanee</name>
</author>
<author>
<name sortKey="Matsumoto, M" uniqKey="Matsumoto M">M. Matsumoto</name>
</author>
<author>
<name sortKey="Hashimoto, S" uniqKey="Hashimoto S">S. Hashimoto</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Schneider, A" uniqKey="Schneider A">A. Schneider</name>
</author>
<author>
<name sortKey="Sturm, J" uniqKey="Sturm J">J. Sturm</name>
</author>
<author>
<name sortKey="Stachniss, C" uniqKey="Stachniss C">C. Stachniss</name>
</author>
<author>
<name sortKey="Reisert, M" uniqKey="Reisert M">M. Reisert</name>
</author>
<author>
<name sortKey="Burkhardt, H" uniqKey="Burkhardt H">H. Burkhardt</name>
</author>
<author>
<name sortKey="Burgard, W" uniqKey="Burgard W">W. Burgard</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Payeur, P" uniqKey="Payeur P">P. Payeur</name>
</author>
<author>
<name sortKey="Pasca, C" uniqKey="Pasca C">C. Pasca</name>
</author>
<author>
<name sortKey="Cretu, A" uniqKey="Cretu A">A. Cretu</name>
</author>
<author>
<name sortKey="Petriu, E" uniqKey="Petriu E">E. Petriu</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Kamiyama, K" uniqKey="Kamiyama K">K. Kamiyama</name>
</author>
<author>
<name sortKey="Kajimoto, H" uniqKey="Kajimoto H">H. Kajimoto</name>
</author>
<author>
<name sortKey="Kawakami, N" uniqKey="Kawakami N">N. Kawakami</name>
</author>
<author>
<name sortKey="Tachi, S" uniqKey="Tachi S">S. Tachi</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Sato, K" uniqKey="Sato K">K. Sato</name>
</author>
<author>
<name sortKey="Kamiyama, K" uniqKey="Kamiyama K">K. Kamiyama</name>
</author>
<author>
<name sortKey="Nii, H" uniqKey="Nii H">H. Nii</name>
</author>
<author>
<name sortKey="Kawakami, N" uniqKey="Kawakami N">N. Kawakami</name>
</author>
<author>
<name sortKey="Tachi, S" uniqKey="Tachi S">S. Tachi</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Kamiyama, K" uniqKey="Kamiyama K">K. Kamiyama</name>
</author>
<author>
<name sortKey="Vlack, K" uniqKey="Vlack K">K. Vlack</name>
</author>
<author>
<name sortKey="Mizota, T" uniqKey="Mizota T">T. Mizota</name>
</author>
<author>
<name sortKey="Kajimoto, H" uniqKey="Kajimoto H">H. Kajimoto</name>
</author>
<author>
<name sortKey="Kawakami, K" uniqKey="Kawakami K">K. Kawakami</name>
</author>
<author>
<name sortKey="Tachi, S" uniqKey="Tachi S">S. Tachi</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Watanabe, N" uniqKey="Watanabe N">N. Watanabe</name>
</author>
<author>
<name sortKey="Obinata, G" uniqKey="Obinata G">G. Obinata</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Ito, Y" uniqKey="Ito Y">Y. Ito</name>
</author>
<author>
<name sortKey="Kim, Y" uniqKey="Kim Y">Y. Kim</name>
</author>
<author>
<name sortKey="Obinata, G" uniqKey="Obinata G">G. Obinata</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Chorley, C" uniqKey="Chorley C">C. Chorley</name>
</author>
<author>
<name sortKey="Melhuish, C" uniqKey="Melhuish C">C. Melhuish</name>
</author>
<author>
<name sortKey="Pipe, T" uniqKey="Pipe T">T. Pipe</name>
</author>
<author>
<name sortKey="Rossiter, J" uniqKey="Rossiter J">J. Rossiter</name>
</author>
<author>
<name sortKey="Whiteley, G" uniqKey="Whiteley G">G. Whiteley</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Roke, C" uniqKey="Roke C">C. Roke</name>
</author>
<author>
<name sortKey="Melhuish, C" uniqKey="Melhuish C">C. Melhuish</name>
</author>
<author>
<name sortKey="Pipe, T" uniqKey="Pipe T">T. Pipe</name>
</author>
<author>
<name sortKey="Drury, D" uniqKey="Drury D">D. Drury</name>
</author>
<author>
<name sortKey="Chorley, C" uniqKey="Chorley C">C. Chorley</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Roke, C" uniqKey="Roke C">C. Roke</name>
</author>
<author>
<name sortKey="Spiers, A" uniqKey="Spiers A">A. Spiers</name>
</author>
<author>
<name sortKey="Pipe, T" uniqKey="Pipe T">T. Pipe</name>
</author>
<author>
<name sortKey="Melhuish, C" uniqKey="Melhuish C">C. Melhuish</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Winstone, B" uniqKey="Winstone B">B. Winstone</name>
</author>
<author>
<name sortKey="Griffiths, G" uniqKey="Griffiths G">G. Griffiths</name>
</author>
<author>
<name sortKey="Pipe, T" uniqKey="Pipe T">T. Pipe</name>
</author>
<author>
<name sortKey="Melhuish, C" uniqKey="Melhuish C">C. Melhuish</name>
</author>
<author>
<name sortKey="Rossiter, J" uniqKey="Rossiter J">J. Rossiter</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Martinez Hernandez, U" uniqKey="Martinez Hernandez U">U. Martinez-Hernandez</name>
</author>
<author>
<name sortKey="Dodd, T" uniqKey="Dodd T">T. Dodd</name>
</author>
<author>
<name sortKey="Prescott, T" uniqKey="Prescott T">T. Prescott</name>
</author>
<author>
<name sortKey="Lepora, N" uniqKey="Lepora N">N. Lepora</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Martinez Hernandez, U" uniqKey="Martinez Hernandez U">U. Martinez-Hernandez</name>
</author>
<author>
<name sortKey="Dodd, T" uniqKey="Dodd T">T. Dodd</name>
</author>
<author>
<name sortKey="Natale, L" uniqKey="Natale L">L. Natale</name>
</author>
<author>
<name sortKey="Metta, G" uniqKey="Metta G">G. Metta</name>
</author>
<author>
<name sortKey="Prescott, T" uniqKey="Prescott T">T. Prescott</name>
</author>
<author>
<name sortKey="Lepora, N" uniqKey="Lepora N">N. Lepora</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Johansson, R" uniqKey="Johansson R">R. Johansson</name>
</author>
<author>
<name sortKey="Vallbo, A" uniqKey="Vallbo A">A. Vallbo</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Kuroki, S" uniqKey="Kuroki S">S. Kuroki</name>
</author>
<author>
<name sortKey="Kajimoto, H" uniqKey="Kajimoto H">H. Kajimoto</name>
</author>
<author>
<name sortKey="Nii, H" uniqKey="Nii H">H. Nii</name>
</author>
<author>
<name sortKey="Kawakami, N" uniqKey="Kawakami N">N. Kawakami</name>
</author>
<author>
<name sortKey="Tachi, S" uniqKey="Tachi S">S. Tachi</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Takahashi Iwanaga, H" uniqKey="Takahashi Iwanaga H">H. Takahashi-Iwanaga</name>
</author>
<author>
<name sortKey="Shimoda, H" uniqKey="Shimoda H">H. Shimoda</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Gerling, G J" uniqKey="Gerling G">G.J. Gerling</name>
</author>
<author>
<name sortKey="Thomas, G W" uniqKey="Thomas G">G.W. Thomas</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Gardner, E" uniqKey="Gardner E">E. Gardner</name>
</author>
<author>
<name sortKey="Martin, J" uniqKey="Martin J">J. Martin</name>
</author>
<author>
<name sortKey="Jessell, T" uniqKey="Jessell T">T. Jessell</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Toma, S" uniqKey="Toma S">S. Toma</name>
</author>
<author>
<name sortKey="Nakajima, Y" uniqKey="Nakajima Y">Y. Nakajima</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Johansson, R S" uniqKey="Johansson R">R.S. Johansson</name>
</author>
<author>
<name sortKey="Landstrm, U" uniqKey="Landstrm U">U. Landstrm</name>
</author>
<author>
<name sortKey="Lundstrm, R" uniqKey="Lundstrm R">R. Lundstrm</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Johnson, K O" uniqKey="Johnson K">K.O. Johnson</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Vega Bermudez, F" uniqKey="Vega Bermudez F">F. Vega-Bermudez</name>
</author>
<author>
<name sortKey="Johnson, K O" uniqKey="Johnson K">K.O. Johnson</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Bradski, G" uniqKey="Bradski G">G. Bradski</name>
</author>
<author>
<name sortKey="Kaehler, A" uniqKey="Kaehler A">A. Kaehler</name>
</author>
</analytic>
</biblStruct>
</listBibl>
</div1>
</back>
</TEI>
<pmc article-type="research-article">
<pmc-dir>properties open_access</pmc-dir>
<front>
<journal-meta>
<journal-id journal-id-type="nlm-ta">Sensors (Basel)</journal-id>
<journal-id journal-id-type="iso-abbrev">Sensors (Basel)</journal-id>
<journal-title-group>
<journal-title>Sensors (Basel, Switzerland)</journal-title>
</journal-title-group>
<issn pub-type="epub">1424-8220</issn>
<publisher>
<publisher-name>Molecular Diversity Preservation International (MDPI)</publisher-name>
</publisher>
</journal-meta>
<article-meta>
<article-id pub-id-type="pmid">24514881</article-id>
<article-id pub-id-type="pmc">3958268</article-id>
<article-id pub-id-type="doi">10.3390/s140202561</article-id>
<article-id pub-id-type="publisher-id">sensors-14-02561</article-id>
<article-categories>
<subj-group subj-group-type="heading">
<subject>Article</subject>
</subj-group>
</article-categories>
<title-group>
<article-title>Seeing by Touch: Evaluation of a Soft Biologically-Inspired Artificial Fingertip in Real-Time Active Touch</article-title>
</title-group>
<contrib-group>
<contrib contrib-type="author">
<name>
<surname>Assaf</surname>
<given-names>Tareq</given-names>
</name>
<xref rid="c1-sensors-14-02561" ref-type="corresp">
<sup>*</sup>
</xref>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Roke</surname>
<given-names>Calum</given-names>
</name>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Rossiter</surname>
<given-names>Jonathan</given-names>
</name>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Pipe</surname>
<given-names>Tony</given-names>
</name>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Melhuish</surname>
<given-names>Chris</given-names>
</name>
</contrib>
<aff id="af1-sensors-14-02561">Bristol Robotics Lab, T Block, Frenchay Campus, Coldharbour Lane, Bristol, BS16 1QY, UK; E-Mails:
<email>calum.roke@brl.ac.uk</email>
(C.R.);
<email>Jonathan.Rossiter@bristol.ac.uk</email>
(J.R.);
<email>Tony.Pipe@brl.ac.uk</email>
(T.P.);
<email>Chris.Melhuish@brl.ac.uk</email>
(C.M.)</aff>
</contrib-group>
<author-notes>
<corresp id="c1-sensors-14-02561">
<label>*</label>
Author to whom correspondence should be addressed; E-Mail:
<email>tareq.assaf@brl.ac.uk</email>
; Tel.: +44-0-117-328-6786; Fax: +44-0-117-328-3960.</corresp>
</author-notes>
<pub-date pub-type="collection">
<month>2</month>
<year>2014</year>
</pub-date>
<pub-date pub-type="epub">
<day>07</day>
<month>2</month>
<year>2014</year>
</pub-date>
<volume>14</volume>
<issue>2</issue>
<fpage>2561</fpage>
<lpage>2577</lpage>
<history>
<date date-type="received">
<day>25</day>
<month>12</month>
<year>2013</year>
</date>
<date date-type="rev-recd">
<day>23</day>
<month>1</month>
<year>2014</year>
</date>
<date date-type="accepted">
<day>27</day>
<month>1</month>
<year>2014</year>
</date>
</history>
<permissions>
<copyright-statement>© 2014 by the authors; licensee MDPI, Basel, Switzerland.</copyright-statement>
<copyright-year>2014</copyright-year>
<license>
<license-p>This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution license (
<ext-link ext-link-type="uri" xlink:href="http://creativecommons.org/licenses/by/3.0/">http://creativecommons.org/licenses/by/3.0/</ext-link>
).</license-p>
</license>
</permissions>
<abstract>
<p>Effective tactile sensing for artificial platforms remains an open issue in robotics. This study investigates the performance of a soft biologically-inspired artificial fingertip in active exploration tasks. The fingertip sensor replicates the mechanisms within human skin and offers a robust solution that can be used both for tactile sensing and gripping/manipulating objects. The softness of the optical sensor's contact surface also allows safer interactions with objects. High-level tactile features such as edges are extrapolated from the sensor's output and the information is used to generate a tactile image. The work presented in this paper aims to investigate and evaluate this artificial fingertip for 2D shape reconstruction. The sensor was mounted on a robot arm to allow autonomous exploration of different objects. The sensor and a number of human participants were then tested for their abilities to track the raised perimeters of different planar objects and compared. By observing the technique and accuracy of the human subjects, simple but effective parameters were determined in order to evaluate the artificial system's performance. The results prove the capability of the sensor in such active exploration tasks, with a comparable performance to the human subjects despite it using tactile data alone whereas the human participants were also able to use proprioceptive cues.</p>
</abstract>
<kwd-group>
<kwd>shape recognition</kwd>
<kwd>object features</kwd>
<kwd>optical-based tactile sensor</kwd>
<kwd>real-time processing</kwd>
<kwd>touch sensor</kwd>
</kwd-group>
</article-meta>
</front>
<body>
<sec>
<label>1.</label>
<title>Introduction</title>
<p>The sense of touch plays a particularly valuable role in physical and safe interactions, allowing the direct perception of parameters such as shape, texture, stickiness, and friction. These parameters cannot be easily attained from any other sense. As a result, either alone or in conjunction with other senses, tactile information can be used to build a perception of objects and the environment that would be otherwise unobtainable. Tactile information therefore offers a number of benefits that arise from better environment perception including the provision for safer movements and increased dexterity. Its importance for artificial and robotic systems is clear, and accordingly, there has been a rapid growth of the number of related publications since the 1980s [
<xref rid="b1-sensors-14-02561" ref-type="bibr">1</xref>
,
<xref rid="b2-sensors-14-02561" ref-type="bibr">2</xref>
].</p>
<p>However, one of the main open issues in robotics is the development of an effective sensory feedback system for robotic platforms. Such a system is targeted to achieve diverse objectives. For example, humanoid robots, surgical robots and learning robots focus on using tactile sensory feedback to increase the dexterity of the robotic arms/hands, whereas biologically-inspired approaches to humanoid robotics aim to develop an artificial platform able to interact with and understand the real world [
<xref rid="b1-sensors-14-02561" ref-type="bibr">1</xref>
,
<xref rid="b3-sensors-14-02561" ref-type="bibr">3</xref>
,
<xref rid="b4-sensors-14-02561" ref-type="bibr">4</xref>
]. Recent technological advances combined with a deeper understanding of biological systems now make it possible to develop more versatile and sensitive sensors than were previously possible. This should benefit all robotics applications where tactile feedback is useful.</p>
<p>Many kinds of tactile sensor designs can be found in literature, ranging from simple [
<xref rid="b5-sensors-14-02561" ref-type="bibr">5</xref>
,
<xref rid="b6-sensors-14-02561" ref-type="bibr">6</xref>
] to the more complex in terms of mechanical and processing burden [
<xref rid="b7-sensors-14-02561" ref-type="bibr">7</xref>
,
<xref rid="b8-sensors-14-02561" ref-type="bibr">8</xref>
]. A wide range of tactile features are employed for exploring the environment, including contact detection, force measurements, force distribution vectors [
<xref rid="b9-sensors-14-02561" ref-type="bibr">9</xref>
,
<xref rid="b10-sensors-14-02561" ref-type="bibr">10</xref>
], strain extraction, surface traction field [
<xref rid="b11-sensors-14-02561" ref-type="bibr">11</xref>
], vibration monitoring, grip force [
<xref rid="b12-sensors-14-02561" ref-type="bibr">12</xref>
,
<xref rid="b13-sensors-14-02561" ref-type="bibr">13</xref>
], and object recognition [
<xref rid="b7-sensors-14-02561" ref-type="bibr">7</xref>
].</p>
<p>In this paper, the performance of an artificial fingertip sensor is investigated. This sensor has inherent safety features due to the softness of its sensing surface and an extremely diverse sensing capability that could address some of the open issues in robotics. The tactile sensor uses efficient algorithms to identify higher level features from its optical sensors. These features allow shape reconstruction by generating an image that could be then processed with image processing algorithms.</p>
<p>Previous work with this sensor includes: an initial sensing performance evaluation for force and 2D shape detection [
<xref rid="b14-sensors-14-02561" ref-type="bibr">14</xref>
]; use in a tactile feedback system for soft object interaction, where it was employed to measure spatially-distributed skin deformation (3D shape) [
<xref rid="b15-sensors-14-02561" ref-type="bibr">15</xref>
] and lateral skin displacement due to shear forces [
<xref rid="b16-sensors-14-02561" ref-type="bibr">16</xref>
]; and investigation into texture discrimination, including the effect of adding a textured outer surface akin to fingerprints [
<xref rid="b17-sensors-14-02561" ref-type="bibr">17</xref>
].</p>
<p>This paper focuses on evaluating the fingertip tactile sensor for real-time contour-following tasks in a structured environment. Such tasks have previously been shown as important for evaluating the capabilities of sensors and their processing algorithms [
<xref rid="b18-sensors-14-02561" ref-type="bibr">18</xref>
,
<xref rid="b19-sensors-14-02561" ref-type="bibr">19</xref>
]. Such a test is suitable for multiple reasons: (i) it is a simple but effective task, (ii) it is related to daily actions not only for object recognition but also for human interactions. In order to estimate the sensor's capabilities during these tasks, its performance has been compared to that of humans during similar tactile-based contour-following tasks. This is achieved by collecting data on the trajectory taken by the human subjects and by the artificial finger platform. These human tests do not aim to improve our understanding of the human touch capability, instead the work aims to define a robust methodology that exploits the sensor's broad real-time sensing capabilities for contour following, with a performance comparable to that of humans.</p>
<p>The major contributions of this work are the introduction of a suitable sensing and gripping solution, and the rapid extraction of high level features from the tactile sensor during environmental exploration and continuous active touch activities. Active touch is the act of physically exploring an object in order to learn more about it. The extracted features are highly suited to further machine learning tasks in higher level object abstraction and environment mapping applications.</p>
<p>This paper is structured as follows: The following two subsections describe the tactile sensor and the feature extraction algorithm; Section 2 illustrates the experiments, the set up and the methodology for the artificial exploration and the human tests; Sections 3 and 4 report and discuss the results respectively; and finally, Section 5 concludes the paper.</p>
</sec>
<sec>
<label>1.1.</label>
<title>Tactile Sensor</title>
<p>In humans, a large proportion of the tactile information needed for object manipulation comes from the hands alone. The fingertips are consequently one of the most sensitive areas used for the recognition of object features, and have the highest density of mechanoreceptors [
<xref rid="b20-sensors-14-02561" ref-type="bibr">20</xref>
].</p>
<p>The tactile fingertip (TACTIP) sensor used in this study is biologically-inspired, taking inspiration from the mechanisms and multi-layered structure of human skin [
<xref rid="b14-sensors-14-02561" ref-type="bibr">14</xref>
,
<xref rid="b15-sensors-14-02561" ref-type="bibr">15</xref>
]. The TACTIP exploits recent theories about how the papillae structures (intermediate epidermal ridges) on the underside of the epidermis interact with the Meissner's corpuscle receptors to provide highly sensitive encoding of edge information [
<xref rid="b21-sensors-14-02561" ref-type="bibr">21</xref>
,
<xref rid="b22-sensors-14-02561" ref-type="bibr">22</xref>
]. It is suggested that changes in the surface gradient of the skin due to tactile interactions create deflection patterns of the papillae, which activate the Meissner's corpuscles that lie between them [
<xref rid="b14-sensors-14-02561" ref-type="bibr">14</xref>
]. The presence of the papillae may also lead to higher stresses near the Merkel cells, positioned at the tip of each papilla [
<xref rid="b23-sensors-14-02561" ref-type="bibr">23</xref>
].
<xref rid="f1-sensors-14-02561" ref-type="fig">Figure 1</xref>
shows a cross section of the human glabrous skin, which illustrates the papillae structures and placement of the mechanoreceptors. According to studies that focus on human and monkey skin [
<xref rid="b22-sensors-14-02561" ref-type="bibr">22</xref>
,
<xref rid="b24-sensors-14-02561" ref-type="bibr">24</xref>
], the frequency response of Meissner's mechanoreceptors is approximately 8–64 Hz [
<xref rid="b25-sensors-14-02561" ref-type="bibr">25</xref>
,
<xref rid="b26-sensors-14-02561" ref-type="bibr">26</xref>
], with a receptive field of 3–5 mm [
<xref rid="b27-sensors-14-02561" ref-type="bibr">27</xref>
] and a sensitivity to indentation that begins to saturate beyond around 100
<italic>μ</italic>
m [
<xref rid="b28-sensors-14-02561" ref-type="bibr">28</xref>
]. Merkel mechanoreceptors operate at lower frequencies of 2–32 Hz [
<xref rid="b26-sensors-14-02561" ref-type="bibr">26</xref>
], can resolve smaller spatial details, and are able to encode skin indentation beyond 1,500
<italic>μ</italic>
m [
<xref rid="b27-sensors-14-02561" ref-type="bibr">27</xref>
,
<xref rid="b28-sensors-14-02561" ref-type="bibr">28</xref>
].</p>
<p>The sensor replicates the papillae structures in the human skin using an array of short pin-like nodules on the underside of its skin-like membrane.
<xref rid="f2-sensors-14-02561" ref-type="fig">Figure 2</xref>
shows the sensor architecture and illustrates the sensor concept, where the papillae are deflected as the result of surface deformation. The opaque skin-like membrane consists of a 40 mm diameter hemisphere of 0.3mm thick, black, Shore hardness A 50 urethane, which provides a flexible but strong and relatively inelastic layer. The array of papillae-like nodules is moulded onto the internal surface of this skin layer, with the tips colored white to aid localization on the black membrane background. This epidermal surface encloses a clear, highly compliant polymer that mimics the dermis and subcutaneous fat in the human finger whilst allowing the underside of the membrane to be viewed through a camera. The artificial skin layers have similar mechanical responses to indentation and shear as the human finger pad but they do not exhibit as much hysteresis. A more non-elastic sensor filling could be attractive for providing greater skin curvature and therefore papilla deflection during interactions, especially with soft elastic objects, although that is not the focus of this performance evaluation. When an object interacts with the sensing surface, changes in the surface gradient of the sensor membrane cause displacement of the white papillae tips on the underside. A CCD camera is used to capture the positions of the white papillae tips. The camera is mounted at a distance of approximately 50 mm from the centre of the membrane in order to capture the whole marker array with almost uniform focus. Six infrared LEDs are positioned above the papillae array to illuminate it. The spatial resolution of the sensor for tactile information relies on the papillae density and the image capture and processing system.</p>
<p>The main advantage of this optical approach to tactile sensing is the removal of any sensing elements or electronics from the immediate locality of the sensing surface. This means that a high spatial resolution can be attained without affecting the softness of the surface. Furthermore, the sensing surface is also very durable, with significant protection between the environment and any delicate components. The resultant device is suitable both for manipulating and for feeling objects, just like the human finger.</p>
<sec>
<label>1.2.</label>
<title>Feature Extraction</title>
<p>
<xref rid="f3-sensors-14-02561" ref-type="fig">Figure 3A,B</xref>
show typical papillae distributions as captured by the embedded camera. In this study, tactile features are extracted by detecting the area and direction of surface gradient changes. Two morphological image processing operators are used to detect these gradient features directly from the papillae marker images, thereby avoiding more processing intensive methods such as tracking each individual marker. The relative displacement of papilla groups are mapped through real-time local aggregation operations to higher level features including lines and points, and dynamic responses such as force and shear. This method is very quick and forms the initial image to be passed to the control software. The operation of image processing functions used in the algorithms presented (Dilation and Erosion) generate the output as result of the contribution of black and white (inhibition, excitation respectively) neighbors, mimicking the aggregation of local information in biological neurons.</p>
<p>We first pre-process the camera image to reduce noise and perform contrast and light adjustments. This source image is then dilated (the bright regions are expanded)
<italic>n</italic>
times using kernel
<italic>K</italic>
<sub>1</sub>
. The higher the number of iterations, the greater the effect the function has on the image. In this work,
<italic>n</italic>
= 5. The value was determined experimentally by considering the papilla spot dimension on the image, the distance between them, and the total image size. This dilation enlarges the white spots to such an extent that close markers merge together. Following this action, the image is eroded (the bright regions are isolated and shrunk) 2
<italic>n</italic>
times using kernel
<italic>K</italic>
<sub>2</sub>
. Finally, the image is cleaned by applying a binary threshold function to obtain a black and white image. In these experiments the default kernels were implemented in the OpenCV framework as 3 × 3 matrices. In this study we select
<italic>K</italic>
<sub>1</sub>
=
<italic>K</italic>
<sub>2</sub>
for simplicity. Pseudo-code for this algorithm is shown below.
<array>
<tbody>
<tr>
<td valign="bottom" colspan="2" rowspan="1">
<hr></hr>
</td>
</tr>
<tr>
<td colspan="2" align="left" valign="top" rowspan="1">
<bold>Algorithm 1</bold>
</td>
</tr>
<tr>
<td valign="bottom" colspan="2" rowspan="1">
<hr></hr>
</td>
</tr>
<tr>
<td align="right" valign="top" rowspan="1" colspan="1">1:</td>
<td align="left" valign="top" rowspan="1" colspan="1"> Frame Process</td>
</tr>
<tr>
<td align="right" valign="top" rowspan="1" colspan="1">2:</td>
<td align="left" valign="top" rowspan="1" colspan="1"> Dilate Function (cvDilate(
<italic>I</italic>
,
<italic>n</italic>
))</td>
</tr>
<tr>
<td align="right" valign="top" rowspan="1" colspan="1"></td>
<td align="left" valign="top" rowspan="1" colspan="1">
<mml:math id="mm1">
<mml:mrow>
<mml:mtext mathvariant="italic">Dilate</mml:mtext>
<mml:mo stretchy="false">(</mml:mo>
<mml:mi>x</mml:mi>
<mml:mo>,</mml:mo>
<mml:mi>y</mml:mi>
<mml:mo stretchy="false">)</mml:mo>
<mml:mo>=</mml:mo>
<mml:mfrac>
<mml:mrow>
<mml:mo mathvariant="italic">Max</mml:mo>
</mml:mrow>
<mml:mrow>
<mml:mo stretchy="false">(</mml:mo>
<mml:msup>
<mml:mi>x</mml:mi>
<mml:mo></mml:mo>
</mml:msup>
<mml:mo>,</mml:mo>
<mml:msup>
<mml:mi>y</mml:mi>
<mml:mo></mml:mo>
</mml:msup>
<mml:mo stretchy="false">)</mml:mo>
<mml:mo></mml:mo>
<mml:msub>
<mml:mi>K</mml:mi>
<mml:mn>1</mml:mn>
</mml:msub>
</mml:mrow>
</mml:mfrac>
<mml:mi>I</mml:mi>
<mml:mo stretchy="false">(</mml:mo>
<mml:mi>x</mml:mi>
<mml:mo>+</mml:mo>
<mml:msup>
<mml:mi>x</mml:mi>
<mml:mo></mml:mo>
</mml:msup>
<mml:mo>,</mml:mo>
<mml:mi>y</mml:mi>
<mml:mo>+</mml:mo>
<mml:msup>
<mml:mi>y</mml:mi>
<mml:mo></mml:mo>
</mml:msup>
<mml:mo stretchy="false">)</mml:mo>
</mml:mrow>
</mml:math>
</td>
</tr>
<tr>
<td align="right" valign="top" rowspan="1" colspan="1">3:</td>
<td align="left" valign="top" rowspan="1" colspan="1"> Erode Function (cvErode(
<italic>I</italic>
,
<italic>n</italic>
))</td>
</tr>
<tr>
<td align="right" valign="top" rowspan="1" colspan="1"></td>
<td align="left" valign="top" rowspan="1" colspan="1">
<mml:math id="mm2">
<mml:mrow>
<mml:mtext mathvariant="italic">Erode</mml:mtext>
<mml:mo stretchy="false">(</mml:mo>
<mml:mi>x</mml:mi>
<mml:mo>,</mml:mo>
<mml:mi>y</mml:mi>
<mml:mo stretchy="false">)</mml:mo>
<mml:mo>=</mml:mo>
<mml:mfrac>
<mml:mrow>
<mml:mo mathvariant="italic">Min</mml:mo>
</mml:mrow>
<mml:mrow>
<mml:mo stretchy="false">(</mml:mo>
<mml:msup>
<mml:mi>x</mml:mi>
<mml:mo></mml:mo>
</mml:msup>
<mml:mo>,</mml:mo>
<mml:msup>
<mml:mi>y</mml:mi>
<mml:mo></mml:mo>
</mml:msup>
<mml:mo stretchy="false">)</mml:mo>
<mml:mo></mml:mo>
<mml:msub>
<mml:mi>K</mml:mi>
<mml:mn>2</mml:mn>
</mml:msub>
</mml:mrow>
</mml:mfrac>
<mml:mi>I</mml:mi>
<mml:mo stretchy="false">(</mml:mo>
<mml:mi>x</mml:mi>
<mml:mo>+</mml:mo>
<mml:msup>
<mml:mi>x</mml:mi>
<mml:mo></mml:mo>
</mml:msup>
<mml:mo>,</mml:mo>
<mml:mi>y</mml:mi>
<mml:mo>+</mml:mo>
<mml:msup>
<mml:mi>y</mml:mi>
<mml:mo></mml:mo>
</mml:msup>
<mml:mo stretchy="false">)</mml:mo>
</mml:mrow>
</mml:math>
</td>
</tr>
<tr>
<td align="right" valign="top" rowspan="1" colspan="1">4:</td>
<td align="left" valign="top" rowspan="1" colspan="1"> Clean</td>
</tr>
<tr>
<td align="right" valign="top" rowspan="1" colspan="1">5:</td>
<td align="left" valign="top" rowspan="1" colspan="1"> End</td>
</tr>
<tr>
<td valign="bottom" colspan="2" rowspan="1">
<hr></hr>
</td>
</tr>
</tbody>
</array>
</p>
<p>The result of this processing can been seen in
<xref rid="f3-sensors-14-02561" ref-type="fig">Figure 3</xref>
. This figure illustrates the extraction of features from the embedded camera sensor, where an edge is detected from the deflection of the internal papillae.
<xref rid="f3-sensors-14-02561" ref-type="fig">Figure 3A,B</xref>
show the camera view of the papillae ends when a horizontal edge and a corner are sensed by the fingertip. The lines show the position and orientation of the actual edges.
<xref rid="f3-sensors-14-02561" ref-type="fig">Figure 3C,D</xref>
show the corresponding features extracted using the method described above. Note the clear correlation between the extracted shapes on the right and the lines on the left.</p>
<p>Executing the tactile feature extraction algorithm on a single-core 3 GHz processor computer takes 16–30 ms each frame, depending on the thread scheduling. This results in an upper-bound of approximately 60 frames per second (fps), which covers almost the entire bandwidth of the Meissner's corpuscle. However, whilst 60 Hz or even higher frequencies are possible in terms of sensor bandwidth [
<xref rid="b17-sensors-14-02561" ref-type="bibr">17</xref>
], the processing was restricted to 25 fps in this work due to the maximum capture rate of the camera.</p>
<p>After processing, the resulting frames, as illustrated in
<xref rid="f3-sensors-14-02561" ref-type="fig">Figure 3C,D</xref>
, contain all the information needed for subsequent feature detection. The white region represents the changing gradient of the target surface. The final step of the processing then involves extracting higher level information from this binary image.</p>
<p>For the contour-following tasks used in this study, the extracted features are subsequently processed in order to obtain information suitable for the control software. For instance, the orientation of a detected edge is processed using the principle and secondary Eigen components of the region (
<italic>i.e.</italic>
, the image moments [
<xref rid="b29-sensors-14-02561" ref-type="bibr">29</xref>
]).</p>
</sec>
</sec>
<sec>
<label>2.</label>
<title>Experiments in Real-time Active Touch</title>
<p>The aim of the experiments was to quantitatively define the capability of the sensor for human-like environmental exploration and shape recognition tasks, and to compare this capability to that of humans conducting a similar tactile exploration task.</p>
<p>To achieve this, the sensor was mounted on a robotic arm to allow it to be moved through the environment and explore object surfaces and edges. The sensor position was controlled autonomously according to real-time processing of the tactile sensor information.</p>
<p>A number of raised shapes (shown in
<xref rid="f4-sensors-14-02561" ref-type="fig">Figure 4</xref>
) were presented to the system and the paths taken by the sensor when following the perimeter edge of the shape were recorded. The shapes were constructed from 3 mm thick rigid plastic, even though the sensor can detect different thickness objects below 1 mm using the current hardware and software implementations.</p>
<p>Volunteer Human test participants were also asked to feel around the edge of these same shapes. The paths taken by the sensor system and the human subjects were compared to each other, and to the actual object shapes and sizes of the objects. This information was used to analyze the performance of the sensor system.</p>
<sec>
<label>2.1.</label>
<title>Robotic Contour Following Task</title>
<p>The main hardware components of the setup were:
<list list-type="order">
<list-item>
<p>Robotic platform to control the sensor position</p>
</list-item>
<list-item>
<p>Target shapes</p>
</list-item>
<list-item>
<p>Cradles to secure the sensor on to the platform</p>
</list-item>
</list>
</p>
<p>A Barrett Technology Inc, 7DOF robotic arm was used for the robotic platform to move the sensor (shown in
<xref rid="f5-sensors-14-02561" ref-type="fig">Figure 5</xref>
). The platform operates within a defined working area of 400
<italic>mm</italic>
×400
<italic>mm</italic>
square, in roughly 1 mm increments.</p>
<p>No information from the robot arm positioning system was used (such as absolute position, force, or velocity). This is somewhat different to human tactile exploration where a degree of proprioception and visual feedback normally accompanies tactile exploration. However, this choice provides a more stringent evaluation of the fingertip sensor.</p>
<p>The exploration process for the sensor is described in
<xref rid="f6-sensors-14-02561" ref-type="fig">Figure 6</xref>
. By applying the low level image-based feature extraction algorithm detailed earlier, the edge angle and edge orientation features were extracted from each image frame. By edge angle we mean the direction of the edge on the table plane and by edge orientation we refer to the side of the edge which is highest (in order to define the inside and outside of the object).</p>
<p>This information was then used to determine the direction of movement for the sensor, with a consistent movement direction chosen according to the edge orientation. Changes of the direction of motion as the edge were quantized to 45 degrees to simplify the output evaluation. This choice affects the accuracy of the output but the effect is small due to the small 1 mm step size of the positioning system.</p>
<p>
<xref rid="f7-sensors-14-02561" ref-type="fig">Figure 7</xref>
shows examples of the main processing steps of the proposed algorithm (small images) and the 400×400 pixel output reconstruction (large images). The larger images show the reconstructed shape formed using the sensor data and the calculated sensor position. When an object is encountered (when the sensor output detects an edge), the tactile features of the edge are assessed with respect to a notion of whether it is a ‘good feature’. A frame contains a ‘good feature’ if: (i) a feature is present, and (ii) the extracted feature's centre of mass is within the central zone of the camera. This zone is about 1/3 of the image size. This check ensures that the sensor is on the edge of the shape before extracting the possible direction for the next iteration. The frames without a ‘good feature’ contain however information for the recovery process that will move the sensor in the direction of the centre of mass. This operation leads to a new ‘good feature’ and the process can start once again. This notion is similar to the capabilities of humans to first explore the macro scale properties of an object (the size and general shape) before refining this to determine the minor scale properties. To this extent the object exploration algorithm first seeks a ‘good feature’ upon which it may base further active sensing. The higher side and the falling side of the edge are calculated by defining two rectangular regions (visible in
<xref rid="f7-sensors-14-02561" ref-type="fig">Figure 7D,I,P</xref>
and within these regions, detecting the amount of white (papillae) in the image; the greater the white value, the greater the number of papillae. Due to the aggregation of papillae in the higher edge and divergence in the falling side, which is an intrinsic feature of the sensor, the edge orientation can be detected.</p>
<p>A detected edge is classified as a ‘good feature’ when the desired tactile features (angle and orientation) can be found and the centre of the edge is positioned in the middle of the sensor. Where a ‘good feature’ is not found, the sensor must be moved to try to recover the edge. This method gives two main advantages: First, the centre of the feature is always coincident with the sensor's most sensitive area and second, any horizontal displacement of the soft sensitive surface due to shear forces corrected. Without this correction, this latter point can result in a mismatch between the centre of the image and the centre of the sensor due to the friction between the surface and the sensor membrane causing a displacement of the compliant sensor surface.</p>
<p>Theoretically, the algorithms used can run with an update rate of 60 fps and therefore the speed of the fingertip could be potentially increased. In this work, in which maximizing the speed is not a primary objective, the frame and the update rate are determined by the 30 fps maximum camera refresh rate and the speed of the arm, limited to 10 mm/s in order to minimize the risk of damaging the sensor skin whilst sliding along the objects' edges. This velocity was deemed to provide a reasonable trade-off between speed of active exploration and reconstruction fidelity.</p>
</sec>
<sec>
<label>2.2.</label>
<title>Human Contour Following Task</title>
<p>As a bio-inspired sensor, the TACTIP sensor is not optimized for precision, unlike a digital sensor such as a laser scan array, but rather is optimized for a balance between compliance, robustness and accuracy. However, even the human tactile perception is not 100% precise. Consequently, a set of experiments has been designed in order to evaluate the sensor in terms of the performance of the human sense of touch. By comparing the results of the artificial tests and the human ones we can define whether the sensor can effectively mimic human touch. The human experiments were designed not from a neuroscientific point of view but as an engineering tool to obtain data for a quantitative comparison.</p>
<p>Twelve volunteer subjects were asked to perform object exploration and identification tasks. Each subject was blindfolded throughout the tests and was unfamiliar with the objects. The dominant hand was used and the subject was instructed to not move the index finger during the experiment independently from the hand.</p>
<p>The experiments were divided into three tasks in which the rectangle, hexagon and circle were presented to the volunteers. The volunteer was free to explore the shapes by touch alone using one fingertip. They were then required to guess the kind of shape and to estimate its dimensions as feedback about the subject's mental reconstruction in a short interview.</p>
<p>No constraints were imposed on the exploration speed. To generate the human fingertip trajectory during these tests, they were preformed within a Vicon optical motion tracking environment. Through the use of small markers and Infrared cameras, the Vicon System can reconstruct the movements of a rigid body. In these experiments three markers were placed on a glove worn by the subjects and these formed a known rigid body within the Vicon workspace with the
<italic>reference point</italic>
fixed on the fingertip. The glove, marker and fingertip are illustrated in
<xref rid="f8-sensors-14-02561" ref-type="fig">Figure 8</xref>
.</p>
<p>The experimental constraints imposed during these tests were:
<list list-type="order">
<list-item>
<p>Subjects were blindfolded</p>
</list-item>
<list-item>
<p>Only one fingertip was used, from the index finger</p>
</list-item>
<list-item>
<p>The hand was held in as close to a constant orientation as possible and the index finger not moved independently from the hand</p>
</list-item>
<list-item>
<p>Movements were preferred in one direction (
<italic>i.e.</italic>
, following the circumference of an object)</p>
</list-item>
<list-item>
<p>After exploring the object, an estimate of the shape and its dimensions was made</p>
</list-item>
</list>
</p>
</sec>
</sec>
<sec sec-type="results">
<label>3.</label>
<title>Results</title>
<p>As shown in
<xref rid="f7-sensors-14-02561" ref-type="fig">Figure 7</xref>
, the artificial fingertip and autonomous control system are able to successfully map the perimeter of the different objects by finding and then following their raised edges. A comparison between the real and estimated perimeters of the shapes by the sensor system can be seen in
<xref rid="f9-sensors-14-02561" ref-type="fig">Figure 9</xref>
. The trajectories are not always completely closed due to the stop condition, although this is not a problem for the evaluation due to the post process convex hull built around the trajectory (
<xref rid="f9-sensors-14-02561" ref-type="fig">Figure 9a</xref>
). The length of the hull sides was used to generate the reconstructed dimensions.</p>
<p>
<xref rid="t1-sensors-14-02561" ref-type="table">Table 1</xref>
compares the dimensions of the reconstructed shapes found from the sensor's path to the actual dimensions of each shape. It is clear that the dimensions of the paths taken by the sensor are very similar to those of the actual objects.</p>
<p>The repeatability of the reconstructed rectangles is high and can be observed in
<xref rid="f10-sensors-14-02561" ref-type="fig">Figure 10</xref>
, which shows an example of three different exploration paths, overlapped, for the rectangular object. Comparison of the side lengths in each case shows that they are equal with an accuracy of 3%. The maximum estimation error for the area of these paths is calculated to be approximately 7.5%.</p>
<p>Comparison of the human trajectories derived from the Vicon tracking data with those of the sensor shows great similarity (
<xref rid="f11-sensors-14-02561" ref-type="fig">Figure 11</xref>
). Two human trajectories and one from the robotic platform are shown in this figure. It is not immediately clear which is the artificial one.</p>
<p>Analysis of the mean object areas for 12 human subjects and the comparison with the robotic system, shown in
<xref rid="t2-sensors-14-02561" ref-type="table">Table 2</xref>
, indicates that there is an approximately constant scaling factor (
<inline-formula>
<mml:math id="mm3">
<mml:mrow>
<mml:mrow>
<mml:mrow>
<mml:mo stretchy="false">(</mml:mo>
<mml:mrow>
<mml:mover accent="true">
<mml:mrow>
<mml:msub>
<mml:mi>A</mml:mi>
<mml:mi>H</mml:mi>
</mml:msub>
</mml:mrow>
<mml:mo>¯</mml:mo>
</mml:mover>
</mml:mrow>
<mml:mo stretchy="false">)</mml:mo>
</mml:mrow>
<mml:mo stretchy="false">/</mml:mo>
<mml:mrow>
<mml:mo stretchy="false">(</mml:mo>
<mml:mrow>
<mml:mover accent="true">
<mml:mrow>
<mml:msub>
<mml:mi>A</mml:mi>
<mml:mi>P</mml:mi>
</mml:msub>
</mml:mrow>
<mml:mo>¯</mml:mo>
</mml:mover>
</mml:mrow>
<mml:mo stretchy="false">)</mml:mo>
</mml:mrow>
</mml:mrow>
</mml:mrow>
</mml:math>
</inline-formula>
) between the two of approximately 1:1.1, with the human paths enclosing a slightly larger area.
<xref rid="f12-sensors-14-02561" ref-type="fig">Figure 12</xref>
compares this data to the actual object areas. The scaling factor is again evident, as well as the similarities in performance between the biological and artificial systems being highlighted once more.</p>
</sec>
<sec sec-type="discussion">
<label>4.</label>
<title>Discussion</title>
<p>The tactile processing algorithm was able to reliably detect the raised object edges through the artificial fingertip, with minimal adjustment for the object height, dimensions or other specific attributes. It is therefore expected that this system could be used for many different tactile applications, especially given the diverse tactile information offered by the novel TACTIP device.</p>
<p>The autonomous sensing system, created by mounting the TACTIP on a robot arm and moving it according to the tactile data, was also found to be very reliable given the naive positioning used. The simple positioning algorithm used the tactile information from a single location to calculate the subsequent movement direction. A future revision will use a selection of previous tactile data points to determine a better candidate trajectory. However, one limitation of the current algorithm is the recovery mechanism. In this work, if the fingertip moves away from an edge recovery has been simply addressed by slowly moving back to the previous ‘good feature’ position until new ‘good features’ are detected. In the few cases were this strategy was not enough to achieve recovery, the edge was lost. To improve this, the recovery mode could be extended to include a circular palpation pattern, or its path could be retraced, until a ‘good feature’ is regained. Such a procedure was not implemented during these experiments. The focus here is on tactile feedback alone, with no direct feedback of the sensor's position.</p>
<p>The object perimeters found by the artificial tactile system compare well with those by the human subjects. The scaling factor between the two, where the human paths were approximately 10% larger than the sensor system's, is expected to be due to the human subjects using a kinesthetic element to aid the navigation, by applying a slight inward force against the objects' outside edges. In doing this, the centre of the finger would be positioned on the outside of the edge rather than directly above it. A similar mechanism could be applied to the artificial sensing system using force feedback as measured by the TACTIP fingertip sensor.</p>
</sec>
<sec sec-type="conclusions">
<label>5.</label>
<title>Conclusion</title>
<p>A novel biologically-inspired tactile system using the TACTIP compliant sensor was designed and shown to complete real-time object exploration tasks in a structured environment with a human-like performance. The sensor utilizes the movement of papilla-like structures on the underside of its artificial epidermal layer to detect changes of the sensing surface. This is a similar mechanism to that used in human skin. An optical sensing method is used to detect the papillae deflections, which avoids placing delicate sensing elements near the skin surface and leads to a very robust and compliant device. The proposed algorithm avoids computationally expensive tracking methods by applying fast image operations to the sensor output to extract tactile features.</p>
<p>The TACTIP device was mounted on a robotic arm in order for the system to feel and follow the edges of raised objects. The edge angle and orientation at each sensor position was used to determine a movement direction, following which the new tactile data was analyzed. The object perimeters found by the autonomous system were compared to those found by human subjects during similar tasks. The results for this comparison show high similarity, and that the sensor can identify and parameterize hard edges effectively using the proposed algorithms. This shows that the artificial fingertip sensor is capable of emulating human tactile sensing performances despite the human subjects exploiting a larger amount of information not provided to the sensor elaboration such as proprioception, force feedback, vibration, temperature, and texture.</p>
<p>The sensing solution is both sensitive and robust. These attributes make it suitable for tactile sensing as well as gripping and manipulating objects. It is therefore expected to provide a good solution for active sensing and environmental exploration, whilst offering a performance similar to that of humans.</p>
<p>Useful future work could investigate the sensor's ability to detect and follow the edges of different 3D or compliant objects compared to that of humans. The benefit of detecting and processing information about the depth of indentation would also be useful to determine the benefit of changing the current algorithm to include this additional dimension.</p>
</sec>
</body>
<back>
<ack>
<p>This work has been supported by the UK Engineering and Physical Sciences Research Council (grant EP/I032533/1).</p>
</ack>
<notes>
<title>Conflict of Interest</title>
<p>The authors declare no conflict of interest.</p>
</notes>
<ref-list>
<title>References</title>
<ref id="b1-sensors-14-02561">
<label>1.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Tegin</surname>
<given-names>J.</given-names>
</name>
<name>
<surname>Wikander</surname>
<given-names>J.</given-names>
</name>
</person-group>
<article-title>Tactile sensing in intelligent robotic manipulation—A review</article-title>
<source>Ind. Robot Int. J.</source>
<year>2005</year>
<volume>32</volume>
<fpage>64</fpage>
<lpage>70</lpage>
</element-citation>
</ref>
<ref id="b2-sensors-14-02561">
<label>2.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Lee</surname>
<given-names>M.</given-names>
</name>
</person-group>
<article-title>Review article tactile sensing for mechatronics—A state of the art survey</article-title>
<source>Mechatronics</source>
<year>1999</year>
<volume>9</volume>
<fpage>1</fpage>
<lpage>31</lpage>
</element-citation>
</ref>
<ref id="b3-sensors-14-02561">
<label>3.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Lee</surname>
<given-names>M.H.</given-names>
</name>
</person-group>
<article-title>Tactile sensing: New directions, new challenges</article-title>
<source>Int. J. Robot. Res.</source>
<year>2000</year>
<volume>19</volume>
<fpage>636</fpage>
<lpage>643</lpage>
</element-citation>
</ref>
<ref id="b4-sensors-14-02561">
<label>4.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Dario</surname>
<given-names>P.</given-names>
</name>
</person-group>
<article-title>Tactile sensing: Technology and applications</article-title>
<source>Sens. Actuators A Phys.</source>
<year>1991</year>
<volume>26</volume>
<fpage>251</fpage>
<lpage>256</lpage>
</element-citation>
</ref>
<ref id="b5-sensors-14-02561">
<label>5.</label>
<element-citation publication-type="confproc">
<person-group person-group-type="author">
<name>
<surname>Suwanratchatamanee</surname>
<given-names>K.</given-names>
</name>
<name>
<surname>Saegusa</surname>
<given-names>R.</given-names>
</name>
<name>
<surname>Matsumoto</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>Hashimoto</surname>
<given-names>S.</given-names>
</name>
</person-group>
<article-title>A Simple Tactile Sensor System for Robot Manipulator and Object Edge Shape Recognition</article-title>
<conf-name>Proceedings of the 33rd Annual Conference of the IEEE Industrial Electronics Society</conf-name>
<conf-loc>Taipei, Taiwan</conf-loc>
<conf-date>5–8 November 2007</conf-date>
<fpage>245</fpage>
<lpage>250</lpage>
</element-citation>
</ref>
<ref id="b6-sensors-14-02561">
<label>6.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Suwanratchatamanee</surname>
<given-names>K.</given-names>
</name>
<name>
<surname>Matsumoto</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>Hashimoto</surname>
<given-names>S.</given-names>
</name>
</person-group>
<article-title>Robotic tactile sensor system and applications</article-title>
<source>IEEE Trans. Ind. Electron.</source>
<year>2010</year>
<volume>57</volume>
<fpage>1074</fpage>
<lpage>1087</lpage>
</element-citation>
</ref>
<ref id="b7-sensors-14-02561">
<label>7.</label>
<element-citation publication-type="confproc">
<person-group person-group-type="author">
<name>
<surname>Schneider</surname>
<given-names>A.</given-names>
</name>
<name>
<surname>Sturm</surname>
<given-names>J.</given-names>
</name>
<name>
<surname>Stachniss</surname>
<given-names>C.</given-names>
</name>
<name>
<surname>Reisert</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>Burkhardt</surname>
<given-names>H.</given-names>
</name>
<name>
<surname>Burgard</surname>
<given-names>W.</given-names>
</name>
</person-group>
<article-title>Object Identification with Tactile Sensors Using Bag-of-Features</article-title>
<conf-name>Proceedings of the 2009 IEEE/RSJ International Conference on Intelligent Robots and Systems</conf-name>
<conf-loc>St. Louis, MO, USA</conf-loc>
<conf-date>10–15 October 2009</conf-date>
<fpage>243</fpage>
<lpage>248</lpage>
</element-citation>
</ref>
<ref id="b8-sensors-14-02561">
<label>8.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Payeur</surname>
<given-names>P.</given-names>
</name>
<name>
<surname>Pasca</surname>
<given-names>C.</given-names>
</name>
<name>
<surname>Cretu</surname>
<given-names>A.</given-names>
</name>
<name>
<surname>Petriu</surname>
<given-names>E.</given-names>
</name>
</person-group>
<article-title>Intelligent haptic sensor system for robotic manipulation</article-title>
<source>IEEE Trans. Instrum. Meas.</source>
<year>2005</year>
<volume>54</volume>
<fpage>1583</fpage>
<lpage>1592</lpage>
</element-citation>
</ref>
<ref id="b9-sensors-14-02561">
<label>9.</label>
<element-citation publication-type="confproc">
<person-group person-group-type="author">
<name>
<surname>Kamiyama</surname>
<given-names>K.</given-names>
</name>
<name>
<surname>Kajimoto</surname>
<given-names>H.</given-names>
</name>
<name>
<surname>Kawakami</surname>
<given-names>N.</given-names>
</name>
<name>
<surname>Tachi</surname>
<given-names>S.</given-names>
</name>
</person-group>
<article-title>Evaluation of a Vision-Based Tactile Sensor</article-title>
<conf-name>Proceedings of IEEE International Conference on Robotics and Automation (ICRA)</conf-name>
<conf-loc>New Orleans, LA, USA</conf-loc>
<conf-date>26 April–1 May 2004</conf-date>
<fpage>1542</fpage>
<lpage>1547</lpage>
</element-citation>
</ref>
<ref id="b10-sensors-14-02561">
<label>10.</label>
<element-citation publication-type="confproc">
<person-group person-group-type="author">
<name>
<surname>Sato</surname>
<given-names>K.</given-names>
</name>
<name>
<surname>Kamiyama</surname>
<given-names>K.</given-names>
</name>
<name>
<surname>Nii</surname>
<given-names>H.</given-names>
</name>
<name>
<surname>Kawakami</surname>
<given-names>N.</given-names>
</name>
<name>
<surname>Tachi</surname>
<given-names>S.</given-names>
</name>
</person-group>
<article-title>Measurement of Force Vector Field of Robotic Finger Using Vision-Based Haptic Sensor</article-title>
<conf-name>Proceedings of 2008 IEEE/RSJ International Conference on Intelligent Robots and Systems</conf-name>
<conf-loc>Nice, France</conf-loc>
<conf-date>22–26 September 2008</conf-date>
<fpage>488</fpage>
<lpage>493</lpage>
</element-citation>
</ref>
<ref id="b11-sensors-14-02561">
<label>11.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Kamiyama</surname>
<given-names>K.</given-names>
</name>
<name>
<surname>Vlack</surname>
<given-names>K.</given-names>
</name>
<name>
<surname>Mizota</surname>
<given-names>T.</given-names>
</name>
<name>
<surname>Kajimoto</surname>
<given-names>H.</given-names>
</name>
<name>
<surname>Kawakami</surname>
<given-names>K.</given-names>
</name>
<name>
<surname>Tachi</surname>
<given-names>S.</given-names>
</name>
</person-group>
<article-title>Vision-based sensor for real-time measuring of surface traction fields</article-title>
<source>IEEE Comput. Gr. Appl.</source>
<year>2005</year>
<volume>25</volume>
<fpage>68</fpage>
<lpage>75</lpage>
</element-citation>
</ref>
<ref id="b12-sensors-14-02561">
<label>12.</label>
<element-citation publication-type="confproc">
<person-group person-group-type="author">
<name>
<surname>Watanabe</surname>
<given-names>N.</given-names>
</name>
<name>
<surname>Obinata</surname>
<given-names>G.</given-names>
</name>
</person-group>
<article-title>Grip Force Control Based on the Degree of Slippage Using Optical Tactile Sensor</article-title>
<conf-name>Proceedings of 2007 International Symposium on Micro-NanoMechatronics and Human Science</conf-name>
<conf-loc>Nagoya, Japan</conf-loc>
<conf-date>11–14 November 2007</conf-date>
<fpage>466</fpage>
<lpage>471</lpage>
</element-citation>
</ref>
<ref id="b13-sensors-14-02561">
<label>13.</label>
<element-citation publication-type="confproc">
<person-group person-group-type="author">
<name>
<surname>Ito</surname>
<given-names>Y.</given-names>
</name>
<name>
<surname>Kim</surname>
<given-names>Y.</given-names>
</name>
<name>
<surname>Obinata</surname>
<given-names>G.</given-names>
</name>
</person-group>
<article-title>Slippage Degree Estimation for Dexterous Handling of Vision-Based Tactile Sensor</article-title>
<conf-name>Proceedings of IEEE Sensors</conf-name>
<conf-loc>Christchurch, New Zealand</conf-loc>
<conf-date>25–28 October 2009</conf-date>
<fpage>449</fpage>
<lpage>452</lpage>
</element-citation>
</ref>
<ref id="b14-sensors-14-02561">
<label>14.</label>
<element-citation publication-type="confproc">
<person-group person-group-type="author">
<name>
<surname>Chorley</surname>
<given-names>C.</given-names>
</name>
<name>
<surname>Melhuish</surname>
<given-names>C.</given-names>
</name>
<name>
<surname>Pipe</surname>
<given-names>T.</given-names>
</name>
<name>
<surname>Rossiter</surname>
<given-names>J.</given-names>
</name>
<name>
<surname>Whiteley</surname>
<given-names>G.</given-names>
</name>
</person-group>
<article-title>Development of a Tactile Sensor Based on Biologically Inspired Edge Encoding</article-title>
<conf-name>Proceedings of International Conference on Advanced Robotics (ICAR)</conf-name>
<conf-loc>Munich, Germany</conf-loc>
<conf-date>22–26 June 2009</conf-date>
<fpage>1</fpage>
<lpage>6</lpage>
</element-citation>
</ref>
<ref id="b15-sensors-14-02561">
<label>15.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Roke</surname>
<given-names>C.</given-names>
</name>
<name>
<surname>Melhuish</surname>
<given-names>C.</given-names>
</name>
<name>
<surname>Pipe</surname>
<given-names>T.</given-names>
</name>
<name>
<surname>Drury</surname>
<given-names>D.</given-names>
</name>
<name>
<surname>Chorley</surname>
<given-names>C.</given-names>
</name>
</person-group>
<article-title>Lump localisation through a deformation-based tactile feedback system using a biologically inspired finger sensor</article-title>
<source>Robot. Auton. Syst.</source>
<year>2012</year>
<volume>60</volume>
<fpage>1442</fpage>
<lpage>1448</lpage>
</element-citation>
</ref>
<ref id="b16-sensors-14-02561">
<label>16.</label>
<element-citation publication-type="confproc">
<person-group person-group-type="author">
<name>
<surname>Roke</surname>
<given-names>C.</given-names>
</name>
<name>
<surname>Spiers</surname>
<given-names>A.</given-names>
</name>
<name>
<surname>Pipe</surname>
<given-names>T.</given-names>
</name>
<name>
<surname>Melhuish</surname>
<given-names>C.</given-names>
</name>
</person-group>
<article-title>The Effects of Laterotactile Information on Lump Localization through a Teletaction System</article-title>
<conf-name>Proceedings of World Haptics Conference (WHC)</conf-name>
<conf-loc>Daejeon, Korea</conf-loc>
<conf-date>14–17 April 2013</conf-date>
<fpage>365</fpage>
<lpage>370</lpage>
</element-citation>
</ref>
<ref id="b17-sensors-14-02561">
<label>17.</label>
<element-citation publication-type="book">
<person-group person-group-type="author">
<name>
<surname>Winstone</surname>
<given-names>B.</given-names>
</name>
<name>
<surname>Griffiths</surname>
<given-names>G.</given-names>
</name>
<name>
<surname>Pipe</surname>
<given-names>T.</given-names>
</name>
<name>
<surname>Melhuish</surname>
<given-names>C.</given-names>
</name>
<name>
<surname>Rossiter</surname>
<given-names>J.</given-names>
</name>
</person-group>
<article-title>TACTIP—Tactile Fingertip Device Texture Analysis through Optical Tracking of Skin Features</article-title>
<source>Biomimetic and Biohybrid Systems</source>
<person-group person-group-type="editor">
<name>
<surname>Lepora</surname>
<given-names>N.</given-names>
</name>
<name>
<surname>Mura</surname>
<given-names>A.</given-names>
</name>
<name>
<surname>Krapp</surname>
<given-names>H.</given-names>
</name>
<name>
<surname>Verschure</surname>
<given-names>P.</given-names>
</name>
<name>
<surname>Prescott</surname>
<given-names>T.</given-names>
</name>
</person-group>
<publisher-name>Springer Berlin Heidelberg</publisher-name>
<publisher-loc>London, UK</publisher-loc>
<year>2013</year>
<fpage>323</fpage>
<lpage>334</lpage>
</element-citation>
</ref>
<ref id="b18-sensors-14-02561">
<label>18.</label>
<element-citation publication-type="book">
<person-group person-group-type="author">
<name>
<surname>Martinez-Hernandez</surname>
<given-names>U.</given-names>
</name>
<name>
<surname>Dodd</surname>
<given-names>T.</given-names>
</name>
<name>
<surname>Prescott</surname>
<given-names>T.</given-names>
</name>
<name>
<surname>Lepora</surname>
<given-names>N.</given-names>
</name>
</person-group>
<article-title>Angle and Position Perception for Exploration with Active Touch</article-title>
<source>Biomimetic and Biohybrid Systems</source>
<person-group person-group-type="editor">
<name>
<surname>Lepora</surname>
<given-names>N.</given-names>
</name>
<name>
<surname>Mura</surname>
<given-names>A.</given-names>
</name>
<name>
<surname>Krapp</surname>
<given-names>H.</given-names>
</name>
<name>
<surname>Verschure</surname>
<given-names>P.</given-names>
</name>
<name>
<surname>Prescott</surname>
<given-names>T.</given-names>
</name>
</person-group>
<publisher-name>Springer Berlin Heidelberg</publisher-name>
<publisher-loc>London, UK</publisher-loc>
<year>2013</year>
<fpage>405</fpage>
<lpage>408</lpage>
</element-citation>
</ref>
<ref id="b19-sensors-14-02561">
<label>19.</label>
<element-citation publication-type="confproc">
<person-group person-group-type="author">
<name>
<surname>Martinez-Hernandez</surname>
<given-names>U.</given-names>
</name>
<name>
<surname>Dodd</surname>
<given-names>T.</given-names>
</name>
<name>
<surname>Natale</surname>
<given-names>L.</given-names>
</name>
<name>
<surname>Metta</surname>
<given-names>G.</given-names>
</name>
<name>
<surname>Prescott</surname>
<given-names>T.</given-names>
</name>
<name>
<surname>Lepora</surname>
<given-names>N.</given-names>
</name>
</person-group>
<article-title>Active Contour Following to Explore Object Shape with Robot Touch</article-title>
<conf-name>Proceedings of World Haptics Conference (WHC)</conf-name>
<conf-loc>Daejeon, Korea</conf-loc>
<conf-date>14–17 April 2013</conf-date>
<fpage>341</fpage>
<lpage>346</lpage>
</element-citation>
</ref>
<ref id="b20-sensors-14-02561">
<label>20.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Johansson</surname>
<given-names>R.</given-names>
</name>
<name>
<surname>Vallbo</surname>
<given-names>A.</given-names>
</name>
</person-group>
<article-title>Tactile sensibility in the human hand: Receptive field characteristics of mechanoreceptive units in the glabrous skin area</article-title>
<source>J. Physiol.</source>
<year>1979</year>
<volume>286</volume>
<fpage>283</fpage>
<lpage>300</lpage>
<pub-id pub-id-type="pmid">439026</pub-id>
</element-citation>
</ref>
<ref id="b21-sensors-14-02561">
<label>21.</label>
<element-citation publication-type="confproc">
<person-group person-group-type="author">
<name>
<surname>Kuroki</surname>
<given-names>S.</given-names>
</name>
<name>
<surname>Kajimoto</surname>
<given-names>H.</given-names>
</name>
<name>
<surname>Nii</surname>
<given-names>H.</given-names>
</name>
<name>
<surname>Kawakami</surname>
<given-names>N.</given-names>
</name>
<name>
<surname>Tachi</surname>
<given-names>S.</given-names>
</name>
</person-group>
<article-title>Proposal of the Stretch Detection Hypothesis of the Meissner Corpuscle</article-title>
<conf-name>Proceedings of the 6th International Conference on Haptics: Perception, Devices and Scenarios</conf-name>
<conf-loc>Madrid, Spain</conf-loc>
<conf-date>11–13 June 2008</conf-date>
<fpage>245</fpage>
<lpage>254</lpage>
</element-citation>
</ref>
<ref id="b22-sensors-14-02561">
<label>22.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Takahashi-Iwanaga</surname>
<given-names>H.</given-names>
</name>
<name>
<surname>Shimoda</surname>
<given-names>H.</given-names>
</name>
</person-group>
<article-title>The three-dimensional microanatomy of Meissner corpuscles in monkey palmar skin</article-title>
<source>J. Neurocytol.</source>
<year>2003</year>
<volume>32</volume>
<fpage>363</fpage>
<lpage>371</lpage>
<pub-id pub-id-type="pmid">14724379</pub-id>
</element-citation>
</ref>
<ref id="b23-sensors-14-02561">
<label>23.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Gerling</surname>
<given-names>G.J.</given-names>
</name>
<name>
<surname>Thomas</surname>
<given-names>G.W.</given-names>
</name>
</person-group>
<article-title>Fingerprint lines may not directly affect SA-I mechanoreceptor response</article-title>
<source>Somatosens. Motor Res.</source>
<year>2008</year>
<volume>25</volume>
<fpage>61</fpage>
<lpage>76</lpage>
</element-citation>
</ref>
<ref id="b24-sensors-14-02561">
<label>24.</label>
<element-citation publication-type="book">
<person-group person-group-type="author">
<name>
<surname>Gardner</surname>
<given-names>E.</given-names>
</name>
<name>
<surname>Martin</surname>
<given-names>J.</given-names>
</name>
<name>
<surname>Jessell</surname>
<given-names>T.</given-names>
</name>
</person-group>
<article-title>The Bodily Senses</article-title>
<source>Principles of Neural Science</source>
<edition>4th ed.</edition>
<person-group person-group-type="editor">
<name>
<surname>Kandel</surname>
<given-names>E.</given-names>
</name>
<name>
<surname>Schwartz</surname>
<given-names>J.</given-names>
</name>
<name>
<surname>Jessell</surname>
<given-names>T.</given-names>
</name>
</person-group>
<publisher-name>McGraw-Hill</publisher-name>
<publisher-loc>New York, NY, USA</publisher-loc>
<year>1991</year>
<fpage>430</fpage>
<lpage>450</lpage>
</element-citation>
</ref>
<ref id="b25-sensors-14-02561">
<label>25.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Toma</surname>
<given-names>S.</given-names>
</name>
<name>
<surname>Nakajima</surname>
<given-names>Y.</given-names>
</name>
</person-group>
<article-title>Response characteristics of cutaneous mechanoreceptors to vibratory stimuli in human glabrous skin</article-title>
<source>Neurosci. Lett.</source>
<year>1995</year>
<volume>195</volume>
<fpage>61</fpage>
<lpage>63</lpage>
<pub-id pub-id-type="pmid">7478256</pub-id>
</element-citation>
</ref>
<ref id="b26-sensors-14-02561">
<label>26.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Johansson</surname>
<given-names>R.S.</given-names>
</name>
<name>
<surname>Landstrm</surname>
<given-names>U.</given-names>
</name>
<name>
<surname>Lundstrm</surname>
<given-names>R.</given-names>
</name>
</person-group>
<article-title>Responses of mechanoreceptive afferent units in the glabrous skin of the human hand to sinusoidal skin displacements</article-title>
<source>Brain Res.</source>
<year>1982</year>
<volume>244</volume>
<fpage>17</fpage>
<lpage>25</lpage>
<pub-id pub-id-type="pmid">6288178</pub-id>
</element-citation>
</ref>
<ref id="b27-sensors-14-02561">
<label>27.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Johnson</surname>
<given-names>K.O.</given-names>
</name>
</person-group>
<article-title>The roles and functions of cutaneous mechanoreceptors</article-title>
<source>Curr. Opin. Neurobiol.</source>
<year>2001</year>
<volume>11</volume>
<fpage>455</fpage>
<lpage>461</lpage>
<pub-id pub-id-type="pmid">11502392</pub-id>
</element-citation>
</ref>
<ref id="b28-sensors-14-02561">
<label>28.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Vega-Bermudez</surname>
<given-names>F.</given-names>
</name>
<name>
<surname>Johnson</surname>
<given-names>K.O.</given-names>
</name>
</person-group>
<article-title>SA1 and RA receptive fields, response variability, and population responses mapped with a probe array</article-title>
<source>J. Neurophysiol.</source>
<year>1999</year>
<volume>81</volume>
<fpage>2701</fpage>
<lpage>2710</lpage>
<pub-id pub-id-type="pmid">10368390</pub-id>
</element-citation>
</ref>
<ref id="b29-sensors-14-02561">
<label>29.</label>
<element-citation publication-type="book">
<person-group person-group-type="author">
<name>
<surname>Bradski</surname>
<given-names>G.</given-names>
</name>
<name>
<surname>Kaehler</surname>
<given-names>A.</given-names>
</name>
</person-group>
<source>Learning OpenCV: Computer Vision with the OpenCV Library</source>
<publisher-name>O'Reilly</publisher-name>
<publisher-loc>Sebastopol, CA, USA</publisher-loc>
<year>2008</year>
</element-citation>
</ref>
</ref-list>
</back>
<floats-group>
<fig id="f1-sensors-14-02561" position="float">
<label>Figure 1.</label>
<caption>
<p>Cross section of human skin, showing the approximate locations of the different mechanoreceptors.</p>
</caption>
<graphic xlink:href="sensors-14-02561f1"></graphic>
</fig>
<fig id="f2-sensors-14-02561" position="float">
<label>Figure 2.</label>
<caption>
<p>Structure of the biologically-inspired vision-based tactile sensor, tactile fingertip (TACTIP) (
<bold>left</bold>
). Interactions with a rigid object (
<bold>right</bold>
) causes localized papillae deflection at points A (divergence) and B (convergence).</p>
</caption>
<graphic xlink:href="sensors-14-02561f2"></graphic>
</fig>
<fig id="f3-sensors-14-02561" position="float">
<label>Figure 3.</label>
<caption>
<p>Two examples of surface features identification; (
<bold>A,B</bold>
) embedded camera view, (
<bold>C,D</bold>
) extracted edge features.</p>
</caption>
<graphic xlink:href="sensors-14-02561f3"></graphic>
</fig>
<fig id="f4-sensors-14-02561" position="float">
<label>Figure 4.</label>
<caption>
<p>The target objects used in the experiments constructed from 3 mm thick polycarbonate.</p>
</caption>
<graphic xlink:href="sensors-14-02561f4"></graphic>
</fig>
<fig id="f5-sensors-14-02561" position="float">
<label>Figure 5.</label>
<caption>
<p>(
<bold>A</bold>
) sensor mounted in its cradle, (
<bold>B</bold>
) the cradle mounted on the WAM (Whole Arm Manipulator) approaching the environment and (
<bold>C</bold>
) the fingertip on the target during the active touch task.</p>
</caption>
<graphic xlink:href="sensors-14-02561f5"></graphic>
</fig>
<fig id="f6-sensors-14-02561" position="float">
<label>Figure 6.</label>
<caption>
<p>The decision process used by the algorithm. At the beginning, the software searches for an edge by moving using the robotic arm. When the sensor starts to detect a possible edge, the shape found event is triggered in order to start the contour-following task. Once an edge is found, the gradient is determined and used to test for a ‘good feature’. If this is satisfied the position and details are recorded and the next movement calculated. When the starting point is again reached, a stop command is triggered.</p>
</caption>
<graphic xlink:href="sensors-14-02561f6"></graphic>
</fig>
<fig id="f7-sensors-14-02561" position="float">
<label>Figure 7.</label>
<caption>
<p>The various processing steps and outputs as the sensor system finds and explores each test object. The largest images show the early reconstruction stage, using the whole path.
<bold>C,H,O</bold>
show the live video from the camera.
<bold>A,F,M</bold>
show both the sensor image and the edge detection algorithm output.
<bold>B,G,N</bold>
show the extracted feature from the algorithm alone. Finally,
<bold>D,I,P</bold>
show two equal areas parallel to the detected edge. Within each of these areas, the number of papillae is calculated so the higher side of the edge can be found (due to the convergence or divergence of the papillae).</p>
</caption>
<graphic xlink:href="sensors-14-02561f7"></graphic>
</fig>
<fig id="f8-sensors-14-02561" position="float">
<label>Figure 8.</label>
<caption>
<p>The left picture illustrates the sensorised glove with the Vicon markers. The index finger glove tissue was removed to allow full skin contact with the surface. In the right image, the tracking point and frame of reference for fingertip location are highlighted. 1,2,3 are the real, tracked markers. ‘S’ is the virtual marker. The position of ‘S’ is recorded.</p>
</caption>
<graphic xlink:href="sensors-14-02561f8"></graphic>
</fig>
<fig id="f9-sensors-14-02561" position="float">
<label>Figure 9.</label>
<caption>
<p>Visual comparison of the artificial sensor path (
<bold>a</bold>
) in white, reconstructed shape in blue and the target shapes (
<bold>b</bold>
).</p>
</caption>
<graphic xlink:href="sensors-14-02561f9"></graphic>
</fig>
<fig id="f10-sensors-14-02561" position="float">
<label>Figure 10.</label>
<caption>
<p>Three reconstructed rectangles using the sensor paths.</p>
</caption>
<graphic xlink:href="sensors-14-02561f10"></graphic>
</fig>
<fig id="f11-sensors-14-02561" position="float">
<label>Figure 11.</label>
<caption>
<p>Comparison of the shape reconstructions between two humans and the sensor system for the rectangle (
<bold>A</bold>
) and the hexagon (
<bold>B</bold>
). The thinner black lines are the sensor trajectories.</p>
</caption>
<graphic xlink:href="sensors-14-02561f11"></graphic>
</fig>
<fig id="f12-sensors-14-02561" position="float">
<label>Figure 12.</label>
<caption>
<p>Comparison of mean estimated areas for humans, robotic platform and real dimension.</p>
</caption>
<graphic xlink:href="sensors-14-02561f12"></graphic>
</fig>
<table-wrap id="t1-sensors-14-02561" position="float">
<label>Table 1.</label>
<caption>
<p>The reconstructed shapes' dimensions compared to the real ones, for the sensor system. [Dimensions in mm], (real dimensions in brackets).</p>
</caption>
<table frame="box" rules="all">
<thead>
<tr>
<th align="center" valign="middle" rowspan="1" colspan="1">Sides #</th>
<th align="center" valign="middle" rowspan="1" colspan="1">Rectangle (real)</th>
<th align="center" valign="middle" rowspan="1" colspan="1">Hexagon (real)</th>
<th align="center" valign="middle" rowspan="1" colspan="1">Circle (real)</th>
</tr>
</thead>
<tbody>
<tr>
<td align="center" valign="middle" rowspan="1" colspan="1">1</td>
<td align="center" valign="middle" rowspan="1" colspan="1">10.4 (10.8)</td>
<td align="center" valign="middle" rowspan="1" colspan="1">4.8 (5.4)</td>
<td align="center" valign="middle" rowspan="1" colspan="1">-</td>
</tr>
<tr>
<td align="center" valign="middle" rowspan="1" colspan="1">2</td>
<td align="center" valign="middle" rowspan="1" colspan="1">7.0 (7.5)</td>
<td align="center" valign="middle" rowspan="1" colspan="1">5.0 (5.3)</td>
<td align="center" valign="middle" rowspan="1" colspan="1">-</td>
</tr>
<tr>
<td align="center" valign="middle" rowspan="1" colspan="1">3</td>
<td align="center" valign="middle" rowspan="1" colspan="1">10.4 (10.7)</td>
<td align="center" valign="middle" rowspan="1" colspan="1">4.9 (5.3)</td>
<td align="center" valign="middle" rowspan="1" colspan="1">-</td>
</tr>
<tr>
<td align="center" valign="middle" rowspan="1" colspan="1">4</td>
<td align="center" valign="middle" rowspan="1" colspan="1">6.6 (7)</td>
<td align="center" valign="middle" rowspan="1" colspan="1">5.1 (5.4)</td>
<td align="center" valign="middle" rowspan="1" colspan="1">-</td>
</tr>
<tr>
<td align="center" valign="middle" rowspan="1" colspan="1">5</td>
<td align="center" valign="middle" rowspan="1" colspan="1">-</td>
<td align="center" valign="middle" rowspan="1" colspan="1">4.8 (5.3)</td>
<td align="center" valign="middle" rowspan="1" colspan="1">-</td>
</tr>
<tr>
<td align="center" valign="middle" rowspan="1" colspan="1">6</td>
<td align="center" valign="middle" rowspan="1" colspan="1">-</td>
<td align="center" valign="middle" rowspan="1" colspan="1">4.6 (5.3)</td>
<td align="center" valign="middle" rowspan="1" colspan="1">-</td>
</tr>
<tr>
<td align="center" valign="middle" rowspan="1" colspan="1">Diam.</td>
<td align="center" valign="middle" rowspan="1" colspan="1">-</td>
<td align="center" valign="middle" rowspan="1" colspan="1">-</td>
<td align="center" valign="middle" rowspan="1" colspan="1">6.1 (6)</td>
</tr>
</tbody>
</table>
</table-wrap>
<table-wrap id="t2-sensors-14-02561" position="float">
<label>Table 2.</label>
<caption>
<p>Mean areas and the ratio of the areas circumscribed by the humans
<inline-formula>
<mml:math id="mm4">
<mml:mrow>
<mml:mrow>
<mml:mo stretchy="false">(</mml:mo>
<mml:mrow>
<mml:mover accent="true">
<mml:mrow>
<mml:msub>
<mml:mi>A</mml:mi>
<mml:mi>H</mml:mi>
</mml:msub>
</mml:mrow>
<mml:mo>¯</mml:mo>
</mml:mover>
</mml:mrow>
<mml:mo stretchy="false">)</mml:mo>
</mml:mrow>
</mml:mrow>
</mml:math>
</inline-formula>
and the robotic platform
<inline-formula>
<mml:math id="mm5">
<mml:mrow>
<mml:mrow>
<mml:mo stretchy="false">(</mml:mo>
<mml:mrow>
<mml:mover accent="true">
<mml:mrow>
<mml:msub>
<mml:mi>A</mml:mi>
<mml:mi>P</mml:mi>
</mml:msub>
</mml:mrow>
<mml:mo>¯</mml:mo>
</mml:mover>
</mml:mrow>
<mml:mo stretchy="false">)</mml:mo>
</mml:mrow>
</mml:mrow>
</mml:math>
</inline-formula>
.</p>
</caption>
<table frame="box" rules="all">
<thead>
<tr>
<th align="center" valign="middle" rowspan="1" colspan="1">-</th>
<th align="center" valign="middle" rowspan="1" colspan="1">
<mml:math id="mm6">
<mml:mrow>
<mml:mover accent="true">
<mml:mrow>
<mml:msub>
<mml:mi>A</mml:mi>
<mml:mi>H</mml:mi>
</mml:msub>
</mml:mrow>
<mml:mo>¯</mml:mo>
</mml:mover>
</mml:mrow>
</mml:math>
[
<italic>cm</italic>
<sup>2</sup>
]</th>
<th align="center" valign="middle" rowspan="1" colspan="1">
<mml:math id="mm7">
<mml:mrow>
<mml:mover accent="true">
<mml:mrow>
<mml:msub>
<mml:mi>A</mml:mi>
<mml:mi>P</mml:mi>
</mml:msub>
</mml:mrow>
<mml:mo>¯</mml:mo>
</mml:mover>
</mml:mrow>
</mml:math>
[
<italic>cm</italic>
<sup>2</sup>
]</th>
<th align="center" valign="middle" rowspan="1" colspan="1">
<mml:math id="mm8">
<mml:mrow>
<mml:mover accent="true">
<mml:mrow>
<mml:msub>
<mml:mi>A</mml:mi>
<mml:mi>H</mml:mi>
</mml:msub>
</mml:mrow>
<mml:mo>¯</mml:mo>
</mml:mover>
</mml:mrow>
<mml:mo stretchy="false">/</mml:mo>
<mml:mrow>
<mml:mrow>
<mml:mover accent="true">
<mml:mrow>
<mml:msub>
<mml:mi>A</mml:mi>
<mml:mi>P</mml:mi>
</mml:msub>
</mml:mrow>
<mml:mo>¯</mml:mo>
</mml:mover>
</mml:mrow>
</mml:mrow>
</mml:math>
</th>
</tr>
</thead>
<tbody>
<tr>
<td align="center" valign="middle" rowspan="1" colspan="1">Rectangle</td>
<td align="center" valign="middle" rowspan="1" colspan="1">94.540</td>
<td align="center" valign="middle" rowspan="1" colspan="1">83.570</td>
<td align="center" valign="middle" rowspan="1" colspan="1">1.13</td>
</tr>
<tr>
<td align="center" valign="middle" rowspan="1" colspan="1">Hexagon</td>
<td align="center" valign="middle" rowspan="1" colspan="1">80.473</td>
<td align="center" valign="middle" rowspan="1" colspan="1">71.333</td>
<td align="center" valign="middle" rowspan="1" colspan="1">1.13</td>
</tr>
<tr>
<td align="center" valign="middle" rowspan="1" colspan="1">Circle</td>
<td align="center" valign="middle" rowspan="1" colspan="1">29.250</td>
<td align="center" valign="middle" rowspan="1" colspan="1">27</td>
<td align="center" valign="middle" rowspan="1" colspan="1">1.08</td>
</tr>
</tbody>
</table>
</table-wrap>
</floats-group>
</pmc>
<affiliations>
<list></list>
<tree>
<noCountry>
<name sortKey="Assaf, Tareq" sort="Assaf, Tareq" uniqKey="Assaf T" first="Tareq" last="Assaf">Tareq Assaf</name>
<name sortKey="Melhuish, Chris" sort="Melhuish, Chris" uniqKey="Melhuish C" first="Chris" last="Melhuish">Chris Melhuish</name>
<name sortKey="Pipe, Tony" sort="Pipe, Tony" uniqKey="Pipe T" first="Tony" last="Pipe">Tony Pipe</name>
<name sortKey="Roke, Calum" sort="Roke, Calum" uniqKey="Roke C" first="Calum" last="Roke">Calum Roke</name>
<name sortKey="Rossiter, Jonathan" sort="Rossiter, Jonathan" uniqKey="Rossiter J" first="Jonathan" last="Rossiter">Jonathan Rossiter</name>
</noCountry>
</tree>
</affiliations>
</record>

Pour manipuler ce document sous Unix (Dilib)

EXPLOR_STEP=$WICRI_ROOT/Ticri/CIDE/explor/HapticV1/Data/Ncbi/Merge
HfdSelect -h $EXPLOR_STEP/biblio.hfd -nk 002C97 | SxmlIndent | more

Ou

HfdSelect -h $EXPLOR_AREA/Data/Ncbi/Merge/biblio.hfd -nk 002C97 | SxmlIndent | more

Pour mettre un lien sur cette page dans le réseau Wicri

{{Explor lien
   |wiki=    Ticri/CIDE
   |area=    HapticV1
   |flux=    Ncbi
   |étape=   Merge
   |type=    RBID
   |clé=     PMC:3958268
   |texte=   Seeing by Touch: Evaluation of a Soft Biologically-Inspired Artificial Fingertip in Real-Time Active Touch
}}

Pour générer des pages wiki

HfdIndexSelect -h $EXPLOR_AREA/Data/Ncbi/Merge/RBID.i   -Sk "pubmed:24514881" \
       | HfdSelect -Kh $EXPLOR_AREA/Data/Ncbi/Merge/biblio.hfd   \
       | NlmPubMed2Wicri -a HapticV1 

Wicri

This area was generated with Dilib version V0.6.23.
Data generation: Mon Jun 13 01:09:46 2016. Site generation: Wed Mar 6 09:54:07 2024