Serveur d'exploration sur les dispositifs haptiques

Attention, ce site est en cours de développement !
Attention, site généré par des moyens informatiques à partir de corpus bruts.
Les informations ne sont donc pas validées.

Object recognition and localization: the role of tactile sensors.

Identifieur interne : 000740 ( PubMed/Corpus ); précédent : 000739; suivant : 000741

Object recognition and localization: the role of tactile sensors.

Auteurs : Achint Aggarwal ; Frank Kirchner

Source :

RBID : pubmed:24553087

Abstract

Tactile sensors, because of their intrinsic insensitivity to lighting conditions and water turbidity, provide promising opportunities for augmenting the capabilities of vision sensors in applications involving object recognition and localization. This paper presents two approaches for haptic object recognition and localization for ground and underwater environments. The first approach called Batch Ransac and Iterative Closest Point augmented Particle Filter (BRICPPF) is based on an innovative combination of particle filters, Iterative-Closest-Point algorithm, and a feature-based Random Sampling and Consensus (RANSAC) algorithm for database matching. It can handle a large database of 3D-objects of complex shapes and performs a complete six-degree-of-freedom localization of static objects. The algorithms are validated by experimentation in ground and underwater environments using real hardware. To our knowledge this is the first instance of haptic object recognition and localization in underwater environments. The second approach is biologically inspired, and provides a close integration between exploration and recognition. An edge following exploration strategy is developed that receives feedback from the current state of recognition. A recognition by parts approach is developed which uses the BRICPPF for object sub-part recognition. Object exploration is either directed to explore a part until it is successfully recognized, or is directed towards new parts to endorse the current recognition belief. This approach is validated by simulation experiments.

DOI: 10.3390/s140203227
PubMed: 24553087

Links to Exploration step

pubmed:24553087

Le document en format XML

<record>
<TEI>
<teiHeader>
<fileDesc>
<titleStmt>
<title xml:lang="en">Object recognition and localization: the role of tactile sensors.</title>
<author>
<name sortKey="Aggarwal, Achint" sort="Aggarwal, Achint" uniqKey="Aggarwal A" first="Achint" last="Aggarwal">Achint Aggarwal</name>
<affiliation>
<nlm:affiliation>DFKI GmbH, Robotics Innovation Center (RIC), Robert-Hooke-Str. 1, Bremen D-28359, Germany. achint.aggarwal@dfki.de.</nlm:affiliation>
</affiliation>
</author>
<author>
<name sortKey="Kirchner, Frank" sort="Kirchner, Frank" uniqKey="Kirchner F" first="Frank" last="Kirchner">Frank Kirchner</name>
<affiliation>
<nlm:affiliation>DFKI GmbH, Robotics Innovation Center (RIC), Robert-Hooke-Str. 1, Bremen D-28359, Germany. frank.kirchner@dfki.de.</nlm:affiliation>
</affiliation>
</author>
</titleStmt>
<publicationStmt>
<idno type="wicri:source">PubMed</idno>
<date when="2014">2014</date>
<idno type="doi">10.3390/s140203227</idno>
<idno type="RBID">pubmed:24553087</idno>
<idno type="pmid">24553087</idno>
<idno type="wicri:Area/PubMed/Corpus">000740</idno>
</publicationStmt>
<sourceDesc>
<biblStruct>
<analytic>
<title xml:lang="en">Object recognition and localization: the role of tactile sensors.</title>
<author>
<name sortKey="Aggarwal, Achint" sort="Aggarwal, Achint" uniqKey="Aggarwal A" first="Achint" last="Aggarwal">Achint Aggarwal</name>
<affiliation>
<nlm:affiliation>DFKI GmbH, Robotics Innovation Center (RIC), Robert-Hooke-Str. 1, Bremen D-28359, Germany. achint.aggarwal@dfki.de.</nlm:affiliation>
</affiliation>
</author>
<author>
<name sortKey="Kirchner, Frank" sort="Kirchner, Frank" uniqKey="Kirchner F" first="Frank" last="Kirchner">Frank Kirchner</name>
<affiliation>
<nlm:affiliation>DFKI GmbH, Robotics Innovation Center (RIC), Robert-Hooke-Str. 1, Bremen D-28359, Germany. frank.kirchner@dfki.de.</nlm:affiliation>
</affiliation>
</author>
</analytic>
<series>
<title level="j">Sensors (Basel, Switzerland)</title>
<idno type="eISSN">1424-8220</idno>
<imprint>
<date when="2014" type="published">2014</date>
</imprint>
</series>
</biblStruct>
</sourceDesc>
</fileDesc>
<profileDesc>
<textClass></textClass>
</profileDesc>
</teiHeader>
<front>
<div type="abstract" xml:lang="en">Tactile sensors, because of their intrinsic insensitivity to lighting conditions and water turbidity, provide promising opportunities for augmenting the capabilities of vision sensors in applications involving object recognition and localization. This paper presents two approaches for haptic object recognition and localization for ground and underwater environments. The first approach called Batch Ransac and Iterative Closest Point augmented Particle Filter (BRICPPF) is based on an innovative combination of particle filters, Iterative-Closest-Point algorithm, and a feature-based Random Sampling and Consensus (RANSAC) algorithm for database matching. It can handle a large database of 3D-objects of complex shapes and performs a complete six-degree-of-freedom localization of static objects. The algorithms are validated by experimentation in ground and underwater environments using real hardware. To our knowledge this is the first instance of haptic object recognition and localization in underwater environments. The second approach is biologically inspired, and provides a close integration between exploration and recognition. An edge following exploration strategy is developed that receives feedback from the current state of recognition. A recognition by parts approach is developed which uses the BRICPPF for object sub-part recognition. Object exploration is either directed to explore a part until it is successfully recognized, or is directed towards new parts to endorse the current recognition belief. This approach is validated by simulation experiments.</div>
</front>
</TEI>
<pubmed>
<MedlineCitation Owner="NLM" Status="PubMed-not-MEDLINE">
<PMID Version="1">24553087</PMID>
<DateCreated>
<Year>2014</Year>
<Month>02</Month>
<Day>20</Day>
</DateCreated>
<DateCompleted>
<Year>2014</Year>
<Month>07</Month>
<Day>01</Day>
</DateCompleted>
<Article PubModel="Electronic">
<Journal>
<ISSN IssnType="Electronic">1424-8220</ISSN>
<JournalIssue CitedMedium="Internet">
<Volume>14</Volume>
<Issue>2</Issue>
<PubDate>
<Year>2014</Year>
</PubDate>
</JournalIssue>
<Title>Sensors (Basel, Switzerland)</Title>
<ISOAbbreviation>Sensors (Basel)</ISOAbbreviation>
</Journal>
<ArticleTitle>Object recognition and localization: the role of tactile sensors.</ArticleTitle>
<Pagination>
<MedlinePgn>3227-66</MedlinePgn>
</Pagination>
<ELocationID EIdType="doi" ValidYN="Y">10.3390/s140203227</ELocationID>
<Abstract>
<AbstractText>Tactile sensors, because of their intrinsic insensitivity to lighting conditions and water turbidity, provide promising opportunities for augmenting the capabilities of vision sensors in applications involving object recognition and localization. This paper presents two approaches for haptic object recognition and localization for ground and underwater environments. The first approach called Batch Ransac and Iterative Closest Point augmented Particle Filter (BRICPPF) is based on an innovative combination of particle filters, Iterative-Closest-Point algorithm, and a feature-based Random Sampling and Consensus (RANSAC) algorithm for database matching. It can handle a large database of 3D-objects of complex shapes and performs a complete six-degree-of-freedom localization of static objects. The algorithms are validated by experimentation in ground and underwater environments using real hardware. To our knowledge this is the first instance of haptic object recognition and localization in underwater environments. The second approach is biologically inspired, and provides a close integration between exploration and recognition. An edge following exploration strategy is developed that receives feedback from the current state of recognition. A recognition by parts approach is developed which uses the BRICPPF for object sub-part recognition. Object exploration is either directed to explore a part until it is successfully recognized, or is directed towards new parts to endorse the current recognition belief. This approach is validated by simulation experiments.</AbstractText>
</Abstract>
<AuthorList CompleteYN="Y">
<Author ValidYN="Y">
<LastName>Aggarwal</LastName>
<ForeName>Achint</ForeName>
<Initials>A</Initials>
<AffiliationInfo>
<Affiliation>DFKI GmbH, Robotics Innovation Center (RIC), Robert-Hooke-Str. 1, Bremen D-28359, Germany. achint.aggarwal@dfki.de.</Affiliation>
</AffiliationInfo>
</Author>
<Author ValidYN="Y">
<LastName>Kirchner</LastName>
<ForeName>Frank</ForeName>
<Initials>F</Initials>
<AffiliationInfo>
<Affiliation>DFKI GmbH, Robotics Innovation Center (RIC), Robert-Hooke-Str. 1, Bremen D-28359, Germany. frank.kirchner@dfki.de.</Affiliation>
</AffiliationInfo>
</Author>
</AuthorList>
<Language>eng</Language>
<PublicationTypeList>
<PublicationType UI="D016428">Journal Article</PublicationType>
<PublicationType UI="D013485">Research Support, Non-U.S. Gov't</PublicationType>
</PublicationTypeList>
<ArticleDate DateType="Electronic">
<Year>2014</Year>
<Month>02</Month>
<Day>18</Day>
</ArticleDate>
</Article>
<MedlineJournalInfo>
<Country>Switzerland</Country>
<MedlineTA>Sensors (Basel)</MedlineTA>
<NlmUniqueID>101204366</NlmUniqueID>
<ISSNLinking>1424-8220</ISSNLinking>
</MedlineJournalInfo>
<CommentsCorrectionsList>
<CommentsCorrections RefType="Cites">
<RefSource>Psychol Rev. 1987 Apr;94(2):115-47</RefSource>
<PMID Version="1">3575582</PMID>
</CommentsCorrections>
<CommentsCorrections RefType="Cites">
<RefSource>Acta Psychol (Amst). 1993 Oct;84(1):29-40</RefSource>
<PMID Version="1">8237454</PMID>
</CommentsCorrections>
<CommentsCorrections RefType="Cites">
<RefSource>Cogn Psychol. 1990 Oct;22(4):421-59</RefSource>
<PMID Version="1">2253454</PMID>
</CommentsCorrections>
<CommentsCorrections RefType="Cites">
<RefSource>Cogn Psychol. 1987 Jul;19(3):342-68</RefSource>
<PMID Version="1">3608405</PMID>
</CommentsCorrections>
</CommentsCorrectionsList>
<OtherID Source="NLM">PMC3958302</OtherID>
</MedlineCitation>
<PubmedData>
<History>
<PubMedPubDate PubStatus="received">
<Year>2013</Year>
<Month>12</Month>
<Day>23</Day>
</PubMedPubDate>
<PubMedPubDate PubStatus="revised">
<Year>2014</Year>
<Month>1</Month>
<Day>31</Day>
</PubMedPubDate>
<PubMedPubDate PubStatus="accepted">
<Year>2014</Year>
<Month>2</Month>
<Day>08</Day>
</PubMedPubDate>
<PubMedPubDate PubStatus="entrez">
<Year>2014</Year>
<Month>2</Month>
<Day>21</Day>
<Hour>6</Hour>
<Minute>0</Minute>
</PubMedPubDate>
<PubMedPubDate PubStatus="pubmed">
<Year>2014</Year>
<Month>2</Month>
<Day>21</Day>
<Hour>6</Hour>
<Minute>0</Minute>
</PubMedPubDate>
<PubMedPubDate PubStatus="medline">
<Year>2014</Year>
<Month>2</Month>
<Day>21</Day>
<Hour>6</Hour>
<Minute>1</Minute>
</PubMedPubDate>
</History>
<PublicationStatus>epublish</PublicationStatus>
<ArticleIdList>
<ArticleId IdType="pii">s140203227</ArticleId>
<ArticleId IdType="doi">10.3390/s140203227</ArticleId>
<ArticleId IdType="pubmed">24553087</ArticleId>
<ArticleId IdType="pmc">PMC3958302</ArticleId>
</ArticleIdList>
</PubmedData>
</pubmed>
</record>

Pour manipuler ce document sous Unix (Dilib)

EXPLOR_STEP=$WICRI_ROOT/Ticri/CIDE/explor/HapticV1/Data/PubMed/Corpus
HfdSelect -h $EXPLOR_STEP/biblio.hfd -nk 000740 | SxmlIndent | more

Ou

HfdSelect -h $EXPLOR_AREA/Data/PubMed/Corpus/biblio.hfd -nk 000740 | SxmlIndent | more

Pour mettre un lien sur cette page dans le réseau Wicri

{{Explor lien
   |wiki=    Ticri/CIDE
   |area=    HapticV1
   |flux=    PubMed
   |étape=   Corpus
   |type=    RBID
   |clé=     pubmed:24553087
   |texte=   Object recognition and localization: the role of tactile sensors.
}}

Pour générer des pages wiki

HfdIndexSelect -h $EXPLOR_AREA/Data/PubMed/Corpus/RBID.i   -Sk "pubmed:24553087" \
       | HfdSelect -Kh $EXPLOR_AREA/Data/PubMed/Corpus/biblio.hfd   \
       | NlmPubMed2Wicri -a HapticV1 

Wicri

This area was generated with Dilib version V0.6.23.
Data generation: Mon Jun 13 01:09:46 2016. Site generation: Wed Mar 6 09:54:07 2024