Serveur d'exploration sur les dispositifs haptiques

Attention, ce site est en cours de développement !
Attention, site généré par des moyens informatiques à partir de corpus bruts.
Les informations ne sont donc pas validées.

Asynchronous Event Based Vision: Algorithms and Applications to Microrobotics

Identifieur interne : 000080 ( Hal/Curation ); précédent : 000079; suivant : 000081

Asynchronous Event Based Vision: Algorithms and Applications to Microrobotics

Auteurs : Zhenjiang Ni [France]

Source :

RBID : Hal:tel-00916995

Descripteurs français

English descriptors

Abstract

The dynamic vision sensor (DVS) is a silicon retina prototype that records only scene contrast changes in the form of stream of events, thus naturally excluding the redundant background absolute gray levels. In this context, numerous high speed asynchronous event based vision algorithms have been developed and their advantages over frame based processing methods have been compared. In haptic feedback teleoperated micromanipulation, vision is a sound candidate for force estimation if the position-force model is well established. The sampling frequency, however, needs to attain 1KHz to allow a transparent and reliable tactile sensation and to ensure system stability. The event based vision has thus been applied to provide the needed force feedback for two micromanipulation applications: Haptic feedback teleoperation of optical tweezers; Haptic virtual assistance in microgripper based micromanipulation. The results show that the haptic frequency requirement of 1KHz has successfully been achieved. For the first application, high speed particle position detection algorithms have been developed and validated. A three-dimensional haptic feedback system capable of manipulating multiple-trap optical tweezers has been realized. In the second application, a novel event based shape registration algorithm capable of tracking arbitrary form object has been developed to track a piezoelectric microgripper. The stability of the system has been significantly enhanced to assist operators in performing complex micromanipulation tasks.

Url:

Links toward previous steps (curation, corpus...)


Links to Exploration step

Hal:tel-00916995

Le document en format XML

<record>
<TEI>
<teiHeader>
<fileDesc>
<titleStmt>
<title xml:lang="en">Asynchronous Event Based Vision: Algorithms and Applications to Microrobotics</title>
<title xml:lang="fr">Vision Asynchrone Événementielle: Algorithmes et Applications à la Microrobotique</title>
<author>
<name sortKey="Ni, Zhenjiang" sort="Ni, Zhenjiang" uniqKey="Ni Z" first="Zhenjiang" last="Ni">Zhenjiang Ni</name>
<affiliation wicri:level="1">
<hal:affiliation type="laboratory" xml:id="struct-96164" status="VALID">
<idno type="RNSR">200918463J</idno>
<orgName>Institut des Systèmes Intelligents et de Robotique</orgName>
<orgName type="acronym">ISIR</orgName>
<desc>
<address>
<addrLine>Université Pierre et Marie Curie - Paris VI Boite courrier 173 4 Place JUSSIEU 75252 Paris cedex 05</addrLine>
<country key="FR"></country>
</address>
<ref type="url">http://www.isir.upmc.fr</ref>
</desc>
<listRelation>
<relation active="#struct-93591" type="direct"></relation>
<relation name="UMR7222" active="#struct-441569" type="direct"></relation>
</listRelation>
<tutelles>
<tutelle active="#struct-93591" type="direct">
<org type="institution" xml:id="struct-93591" status="VALID">
<orgName>Université Pierre et Marie Curie - Paris 6</orgName>
<orgName type="acronym">UPMC</orgName>
<desc>
<address>
<addrLine>4 place Jussieu - 75005 Paris</addrLine>
<country key="FR"></country>
</address>
<ref type="url">http://www.upmc.fr/</ref>
</desc>
</org>
</tutelle>
<tutelle name="UMR7222" active="#struct-441569" type="direct">
<org type="institution" xml:id="struct-441569" status="VALID">
<idno type="ISNI">0000000122597504</idno>
<idno type="IdRef">02636817X</idno>
<orgName>Centre National de la Recherche Scientifique</orgName>
<orgName type="acronym">CNRS</orgName>
<date type="start">1939-10-19</date>
<desc>
<address>
<country key="FR"></country>
</address>
<ref type="url">http://www.cnrs.fr/</ref>
</desc>
</org>
</tutelle>
</tutelles>
</hal:affiliation>
<country>France</country>
</affiliation>
</author>
</titleStmt>
<publicationStmt>
<idno type="wicri:source">HAL</idno>
<idno type="RBID">Hal:tel-00916995</idno>
<idno type="halId">tel-00916995</idno>
<idno type="halUri">https://tel.archives-ouvertes.fr/tel-00916995</idno>
<idno type="url">https://tel.archives-ouvertes.fr/tel-00916995</idno>
<date when="2013-11-26">2013-11-26</date>
<idno type="wicri:Area/Hal/Corpus">000080</idno>
<idno type="wicri:Area/Hal/Curation">000080</idno>
</publicationStmt>
<sourceDesc>
<biblStruct>
<analytic>
<title xml:lang="en">Asynchronous Event Based Vision: Algorithms and Applications to Microrobotics</title>
<title xml:lang="fr">Vision Asynchrone Événementielle: Algorithmes et Applications à la Microrobotique</title>
<author>
<name sortKey="Ni, Zhenjiang" sort="Ni, Zhenjiang" uniqKey="Ni Z" first="Zhenjiang" last="Ni">Zhenjiang Ni</name>
<affiliation wicri:level="1">
<hal:affiliation type="laboratory" xml:id="struct-96164" status="VALID">
<idno type="RNSR">200918463J</idno>
<orgName>Institut des Systèmes Intelligents et de Robotique</orgName>
<orgName type="acronym">ISIR</orgName>
<desc>
<address>
<addrLine>Université Pierre et Marie Curie - Paris VI Boite courrier 173 4 Place JUSSIEU 75252 Paris cedex 05</addrLine>
<country key="FR"></country>
</address>
<ref type="url">http://www.isir.upmc.fr</ref>
</desc>
<listRelation>
<relation active="#struct-93591" type="direct"></relation>
<relation name="UMR7222" active="#struct-441569" type="direct"></relation>
</listRelation>
<tutelles>
<tutelle active="#struct-93591" type="direct">
<org type="institution" xml:id="struct-93591" status="VALID">
<orgName>Université Pierre et Marie Curie - Paris 6</orgName>
<orgName type="acronym">UPMC</orgName>
<desc>
<address>
<addrLine>4 place Jussieu - 75005 Paris</addrLine>
<country key="FR"></country>
</address>
<ref type="url">http://www.upmc.fr/</ref>
</desc>
</org>
</tutelle>
<tutelle name="UMR7222" active="#struct-441569" type="direct">
<org type="institution" xml:id="struct-441569" status="VALID">
<idno type="ISNI">0000000122597504</idno>
<idno type="IdRef">02636817X</idno>
<orgName>Centre National de la Recherche Scientifique</orgName>
<orgName type="acronym">CNRS</orgName>
<date type="start">1939-10-19</date>
<desc>
<address>
<country key="FR"></country>
</address>
<ref type="url">http://www.cnrs.fr/</ref>
</desc>
</org>
</tutelle>
</tutelles>
</hal:affiliation>
<country>France</country>
</affiliation>
</author>
</analytic>
</biblStruct>
</sourceDesc>
</fileDesc>
<profileDesc>
<textClass>
<keywords scheme="mix" xml:lang="en">
<term>Asynchronous Event Based Computation</term>
<term>Dynamic Vision Sensor</term>
<term>Haptic Feedback</term>
<term>Microrobotics</term>
<term>Optical Tweezers</term>
</keywords>
<keywords scheme="mix" xml:lang="fr">
<term>Calcul Asynchrone Basé sur Événements</term>
<term>Capteur Vision Dynamique</term>
<term>Micromanipulation</term>
<term>Microrobotique</term>
<term>Pince Optique</term>
<term>Retour Haptique</term>
<term>Téléopération</term>
</keywords>
</textClass>
</profileDesc>
</teiHeader>
<front>
<div type="abstract" xml:lang="en">The dynamic vision sensor (DVS) is a silicon retina prototype that records only scene contrast changes in the form of stream of events, thus naturally excluding the redundant background absolute gray levels. In this context, numerous high speed asynchronous event based vision algorithms have been developed and their advantages over frame based processing methods have been compared. In haptic feedback teleoperated micromanipulation, vision is a sound candidate for force estimation if the position-force model is well established. The sampling frequency, however, needs to attain 1KHz to allow a transparent and reliable tactile sensation and to ensure system stability. The event based vision has thus been applied to provide the needed force feedback for two micromanipulation applications: Haptic feedback teleoperation of optical tweezers; Haptic virtual assistance in microgripper based micromanipulation. The results show that the haptic frequency requirement of 1KHz has successfully been achieved. For the first application, high speed particle position detection algorithms have been developed and validated. A three-dimensional haptic feedback system capable of manipulating multiple-trap optical tweezers has been realized. In the second application, a novel event based shape registration algorithm capable of tracking arbitrary form object has been developed to track a piezoelectric microgripper. The stability of the system has been significantly enhanced to assist operators in performing complex micromanipulation tasks.</div>
</front>
</TEI>
<hal api="V3">
<titleStmt>
<title xml:lang="en">Asynchronous Event Based Vision: Algorithms and Applications to Microrobotics</title>
<title xml:lang="fr">Vision Asynchrone Événementielle: Algorithmes et Applications à la Microrobotique</title>
<author role="aut">
<persName>
<forename type="first">Zhenjiang</forename>
<surname>Ni</surname>
</persName>
<email>ni@isir.upmc.fr</email>
<idno type="halauthor">955429</idno>
<affiliation ref="#struct-96164"></affiliation>
</author>
<editor role="depositor">
<persName>
<forename>Zhenjiang</forename>
<surname>Ni</surname>
</persName>
<email>ni@isir.upmc.fr</email>
</editor>
</titleStmt>
<editionStmt>
<edition n="v1" type="current">
<date type="whenSubmitted">2014-04-28 10:11:43</date>
<date type="whenModified">2014-04-28 11:39:40</date>
<date type="whenReleased">2014-04-28 11:39:40</date>
<date type="whenProduced">2013-11-26</date>
<date type="whenEndEmbargoed">2014-04-28</date>
<ref type="file" target="https://tel.archives-ouvertes.fr/tel-00916995/document">
<date notBefore="2014-04-28"></date>
</ref>
<ref type="file" n="1" target="https://tel.archives-ouvertes.fr/tel-00916995/file/Thesis.pdf">
<date notBefore="2014-04-28"></date>
</ref>
</edition>
<respStmt>
<resp>contributor</resp>
<name key="189388">
<persName>
<forename>Zhenjiang</forename>
<surname>Ni</surname>
</persName>
<email>ni@isir.upmc.fr</email>
</name>
</respStmt>
</editionStmt>
<publicationStmt>
<distributor>CCSD</distributor>
<idno type="halId">tel-00916995</idno>
<idno type="halUri">https://tel.archives-ouvertes.fr/tel-00916995</idno>
<idno type="halBibtex">ni:tel-00916995</idno>
<idno type="halRefHtml">Robotics [cs.RO]. Université Pierre et Marie Curie - Paris VI, 2013. English</idno>
<idno type="halRef">Robotics [cs.RO]. Université Pierre et Marie Curie - Paris VI, 2013. English</idno>
</publicationStmt>
<seriesStmt>
<idno type="stamp" n="CNRS">CNRS - Centre national de la recherche scientifique</idno>
<idno type="stamp" n="THESES-UPMC">Thèses de l'Université Pierre et Marie Curie</idno>
<idno type="stamp" n="UPMC">Université Pierre et Marie Curie</idno>
<idno type="stamp" n="ISIR" p="UPMC">Institut des Systèmes Intelligents et de Robotique</idno>
</seriesStmt>
<notesStmt></notesStmt>
<sourceDesc>
<biblStruct>
<analytic>
<title xml:lang="en">Asynchronous Event Based Vision: Algorithms and Applications to Microrobotics</title>
<title xml:lang="fr">Vision Asynchrone Événementielle: Algorithmes et Applications à la Microrobotique</title>
<author role="aut">
<persName>
<forename type="first">Zhenjiang</forename>
<surname>Ni</surname>
</persName>
<email>ni@isir.upmc.fr</email>
<idno type="halAuthorId">955429</idno>
<affiliation ref="#struct-96164"></affiliation>
</author>
</analytic>
<monogr>
<imprint>
<date type="dateDefended">2013-11-26</date>
</imprint>
<authority type="institution">Université Pierre et Marie Curie - Paris VI</authority>
<authority type="supervisor">Stéphane Régnier</authority>
<authority type="jury">Yu Sun (rapporteur)</authority>
<authority type="jury">Sylvain Saïghi (rapporteur)</authority>
<authority type="jury">Nathalie Westbrook (examinatrice)</authority>
<authority type="jury">Michaël Gauthier (examinateur)</authority>
<authority type="jury">Stéphane Régnier (directeur de thèse)</authority>
<authority type="jury">Ryad Benosman (encadrant)</authority>
</monogr>
</biblStruct>
</sourceDesc>
<profileDesc>
<langUsage>
<language ident="en">English</language>
</langUsage>
<textClass>
<keywords scheme="author">
<term xml:lang="en">Dynamic Vision Sensor</term>
<term xml:lang="en">Asynchronous Event Based Computation</term>
<term xml:lang="en">Haptic Feedback</term>
<term xml:lang="en">Optical Tweezers</term>
<term xml:lang="en">Microrobotics</term>
<term xml:lang="fr">Capteur Vision Dynamique</term>
<term xml:lang="fr">Calcul Asynchrone Basé sur Événements</term>
<term xml:lang="fr">Retour Haptique</term>
<term xml:lang="fr">Téléopération</term>
<term xml:lang="fr">Pince Optique</term>
<term xml:lang="fr">Microrobotique</term>
<term xml:lang="fr">Micromanipulation</term>
</keywords>
<classCode scheme="halDomain" n="info.info-rb">Computer Science [cs]/Robotics [cs.RO]</classCode>
<classCode scheme="halTypology" n="THESE">Theses</classCode>
</textClass>
<abstract xml:lang="en">The dynamic vision sensor (DVS) is a silicon retina prototype that records only scene contrast changes in the form of stream of events, thus naturally excluding the redundant background absolute gray levels. In this context, numerous high speed asynchronous event based vision algorithms have been developed and their advantages over frame based processing methods have been compared. In haptic feedback teleoperated micromanipulation, vision is a sound candidate for force estimation if the position-force model is well established. The sampling frequency, however, needs to attain 1KHz to allow a transparent and reliable tactile sensation and to ensure system stability. The event based vision has thus been applied to provide the needed force feedback for two micromanipulation applications: Haptic feedback teleoperation of optical tweezers; Haptic virtual assistance in microgripper based micromanipulation. The results show that the haptic frequency requirement of 1KHz has successfully been achieved. For the first application, high speed particle position detection algorithms have been developed and validated. A three-dimensional haptic feedback system capable of manipulating multiple-trap optical tweezers has been realized. In the second application, a novel event based shape registration algorithm capable of tracking arbitrary form object has been developed to track a piezoelectric microgripper. The stability of the system has been significantly enhanced to assist operators in performing complex micromanipulation tasks.</abstract>
<abstract xml:lang="fr">Le Dynamic Vision Sensor (DVS) est un prototype de la rétine silicium qui n'enregistre que des changements de contraste de la scène sous la forme de flux d'événements, excluant donc naturellement les niveaux de gris absolus redondants. Dans ce contexte, de nombreux algorithmes de vision asynchrones à grande vitesse basées sur l'événement ont été développés et leurs avantages par rapport aux méthodes de traitement traditionnel basé sur l'image ont été comparés. En retour haptique pour la micromanipulation, la vision est un candidat qualifié pour l'estimation de la force si le modèle position-force est bien établi. La fréquence d'échantillonnage doit toutefois atteindre 1 kHz pour permettre une sensation tactile transparente et fiable et assurer la stabilité du système. La vision basée sur l'événement a donc été appliquée pour fournir le retour d'effort nécessaire sur deux applications de micromanipulation: Le retour haptique sur la pince optique; Assistance virtuelle haptique sur micro-outil mécanique. Les résultats montrent que l'exigence de fréquence haptique de 1 kHz a été réalisée avec succès. Pour la première application, les algorithmes de détection de la position d'une microsphère à haute vitesse ont été développés. Un système de retour haptique tridimensionnel capable de manipuler plusieurs pièges optiques a été réalisé. Dans la deuxième application, un nouvel algorithme d'enregistrement de forme basé sur l'événement capable de suivre objet de forme arbitraire a été développé pour suivre une micropince piézoélectrique. La stabilité du système a été considérablement renforcée pour aider les opérateurs à effectuer des tâches de micromanipulation complexes.</abstract>
</profileDesc>
</hal>
</record>

Pour manipuler ce document sous Unix (Dilib)

EXPLOR_STEP=$WICRI_ROOT/Ticri/CIDE/explor/HapticV1/Data/Hal/Curation
HfdSelect -h $EXPLOR_STEP/biblio.hfd -nk 000080 | SxmlIndent | more

Ou

HfdSelect -h $EXPLOR_AREA/Data/Hal/Curation/biblio.hfd -nk 000080 | SxmlIndent | more

Pour mettre un lien sur cette page dans le réseau Wicri

{{Explor lien
   |wiki=    Ticri/CIDE
   |area=    HapticV1
   |flux=    Hal
   |étape=   Curation
   |type=    RBID
   |clé=     Hal:tel-00916995
   |texte=   Asynchronous Event Based Vision: Algorithms and Applications to Microrobotics
}}

Wicri

This area was generated with Dilib version V0.6.23.
Data generation: Mon Jun 13 01:09:46 2016. Site generation: Wed Mar 6 09:54:07 2024