Serveur d'exploration sur les dispositifs haptiques

Attention, ce site est en cours de développement !
Attention, site généré par des moyens informatiques à partir de corpus bruts.
Les informations ne sont donc pas validées.

Investigation of motion guidance with scooter cobot and collaborative learning

Identifieur interne : 000B39 ( PascalFrancis/Corpus ); précédent : 000B38; suivant : 000B40

Investigation of motion guidance with scooter cobot and collaborative learning

Auteurs : ENG SENG BOY ; Etienne Burdet ; CHEE LEONG TEO ; James Edward Colgate

Source :

RBID : Pascal:07-0308181

Descripteurs français

English descriptors

Abstract

This paper investigates how collaborative robots (cobots) can assist a human by mechanically constraining motion to software-defined guide paths, and introduces simple and efficient tools to design ergonomic paths. Analysis of the movements of seven subjects with the Scooter cobot reveals significant differences between guided movements (GM) and free movements (FM). While FM requires learning for each novel task, movements in GM are satisfying from the first trial, require little effort, are faster, smoother, and with fewer back and forth corrections than in FM. Operators rely on path guidance to rotate the Scooter and direct it along curved trajectories. While these advantages demonstrate the strength of the cobot concept, they do not show how guide paths should be defined. We introduce tools to enable the cobot and its operator to collaboratively learn ergonomic guide paths and adapt to changes in the environment. By relying on the haptic sensing, vision, and planning capabilities of the human operator, we can avoid equipping the cobot with complex sensor processing. Experiments with human subjects demonstrate the efficiency and complementarity of these guide paths design tools.

Notice en format standard (ISO 2709)

Pour connaître la documentation sur le format Inist Standard.

pA  
A01 01  1    @0 1552-3098
A03   1    @0 IEEE trans. robot.
A05       @2 23
A06       @2 2
A08 01  1  ENG  @1 Investigation of motion guidance with scooter cobot and collaborative learning
A11 01  1    @1 ENG SENG BOY
A11 02  1    @1 BURDET (Etienne)
A11 03  1    @1 CHEE LEONG TEO
A11 04  1    @1 COLGATE (James Edward)
A14 01      @1 Victoria Junior College @2 Singapore 449035 @3 SGP @Z 1 aut.
A14 02      @1 Department of Bioengineering, Imperial College London @2 SW7 2AZ London @3 GBR @Z 2 aut.
A14 03      @1 Department of Mechanical Engineering, National University of Singapore @2 Singapore 119260 @3 SGP @Z 3 aut.
A14 04      @1 Department of Mechanical Engineering, Northwestern University @2 Evanston IL 60208-3111 @3 USA @Z 4 aut.
A20       @1 245-255
A21       @1 2007
A23 01      @0 ENG
A43 01      @1 INIST @2 21023A @5 354000149533850060
A44       @0 0000 @1 © 2007 INIST-CNRS. All rights reserved.
A45       @0 30 ref.
A47 01  1    @0 07-0308181
A60       @1 P
A61       @0 A
A64 01  1    @0 IEEE transactions on robotics
A66 01      @0 USA
C01 01    ENG  @0 This paper investigates how collaborative robots (cobots) can assist a human by mechanically constraining motion to software-defined guide paths, and introduces simple and efficient tools to design ergonomic paths. Analysis of the movements of seven subjects with the Scooter cobot reveals significant differences between guided movements (GM) and free movements (FM). While FM requires learning for each novel task, movements in GM are satisfying from the first trial, require little effort, are faster, smoother, and with fewer back and forth corrections than in FM. Operators rely on path guidance to rotate the Scooter and direct it along curved trajectories. While these advantages demonstrate the strength of the cobot concept, they do not show how guide paths should be defined. We introduce tools to enable the cobot and its operator to collaboratively learn ergonomic guide paths and adapt to changes in the environment. By relying on the haptic sensing, vision, and planning capabilities of the human operator, we can avoid equipping the cobot with complex sensor processing. Experiments with human subjects demonstrate the efficiency and complementarity of these guide paths design tools.
C02 01  X    @0 001D02D11
C03 01  X  FRE  @0 Guidage @5 06
C03 01  X  ENG  @0 Guidance @5 06
C03 01  X  SPA  @0 Guiado @5 06
C03 02  X  FRE  @0 Robotique @5 07
C03 02  X  ENG  @0 Robotics @5 07
C03 02  X  SPA  @0 Robótica @5 07
C03 03  X  FRE  @0 Planification @5 08
C03 03  X  ENG  @0 Planning @5 08
C03 03  X  SPA  @0 Planificación @5 08
C03 04  X  FRE  @0 Homme @5 09
C03 04  X  ENG  @0 Human @5 09
C03 04  X  SPA  @0 Hombre @5 09
C03 05  X  FRE  @0 Mouvement corporel @5 18
C03 05  X  ENG  @0 Body movement @5 18
C03 05  X  SPA  @0 Movimiento corporal @5 18
C03 06  X  FRE  @0 Outil coupe @5 19
C03 06  X  ENG  @0 Cutting tool @5 19
C03 06  X  SPA  @0 Herramienta corte @5 19
C03 07  X  FRE  @0 Trajectoire @5 20
C03 07  X  ENG  @0 Trajectory @5 20
C03 07  X  SPA  @0 Trayectoria @5 20
C03 08  X  FRE  @0 Ergonomie @5 21
C03 08  X  ENG  @0 Ergonomics @5 21
C03 08  X  SPA  @0 Ergonomía @5 21
C03 09  X  FRE  @0 Sensibilité tactile @5 22
C03 09  X  ENG  @0 Tactile sensitivity @5 22
C03 09  X  SPA  @0 Sensibilidad tactil @5 22
C03 10  X  FRE  @0 Opérateur humain @5 23
C03 10  X  ENG  @0 Human operator @5 23
C03 10  X  SPA  @0 Operador humano @5 23
C03 11  X  FRE  @0 Capteur mesure @5 24
C03 11  X  ENG  @0 Measurement sensor @5 24
C03 11  X  SPA  @0 Captador medida @5 24
N21       @1 197
N44 01      @1 OTO
N82       @1 OTO

Format Inist (serveur)

NO : PASCAL 07-0308181 INIST
ET : Investigation of motion guidance with scooter cobot and collaborative learning
AU : ENG SENG BOY; BURDET (Etienne); CHEE LEONG TEO; COLGATE (James Edward)
AF : Victoria Junior College/Singapore 449035/Singapour (1 aut.); Department of Bioengineering, Imperial College London/SW7 2AZ London/Royaume-Uni (2 aut.); Department of Mechanical Engineering, National University of Singapore/Singapore 119260/Singapour (3 aut.); Department of Mechanical Engineering, Northwestern University/Evanston IL 60208-3111/Etats-Unis (4 aut.)
DT : Publication en série; Niveau analytique
SO : IEEE transactions on robotics ; ISSN 1552-3098; Etats-Unis; Da. 2007; Vol. 23; No. 2; Pp. 245-255; Bibl. 30 ref.
LA : Anglais
EA : This paper investigates how collaborative robots (cobots) can assist a human by mechanically constraining motion to software-defined guide paths, and introduces simple and efficient tools to design ergonomic paths. Analysis of the movements of seven subjects with the Scooter cobot reveals significant differences between guided movements (GM) and free movements (FM). While FM requires learning for each novel task, movements in GM are satisfying from the first trial, require little effort, are faster, smoother, and with fewer back and forth corrections than in FM. Operators rely on path guidance to rotate the Scooter and direct it along curved trajectories. While these advantages demonstrate the strength of the cobot concept, they do not show how guide paths should be defined. We introduce tools to enable the cobot and its operator to collaboratively learn ergonomic guide paths and adapt to changes in the environment. By relying on the haptic sensing, vision, and planning capabilities of the human operator, we can avoid equipping the cobot with complex sensor processing. Experiments with human subjects demonstrate the efficiency and complementarity of these guide paths design tools.
CC : 001D02D11
FD : Guidage; Robotique; Planification; Homme; Mouvement corporel; Outil coupe; Trajectoire; Ergonomie; Sensibilité tactile; Opérateur humain; Capteur mesure
ED : Guidance; Robotics; Planning; Human; Body movement; Cutting tool; Trajectory; Ergonomics; Tactile sensitivity; Human operator; Measurement sensor
SD : Guiado; Robótica; Planificación; Hombre; Movimiento corporal; Herramienta corte; Trayectoria; Ergonomía; Sensibilidad tactil; Operador humano; Captador medida
LO : INIST-21023A.354000149533850060
ID : 07-0308181

Links to Exploration step

Pascal:07-0308181

Le document en format XML

<record>
<TEI>
<teiHeader>
<fileDesc>
<titleStmt>
<title xml:lang="en" level="a">Investigation of motion guidance with scooter cobot and collaborative learning</title>
<author>
<name sortKey="Eng Seng Boy" sort="Eng Seng Boy" uniqKey="Eng Seng Boy" last="Eng Seng Boy">ENG SENG BOY</name>
<affiliation>
<inist:fA14 i1="01">
<s1>Victoria Junior College</s1>
<s2>Singapore 449035</s2>
<s3>SGP</s3>
<sZ>1 aut.</sZ>
</inist:fA14>
</affiliation>
</author>
<author>
<name sortKey="Burdet, Etienne" sort="Burdet, Etienne" uniqKey="Burdet E" first="Etienne" last="Burdet">Etienne Burdet</name>
<affiliation>
<inist:fA14 i1="02">
<s1>Department of Bioengineering, Imperial College London</s1>
<s2>SW7 2AZ London</s2>
<s3>GBR</s3>
<sZ>2 aut.</sZ>
</inist:fA14>
</affiliation>
</author>
<author>
<name sortKey="Chee Leong Teo" sort="Chee Leong Teo" uniqKey="Chee Leong Teo" last="Chee Leong Teo">CHEE LEONG TEO</name>
<affiliation>
<inist:fA14 i1="03">
<s1>Department of Mechanical Engineering, National University of Singapore</s1>
<s2>Singapore 119260</s2>
<s3>SGP</s3>
<sZ>3 aut.</sZ>
</inist:fA14>
</affiliation>
</author>
<author>
<name sortKey="Colgate, James Edward" sort="Colgate, James Edward" uniqKey="Colgate J" first="James Edward" last="Colgate">James Edward Colgate</name>
<affiliation>
<inist:fA14 i1="04">
<s1>Department of Mechanical Engineering, Northwestern University</s1>
<s2>Evanston IL 60208-3111</s2>
<s3>USA</s3>
<sZ>4 aut.</sZ>
</inist:fA14>
</affiliation>
</author>
</titleStmt>
<publicationStmt>
<idno type="wicri:source">INIST</idno>
<idno type="inist">07-0308181</idno>
<date when="2007">2007</date>
<idno type="stanalyst">PASCAL 07-0308181 INIST</idno>
<idno type="RBID">Pascal:07-0308181</idno>
<idno type="wicri:Area/PascalFrancis/Corpus">000B39</idno>
</publicationStmt>
<sourceDesc>
<biblStruct>
<analytic>
<title xml:lang="en" level="a">Investigation of motion guidance with scooter cobot and collaborative learning</title>
<author>
<name sortKey="Eng Seng Boy" sort="Eng Seng Boy" uniqKey="Eng Seng Boy" last="Eng Seng Boy">ENG SENG BOY</name>
<affiliation>
<inist:fA14 i1="01">
<s1>Victoria Junior College</s1>
<s2>Singapore 449035</s2>
<s3>SGP</s3>
<sZ>1 aut.</sZ>
</inist:fA14>
</affiliation>
</author>
<author>
<name sortKey="Burdet, Etienne" sort="Burdet, Etienne" uniqKey="Burdet E" first="Etienne" last="Burdet">Etienne Burdet</name>
<affiliation>
<inist:fA14 i1="02">
<s1>Department of Bioengineering, Imperial College London</s1>
<s2>SW7 2AZ London</s2>
<s3>GBR</s3>
<sZ>2 aut.</sZ>
</inist:fA14>
</affiliation>
</author>
<author>
<name sortKey="Chee Leong Teo" sort="Chee Leong Teo" uniqKey="Chee Leong Teo" last="Chee Leong Teo">CHEE LEONG TEO</name>
<affiliation>
<inist:fA14 i1="03">
<s1>Department of Mechanical Engineering, National University of Singapore</s1>
<s2>Singapore 119260</s2>
<s3>SGP</s3>
<sZ>3 aut.</sZ>
</inist:fA14>
</affiliation>
</author>
<author>
<name sortKey="Colgate, James Edward" sort="Colgate, James Edward" uniqKey="Colgate J" first="James Edward" last="Colgate">James Edward Colgate</name>
<affiliation>
<inist:fA14 i1="04">
<s1>Department of Mechanical Engineering, Northwestern University</s1>
<s2>Evanston IL 60208-3111</s2>
<s3>USA</s3>
<sZ>4 aut.</sZ>
</inist:fA14>
</affiliation>
</author>
</analytic>
<series>
<title level="j" type="main">IEEE transactions on robotics </title>
<title level="j" type="abbreviated">IEEE trans. robot. </title>
<idno type="ISSN">1552-3098</idno>
<imprint>
<date when="2007">2007</date>
</imprint>
</series>
</biblStruct>
</sourceDesc>
<seriesStmt>
<title level="j" type="main">IEEE transactions on robotics </title>
<title level="j" type="abbreviated">IEEE trans. robot. </title>
<idno type="ISSN">1552-3098</idno>
</seriesStmt>
</fileDesc>
<profileDesc>
<textClass>
<keywords scheme="KwdEn" xml:lang="en">
<term>Body movement</term>
<term>Cutting tool</term>
<term>Ergonomics</term>
<term>Guidance</term>
<term>Human</term>
<term>Human operator</term>
<term>Measurement sensor</term>
<term>Planning</term>
<term>Robotics</term>
<term>Tactile sensitivity</term>
<term>Trajectory</term>
</keywords>
<keywords scheme="Pascal" xml:lang="fr">
<term>Guidage</term>
<term>Robotique</term>
<term>Planification</term>
<term>Homme</term>
<term>Mouvement corporel</term>
<term>Outil coupe</term>
<term>Trajectoire</term>
<term>Ergonomie</term>
<term>Sensibilité tactile</term>
<term>Opérateur humain</term>
<term>Capteur mesure</term>
</keywords>
</textClass>
</profileDesc>
</teiHeader>
<front>
<div type="abstract" xml:lang="en">This paper investigates how collaborative robots (cobots) can assist a human by mechanically constraining motion to software-defined guide paths, and introduces simple and efficient tools to design ergonomic paths. Analysis of the movements of seven subjects with the Scooter cobot reveals significant differences between guided movements (GM) and free movements (FM). While FM requires learning for each novel task, movements in GM are satisfying from the first trial, require little effort, are faster, smoother, and with fewer back and forth corrections than in FM. Operators rely on path guidance to rotate the Scooter and direct it along curved trajectories. While these advantages demonstrate the strength of the cobot concept, they do not show how guide paths should be defined. We introduce tools to enable the cobot and its operator to collaboratively learn ergonomic guide paths and adapt to changes in the environment. By relying on the haptic sensing, vision, and planning capabilities of the human operator, we can avoid equipping the cobot with complex sensor processing. Experiments with human subjects demonstrate the efficiency and complementarity of these guide paths design tools.</div>
</front>
</TEI>
<inist>
<standard h6="B">
<pA>
<fA01 i1="01" i2="1">
<s0>1552-3098</s0>
</fA01>
<fA03 i2="1">
<s0>IEEE trans. robot. </s0>
</fA03>
<fA05>
<s2>23</s2>
</fA05>
<fA06>
<s2>2</s2>
</fA06>
<fA08 i1="01" i2="1" l="ENG">
<s1>Investigation of motion guidance with scooter cobot and collaborative learning</s1>
</fA08>
<fA11 i1="01" i2="1">
<s1>ENG SENG BOY</s1>
</fA11>
<fA11 i1="02" i2="1">
<s1>BURDET (Etienne)</s1>
</fA11>
<fA11 i1="03" i2="1">
<s1>CHEE LEONG TEO</s1>
</fA11>
<fA11 i1="04" i2="1">
<s1>COLGATE (James Edward)</s1>
</fA11>
<fA14 i1="01">
<s1>Victoria Junior College</s1>
<s2>Singapore 449035</s2>
<s3>SGP</s3>
<sZ>1 aut.</sZ>
</fA14>
<fA14 i1="02">
<s1>Department of Bioengineering, Imperial College London</s1>
<s2>SW7 2AZ London</s2>
<s3>GBR</s3>
<sZ>2 aut.</sZ>
</fA14>
<fA14 i1="03">
<s1>Department of Mechanical Engineering, National University of Singapore</s1>
<s2>Singapore 119260</s2>
<s3>SGP</s3>
<sZ>3 aut.</sZ>
</fA14>
<fA14 i1="04">
<s1>Department of Mechanical Engineering, Northwestern University</s1>
<s2>Evanston IL 60208-3111</s2>
<s3>USA</s3>
<sZ>4 aut.</sZ>
</fA14>
<fA20>
<s1>245-255</s1>
</fA20>
<fA21>
<s1>2007</s1>
</fA21>
<fA23 i1="01">
<s0>ENG</s0>
</fA23>
<fA43 i1="01">
<s1>INIST</s1>
<s2>21023A</s2>
<s5>354000149533850060</s5>
</fA43>
<fA44>
<s0>0000</s0>
<s1>© 2007 INIST-CNRS. All rights reserved.</s1>
</fA44>
<fA45>
<s0>30 ref.</s0>
</fA45>
<fA47 i1="01" i2="1">
<s0>07-0308181</s0>
</fA47>
<fA60>
<s1>P</s1>
</fA60>
<fA61>
<s0>A</s0>
</fA61>
<fA64 i1="01" i2="1">
<s0>IEEE transactions on robotics </s0>
</fA64>
<fA66 i1="01">
<s0>USA</s0>
</fA66>
<fC01 i1="01" l="ENG">
<s0>This paper investigates how collaborative robots (cobots) can assist a human by mechanically constraining motion to software-defined guide paths, and introduces simple and efficient tools to design ergonomic paths. Analysis of the movements of seven subjects with the Scooter cobot reveals significant differences between guided movements (GM) and free movements (FM). While FM requires learning for each novel task, movements in GM are satisfying from the first trial, require little effort, are faster, smoother, and with fewer back and forth corrections than in FM. Operators rely on path guidance to rotate the Scooter and direct it along curved trajectories. While these advantages demonstrate the strength of the cobot concept, they do not show how guide paths should be defined. We introduce tools to enable the cobot and its operator to collaboratively learn ergonomic guide paths and adapt to changes in the environment. By relying on the haptic sensing, vision, and planning capabilities of the human operator, we can avoid equipping the cobot with complex sensor processing. Experiments with human subjects demonstrate the efficiency and complementarity of these guide paths design tools.</s0>
</fC01>
<fC02 i1="01" i2="X">
<s0>001D02D11</s0>
</fC02>
<fC03 i1="01" i2="X" l="FRE">
<s0>Guidage</s0>
<s5>06</s5>
</fC03>
<fC03 i1="01" i2="X" l="ENG">
<s0>Guidance</s0>
<s5>06</s5>
</fC03>
<fC03 i1="01" i2="X" l="SPA">
<s0>Guiado</s0>
<s5>06</s5>
</fC03>
<fC03 i1="02" i2="X" l="FRE">
<s0>Robotique</s0>
<s5>07</s5>
</fC03>
<fC03 i1="02" i2="X" l="ENG">
<s0>Robotics</s0>
<s5>07</s5>
</fC03>
<fC03 i1="02" i2="X" l="SPA">
<s0>Robótica</s0>
<s5>07</s5>
</fC03>
<fC03 i1="03" i2="X" l="FRE">
<s0>Planification</s0>
<s5>08</s5>
</fC03>
<fC03 i1="03" i2="X" l="ENG">
<s0>Planning</s0>
<s5>08</s5>
</fC03>
<fC03 i1="03" i2="X" l="SPA">
<s0>Planificación</s0>
<s5>08</s5>
</fC03>
<fC03 i1="04" i2="X" l="FRE">
<s0>Homme</s0>
<s5>09</s5>
</fC03>
<fC03 i1="04" i2="X" l="ENG">
<s0>Human</s0>
<s5>09</s5>
</fC03>
<fC03 i1="04" i2="X" l="SPA">
<s0>Hombre</s0>
<s5>09</s5>
</fC03>
<fC03 i1="05" i2="X" l="FRE">
<s0>Mouvement corporel</s0>
<s5>18</s5>
</fC03>
<fC03 i1="05" i2="X" l="ENG">
<s0>Body movement</s0>
<s5>18</s5>
</fC03>
<fC03 i1="05" i2="X" l="SPA">
<s0>Movimiento corporal</s0>
<s5>18</s5>
</fC03>
<fC03 i1="06" i2="X" l="FRE">
<s0>Outil coupe</s0>
<s5>19</s5>
</fC03>
<fC03 i1="06" i2="X" l="ENG">
<s0>Cutting tool</s0>
<s5>19</s5>
</fC03>
<fC03 i1="06" i2="X" l="SPA">
<s0>Herramienta corte</s0>
<s5>19</s5>
</fC03>
<fC03 i1="07" i2="X" l="FRE">
<s0>Trajectoire</s0>
<s5>20</s5>
</fC03>
<fC03 i1="07" i2="X" l="ENG">
<s0>Trajectory</s0>
<s5>20</s5>
</fC03>
<fC03 i1="07" i2="X" l="SPA">
<s0>Trayectoria</s0>
<s5>20</s5>
</fC03>
<fC03 i1="08" i2="X" l="FRE">
<s0>Ergonomie</s0>
<s5>21</s5>
</fC03>
<fC03 i1="08" i2="X" l="ENG">
<s0>Ergonomics</s0>
<s5>21</s5>
</fC03>
<fC03 i1="08" i2="X" l="SPA">
<s0>Ergonomía</s0>
<s5>21</s5>
</fC03>
<fC03 i1="09" i2="X" l="FRE">
<s0>Sensibilité tactile</s0>
<s5>22</s5>
</fC03>
<fC03 i1="09" i2="X" l="ENG">
<s0>Tactile sensitivity</s0>
<s5>22</s5>
</fC03>
<fC03 i1="09" i2="X" l="SPA">
<s0>Sensibilidad tactil</s0>
<s5>22</s5>
</fC03>
<fC03 i1="10" i2="X" l="FRE">
<s0>Opérateur humain</s0>
<s5>23</s5>
</fC03>
<fC03 i1="10" i2="X" l="ENG">
<s0>Human operator</s0>
<s5>23</s5>
</fC03>
<fC03 i1="10" i2="X" l="SPA">
<s0>Operador humano</s0>
<s5>23</s5>
</fC03>
<fC03 i1="11" i2="X" l="FRE">
<s0>Capteur mesure</s0>
<s5>24</s5>
</fC03>
<fC03 i1="11" i2="X" l="ENG">
<s0>Measurement sensor</s0>
<s5>24</s5>
</fC03>
<fC03 i1="11" i2="X" l="SPA">
<s0>Captador medida</s0>
<s5>24</s5>
</fC03>
<fN21>
<s1>197</s1>
</fN21>
<fN44 i1="01">
<s1>OTO</s1>
</fN44>
<fN82>
<s1>OTO</s1>
</fN82>
</pA>
</standard>
<server>
<NO>PASCAL 07-0308181 INIST</NO>
<ET>Investigation of motion guidance with scooter cobot and collaborative learning</ET>
<AU>ENG SENG BOY; BURDET (Etienne); CHEE LEONG TEO; COLGATE (James Edward)</AU>
<AF>Victoria Junior College/Singapore 449035/Singapour (1 aut.); Department of Bioengineering, Imperial College London/SW7 2AZ London/Royaume-Uni (2 aut.); Department of Mechanical Engineering, National University of Singapore/Singapore 119260/Singapour (3 aut.); Department of Mechanical Engineering, Northwestern University/Evanston IL 60208-3111/Etats-Unis (4 aut.)</AF>
<DT>Publication en série; Niveau analytique</DT>
<SO>IEEE transactions on robotics ; ISSN 1552-3098; Etats-Unis; Da. 2007; Vol. 23; No. 2; Pp. 245-255; Bibl. 30 ref.</SO>
<LA>Anglais</LA>
<EA>This paper investigates how collaborative robots (cobots) can assist a human by mechanically constraining motion to software-defined guide paths, and introduces simple and efficient tools to design ergonomic paths. Analysis of the movements of seven subjects with the Scooter cobot reveals significant differences between guided movements (GM) and free movements (FM). While FM requires learning for each novel task, movements in GM are satisfying from the first trial, require little effort, are faster, smoother, and with fewer back and forth corrections than in FM. Operators rely on path guidance to rotate the Scooter and direct it along curved trajectories. While these advantages demonstrate the strength of the cobot concept, they do not show how guide paths should be defined. We introduce tools to enable the cobot and its operator to collaboratively learn ergonomic guide paths and adapt to changes in the environment. By relying on the haptic sensing, vision, and planning capabilities of the human operator, we can avoid equipping the cobot with complex sensor processing. Experiments with human subjects demonstrate the efficiency and complementarity of these guide paths design tools.</EA>
<CC>001D02D11</CC>
<FD>Guidage; Robotique; Planification; Homme; Mouvement corporel; Outil coupe; Trajectoire; Ergonomie; Sensibilité tactile; Opérateur humain; Capteur mesure</FD>
<ED>Guidance; Robotics; Planning; Human; Body movement; Cutting tool; Trajectory; Ergonomics; Tactile sensitivity; Human operator; Measurement sensor</ED>
<SD>Guiado; Robótica; Planificación; Hombre; Movimiento corporal; Herramienta corte; Trayectoria; Ergonomía; Sensibilidad tactil; Operador humano; Captador medida</SD>
<LO>INIST-21023A.354000149533850060</LO>
<ID>07-0308181</ID>
</server>
</inist>
</record>

Pour manipuler ce document sous Unix (Dilib)

EXPLOR_STEP=$WICRI_ROOT/Ticri/CIDE/explor/HapticV1/Data/PascalFrancis/Corpus
HfdSelect -h $EXPLOR_STEP/biblio.hfd -nk 000B39 | SxmlIndent | more

Ou

HfdSelect -h $EXPLOR_AREA/Data/PascalFrancis/Corpus/biblio.hfd -nk 000B39 | SxmlIndent | more

Pour mettre un lien sur cette page dans le réseau Wicri

{{Explor lien
   |wiki=    Ticri/CIDE
   |area=    HapticV1
   |flux=    PascalFrancis
   |étape=   Corpus
   |type=    RBID
   |clé=     Pascal:07-0308181
   |texte=   Investigation of motion guidance with scooter cobot and collaborative learning
}}

Wicri

This area was generated with Dilib version V0.6.23.
Data generation: Mon Jun 13 01:09:46 2016. Site generation: Wed Mar 6 09:54:07 2024