Serveur d'exploration sur les dispositifs haptiques

Attention, ce site est en cours de développement !
Attention, site généré par des moyens informatiques à partir de corpus bruts.
Les informations ne sont donc pas validées.

A Fully Sensorized Cooperative Robotic System for Surgical Interventions

Identifieur interne : 002492 ( Pmc/Curation ); précédent : 002491; suivant : 002493

A Fully Sensorized Cooperative Robotic System for Surgical Interventions

Auteurs : Saúl Tovar-Arriaga ; José Emilio Vargas ; Juan M. Ramos ; Marco A. Aceves ; Efren Gorrostieta ; Willi A. Kalender

Source :

RBID : PMC:3444109

Abstract

In this research a fully sensorized cooperative robot system for manipulation of needles is presented. The setup consists of a DLR/KUKA Light Weight Robot III especially designed for safe human/robot interaction, a FD-CT robot-driven angiographic C-arm system, and a navigation camera. Also, new control strategies for robot manipulation in the clinical environment are introduced. A method for fast calibration of the involved components and the preliminary accuracy tests of the whole possible errors chain are presented. Calibration of the robot with the navigation system has a residual error of 0.81 mm (rms) with a standard deviation of ±0.41 mm. The accuracy of the robotic system while targeting fixed points at different positions within the workspace is of 1.2 mm (rms) with a standard deviation of ±0.4 mm. After calibration, and due to close loop control, the absolute positioning accuracy was reduced to the navigation camera accuracy which is of 0.35 mm (rms). The implemented control allows the robot to compensate for small patient movements.


Url:
DOI: 10.3390/s120709423
PubMed: 23012551
PubMed Central: 3444109

Links toward previous steps (curation, corpus...)


Links to Exploration step

PMC:3444109

Le document en format XML

<record>
<TEI>
<teiHeader>
<fileDesc>
<titleStmt>
<title xml:lang="en">A Fully Sensorized Cooperative Robotic System for Surgical Interventions</title>
<author>
<name sortKey="Tovar Arriaga, Saul" sort="Tovar Arriaga, Saul" uniqKey="Tovar Arriaga S" first="Saúl" last="Tovar-Arriaga">Saúl Tovar-Arriaga</name>
<affiliation>
<nlm:aff id="af1-sensors-12-09423"> Institute of Medical Physics, Friedrich-Alexander-University Erlangen-Nuremberg, Henkestr. 91, 91052 Erlangen, Germany; E-Mail:
<email>willi.kalender@imp.uni-erlangen.de</email>
</nlm:aff>
</affiliation>
<affiliation>
<nlm:aff id="af2-sensors-12-09423"> Informatics Faculty, Autonomous University of Querétaro, Avenida de las ciencias s/n, Juriquilla, Querétaro, Qro. C.P. 76230, Mexico; E-Mails:
<email>emilio@mecatronica.net</email>
(J.E.V.);
<email>jramos@mecamex.net</email>
(J.M.R.);
<email>marco.aceves@uaq.mx</email>
(M.A.A.);
<email>efrengorrostieta@gmail.com</email>
(E.G.)</nlm:aff>
</affiliation>
</author>
<author>
<name sortKey="Vargas, Jose Emilio" sort="Vargas, Jose Emilio" uniqKey="Vargas J" first="José Emilio" last="Vargas">José Emilio Vargas</name>
<affiliation>
<nlm:aff id="af2-sensors-12-09423"> Informatics Faculty, Autonomous University of Querétaro, Avenida de las ciencias s/n, Juriquilla, Querétaro, Qro. C.P. 76230, Mexico; E-Mails:
<email>emilio@mecatronica.net</email>
(J.E.V.);
<email>jramos@mecamex.net</email>
(J.M.R.);
<email>marco.aceves@uaq.mx</email>
(M.A.A.);
<email>efrengorrostieta@gmail.com</email>
(E.G.)</nlm:aff>
</affiliation>
</author>
<author>
<name sortKey="Ramos, Juan M" sort="Ramos, Juan M" uniqKey="Ramos J" first="Juan M." last="Ramos">Juan M. Ramos</name>
<affiliation>
<nlm:aff id="af2-sensors-12-09423"> Informatics Faculty, Autonomous University of Querétaro, Avenida de las ciencias s/n, Juriquilla, Querétaro, Qro. C.P. 76230, Mexico; E-Mails:
<email>emilio@mecatronica.net</email>
(J.E.V.);
<email>jramos@mecamex.net</email>
(J.M.R.);
<email>marco.aceves@uaq.mx</email>
(M.A.A.);
<email>efrengorrostieta@gmail.com</email>
(E.G.)</nlm:aff>
</affiliation>
</author>
<author>
<name sortKey="Aceves, Marco A" sort="Aceves, Marco A" uniqKey="Aceves M" first="Marco A." last="Aceves">Marco A. Aceves</name>
<affiliation>
<nlm:aff id="af2-sensors-12-09423"> Informatics Faculty, Autonomous University of Querétaro, Avenida de las ciencias s/n, Juriquilla, Querétaro, Qro. C.P. 76230, Mexico; E-Mails:
<email>emilio@mecatronica.net</email>
(J.E.V.);
<email>jramos@mecamex.net</email>
(J.M.R.);
<email>marco.aceves@uaq.mx</email>
(M.A.A.);
<email>efrengorrostieta@gmail.com</email>
(E.G.)</nlm:aff>
</affiliation>
</author>
<author>
<name sortKey="Gorrostieta, Efren" sort="Gorrostieta, Efren" uniqKey="Gorrostieta E" first="Efren" last="Gorrostieta">Efren Gorrostieta</name>
<affiliation>
<nlm:aff id="af2-sensors-12-09423"> Informatics Faculty, Autonomous University of Querétaro, Avenida de las ciencias s/n, Juriquilla, Querétaro, Qro. C.P. 76230, Mexico; E-Mails:
<email>emilio@mecatronica.net</email>
(J.E.V.);
<email>jramos@mecamex.net</email>
(J.M.R.);
<email>marco.aceves@uaq.mx</email>
(M.A.A.);
<email>efrengorrostieta@gmail.com</email>
(E.G.)</nlm:aff>
</affiliation>
</author>
<author>
<name sortKey="Kalender, Willi A" sort="Kalender, Willi A" uniqKey="Kalender W" first="Willi A." last="Kalender">Willi A. Kalender</name>
<affiliation>
<nlm:aff id="af1-sensors-12-09423"> Institute of Medical Physics, Friedrich-Alexander-University Erlangen-Nuremberg, Henkestr. 91, 91052 Erlangen, Germany; E-Mail:
<email>willi.kalender@imp.uni-erlangen.de</email>
</nlm:aff>
</affiliation>
</author>
</titleStmt>
<publicationStmt>
<idno type="wicri:source">PMC</idno>
<idno type="pmid">23012551</idno>
<idno type="pmc">3444109</idno>
<idno type="url">http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3444109</idno>
<idno type="RBID">PMC:3444109</idno>
<idno type="doi">10.3390/s120709423</idno>
<date when="2012">2012</date>
<idno type="wicri:Area/Pmc/Corpus">002492</idno>
<idno type="wicri:Area/Pmc/Curation">002492</idno>
</publicationStmt>
<sourceDesc>
<biblStruct>
<analytic>
<title xml:lang="en" level="a" type="main">A Fully Sensorized Cooperative Robotic System for Surgical Interventions</title>
<author>
<name sortKey="Tovar Arriaga, Saul" sort="Tovar Arriaga, Saul" uniqKey="Tovar Arriaga S" first="Saúl" last="Tovar-Arriaga">Saúl Tovar-Arriaga</name>
<affiliation>
<nlm:aff id="af1-sensors-12-09423"> Institute of Medical Physics, Friedrich-Alexander-University Erlangen-Nuremberg, Henkestr. 91, 91052 Erlangen, Germany; E-Mail:
<email>willi.kalender@imp.uni-erlangen.de</email>
</nlm:aff>
</affiliation>
<affiliation>
<nlm:aff id="af2-sensors-12-09423"> Informatics Faculty, Autonomous University of Querétaro, Avenida de las ciencias s/n, Juriquilla, Querétaro, Qro. C.P. 76230, Mexico; E-Mails:
<email>emilio@mecatronica.net</email>
(J.E.V.);
<email>jramos@mecamex.net</email>
(J.M.R.);
<email>marco.aceves@uaq.mx</email>
(M.A.A.);
<email>efrengorrostieta@gmail.com</email>
(E.G.)</nlm:aff>
</affiliation>
</author>
<author>
<name sortKey="Vargas, Jose Emilio" sort="Vargas, Jose Emilio" uniqKey="Vargas J" first="José Emilio" last="Vargas">José Emilio Vargas</name>
<affiliation>
<nlm:aff id="af2-sensors-12-09423"> Informatics Faculty, Autonomous University of Querétaro, Avenida de las ciencias s/n, Juriquilla, Querétaro, Qro. C.P. 76230, Mexico; E-Mails:
<email>emilio@mecatronica.net</email>
(J.E.V.);
<email>jramos@mecamex.net</email>
(J.M.R.);
<email>marco.aceves@uaq.mx</email>
(M.A.A.);
<email>efrengorrostieta@gmail.com</email>
(E.G.)</nlm:aff>
</affiliation>
</author>
<author>
<name sortKey="Ramos, Juan M" sort="Ramos, Juan M" uniqKey="Ramos J" first="Juan M." last="Ramos">Juan M. Ramos</name>
<affiliation>
<nlm:aff id="af2-sensors-12-09423"> Informatics Faculty, Autonomous University of Querétaro, Avenida de las ciencias s/n, Juriquilla, Querétaro, Qro. C.P. 76230, Mexico; E-Mails:
<email>emilio@mecatronica.net</email>
(J.E.V.);
<email>jramos@mecamex.net</email>
(J.M.R.);
<email>marco.aceves@uaq.mx</email>
(M.A.A.);
<email>efrengorrostieta@gmail.com</email>
(E.G.)</nlm:aff>
</affiliation>
</author>
<author>
<name sortKey="Aceves, Marco A" sort="Aceves, Marco A" uniqKey="Aceves M" first="Marco A." last="Aceves">Marco A. Aceves</name>
<affiliation>
<nlm:aff id="af2-sensors-12-09423"> Informatics Faculty, Autonomous University of Querétaro, Avenida de las ciencias s/n, Juriquilla, Querétaro, Qro. C.P. 76230, Mexico; E-Mails:
<email>emilio@mecatronica.net</email>
(J.E.V.);
<email>jramos@mecamex.net</email>
(J.M.R.);
<email>marco.aceves@uaq.mx</email>
(M.A.A.);
<email>efrengorrostieta@gmail.com</email>
(E.G.)</nlm:aff>
</affiliation>
</author>
<author>
<name sortKey="Gorrostieta, Efren" sort="Gorrostieta, Efren" uniqKey="Gorrostieta E" first="Efren" last="Gorrostieta">Efren Gorrostieta</name>
<affiliation>
<nlm:aff id="af2-sensors-12-09423"> Informatics Faculty, Autonomous University of Querétaro, Avenida de las ciencias s/n, Juriquilla, Querétaro, Qro. C.P. 76230, Mexico; E-Mails:
<email>emilio@mecatronica.net</email>
(J.E.V.);
<email>jramos@mecamex.net</email>
(J.M.R.);
<email>marco.aceves@uaq.mx</email>
(M.A.A.);
<email>efrengorrostieta@gmail.com</email>
(E.G.)</nlm:aff>
</affiliation>
</author>
<author>
<name sortKey="Kalender, Willi A" sort="Kalender, Willi A" uniqKey="Kalender W" first="Willi A." last="Kalender">Willi A. Kalender</name>
<affiliation>
<nlm:aff id="af1-sensors-12-09423"> Institute of Medical Physics, Friedrich-Alexander-University Erlangen-Nuremberg, Henkestr. 91, 91052 Erlangen, Germany; E-Mail:
<email>willi.kalender@imp.uni-erlangen.de</email>
</nlm:aff>
</affiliation>
</author>
</analytic>
<series>
<title level="j">Sensors (Basel, Switzerland)</title>
<idno type="eISSN">1424-8220</idno>
<imprint>
<date when="2012">2012</date>
</imprint>
</series>
</biblStruct>
</sourceDesc>
</fileDesc>
<profileDesc>
<textClass></textClass>
</profileDesc>
</teiHeader>
<front>
<div type="abstract" xml:lang="en">
<p>In this research a fully sensorized cooperative robot system for manipulation of needles is presented. The setup consists of a DLR/KUKA Light Weight Robot III especially designed for safe human/robot interaction, a FD-CT robot-driven angiographic C-arm system, and a navigation camera. Also, new control strategies for robot manipulation in the clinical environment are introduced. A method for fast calibration of the involved components and the preliminary accuracy tests of the whole possible errors chain are presented. Calibration of the robot with the navigation system has a residual error of 0.81 mm (rms) with a standard deviation of ±0.41 mm. The accuracy of the robotic system while targeting fixed points at different positions within the workspace is of 1.2 mm (rms) with a standard deviation of ±0.4 mm. After calibration, and due to close loop control, the absolute positioning accuracy was reduced to the navigation camera accuracy which is of 0.35 mm (rms). The implemented control allows the robot to compensate for small patient movements.</p>
</div>
</front>
<back>
<div1 type="bibliography">
<listBibl>
<biblStruct>
<analytic>
<author>
<name sortKey="Kwoh, Y S" uniqKey="Kwoh Y">Y.S. Kwoh</name>
</author>
<author>
<name sortKey="Hou, J" uniqKey="Hou J">J. Hou</name>
</author>
<author>
<name sortKey="Jonckheere, E A" uniqKey="Jonckheere E">E.A. Jonckheere</name>
</author>
<author>
<name sortKey="Hayati, S" uniqKey="Hayati S">S. Hayati</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Cleary, K" uniqKey="Cleary K">K. Cleary</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Kazanzides, P" uniqKey="Kazanzides P">P. Kazanzides</name>
</author>
<author>
<name sortKey="Fichtinger, G" uniqKey="Fichtinger G">G. Fichtinger</name>
</author>
<author>
<name sortKey="Hager, G D" uniqKey="Hager G">G.D. Hager</name>
</author>
<author>
<name sortKey="Okamura, A M" uniqKey="Okamura A">A.M. Okamura</name>
</author>
<author>
<name sortKey="Whitcomb, L L" uniqKey="Whitcomb L">L.L. Whitcomb</name>
</author>
<author>
<name sortKey="Taylor, R H" uniqKey="Taylor R">R.H. Taylor</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Guthart, G S" uniqKey="Guthart G">G.S. Guthart</name>
</author>
<author>
<name sortKey="Salisbury, J J" uniqKey="Salisbury J">J.J. Salisbury</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Adler, J R" uniqKey="Adler J">J.R. Adler</name>
</author>
<author>
<name sortKey="Murphy, M J" uniqKey="Murphy M">M.J. Murphy</name>
</author>
<author>
<name sortKey="Chang, S D" uniqKey="Chang S">S.D. Chang</name>
</author>
<author>
<name sortKey="Hankock, S L" uniqKey="Hankock S">S.L. Hankock</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Schweikard, A" uniqKey="Schweikard A">A. Schweikard</name>
</author>
<author>
<name sortKey="Glosser, G" uniqKey="Glosser G">G. Glosser</name>
</author>
<author>
<name sortKey="Bodduluri, M" uniqKey="Bodduluri M">M. Bodduluri</name>
</author>
<author>
<name sortKey="Murphy, M J" uniqKey="Murphy M">M.J. Murphy</name>
</author>
<author>
<name sortKey="Adler, J R" uniqKey="Adler J">J.R. Adler</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Hirzinger, G" uniqKey="Hirzinger G">G. Hirzinger</name>
</author>
<author>
<name sortKey="Sporer, N" uniqKey="Sporer N">N. Sporer</name>
</author>
<author>
<name sortKey="Albu Sch Ffer, A" uniqKey="Albu Sch Ffer A">A. Albu-Schäffer</name>
</author>
<author>
<name sortKey="H Hnle, M" uniqKey="H Hnle M">M. Hähnle</name>
</author>
<author>
<name sortKey="Krenn, R" uniqKey="Krenn R">R. Krenn</name>
</author>
<author>
<name sortKey="Pascucci, A" uniqKey="Pascucci A">A. Pascucci</name>
</author>
<author>
<name sortKey="Schedl, M" uniqKey="Schedl M">M. Schedl</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Hagn, U" uniqKey="Hagn U">U. Hagn</name>
</author>
<author>
<name sortKey="Konietschke, R" uniqKey="Konietschke R">R. Konietschke</name>
</author>
<author>
<name sortKey="Tobergte, A" uniqKey="Tobergte A">A. Tobergte</name>
</author>
<author>
<name sortKey="Nickl, M" uniqKey="Nickl M">M. Nickl</name>
</author>
<author>
<name sortKey="Jorg, S" uniqKey="Jorg S">S. Jörg</name>
</author>
<author>
<name sortKey="Kuebler, B" uniqKey="Kuebler B">B. Kuebler</name>
</author>
<author>
<name sortKey="Passig, G" uniqKey="Passig G">G. Passig</name>
</author>
<author>
<name sortKey="Groger, M" uniqKey="Groger M">M. Gröger</name>
</author>
<author>
<name sortKey="Frohlich, F" uniqKey="Frohlich F">F. Fröhlich</name>
</author>
<author>
<name sortKey="Seibold, U" uniqKey="Seibold U">U. Seibold</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Hannaford, B" uniqKey="Hannaford B">B. Hannaford</name>
</author>
<author>
<name sortKey="Okamura, A M" uniqKey="Okamura A">A.M. Okamura</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Tovar Arriaga, S" uniqKey="Tovar Arriaga S">S. Tovar-Arriaga</name>
</author>
<author>
<name sortKey="Tita, R" uniqKey="Tita R">R. Tita</name>
</author>
<author>
<name sortKey="Pedraza Ortega, J C" uniqKey="Pedraza Ortega J">J.C. Pedraza-Ortega</name>
</author>
<author>
<name sortKey="Gorrostieta, E" uniqKey="Gorrostieta E">E. Gorrostieta</name>
</author>
<author>
<name sortKey="Kalender, W A" uniqKey="Kalender W">W.A. Kalender</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Hager, G D" uniqKey="Hager G">G.D. Hager</name>
</author>
<author>
<name sortKey="Okamura, A M" uniqKey="Okamura A">A.M. Okamura</name>
</author>
<author>
<name sortKey="Kazanzides, P" uniqKey="Kazanzides P">P. Kazanzides</name>
</author>
<author>
<name sortKey="Whitcomb, G F" uniqKey="Whitcomb G">G.F. Whitcomb</name>
</author>
<author>
<name sortKey="Taylor, R H" uniqKey="Taylor R">R.H. Taylor</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Hagn, U" uniqKey="Hagn U">U. Hagn</name>
</author>
<author>
<name sortKey="Nickl, M" uniqKey="Nickl M">M. Nickl</name>
</author>
<author>
<name sortKey="Jorg, S" uniqKey="Jorg S">S. Jörg</name>
</author>
<author>
<name sortKey="Passig, G" uniqKey="Passig G">G. Passig</name>
</author>
<author>
<name sortKey="Bahls, T" uniqKey="Bahls T">T. Bahls</name>
</author>
<author>
<name sortKey="Nothhelfer, A" uniqKey="Nothhelfer A">A. Nothhelfer</name>
</author>
<author>
<name sortKey="Hacker, F" uniqKey="Hacker F">F. Hacker</name>
</author>
<author>
<name sortKey="Le Tien, L" uniqKey="Le Tien L">L. Le-Tien</name>
</author>
<author>
<name sortKey="Albu Sch Ffer, A" uniqKey="Albu Sch Ffer A">A. Albu-Schäffer</name>
</author>
<author>
<name sortKey="Konietschke, R" uniqKey="Konietschke R">R. Konietschke</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Peters, T" uniqKey="Peters T">T. Peters</name>
</author>
<author>
<name sortKey="Cleary, K" uniqKey="Cleary K">K. Cleary</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Taylor, R H" uniqKey="Taylor R">R.H. Taylor</name>
</author>
<author>
<name sortKey="Paul, H A" uniqKey="Paul H">H.A. Paul</name>
</author>
<author>
<name sortKey="Kazandzides, P" uniqKey="Kazandzides P">P. Kazandzides</name>
</author>
<author>
<name sortKey="Mittelstadt, B D" uniqKey="Mittelstadt B">B.D. Mittelstadt</name>
</author>
<author>
<name sortKey="Hanson, W" uniqKey="Hanson W">W. Hanson</name>
</author>
<author>
<name sortKey="Zuhars, J F" uniqKey="Zuhars J">J.F. Zuhars</name>
</author>
<author>
<name sortKey="Williamson, B" uniqKey="Williamson B">B. Williamson</name>
</author>
<author>
<name sortKey="Musits, B L" uniqKey="Musits B">B.L. Musits</name>
</author>
<author>
<name sortKey="Glassman, E" uniqKey="Glassman E">E. Glassman</name>
</author>
<author>
<name sortKey="Bargar, W L" uniqKey="Bargar W">W.L. Bargar</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Petermann, J" uniqKey="Petermann J">J. Petermann</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Stoianovici, D" uniqKey="Stoianovici D">D. Stoianovici</name>
</author>
<author>
<name sortKey="Song, D" uniqKey="Song D">D. Song</name>
</author>
<author>
<name sortKey="Petrisor, D" uniqKey="Petrisor D">D. Petrisor</name>
</author>
<author>
<name sortKey="Ursu, D" uniqKey="Ursu D">D. Ursu</name>
</author>
<author>
<name sortKey="Mazilu, D" uniqKey="Mazilu D">D. Mazilu</name>
</author>
<author>
<name sortKey="Mutener, M" uniqKey="Mutener M">M. Mutener</name>
</author>
<author>
<name sortKey="Schar, M" uniqKey="Schar M">M. Schar</name>
</author>
<author>
<name sortKey="Patriciu, A" uniqKey="Patriciu A">A. Patriciu</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Kalender, W A" uniqKey="Kalender W">W.A. Kalender</name>
</author>
<author>
<name sortKey="Kyriakou, Y" uniqKey="Kyriakou Y">Y. Kyriakou</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Ning, R" uniqKey="Ning R">R. Ning</name>
</author>
<author>
<name sortKey="Chen, B" uniqKey="Chen B">B. Chen</name>
</author>
<author>
<name sortKey="Yu, R" uniqKey="Yu R">R. Yu</name>
</author>
<author>
<name sortKey="Conover, D" uniqKey="Conover D">D. Conover</name>
</author>
<author>
<name sortKey="Tang, X" uniqKey="Tang X">X. Tang</name>
</author>
<author>
<name sortKey="Ning, Y" uniqKey="Ning Y">Y. Ning</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Jaffray, D A" uniqKey="Jaffray D">D.A. Jaffray</name>
</author>
<author>
<name sortKey="Siewerdsen, J H" uniqKey="Siewerdsen J">J.H. Siewerdsen</name>
</author>
<author>
<name sortKey="Wong, J" uniqKey="Wong J">J. Wong</name>
</author>
<author>
<name sortKey="Martinez, A" uniqKey="Martinez A">A. Martinez</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Siewerdsen, J H" uniqKey="Siewerdsen J">J.H. Siewerdsen</name>
</author>
<author>
<name sortKey="Moseley, D" uniqKey="Moseley D">D. Moseley</name>
</author>
<author>
<name sortKey="Burch, S" uniqKey="Burch S">S. Burch</name>
</author>
<author>
<name sortKey="Bisland, S" uniqKey="Bisland S">S. Bisland</name>
</author>
<author>
<name sortKey="Bogaards, A" uniqKey="Bogaards A">A. Bogaards</name>
</author>
<author>
<name sortKey="Wilson, B" uniqKey="Wilson B">B. Wilson</name>
</author>
<author>
<name sortKey="Jaffray, D A" uniqKey="Jaffray D">D.A. Jaffray</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Cleary, K" uniqKey="Cleary K">K. Cleary</name>
</author>
<author>
<name sortKey="Melzer, A" uniqKey="Melzer A">A. Melzer</name>
</author>
<author>
<name sortKey="Watson, V" uniqKey="Watson V">V. Watson</name>
</author>
<author>
<name sortKey="Kronreif, G" uniqKey="Kronreif G">G. Kronreif</name>
</author>
<author>
<name sortKey="Stoianovici, D" uniqKey="Stoianovici D">D. Stoianovici</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Strobel, N" uniqKey="Strobel N">N. Strobel</name>
</author>
<author>
<name sortKey="Meissner, O" uniqKey="Meissner O">O. Meissner</name>
</author>
<author>
<name sortKey="Boese, J" uniqKey="Boese J">J. Boese</name>
</author>
<author>
<name sortKey="Brunner, T" uniqKey="Brunner T">T. Brunner</name>
</author>
<author>
<name sortKey="Heigl, B" uniqKey="Heigl B">B. Heigl</name>
</author>
<author>
<name sortKey="Hoheisel, M" uniqKey="Hoheisel M">M. Hoheisel</name>
</author>
<author>
<name sortKey="Lauritsch, G" uniqKey="Lauritsch G">G. Lauritsch</name>
</author>
<author>
<name sortKey="Nagel, M" uniqKey="Nagel M">M. Nagel</name>
</author>
<author>
<name sortKey="Pfister, M" uniqKey="Pfister M">M. Pfister</name>
</author>
<author>
<name sortKey="Ruhrnschopf, E P" uniqKey="Ruhrnschopf E">E.P. Rührnschopf</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Castillo Cruces, R A" uniqKey="Castillo Cruces R">R.A. Castillo-Cruces</name>
</author>
<author>
<name sortKey="Schneider, H C" uniqKey="Schneider H">H.C. Schneider</name>
</author>
<author>
<name sortKey="Wahrburg, J" uniqKey="Wahrburg J">J. Wahrburg</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Baron, S" uniqKey="Baron S">S. Baron</name>
</author>
<author>
<name sortKey="Eilers, H" uniqKey="Eilers H">H. Eilers</name>
</author>
<author>
<name sortKey="Munske, B" uniqKey="Munske B">B. Munske</name>
</author>
<author>
<name sortKey="Toennies, J L" uniqKey="Toennies J">J.L. Toennies</name>
</author>
<author>
<name sortKey="Balachandran, R" uniqKey="Balachandran R">R. Balachandran</name>
</author>
<author>
<name sortKey="Labadie, R F" uniqKey="Labadie R">R.F. Labadie</name>
</author>
<author>
<name sortKey="Ortmaier, T" uniqKey="Ortmaier T">T. Ortmaier</name>
</author>
<author>
<name sortKey="Webster, R J" uniqKey="Webster R">R.J. Webster</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Masamune, K" uniqKey="Masamune K">K. Masamune</name>
</author>
<author>
<name sortKey="Fichtinger, G" uniqKey="Fichtinger G">G. Fichtinger</name>
</author>
<author>
<name sortKey="Patriciu, A" uniqKey="Patriciu A">A. Patriciu</name>
</author>
<author>
<name sortKey="Susil, R C" uniqKey="Susil R">R.C. Susil</name>
</author>
<author>
<name sortKey="Taylor, R H" uniqKey="Taylor R">R.H. Taylor</name>
</author>
<author>
<name sortKey="Kavoussi, L R" uniqKey="Kavoussi L">L.R. Kavoussi</name>
</author>
<author>
<name sortKey="Anderson, J H" uniqKey="Anderson J">J.H. Anderson</name>
</author>
<author>
<name sortKey="Sakuma, I" uniqKey="Sakuma I">I. Sakuma</name>
</author>
<author>
<name sortKey="Dohi, T" uniqKey="Dohi T">T. Dohi</name>
</author>
<author>
<name sortKey="Stoianovici, D" uniqKey="Stoianovici D">D. Stoianovici</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Nagel, M" uniqKey="Nagel M">M. Nagel</name>
</author>
<author>
<name sortKey="Schmidt, G" uniqKey="Schmidt G">G. Schmidt</name>
</author>
<author>
<name sortKey="Petzold, R" uniqKey="Petzold R">R. Petzold</name>
</author>
<author>
<name sortKey="Kalender, W A" uniqKey="Kalender W">W.A. Kalender</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Grunwald, G" uniqKey="Grunwald G">G. Grunwald</name>
</author>
<author>
<name sortKey="Schreiber, G" uniqKey="Schreiber G">G. Schreiber</name>
</author>
<author>
<name sortKey="Albu Sch Ffer, A" uniqKey="Albu Sch Ffer A">A. Albu-Schäffer</name>
</author>
<author>
<name sortKey="Hirzinger, G" uniqKey="Hirzinger G">G. Hirzinger</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Hirzinger, G" uniqKey="Hirzinger G">G. Hirzinger</name>
</author>
<author>
<name sortKey="Bals, J" uniqKey="Bals J">J. Bals</name>
</author>
<author>
<name sortKey="Otter, M" uniqKey="Otter M">M. Otter</name>
</author>
<author>
<name sortKey="Stelter, J" uniqKey="Stelter J">J. Stelter</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Majdani, O" uniqKey="Majdani O">O. Majdani</name>
</author>
<author>
<name sortKey="Rau, T" uniqKey="Rau T">T. Rau</name>
</author>
<author>
<name sortKey="Baron, S" uniqKey="Baron S">S. Baron</name>
</author>
<author>
<name sortKey="Eilers, H" uniqKey="Eilers H">H. Eilers</name>
</author>
<author>
<name sortKey="Baier, C" uniqKey="Baier C">C. Baier</name>
</author>
<author>
<name sortKey="Heimann, B" uniqKey="Heimann B">B. Heimann</name>
</author>
<author>
<name sortKey="Ortmaier, T" uniqKey="Ortmaier T">T. Ortmaier</name>
</author>
<author>
<name sortKey="Bartling, S" uniqKey="Bartling S">S. Bartling</name>
</author>
<author>
<name sortKey="Lenarz, T" uniqKey="Lenarz T">T. Lenarz</name>
</author>
<author>
<name sortKey="Leinung, M" uniqKey="Leinung M">M. Leinung</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Matinfar, M" uniqKey="Matinfar M">M. Matinfar</name>
</author>
<author>
<name sortKey="Baird, C" uniqKey="Baird C">C. Baird</name>
</author>
<author>
<name sortKey="Bautouli, A" uniqKey="Bautouli A">A. Bautouli</name>
</author>
<author>
<name sortKey="Clatterbuck, R" uniqKey="Clatterbuck R">R. Clatterbuck</name>
</author>
<author>
<name sortKey="Kazanzides, P" uniqKey="Kazanzides P">P. Kazanzides</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Maurer, C R" uniqKey="Maurer C">C.R. Maurer</name>
</author>
<author>
<name sortKey="Fitzpatrick, J M" uniqKey="Fitzpatrick J">J.M. Fitzpatrick</name>
</author>
<author>
<name sortKey="Wang, M Y" uniqKey="Wang M">M.Y. Wang</name>
</author>
<author>
<name sortKey="Galloway, R L" uniqKey="Galloway R">R.L. Galloway</name>
</author>
<author>
<name sortKey="Maciunas, R J" uniqKey="Maciunas R">R.J. Maciunas</name>
</author>
<author>
<name sortKey="Allen, G S" uniqKey="Allen G">G.S. Allen</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Umeyama, S" uniqKey="Umeyama S">S. Umeyama</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Ortmaier, T" uniqKey="Ortmaier T">T. Ortmaier</name>
</author>
<author>
<name sortKey="Weiss, H" uniqKey="Weiss H">H. Weiss</name>
</author>
<author>
<name sortKey="Ott, Ch" uniqKey="Ott C">Ch. Ott</name>
</author>
<author>
<name sortKey="Hirzinger, G" uniqKey="Hirzinger G">G. Hirzinger</name>
</author>
</analytic>
</biblStruct>
</listBibl>
</div1>
</back>
</TEI>
<pmc article-type="research-article">
<pmc-dir>properties open_access</pmc-dir>
<front>
<journal-meta>
<journal-id journal-id-type="nlm-ta">Sensors (Basel)</journal-id>
<journal-id journal-id-type="iso-abbrev">Sensors (Basel)</journal-id>
<journal-title-group>
<journal-title>Sensors (Basel, Switzerland)</journal-title>
</journal-title-group>
<issn pub-type="epub">1424-8220</issn>
<publisher>
<publisher-name>Molecular Diversity Preservation International (MDPI)</publisher-name>
</publisher>
</journal-meta>
<article-meta>
<article-id pub-id-type="pmid">23012551</article-id>
<article-id pub-id-type="pmc">3444109</article-id>
<article-id pub-id-type="doi">10.3390/s120709423</article-id>
<article-id pub-id-type="publisher-id">sensors-12-09423</article-id>
<article-categories>
<subj-group subj-group-type="heading">
<subject>Article</subject>
</subj-group>
</article-categories>
<title-group>
<article-title>A Fully Sensorized Cooperative Robotic System for Surgical Interventions</article-title>
</title-group>
<contrib-group>
<contrib contrib-type="author">
<name>
<surname>Tovar-Arriaga</surname>
<given-names>Saúl</given-names>
</name>
<xref ref-type="aff" rid="af1-sensors-12-09423">
<sup>1</sup>
</xref>
<xref ref-type="aff" rid="af2-sensors-12-09423">
<sup>2</sup>
</xref>
<xref ref-type="corresp" rid="c1-sensors-12-09423">
<sup>*</sup>
</xref>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Vargas</surname>
<given-names>José Emilio</given-names>
</name>
<xref ref-type="aff" rid="af2-sensors-12-09423">
<sup>2</sup>
</xref>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Ramos</surname>
<given-names>Juan M.</given-names>
</name>
<xref ref-type="aff" rid="af2-sensors-12-09423">
<sup>2</sup>
</xref>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Aceves</surname>
<given-names>Marco A.</given-names>
</name>
<xref ref-type="aff" rid="af2-sensors-12-09423">
<sup>2</sup>
</xref>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Gorrostieta</surname>
<given-names>Efren</given-names>
</name>
<xref ref-type="aff" rid="af2-sensors-12-09423">
<sup>2</sup>
</xref>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Kalender</surname>
<given-names>Willi A.</given-names>
</name>
<xref ref-type="aff" rid="af1-sensors-12-09423">
<sup>1</sup>
</xref>
</contrib>
</contrib-group>
<aff id="af1-sensors-12-09423">
<label>1</label>
Institute of Medical Physics, Friedrich-Alexander-University Erlangen-Nuremberg, Henkestr. 91, 91052 Erlangen, Germany; E-Mail:
<email>willi.kalender@imp.uni-erlangen.de</email>
</aff>
<aff id="af2-sensors-12-09423">
<label>2</label>
Informatics Faculty, Autonomous University of Querétaro, Avenida de las ciencias s/n, Juriquilla, Querétaro, Qro. C.P. 76230, Mexico; E-Mails:
<email>emilio@mecatronica.net</email>
(J.E.V.);
<email>jramos@mecamex.net</email>
(J.M.R.);
<email>marco.aceves@uaq.mx</email>
(M.A.A.);
<email>efrengorrostieta@gmail.com</email>
(E.G.)</aff>
<author-notes>
<corresp id="c1-sensors-12-09423">
<label>*</label>
Author to whom correspondence should be addressed; E-Mail:
<email>saul.tovar@uaq.mx</email>
; Tel.: +52-442-230-3685.</corresp>
</author-notes>
<pub-date pub-type="collection">
<year>2012</year>
</pub-date>
<pub-date pub-type="epub">
<day>09</day>
<month>7</month>
<year>2012</year>
</pub-date>
<volume>12</volume>
<issue>7</issue>
<fpage>9423</fpage>
<lpage>9447</lpage>
<history>
<date date-type="received">
<day>18</day>
<month>6</month>
<year>2012</year>
</date>
<date date-type="rev-recd">
<day>03</day>
<month>7</month>
<year>2012</year>
</date>
<date date-type="accepted">
<day>03</day>
<month>7</month>
<year>2012</year>
</date>
</history>
<permissions>
<copyright-statement>© 2012 by the authors; licensee MDPI, Basel, Switzerland.</copyright-statement>
<copyright-year>2012</copyright-year>
<license>
<license-p>
<pmc-comment>CREATIVE COMMONS</pmc-comment>
This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution license (
<ext-link ext-link-type="uri" xlink:href="http://creativecommons.org/licenses/by/3.0/">http://creativecommons.org/licenses/by/3.0/</ext-link>
).</license-p>
</license>
</permissions>
<abstract>
<p>In this research a fully sensorized cooperative robot system for manipulation of needles is presented. The setup consists of a DLR/KUKA Light Weight Robot III especially designed for safe human/robot interaction, a FD-CT robot-driven angiographic C-arm system, and a navigation camera. Also, new control strategies for robot manipulation in the clinical environment are introduced. A method for fast calibration of the involved components and the preliminary accuracy tests of the whole possible errors chain are presented. Calibration of the robot with the navigation system has a residual error of 0.81 mm (rms) with a standard deviation of ±0.41 mm. The accuracy of the robotic system while targeting fixed points at different positions within the workspace is of 1.2 mm (rms) with a standard deviation of ±0.4 mm. After calibration, and due to close loop control, the absolute positioning accuracy was reduced to the navigation camera accuracy which is of 0.35 mm (rms). The implemented control allows the robot to compensate for small patient movements.</p>
</abstract>
<kwd-group>
<kwd>surgical robotics</kwd>
<kwd>robotic needle-placement</kwd>
<kwd>robot-driven C-arm</kwd>
<kwd>light-weight robot</kwd>
</kwd-group>
</article-meta>
</front>
<body>
<sec>
<label>1.</label>
<title>Introduction</title>
<p>Surgical robotics is an evolving field with a relatively short history. The first recorded medical application occurred in 1985, where a brain biopsy was carried out [
<xref ref-type="bibr" rid="b1-sensors-12-09423">1</xref>
]. Surgical robotics is an interdisciplinary field in which many components interact with each other. These include electromechanical devices such as motors, gears and a variety of sensors. Surgical robots have a lot of potential to improve patient care [
<xref ref-type="bibr" rid="b2-sensors-12-09423">2</xref>
]. They have certain advantages over humans, for example, they don’t have the 20 Hz tremor (inherent to humans) and can follow smooth trajectories with more accuracy. In operations, where the physicians are close to radiation, the use of a robotics system can help to avoid exposure.</p>
<p>By definition, a surgeon is a hand worker. He uses his hands to cut tissues with scalpels and scissors, employ handsaws to cut bones, introduce screws, sew with thread and wire,
<italic>etc.</italic>
In order to execute these activities with a robot system, it has to be especially well equipped with vast range of internal and external sensors. Sensor technology is very important in modern operating rooms and will be essential in the operating rooms of the future [
<xref ref-type="bibr" rid="b2-sensors-12-09423">2</xref>
]. Surgical robotics systems are possible due to the use of a wide variety of sensors. In contrast to robotics deployed in the automation industry, where robot assemblies are isolated from humans, a surgical robot exerts forces directly on the patient organs [
<xref ref-type="bibr" rid="b3-sensors-12-09423">3</xref>
]. Designing a robot that touches, presses and cuts directly the both fragile and vital organs presents a number of issues. Those issues have made to slow down the practice of this very promising field.</p>
<p>In spite of all the challenges having to do with surgical robotics there are some successful systems that have FDA acceptance and are commercial available [
<xref ref-type="bibr" rid="b4-sensors-12-09423">4</xref>
,
<xref ref-type="bibr" rid="b5-sensors-12-09423">5</xref>
]. For instance, the Da-Vinci robot (Intuitive Surgical Inc., Sunnyvale, CA, USA) [
<xref ref-type="bibr" rid="b4-sensors-12-09423">4</xref>
] is the system with more market penetration, as already more than 1933 units have been sold around the World. Its design gives surgeons increased dexterity while they work through small incisions in the body. The system includes a cockpit where the surgeon teleoperates the robot by using haptic devices, a cart with four arms (three of them, depending of the task, may have tweezers, scissors, hold a scalpel, and the other arm holds a laparoscope) and image processing equipment. Another successful surgical robot is the CyberKnife
<sup>®</sup>
(Accuray Inc., Sunnyvale, CA, USA) [
<xref ref-type="bibr" rid="b5-sensors-12-09423">5</xref>
]. The system has the ability to irradiate tumors very precisely even while the patient anatomy is moving due to breathing [
<xref ref-type="bibr" rid="b6-sensors-12-09423">6</xref>
]. The main idea of this approach is to avoid damaging the healthy tissue around the tumor. The system comprises a linear accelerator mounted on the wrist of six degrees of freedom robot arm. The robot system task is to precisely orientate the linear accelerator towards the tumor. For vision, the system uses two orthogonal x-rays cameras equipped with flat panel detectors. For better accuracy, the system includes a navigation system for real-time patient tracking. Both the Da Vinci and CyberKnife systems make use of a variety of sensors necessary to perform their tasks.</p>
<p>Analyzing some successful surgical robotic systems, it is possible to figure out that they are isolated efforts using different technology levels. Experts realize that in spite of many offered solutions trying to introduce automation in the clinical environment, it is still too far away from the standardization level achieved by industry. Standardization is the key of surgical robotics development [
<xref ref-type="bibr" rid="b2-sensors-12-09423">2</xref>
].</p>
<p>The mechanical part of a robot uses special sensors to measure the position not only of the end effector but also along its kinematics. The most used technologies for measuring position and its derivatives are resolvers, optical encoders and magnetic encoders with hall sensors. The sensing technology has opened the door to the development of robotic systems that are able to interact with changing environments. Hirzinger
<italic>et al.</italic>
(Institute of Robotics and Mechatronics of German Aerospace Center) developed a robot system for safe interaction with humans [
<xref ref-type="bibr" rid="b7-sensors-12-09423">7</xref>
]. The robust compliance control of these systems allows the user to pull or push the robot arm with the hand and the system will move as if it had no weigh. This way of control is normally referred as “soft robotics” or “hands-on robots”. The MiroSurge system, which uses similar control schemes, was designed for minimally invasive surgical operations [
<xref ref-type="bibr" rid="b8-sensors-12-09423">8</xref>
].</p>
<p>The main advantage of robotic assistance is the possibility to enhance or extend the hands and eyes of the surgeon during surgery. Controlling a surgical robotic system must be easy and intuitive for users that are not familiarized with robotics. Therefore, in the development of a surgical robot, a proper human-machine interface is essential. The word haptics is defined in robotics as the real and simulated touch interactions between robots, human and real, remote, or simulated environments [
<xref ref-type="bibr" rid="b9-sensors-12-09423">9</xref>
]. The Da-Vinci system, for example, utilizes a haptic device for the intuitive manipulation of the robot [
<xref ref-type="bibr" rid="b4-sensors-12-09423">4</xref>
]. Other surgical systems use joystick [
<xref ref-type="bibr" rid="b10-sensors-12-09423">10</xref>
] and haptic devices like the Phantom Omni device (Sensable Technologies Inc., Wilmington, MA, USA).</p>
<p>One essential part in the development of surgical robotics is its capability to visualize the operating area. It is useful to classify visualization sensing modalities in real-time methods, which provide a continuous visualization of the area of interest, and non-real-time methods that are typically use for preoperative diagnosis and planning [
<xref ref-type="bibr" rid="b11-sensors-12-09423">11</xref>
].</p>
<p>Commonly used real-time sensing modalities are endoscopes, ultrasound, fluoroscopy and optical coherence tomography (OCT). Endoscopes have been the most successful way of visualization. They are usually used for minimally invasive operations [
<xref ref-type="bibr" rid="b5-sensors-12-09423">5</xref>
,
<xref ref-type="bibr" rid="b12-sensors-12-09423">12</xref>
]. Unfortunately, they cannot provide further information from inside the tissue. In contrast, ultrasound provides 2D-real-time pictures from inside the tissue but only a skilled clinician is able to use this technique properly. Fluoroscopy is a technique that offers high spatial resolution so that submillimeter-sized objects can be resolved. Its 2D image clearly shows contrast between different materials (such as bone and liver) and different tissue densities (such as the heart and lungs) [
<xref ref-type="bibr" rid="b13-sensors-12-09423">13</xref>
]. The biggest limitation of fluoroscopy is that overlaying structures are all reduced to a single image plane.</p>
<p>Most important non-real-time visualization systems deployed in surgical robotics are X-rays, Computed Tomography (CT), Magnetic Resonance (MR) and Positron Emission Tomography (PET). Previous work has been done using CT, (e.g., the ROBODOC and CASPAR system [
<xref ref-type="bibr" rid="b14-sensors-12-09423">14</xref>
,
<xref ref-type="bibr" rid="b15-sensors-12-09423">15</xref>
]) used for total hip and knee replacement. X-ray technology is employed in the CyberKnife system for localizing the target position. MR is the imaging technique that offers the most accurate tissue differentiation. Recently, specially designed robots, built out of non-metal parts, can be used together with MR [
<xref ref-type="bibr" rid="b16-sensors-12-09423">16</xref>
].</p>
<p>FD-CT is a technology which combines fluoroscopy (real-time) and CT (non-real-time) in one single device, comprised of a C-arm equipped with flat panel detectors. FD technology in comparison to X-ray film and image intensifiers offers higher dynamic range, dose reduction, fast digital readout, yet keeping to a compact design [
<xref ref-type="bibr" rid="b17-sensors-12-09423">17</xref>
]. Although FD-CT provides higher spatial resolution than common CT, it encompasses a few disadvantages, such as smaller field of view and lower temporal resolution [
<xref ref-type="bibr" rid="b17-sensors-12-09423">17</xref>
]. Nevertheless, FD-CT has already proved unique for planning and intraoperative surgery [
<xref ref-type="bibr" rid="b18-sensors-12-09423">18</xref>
<xref ref-type="bibr" rid="b20-sensors-12-09423">20</xref>
]. C-arms are characterized by their flexibility and ease of use; in particular, by the possibility of choosing arbitrary angulations.</p>
<p>One of the impediments of using robot systems with imaging devices is that the gantry size of the later is not big enough to house both the patient and kinematics of some robots [
<xref ref-type="bibr" rid="b21-sensors-12-09423">21</xref>
]. Many collision issues arise from these kind setups and this is one of the reasons that many researchers have built small dedicated robot systems which fit into the remaining place. A newly FD-CT system (Artis zeego, Siemens Healthcare, Forchheim, Germany) employs a robot arm (KUKA Robots, Gersthofen, Germany) for increased movement flexibility [
<xref ref-type="bibr" rid="b17-sensors-12-09423">17</xref>
,
<xref ref-type="bibr" rid="b22-sensors-12-09423">22</xref>
]. This system can be used for flexible intraoperative imaging and could be coordinated with other robotic systems to assist the surgeon.</p>
<p>Besides encoders and imaging devices as sensing modalities, localizers have been studied in order to analyze its benefits for surgical robotics applications [
<xref ref-type="bibr" rid="b23-sensors-12-09423">23</xref>
,
<xref ref-type="bibr" rid="b24-sensors-12-09423">24</xref>
]. These devices track the position of instruments relative to the patient anatomy. The instrument could be a surgical tool held by a robotic arm. In the CyberKnife system, an optical localizer is deployed to track the position of the patient prone to movements due to respiration [
<xref ref-type="bibr" rid="b5-sensors-12-09423">5</xref>
].</p>
<p>In this work, the concept and implementation of a fully sensorized robotic surgical system is presented. The proposed system utilizes a variety of concepts employed in surgical robotics such as haptics, soft robotics, visualization and external tracking. The surgical system is comprised of two coordinated robot arms. The former carries out the surgical task and the second gives precise target visualization. The system was adapted for threating injuries where the insertion of a needle into an anatomy is commonly carried out in order to extract tissue samples for further analysis or to inject substances for therapy. The authors emphasize the description of the sensor technology employed in the system. In addition to conventional standards normally employed in the medical environment, the system utilizes standards from the automation industry.</p>
</sec>
<sec sec-type="materials|methods">
<label>2.</label>
<title>Materials and Methods</title>
<p>In
<xref ref-type="fig" rid="f1-sensors-12-09423">Figure 1</xref>
, a representation of the main system components is displayed. For needle insertion, a serial robotic system is utilized. It has a special needle holder attached to its wrist. This mechanism allows the clinician to insert the needle manually. The robotic system comprises of a real-time controller (from the manufacturer) and an application controller. It was designed to be manually controlled by means of a touch screen and an industrial joystick. In order to position the robotic system easily along the CT table all the components are built on a mobile trolley. In this way, the system can be easily positioned and removed from the patient table. For target visualization, a robot-driven angiography system, equipped with a flat-panel detector, is deployed. This special C-arm can be positioned along the operating table providing full body coverage. Contrary to conventional C-arms, which only rotate around a fixed position, this imaging system can be adjusted to scan anatomical targets with different angles and convex shaped trajectories. Once a scan is taken, it sends the reconstructed 3D-images to the application controller for planning. In addition, 2D projections can be used to get a real-time target visualization. Based on the images, the surgeon can choose a target and an appropriate entry point. The other important part of the system is an optical localizer which tracks with precision the needle position by means of the reference frame that is attached to the needle holder. Additionally, it also tracks the patient position by means of a reference frame attached to it.</p>
<p>The natural haptic feedback while inserting a needle gives important information from the characteristics of surrounding tissue through the trajectory. Among these characteristics it is possible to get a feedback feeling of non-uniform toughness and tissue elasticity. Therefore, it was decided to use the robot only to position and orientate the needle. Once the desired needle direction is reached by the robot, the surgeon's task is to insert the needle carefully. In this way, the experience of the clinician is taken into account who keeps in control of the surgery.
<xref ref-type="fig" rid="f2-sensors-12-09423">Figure 2</xref>
shows the main system components in an interventional suite. The angiography system (Artis zeego, Siemens Healthcare) comprises a serial robot (KUKA Robots) with a C-arm attached to its wrist.</p>
<sec>
<label>2.1.</label>
<title>Workflow</title>
<p>Most robotic needle placement setups utilize the workflow introduced by Masamune
<italic>et al.</italic>
[
<xref ref-type="bibr" rid="b25-sensors-12-09423">25</xref>
]. Although our system's workflow is similar, it has some different innovations. One of these is the so called target pivoting, which gives the flexibility to change the insertion point while the target is fixed. This method is not possible with RCM robots which normally use the workflow introduced by Masamune. The proposed workflow is described in the next steps:</p>
<list list-type="order">
<list-item>
<p>
<bold>Preparation:</bold>
The patient is stabilized on the CT-table and a patient-image registration device is fixed according to the procedure described in [
<xref ref-type="bibr" rid="b26-sensors-12-09423">26</xref>
].</p>
</list-item>
<list-item>
<p>
<bold>Imaging:</bold>
A 3D scan is acquired with the angiographic C-arm. The reconstructed CT-images are transferred instantly to the navigation suit.</p>
</list-item>
<list-item>
<p>
<bold>Planning:</bold>
Once the images are displayed on a touch-screen monitor, the clinician defines entry and target points.</p>
</list-item>
<list-item>
<p>
<bold>Interactive positioning:</bold>
The robot trolley is placed besides the patient. Then, the clinician takes the robot with the hand, activates the interactive positioning control (which will be described the next Section 2.3), and moves the robot arm till the tool tip is above the entry point.</p>
</list-item>
<list-item>
<p>
<bold>Automatic positioning:</bold>
Once the clinician activates the dead man switch the robot orientates the needle holder towards the planned target.</p>
</list-item>
<list-item>
<p>
<bold>Repositioning (teleoperation mode):</bold>
If required, the entry point can be changed using a joystick using the target pivoting option. During this procedure, the needle trajectory is continuously displayed in the 3D images.</p>
</list-item>
<list-item>
<p>
<bold>Needle insertion:</bold>
The needle is manually inserted by the clinician using the robot's needle holder as a guide. A confirmation scan can be performed with fluoroscopy or with a full-3D CT scan.</p>
</list-item>
<list-item>
<p>
<bold>Intervention or therapy:</bold>
Once the needle hits the target, the tissue sample can be taken or, in case of an ablation, therapy is performed.</p>
</list-item>
</list>
</sec>
<sec>
<label>2.2.</label>
<title>Robot Control Modes</title>
<p>As shown in the workflow, the robotic system has three different control strategies:</p>
<p>
<bold>
<italic>Interactive mode:</italic>
</bold>
The robot arm is freely maneuverable, as if it has no weight, in all directions of the Cartesian space (gravity compensation control). By picking the robot handle with the hand, and pressing sequentially two buttons attached to it, the interactive mode is activated. It is also possible to change the robot's elbow position by pushing it.</p>
<p>
<bold>
<italic>Image guided mode:</italic>
</bold>
The robot moves according to patient-specific planning based data measured by the navigation system.</p>
<p>
<bold>
<italic>Teleoperation mode:</italic>
</bold>
The user controls the robot arm with a joystick in TCP (tool center point) coordinates. Readjustments of the entering angle can be performed meanwhile the needle holder keeps pointing to the target. This is quite useful since the user can watch the new trajectory in the 3D images and choose the most convenient one.</p>
</sec>
<sec>
<label>2.3.</label>
<title>Robot Trolley</title>
<p>As pointed out, the needle insertion robot is mounted on a mobile trolley, together with its manufacturer controller and a dedicated controller for the application (
<xref ref-type="fig" rid="f3-sensors-12-09423">Figure 3</xref>
). The trolley can be placed near the operating table so that the robot arm could be positioned near the patient without interfering with the C-arm trajectory. In order to do it intuitively, the robot has special control modes. The needle insertion robot is a third generation DLR/KUKA Light Weight Robot (LWR III) [
<xref ref-type="bibr" rid="b7-sensors-12-09423">7</xref>
] especially designed for safe human-robot interaction. Due to the fact that this robot has carbon fiber covers and an aluminum skeleton it only weights 14 Kg. All sensors (including encoders, bumpers, and others), motor controllers and cables are integrated into the arm, which makes this robot good for manipulation in a crowded environment in which safety is of mayor concern. The robot has seven rotary joints; in contrast to six d.o.f. robots, its additional joint allows to change the elbows position without affecting the pose of the robot's tool. In every joint of the robot, a torque sensor measures the forces exerted. One of the mayor advantages of this setup is that the robot can be used in a so called gravy compensation mode. In this control mode, the robot arm can be moved by picking the robot with the hand with almost no resistance [
<xref ref-type="bibr" rid="b27-sensors-12-09423">27</xref>
]. Once the user stops pulling or pushing it (on any part of its structure) it stays in its position waiting for the next movement. It looks quite similar like an object inside a spaceship, where gravity is not present. If some parameters of the control mode like virtual weight, friction and spring force (which can change the behavior of the compensation mode) are necessary to change, the programming interface of the manufacturer controller have the option to do it. At the bottom of the mobile platform are mounted the manufacturer real-time controller (KRC, KUKA Robot Controller) and the application controller. Attached to the trolleys inners frame there are different sensors for safety purposes which will be described in the control system section.</p>
</sec>
<sec>
<label>2.4.</label>
<title>Grip and Needle Tool Holder</title>
<p>A handle with two grips is attached to the robot's wrist for user handling. The idea behind the use of two grips is that the user can take it from both sides of the patient table. Each grip has two push buttons, one at the top and the other at the inner side (
<xref ref-type="fig" rid="f4-sensors-12-09423">Figure 4(a)</xref>
). By pressing the push buttons, the gravity compensation mode can be activated. For safety reasons, this interactive mode can only be enabled when both push buttons are pressed, the upper one with the thumb and the lower one with the forefinger. At the end of the handle, a passive tool changer is mounted (GRIP GmbH Handhabungstechnik, Dortmund, Germany) in case the medical application needs a different tool. It can be seen in
<xref ref-type="fig" rid="f4-sensors-12-09423">Figure 4(a)</xref>
that, starting from the passive tool changer, the robot can be covered by a sterile drape to protect the robot from patient blood and other fluids.</p>
<p>In order to track the needle with the optical localizer, a dynamic reference frame (DRF) is attached to the needle holder (
<xref ref-type="fig" rid="f4-sensors-12-09423">Figure 4(a)</xref>
). The device that will carry the needle is the beige piece placed at the tool front. It is completely PEEK fabricated to ensure biocompatibility. In addition, its properties make this material artifact free in the CT-images. The needle holder can carry out different inserts to support varying needle or tool diameters.</p>
</sec>
<sec>
<label>2.5.</label>
<title>Control System</title>
<p>As mentioned before, the KUKA/DLR LWR III is a serial robot. These kinds of robots have excellent repeatability but their absolute positioning accuracy is not outstanding due to small inaccuracies in their kinematics or calibration mistakes that increase over time. These inaccuracies have less impact when differential motions commands are given to the robot, meaning that the robot should move in relation to its last position instead of the absolute position. Based on this assumption, the present approach consists on locating the TCP position (Tool Center Point) and performing small movements taking the actual position as the origin.</p>
<sec>
<label>2.5.1.</label>
<title>Application Controller</title>
<p>The application controller main task is to centralize the data coming from all the system components, process this information and to send orders to be carried out by the actuators. The application controller gets sensor data from the real-time controller (robot pose and force measurements), the optical localizer (reference frames positions and orientations), the robot-driven angiographic system (2D projections and 3D-image reconstruction) a touch screen (user planning instructions) and a joystick (user command movements). The authors implemented the controller in a bare bone PC with Windows XP as operating system.
<xref ref-type="fig" rid="f5-sensors-12-09423">Figure 5</xref>
shows an overview of the system components and their communication protocols with the application controller.</p>
<p>The application controller has a state machine which is triggered depending on the actual state and the data information coming from the system components. Examples of these transitions are when: the user introduces commands in the touch screen, an image is ready to use, the interactive control is activated, safety-related data from the KRC has arrived,
<italic>etc.</italic>
Robot internal safety-related features such as velocity limitation, force monitoring are processed in real time by the KCR. External safety-related emergency buttons and activating buttons (from the handle) are connected to the KCR and to the application controller though a DeviceNet link.</p>
</sec>
<sec>
<label>2.5.2.</label>
<title>Robot Sensor Interface</title>
<p>Dynamic data of the robot pose and movement's commands are cyclically exchanged through the KUKA Robot Sensor Interface (RSI) [
<xref ref-type="bibr" rid="b28-sensors-12-09423">28</xref>
,
<xref ref-type="bibr" rid="b29-sensors-12-09423">29</xref>
]. The RSI real-time interface is the solution offered by the robot manufacturer for coupling sensor to its controllers. Fundamental mechanisms of data communications are collected in the RSI, which is modularly structured and embedded into the KUKA programming environment. It supports synchronous and asynchronous data transportation based on industrial communication standards (Fieldbus, Ethernet). In this research, the data exchange between sensor (in this case the optical localizer) and robot is done using a XML messages. Sensor data is processed within the real-time kernel of the KUKA controller using predefined functions modules (
<italic>i.e.</italic>
, digital filters, transformations, control algorithms) that are combined in a sensor function library consisting of approximately 100 different modules. Processing tasks can be executed within one cycle time of Cartesian motion interpolation (12 ms) allowing that sensor signals can influence the robot positioning during motions.</p>
<p>A TCP/IP link from the KCR to the application controller was established in order to transfer XML data. In every interpolation cycle of the KRC an XML package containing the actual robot position, the joint angles, measured axis forces and motor currents are sent to the application controller. Based on this data, the application controller calculates an XML package including a correction vector for the TCP. The KRC processes the received package only if it arrives within the same time slot.</p>
</sec>
<sec>
<label>2.5.3.</label>
<title>Flow Diagram</title>
<p>The system program sequence can be seen in
<xref ref-type="fig" rid="f6-sensors-12-09423">Figure 6</xref>
. After booting, the application controller initializes the DeviceNet protocol and opens a communication channel with the KRC. Internal variables of the KRC that have direct influence on the robot's configuration and motion can be controlled externally by the application controller. Such variables can, for example, switch on/off the brakes, trigger an external stop, select a program, alert when a configuration lock is activated or the robot kinematic is not calibrated,
<italic>etc.</italic>
Then, the joystick and the camera are initialized. To do this, a USB channel is created for the joystick and a serial communication for the camera is established. Afterwards, the robot initialization is started. This includes program selection (on the KRC) using the opened DeviceNet channel. The security locks are then acknowledged and the robot brakes are released. Then, the robot moves to the programmed initial position. A TCP/IP channel is opened in order to exchange data through the RSI. After a command from the application controller, the RSI data exchange starts. This consists of two control loops, one in the application controller called central control loop and a second one in the KRC. Using the RSI, the control loops trigger each other every 12 ms. At every central control loop, the application controller receives data from the navigation system, the joystick and the bus terminal. Then, it sends the processed data to the KRC which in turn carries out the instructions (adjust the robot position). The KRC sends back a package containing the actual conditions of the robot. This data exchange is repeated until the user stops the program. If a delay exists, the data transmission is broken and the robot stops. The control loop is explained in more detail in the next section.</p>
</sec>
<sec>
<label>2.5.4.</label>
<title>Robot Control Loop</title>
<p>The new position and orientation of the robot are continuously measured using the control loop in
<xref ref-type="fig" rid="f7-sensors-12-09423">Figure 7</xref>
. There are two ways to control the robot pose, either by the data obtained by the optical localizer or by the movement commands given by the user through the joystick.</p>
<p>The KRC transfers via RSI the actual robot's pose (of the wrist) in robot base coordinates. The state machine check which control mode is selected by the user. If the actual mode is “Navigation mode” the controller reads the tool and the patient anatomy positions and orientations and uses this information to calculate the TCP set point. Given the robot pose and the set point, the controller estimates the offset using some transformations that will be described in Section 2.6. In case the “Joystick mode” is activated then the measured value from this input device will be taken as an offset. Finally, in either case, a PID controller gets the correcting value that will be sent to the KRC in TCP coordinates.</p>
</sec>
<sec>
<label>2.5.5.</label>
<title>DeviceNet Link</title>
<p>A DeviceNet link is used to share additional inputs and outputs between the application controller and the KRC. Such I/Os work independently from the RSI communication and are used to initialize the KUKA controller, to share additional information and to be used as interruptions. When the robot handle buttons are pressed, this input is converted into a DeviceNet protocol by means of a bus terminal (BK5250, BECKHOFF New Automation Technology GmbH, Verl, Germany) and is finally shared with both controllers.</p>
<p>DeviceNet is a communication protocol used in the automation industry to interconnect control devices for data exchange. It uses a controller area network as the backbone technology and defines an application layer to cover a range of device profiles. Typical applications include information exchange, safety devices, and large I/O control networks. DeviceNet is a quite spread standard in the automation industry and is highly used by KUKA Robots to control their robots thanks to its real-time capabilities. KUKA controllers are equipped with a DeviceNet card that can be used to share information with external PLCs (programmable logic controllers) or computers.</p>
<p>The KR C2 lr controller was already equipped with a master DeviceNet link. Therefore a DeviceNet slave card was installed in the application controller to be connected with the one of the KR C2 lr. The card used is an AnyBus-PCI DeviceNet Slave with a baud rate of 500 Kbit/sec and 512 programmable I/O bytes (HMS Industrial Networks AB, Halmstad, Sweden). The DeviceNet data is shared between the application controller, the KUKA controller and the bus terminal (
<xref ref-type="fig" rid="f8-sensors-12-09423">Figure 8</xref>
).</p>
<p>The DeviceNet is mounted in the rack attached to the robot platform. Some of the physical connections can be seen in
<xref ref-type="fig" rid="f9-sensors-12-09423">Figure 9</xref>
, including the DeviceNet bus terminal, the power supply and the cables that control the robot system. The bus terminal uses a 24 Volt DC power supply which is distributed in a terminal.</p>
</sec>
</sec>
<sec>
<label>2.6.</label>
<title>Navigation</title>
<p>An optical localizer (Polaris, NDI, Waterloo, ON, Canada) was selected for the development and evaluation of the robotic system. In a final clinical setup, a commercial navigation system with a planning station will be used. The optical localizer tracks the position and orientation of dynamic reference frames (DRF). These frames have four retro-reflecting spheres which the optical localizer uses to detect with precision (with three or more spheres, it is possible to construct a coordinate system with its center in one of the spheres). The optical localizer update rate is 20 Hz and has a localization accuracy of 0.35 mm (rms). Since the control loop is running at 83 Hz an optical localizer with a higher acquisition rate would be a better option, but it was unfortunately not at hand. In this study, the optical localizer was not yet attached to the C-arm.</p>
<p>In
<xref ref-type="fig" rid="f10-sensors-12-09423">Figure 10</xref>
, all coordinate systems are presented. The optical localizer measures the transformations
<italic>
<sup>Cam</sup>
</italic>
<bold>T</bold>
<italic>
<sub>RobRef</sub>
</italic>
and the patient DRF
<italic>
<sup>Cam</sup>
</italic>
<bold>T</bold>
<italic>
<sub>PatRef</sub>
</italic>
in camera coordinates. Getting the transformation
<italic>
<sup>PatRef</sup>
</italic>
<bold>T</bold>
<italic>
<sub>Ima</sub>
</italic>
is the goal of the registration process.</p>
<p>The TCP is located at the lower end of the needle holder. During surgery, it has to be positioned above the skin surface pointing to the target in order to introduce the needle. The desired position of the TCP,
<italic>TCP
<sub>des</sub>
</italic>
, is selected in the CT data set (with
<italic>Ima</italic>
coordinates) in the planning step and can be, if necessary, changed with instructions coming from the joystick. The registration transformation
<italic>
<sup>PatRef</sup>
</italic>
<bold>T</bold>
<italic>
<sub>Ima</sub>
</italic>
is used to determine the transformation
<italic>TCP
<sub>des</sub>
</italic>
in relation to
<italic>PatRef</italic>
:
<disp-formula id="FD1">
<label>(1)</label>
<mml:math id="mm1">
<mml:mrow>
<mml:msub>
<mml:mrow>
<mml:mmultiscripts>
<mml:mi mathvariant="bold">T</mml:mi>
<mml:mprescripts></mml:mprescripts>
<mml:none></mml:none>
<mml:mrow>
<mml:mtext mathvariant="italic">PatRef</mml:mtext>
</mml:mrow>
</mml:mmultiscripts>
</mml:mrow>
<mml:mrow>
<mml:msub>
<mml:mtext mathvariant="italic">TCP</mml:mtext>
<mml:mrow>
<mml:mtext mathvariant="italic">des</mml:mtext>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:msub>
<mml:mo>=</mml:mo>
<mml:msub>
<mml:mrow>
<mml:mmultiscripts>
<mml:mi mathvariant="bold">T</mml:mi>
<mml:mprescripts></mml:mprescripts>
<mml:none></mml:none>
<mml:mrow>
<mml:mtext mathvariant="italic">PatRef</mml:mtext>
</mml:mrow>
</mml:mmultiscripts>
</mml:mrow>
<mml:mrow>
<mml:mtext mathvariant="italic">Ima</mml:mtext>
</mml:mrow>
</mml:msub>
<mml:mo></mml:mo>
<mml:msub>
<mml:mrow>
<mml:mmultiscripts>
<mml:mi mathvariant="bold">P</mml:mi>
<mml:mprescripts></mml:mprescripts>
<mml:none></mml:none>
<mml:mrow>
<mml:mtext mathvariant="italic">Ima</mml:mtext>
</mml:mrow>
</mml:mmultiscripts>
</mml:mrow>
<mml:mrow>
<mml:msub>
<mml:mtext mathvariant="italic">TCP</mml:mtext>
<mml:mrow>
<mml:mtext mathvariant="italic">des</mml:mtext>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:math>
</disp-formula>
</p>
<p>The offset between
<italic>TCP</italic>
and
<italic>TCP
<sub>des</sub>
</italic>
can be calculated by:
<disp-formula id="FD2">
<label>(2)</label>
<mml:math id="mm2">
<mml:mrow>
<mml:msub>
<mml:mrow>
<mml:mmultiscripts>
<mml:mi mathvariant="bold">T</mml:mi>
<mml:mprescripts></mml:mprescripts>
<mml:none></mml:none>
<mml:mrow>
<mml:mtext mathvariant="italic">PatRef</mml:mtext>
</mml:mrow>
</mml:mmultiscripts>
</mml:mrow>
<mml:mrow>
<mml:mtext mathvariant="italic">TCP</mml:mtext>
</mml:mrow>
</mml:msub>
<mml:mo>=</mml:mo>
<mml:msub>
<mml:mrow>
<mml:mmultiscripts>
<mml:mi mathvariant="bold">T</mml:mi>
<mml:mprescripts></mml:mprescripts>
<mml:none></mml:none>
<mml:mrow>
<mml:mtext mathvariant="italic">PatRef</mml:mtext>
</mml:mrow>
</mml:mmultiscripts>
</mml:mrow>
<mml:mrow>
<mml:mtext mathvariant="italic">RobRef</mml:mtext>
</mml:mrow>
</mml:msub>
<mml:mo></mml:mo>
<mml:msub>
<mml:mrow>
<mml:mmultiscripts>
<mml:mtext mathvariant="bold">P</mml:mtext>
<mml:mprescripts></mml:mprescripts>
<mml:none></mml:none>
<mml:mrow>
<mml:mtext mathvariant="italic">RobRef</mml:mtext>
</mml:mrow>
</mml:mmultiscripts>
</mml:mrow>
<mml:mrow>
<mml:mtext mathvariant="italic">TCP</mml:mtext>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:math>
</disp-formula>
where
<disp-formula id="FD3">
<label>(3)</label>
<mml:math id="mm3">
<mml:mrow>
<mml:msub>
<mml:mrow>
<mml:mmultiscripts>
<mml:mi mathvariant="bold">T</mml:mi>
<mml:mprescripts></mml:mprescripts>
<mml:none></mml:none>
<mml:mrow>
<mml:mtext mathvariant="italic">PatRef</mml:mtext>
</mml:mrow>
</mml:mmultiscripts>
</mml:mrow>
<mml:mrow>
<mml:mtext mathvariant="italic">RobRef</mml:mtext>
</mml:mrow>
</mml:msub>
<mml:mo>=</mml:mo>
<mml:msup>
<mml:mrow>
<mml:mrow>
<mml:mo>[</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mrow>
<mml:mmultiscripts>
<mml:mi mathvariant="bold">T</mml:mi>
<mml:mprescripts></mml:mprescripts>
<mml:none></mml:none>
<mml:mrow>
<mml:mtext mathvariant="italic">Cam</mml:mtext>
</mml:mrow>
</mml:mmultiscripts>
</mml:mrow>
<mml:mrow>
<mml:mtext mathvariant="italic">PatRef</mml:mtext>
</mml:mrow>
</mml:msub>
</mml:mrow>
<mml:mo>]</mml:mo>
</mml:mrow>
</mml:mrow>
<mml:mrow>
<mml:mo></mml:mo>
<mml:mn>1</mml:mn>
</mml:mrow>
</mml:msup>
<mml:mo></mml:mo>
<mml:msub>
<mml:mrow>
<mml:mmultiscripts>
<mml:mtext mathvariant="bold">P</mml:mtext>
<mml:mprescripts></mml:mprescripts>
<mml:none></mml:none>
<mml:mrow>
<mml:mtext mathvariant="italic">Cam</mml:mtext>
</mml:mrow>
</mml:mmultiscripts>
</mml:mrow>
<mml:mrow>
<mml:mtext mathvariant="italic">RobRef</mml:mtext>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:math>
</disp-formula>
and
<disp-formula id="FD4">
<label>(4)</label>
<mml:math id="mm4">
<mml:mrow>
<mml:msub>
<mml:mrow>
<mml:mmultiscripts>
<mml:mi mathvariant="bold">T</mml:mi>
<mml:mprescripts></mml:mprescripts>
<mml:none></mml:none>
<mml:mrow>
<mml:mtext mathvariant="italic">TCP</mml:mtext>
</mml:mrow>
</mml:mmultiscripts>
</mml:mrow>
<mml:mrow>
<mml:msub>
<mml:mtext mathvariant="italic">TCP</mml:mtext>
<mml:mrow>
<mml:mtext mathvariant="italic">des</mml:mtext>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:msub>
<mml:mo>=</mml:mo>
<mml:msup>
<mml:mrow>
<mml:mrow>
<mml:mo>[</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mrow>
<mml:mmultiscripts>
<mml:mi mathvariant="bold">T</mml:mi>
<mml:mprescripts></mml:mprescripts>
<mml:none></mml:none>
<mml:mrow>
<mml:mtext mathvariant="italic">PatRef</mml:mtext>
</mml:mrow>
</mml:mmultiscripts>
</mml:mrow>
<mml:mrow>
<mml:mtext mathvariant="italic">TCP</mml:mtext>
</mml:mrow>
</mml:msub>
</mml:mrow>
<mml:mo>]</mml:mo>
</mml:mrow>
</mml:mrow>
<mml:mrow>
<mml:mo></mml:mo>
<mml:mn>1</mml:mn>
</mml:mrow>
</mml:msup>
<mml:mo></mml:mo>
<mml:msub>
<mml:mrow>
<mml:mmultiscripts>
<mml:mtext mathvariant="bold">P</mml:mtext>
<mml:mprescripts></mml:mprescripts>
<mml:none></mml:none>
<mml:mrow>
<mml:mtext mathvariant="italic">PatRef</mml:mtext>
</mml:mrow>
</mml:mmultiscripts>
</mml:mrow>
<mml:mrow>
<mml:msub>
<mml:mtext mathvariant="italic">TCP</mml:mtext>
<mml:mrow>
<mml:mtext mathvariant="italic">des</mml:mtext>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:math>
</disp-formula>
where
<italic>
<sup>TCP</sup>
</italic>
<bold>T</bold>
<italic>
<sub>TCPdes</sub>
</italic>
has be minimized to achieve the desired position.</p>
<p>For robot control, the offset has to be transformed to
<italic>RobWrist</italic>
coordinates. To do this, the calibration transformation
<italic>
<sup>RobRef</sup>
</italic>
<bold>T</bold>
<italic>
<sub>RobWrist</sub>
</italic>
has to be first estimated (see Section 2.9). The chain of the last transformation can be seen in
<xref ref-type="fig" rid="f11-sensors-12-09423">Figure 11</xref>
.</p>
</sec>
<sec>
<label>2.7.</label>
<title>System Transformations</title>
<p>In order to guide a needle into an anatomic area, the robot and the target area must be correlated. To do this, it is necessary to find a transformation between both coordinate systems. In the present approach, the optical localizer is used as an intermediate coordinate. The optical localizer tracks both the robot's actual position and the patient's actual position. The patient actual position measured with the optical localizer in relation to the 3D-image is carried out in the registration process. To correlate the robot with the optical localizer a calibration process is required.</p>
</sec>
<sec>
<label>2.8.</label>
<title>Registration</title>
<p>In contrast to orthopedic procedures where the fixation of a DRF to bones is possible, needle placement procedures are performed on soft tissue. No rigid fixation is possible due to tissue deformation. Therefore, a special registration method introduced by Nagel
<italic>et al.</italic>
, was deployed [
<xref ref-type="bibr" rid="b26-sensors-12-09423">26</xref>
]. The device shape utilized in this method reduces significantly errors introduced by tissue deformation. It consists of a DRF attached to a frame which has an empty space (to insert the needle) in the center and CT-markers distributed in known geometry. A vacuum bag is used to stabilize patient movements. The transformation from the DRF to the CT-markers coordinate system is known in advance and is used to get the transformation
<italic>
<sup>PatRef</sup>
</italic>
<bold>T</bold>
<italic>
<sub>Ima</sub>
</italic>
that registers the patient image to the navigation system.</p>
</sec>
<sec>
<label>2.9.</label>
<title>Calibration</title>
<p>The calibration process consists in finding the transformation
<italic>
<sup>RobRef</sup>
</italic>
T
<italic>
<sub>RobWrist</sub>
</italic>
and the transformation
<italic>
<sup>RobRef</sup>
</italic>
T
<italic>
<sub>TCP</sub>
</italic>
. Once these transformations are estimated the whole chain of the system transformations are known. For the user, the calibration procedure consists of two simple steps:</p>
<list list-type="order">
<list-item>
<p>Robot pivoting. A small iron tip is inserted into the needle holder. The robot is taken using the gravity compensation mode and the iron tip is inserted into a fixed divot. With the optical localizer pointing to the robots tool, the user pivots the robot doing smooth rotational movements for about 30 seconds.</p>
</list-item>
<list-item>
<p>Automatic sequence. A reference DRF is attached near the robots base (always within the camera volume of measurement). After a user command, the robot follows a sequence of movements.</p>
</list-item>
</list>
<p>Internally, during the pivoting step,
<italic>
<sup>RobWrist</sup>
</italic>
<bold>P</bold>
<italic>
<sub>TCP</sub>
</italic>
and
<italic>
<sup>RobRef</sup>
</italic>
<bold>P</bold>
<italic>
<sub>TCP</sub>
</italic>
are estimated. Both are necessary to calculate
<italic>
<sup>RobRef</sup>
</italic>
<bold>T</bold>
<italic>
<sub>RobWrist</sub>
</italic>
as will be described later. In the automatic sequence an algorithm to get
<italic>
<sup>RobRef</sup>
</italic>
<bold>T</bold>
<italic>
<sub>RobWrist</sub>
</italic>
is executed.</p>
<p>The calibration method implemented in this research is similar to the used in medical robotics applications with similar setups [
<xref ref-type="bibr" rid="b24-sensors-12-09423">24</xref>
,
<xref ref-type="bibr" rid="b30-sensors-12-09423">30</xref>
]. This method consists in finding the transformation from a fixed DRF (
<italic>Ref</italic>
) to the robot base coordinate system
<italic>RobBase</italic>
,
<italic>
<sup>Ref</sup>
</italic>
<bold>T</bold>
<italic>
<sub>RobBase</sub>
</italic>
, which is only useful for calibration. With this matrix, the transformation
<italic>
<sup>RobRef</sup>
</italic>
<bold>T</bold>
<italic>
<sub>RobWrist</sub>
</italic>
is then estimated. Robot pivoting helps to find
<italic>
<sup>RobRef</sup>
</italic>
P
<italic>
<sub>TCP</sub>
</italic>
and
<italic>
<sup>RobWrist</sup>
</italic>
P
<italic>
<sub>TCP</sub>
</italic>
. In step two, the robot TCP is moved through a particular workspace in order to measure two sets of corresponding TCP points in relation to the camera and the robot base. By matching these two datasets the transformation
<italic>
<sup>Ref</sup>
</italic>
<bold>T</bold>
<italic>
<sub>RobBase</sub>
</italic>
is calculated with a pair point method.</p>
<sec>
<label>2.9.1.</label>
<title>Robot Pivot Calibration</title>
<p>The pivot calibration consist on the repetitive tilting (pivoting) of a rigid instrument tip into a small orifice called divot along two spherical axes [
<xref ref-type="bibr" rid="b31-sensors-12-09423">31</xref>
]. The gravity compensation mode described in Section 2.3 was used for this purpose. Once this mode is activated, the robot handle was taken with the hand and inserted into the divot of a fixed aluminum plate. While pivoting, the transformation
<italic>
<sup>RobBase</sup>
</italic>
<bold>T</bold>
<italic>
<sub>RobWrist</sub>
</italic>
was continued obtained from the KRC and saved in a text file. At the same time, the transformation
<italic>
<sup>Cam</sup>
</italic>
<bold>T</bold>
<italic>
<sub>RobRef</sub>
</italic>
was also obtained and saved in a text file. The
<italic>TCP</italic>
translations relative to coordinate systems
<italic>RobBase</italic>
and
<italic>Cam</italic>
are estimated by finding the most invariant point in these pivot motions as:
<disp-formula id="FD5">
<label>(5)</label>
<mml:math id="mm5">
<mml:mrow>
<mml:mrow>
<mml:mo>[</mml:mo>
<mml:mrow>
<mml:mtable>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:msub>
<mml:mrow>
<mml:mmultiscripts>
<mml:mtext mathvariant="bold">R</mml:mtext>
<mml:mprescripts></mml:mprescripts>
<mml:none></mml:none>
<mml:mrow>
<mml:mtext mathvariant="italic">RobBase</mml:mtext>
</mml:mrow>
</mml:mmultiscripts>
<mml:mo>(</mml:mo>
<mml:mn>1</mml:mn>
<mml:mo>)</mml:mo>
</mml:mrow>
<mml:mrow>
<mml:mtext mathvariant="italic">RobWrist</mml:mtext>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mo>|</mml:mo>
</mml:mtd>
<mml:mtd>
<mml:mo></mml:mo>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mo></mml:mo>
</mml:mtd>
<mml:mtd>
<mml:mo>|</mml:mo>
</mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:mo></mml:mo>
<mml:mi mathvariant="bold">I</mml:mi>
</mml:mrow>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:msub>
<mml:mrow>
<mml:mmultiscripts>
<mml:mtext mathvariant="bold">R</mml:mtext>
<mml:mprescripts></mml:mprescripts>
<mml:none></mml:none>
<mml:mrow>
<mml:mtext mathvariant="italic">RobBase</mml:mtext>
</mml:mrow>
</mml:mmultiscripts>
<mml:mo stretchy="false">(</mml:mo>
<mml:mi>n</mml:mi>
<mml:mo stretchy="false">)</mml:mo>
</mml:mrow>
<mml:mrow>
<mml:mtext mathvariant="italic">RobWrist</mml:mtext>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mo>|</mml:mo>
</mml:mtd>
<mml:mtd>
<mml:mo></mml:mo>
</mml:mtd>
</mml:mtr>
</mml:mtable>
</mml:mrow>
<mml:mo>]</mml:mo>
</mml:mrow>
<mml:mrow>
<mml:mo>[</mml:mo>
<mml:mrow>
<mml:mtable>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:msub>
<mml:mrow>
<mml:mmultiscripts>
<mml:mtext mathvariant="bold">P</mml:mtext>
<mml:mprescripts></mml:mprescripts>
<mml:none></mml:none>
<mml:mrow>
<mml:mtext mathvariant="italic">RobWrist</mml:mtext>
</mml:mrow>
</mml:mmultiscripts>
</mml:mrow>
<mml:mrow>
<mml:mtext mathvariant="italic">TCP</mml:mtext>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mo></mml:mo>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:msub>
<mml:mrow>
<mml:mmultiscripts>
<mml:mtext mathvariant="bold">P</mml:mtext>
<mml:mprescripts></mml:mprescripts>
<mml:none></mml:none>
<mml:mrow>
<mml:mtext mathvariant="italic">RobBase</mml:mtext>
</mml:mrow>
</mml:mmultiscripts>
</mml:mrow>
<mml:mrow>
<mml:mtext mathvariant="italic">TCP</mml:mtext>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:mtd>
</mml:mtr>
</mml:mtable>
</mml:mrow>
<mml:mo>]</mml:mo>
</mml:mrow>
<mml:mo>=</mml:mo>
<mml:mrow>
<mml:mo>[</mml:mo>
<mml:mrow>
<mml:mtable>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:msub>
<mml:mrow>
<mml:mmultiscripts>
<mml:mtext mathvariant="bold">P</mml:mtext>
<mml:mprescripts></mml:mprescripts>
<mml:none></mml:none>
<mml:mrow>
<mml:mo>_</mml:mo>
<mml:mtext mathvariant="italic">RobBase</mml:mtext>
</mml:mrow>
</mml:mmultiscripts>
<mml:mo>(</mml:mo>
<mml:mn>1</mml:mn>
<mml:mo>)</mml:mo>
</mml:mrow>
<mml:mrow>
<mml:mtext mathvariant="italic">RobWrist</mml:mtext>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mo></mml:mo>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:msub>
<mml:mrow>
<mml:mmultiscripts>
<mml:mtext mathvariant="bold">P</mml:mtext>
<mml:mprescripts></mml:mprescripts>
<mml:none></mml:none>
<mml:mrow>
<mml:mo>_</mml:mo>
<mml:mtext mathvariant="italic">RobBase</mml:mtext>
</mml:mrow>
</mml:mmultiscripts>
<mml:mo stretchy="false">(</mml:mo>
<mml:mi>n</mml:mi>
<mml:mo stretchy="false">)</mml:mo>
</mml:mrow>
<mml:mrow>
<mml:mtext mathvariant="italic">RobWrist</mml:mtext>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:mtd>
</mml:mtr>
</mml:mtable>
</mml:mrow>
<mml:mo>]</mml:mo>
</mml:mrow>
</mml:mrow>
</mml:math>
</disp-formula>
<disp-formula id="FD6">
<label>(6)</label>
<mml:math id="mm6">
<mml:mrow>
<mml:mrow>
<mml:mo>[</mml:mo>
<mml:mrow>
<mml:mtable>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:msub>
<mml:mrow>
<mml:mmultiscripts>
<mml:mtext mathvariant="bold">R</mml:mtext>
<mml:mprescripts></mml:mprescripts>
<mml:none></mml:none>
<mml:mrow>
<mml:mtext mathvariant="italic">Cam</mml:mtext>
</mml:mrow>
</mml:mmultiscripts>
<mml:mo>(</mml:mo>
<mml:mn>1</mml:mn>
<mml:mo>)</mml:mo>
</mml:mrow>
<mml:mrow>
<mml:mtext mathvariant="italic">RobRef</mml:mtext>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mo>|</mml:mo>
</mml:mtd>
<mml:mtd>
<mml:mo></mml:mo>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mo></mml:mo>
</mml:mtd>
<mml:mtd>
<mml:mo>|</mml:mo>
</mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:mo></mml:mo>
<mml:mi mathvariant="bold">I</mml:mi>
</mml:mrow>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:msub>
<mml:mrow>
<mml:mmultiscripts>
<mml:mtext mathvariant="bold">R</mml:mtext>
<mml:mprescripts></mml:mprescripts>
<mml:none></mml:none>
<mml:mrow>
<mml:mtext mathvariant="italic">Cam</mml:mtext>
</mml:mrow>
</mml:mmultiscripts>
<mml:mo stretchy="false">(</mml:mo>
<mml:mi>n</mml:mi>
<mml:mo stretchy="false">)</mml:mo>
</mml:mrow>
<mml:mrow>
<mml:mtext mathvariant="italic">RobRef</mml:mtext>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mo>|</mml:mo>
</mml:mtd>
<mml:mtd>
<mml:mo></mml:mo>
</mml:mtd>
</mml:mtr>
</mml:mtable>
</mml:mrow>
<mml:mo>]</mml:mo>
</mml:mrow>
<mml:mrow>
<mml:mo>[</mml:mo>
<mml:mrow>
<mml:mtable>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:msub>
<mml:mrow>
<mml:mmultiscripts>
<mml:mtext mathvariant="bold">P</mml:mtext>
<mml:mprescripts></mml:mprescripts>
<mml:none></mml:none>
<mml:mrow>
<mml:mtext mathvariant="italic">RobRef</mml:mtext>
</mml:mrow>
</mml:mmultiscripts>
</mml:mrow>
<mml:mrow>
<mml:mtext mathvariant="italic">TCP</mml:mtext>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mo></mml:mo>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:msub>
<mml:mrow>
<mml:mmultiscripts>
<mml:mtext mathvariant="bold">P</mml:mtext>
<mml:mprescripts></mml:mprescripts>
<mml:none></mml:none>
<mml:mrow>
<mml:mtext mathvariant="italic">Cam</mml:mtext>
</mml:mrow>
</mml:mmultiscripts>
</mml:mrow>
<mml:mrow>
<mml:mtext mathvariant="italic">TCP</mml:mtext>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:mtd>
</mml:mtr>
</mml:mtable>
</mml:mrow>
<mml:mo>]</mml:mo>
</mml:mrow>
<mml:mo>=</mml:mo>
<mml:mrow>
<mml:mo>[</mml:mo>
<mml:mrow>
<mml:mtable>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:msub>
<mml:mrow>
<mml:mmultiscripts>
<mml:mtext mathvariant="bold">P</mml:mtext>
<mml:mprescripts></mml:mprescripts>
<mml:none></mml:none>
<mml:mrow>
<mml:mo>_</mml:mo>
<mml:mtext mathvariant="italic">Cam</mml:mtext>
</mml:mrow>
</mml:mmultiscripts>
<mml:mo>(</mml:mo>
<mml:mn>1</mml:mn>
<mml:mo>)</mml:mo>
</mml:mrow>
<mml:mrow>
<mml:mtext mathvariant="italic">RobRef</mml:mtext>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mo></mml:mo>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:msub>
<mml:mrow>
<mml:mmultiscripts>
<mml:mtext mathvariant="bold">P</mml:mtext>
<mml:mprescripts></mml:mprescripts>
<mml:none></mml:none>
<mml:mrow>
<mml:mo>_</mml:mo>
<mml:mtext mathvariant="italic">Cam</mml:mtext>
</mml:mrow>
</mml:mmultiscripts>
<mml:mo stretchy="false">(</mml:mo>
<mml:mi>n</mml:mi>
<mml:mo stretchy="false">)</mml:mo>
</mml:mrow>
<mml:mrow>
<mml:mtext mathvariant="italic">RobRef</mml:mtext>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:mtd>
</mml:mtr>
</mml:mtable>
</mml:mrow>
<mml:mo>]</mml:mo>
</mml:mrow>
</mml:mrow>
</mml:math>
</disp-formula>
where
<italic>n</italic>
is the number of measurements,
<italic>
<sup>RobBase</sup>
</italic>
R(i)
<italic>
<sub>RobWrist</sub>
</italic>
the rotation matrix,
<italic>
<sup>RobBase</sup>
</italic>
P(i)
<italic>
<sub>RobWrist</sub>
</italic>
the position vector of a single measured pose of the robot wrist in robot's base coordinate system.
<italic>
<sup>Cam</sup>
</italic>
R(i)
<italic>
<sub>RobRef</sub>
</italic>
and
<italic>
<sup>Cam</sup>
</italic>
P(i)
<italic>
<sub>RobRef</sub>
</italic>
built the pose of the attached DRF in camera coordinates; –I is the identity matrix. To find the desired values
<xref rid="FD5" ref-type="disp-formula">Equations (5</xref>
,
<xref rid="FD6" ref-type="disp-formula">6)</xref>
can be expressed as:
<disp-formula id="FD7">
<label>(7)</label>
<mml:math id="mm7">
<mml:mrow>
<mml:mrow>
<mml:mo>[</mml:mo>
<mml:mrow>
<mml:mtable>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:msub>
<mml:mrow>
<mml:mmultiscripts>
<mml:mtext mathvariant="bold">P</mml:mtext>
<mml:mprescripts></mml:mprescripts>
<mml:none></mml:none>
<mml:mrow>
<mml:mtext mathvariant="italic">RobWrist</mml:mtext>
</mml:mrow>
</mml:mmultiscripts>
</mml:mrow>
<mml:mrow>
<mml:mtext mathvariant="italic">TCP</mml:mtext>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:msub>
<mml:mrow>
<mml:mmultiscripts>
<mml:mtext mathvariant="bold">P</mml:mtext>
<mml:mprescripts></mml:mprescripts>
<mml:none></mml:none>
<mml:mrow>
<mml:mtext mathvariant="italic">RobBase</mml:mtext>
</mml:mrow>
</mml:mmultiscripts>
</mml:mrow>
<mml:mrow>
<mml:mtext mathvariant="italic">TCP</mml:mtext>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:mtd>
</mml:mtr>
</mml:mtable>
</mml:mrow>
<mml:mo>]</mml:mo>
</mml:mrow>
<mml:mo>=</mml:mo>
<mml:msup>
<mml:mrow>
<mml:mrow>
<mml:mo>[</mml:mo>
<mml:mrow>
<mml:msubsup>
<mml:mrow>
<mml:mmultiscripts>
<mml:mi></mml:mi>
<mml:mprescripts></mml:mprescripts>
<mml:none></mml:none>
<mml:mrow>
<mml:mtext mathvariant="italic">RobWrist</mml:mtext>
</mml:mrow>
</mml:mmultiscripts>
</mml:mrow>
<mml:mtext mathvariant="italic">RobWrist</mml:mtext>
<mml:mrow>
<mml:mi>T</mml:mi>
</mml:mrow>
</mml:msubsup>
<mml:mo></mml:mo>
<mml:msub>
<mml:mrow>
<mml:mmultiscripts>
<mml:mi></mml:mi>
<mml:mprescripts></mml:mprescripts>
<mml:none></mml:none>
<mml:mrow>
<mml:mtext mathvariant="italic">RobBase</mml:mtext>
</mml:mrow>
</mml:mmultiscripts>
</mml:mrow>
<mml:mrow>
<mml:mtext mathvariant="italic">RobWrist</mml:mtext>
</mml:mrow>
</mml:msub>
</mml:mrow>
<mml:mo>]</mml:mo>
</mml:mrow>
</mml:mrow>
<mml:mrow>
<mml:mo></mml:mo>
<mml:mn>1</mml:mn>
</mml:mrow>
</mml:msup>
<mml:mo></mml:mo>
<mml:msubsup>
<mml:mrow>
<mml:mmultiscripts>
<mml:mi></mml:mi>
<mml:mprescripts></mml:mprescripts>
<mml:none></mml:none>
<mml:mrow>
<mml:mtext mathvariant="italic">RobBase</mml:mtext>
</mml:mrow>
</mml:mmultiscripts>
</mml:mrow>
<mml:mtext mathvariant="italic">RobWrist</mml:mtext>
<mml:mrow>
<mml:mi>T</mml:mi>
</mml:mrow>
</mml:msubsup>
<mml:mo></mml:mo>
<mml:msub>
<mml:mrow>
<mml:mmultiscripts>
<mml:mi></mml:mi>
<mml:mprescripts></mml:mprescripts>
<mml:none></mml:none>
<mml:mrow>
<mml:mtext mathvariant="italic">RobBase</mml:mtext>
</mml:mrow>
</mml:mmultiscripts>
</mml:mrow>
<mml:mtext mathvariant="italic">RobWrist</mml:mtext>
</mml:msub>
</mml:mrow>
</mml:math>
</disp-formula>
<disp-formula id="FD8">
<label>(8)</label>
<mml:math id="mm8">
<mml:mrow>
<mml:mrow>
<mml:mo>[</mml:mo>
<mml:mrow>
<mml:mtable>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:msub>
<mml:mrow>
<mml:mmultiscripts>
<mml:mtext mathvariant="bold">P</mml:mtext>
<mml:mprescripts></mml:mprescripts>
<mml:none></mml:none>
<mml:mrow>
<mml:mtext mathvariant="italic">RobRef</mml:mtext>
</mml:mrow>
</mml:mmultiscripts>
</mml:mrow>
<mml:mrow>
<mml:mtext mathvariant="italic">TCP</mml:mtext>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:msub>
<mml:mrow>
<mml:mmultiscripts>
<mml:mtext mathvariant="bold">P</mml:mtext>
<mml:mprescripts></mml:mprescripts>
<mml:none></mml:none>
<mml:mrow>
<mml:mtext mathvariant="italic">Cam</mml:mtext>
</mml:mrow>
</mml:mmultiscripts>
</mml:mrow>
<mml:mrow>
<mml:mtext mathvariant="italic">TCP</mml:mtext>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:mtd>
</mml:mtr>
</mml:mtable>
</mml:mrow>
<mml:mo>]</mml:mo>
</mml:mrow>
<mml:mo>=</mml:mo>
<mml:msup>
<mml:mrow>
<mml:mrow>
<mml:mo>[</mml:mo>
<mml:mrow>
<mml:msubsup>
<mml:mrow>
<mml:mmultiscripts>
<mml:mi></mml:mi>
<mml:mprescripts></mml:mprescripts>
<mml:none></mml:none>
<mml:mrow>
<mml:mtext mathvariant="italic">Cam</mml:mtext>
</mml:mrow>
</mml:mmultiscripts>
</mml:mrow>
<mml:mtext mathvariant="italic">RobRef</mml:mtext>
<mml:mrow>
<mml:mi>T</mml:mi>
</mml:mrow>
</mml:msubsup>
<mml:mo></mml:mo>
<mml:msub>
<mml:mrow>
<mml:mmultiscripts>
<mml:mi></mml:mi>
<mml:mprescripts></mml:mprescripts>
<mml:none></mml:none>
<mml:mrow>
<mml:mtext mathvariant="italic">Cam</mml:mtext>
</mml:mrow>
</mml:mmultiscripts>
</mml:mrow>
<mml:mrow>
<mml:mtext mathvariant="italic">RobRef</mml:mtext>
</mml:mrow>
</mml:msub>
</mml:mrow>
<mml:mo>]</mml:mo>
</mml:mrow>
</mml:mrow>
<mml:mrow>
<mml:mo></mml:mo>
<mml:mn>1</mml:mn>
</mml:mrow>
</mml:msup>
<mml:mo></mml:mo>
<mml:msubsup>
<mml:mrow>
<mml:mmultiscripts>
<mml:mi></mml:mi>
<mml:mprescripts></mml:mprescripts>
<mml:none></mml:none>
<mml:mrow>
<mml:mtext mathvariant="italic">Cam</mml:mtext>
</mml:mrow>
</mml:mmultiscripts>
</mml:mrow>
<mml:mtext mathvariant="italic">RobRef</mml:mtext>
<mml:mrow>
<mml:mi>T</mml:mi>
</mml:mrow>
</mml:msubsup>
<mml:mo></mml:mo>
<mml:msub>
<mml:mrow>
<mml:mmultiscripts>
<mml:mi></mml:mi>
<mml:mprescripts></mml:mprescripts>
<mml:none></mml:none>
<mml:mrow>
<mml:mtext mathvariant="italic">Cam</mml:mtext>
</mml:mrow>
</mml:mmultiscripts>
</mml:mrow>
<mml:mrow>
<mml:mtext mathvariant="italic">RobRef</mml:mtext>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:math>
</disp-formula>
</p>
<p>
<italic>
<sup>RobBase</sup>
</italic>
<italic>
<sub>RobWrist</sub>
<sup>RobBase</sup>
</italic>
<italic>
<sub>RobWrist</sub>
</italic>
,
<italic>
<sup>cam</sup>
</italic>
<italic>
<sub>RobRef</sub>
</italic>
and
<italic>
<sup>Cam</sup>
</italic>
<italic>
<sub>RobRef</sub>
</italic>
represent the complete set of measured poses during pivoting. Since
<xref rid="FD7" ref-type="disp-formula">Equations (7</xref>
,
<xref rid="FD8" ref-type="disp-formula">8)</xref>
are not square matrices, the unknowns are solved in the least squares sense. Note that in pivot calibration, the final result contains only the translation vector.</p>
<p>The pivot process took about 1 minute and n = 1,000 different poses were obtained. The maximum pivot angulation was of 120°. To estimate the quality of the procedure, the residual error was computed as:
<disp-formula id="FD9">
<label>(9)</label>
<mml:math id="mm9">
<mml:mrow>
<mml:msub>
<mml:mi>e</mml:mi>
<mml:mrow>
<mml:mi mathvariant="italic">rms</mml:mi>
</mml:mrow>
</mml:msub>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mrow>
<mml:mmultiscripts>
<mml:mtext mathvariant="bold">P</mml:mtext>
<mml:mprescripts></mml:mprescripts>
<mml:none></mml:none>
<mml:mrow>
<mml:mtext mathvariant="italic">RobWrist</mml:mtext>
</mml:mrow>
</mml:mmultiscripts>
</mml:mrow>
<mml:mrow>
<mml:mtext mathvariant="italic">TCP</mml:mtext>
</mml:mrow>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
<mml:mo>=</mml:mo>
<mml:msqrt>
<mml:mrow>
<mml:mfrac>
<mml:mn>1</mml:mn>
<mml:mi>N</mml:mi>
</mml:mfrac>
<mml:msup>
<mml:mrow>
<mml:mrow>
<mml:mo>[</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mrow>
<mml:mmultiscripts>
<mml:mi></mml:mi>
<mml:mprescripts></mml:mprescripts>
<mml:none></mml:none>
<mml:mrow>
<mml:mtext mathvariant="italic">RobBase</mml:mtext>
</mml:mrow>
</mml:mmultiscripts>
</mml:mrow>
<mml:mrow>
<mml:mtext mathvariant="italic">RobWrist</mml:mtext>
</mml:mrow>
</mml:msub>
<mml:mrow>
<mml:mo>[</mml:mo>
<mml:mrow>
<mml:mtable>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:msub>
<mml:mrow>
<mml:mmultiscripts>
<mml:mtext mathvariant="bold">P</mml:mtext>
<mml:mprescripts></mml:mprescripts>
<mml:none></mml:none>
<mml:mrow>
<mml:mtext mathvariant="italic">RobWrist</mml:mtext>
</mml:mrow>
</mml:mmultiscripts>
</mml:mrow>
<mml:mrow>
<mml:mtext mathvariant="italic">TCP</mml:mtext>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:msub>
<mml:mrow>
<mml:mmultiscripts>
<mml:mtext mathvariant="bold">P</mml:mtext>
<mml:mprescripts></mml:mprescripts>
<mml:none></mml:none>
<mml:mrow>
<mml:mtext mathvariant="italic">RobBase</mml:mtext>
</mml:mrow>
</mml:mmultiscripts>
</mml:mrow>
<mml:mrow>
<mml:mtext mathvariant="italic">TCP</mml:mtext>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:mtd>
</mml:mtr>
</mml:mtable>
</mml:mrow>
<mml:mo>]</mml:mo>
</mml:mrow>
<mml:mo></mml:mo>
<mml:msub>
<mml:mrow>
<mml:mmultiscripts>
<mml:mi></mml:mi>
<mml:mprescripts></mml:mprescripts>
<mml:none></mml:none>
<mml:mrow>
<mml:mtext mathvariant="italic">RobBase</mml:mtext>
</mml:mrow>
</mml:mmultiscripts>
</mml:mrow>
<mml:mrow>
<mml:mtext mathvariant="italic">RobWrist</mml:mtext>
</mml:mrow>
</mml:msub>
</mml:mrow>
<mml:mo>]</mml:mo>
</mml:mrow>
</mml:mrow>
<mml:mn>2</mml:mn>
</mml:msup>
</mml:mrow>
</mml:msqrt>
</mml:mrow>
</mml:math>
</disp-formula>
<disp-formula id="FD10">
<label>(10)</label>
<mml:math id="mm10">
<mml:mrow>
<mml:msub>
<mml:mi>e</mml:mi>
<mml:mrow>
<mml:mi mathvariant="italic">rms</mml:mi>
</mml:mrow>
</mml:msub>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mrow>
<mml:mmultiscripts>
<mml:mtext mathvariant="bold">P</mml:mtext>
<mml:mprescripts></mml:mprescripts>
<mml:none></mml:none>
<mml:mrow>
<mml:mi mathvariant="italic">RobRef</mml:mi>
</mml:mrow>
</mml:mmultiscripts>
</mml:mrow>
<mml:mrow>
<mml:mtext mathvariant="italic">TCP</mml:mtext>
</mml:mrow>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
<mml:mo>=</mml:mo>
<mml:msqrt>
<mml:mrow>
<mml:mfrac>
<mml:mn>1</mml:mn>
<mml:mi>N</mml:mi>
</mml:mfrac>
<mml:msup>
<mml:mrow>
<mml:mrow>
<mml:mo>[</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mrow>
<mml:mmultiscripts>
<mml:mi></mml:mi>
<mml:mprescripts></mml:mprescripts>
<mml:none></mml:none>
<mml:mrow>
<mml:mtext mathvariant="italic">Cam</mml:mtext>
</mml:mrow>
</mml:mmultiscripts>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">RobRef</mml:mi>
</mml:mrow>
</mml:msub>
<mml:mrow>
<mml:mo>[</mml:mo>
<mml:mrow>
<mml:mtable>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:msub>
<mml:mrow>
<mml:mmultiscripts>
<mml:mtext mathvariant="bold">P</mml:mtext>
<mml:mprescripts></mml:mprescripts>
<mml:none></mml:none>
<mml:mrow>
<mml:mi mathvariant="italic">RobRef</mml:mi>
</mml:mrow>
</mml:mmultiscripts>
</mml:mrow>
<mml:mrow>
<mml:mtext mathvariant="italic">TCP</mml:mtext>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:msub>
<mml:mrow>
<mml:mmultiscripts>
<mml:mtext mathvariant="bold">P</mml:mtext>
<mml:mprescripts></mml:mprescripts>
<mml:none></mml:none>
<mml:mrow>
<mml:mtext mathvariant="italic">Cam</mml:mtext>
</mml:mrow>
</mml:mmultiscripts>
</mml:mrow>
<mml:mrow>
<mml:mtext mathvariant="italic">TCP</mml:mtext>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:mtd>
</mml:mtr>
</mml:mtable>
</mml:mrow>
<mml:mo>]</mml:mo>
</mml:mrow>
<mml:mo></mml:mo>
<mml:msub>
<mml:mrow>
<mml:mmultiscripts>
<mml:mi></mml:mi>
<mml:mprescripts></mml:mprescripts>
<mml:none></mml:none>
<mml:mrow>
<mml:mtext mathvariant="italic">Cam</mml:mtext>
</mml:mrow>
</mml:mmultiscripts>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="italic">RobRef</mml:mi>
</mml:mrow>
</mml:msub>
</mml:mrow>
<mml:mo>]</mml:mo>
</mml:mrow>
</mml:mrow>
<mml:mn>2</mml:mn>
</mml:msup>
</mml:mrow>
</mml:msqrt>
</mml:mrow>
</mml:math>
</disp-formula>
</p>
<p>The result for the navigation part is typical for pivot calibration (see
<xref ref-type="table" rid="t1-sensors-12-09423">Table 1</xref>
). The higher error value of the robot calibration may be attributed to mechanical instabilities. However, the error influence is not as big in the final calibration result due to the close loop control strategy.</p>
</sec>
<sec>
<label>2.9.2.</label>
<title>Pair Point Method</title>
<p>With the translation vectors, derived from the pivot procedure, the transformation
<italic>
<sup>Ref</sup>
</italic>
T
<italic>
<sub>RobBase</sub>
</italic>
can be calculated using the pair point method (
<xref ref-type="fig" rid="f12-sensors-12-09423">Figure 12</xref>
). This method consists of estimating the transformation between two coordinate systems by using singular value decompositions of a covariance matrix of a set of corresponding points [
<xref ref-type="bibr" rid="b32-sensors-12-09423">32</xref>
]. For both sets of measured points
<inline-formula>
<mml:math id="mm11">
<mml:mrow>
<mml:msubsup>
<mml:mrow>
<mml:mrow>
<mml:mo>{</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mtext>a</mml:mtext>
<mml:mi>i</mml:mi>
</mml:msub>
</mml:mrow>
<mml:mo>}</mml:mo>
</mml:mrow>
</mml:mrow>
<mml:mrow>
<mml:mi>i</mml:mi>
<mml:mo>=</mml:mo>
<mml:mn>1</mml:mn>
</mml:mrow>
<mml:mi>n</mml:mi>
</mml:msubsup>
</mml:mrow>
</mml:math>
</inline-formula>
and
<inline-formula>
<mml:math id="mm12">
<mml:mrow>
<mml:msubsup>
<mml:mrow>
<mml:mrow>
<mml:mo>{</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mtext>b</mml:mtext>
<mml:mi>i</mml:mi>
</mml:msub>
</mml:mrow>
<mml:mo>}</mml:mo>
</mml:mrow>
</mml:mrow>
<mml:mrow>
<mml:mi>i</mml:mi>
<mml:mo>=</mml:mo>
<mml:mn>1</mml:mn>
</mml:mrow>
<mml:mi>n</mml:mi>
</mml:msubsup>
</mml:mrow>
</mml:math>
</inline-formula>
the relation can be described by:
<disp-formula id="FD11">
<label>(11)</label>
<mml:math id="mm13">
<mml:mrow>
<mml:msub>
<mml:mi mathvariant="bold">b</mml:mi>
<mml:mi>i</mml:mi>
</mml:msub>
<mml:mo></mml:mo>
<mml:mi mathvariant="bold">R</mml:mi>
<mml:mo></mml:mo>
<mml:msub>
<mml:mi mathvariant="bold">a</mml:mi>
<mml:mi>i</mml:mi>
</mml:msub>
<mml:mo>+</mml:mo>
<mml:mi mathvariant="bold">P</mml:mi>
<mml:mo>,</mml:mo>
<mml:mi>i</mml:mi>
<mml:mo>=</mml:mo>
<mml:mn>1</mml:mn>
<mml:mo>,</mml:mo>
<mml:mn>2</mml:mn>
<mml:mo>,</mml:mo>
<mml:mo></mml:mo>
<mml:mo>,</mml:mo>
<mml:mi>n</mml:mi>
<mml:mo>,</mml:mo>
<mml:msub>
<mml:mi mathvariant="bold">a</mml:mi>
<mml:mi>i</mml:mi>
</mml:msub>
<mml:mo>,</mml:mo>
<mml:msub>
<mml:mi mathvariant="bold">b</mml:mi>
<mml:mi>i</mml:mi>
</mml:msub>
<mml:mo></mml:mo>
<mml:msup>
<mml:mi></mml:mi>
<mml:mn>3</mml:mn>
</mml:msup>
</mml:mrow>
</mml:math>
</disp-formula>
</p>
<p>The similarity transformation parameters (R: rotation, P: translation) give the minimum value of the mean squared error
<italic>e</italic>
<sup>2</sup>
(R,P) of these two point sets:
<disp-formula id="FD12">
<label>(12)</label>
<mml:math id="mm14">
<mml:mrow>
<mml:msup>
<mml:mi>e</mml:mi>
<mml:mn>2</mml:mn>
</mml:msup>
<mml:mo stretchy="false">(</mml:mo>
<mml:mi mathvariant="bold">R</mml:mi>
<mml:mo>,</mml:mo>
<mml:mi mathvariant="bold">P</mml:mi>
<mml:mo stretchy="false">)</mml:mo>
<mml:mo>=</mml:mo>
<mml:mfrac>
<mml:mn>1</mml:mn>
<mml:mi>n</mml:mi>
</mml:mfrac>
<mml:munderover>
<mml:mo></mml:mo>
<mml:mrow>
<mml:mi>i</mml:mi>
<mml:mo>=</mml:mo>
<mml:mn>1</mml:mn>
</mml:mrow>
<mml:mi>n</mml:mi>
</mml:munderover>
<mml:mrow>
<mml:msup>
<mml:mrow>
<mml:mrow>
<mml:mo></mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi mathvariant="bold">b</mml:mi>
<mml:mi>i</mml:mi>
</mml:msub>
<mml:mo></mml:mo>
<mml:mo stretchy="false">(</mml:mo>
<mml:mi mathvariant="bold">R</mml:mi>
<mml:msub>
<mml:mi mathvariant="bold">a</mml:mi>
<mml:mi>i</mml:mi>
</mml:msub>
<mml:mo>+</mml:mo>
<mml:mi mathvariant="bold">P</mml:mi>
<mml:mo stretchy="false">)</mml:mo>
</mml:mrow>
<mml:mo></mml:mo>
</mml:mrow>
</mml:mrow>
<mml:mn>2</mml:mn>
</mml:msup>
</mml:mrow>
</mml:mrow>
</mml:math>
</disp-formula>
</p>
<p>For a detailed description of the used least square fitting algorithm method, utilized to minimize
<italic>e</italic>
, the reader may refer to the work by Umeyama [
<xref ref-type="bibr" rid="b32-sensors-12-09423">32</xref>
].</p>
<p>Selecting the TCP coordinate system as a mutual point to match both measurement systems (the robot encoders at one side and the navigation camera on the other) it is possible to calculate the rigid transformation between them. The unknown transformation matrix
<italic>
<sup>Ref</sup>
</italic>
<bold>T</bold>
<italic>
<sub>RobBase</sub>
</italic>
can be determined once by moving the robot along the work space and calculate the TCP position using
<xref rid="FD7" ref-type="disp-formula">Equations (7</xref>
,
<xref rid="FD8" ref-type="disp-formula">8)</xref>
:
<disp-formula id="FD13">
<label>(13)</label>
<mml:math id="mm15">
<mml:mrow>
<mml:msub>
<mml:mrow>
<mml:mmultiscripts>
<mml:mtext mathvariant="bold">P</mml:mtext>
<mml:mprescripts></mml:mprescripts>
<mml:none></mml:none>
<mml:mrow>
<mml:mtext mathvariant="italic">RobBase</mml:mtext>
</mml:mrow>
</mml:mmultiscripts>
</mml:mrow>
<mml:mrow>
<mml:mtext mathvariant="italic">TCP</mml:mtext>
</mml:mrow>
</mml:msub>
<mml:mo>=</mml:mo>
<mml:msub>
<mml:mrow>
<mml:mmultiscripts>
<mml:mi mathvariant="bold">T</mml:mi>
<mml:mprescripts></mml:mprescripts>
<mml:none></mml:none>
<mml:mrow>
<mml:mtext mathvariant="italic">RobBase</mml:mtext>
</mml:mrow>
</mml:mmultiscripts>
</mml:mrow>
<mml:mrow>
<mml:mtext mathvariant="italic">RobWrist</mml:mtext>
</mml:mrow>
</mml:msub>
<mml:mo></mml:mo>
<mml:msub>
<mml:mrow>
<mml:mmultiscripts>
<mml:mtext mathvariant="bold">P</mml:mtext>
<mml:mprescripts></mml:mprescripts>
<mml:none></mml:none>
<mml:mrow>
<mml:mtext mathvariant="italic">RobWrist</mml:mtext>
</mml:mrow>
</mml:mmultiscripts>
</mml:mrow>
<mml:mrow>
<mml:mtext mathvariant="italic">TCP</mml:mtext>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:math>
</disp-formula>
<disp-formula id="FD14">
<label>(14)</label>
<mml:math id="mm16">
<mml:mrow>
<mml:msub>
<mml:mrow>
<mml:mmultiscripts>
<mml:mtext mathvariant="bold">P</mml:mtext>
<mml:mprescripts></mml:mprescripts>
<mml:none></mml:none>
<mml:mrow>
<mml:mtext mathvariant="italic">Ref</mml:mtext>
</mml:mrow>
</mml:mmultiscripts>
</mml:mrow>
<mml:mrow>
<mml:mtext mathvariant="italic">TCP</mml:mtext>
</mml:mrow>
</mml:msub>
<mml:mo>=</mml:mo>
<mml:msup>
<mml:mrow>
<mml:mrow>
<mml:mo>[</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mrow>
<mml:mmultiscripts>
<mml:mi mathvariant="bold">T</mml:mi>
<mml:mprescripts></mml:mprescripts>
<mml:none></mml:none>
<mml:mrow>
<mml:mtext mathvariant="italic">Cam</mml:mtext>
</mml:mrow>
</mml:mmultiscripts>
</mml:mrow>
<mml:mrow>
<mml:mtext mathvariant="italic">Ref</mml:mtext>
</mml:mrow>
</mml:msub>
</mml:mrow>
<mml:mo>]</mml:mo>
</mml:mrow>
</mml:mrow>
<mml:mrow>
<mml:mo></mml:mo>
<mml:mn>1</mml:mn>
</mml:mrow>
</mml:msup>
<mml:mo></mml:mo>
<mml:msub>
<mml:mrow>
<mml:mmultiscripts>
<mml:mi mathvariant="bold">T</mml:mi>
<mml:mprescripts></mml:mprescripts>
<mml:none></mml:none>
<mml:mrow>
<mml:mtext mathvariant="italic">Cam</mml:mtext>
</mml:mrow>
</mml:mmultiscripts>
</mml:mrow>
<mml:mrow>
<mml:mtext mathvariant="italic">RobRef</mml:mtext>
</mml:mrow>
</mml:msub>
<mml:mo></mml:mo>
<mml:msub>
<mml:mrow>
<mml:mmultiscripts>
<mml:mi mathvariant="bold">P</mml:mi>
<mml:mprescripts></mml:mprescripts>
<mml:none></mml:none>
<mml:mrow>
<mml:mtext mathvariant="italic">RobRef</mml:mtext>
</mml:mrow>
</mml:mmultiscripts>
</mml:mrow>
<mml:mrow>
<mml:mtext mathvariant="italic">TCP</mml:mtext>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:math>
</disp-formula>
</p>
</sec>
<sec>
<label>2.9.3.</label>
<title>Determination of the Rigid Transformation</title>
<p>The robot was moved to 175 different programmed positions. At every position the robot was stopped for 5 seconds. To filter noise, 100 measurements were taken at every position; the calculated mean values were then used in the following procedure. 50 positions were distributed equally in the robot's workspace; these measured points were used to determine
<italic>
<sup>Ref</sup>
</italic>
<bold>T</bold>
<italic>
<sub>RobBase</sub>
</italic>
according to the method described before.</p>
<p>The leftover 125 measured points were used to estimate the residual error of the obtained
<italic>
<sup>Ref</sup>
</italic>
<bold>T</bold>
<italic>
<sub>RobBase</sub>
</italic>
matrix and were distributed equally within a work space of (200 mm)
<sup>3</sup>
. Two different transformation chains were used to calculate the resulting difference at each point, called estimated error:
<disp-formula id="FD15">
<label>(15)</label>
<mml:math id="mm17">
<mml:mrow>
<mml:msub>
<mml:mrow>
<mml:mmultiscripts>
<mml:mi mathvariant="bold">T</mml:mi>
<mml:mprescripts></mml:mprescripts>
<mml:none></mml:none>
<mml:mrow>
<mml:mtext mathvariant="italic">Ref</mml:mtext>
</mml:mrow>
</mml:mmultiscripts>
</mml:mrow>
<mml:mrow>
<mml:mtext mathvariant="italic">TCP</mml:mtext>
</mml:mrow>
</mml:msub>
<mml:mo stretchy="false">(</mml:mo>
<mml:mi mathvariant="italic">short</mml:mi>
<mml:mo stretchy="false">)</mml:mo>
<mml:mo>=</mml:mo>
<mml:msub>
<mml:mrow>
<mml:mmultiscripts>
<mml:mi mathvariant="bold">T</mml:mi>
<mml:mprescripts></mml:mprescripts>
<mml:none></mml:none>
<mml:mrow>
<mml:mtext mathvariant="italic">Ref</mml:mtext>
</mml:mrow>
</mml:mmultiscripts>
</mml:mrow>
<mml:mrow>
<mml:mtext mathvariant="italic">RobRef</mml:mtext>
</mml:mrow>
</mml:msub>
<mml:mo></mml:mo>
<mml:msub>
<mml:mrow>
<mml:mmultiscripts>
<mml:mi mathvariant="bold">T</mml:mi>
<mml:mprescripts></mml:mprescripts>
<mml:none></mml:none>
<mml:mrow>
<mml:mtext mathvariant="italic">RobRef</mml:mtext>
</mml:mrow>
</mml:mmultiscripts>
</mml:mrow>
<mml:mrow>
<mml:mtext mathvariant="italic">TCP</mml:mtext>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:math>
</disp-formula>
<disp-formula id="FD16">
<label>(16)</label>
<mml:math id="mm18">
<mml:mrow>
<mml:msub>
<mml:mrow>
<mml:mmultiscripts>
<mml:mi mathvariant="bold">T</mml:mi>
<mml:mprescripts></mml:mprescripts>
<mml:none></mml:none>
<mml:mrow>
<mml:mtext mathvariant="italic">Ref</mml:mtext>
</mml:mrow>
</mml:mmultiscripts>
</mml:mrow>
<mml:mrow>
<mml:mtext mathvariant="italic">TCP</mml:mtext>
</mml:mrow>
</mml:msub>
<mml:mo stretchy="false">(</mml:mo>
<mml:mtext mathvariant="italic">long</mml:mtext>
<mml:mo stretchy="false">)</mml:mo>
<mml:mo>=</mml:mo>
<mml:msub>
<mml:mrow>
<mml:mmultiscripts>
<mml:mi mathvariant="bold">T</mml:mi>
<mml:mprescripts></mml:mprescripts>
<mml:none></mml:none>
<mml:mrow>
<mml:mtext mathvariant="italic">Ref</mml:mtext>
</mml:mrow>
</mml:mmultiscripts>
</mml:mrow>
<mml:mrow>
<mml:mtext mathvariant="italic">RobBase</mml:mtext>
</mml:mrow>
</mml:msub>
<mml:mo></mml:mo>
<mml:msub>
<mml:mrow>
<mml:mmultiscripts>
<mml:mi mathvariant="bold">T</mml:mi>
<mml:mprescripts></mml:mprescripts>
<mml:none></mml:none>
<mml:mrow>
<mml:mtext mathvariant="italic">RobBase</mml:mtext>
</mml:mrow>
</mml:mmultiscripts>
</mml:mrow>
<mml:mrow>
<mml:mtext mathvariant="italic">RobWrist</mml:mtext>
</mml:mrow>
</mml:msub>
<mml:mo></mml:mo>
<mml:msub>
<mml:mrow>
<mml:mmultiscripts>
<mml:mi mathvariant="bold">T</mml:mi>
<mml:mprescripts></mml:mprescripts>
<mml:none></mml:none>
<mml:mrow>
<mml:mtext mathvariant="italic">RobWrist</mml:mtext>
</mml:mrow>
</mml:mmultiscripts>
</mml:mrow>
<mml:mrow>
<mml:mtext mathvariant="italic">TCP</mml:mtext>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:math>
</disp-formula>
</p>
<p>In an ideal case the difference between these two equations should be zero:
<disp-formula id="FD17">
<label>(17)</label>
<mml:math id="mm19">
<mml:mrow>
<mml:msup>
<mml:mrow>
<mml:mrow>
<mml:mo>[</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mrow>
<mml:mmultiscripts>
<mml:mi mathvariant="bold">T</mml:mi>
<mml:mprescripts></mml:mprescripts>
<mml:none></mml:none>
<mml:mrow>
<mml:mtext mathvariant="italic">Ref</mml:mtext>
</mml:mrow>
</mml:mmultiscripts>
</mml:mrow>
<mml:mrow>
<mml:mtext mathvariant="italic">TCP</mml:mtext>
</mml:mrow>
</mml:msub>
<mml:mo stretchy="false">(</mml:mo>
<mml:mi mathvariant="italic">short</mml:mi>
<mml:mo stretchy="false">)</mml:mo>
</mml:mrow>
<mml:mo>]</mml:mo>
</mml:mrow>
</mml:mrow>
<mml:mrow>
<mml:mo></mml:mo>
<mml:mn>1</mml:mn>
</mml:mrow>
</mml:msup>
<mml:mo></mml:mo>
<mml:msub>
<mml:mrow>
<mml:mmultiscripts>
<mml:mi mathvariant="bold">T</mml:mi>
<mml:mprescripts></mml:mprescripts>
<mml:none></mml:none>
<mml:mrow>
<mml:mtext mathvariant="italic">Ref</mml:mtext>
</mml:mrow>
</mml:mmultiscripts>
</mml:mrow>
<mml:mrow>
<mml:mtext mathvariant="italic">TCP</mml:mtext>
</mml:mrow>
</mml:msub>
<mml:mo stretchy="false">(</mml:mo>
<mml:mtext mathvariant="italic">long</mml:mtext>
<mml:mo stretchy="false">)</mml:mo>
<mml:mo></mml:mo>
<mml:mi mathvariant="bold">I</mml:mi>
</mml:mrow>
</mml:math>
</disp-formula>
</p>
<p>By applying
<xref rid="FD17" ref-type="disp-formula">Equation (17)</xref>
to the 125 different measurements a root mean square residual error of the length of the translational part was 0.81 mm with a standard deviation of 0.41 mm (
<xref ref-type="table" rid="t2-sensors-12-09423">Table 2</xref>
).</p>
<p>As described before, the system's control loop uses
<italic>
<sup>RobRef</sup>
</italic>
<bold>T</bold>
<italic>
<sub>RobWrist</sub>
</italic>
, this can now be determined by calculating:
<disp-formula id="FD18">
<label>(18)</label>
<mml:math id="mm20">
<mml:mrow>
<mml:msub>
<mml:mrow>
<mml:mmultiscripts>
<mml:mi mathvariant="bold">T</mml:mi>
<mml:mprescripts></mml:mprescripts>
<mml:none></mml:none>
<mml:mrow>
<mml:mtext mathvariant="italic">RobRef</mml:mtext>
</mml:mrow>
</mml:mmultiscripts>
<mml:mo stretchy="false">(</mml:mo>
<mml:mi>i</mml:mi>
<mml:mo stretchy="false">)</mml:mo>
</mml:mrow>
<mml:mrow>
<mml:mtext mathvariant="italic">RobWrist</mml:mtext>
</mml:mrow>
</mml:msub>
<mml:mo>=</mml:mo>
<mml:msup>
<mml:mrow>
<mml:mrow>
<mml:mo>[</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mrow>
<mml:mmultiscripts>
<mml:mi mathvariant="bold">T</mml:mi>
<mml:mprescripts></mml:mprescripts>
<mml:none></mml:none>
<mml:mrow>
<mml:mtext mathvariant="italic">Ref</mml:mtext>
</mml:mrow>
</mml:mmultiscripts>
<mml:mo stretchy="false">(</mml:mo>
<mml:mi>i</mml:mi>
<mml:mo stretchy="false">)</mml:mo>
</mml:mrow>
<mml:mrow>
<mml:mtext mathvariant="italic">RobRef</mml:mtext>
</mml:mrow>
</mml:msub>
</mml:mrow>
<mml:mo>]</mml:mo>
</mml:mrow>
</mml:mrow>
<mml:mrow>
<mml:mo></mml:mo>
<mml:mn>1</mml:mn>
</mml:mrow>
</mml:msup>
<mml:mo></mml:mo>
<mml:msub>
<mml:mrow>
<mml:mmultiscripts>
<mml:mi mathvariant="bold">T</mml:mi>
<mml:mprescripts></mml:mprescripts>
<mml:none></mml:none>
<mml:mrow>
<mml:mtext mathvariant="italic">Ref</mml:mtext>
</mml:mrow>
</mml:mmultiscripts>
<mml:mo stretchy="false">(</mml:mo>
<mml:mi>i</mml:mi>
<mml:mo stretchy="false">)</mml:mo>
</mml:mrow>
<mml:mrow>
<mml:mtext mathvariant="italic">RobBase</mml:mtext>
</mml:mrow>
</mml:msub>
<mml:mo></mml:mo>
<mml:msub>
<mml:mrow>
<mml:mmultiscripts>
<mml:mi mathvariant="bold">T</mml:mi>
<mml:mprescripts></mml:mprescripts>
<mml:none></mml:none>
<mml:mrow>
<mml:mtext mathvariant="italic">RobBase</mml:mtext>
</mml:mrow>
</mml:mmultiscripts>
<mml:mo stretchy="false">(</mml:mo>
<mml:mi>i</mml:mi>
<mml:mo stretchy="false">)</mml:mo>
</mml:mrow>
<mml:mrow>
<mml:mtext mathvariant="italic">RobWrist</mml:mtext>
</mml:mrow>
</mml:msub>
<mml:mo>,</mml:mo>
<mml:mi>i</mml:mi>
<mml:mo>=</mml:mo>
<mml:mn>1</mml:mn>
<mml:mo></mml:mo>
<mml:mi>n</mml:mi>
</mml:mrow>
</mml:math>
</disp-formula>
for all measured positions. The noise is filtered by using the mean values for every field of the resulting matrix. The matrix was made homogeneous afterwards.</p>
</sec>
</sec>
<sec>
<label>2.10.</label>
<title>User Control Modes</title>
<p>A graphical interface (GUI) was designed to easily select the functions of the robot (
<xref ref-type="fig" rid="f13-sensors-12-09423">Figure 13</xref>
). The target coordinates in relation to the patient can also be read from a text file. It also tells the user when the robot and the patient are not visible to the navigation camera. The actual distance from the TCP to the target is continuously displayed.</p>
<p>The user control modes are explained next:</p>
<p>
<bold>
<italic>Only joystick</italic>
</bold>
. In this control mode the TCP can be moved in Cartesian coordinates according to
<xref ref-type="fig" rid="f14-sensors-12-09423">Figure 14</xref>
. A joystick movement to the left corresponds to a robot movement to the left and so on for right, front and back movements. By pressing the side joystick buttons the robot can be moved back and forth along the needle direction (x direction in TCP coordinates) at constant velocity. The robot calculates internally all the necessary transformations. If the user wants to change the TCP orientation (but keep the position) he only has to press the upper button of the joystick to initiate movement along α and β (see
<xref ref-type="fig" rid="f14-sensors-12-09423">Figure 14</xref>
).</p>
<p>
<bold>
<italic>Automatic orientation</italic>
</bold>
. Once this mode is selected, the robot orients the TCP automatically towards the target as shown in
<xref ref-type="fig" rid="f15-sensors-12-09423">Figure 15</xref>
. It is still possible to change the TCP position using the joystick like in “only joystick mode” but once the joystick is released (a new desired position is achieved) the robot orients itself again pointing to the target but now from a new perspective. This control mode is quite helpful to find new entry points. During the operation, the radiologist can move to different entry points and decide which one may be more adequate. This control mode may be more helpful to radiologists than the mentioned RCM method where only one entry point can be chosen (otherwise the whole Cartesian positioning has to be repeated, which involves moving the robot manually to a different location at the skin and then try to pivot again).</p>
<p>
<bold>Automatic orientation with fixed distance:</bold>
This control mode works similarly to automatic orientation mode with the only difference that the user can decide on the trajectory distance. The desired distance can be entered by the user on the GUI.</p>
<p>
<bold>Automatic orientation in plane:</bold>
By selection of this mode the TCP can be moved with the joystick only along an imaginary plane positioned over the patient skin. The TCP keeps pointing at the target at any time.</p>
</sec>
</sec>
<sec>
<label>3.</label>
<title>Results and Discussion</title>
<p>Preliminary accuracy tests of the developed components and procedures were performed. The overall chain error includes errors introduced by the imaging system, planning, patient registration and unrecognized movement of patient tissue. Additional errors are introduced by the robotic system and its connection to the navigation system, namely robot kinematic error, robot calibration error, navigation system measurement error and instrument calibration error. These errors were evaluated in three experiments. The first two experiments, namely the evaluation of the kinematics and the imaging system error can be seen in [
<xref ref-type="bibr" rid="b10-sensors-12-09423">10</xref>
] which show that the robot is able to reach positioning with accuracy similar to the optical localizer, 0.35 mm (rms). In this article, only the overall error was measured in the next experiment.</p>
<sec>
<label>3.1.</label>
<title>Accuracy Tests for Targeting a Needle</title>
<p>These measurements were performed using a specially designed testing device and an Artis zeego imaging system for error visualization. The testing device consists of nine rods with tips distributed along different positions
<xref ref-type="fig" rid="f16-sensors-12-09423">Figure 16</xref>
. The height of the higher five rods was 40 mm (from the base to the tip) while the height of the smaller four rods was of 25 mm. A DRF was attached to one side of the testing device. The distances from the DRF's coordinate system to the rod tips were known in advance. The construction accuracy of the testing device is about 0.01 mm.</p>
<p>For the experiment, the trolley with the robotic system was placed at one side of the CT-table. The robot's tool was positioned over the testing device, which was positioned on the CT table, using the gravity compensation mode. Using the graphical interface, the robot system was programmed with the position of the selected tip. The TCP was positioned accordingly over the selected tip using the joystick control mode. After an automatic orientation command, selected in the graphical interface, the robot orientated the TCP pointing to the target. The angle from the rod's vertical to the TCP did not exceed 45 degrees. The experiment was performed using different trajectories lengths at each rod (30–60 mm). Then, the robot was stopped by activating the robot's brakes and a 150 mm needle with 2 mm of diameter was inserted until the tip reached the rod's peak. A CT scan (20 s, 200° rotation range) was performed and reconstructed using a high resolution kernel with 512 × 512 matrix and 0.13 mm voxel size. The distance error was defined as the measured distance from the needle's tip to the rod's tip in the CT images (
<xref ref-type="fig" rid="f17-sensors-12-09423">Figure 17</xref>
).</p>
<p>The experiment was repeated approaching from five different directions for all rods, resulting in a total of 45 measurements. The resulting root mean square positioning error e
<sub>rms</sub>
is shown in
<xref ref-type="table" rid="t3-sensors-12-09423">Table 3</xref>
together with its standard deviation σ and the minimum and maximum deviation e
<sub>min</sub>
and e
<sub>max</sub>
respectively.</p>
</sec>
</sec>
<sec>
<label>4.</label>
<title>Conclusions/Outlook</title>
<p>In this research a fully sensorized cooperative robot system for surgical interventions was introduced. These kinds of systems depend heavily in the information coming from different sensors. Therefore, in this paper the primary focus lies in the sensor technology employed. The robot system was adapted for the placement of needles into anatomic areas such as liver, kidneys, and lungs. In these kinds of operations, an interventional radiology procedure is commonly required for target visualization. The system uses an optical localizer for robot control and patient tracking. For target visualization, a robot-driven FD-CT was introduced, which gives the systems the flexibility to move along the patient table with the use of joysticks and pedals. The developed mobile robot platform can be easily positioned in an intraoperative suit. The LWRIII robot control strategies allow robot manipulation with the hand. For fine movements, the robot can be manipulated via joystick while the target is fixed, helping the clinician to choose different entry points. The autoclavable tool holder can support different kinds of tools for different robot operations. An application controller was developed specially for surgical applications which require real time response. Real time control was possible due to the RSI-Ethernet.</p>
<p>Because of the existence of the different system components, namely the robot arm, the robot-driven FD-CT and the optical localizer, a calibration process was required. For the clinician, this calibration process is easy to carry out without the need of a technical assistance. Once calibration was done, it was visually confirmed that the robot reacted faster to new programmed poses. No meaningful oscillations were presented in the steady state. When the patient reference frame was manually moved with slow movements, the robot mirrored the movement smoothly. Nevertheless, for big movements the robot does not reacts fast enough to mirror the movements. Therefore, it can be claimed that the proposed setup is able to compensate only for small patient movements. Using Kalman filters and using an optical localizer with faster frequency (100 Hz) would improve this reaction.</p>
<p>While targeting points with the robot system the whole error chain is present. The most significant error includes the robot calibration error, the optical localizer error, the testing device construction error, and the image reconstruction error. The obtained error of 1.2 mm with a standard deviation of ±0.4 mm seems to be acceptable but insufficient for some critical applications. Using navigation camera with a higher accuracy and smaller robot like the MIRO [
<xref ref-type="bibr" rid="b33-sensors-12-09423">33</xref>
] may improve the accuracy.</p>
<p>With the used components, a line of sight problem emerges. This is mainly due to fact that many components in the same working space are present. In this research, we partially solved this problem by attaching the navigation camera to the C-arm as shown in
<xref ref-type="fig" rid="f1-sensors-12-09423">Figure 1</xref>
. Moving the C-arm will not affect the camera measurements as they are done in relation to a DRF. Finally, the navigation data could also be used for real-time 3D reconstruction.</p>
</sec>
</body>
<back>
<ack>
<p>We are grateful for support by KUKA AG, Augsburg, Germany, who provided the LWR system, by the National Council of Science and Technology of Mexico and by the German Ministry of Education and Research (BMBF Grant ‘OrthoMIT’, FKZ 01EQ0425).</p>
</ack>
<ref-list>
<title>References</title>
<ref id="b1-sensors-12-09423">
<label>1.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Kwoh</surname>
<given-names>Y.S.</given-names>
</name>
<name>
<surname>Hou</surname>
<given-names>J.</given-names>
</name>
<name>
<surname>Jonckheere</surname>
<given-names>E.A.</given-names>
</name>
<name>
<surname>Hayati</surname>
<given-names>S.</given-names>
</name>
</person-group>
<article-title>A robot with improved absolute positioning accuracy for CT-guided stereotactic brain surgery</article-title>
<source>IEEE Trans. Biomed. Eng.</source>
<year>1988</year>
<volume>35</volume>
<fpage>153</fpage>
<lpage>160</lpage>
<pub-id pub-id-type="pmid">3280462</pub-id>
</element-citation>
</ref>
<ref id="b2-sensors-12-09423">
<label>2.</label>
<element-citation publication-type="confproc">
<person-group person-group-type="author">
<name>
<surname>Cleary</surname>
<given-names>K.</given-names>
</name>
</person-group>
<article-title>Medical Robotics and the Operating Room of the Future</article-title>
<conf-name>Proceedings of the 2005 IEEE, Engineering in Medicine and Biology 27th Annual Conference</conf-name>
<conf-loc>Shanghai, China</conf-loc>
<conf-date>1– 4 September 2005</conf-date>
</element-citation>
</ref>
<ref id="b3-sensors-12-09423">
<label>3.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Kazanzides</surname>
<given-names>P.</given-names>
</name>
<name>
<surname>Fichtinger</surname>
<given-names>G.</given-names>
</name>
<name>
<surname>Hager</surname>
<given-names>G.D.</given-names>
</name>
<name>
<surname>Okamura</surname>
<given-names>A.M.</given-names>
</name>
<name>
<surname>Whitcomb</surname>
<given-names>L.L.</given-names>
</name>
<name>
<surname>Taylor</surname>
<given-names>R.H.</given-names>
</name>
</person-group>
<article-title>Surgical and interventional robotics-core concepts, technology, and design [Tutorial]</article-title>
<source>IEEE Robot. Autom. Mag.</source>
<year>2008</year>
<volume>15</volume>
<fpage>122</fpage>
<lpage>130</lpage>
<pub-id pub-id-type="pmid">20428333</pub-id>
</element-citation>
</ref>
<ref id="b4-sensors-12-09423">
<label>4.</label>
<element-citation publication-type="confproc">
<person-group person-group-type="author">
<name>
<surname>Guthart</surname>
<given-names>G.S.</given-names>
</name>
<name>
<surname>Salisbury</surname>
<given-names>J.J.</given-names>
</name>
</person-group>
<article-title>The Intuitive Telesurgery System: Overview and Application</article-title>
<conf-name>Proceedings of the IEEE International Conference on Robotics and Automation</conf-name>
<conf-loc>San Francisco, CA, USA</conf-loc>
<conf-date>24– 28 April 2000</conf-date>
<fpage>618</fpage>
<lpage>621</lpage>
</element-citation>
</ref>
<ref id="b5-sensors-12-09423">
<label>5.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Adler</surname>
<given-names>J.R.</given-names>
</name>
<name>
<surname>Murphy</surname>
<given-names>M.J.</given-names>
</name>
<name>
<surname>Chang</surname>
<given-names>S.D.</given-names>
</name>
<name>
<surname>Hankock</surname>
<given-names>S.L.</given-names>
</name>
</person-group>
<article-title>Image guided robotic radiosurgery</article-title>
<source>Neurosurgery</source>
<year>1999</year>
<volume>44</volume>
<fpage>1299</fpage>
<lpage>1306</lpage>
<pub-id pub-id-type="pmid">10371630</pub-id>
</element-citation>
</ref>
<ref id="b6-sensors-12-09423">
<label>6.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Schweikard</surname>
<given-names>A.</given-names>
</name>
<name>
<surname>Glosser</surname>
<given-names>G.</given-names>
</name>
<name>
<surname>Bodduluri</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>Murphy</surname>
<given-names>M.J.</given-names>
</name>
<name>
<surname>Adler</surname>
<given-names>J.R.</given-names>
</name>
</person-group>
<article-title>Robotic motion compensation for respiratory movement during radiosurgery</article-title>
<source>Comput. Aided Surg.</source>
<year>2000</year>
<volume>5</volume>
<fpage>263</fpage>
<lpage>277</lpage>
<pub-id pub-id-type="pmid">11029159</pub-id>
</element-citation>
</ref>
<ref id="b7-sensors-12-09423">
<label>7.</label>
<element-citation publication-type="confproc">
<person-group person-group-type="author">
<name>
<surname>Hirzinger</surname>
<given-names>G.</given-names>
</name>
<name>
<surname>Sporer</surname>
<given-names>N.</given-names>
</name>
<name>
<surname>Albu-Schäffer</surname>
<given-names>A.</given-names>
</name>
<name>
<surname>Hähnle</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>Krenn</surname>
<given-names>R.</given-names>
</name>
<name>
<surname>Pascucci</surname>
<given-names>A.</given-names>
</name>
<name>
<surname>Schedl</surname>
<given-names>M.</given-names>
</name>
</person-group>
<article-title>DLR's Torque-Controlled Light Weight Robot III—Are we Reaching the technological Limits Now?</article-title>
<conf-name>Proceedings of the IEEE International Conference on Robotics and Automation (ICRA ‘02)</conf-name>
<conf-loc>Washington, DC, USA</conf-loc>
<conf-date>11– 15 May 2002</conf-date>
<fpage>1710</fpage>
<lpage>1716</lpage>
</element-citation>
</ref>
<ref id="b8-sensors-12-09423">
<label>8.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Hagn</surname>
<given-names>U.</given-names>
</name>
<name>
<surname>Konietschke</surname>
<given-names>R.</given-names>
</name>
<name>
<surname>Tobergte</surname>
<given-names>A.</given-names>
</name>
<name>
<surname>Nickl</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>Jörg</surname>
<given-names>S.</given-names>
</name>
<name>
<surname>Kuebler</surname>
<given-names>B.</given-names>
</name>
<name>
<surname>Passig</surname>
<given-names>G.</given-names>
</name>
<name>
<surname>Gröger</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>Fröhlich</surname>
<given-names>F.</given-names>
</name>
<name>
<surname>Seibold</surname>
<given-names>U.</given-names>
</name>
<etal></etal>
</person-group>
<article-title>DLR MiroSurge-a versatile system for research in endoscopic telesurgery</article-title>
<source>Int. J. Comput. Assist. Radiol. Surg.</source>
<year>2009</year>
<volume>5</volume>
<fpage>183</fpage>
<lpage>193</lpage>
<pub-id pub-id-type="pmid">20033517</pub-id>
</element-citation>
</ref>
<ref id="b9-sensors-12-09423">
<label>9.</label>
<element-citation publication-type="book">
<person-group person-group-type="author">
<name>
<surname>Hannaford</surname>
<given-names>B.</given-names>
</name>
<name>
<surname>Okamura</surname>
<given-names>A.M.</given-names>
</name>
</person-group>
<article-title>Haptics</article-title>
<source>Handbook of Robotics</source>
<person-group person-group-type="editor">
<name>
<surname>Siciliano</surname>
<given-names>B.</given-names>
</name>
<name>
<surname>Khatib</surname>
<given-names>O.</given-names>
</name>
</person-group>
<publisher-name>Springer-Verlag</publisher-name>
<publisher-loc>Berlin/Heidelberg, Germany</publisher-loc>
<year>2008</year>
<fpage>719</fpage>
<lpage>739</lpage>
</element-citation>
</ref>
<ref id="b10-sensors-12-09423">
<label>10.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Tovar-Arriaga</surname>
<given-names>S.</given-names>
</name>
<name>
<surname>Tita</surname>
<given-names>R.</given-names>
</name>
<name>
<surname>Pedraza-Ortega</surname>
<given-names>J.C.</given-names>
</name>
<name>
<surname>Gorrostieta</surname>
<given-names>E.</given-names>
</name>
<name>
<surname>Kalender</surname>
<given-names>W.A.</given-names>
</name>
</person-group>
<article-title>Development of a robotic FD-CT-guided navigation system for needle placement-Preliminary accuracy tests</article-title>
<source>Int. J. Med. Robot. Comput. Assist. Surg.</source>
<year>2011</year>
<volume>7</volume>
<fpage>225</fpage>
<lpage>236</lpage>
</element-citation>
</ref>
<ref id="b11-sensors-12-09423">
<label>11.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Hager</surname>
<given-names>G.D.</given-names>
</name>
<name>
<surname>Okamura</surname>
<given-names>A.M.</given-names>
</name>
<name>
<surname>Kazanzides</surname>
<given-names>P.</given-names>
</name>
<name>
<surname>Whitcomb</surname>
<given-names>G.F.</given-names>
</name>
<name>
<surname>Taylor</surname>
<given-names>R.H.</given-names>
</name>
</person-group>
<article-title>Surgical and interventional robotics: Part III, Surgical assistance systems</article-title>
<source>IEEE Robot. Autom. Mag.</source>
<year>2008</year>
<volume>15</volume>
<fpage>84</fpage>
<lpage>93</lpage>
<pub-id pub-id-type="pmid">20305740</pub-id>
</element-citation>
</ref>
<ref id="b12-sensors-12-09423">
<label>12.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Hagn</surname>
<given-names>U.</given-names>
</name>
<name>
<surname>Nickl</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>Jörg</surname>
<given-names>S.</given-names>
</name>
<name>
<surname>Passig</surname>
<given-names>G.</given-names>
</name>
<name>
<surname>Bahls</surname>
<given-names>T.</given-names>
</name>
<name>
<surname>Nothhelfer</surname>
<given-names>A.</given-names>
</name>
<name>
<surname>Hacker</surname>
<given-names>F.</given-names>
</name>
<name>
<surname>Le-Tien</surname>
<given-names>L.</given-names>
</name>
<name>
<surname>Albu-Schäffer</surname>
<given-names>A.</given-names>
</name>
<name>
<surname>Konietschke</surname>
<given-names>R.</given-names>
</name>
<etal></etal>
</person-group>
<article-title>The DLR MIRO: A versatile lightweight robot for surgical applications</article-title>
<source>Ind. Robot.</source>
<year>2008</year>
<volume>35</volume>
<fpage>324</fpage>
<lpage>336</lpage>
</element-citation>
</ref>
<ref id="b13-sensors-12-09423">
<label>13.</label>
<element-citation publication-type="book">
<person-group person-group-type="author">
<name>
<surname>Peters</surname>
<given-names>T.</given-names>
</name>
<name>
<surname>Cleary</surname>
<given-names>K.</given-names>
</name>
</person-group>
<article-title>Imaging Modalities</article-title>
<source>Image-Guided Interventions—Technology and Applications</source>
<publisher-name>Springer Science+Business Media LLC</publisher-name>
<publisher-loc>New York, NY, USA</publisher-loc>
<year>2008</year>
<fpage>241</fpage>
<lpage>273</lpage>
</element-citation>
</ref>
<ref id="b14-sensors-12-09423">
<label>14.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Taylor</surname>
<given-names>R.H.</given-names>
</name>
<name>
<surname>Paul</surname>
<given-names>H.A.</given-names>
</name>
<name>
<surname>Kazandzides</surname>
<given-names>P.</given-names>
</name>
<name>
<surname>Mittelstadt</surname>
<given-names>B.D.</given-names>
</name>
<name>
<surname>Hanson</surname>
<given-names>W.</given-names>
</name>
<name>
<surname>Zuhars</surname>
<given-names>J.F.</given-names>
</name>
<name>
<surname>Williamson</surname>
<given-names>B.</given-names>
</name>
<name>
<surname>Musits</surname>
<given-names>B.L.</given-names>
</name>
<name>
<surname>Glassman</surname>
<given-names>E.</given-names>
</name>
<name>
<surname>Bargar</surname>
<given-names>W.L.</given-names>
</name>
</person-group>
<article-title>An image directed robotic system for precise orthopaedic surgery</article-title>
<source>IEEE Trans. Robot. Autom.</source>
<year>1994</year>
<volume>10</volume>
<fpage>261</fpage>
<lpage>275</lpage>
</element-citation>
</ref>
<ref id="b15-sensors-12-09423">
<label>15.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Petermann</surname>
<given-names>J.</given-names>
</name>
</person-group>
<article-title>Computer assisted planning and robot-assisted, surgery in cruciate ligament reconstruction</article-title>
<source>Oper. Tech. Orthop.</source>
<year>2000</year>
<volume>10</volume>
<fpage>50</fpage>
<lpage>55</lpage>
</element-citation>
</ref>
<ref id="b16-sensors-12-09423">
<label>16.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Stoianovici</surname>
<given-names>D.</given-names>
</name>
<name>
<surname>Song</surname>
<given-names>D.</given-names>
</name>
<name>
<surname>Petrisor</surname>
<given-names>D.</given-names>
</name>
<name>
<surname>Ursu</surname>
<given-names>D.</given-names>
</name>
<name>
<surname>Mazilu</surname>
<given-names>D.</given-names>
</name>
<name>
<surname>Mutener</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>Schar</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>Patriciu</surname>
<given-names>A.</given-names>
</name>
</person-group>
<article-title>“MRI Stealth” robot for prostate interventions</article-title>
<source>Minim. Invasive Ther.</source>
<year>2007</year>
<volume>16</volume>
<fpage>241</fpage>
<lpage>248</lpage>
</element-citation>
</ref>
<ref id="b17-sensors-12-09423">
<label>17.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Kalender</surname>
<given-names>W.A.</given-names>
</name>
<name>
<surname>Kyriakou</surname>
<given-names>Y.</given-names>
</name>
</person-group>
<article-title>Flat-detector computer tomography (FD-CT)</article-title>
<source>Eur. Radiol.</source>
<year>2007</year>
<volume>17</volume>
<fpage>2767</fpage>
<lpage>2779</lpage>
<pub-id pub-id-type="pmid">17587058</pub-id>
</element-citation>
</ref>
<ref id="b18-sensors-12-09423">
<label>18.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Ning</surname>
<given-names>R.</given-names>
</name>
<name>
<surname>Chen</surname>
<given-names>B.</given-names>
</name>
<name>
<surname>Yu</surname>
<given-names>R.</given-names>
</name>
<name>
<surname>Conover</surname>
<given-names>D.</given-names>
</name>
<name>
<surname>Tang</surname>
<given-names>X.</given-names>
</name>
<name>
<surname>Ning</surname>
<given-names>Y.</given-names>
</name>
</person-group>
<article-title>Flat-panel detector-based cone-beam volume CT angiography imaging: System evaluation</article-title>
<source>IEEE Trans. Med. Imag.</source>
<year>2000</year>
<volume>19</volume>
<fpage>949</fpage>
<lpage>963</lpage>
</element-citation>
</ref>
<ref id="b19-sensors-12-09423">
<label>19.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Jaffray</surname>
<given-names>D.A.</given-names>
</name>
<name>
<surname>Siewerdsen</surname>
<given-names>J.H.</given-names>
</name>
<name>
<surname>Wong</surname>
<given-names>J.</given-names>
</name>
<name>
<surname>Martinez</surname>
<given-names>A.</given-names>
</name>
</person-group>
<article-title>Flat-panel conebeam computed tomography for image-guided radiation therapy</article-title>
<source>Int. J. Radiat. Oncol. Biol. Phys.</source>
<year>2002</year>
<volume>53</volume>
<fpage>1337</fpage>
<lpage>1349</lpage>
<pub-id pub-id-type="pmid">12128137</pub-id>
</element-citation>
</ref>
<ref id="b20-sensors-12-09423">
<label>20.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Siewerdsen</surname>
<given-names>J.H.</given-names>
</name>
<name>
<surname>Moseley</surname>
<given-names>D.</given-names>
</name>
<name>
<surname>Burch</surname>
<given-names>S.</given-names>
</name>
<name>
<surname>Bisland</surname>
<given-names>S.</given-names>
</name>
<name>
<surname>Bogaards</surname>
<given-names>A.</given-names>
</name>
<name>
<surname>Wilson</surname>
<given-names>B.</given-names>
</name>
<name>
<surname>Jaffray</surname>
<given-names>D.A.</given-names>
</name>
</person-group>
<article-title>Volume CT with a flat-panel detector on a mobile, isocentric C-arm: Pre-clinical investigation in guidance of minimally invasive surgery</article-title>
<source>Med. Phys.</source>
<year>2005</year>
<volume>32</volume>
<fpage>241</fpage>
<lpage>254</lpage>
<pub-id pub-id-type="pmid">15719975</pub-id>
</element-citation>
</ref>
<ref id="b21-sensors-12-09423">
<label>21.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Cleary</surname>
<given-names>K.</given-names>
</name>
<name>
<surname>Melzer</surname>
<given-names>A.</given-names>
</name>
<name>
<surname>Watson</surname>
<given-names>V.</given-names>
</name>
<name>
<surname>Kronreif</surname>
<given-names>G.</given-names>
</name>
<name>
<surname>Stoianovici</surname>
<given-names>D.</given-names>
</name>
</person-group>
<article-title>Interventional robotic systems: Applications and technology state-of-the-art</article-title>
<source>Minim. Invasive Ther.</source>
<year>2006</year>
<volume>15</volume>
<fpage>101</fpage>
<lpage>113</lpage>
</element-citation>
</ref>
<ref id="b22-sensors-12-09423">
<label>22.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Strobel</surname>
<given-names>N.</given-names>
</name>
<name>
<surname>Meissner</surname>
<given-names>O.</given-names>
</name>
<name>
<surname>Boese</surname>
<given-names>J.</given-names>
</name>
<name>
<surname>Brunner</surname>
<given-names>T.</given-names>
</name>
<name>
<surname>Heigl</surname>
<given-names>B.</given-names>
</name>
<name>
<surname>Hoheisel</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>Lauritsch</surname>
<given-names>G.</given-names>
</name>
<name>
<surname>Nagel</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>Pfister</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>Rührnschopf</surname>
<given-names>E.P.</given-names>
</name>
<etal></etal>
</person-group>
<article-title>3D imaging with flat-detector C-arm systems</article-title>
<source>Med. Radiol.</source>
<year>2009</year>
<comment>Part 1</comment>
<fpage>33</fpage>
<lpage>51</lpage>
</element-citation>
</ref>
<ref id="b23-sensors-12-09423">
<label>23.</label>
<element-citation publication-type="book">
<person-group person-group-type="author">
<name>
<surname>Castillo-Cruces</surname>
<given-names>R.A.</given-names>
</name>
<name>
<surname>Schneider</surname>
<given-names>H.C.</given-names>
</name>
<name>
<surname>Wahrburg</surname>
<given-names>J.</given-names>
</name>
</person-group>
<article-title>Cooperative Robotic System to Support Surgical Interventions</article-title>
<source>Medical Robotics</source>
<person-group person-group-type="editor">
<name>
<surname>Vanja</surname>
<given-names>B.</given-names>
</name>
</person-group>
<publisher-name>InTech</publisher-name>
<publisher-loc>Rijeka, Croatia</publisher-loc>
<year>2008</year>
</element-citation>
</ref>
<ref id="b24-sensors-12-09423">
<label>24.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Baron</surname>
<given-names>S.</given-names>
</name>
<name>
<surname>Eilers</surname>
<given-names>H.</given-names>
</name>
<name>
<surname>Munske</surname>
<given-names>B.</given-names>
</name>
<name>
<surname>Toennies</surname>
<given-names>J.L.</given-names>
</name>
<name>
<surname>Balachandran</surname>
<given-names>R.</given-names>
</name>
<name>
<surname>Labadie</surname>
<given-names>R.F.</given-names>
</name>
<name>
<surname>Ortmaier</surname>
<given-names>T.</given-names>
</name>
<name>
<surname>Webster</surname>
<given-names>R.J.</given-names>
<suffix>III</suffix>
</name>
</person-group>
<article-title>Percutaneous inner-ear access via an image-guided industrial robot system</article-title>
<source>Proc. Inst. Mech. Eng. Part H J. Eng. Med.</source>
<year>2010</year>
<volume>224</volume>
<fpage>633</fpage>
<lpage>649</lpage>
</element-citation>
</ref>
<ref id="b25-sensors-12-09423">
<label>25.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Masamune</surname>
<given-names>K.</given-names>
</name>
<name>
<surname>Fichtinger</surname>
<given-names>G.</given-names>
</name>
<name>
<surname>Patriciu</surname>
<given-names>A.</given-names>
</name>
<name>
<surname>Susil</surname>
<given-names>R.C.</given-names>
</name>
<name>
<surname>Taylor</surname>
<given-names>R.H.</given-names>
</name>
<name>
<surname>Kavoussi</surname>
<given-names>L.R.</given-names>
</name>
<name>
<surname>Anderson</surname>
<given-names>J.H.</given-names>
</name>
<name>
<surname>Sakuma</surname>
<given-names>I.</given-names>
</name>
<name>
<surname>Dohi</surname>
<given-names>T.</given-names>
</name>
<name>
<surname>Stoianovici</surname>
<given-names>D.</given-names>
</name>
</person-group>
<article-title>System for robotically assisted percutaneous procedures with computed tomography guidance</article-title>
<source>Comput. Aided Surg.</source>
<year>2001</year>
<volume>6</volume>
<fpage>370</fpage>
<lpage>383</lpage>
<pub-id pub-id-type="pmid">11954068</pub-id>
</element-citation>
</ref>
<ref id="b26-sensors-12-09423">
<label>26.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Nagel</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>Schmidt</surname>
<given-names>G.</given-names>
</name>
<name>
<surname>Petzold</surname>
<given-names>R.</given-names>
</name>
<name>
<surname>Kalender</surname>
<given-names>W.A.</given-names>
</name>
</person-group>
<article-title>A navigation system for minimally invasive CT-guided interventions</article-title>
<source>Med. Image Comput. Comput. Assist. Interv.</source>
<year>2005</year>
<volume>8</volume>
<fpage>33</fpage>
<lpage>40</lpage>
<pub-id pub-id-type="pmid">16685940</pub-id>
</element-citation>
</ref>
<ref id="b27-sensors-12-09423">
<label>27.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Grunwald</surname>
<given-names>G.</given-names>
</name>
<name>
<surname>Schreiber</surname>
<given-names>G.</given-names>
</name>
<name>
<surname>Albu-Schäffer</surname>
<given-names>A.</given-names>
</name>
<name>
<surname>Hirzinger</surname>
<given-names>G.</given-names>
</name>
</person-group>
<article-title>Touch: The intuitive type of human and robot interaction</article-title>
<source>Springer Tracts Adv. Robot.</source>
<year>2005</year>
<volume>14</volume>
<fpage>9</fpage>
<lpage>21</lpage>
</element-citation>
</ref>
<ref id="b28-sensors-12-09423">
<label>28.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Hirzinger</surname>
<given-names>G.</given-names>
</name>
<name>
<surname>Bals</surname>
<given-names>J.</given-names>
</name>
<name>
<surname>Otter</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>Stelter</surname>
<given-names>J.</given-names>
</name>
</person-group>
<article-title>The DLR-KUKA success story</article-title>
<source>IEEE Robot. Autom. Mag.</source>
<year>2005</year>
<volume>12</volume>
<fpage>16</fpage>
<lpage>23</lpage>
</element-citation>
</ref>
<ref id="b29-sensors-12-09423">
<label>29.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Majdani</surname>
<given-names>O.</given-names>
</name>
<name>
<surname>Rau</surname>
<given-names>T.</given-names>
</name>
<name>
<surname>Baron</surname>
<given-names>S.</given-names>
</name>
<name>
<surname>Eilers</surname>
<given-names>H.</given-names>
</name>
<name>
<surname>Baier</surname>
<given-names>C.</given-names>
</name>
<name>
<surname>Heimann</surname>
<given-names>B.</given-names>
</name>
<name>
<surname>Ortmaier</surname>
<given-names>T.</given-names>
</name>
<name>
<surname>Bartling</surname>
<given-names>S.</given-names>
</name>
<name>
<surname>Lenarz</surname>
<given-names>T.</given-names>
</name>
<name>
<surname>Leinung</surname>
<given-names>M.</given-names>
</name>
</person-group>
<article-title>A robot-guided minimally invasive approach for cochlear implant surgery: Preliminary results of a temporal bone study</article-title>
<source>Int. J. Comput. Assist. Radiol. Surg.</source>
<year>2009</year>
<volume>4</volume>
<fpage>475</fpage>
<lpage>486</lpage>
<pub-id pub-id-type="pmid">20033531</pub-id>
</element-citation>
</ref>
<ref id="b30-sensors-12-09423">
<label>30.</label>
<element-citation publication-type="confproc">
<person-group person-group-type="author">
<name>
<surname>Matinfar</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>Baird</surname>
<given-names>C.</given-names>
</name>
<name>
<surname>Bautouli</surname>
<given-names>A.</given-names>
</name>
<name>
<surname>Clatterbuck</surname>
<given-names>R.</given-names>
</name>
<name>
<surname>Kazanzides</surname>
<given-names>P.</given-names>
</name>
</person-group>
<article-title>Robot-Assisted Skull Base Surgery</article-title>
<conf-name>Proceedings of the IEEE /RSJ International Conference on Intelligent Robots and Systems (IROS 2007)</conf-name>
<conf-loc>San Diego, CA, USA</conf-loc>
<conf-date>29 October– 2 November 2007</conf-date>
<fpage>865</fpage>
<lpage>870</lpage>
</element-citation>
</ref>
<ref id="b31-sensors-12-09423">
<label>31.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Maurer</surname>
<given-names>C.R.</given-names>
</name>
<name>
<surname>Fitzpatrick</surname>
<given-names>J.M.</given-names>
</name>
<name>
<surname>Wang</surname>
<given-names>M.Y.</given-names>
</name>
<name>
<surname>Galloway</surname>
<given-names>R.L.</given-names>
<suffix>Jr.</suffix>
</name>
<name>
<surname>Maciunas</surname>
<given-names>R.J.</given-names>
</name>
<name>
<surname>Allen</surname>
<given-names>G.S.</given-names>
</name>
</person-group>
<article-title>Registration of head volume images using implantable ӿducial markers</article-title>
<source>IEEE Trans. Med. Imag.</source>
<year>1997</year>
<volume>16</volume>
<fpage>447</fpage>
<lpage>462</lpage>
</element-citation>
</ref>
<ref id="b32-sensors-12-09423">
<label>32.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Umeyama</surname>
<given-names>S.</given-names>
</name>
</person-group>
<article-title>Least-squares estimation of transformation parameters between two point patterns</article-title>
<source>IEEE Trans. Patt. Anal. Mach. Intell.</source>
<year>1991</year>
<volume>13</volume>
<fpage>376</fpage>
<lpage>380</lpage>
</element-citation>
</ref>
<ref id="b33-sensors-12-09423">
<label>33.</label>
<element-citation publication-type="confproc">
<person-group person-group-type="author">
<name>
<surname>Ortmaier</surname>
<given-names>T.</given-names>
</name>
<name>
<surname>Weiss</surname>
<given-names>H.</given-names>
</name>
<name>
<surname>Ott</surname>
<given-names>Ch.</given-names>
</name>
<name>
<surname>Hirzinger</surname>
<given-names>G.</given-names>
</name>
</person-group>
<article-title>A Soft Robotics Approach for Navigated Pedicle Screw Placement—First Experimental Results</article-title>
<conf-name>Proceedings of the International Conference on Computer Assisted Radiology and Surgery (CARS)</conf-name>
<conf-loc>Osaka, Japan</conf-loc>
<conf-date>28 June–1 July 2006</conf-date>
</element-citation>
</ref>
</ref-list>
</back>
<floats-group>
<fig id="f1-sensors-12-09423" position="float">
<label>Figure 1.</label>
<caption>
<p>Sketch of a fully integrated system for percutaneous procedures.</p>
</caption>
<graphic xlink:href="sensors-12-09423f1"></graphic>
</fig>
<fig id="f2-sensors-12-09423" position="float">
<label>Figure 2.</label>
<caption>
<p>System setup in the interventional suite.</p>
</caption>
<graphic xlink:href="sensors-12-09423f2"></graphic>
</fig>
<fig id="f3-sensors-12-09423" position="float">
<label>Figure 3.</label>
<caption>
<p>Mobile robot platform with mounted DLR/KUKA Light Weight Robot III. (
<bold>a</bold>
) The real-time robot controller, the application controller, a DeviceNET bus terminal and a touch screen are integrated; (
<bold>b</bold>
) Platform with covers attached to it.</p>
</caption>
<graphic xlink:href="sensors-12-09423f3"></graphic>
</fig>
<fig id="f4-sensors-12-09423" position="float">
<label>Figure 4.</label>
<caption>
<p>(
<bold>a</bold>
) robot handle together with the needle holder. A DRF is attached to the tool in order to be tracked by the navigation system; (
<bold>b</bold>
) the tool was designed autoclavable. The beige color parts are made of PEEK to ensure artifact-free imaging.</p>
</caption>
<graphic xlink:href="sensors-12-09423f4"></graphic>
</fig>
<fig id="f5-sensors-12-09423" position="float">
<label>Figure 5.</label>
<caption>
<p>Control architecture. The application controller receives information from the different components of the system and uses it to control the robot arm.</p>
</caption>
<graphic xlink:href="sensors-12-09423f5"></graphic>
</fig>
<fig id="f6-sensors-12-09423" position="float">
<label>Figure 6.</label>
<caption>
<p>During initialization, all the channels necessary to make the application controller communicate with the main components of the system are opened. Then, the application controller and the KRC interchange information via RSI in order to control the robot motion.</p>
</caption>
<graphic xlink:href="sensors-12-09423f6"></graphic>
</fig>
<fig id="f7-sensors-12-09423" position="float">
<label>Figure 7.</label>
<caption>
<p>Control loop to manipulate the robot pose in the application controller.</p>
</caption>
<graphic xlink:href="sensors-12-09423f7"></graphic>
</fig>
<fig id="f8-sensors-12-09423" position="float">
<label>Figure 8.</label>
<caption>
<p>DeviceNet link. The data in the bus can be written/read by any of the connected cards.</p>
</caption>
<graphic xlink:href="sensors-12-09423f8"></graphic>
</fig>
<fig id="f9-sensors-12-09423" position="float">
<label>Figure 9.</label>
<caption>
<p>Inner connections of the robotic system, including a DeviceNet bus terminal and power supply.</p>
</caption>
<graphic xlink:href="sensors-12-09423f9"></graphic>
</fig>
<fig id="f10-sensors-12-09423" position="float">
<label>Figure 10.</label>
<caption>
<p>Coordinates and transformations of the system. The dotted lines represent the transformations measured by the navigation system and the robot controller. The dashed lines represent rigid transformations.</p>
</caption>
<graphic xlink:href="sensors-12-09423f10"></graphic>
</fig>
<fig id="f11-sensors-12-09423" position="float">
<label>Figure 11.</label>
<caption>
<p>Offset transformation into RobWrist coordinates.</p>
</caption>
<graphic xlink:href="sensors-12-09423f11"></graphic>
</fig>
<fig id="f12-sensors-12-09423" position="float">
<label>Figure 12.</label>
<caption>
<p>To obtain the rigid transformation
<italic>
<sup>RobRef</sup>
</italic>
T
<italic>
<sub>RobWrist</sub>
</italic>
first the position of the TCP is measured using a pivot calibration following by a pair-point method with an additional DRF (
<italic>Ref</italic>
).</p>
</caption>
<graphic xlink:href="sensors-12-09423f12"></graphic>
</fig>
<fig id="f13-sensors-12-09423" position="float">
<label>Figure 13.</label>
<caption>
<p>Graphical user interface. The user may select the desired control mode and trajectory coordinates. Also, the interface shows whether there is visual contact between the robot and the patient′s camera.</p>
</caption>
<graphic xlink:href="sensors-12-09423f13"></graphic>
</fig>
<fig id="f14-sensors-12-09423" position="float">
<label>Figure 14.</label>
<caption>
<p>An industrial joystick is used to move the TCP in Cartesian coordinates. By pressing the joystick's upper button pivoting at the TCP is possible.</p>
</caption>
<graphic xlink:href="sensors-12-09423f14"></graphic>
</fig>
<fig id="f15-sensors-12-09423" position="float">
<label>Figure 15.</label>
<caption>
<p>In automatic orientation mode the robot orients the TCP in relation to the target every time after a joystick movement is performed. If the patient moves, the robot will react and compensate for patient movement and will point again to the target.</p>
</caption>
<graphic xlink:href="sensors-12-09423f15"></graphic>
</fig>
<fig id="f16-sensors-12-09423" position="float">
<label>Figure 16.</label>
<caption>
<p>Accuracy testing device. The rod's tip position in relation to the attached DRF was known in advance.</p>
</caption>
<graphic xlink:href="sensors-12-09423f16"></graphic>
</fig>
<fig id="f17-sensors-12-09423" position="float">
<label>Figure 17.</label>
<caption>
<p>The error was defined as the distance between the needle's tip and the rod's tip measured on the reconstructed CT-images.</p>
</caption>
<graphic xlink:href="sensors-12-09423f17"></graphic>
</fig>
<table-wrap id="t1-sensors-12-09423" position="float">
<label>Table 1.</label>
<caption>
<p>Pivot residual error (n = 1,000).</p>
</caption>
<table frame="hsides" rules="groups">
<thead>
<tr>
<th align="center" valign="bottom" rowspan="1" colspan="1">
<bold>Rigid transformation</bold>
</th>
<th align="center" valign="bottom" rowspan="1" colspan="1">
<bold>
<italic>e</italic>
</bold>
<italic>
<sub>rms</sub>
</italic>
</th>
</tr>
</thead>
<tbody>
<tr>
<td align="center" valign="top" rowspan="1" colspan="1">
<italic>
<sup>RobWrist</sup>
</italic>
<bold>P</bold>
<italic>
<sub>TCP</sub>
</italic>
</td>
<td align="center" valign="top" rowspan="1" colspan="1">0.94 mm</td>
</tr>
<tr>
<td align="center" valign="top" rowspan="1" colspan="1">
<italic>
<sup>RobRef</sup>
</italic>
<bold>P</bold>
<italic>
<sub>TCP</sub>
</italic>
</td>
<td align="center" valign="top" rowspan="1" colspan="1">0.47 m</td>
</tr>
</tbody>
</table>
</table-wrap>
<table-wrap id="t2-sensors-12-09423" position="float">
<label>Table 2.</label>
<caption>
<p>Residual error.</p>
</caption>
<table frame="hsides" rules="groups">
<thead>
<tr>
<th align="center" valign="bottom" rowspan="1" colspan="1">
<bold>
<italic>e</italic>
</bold>
<italic>
<sub>rms</sub>
</italic>
</th>
<th align="center" valign="bottom" rowspan="1" colspan="1">
<bold>Standard Deviation</bold>
</th>
</tr>
</thead>
<tbody>
<tr>
<td align="center" valign="top" rowspan="1" colspan="1">0.81 mm</td>
<td align="center" valign="top" rowspan="1" colspan="1">0.41 mm</td>
</tr>
</tbody>
</table>
</table-wrap>
<table-wrap id="t3-sensors-12-09423" position="float">
<label>Table 3.</label>
<caption>
<p>Technical accuracy results for targeting a needle on the accuracy testing device. The accuracy was determined as the distance from the needle's tip to the rod's tip from N = 45 different measurements.</p>
</caption>
<table frame="hsides" rules="groups">
<thead>
<tr>
<th align="center" valign="bottom" rowspan="1" colspan="1">
<bold>e
<sub>rms</sub>
± σ</bold>
</th>
<th align="center" valign="bottom" rowspan="1" colspan="1">
<bold>e
<sub>min</sub>
</bold>
</th>
<th align="center" valign="bottom" rowspan="1" colspan="1">
<bold>e
<sub>max</sub>
</bold>
</th>
</tr>
</thead>
<tbody>
<tr>
<td align="center" valign="top" rowspan="1" colspan="1">1.2 mm ± 0.4 mm</td>
<td align="center" valign="top" rowspan="1" colspan="1">0.33 mm</td>
<td align="center" valign="top" rowspan="1" colspan="1">1.98 mm</td>
</tr>
</tbody>
</table>
</table-wrap>
</floats-group>
</pmc>
</record>

Pour manipuler ce document sous Unix (Dilib)

EXPLOR_STEP=$WICRI_ROOT/Ticri/CIDE/explor/HapticV1/Data/Pmc/Curation
HfdSelect -h $EXPLOR_STEP/biblio.hfd -nk 002492 | SxmlIndent | more

Ou

HfdSelect -h $EXPLOR_AREA/Data/Pmc/Curation/biblio.hfd -nk 002492 | SxmlIndent | more

Pour mettre un lien sur cette page dans le réseau Wicri

{{Explor lien
   |wiki=    Ticri/CIDE
   |area=    HapticV1
   |flux=    Pmc
   |étape=   Curation
   |type=    RBID
   |clé=     PMC:3444109
   |texte=   A Fully Sensorized Cooperative Robotic System for Surgical Interventions
}}

Pour générer des pages wiki

HfdIndexSelect -h $EXPLOR_AREA/Data/Pmc/Curation/RBID.i   -Sk "pubmed:23012551" \
       | HfdSelect -Kh $EXPLOR_AREA/Data/Pmc/Curation/biblio.hfd   \
       | NlmPubMed2Wicri -a HapticV1 

Wicri

This area was generated with Dilib version V0.6.23.
Data generation: Mon Jun 13 01:09:46 2016. Site generation: Wed Mar 6 09:54:07 2024