Serveur d'exploration sur les dispositifs haptiques

Attention, ce site est en cours de développement !
Attention, site généré par des moyens informatiques à partir de corpus bruts.
Les informations ne sont donc pas validées.

Contact Region Estimation Based on a Vision-Based Tactile Sensor Using a Deformable Touchpad

Identifieur interne : 000D37 ( Pmc/Checkpoint ); précédent : 000D36; suivant : 000D38

Contact Region Estimation Based on a Vision-Based Tactile Sensor Using a Deformable Touchpad

Auteurs : Yuji Ito [Japon] ; Youngwoo Kim ; Goro Obinata

Source :

RBID : PMC:4029664

Abstract

A new method is proposed to estimate the contact region between a sensor and an object using a deformable tactile sensor. The sensor consists of a charge-coupled device (CCD) camera, light-emitting diode (LED) lights and a deformable touchpad. The sensor can obtain a variety of tactile information, such as the contact region, multi-axis contact force, slippage, shape, position and orientation of an object in contact with the touchpad. The proposed method is based on the movements of dots printed on the surface of the touchpad and classifies the contact state of dots into three types—A non-contacting dot, a sticking dot and a slipping dot. Considering the movements of the dots with noise and errors, equations are formulated to discriminate between the contacting dots and the non-contacting dots. A set of the contacting dots discriminated by the formulated equations can construct the contact region. Next, a method is developed to detect the dots in images of the surface of the touchpad captured by the CCD camera. A method to assign numbers to dots for calculating the displacements of the dots is also proposed. Finally, the proposed methods are validated by experimental results.


Url:
DOI: 10.3390/s140405805
PubMed: 24670719
PubMed Central: 4029664


Affiliations:


Links toward previous steps (curation, corpus...)


Links to Exploration step

PMC:4029664

Le document en format XML

<record>
<TEI>
<teiHeader>
<fileDesc>
<titleStmt>
<title xml:lang="en">Contact Region Estimation Based on a Vision-Based Tactile Sensor Using a Deformable Touchpad</title>
<author>
<name sortKey="Ito, Yuji" sort="Ito, Yuji" uniqKey="Ito Y" first="Yuji" last="Ito">Yuji Ito</name>
<affiliation wicri:level="1">
<nlm:aff id="af1-sensors-14-05805"> Graduate School of Engineering, Nagoya University, Furo-cho, Chikusa-ku, Nagoya 464-8603, Japan</nlm:aff>
<country xml:lang="fr">Japon</country>
<wicri:regionArea> Graduate School of Engineering, Nagoya University, Furo-cho, Chikusa-ku, Nagoya 464-8603</wicri:regionArea>
<wicri:noRegion>Nagoya 464-8603</wicri:noRegion>
</affiliation>
</author>
<author>
<name sortKey="Kim, Youngwoo" sort="Kim, Youngwoo" uniqKey="Kim Y" first="Youngwoo" last="Kim">Youngwoo Kim</name>
<affiliation>
<nlm:aff id="af2-sensors-14-05805"> Daegu Research Center for Medical Devices and Green Engergy, Korea Institute of Machinery & Materials (KIMM), Deagu Techno Park R&D Center #1031, 711 Hosan-dong, Dalseo-gu 704-948, Korea; E-Mail:
<email>ywkim@kimm.re.kr</email>
</nlm:aff>
</affiliation>
</author>
<author>
<name sortKey="Obinata, Goro" sort="Obinata, Goro" uniqKey="Obinata G" first="Goro" last="Obinata">Goro Obinata</name>
<affiliation>
<nlm:aff id="af3-sensors-14-05805"> EcoTopia Science Institute, Nagoya University, Furo-cho, Chikusa-ku, Nagoya 464-8603, Japan; E-Mail:
<email>obinata@mech.nagoya-u.ac.jp</email>
</nlm:aff>
</affiliation>
</author>
</titleStmt>
<publicationStmt>
<idno type="wicri:source">PMC</idno>
<idno type="pmid">24670719</idno>
<idno type="pmc">4029664</idno>
<idno type="url">http://www.ncbi.nlm.nih.gov/pmc/articles/PMC4029664</idno>
<idno type="RBID">PMC:4029664</idno>
<idno type="doi">10.3390/s140405805</idno>
<date when="2014">2014</date>
<idno type="wicri:Area/Pmc/Corpus">002477</idno>
<idno type="wicri:Area/Pmc/Curation">002477</idno>
<idno type="wicri:Area/Pmc/Checkpoint">000D37</idno>
</publicationStmt>
<sourceDesc>
<biblStruct>
<analytic>
<title xml:lang="en" level="a" type="main">Contact Region Estimation Based on a Vision-Based Tactile Sensor Using a Deformable Touchpad</title>
<author>
<name sortKey="Ito, Yuji" sort="Ito, Yuji" uniqKey="Ito Y" first="Yuji" last="Ito">Yuji Ito</name>
<affiliation wicri:level="1">
<nlm:aff id="af1-sensors-14-05805"> Graduate School of Engineering, Nagoya University, Furo-cho, Chikusa-ku, Nagoya 464-8603, Japan</nlm:aff>
<country xml:lang="fr">Japon</country>
<wicri:regionArea> Graduate School of Engineering, Nagoya University, Furo-cho, Chikusa-ku, Nagoya 464-8603</wicri:regionArea>
<wicri:noRegion>Nagoya 464-8603</wicri:noRegion>
</affiliation>
</author>
<author>
<name sortKey="Kim, Youngwoo" sort="Kim, Youngwoo" uniqKey="Kim Y" first="Youngwoo" last="Kim">Youngwoo Kim</name>
<affiliation>
<nlm:aff id="af2-sensors-14-05805"> Daegu Research Center for Medical Devices and Green Engergy, Korea Institute of Machinery & Materials (KIMM), Deagu Techno Park R&D Center #1031, 711 Hosan-dong, Dalseo-gu 704-948, Korea; E-Mail:
<email>ywkim@kimm.re.kr</email>
</nlm:aff>
</affiliation>
</author>
<author>
<name sortKey="Obinata, Goro" sort="Obinata, Goro" uniqKey="Obinata G" first="Goro" last="Obinata">Goro Obinata</name>
<affiliation>
<nlm:aff id="af3-sensors-14-05805"> EcoTopia Science Institute, Nagoya University, Furo-cho, Chikusa-ku, Nagoya 464-8603, Japan; E-Mail:
<email>obinata@mech.nagoya-u.ac.jp</email>
</nlm:aff>
</affiliation>
</author>
</analytic>
<series>
<title level="j">Sensors (Basel, Switzerland)</title>
<idno type="eISSN">1424-8220</idno>
<imprint>
<date when="2014">2014</date>
</imprint>
</series>
</biblStruct>
</sourceDesc>
</fileDesc>
<profileDesc>
<textClass></textClass>
</profileDesc>
</teiHeader>
<front>
<div type="abstract" xml:lang="en">
<p>A new method is proposed to estimate the contact region between a sensor and an object using a deformable tactile sensor. The sensor consists of a charge-coupled device (CCD) camera, light-emitting diode (LED) lights and a deformable touchpad. The sensor can obtain a variety of tactile information, such as the contact region, multi-axis contact force, slippage, shape, position and orientation of an object in contact with the touchpad. The proposed method is based on the movements of dots printed on the surface of the touchpad and classifies the contact state of dots into three types—A non-contacting dot, a sticking dot and a slipping dot. Considering the movements of the dots with noise and errors, equations are formulated to discriminate between the contacting dots and the non-contacting dots. A set of the contacting dots discriminated by the formulated equations can construct the contact region. Next, a method is developed to detect the dots in images of the surface of the touchpad captured by the CCD camera. A method to assign numbers to dots for calculating the displacements of the dots is also proposed. Finally, the proposed methods are validated by experimental results.</p>
</div>
</front>
<back>
<div1 type="bibliography">
<listBibl>
<biblStruct>
<analytic>
<author>
<name sortKey="Lee, M H" uniqKey="Lee M">M.H. Lee</name>
</author>
<author>
<name sortKey="Nicholls, H R" uniqKey="Nicholls H">H.R. Nicholls</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Dahiya, R S" uniqKey="Dahiya R">R.S. Dahiya</name>
</author>
<author>
<name sortKey="Metta, G" uniqKey="Metta G">G. Metta</name>
</author>
<author>
<name sortKey="Valle, M" uniqKey="Valle M">M. Valle</name>
</author>
<author>
<name sortKey="Sandini, G" uniqKey="Sandini G">G. Sandini</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Yamada, D" uniqKey="Yamada D">D. Yamada</name>
</author>
<author>
<name sortKey="Maeno, T" uniqKey="Maeno T">T. Maeno</name>
</author>
<author>
<name sortKey="Yamada, Y" uniqKey="Yamada Y">Y. Yamada</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Noda, K" uniqKey="Noda K">K. Noda</name>
</author>
<author>
<name sortKey="Hoshino, K" uniqKey="Hoshino K">K. Hoshino</name>
</author>
<author>
<name sortKey="Matsumoto, K" uniqKey="Matsumoto K">K. Matsumoto</name>
</author>
<author>
<name sortKey="Shimoyama, I" uniqKey="Shimoyama I">I. Shimoyama</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Schmitz, A" uniqKey="Schmitz A">A. Schmitz</name>
</author>
<author>
<name sortKey="Maggiali, M" uniqKey="Maggiali M">M. Maggiali</name>
</author>
<author>
<name sortKey="Natale, L" uniqKey="Natale L">L. Natale</name>
</author>
<author>
<name sortKey="Bonino, B" uniqKey="Bonino B">B. Bonino</name>
</author>
<author>
<name sortKey="Metta, G" uniqKey="Metta G">G. Metta</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Hakozaki, M" uniqKey="Hakozaki M">M. Hakozaki</name>
</author>
<author>
<name sortKey="Shinoda, H" uniqKey="Shinoda H">H. Shinoda</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Yamada, K" uniqKey="Yamada K">K. Yamada</name>
</author>
<author>
<name sortKey="Goto, K" uniqKey="Goto K">K. Goto</name>
</author>
<author>
<name sortKey="Nakajima, Y" uniqKey="Nakajima Y">Y. Nakajima</name>
</author>
<author>
<name sortKey="Koshida, N" uniqKey="Koshida N">N. Koshida</name>
</author>
<author>
<name sortKey="Shinoda, H" uniqKey="Shinoda H">H. Shinoda</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Yang, S" uniqKey="Yang S">S. Yang</name>
</author>
<author>
<name sortKey="Chen, X" uniqKey="Chen X">X. Chen</name>
</author>
<author>
<name sortKey="Motojima, S" uniqKey="Motojima S">S. Motojima</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Takao, H" uniqKey="Takao H">H. Takao</name>
</author>
<author>
<name sortKey="Sawada, K" uniqKey="Sawada K">K. Sawada</name>
</author>
<author>
<name sortKey="Ishida, M" uniqKey="Ishida M">M. Ishida</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Engel, J" uniqKey="Engel J">J. Engel</name>
</author>
<author>
<name sortKey="Chen, J" uniqKey="Chen J">J. Chen</name>
</author>
<author>
<name sortKey="Liu, C" uniqKey="Liu C">C. Liu</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Mei, T" uniqKey="Mei T">T. Mei</name>
</author>
<author>
<name sortKey="Li, W J" uniqKey="Li W">W.J. Li</name>
</author>
<author>
<name sortKey="Ge, Y" uniqKey="Ge Y">Y. Ge</name>
</author>
<author>
<name sortKey="Chen, Y" uniqKey="Chen Y">Y. Chen</name>
</author>
<author>
<name sortKey="Ni, L" uniqKey="Ni L">L. Ni</name>
</author>
<author>
<name sortKey="Chan, M H" uniqKey="Chan M">M.H. Chan</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Engel, J" uniqKey="Engel J">J. Engel</name>
</author>
<author>
<name sortKey="Chen, J" uniqKey="Chen J">J. Chen</name>
</author>
<author>
<name sortKey="Liu, C" uniqKey="Liu C">C. Liu</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Ferrier, N J" uniqKey="Ferrier N">N.J. Ferrier</name>
</author>
<author>
<name sortKey="Brockett, R W" uniqKey="Brockett R">R.W. Brockett</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Saga, S" uniqKey="Saga S">S. Saga</name>
</author>
<author>
<name sortKey="Kajimoto, H" uniqKey="Kajimoto H">H. Kajimoto</name>
</author>
<author>
<name sortKey="Tachi, S" uniqKey="Tachi S">S. Tachi</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Johnson, M K" uniqKey="Johnson M">M.K. Johnson</name>
</author>
<author>
<name sortKey="Adelson, E H" uniqKey="Adelson E">E.H. Adelson</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Kamiyama, K" uniqKey="Kamiyama K">K. Kamiyama</name>
</author>
<author>
<name sortKey="Vlack, K" uniqKey="Vlack K">K. Vlack</name>
</author>
<author>
<name sortKey="Mizota, T" uniqKey="Mizota T">T. Mizota</name>
</author>
<author>
<name sortKey="Kajimoto, H" uniqKey="Kajimoto H">H. Kajimoto</name>
</author>
<author>
<name sortKey="Kawakami, N" uniqKey="Kawakami N">N. Kawakami</name>
</author>
<author>
<name sortKey="Tachi, S" uniqKey="Tachi S">S. Tachi</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Sato, K" uniqKey="Sato K">K. Sato</name>
</author>
<author>
<name sortKey="Kamiyama, K" uniqKey="Kamiyama K">K. Kamiyama</name>
</author>
<author>
<name sortKey="Nii, H" uniqKey="Nii H">H. Nii</name>
</author>
<author>
<name sortKey="Kawakami, N" uniqKey="Kawakami N">N. Kawakami</name>
</author>
<author>
<name sortKey="Tachi, S" uniqKey="Tachi S">S. Tachi</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Ohka, M" uniqKey="Ohka M">M. Ohka</name>
</author>
<author>
<name sortKey="Mitsuya, Y" uniqKey="Mitsuya Y">Y. Mitsuya</name>
</author>
<author>
<name sortKey="Matsunaga, Y" uniqKey="Matsunaga Y">Y. Matsunaga</name>
</author>
<author>
<name sortKey="Takeuchi, S" uniqKey="Takeuchi S">S. Takeuchi</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Ohka, M" uniqKey="Ohka M">M. Ohka</name>
</author>
<author>
<name sortKey="Takata, J" uniqKey="Takata J">J. Takata</name>
</author>
<author>
<name sortKey="Kobayashi, H" uniqKey="Kobayashi H">H. Kobayashi</name>
</author>
<author>
<name sortKey="Suzuki, H" uniqKey="Suzuki H">H. Suzuki</name>
</author>
<author>
<name sortKey="Morisawa, N" uniqKey="Morisawa N">N. Morisawa</name>
</author>
<author>
<name sortKey="Yussof, H B" uniqKey="Yussof H">H.B. Yussof</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Yamada, Y" uniqKey="Yamada Y">Y. Yamada</name>
</author>
<author>
<name sortKey="Iwanaga, Y" uniqKey="Iwanaga Y">Y. Iwanaga</name>
</author>
<author>
<name sortKey="Fukunaga, M" uniqKey="Fukunaga M">M. Fukunaga</name>
</author>
<author>
<name sortKey="Fujimoto, N" uniqKey="Fujimoto N">N. Fujimoto</name>
</author>
<author>
<name sortKey="Ohta, E" uniqKey="Ohta E">E. Ohta</name>
</author>
<author>
<name sortKey="Morizono, T" uniqKey="Morizono T">T. Morizono</name>
</author>
<author>
<name sortKey="Umetani, Y" uniqKey="Umetani Y">Y. Umetani</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Miyamoto, R" uniqKey="Miyamoto R">R. Miyamoto</name>
</author>
<author>
<name sortKey="Komatsu, S" uniqKey="Komatsu S">S. Komatsu</name>
</author>
<author>
<name sortKey="Iwase, E" uniqKey="Iwase E">E. Iwase</name>
</author>
<author>
<name sortKey="Matsumoto, K" uniqKey="Matsumoto K">K. Matsumoto</name>
</author>
<author>
<name sortKey="Shimoyama, I" uniqKey="Shimoyama I">I. Shimoyama</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Maekawa, H" uniqKey="Maekawa H">H. Maekawa</name>
</author>
<author>
<name sortKey="Tanie, K" uniqKey="Tanie K">K. Tanie</name>
</author>
<author>
<name sortKey="Kaneko, M" uniqKey="Kaneko M">M. Kaneko</name>
</author>
<author>
<name sortKey="Suzuki, N" uniqKey="Suzuki N">N. Suzuki</name>
</author>
<author>
<name sortKey="Horiguchi, C" uniqKey="Horiguchi C">C. Horiguchi</name>
</author>
<author>
<name sortKey="Sugawara, T" uniqKey="Sugawara T">T. Sugawara</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Obinata, G" uniqKey="Obinata G">G. Obinata</name>
</author>
<author>
<name sortKey="Ashis, D" uniqKey="Ashis D">D. Ashis</name>
</author>
<author>
<name sortKey="Watanabe, N" uniqKey="Watanabe N">N. Watanabe</name>
</author>
<author>
<name sortKey="Moriyama, N" uniqKey="Moriyama N">N. Moriyama</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Ito, Y" uniqKey="Ito Y">Y. Ito</name>
</author>
<author>
<name sortKey="Kim, Y" uniqKey="Kim Y">Y. Kim</name>
</author>
<author>
<name sortKey="Obinata, G" uniqKey="Obinata G">G. Obinata</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Ito, Y" uniqKey="Ito Y">Y. Ito</name>
</author>
<author>
<name sortKey="Kim, Y" uniqKey="Kim Y">Y. Kim</name>
</author>
<author>
<name sortKey="Nagai, C" uniqKey="Nagai C">C. Nagai</name>
</author>
<author>
<name sortKey="Obinata, G" uniqKey="Obinata G">G. Obinata</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Ito, Y" uniqKey="Ito Y">Y. Ito</name>
</author>
<author>
<name sortKey="Kim, Y" uniqKey="Kim Y">Y. Kim</name>
</author>
<author>
<name sortKey="Nagai, C" uniqKey="Nagai C">C. Nagai</name>
</author>
<author>
<name sortKey="Obinata, G" uniqKey="Obinata G">G. Obinata</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Ito, Y" uniqKey="Ito Y">Y. Ito</name>
</author>
<author>
<name sortKey="Kim, Y" uniqKey="Kim Y">Y. Kim</name>
</author>
<author>
<name sortKey="Obinata, G" uniqKey="Obinata G">G. Obinata</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Watanabe, N" uniqKey="Watanabe N">N. Watanabe</name>
</author>
<author>
<name sortKey="Obinata, G" uniqKey="Obinata G">G. Obinata</name>
</author>
</analytic>
</biblStruct>
</listBibl>
</div1>
</back>
</TEI>
<pmc article-type="research-article">
<pmc-dir>properties open_access</pmc-dir>
<front>
<journal-meta>
<journal-id journal-id-type="nlm-ta">Sensors (Basel)</journal-id>
<journal-id journal-id-type="iso-abbrev">Sensors (Basel)</journal-id>
<journal-title-group>
<journal-title>Sensors (Basel, Switzerland)</journal-title>
</journal-title-group>
<issn pub-type="epub">1424-8220</issn>
<publisher>
<publisher-name>MDPI</publisher-name>
</publisher>
</journal-meta>
<article-meta>
<article-id pub-id-type="pmid">24670719</article-id>
<article-id pub-id-type="pmc">4029664</article-id>
<article-id pub-id-type="doi">10.3390/s140405805</article-id>
<article-id pub-id-type="publisher-id">sensors-14-05805</article-id>
<article-categories>
<subj-group subj-group-type="heading">
<subject>Article</subject>
</subj-group>
</article-categories>
<title-group>
<article-title>Contact Region Estimation Based on a Vision-Based Tactile Sensor Using a Deformable Touchpad</article-title>
</title-group>
<contrib-group>
<contrib contrib-type="author">
<name>
<surname>Ito</surname>
<given-names>Yuji</given-names>
</name>
<xref ref-type="aff" rid="af1-sensors-14-05805">
<sup>1</sup>
</xref>
<xref rid="c1-sensors-14-05805" ref-type="corresp">
<sup>*</sup>
</xref>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Kim</surname>
<given-names>Youngwoo</given-names>
</name>
<xref ref-type="aff" rid="af2-sensors-14-05805">
<sup>2</sup>
</xref>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Obinata</surname>
<given-names>Goro</given-names>
</name>
<xref ref-type="aff" rid="af3-sensors-14-05805">
<sup>3</sup>
</xref>
</contrib>
</contrib-group>
<aff id="af1-sensors-14-05805">
<label>1</label>
Graduate School of Engineering, Nagoya University, Furo-cho, Chikusa-ku, Nagoya 464-8603, Japan</aff>
<aff id="af2-sensors-14-05805">
<label>2</label>
Daegu Research Center for Medical Devices and Green Engergy, Korea Institute of Machinery & Materials (KIMM), Deagu Techno Park R&D Center #1031, 711 Hosan-dong, Dalseo-gu 704-948, Korea; E-Mail:
<email>ywkim@kimm.re.kr</email>
</aff>
<aff id="af3-sensors-14-05805">
<label>3</label>
EcoTopia Science Institute, Nagoya University, Furo-cho, Chikusa-ku, Nagoya 464-8603, Japan; E-Mail:
<email>obinata@mech.nagoya-u.ac.jp</email>
</aff>
<author-notes>
<corresp id="c1-sensors-14-05805">
<label>*</label>
Author to whom correspondence should be addressed; E-Mail:
<email>ito_yuji@nagoya-u.jp</email>
; Tel.: +81-52-789-5030; Fax: +81-52-789-5589.</corresp>
</author-notes>
<pub-date pub-type="collection">
<month>4</month>
<year>2014</year>
</pub-date>
<pub-date pub-type="epub">
<day>25</day>
<month>3</month>
<year>2014</year>
</pub-date>
<volume>14</volume>
<issue>4</issue>
<fpage>5805</fpage>
<lpage>5822</lpage>
<history>
<date date-type="received">
<day>30</day>
<month>1</month>
<year>2014</year>
</date>
<date date-type="rev-recd">
<day>10</day>
<month>3</month>
<year>2014</year>
</date>
<date date-type="accepted">
<day>20</day>
<month>3</month>
<year>2014</year>
</date>
</history>
<permissions>
<copyright-statement>© 2014 by the authors; licensee MDPI, Basel, Switzerland.</copyright-statement>
<copyright-year>2014</copyright-year>
<license>
<license-p>This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution license(
<ext-link ext-link-type="uri" xlink:href="http://creativecommons.org/licenses/by/3.0/">http://creativecommons.org/licenses/by/3.0/</ext-link>
).</license-p>
</license>
</permissions>
<abstract>
<p>A new method is proposed to estimate the contact region between a sensor and an object using a deformable tactile sensor. The sensor consists of a charge-coupled device (CCD) camera, light-emitting diode (LED) lights and a deformable touchpad. The sensor can obtain a variety of tactile information, such as the contact region, multi-axis contact force, slippage, shape, position and orientation of an object in contact with the touchpad. The proposed method is based on the movements of dots printed on the surface of the touchpad and classifies the contact state of dots into three types—A non-contacting dot, a sticking dot and a slipping dot. Considering the movements of the dots with noise and errors, equations are formulated to discriminate between the contacting dots and the non-contacting dots. A set of the contacting dots discriminated by the formulated equations can construct the contact region. Next, a method is developed to detect the dots in images of the surface of the touchpad captured by the CCD camera. A method to assign numbers to dots for calculating the displacements of the dots is also proposed. Finally, the proposed methods are validated by experimental results.</p>
</abstract>
<kwd-group>
<kwd>contact region estimation</kwd>
<kwd>flexible and conformable sensors and arrays</kwd>
<kwd>image processing</kwd>
<kwd>vision-based tactile sensors</kwd>
</kwd-group>
</article-meta>
</front>
<body>
<sec sec-type="intro">
<label>1.</label>
<title>Introduction</title>
<p>Tactile receptors in the skin allow humans to sense multimodal tactile information such as the contact force, slippage, shape and temperature of a contacted object. By feeding back information from tactile receptors, humans can control their muscles dexterously. Therefore, tactile sensing is a crucial factor for robots to imitate skilled human behaviors. In consideration of practical applications, tactile sensors should meet three specific requirements. Firstly, flexible sensor surfaces are desirable because sensors should fit the object geometrically to avoid the contacted object from collapsing and enhance stability of the contact. Secondly, a simple structure is required for compactness of robots. Thirdly, for achieving dexterous and multifunctional robots, we need a sensor which can obtain various types of tactile information simultaneously.</p>
<p>Many types of tactile sensors have been developed using various sensing elements such as resistive, capacitive, piezoelectric, ultrasonic or electromagnetic devices [
<xref rid="b1-sensors-14-05805" ref-type="bibr">1</xref>
,
<xref rid="b2-sensors-14-05805" ref-type="bibr">2</xref>
]. In order to estimate the slippage of a contacted object, a sensor with array of strain gauges embedded in an elastic body has been proposed [
<xref rid="b3-sensors-14-05805" ref-type="bibr">3</xref>
]. Standing cantilevers and piezo resistors arrayed in an elastic body have been developed for detecting shear stress [
<xref rid="b4-sensors-14-05805" ref-type="bibr">4</xref>
]. Schmitz
<italic>et al.</italic>
have implemented twelve capacitance-to-digital converter (CDC) chips in a robot finger, providing twelve 16-bit measurements of capacitance [
<xref rid="b5-sensors-14-05805" ref-type="bibr">5</xref>
]. Sensing elements based on a capacitive method have been arrayed on conductive rubber at regular intervals for measuring three components of stress [
<xref rid="b6-sensors-14-05805" ref-type="bibr">6</xref>
].</p>
<p>However, the crucial practical issues remain unresolved. The structures of these sensors are complex and cannot satisfy the second requirement as described in the previous paragraph because theses sensors require many sensing elements and complicated wiring. Although a wire-free tactile sensor using transmitters/receivers [
<xref rid="b7-sensors-14-05805" ref-type="bibr">7</xref>
] and a sensor using micro coils changing impedance by contact force [
<xref rid="b8-sensors-14-05805" ref-type="bibr">8</xref>
] have been proposed, they are also packed in complex structures. Small sensors using microelectromechanical system (MEMS) have been manufactured [
<xref rid="b9-sensors-14-05805" ref-type="bibr">9</xref>
<xref rid="b12-sensors-14-05805" ref-type="bibr">12</xref>
]. However, the surfaces of these sensors are minimally deformable and cannot satisfy the first requirement as described above.</p>
<p>Differently to these sensors, vision-based sensors are suitable for tactile sensing [
<xref rid="b13-sensors-14-05805" ref-type="bibr">13</xref>
<xref rid="b15-sensors-14-05805" ref-type="bibr">15</xref>
]. Typical vision-based sensors can satisfy the first and second requirements as described above because they consist of the following two components: a deformable contact surface made of elastic material to fit its shape to contacted objects; and a camera to observe the deformation of the contact surface. Since multiple sensing elements and complex wiring are not required, compact vision-based sensors can be easily fabricated. Analysis of the deformation of the surface yields multiple types of tactile information. The two-layered dot markers embedded in the elastic body of the sensors have visualized the three-dimensional deformation of the elastic body to measure a three-axis contact force [
<xref rid="b16-sensors-14-05805" ref-type="bibr">16</xref>
,
<xref rid="b17-sensors-14-05805" ref-type="bibr">17</xref>
]. The markers are observed by a charge-coupled device (CCD) camera. The sensors consisting of rubber sheets with nubs, a transparent acrylic plate, a light source and a CCD camera have been developed [
<xref rid="b18-sensors-14-05805" ref-type="bibr">18</xref>
,
<xref rid="b19-sensors-14-05805" ref-type="bibr">19</xref>
]. Light traveling through the transparent plate is diffusely reflected at which the nubs contact the plate. The intensity of the reflected light captured by the CCD camera is transformed into the three-axis contact force. The sensor reported in [
<xref rid="b20-sensors-14-05805" ref-type="bibr">20</xref>
] has estimated the orientation of an object by using the four corner positions of the reflector chips embedded in the deformable surface of the sensor. However, these sensors cannot satisfy the third requirement because they only detect single type of tactile information.</p>
<p>Moreover, although the sensors in the literature have provided information such as the contact force, slippage and shape of an object, the contact region between the sensor and an object also gives crucial information. The contact region allows us to estimate the shapes of objects in an accurate manner when combined with shape information from a sensor surface. Since geometric fit between a sensor and object occurs only in the contact region, the accurate estimation of an object's shape requires information about not only the sensor's shape but also the contact region. In consideration of implementing the sensor in robot hands, when the contact region is small, the grasped object can be easily rotated because the feasible contact moment is also small. In order to avoid the risk that the sensor surface tears or a grasped object collapses, and to enhance a grasping task's stability with a sufficiently large contact area, it is necessary to evaluate the contact pressure based on the contact region.</p>
<p>Among many sensors to sense contact forces, some sensors to measure force pressure distribution may detect the contact region based on the distribution [
<xref rid="b16-sensors-14-05805" ref-type="bibr">16</xref>
<xref rid="b19-sensors-14-05805" ref-type="bibr">19</xref>
]. In ideal situations, the force pressure becomes zero outside the contact region and does not become zero in the contact region. However, this assumption is violated by the stiffness of the elastic body. Moreover, in order to measure the pressure distribution, the sensors requires many arrayed sensing elements and wiring as described in above sentences. Some sensors have been proposed for obtaining the contact region directly. A sensor using regularly-arrayed cantilevers has been developed for estimating the contact region of objects from the deformation of the cantilevers in the elastic body [
<xref rid="b21-sensors-14-05805" ref-type="bibr">21</xref>
]. However, this sensor also requires many arrayed sensing elements and the measurement error depends on the direction and position of the cantilevers. A large error can be caused when the cantilevers is far away from the contacted object. A finger-shaped tactile sensor based on optical phenomena has been developed for detecting the contact location [
<xref rid="b22-sensors-14-05805" ref-type="bibr">22</xref>
]. The light travels from optical fibers into a hemispherical optical waveguide in the elastic sensor surface. When the sensor surface contacts the internal optical waveguide due to the contact between the sensor surface and the objects, light is reflected in the contact region. A position sensitive detector (PSD) receives the reflected light and thus the contact region is detected from the signals of the PSD. However, the surfaces of these sensors cannot fit with in the objects geometrically because the surfaces has elasticity but the inside optical waveguide is not deformed. Therefore, large contact region cannot be generated and that leads unstable contact.</p>
<p>We have proposed a vision-based tactile sensor that can sense multiple types of tactile information simultaneously including the slippage [
<xref rid="b23-sensors-14-05805" ref-type="bibr">23</xref>
,
<xref rid="b24-sensors-14-05805" ref-type="bibr">24</xref>
], contact region [
<xref rid="b25-sensors-14-05805" ref-type="bibr">25</xref>
], shape [
<xref rid="b26-sensors-14-05805" ref-type="bibr">26</xref>
], multi-axis contact force [
<xref rid="b27-sensors-14-05805" ref-type="bibr">27</xref>
], position [
<xref rid="b25-sensors-14-05805" ref-type="bibr">25</xref>
] and orientation [
<xref rid="b25-sensors-14-05805" ref-type="bibr">25</xref>
] of an object. We have applied this sensor to prevent the object from slipping [
<xref rid="b28-sensors-14-05805" ref-type="bibr">28</xref>
]. The sensor consists of a CCD camera, light-emitting diode (LED) lights and a hemispherical elastic touchpad for contacting the object. Because of the simple structure of the sensor and the deformable touchpad, our proposed sensor can satisfy the above three requirements: a deformable surface for the sensor; simple structure; and simultaneous acquisition of various types of tactile information. However, the previous method used for estimating the contact region required the strict restriction that the contact surface of the object must be flat or convex [
<xref rid="b25-sensors-14-05805" ref-type="bibr">25</xref>
].</p>
<p>The purpose of this study is to estimate the contact region between the sensor and a contacted object without strict assumptions. A new proposed method is based on the movements of dots printed on the surface of the sensor. The contact state of the dots is classified into three types—the non-contacting dot, the sticking dot and the slipping dot. Considering the movements of the dots, equations are formulated to discriminate between the contacting dots and the non-contacting dots and modified by selecting the appropriate time interval and introducing the threshold values. A set of the contacting dots discriminated by the formulated equations can construct the contact region. Next, an image processing method is proposed to detect the dots in images of the surface of the sensor captured by the CCD camera. A method to assign numbers to dots for calculating the displacements of the dots is also proposed. Finally, the methods proposed methods are validated by experimental results.</p>
</sec>
<sec>
<label>2.</label>
<title>Vision-Based Tactile Sensor</title>
<p>
<xref rid="f1-sensors-14-05805" ref-type="fig">Figure 1</xref>
shows the configuration of a vision-based tactile sensor which consists of a CCD camera, LED lights, and a deformable fluid-type touchpad. The dimensions of the LED lights and the CCD camera are 60 × 60 × 60 mm and 8 × 8 × 40 mm, respectively. The fluid-type touchpad is hemispherical, with a curvature radius and height of 20 mm and 13 mm. The surface of the touchpad is made of an elastic membrane constructed of silicon rubber; the inside of the membrane is filled with translucent, red-colored water. A dotted pattern is printed on the inside of the touchpad surface to observe the deformation of the touchpad. When the touchpad comes in contact with objects, analysis of the deformations yields multimodal tactile information, using an image of the inside of the deformed touchpad captured by the CCD camera.
<xref rid="f2-sensors-14-05805" ref-type="fig">Figure 2</xref>
shows the captured images, sized 640 × 480 effective pixels, when the touchpad does not contact and contact an object, respectively. Our proposed sensor can obtain multiple types of tactile information, including the contact region, multi-axis contact force, slippage, shape, position and orientation of an object [
<xref rid="b23-sensors-14-05805" ref-type="bibr">23</xref>
<xref rid="b27-sensors-14-05805" ref-type="bibr">27</xref>
].</p>
</sec>
<sec>
<label>3.</label>
<title>Estimation of Contact Region</title>
<sec>
<label>3.1.</label>
<title>Theory for Estimating the Contact Region</title>
<p>In order to estimate the contact region between the touchpad and an object, we focus on the approach that the printed dot patterns on the surface of the touchpad can be considered to be sensing elements. If each dot contacting the object can be discriminated, the contact region can be constructed as a set of dots in the contact region as shown in
<xref rid="f3-sensors-14-05805" ref-type="fig">Figure 3</xref>
. Although the other sensors use many sensing devices and wiring for obtaining the contact region [
<xref rid="b16-sensors-14-05805" ref-type="bibr">16</xref>
<xref rid="b19-sensors-14-05805" ref-type="bibr">19</xref>
,
<xref rid="b21-sensors-14-05805" ref-type="bibr">21</xref>
], our approach is advantageous because fewer sensing elements and less wiring is required, thereby generating a more compact size and structure. Moreover, the method is generalized because it can be applied to other sensors including dots/markers on the sensor surface. Differently to using many sensing devices, the size of dots can easily be produced in smaller sizes by the printing technique, and thus high resolution is expected. The sensor for obtaining the contact region in [
<xref rid="b22-sensors-14-05805" ref-type="bibr">22</xref>
] cannot fit with in the objects geometrically because of the inside solid body. Our previous work in [
<xref rid="b25-sensors-14-05805" ref-type="bibr">25</xref>
] required the strict restriction that the contact surface of the object must be flat or convex. Differently from these previous works, our sensor without many sensing devices can deform because of the elastic touchpad and does not require strict assumptions to objects, which are also advantageous. In the next section, we discriminate dots to construct the contact region.</p>
</sec>
<sec>
<label>3.2.</label>
<title>Discrimination of Dots</title>
<p>The contact state of a dot is classified into three types—the non-contacting dot, the sticking dot and the slipping dot. The non-contacting dot is a dot that lies outside of the contact region. The sticking dot is defined as a dot that is in the contact region but does not slip, while the slipping dot slips on an object in the contact region. The contacting dots include the sticking dots and the slipping dots. In order to construct the contact region, the multiple types of dots are discriminated, and the contacting dots are extracted.</p>
<p>To solve the problem of discriminating among the dots, we focus on dynamic information concerning the dots. Considering the movements of the dots with reference to an object, we formulate equations to discriminate between the contacting dots and the non-contacting dots. Here, calculation of the positions and movements of the dots is described in the next section.</p>
<p>Firstly, we address the discrimination of the sticking dots. When a certain dot is in contact with the object without slippage, the movement of the dot is geometrically-determined from the movement of the object. The sticking dot must satisfy the following equation:
<disp-formula id="FD1">
<label>(1)</label>
<mml:math id="mm1">
<mml:mrow>
<mml:msub>
<mml:mi mathvariant="bold-italic">d</mml:mi>
<mml:mrow>
<mml:mtext mathvariant="italic">obj</mml:mtext>
</mml:mrow>
</mml:msub>
<mml:mo>=</mml:mo>
<mml:msub>
<mml:mi mathvariant="bold-italic">d</mml:mi>
<mml:mi>k</mml:mi>
</mml:msub>
<mml:mo></mml:mo>
<mml:mi mathvariant="bold-italic">R</mml:mi>
<mml:mo stretchy="false">(</mml:mo>
<mml:mo></mml:mo>
<mml:msub>
<mml:mi mathvariant="bold-italic">ω</mml:mi>
<mml:mrow>
<mml:mtext mathvariant="italic">obj</mml:mtext>
</mml:mrow>
</mml:msub>
<mml:mo stretchy="false">)</mml:mo>
<mml:mo stretchy="false">(</mml:mo>
<mml:msub>
<mml:mi mathvariant="bold-italic">p</mml:mi>
<mml:mrow>
<mml:mtext mathvariant="italic">obj</mml:mtext>
</mml:mrow>
</mml:msub>
<mml:mo></mml:mo>
<mml:msub>
<mml:mi mathvariant="bold-italic">p</mml:mi>
<mml:mi>k</mml:mi>
</mml:msub>
<mml:mo stretchy="false">)</mml:mo>
<mml:mo>+</mml:mo>
<mml:mo stretchy="false">(</mml:mo>
<mml:msub>
<mml:mi mathvariant="bold-italic">p</mml:mi>
<mml:mrow>
<mml:mtext mathvariant="italic">obj</mml:mtext>
</mml:mrow>
</mml:msub>
<mml:mo></mml:mo>
<mml:msub>
<mml:mi mathvariant="bold-italic">p</mml:mi>
<mml:mi>k</mml:mi>
</mml:msub>
<mml:mo stretchy="false">)</mml:mo>
<mml:mo>=</mml:mo>
<mml:msub>
<mml:mi mathvariant="bold-italic">d</mml:mi>
<mml:mi>k</mml:mi>
</mml:msub>
<mml:mo>+</mml:mo>
<mml:mrow>
<mml:mo stretchy="false">{</mml:mo>
<mml:mrow>
<mml:mi mathvariant="bold-italic">I</mml:mi>
<mml:mo></mml:mo>
<mml:mi mathvariant="bold-italic">R</mml:mi>
<mml:mo stretchy="false">(</mml:mo>
<mml:mo></mml:mo>
<mml:msub>
<mml:mi mathvariant="bold-italic">ω</mml:mi>
<mml:mrow>
<mml:mtext mathvariant="italic">obj</mml:mtext>
</mml:mrow>
</mml:msub>
<mml:mo stretchy="false">)</mml:mo>
</mml:mrow>
<mml:mo stretchy="false">}</mml:mo>
</mml:mrow>
<mml:mo stretchy="false">(</mml:mo>
<mml:msub>
<mml:mi mathvariant="bold-italic">p</mml:mi>
<mml:mrow>
<mml:mtext mathvariant="italic">obj</mml:mtext>
</mml:mrow>
</mml:msub>
<mml:mo></mml:mo>
<mml:msub>
<mml:mi mathvariant="bold-italic">p</mml:mi>
<mml:mi mathvariant="bold-italic">k</mml:mi>
</mml:msub>
<mml:mo stretchy="false">)</mml:mo>
</mml:mrow>
</mml:math>
</disp-formula>
where
<bold>
<italic>ω</italic>
</bold>
<italic>
<sub>obj</sub>
</italic>
is the rotation angle of the object defined as Euler angle,
<bold>
<italic>I</italic>
</bold>
is the identity matrix of size three;
<bold>
<italic>d</italic>
</bold>
<italic>
<sub>obj</sub>
</italic>
and
<bold>
<italic>d</italic>
</bold>
<italic>
<sub>k</sub>
</italic>
are the three-dimensional displacements of the object and a certain sticking dot
<italic>k</italic>
, respectively;
<bold>
<italic>p</italic>
</bold>
<italic>
<sub>obj</sub>
</italic>
and
<bold>
<italic>p</italic>
</bold>
<italic>
<sub>k</sub>
</italic>
are the three-dimensional positions of the object and the sticking dot
<italic>k</italic>
, respectively; and
<bold>
<italic>R</italic>
</bold>
(−
<bold>
<italic>ω</italic>
</bold>
<italic>
<sub>obj</sub>
</italic>
) is the rotation (square) matrix of size three with reference to the three-dimensional angle −
<bold>
<italic>ω</italic>
</bold>
<italic>
<sub>obj</sub>
</italic>
. Here, the rotation angle, the displacement and the position of the object refer to the mean for the weighted center of the object.
<xref rid="f4-sensors-14-05805" ref-type="fig">Figure 4</xref>
shows this geometric relationship between the object and the sticking dot
<italic>k</italic>
in
<xref rid="FD1" ref-type="disp-formula">Equation (1)</xref>
.</p>
<p>In order to apply
<xref rid="FD1" ref-type="disp-formula">Equation (1)</xref>
to all dots, we must obtain the rotation angle, the displacement and the position of the object. Although the values cannot be directly obtained, our previous method can be used to estimate the rotation angle and the displacement of the object under the assumption that the at least a set of nine dots are does not slip on the surface of an object [
<xref rid="b25-sensors-14-05805" ref-type="bibr">25</xref>
]. This assumption is achieved by preventing the object from slipping when we apply our previous method [
<xref rid="b28-sensors-14-05805" ref-type="bibr">28</xref>
].</p>
<p>We introduce the contact reference dot previously proposed in [
<xref rid="b25-sensors-14-05805" ref-type="bibr">25</xref>
] to calculate the rotation angle and the displacement of the object. From among the multiple dots printed on the surface of the touchpad, the contact reference dot
<italic>k
<sub>ref</sub>
</italic>
, is defined in the following equation as the dot with the greatest displacement from the initial position:
<disp-formula id="FD2">
<label>(2)</label>
<mml:math id="mm2">
<mml:mrow>
<mml:msub>
<mml:mi>k</mml:mi>
<mml:mrow>
<mml:mi>r</mml:mi>
<mml:mi>e</mml:mi>
<mml:mi>f</mml:mi>
</mml:mrow>
</mml:msub>
<mml:mo>=</mml:mo>
<mml:munder>
<mml:mrow>
<mml:mo>arg</mml:mo>
<mml:mo>max</mml:mo>
</mml:mrow>
<mml:mi>k</mml:mi>
</mml:munder>
<mml:mrow>
<mml:mo>|</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi mathvariant="bold-italic">p</mml:mi>
<mml:mi>k</mml:mi>
</mml:msub>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi>t</mml:mi>
<mml:mi>c</mml:mi>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
<mml:mo></mml:mo>
<mml:msub>
<mml:mi mathvariant="bold-italic">p</mml:mi>
<mml:mi>k</mml:mi>
</mml:msub>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi>t</mml:mi>
<mml:mn>0</mml:mn>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
</mml:mrow>
<mml:mo>|</mml:mo>
</mml:mrow>
</mml:mrow>
</mml:math>
</disp-formula>
where
<bold>
<italic>p</italic>
</bold>
<italic>
<sub>k</sub>
</italic>
is the three-dimensional position of a dot
<italic>k</italic>
, and
<italic>t
<sub>c</sub>
</italic>
and
<italic>t</italic>
<sub>0</sub>
are the current time and the initial time at which the touchpad is not in contact with any objects, respectively.</p>
<p>The contact reference dot is defined such that it is always in contact with the object. Moreover, the contact reference dot does not slip on the surface of a contacted object, because our previous method can be applied to prevent the object from slipping [
<xref rid="b28-sensors-14-05805" ref-type="bibr">28</xref>
]. As a result of this characteristic, the displacement and the rotation angle of the contact reference dot are equal to those of the object as demonstrated in an earlier manuscript [
<xref rid="b25-sensors-14-05805" ref-type="bibr">25</xref>
]. Therefore, the contact reference dot always satisfies the following equations based on
<xref rid="FD1" ref-type="disp-formula">Equation (1)</xref>
:</p>
<disp-formula id="FD3">
<label>(3)</label>
<mml:math id="mm3">
<mml:mrow>
<mml:mrow>
<mml:mo>{</mml:mo>
<mml:mtable columnalign="left">
<mml:mtr>
<mml:mtd>
<mml:msub>
<mml:mi mathvariant="bold-italic">ω</mml:mi>
<mml:mrow>
<mml:mtext mathvariant="italic">obj</mml:mtext>
</mml:mrow>
</mml:msub>
<mml:mo>=</mml:mo>
<mml:msub>
<mml:mi mathvariant="bold-italic">ω</mml:mi>
<mml:mrow>
<mml:mtext mathvariant="italic">ref</mml:mtext>
</mml:mrow>
</mml:msub>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:msub>
<mml:mi mathvariant="bold-italic">d</mml:mi>
<mml:mrow>
<mml:mi mathvariant="italic">obj</mml:mi>
</mml:mrow>
</mml:msub>
<mml:mo>=</mml:mo>
<mml:msub>
<mml:mi mathvariant="bold-italic">d</mml:mi>
<mml:mrow>
<mml:mi mathvariant="italic">ref</mml:mi>
</mml:mrow>
</mml:msub>
<mml:mo>+</mml:mo>
<mml:mrow>
<mml:mo>{</mml:mo>
<mml:mrow>
<mml:mi mathvariant="bold-italic">I</mml:mi>
<mml:mo></mml:mo>
<mml:mi mathvariant="bold-italic">R</mml:mi>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:mo></mml:mo>
<mml:msub>
<mml:mi mathvariant="bold-italic">ω</mml:mi>
<mml:mrow>
<mml:mtext mathvariant="italic">obj</mml:mtext>
</mml:mrow>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
</mml:mrow>
<mml:mo>}</mml:mo>
</mml:mrow>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi mathvariant="bold-italic">p</mml:mi>
<mml:mrow>
<mml:mtext mathvariant="italic">obj</mml:mtext>
</mml:mrow>
</mml:msub>
<mml:mo></mml:mo>
<mml:msub>
<mml:mi mathvariant="bold-italic">p</mml:mi>
<mml:mrow>
<mml:mtext mathvariant="italic">ref</mml:mtext>
</mml:mrow>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
<mml:mo>=</mml:mo>
<mml:msub>
<mml:mi mathvariant="bold-italic">d</mml:mi>
<mml:mrow>
<mml:mi mathvariant="italic">ref</mml:mi>
</mml:mrow>
</mml:msub>
<mml:mo>+</mml:mo>
<mml:mrow>
<mml:mo>{</mml:mo>
<mml:mrow>
<mml:mi mathvariant="bold-italic">I</mml:mi>
<mml:mo></mml:mo>
<mml:mi mathvariant="bold-italic">R</mml:mi>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:mo></mml:mo>
<mml:msub>
<mml:mi mathvariant="bold-italic">ω</mml:mi>
<mml:mrow>
<mml:mtext mathvariant="italic">ref</mml:mtext>
</mml:mrow>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
<mml:mo>}</mml:mo>
</mml:mrow>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi mathvariant="bold-italic">p</mml:mi>
<mml:mrow>
<mml:mtext mathvariant="italic">obj</mml:mtext>
</mml:mrow>
</mml:msub>
<mml:mo></mml:mo>
<mml:msub>
<mml:mi mathvariant="bold-italic">p</mml:mi>
<mml:mrow>
<mml:mtext mathvariant="italic">ref</mml:mtext>
</mml:mrow>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
</mml:mtd>
</mml:mtr>
</mml:mtable>
</mml:mrow>
</mml:mrow>
</mml:math>
</disp-formula>
<p>where
<bold>
<italic>ω</italic>
</bold>
<italic>
<sub>ref</sub>
</italic>
,
<bold>
<italic>d</italic>
</bold>
<italic>
<sub>ref</sub>
</italic>
and
<bold>
<italic>p</italic>
</bold>
<italic>
<sub>ref</sub>
</italic>
are the rotation angle, the displacement and the position of the contact reference dot
<italic>k
<sub>ref</sub>
</italic>
.</p>
<p>Next, we calculate the position of the object. When
<bold>
<italic>ω</italic>
</bold>
<italic>
<sub>obj</sub>
</italic>
=
<bold>
<italic>ω</italic>
</bold>
<italic>
<sub>ref</sub>
</italic>
=
<bold>
<italic>0</italic>
</bold>
without rotation,
<bold>
<italic>p</italic>
</bold>
<italic>
<sub>obj</sub>
</italic>
is eliminated in
<xref rid="FD1" ref-type="disp-formula">Equations (1)</xref>
and
<xref rid="FD3" ref-type="disp-formula">(3)</xref>
because
<bold>
<italic>R</italic>
</bold>
(–
<bold>
<italic>ω</italic>
</bold>
<italic>
<sub>obj</sub>
</italic>
) =
<bold>
<italic>R</italic>
</bold>
(–
<bold>
<italic>ω</italic>
</bold>
<italic>
<sub>ref</sub>
</italic>
) =
<bold>
<italic>I</italic>
</bold>
. When
<bold>
<italic>ω</italic>
</bold>
<italic>
<sub>ref</sub>
</italic>
is not equal to zero,
<bold>
<italic>p</italic>
</bold>
<italic>
<sub>obj</sub>
</italic>
is given as follows by transforming
<xref rid="FD3" ref-type="disp-formula">Equation (3)</xref>
:
<disp-formula id="FD4">
<label>(4)</label>
<mml:math id="mm4">
<mml:mrow>
<mml:msub>
<mml:mi mathvariant="bold-italic">p</mml:mi>
<mml:mrow>
<mml:mtext mathvariant="italic">obj</mml:mtext>
</mml:mrow>
</mml:msub>
<mml:mo>=</mml:mo>
<mml:msup>
<mml:mrow>
<mml:mrow>
<mml:mo>{</mml:mo>
<mml:mrow>
<mml:mi mathvariant="bold-italic">I</mml:mi>
<mml:mo></mml:mo>
<mml:mi mathvariant="bold-italic">R</mml:mi>
<mml:mo>(</mml:mo>
<mml:mo></mml:mo>
<mml:msub>
<mml:mi mathvariant="bold-italic">ω</mml:mi>
<mml:mrow>
<mml:mi mathvariant="italic">ref</mml:mi>
</mml:mrow>
</mml:msub>
<mml:mo>)</mml:mo>
</mml:mrow>
<mml:mo>}</mml:mo>
</mml:mrow>
</mml:mrow>
<mml:mrow>
<mml:mo></mml:mo>
<mml:mn>1</mml:mn>
</mml:mrow>
</mml:msup>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi mathvariant="bold-italic">d</mml:mi>
<mml:mrow>
<mml:mtext mathvariant="italic">obj</mml:mtext>
</mml:mrow>
</mml:msub>
<mml:mo></mml:mo>
<mml:msub>
<mml:mi mathvariant="bold-italic">d</mml:mi>
<mml:mrow>
<mml:mtext mathvariant="italic">ref</mml:mtext>
</mml:mrow>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
<mml:mo>+</mml:mo>
<mml:msub>
<mml:mi mathvariant="bold-italic">p</mml:mi>
<mml:mrow>
<mml:mtext mathvariant="italic">ref</mml:mtext>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:math>
</disp-formula>
Therefore, substituting
<xref rid="FD4" ref-type="disp-formula">Equation (4)</xref>
into
<xref rid="FD1" ref-type="disp-formula">Equation (1)</xref>
can eliminate the position of the object
<bold>
<italic>p</italic>
</bold>
<italic>
<sub>obj</sub>
</italic>
as follows:
<disp-formula id="FD5">
<label>(5)</label>
<mml:math id="mm5">
<mml:mrow>
<mml:msub>
<mml:mi mathvariant="bold-italic">d</mml:mi>
<mml:mrow>
<mml:mtext mathvariant="italic">obj</mml:mtext>
</mml:mrow>
</mml:msub>
<mml:mo>=</mml:mo>
<mml:mrow>
<mml:mo>{</mml:mo>
<mml:mrow>
<mml:mtable columnalign="left">
<mml:mtr columnalign="left">
<mml:mtd columnalign="left">
<mml:mrow>
<mml:msub>
<mml:mi mathvariant="bold-italic">d</mml:mi>
<mml:mi mathvariant="bold-italic">k</mml:mi>
</mml:msub>
</mml:mrow>
</mml:mtd>
<mml:mtd columnalign="left">
<mml:mrow>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:mi mathvariant="bold-italic">R</mml:mi>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:mo></mml:mo>
<mml:msub>
<mml:mi mathvariant="bold-italic">ω</mml:mi>
<mml:mrow>
<mml:mi mathvariant="italic">ref</mml:mi>
</mml:mrow>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
<mml:mo>=</mml:mo>
<mml:mi mathvariant="bold-italic">I</mml:mi>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
</mml:mrow>
</mml:mtd>
</mml:mtr>
<mml:mtr columnalign="left">
<mml:mtd columnalign="left">
<mml:mrow>
<mml:msub>
<mml:mi mathvariant="bold-italic">d</mml:mi>
<mml:mi>k</mml:mi>
</mml:msub>
<mml:mo>+</mml:mo>
<mml:mrow>
<mml:mo>{</mml:mo>
<mml:mrow>
<mml:mi mathvariant="bold-italic">I</mml:mi>
<mml:mo></mml:mo>
<mml:mi mathvariant="bold-italic">R</mml:mi>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:mo></mml:mo>
<mml:msub>
<mml:mi mathvariant="bold-italic">ω</mml:mi>
<mml:mrow>
<mml:mtext mathvariant="italic">obj</mml:mtext>
</mml:mrow>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
</mml:mrow>
<mml:mo>}</mml:mo>
</mml:mrow>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:msup>
<mml:mrow>
<mml:mrow>
<mml:mo>{</mml:mo>
<mml:mrow>
<mml:mi mathvariant="bold-italic">I</mml:mi>
<mml:mo></mml:mo>
<mml:mi mathvariant="bold-italic">R</mml:mi>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:mo></mml:mo>
<mml:msub>
<mml:mi mathvariant="bold-italic">ω</mml:mi>
<mml:mrow>
<mml:mtext mathvariant="italic">ref</mml:mtext>
</mml:mrow>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
</mml:mrow>
<mml:mo>}</mml:mo>
</mml:mrow>
</mml:mrow>
<mml:mrow>
<mml:mo></mml:mo>
<mml:mn>1</mml:mn>
</mml:mrow>
</mml:msup>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi mathvariant="bold-italic">d</mml:mi>
<mml:mrow>
<mml:mtext mathvariant="italic">obj</mml:mtext>
</mml:mrow>
</mml:msub>
<mml:mo></mml:mo>
<mml:msub>
<mml:mi mathvariant="bold-italic">d</mml:mi>
<mml:mrow>
<mml:mtext mathvariant="italic">ref</mml:mtext>
</mml:mrow>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
<mml:mo>+</mml:mo>
<mml:msub>
<mml:mi mathvariant="bold-italic">p</mml:mi>
<mml:mrow>
<mml:mtext mathvariant="italic">ref</mml:mtext>
</mml:mrow>
</mml:msub>
<mml:mo></mml:mo>
<mml:msub>
<mml:mi mathvariant="bold-italic">p</mml:mi>
<mml:mi>k</mml:mi>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
</mml:mrow>
</mml:mtd>
<mml:mtd columnalign="left">
<mml:mrow>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:mi mathvariant="bold-italic">R</mml:mi>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:mo></mml:mo>
<mml:msub>
<mml:mi mathvariant="bold-italic">ω</mml:mi>
<mml:mrow>
<mml:mtext mathvariant="italic">ref</mml:mtext>
</mml:mrow>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
<mml:mo></mml:mo>
<mml:mi mathvariant="bold-italic">I</mml:mi>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
</mml:mrow>
</mml:mtd>
</mml:mtr>
</mml:mtable>
</mml:mrow>
</mml:mrow>
</mml:mrow>
</mml:math>
</disp-formula>
Moreover, substituting
<bold>
<italic>ω</italic>
</bold>
<italic>
<sub>obj</sub>
</italic>
=
<bold>
<italic>ω</italic>
</bold>
<italic>
<sub>ref</sub>
</italic>
from
<xref rid="FD3" ref-type="disp-formula">Equation (3)</xref>
into
<xref rid="FD5" ref-type="disp-formula">Equation (5)</xref>
eliminates the rotation angle of the object
<bold>
<italic>ω</italic>
</bold>
<italic>
<sub>obj</sub>
</italic>
as follows:
<disp-formula id="FD6">
<label>(6)</label>
<mml:math id="mm6">
<mml:mrow>
<mml:msub>
<mml:mi mathvariant="bold-italic">d</mml:mi>
<mml:mrow>
<mml:mtext mathvariant="italic">obj</mml:mtext>
</mml:mrow>
</mml:msub>
<mml:mo>=</mml:mo>
<mml:mrow>
<mml:mo>{</mml:mo>
<mml:mrow>
<mml:mtable columnalign="left">
<mml:mtr columnalign="left">
<mml:mtd columnalign="left">
<mml:mrow>
<mml:msub>
<mml:mi mathvariant="bold-italic">d</mml:mi>
<mml:mi>k</mml:mi>
</mml:msub>
</mml:mrow>
</mml:mtd>
<mml:mtd columnalign="left">
<mml:mrow>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:mi mathvariant="bold-italic">R</mml:mi>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:mo></mml:mo>
<mml:msub>
<mml:mi mathvariant="bold-italic">ω</mml:mi>
<mml:mrow>
<mml:mtext mathvariant="italic">ref</mml:mtext>
</mml:mrow>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
<mml:mo>=</mml:mo>
<mml:mi mathvariant="bold-italic">I</mml:mi>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
</mml:mrow>
</mml:mtd>
</mml:mtr>
<mml:mtr columnalign="left">
<mml:mtd columnalign="left">
<mml:mrow>
<mml:msub>
<mml:mi mathvariant="bold-italic">d</mml:mi>
<mml:mi>k</mml:mi>
</mml:msub>
<mml:mo>+</mml:mo>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi mathvariant="bold-italic">d</mml:mi>
<mml:mrow>
<mml:mtext mathvariant="italic">obj</mml:mtext>
</mml:mrow>
</mml:msub>
<mml:mo></mml:mo>
<mml:msub>
<mml:mi mathvariant="bold-italic">d</mml:mi>
<mml:mrow>
<mml:mtext mathvariant="italic">ref</mml:mtext>
</mml:mrow>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
<mml:mo>+</mml:mo>
<mml:mrow>
<mml:mo>{</mml:mo>
<mml:mrow>
<mml:mi mathvariant="bold-italic">I</mml:mi>
<mml:mo></mml:mo>
<mml:mi mathvariant="bold-italic">R</mml:mi>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:mo></mml:mo>
<mml:msub>
<mml:mi mathvariant="bold-italic">ω</mml:mi>
<mml:mrow>
<mml:mtext mathvariant="bold-italic">ref</mml:mtext>
</mml:mrow>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
</mml:mrow>
<mml:mo>}</mml:mo>
</mml:mrow>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi mathvariant="bold-italic">p</mml:mi>
<mml:mrow>
<mml:mtext mathvariant="italic">ref</mml:mtext>
</mml:mrow>
</mml:msub>
<mml:mo></mml:mo>
<mml:msub>
<mml:mi mathvariant="bold-italic">p</mml:mi>
<mml:mi>k</mml:mi>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
</mml:mrow>
</mml:mtd>
<mml:mtd columnalign="left">
<mml:mrow>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:mi mathvariant="bold-italic">R</mml:mi>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:mo></mml:mo>
<mml:msub>
<mml:mi mathvariant="bold-italic">ω</mml:mi>
<mml:mrow>
<mml:mtext mathvariant="bold-italic">ref</mml:mtext>
</mml:mrow>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
<mml:mo></mml:mo>
<mml:mi mathvariant="bold-italic">I</mml:mi>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
</mml:mrow>
</mml:mtd>
</mml:mtr>
</mml:mtable>
</mml:mrow>
</mml:mrow>
</mml:mrow>
</mml:math>
</disp-formula>
Note that
<bold>
<italic>d</italic>
</bold>
<italic>
<sub>obj</sub>
</italic>
=
<bold>
<italic>d</italic>
</bold>
<italic>
<sub>ref</sub>
</italic>
when
<bold>
<italic>R</italic>
</bold>
(-
<bold>
<italic>ω</italic>
</bold>
<italic>
<sub>ref</sub>
</italic>
) =
<bold>
<italic>I</italic>
</bold>
without rotation from
<xref rid="FD3" ref-type="disp-formula">Equation (3)</xref>
. Therefore, we can simplify
<xref rid="FD6" ref-type="disp-formula">Equation (6)</xref>
into the following equation:
<disp-formula id="FD7">
<label>(7)</label>
<mml:math id="mm7">
<mml:mrow>
<mml:msub>
<mml:mi mathvariant="bold-italic">d</mml:mi>
<mml:mrow>
<mml:mtext mathvariant="italic">ref</mml:mtext>
</mml:mrow>
</mml:msub>
<mml:mo>=</mml:mo>
<mml:msub>
<mml:mi mathvariant="bold-italic">d</mml:mi>
<mml:mi>k</mml:mi>
</mml:msub>
<mml:mo>+</mml:mo>
<mml:mrow>
<mml:mo>{</mml:mo>
<mml:mrow>
<mml:mi mathvariant="bold-italic">I</mml:mi>
<mml:mo></mml:mo>
<mml:mi mathvariant="bold-italic">R</mml:mi>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:mo></mml:mo>
<mml:msub>
<mml:mi mathvariant="bold-italic">ω</mml:mi>
<mml:mrow>
<mml:mtext mathvariant="italic">ref</mml:mtext>
</mml:mrow>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
</mml:mrow>
<mml:mo>}</mml:mo>
</mml:mrow>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi mathvariant="bold-italic">p</mml:mi>
<mml:mrow>
<mml:mtext mathvariant="italic">ref</mml:mtext>
</mml:mrow>
</mml:msub>
<mml:mo></mml:mo>
<mml:msub>
<mml:mi mathvariant="bold-italic">p</mml:mi>
<mml:mi>k</mml:mi>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
</mml:mrow>
</mml:math>
</disp-formula>
Consequently, a sticking dot can be discriminated by considering the relative displacement between the dot and the contact reference dot based on
<xref rid="FD7" ref-type="disp-formula">Equation (7)</xref>
.</p>
<p>Secondly, the discrimination of non-contacting dots is considered. When a dot is not in contact with the object, the dot does not satisfy
<xref rid="FD7" ref-type="disp-formula">Equation (7)</xref>
as follows:</p>
<disp-formula id="FD8">
<label>(8)</label>
<mml:math id="mm8">
<mml:mrow>
<mml:msub>
<mml:mi mathvariant="bold-italic">d</mml:mi>
<mml:mrow>
<mml:mtext mathvariant="italic">ref</mml:mtext>
</mml:mrow>
</mml:msub>
<mml:mo></mml:mo>
<mml:msub>
<mml:mi mathvariant="bold-italic">d</mml:mi>
<mml:mi>k</mml:mi>
</mml:msub>
<mml:mo>+</mml:mo>
<mml:mrow>
<mml:mo>{</mml:mo>
<mml:mrow>
<mml:mi mathvariant="bold-italic">I</mml:mi>
<mml:mo></mml:mo>
<mml:mi mathvariant="bold-italic">R</mml:mi>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:mo></mml:mo>
<mml:msub>
<mml:mi mathvariant="bold-italic">ω</mml:mi>
<mml:mrow>
<mml:mtext mathvariant="italic">ref</mml:mtext>
</mml:mrow>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
</mml:mrow>
<mml:mo>}</mml:mo>
</mml:mrow>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi mathvariant="bold-italic">p</mml:mi>
<mml:mrow>
<mml:mtext mathvariant="italic">ref</mml:mtext>
</mml:mrow>
</mml:msub>
<mml:mo></mml:mo>
<mml:msub>
<mml:mi mathvariant="bold-italic">p</mml:mi>
<mml:mi>k</mml:mi>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
</mml:mrow>
</mml:math>
</disp-formula>
<p>Thirdly, the difficult problem of the discrimination of slipping dots is considered—the displacements of the slipping dots are not equal to the displacement of the object because of the slippage. The approach used in this paper to solve this problem is to use the normal component of the displacement, which is perpendicular to the surface of the object. Although the slipping dot slips on the surface of the object, the dot does not move in the normal direction that is perpendicular to the surface of the object. Therefore, the normal component of the displacement of the dot is independent of the slippage. The normal component of the displacement of the slipping dot is equal to that of the object as follows:
<disp-formula id="FD9">
<label>(9)</label>
<mml:math id="mm9">
<mml:mrow>
<mml:msub>
<mml:mi mathvariant="bold-italic">d</mml:mi>
<mml:mrow>
<mml:mtext mathvariant="italic">ref</mml:mtext>
</mml:mrow>
</mml:msub>
<mml:mo></mml:mo>
<mml:msub>
<mml:mi mathvariant="bold-italic">n</mml:mi>
<mml:mi>k</mml:mi>
</mml:msub>
<mml:mo>=</mml:mo>
<mml:mrow>
<mml:mo></mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi mathvariant="bold-italic">d</mml:mi>
<mml:mi>k</mml:mi>
</mml:msub>
<mml:mo>+</mml:mo>
<mml:mrow>
<mml:mo>{</mml:mo>
<mml:mrow>
<mml:mi mathvariant="bold-italic">I</mml:mi>
<mml:mo></mml:mo>
<mml:mi mathvariant="bold-italic">R</mml:mi>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:mo></mml:mo>
<mml:msub>
<mml:mi mathvariant="bold-italic">ω</mml:mi>
<mml:mrow>
<mml:mtext mathvariant="italic">ref</mml:mtext>
</mml:mrow>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
</mml:mrow>
<mml:mo>}</mml:mo>
</mml:mrow>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi mathvariant="bold-italic">p</mml:mi>
<mml:mrow>
<mml:mtext mathvariant="italic">ref</mml:mtext>
</mml:mrow>
</mml:msub>
<mml:mo></mml:mo>
<mml:msub>
<mml:mi mathvariant="bold-italic">p</mml:mi>
<mml:mi>k</mml:mi>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
</mml:mrow>
<mml:mo></mml:mo>
</mml:mrow>
<mml:mo></mml:mo>
<mml:msub>
<mml:mi mathvariant="bold-italic">n</mml:mi>
<mml:mi>k</mml:mi>
</mml:msub>
</mml:mrow>
</mml:math>
</disp-formula>
where
<bold>
<italic>n</italic>
</bold>
<italic>
<sub>k</sub>
</italic>
is the unit vector perpendicular to the surface of the object around the position of a slipping dot
<italic>k</italic>
. The value of
<bold>
<italic>n</italic>
</bold>
<italic>
<sub>k</sub>
</italic>
is calculated by using the three-dimensional shape of the touchpad based on the method published in a previous manuscript [
<xref rid="b26-sensors-14-05805" ref-type="bibr">26</xref>
].</p>
<p>We have formulated the conditions
<xref ref-type="disp-formula" rid="FD7">Equations (7)</xref>
<xref ref-type="disp-formula" rid="FD9">(9)</xref>
for discriminating among sticking dots, non-contacting dots, and slipping dots. In fact, the estimation of the contact region only requires the discrimination between the non-contacting dots and the contacting dots. However, the non-contacting dots may satisfy
<xref ref-type="disp-formula" rid="FD8">Equations (8)</xref>
and
<xref ref-type="disp-formula" rid="FD9">(9)</xref>
simultaneously. In order to avoid this problem,
<xref ref-type="disp-formula" rid="FD7">Equations (7)</xref>
and
<xref ref-type="disp-formula" rid="FD9">(9)</xref>
are applied to discriminate among the dots depending on the previous (one sampling step earlier) contact state of the dots. When a dot was a non-contacting dot in the previous state, we apply
<xref rid="FD7" ref-type="disp-formula">Equation (7)</xref>
to determine the current contact state. The dot satisfying
<xref rid="FD7" ref-type="disp-formula">Equation (7)</xref>
is regarded as the contacting dot. Otherwise, it is regarded as a non-contacting dot. When a dot was the contacting dot in the previous state,
<xref rid="FD9" ref-type="disp-formula">Equation (9)</xref>
is applied. The dot satisfying
<xref rid="FD9" ref-type="disp-formula">Equation (9)</xref>
is regarded as a contacting dot. Otherwise, it is regarded as a non-contacting dot. In the following section, the three-dimensional positions and displacements of the dots are calculated, and
<xref ref-type="disp-formula" rid="FD7">Equations (7)</xref>
and
<xref ref-type="disp-formula" rid="FD9">(9)</xref>
are modified for accuracy.</p>
<sec>
<label>3.3.</label>
<title>Modification of Equations Discriminating among the Dots</title>
<p>In order to calculate the three-dimensional positions of dots, the three-dimensional shape of the surface of the touchpad is used, which is estimated in our previous research reported in [
<xref rid="b26-sensors-14-05805" ref-type="bibr">26</xref>
]. Images captured by the CCD camera contain the two-dimensional positions of the dots. By combining the three-dimensional shape of the surface of the touchpad with the two-dimensional positions of the dots in the images, the three-dimensional positions of the dots are calculated based on the geometric relationship described in [
<xref rid="b25-sensors-14-05805" ref-type="bibr">25</xref>
].</p>
<p>Next, the displacements of the dots are calculated as the changes in positions from the previous time
<italic>t
<sub>p</sub>
</italic>
to the current time
<italic>t
<sub>c</sub>
</italic>
. Here, the value for the previous time
<italic>t
<sub>p</sub>
</italic>
must be selected appropriately because a large passage of time decreases the responsiveness of the proposed method. Moreover, the contact state of a dot may change between the previous time
<italic>t
<sub>p</sub>
</italic>
to the current time
<italic>t
<sub>c</sub>
</italic>
when
<italic>t
<sub>c</sub>
</italic>
<italic>t
<sub>p</sub>
</italic>
and the displacement of the dot are significantly large. Conversely, if the displacement of the dot is too small because of small
<italic>t
<sub>c</sub>
</italic>
<italic>t
<sub>p</sub>
</italic>
, the value is inappropriate, because it is significantly influenced by noise and the error of estimating the positions of the dots. In order to address this compromise, we introduced a threshold value
<italic>D</italic>
<sub>max</sub>
for determining
<italic>t
<sub>p</sub>
</italic>
as given in the following equation:
<disp-formula id="FD10">
<label>(10)</label>
<mml:math id="mm10">
<mml:mrow>
<mml:msub>
<mml:mi>t</mml:mi>
<mml:mi>p</mml:mi>
</mml:msub>
<mml:mo>=</mml:mo>
<mml:mo>max</mml:mo>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mi>t</mml:mi>
<mml:mo>)</mml:mo>
</mml:mrow>
<mml:mi>s</mml:mi>
<mml:mo>.</mml:mo>
<mml:mi>t</mml:mi>
<mml:mo>.</mml:mo>
<mml:msub>
<mml:mi>D</mml:mi>
<mml:mrow>
<mml:mo>max</mml:mo>
</mml:mrow>
</mml:msub>
<mml:mo><</mml:mo>
<mml:mo>max</mml:mo>
<mml:mrow>
<mml:mo>|</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi mathvariant="bold-italic">p</mml:mi>
<mml:mi>k</mml:mi>
</mml:msub>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi>t</mml:mi>
<mml:mi>c</mml:mi>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
<mml:mo></mml:mo>
<mml:msub>
<mml:mi mathvariant="bold-italic">p</mml:mi>
<mml:mi>k</mml:mi>
</mml:msub>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi>t</mml:mi>
<mml:mi>p</mml:mi>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
</mml:mrow>
<mml:mo>|</mml:mo>
</mml:mrow>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:mi>k</mml:mi>
<mml:mo></mml:mo>
<mml:mi>K</mml:mi>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
</mml:mrow>
</mml:math>
</disp-formula>
where
<italic>K</italic>
is a set of numbers of all dots and
<xref ref-type="disp-formula" rid="FD10">Equation (10)</xref>
indicates that the previous time
<italic>t
<sub>p</sub>
</italic>
is determined such that the displacement of the dot is large enough, and a recent time is selected for making
<italic>t
<sub>c</sub>
</italic>
<italic>t
<sub>p</sub>
</italic>
smaller. Finally, the displacements of the dots in
<xref ref-type="disp-formula" rid="FD7">Equations (7)</xref>
and
<xref ref-type="disp-formula" rid="FD9">(9)</xref>
are defined as follows:
<disp-formula id="FD11">
<label>(11)</label>
<mml:math id="mm11">
<mml:mrow>
<mml:mrow>
<mml:mo>{</mml:mo>
<mml:mtable columnalign="left">
<mml:mtr>
<mml:mtd>
<mml:msub>
<mml:mi mathvariant="bold-italic">d</mml:mi>
<mml:mrow>
<mml:mtext mathvariant="italic">ref</mml:mtext>
</mml:mrow>
</mml:msub>
<mml:mo>=</mml:mo>
<mml:msub>
<mml:mi mathvariant="bold-italic">p</mml:mi>
<mml:mrow>
<mml:mtext mathvariant="italic">ref</mml:mtext>
</mml:mrow>
</mml:msub>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi>t</mml:mi>
<mml:mi>c</mml:mi>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
<mml:mo></mml:mo>
<mml:msub>
<mml:mi mathvariant="bold-italic">p</mml:mi>
<mml:mrow>
<mml:mtext mathvariant="italic">ref</mml:mtext>
</mml:mrow>
</mml:msub>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi>t</mml:mi>
<mml:mi>p</mml:mi>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:msub>
<mml:mi mathvariant="bold-italic">d</mml:mi>
<mml:mi>k</mml:mi>
</mml:msub>
<mml:mo>=</mml:mo>
<mml:msub>
<mml:mi mathvariant="bold-italic">p</mml:mi>
<mml:mi>k</mml:mi>
</mml:msub>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi>t</mml:mi>
<mml:mi>c</mml:mi>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
<mml:mo></mml:mo>
<mml:msub>
<mml:mi mathvariant="bold-italic">p</mml:mi>
<mml:mi>k</mml:mi>
</mml:msub>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi>t</mml:mi>
<mml:mi>p</mml:mi>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
</mml:mtd>
</mml:mtr>
</mml:mtable>
</mml:mrow>
</mml:mrow>
</mml:math>
</disp-formula>
</p>
<p>Here,
<xref ref-type="disp-formula" rid="FD7">Equations (7)</xref>
and
<xref ref-type="disp-formula" rid="FD9">(9)</xref>
cannot be directly applied because the calculated positions of dots include the estimation error due to the shape-sensing error [
<xref rid="b26-sensors-14-05805" ref-type="bibr">26</xref>
] and noise in the captured image. Therefore,
<xref rid="FD7" ref-type="disp-formula">Equation (7)</xref>
is modified to decrease the effects of the estimation error of the positions as follows:
<disp-formula id="FD12">
<label>(12)</label>
<mml:math id="mm12">
<mml:mrow>
<mml:mrow>
<mml:mo>|</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi mathvariant="bold-italic">d</mml:mi>
<mml:mrow>
<mml:mtext mathvariant="italic">ref</mml:mtext>
</mml:mrow>
</mml:msub>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi>t</mml:mi>
<mml:mi>p</mml:mi>
</mml:msub>
<mml:mo>,</mml:mo>
<mml:msub>
<mml:mi>t</mml:mi>
<mml:mi>c</mml:mi>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
<mml:mo></mml:mo>
<mml:mrow>
<mml:mo>{</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi mathvariant="bold-italic">d</mml:mi>
<mml:mi>k</mml:mi>
</mml:msub>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi>t</mml:mi>
<mml:mi>p</mml:mi>
</mml:msub>
<mml:mo>,</mml:mo>
<mml:msub>
<mml:mi>t</mml:mi>
<mml:mi>c</mml:mi>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
<mml:mo>+</mml:mo>
<mml:mrow>
<mml:mo>{</mml:mo>
<mml:mrow>
<mml:mi mathvariant="bold-italic">I</mml:mi>
<mml:mo></mml:mo>
<mml:mi mathvariant="bold-italic">R</mml:mi>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:mo></mml:mo>
<mml:msub>
<mml:mi mathvariant="bold-italic">ω</mml:mi>
<mml:mrow>
<mml:mtext mathvariant="italic">ref</mml:mtext>
</mml:mrow>
</mml:msub>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi>t</mml:mi>
<mml:mi>p</mml:mi>
</mml:msub>
<mml:mo>,</mml:mo>
<mml:msub>
<mml:mi>t</mml:mi>
<mml:mi>c</mml:mi>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
</mml:mrow>
<mml:mo>}</mml:mo>
</mml:mrow>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi mathvariant="bold-italic">p</mml:mi>
<mml:mrow>
<mml:mtext mathvariant="italic">ref</mml:mtext>
</mml:mrow>
</mml:msub>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi>t</mml:mi>
<mml:mi>c</mml:mi>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
<mml:mo></mml:mo>
<mml:msub>
<mml:mi mathvariant="bold-italic">p</mml:mi>
<mml:mi>k</mml:mi>
</mml:msub>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi>t</mml:mi>
<mml:mi>c</mml:mi>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
</mml:mrow>
<mml:mo>}</mml:mo>
</mml:mrow>
</mml:mrow>
<mml:mo>|</mml:mo>
</mml:mrow>
<mml:mo><</mml:mo>
<mml:msub>
<mml:mi>δ</mml:mi>
<mml:mi>d</mml:mi>
</mml:msub>
<mml:mfrac>
<mml:mrow>
<mml:msub>
<mml:mi>d</mml:mi>
<mml:mrow>
<mml:mo>max</mml:mo>
</mml:mrow>
</mml:msub>
</mml:mrow>
<mml:mrow>
<mml:msub>
<mml:mi>D</mml:mi>
<mml:mrow>
<mml:mo>max</mml:mo>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:mfrac>
</mml:mrow>
</mml:math>
</disp-formula>
where
<bold>
<italic>ω</italic>
</bold>
<italic>
<sub>obj</sub>
</italic>
(
<italic>t
<sub>c</sub>
</italic>
,
<italic>t
<sub>p</sub>
</italic>
) is the rotation angle of the object from the previous time
<italic>t
<sub>p</sub>
</italic>
to the current time
<italic>t
<sub>c</sub>
</italic>
.
<bold>
<italic>d</italic>
</bold>
<italic>
<sub>obj</sub>
</italic>
(
<italic>t
<sub>c</sub>
</italic>
,
<italic>t
<sub>p</sub>
</italic>
) and
<bold>
<italic>d</italic>
</bold>
<italic>
<sub>k</sub>
</italic>
(
<italic>t
<sub>c</sub>
</italic>
,
<italic>t
<sub>p</sub>
</italic>
) are the displacements of the contact reference dot and the dot
<italic>k</italic>
from the previous time
<italic>t
<sub>p</sub>
</italic>
to the current time
<italic>t
<sub>c</sub>
</italic>
.
<bold>
<italic>δ</italic>
</bold>
<italic>
<sub>d</sub>
</italic>
is a threshold value, and
<italic>d</italic>
<sub>max</sub>
is defined as follows:
<disp-formula id="FD13">
<label>(13)</label>
<mml:math id="mm13">
<mml:mrow>
<mml:mtable>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:msub>
<mml:mi>d</mml:mi>
<mml:mrow>
<mml:mo>max</mml:mo>
</mml:mrow>
</mml:msub>
<mml:mo>=</mml:mo>
<mml:mo>max</mml:mo>
<mml:mrow>
<mml:mo>|</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi mathvariant="bold-italic">p</mml:mi>
<mml:mi>k</mml:mi>
</mml:msub>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi>t</mml:mi>
<mml:mi>c</mml:mi>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
<mml:mo></mml:mo>
<mml:msub>
<mml:mi mathvariant="bold-italic">p</mml:mi>
<mml:mi>k</mml:mi>
</mml:msub>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi>t</mml:mi>
<mml:mi>p</mml:mi>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
</mml:mrow>
<mml:mo>|</mml:mo>
</mml:mrow>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:mi>k</mml:mi>
<mml:mo></mml:mo>
<mml:mi>K</mml:mi>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
</mml:mrow>
</mml:mtd>
</mml:mtr>
</mml:mtable>
</mml:mrow>
</mml:math>
</disp-formula>
Here,
<italic>d</italic>
<sub>max</sub>
/
<italic>D</italic>
<sub>max</sub>
in
<xref rid="FD12" ref-type="disp-formula">Equation (12)</xref>
can normalize the sensitivity of
<xref rid="FD12" ref-type="disp-formula">Equation (12)</xref>
depending on the small difference between max|
<bold>
<italic>p</italic>
</bold>
<italic>
<sub>k</sub>
</italic>
(
<italic>t
<sub>c</sub>
</italic>
)-
<bold>
<italic>p</italic>
</bold>
<italic>
<sub>k</sub>
</italic>
(
<italic>t
<sub>p</sub>
</italic>
)| (=
<italic>d</italic>
<sub>max</sub>
) and
<italic>D</italic>
<sub>max</sub>
in
<xref rid="FD10" ref-type="disp-formula">Equation (10)</xref>
, when the previous time
<italic>t
<sub>p</sub>
</italic>
is selected. We also modify
<xref rid="FD9" ref-type="disp-formula">Equation (9)</xref>
as follows:
<disp-formula id="FD14">
<label>(14)</label>
<mml:math id="mm14">
<mml:mrow>
<mml:mrow>
<mml:mo>|</mml:mo>
<mml:mtable columnalign="left">
<mml:mtr>
<mml:mtd>
<mml:msub>
<mml:mi mathvariant="bold-italic">d</mml:mi>
<mml:mrow>
<mml:mtext mathvariant="italic">ref</mml:mtext>
</mml:mrow>
</mml:msub>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi>t</mml:mi>
<mml:mi>p</mml:mi>
</mml:msub>
<mml:mo>,</mml:mo>
<mml:msub>
<mml:mi>t</mml:mi>
<mml:mi>c</mml:mi>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
<mml:mo></mml:mo>
<mml:msub>
<mml:mi mathvariant="bold-italic">n</mml:mi>
<mml:mi>k</mml:mi>
</mml:msub>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi>t</mml:mi>
<mml:mi>c</mml:mi>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mo></mml:mo>
<mml:mrow>
<mml:mo>{</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi mathvariant="bold-italic">d</mml:mi>
<mml:mi>k</mml:mi>
</mml:msub>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi>t</mml:mi>
<mml:mi>p</mml:mi>
</mml:msub>
<mml:mo>,</mml:mo>
<mml:msub>
<mml:mi>t</mml:mi>
<mml:mi>c</mml:mi>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
<mml:mo>+</mml:mo>
<mml:mrow>
<mml:mo>{</mml:mo>
<mml:mrow>
<mml:mi mathvariant="bold-italic">I</mml:mi>
<mml:mo></mml:mo>
<mml:mi mathvariant="bold-italic">R</mml:mi>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:mo></mml:mo>
<mml:msub>
<mml:mi mathvariant="bold-italic">ω</mml:mi>
<mml:mrow>
<mml:mtext mathvariant="italic">ref</mml:mtext>
</mml:mrow>
</mml:msub>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi>t</mml:mi>
<mml:mi>p</mml:mi>
</mml:msub>
<mml:mo>,</mml:mo>
<mml:msub>
<mml:mi>t</mml:mi>
<mml:mi>c</mml:mi>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
</mml:mrow>
<mml:mo>}</mml:mo>
</mml:mrow>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi mathvariant="bold-italic">p</mml:mi>
<mml:mrow>
<mml:mtext mathvariant="italic">ref</mml:mtext>
</mml:mrow>
</mml:msub>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi>t</mml:mi>
<mml:mi>c</mml:mi>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
<mml:mo></mml:mo>
<mml:msub>
<mml:mi mathvariant="bold-italic">p</mml:mi>
<mml:mi>k</mml:mi>
</mml:msub>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi>t</mml:mi>
<mml:mi>c</mml:mi>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
</mml:mrow>
<mml:mo>}</mml:mo>
</mml:mrow>
<mml:mo></mml:mo>
<mml:msub>
<mml:mi mathvariant="bold-italic">n</mml:mi>
<mml:mi>k</mml:mi>
</mml:msub>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi>t</mml:mi>
<mml:mi>c</mml:mi>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
</mml:mtd>
</mml:mtr>
</mml:mtable>
<mml:mo>|</mml:mo>
</mml:mrow>
<mml:mo><</mml:mo>
<mml:msub>
<mml:mi>δ</mml:mi>
<mml:mi>d</mml:mi>
</mml:msub>
<mml:mfrac>
<mml:mrow>
<mml:msub>
<mml:mi>d</mml:mi>
<mml:mrow>
<mml:mo>max</mml:mo>
</mml:mrow>
</mml:msub>
</mml:mrow>
<mml:mrow>
<mml:msub>
<mml:mi>D</mml:mi>
<mml:mrow>
<mml:mo>max</mml:mo>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:mfrac>
</mml:mrow>
</mml:math>
</disp-formula>
Here,
<bold>
<italic>n</italic>
</bold>
<italic>
<sub>k</sub>
</italic>
(
<italic>t
<sub>c</sub>
</italic>
) is the unit vector perpendicular to the surface of the object around the position of a slipping dot
<italic>k</italic>
at
<italic>t
<sub>c</sub>
</italic>
. By applying
<xref ref-type="disp-formula" rid="FD12">Equations (12)</xref>
and
<xref ref-type="disp-formula" rid="FD14">(14)</xref>
to the dots, we discriminate between the non-contacting dots and the contacting dots.</p>
<p>Here, although we consider distinguishing slipping dots temporarily to discriminate the contacting dot appropriately, the estimation of the contact region finally requires only the discrimination between contacting dots and non-contacting dots. It is not required to consider slipping dots directly because the only
<xref ref-type="disp-formula" rid="FD12">Equations (12)</xref>
and
<xref ref-type="disp-formula" rid="FD14">(14)</xref>
can conclude the discrimination.</p>
</sec>
<sec>
<label>3.4.</label>
<title>Image Processing for Detecting the Dots in Captured Images</title>
<p>In the previous sections, we have proposed a method using the two-dimensional positions of the dots in images captured by the CCD camera for calculating the three-dimensional positions. Therefore, accuracy of the proposed method depends on the detection accuracy of the dots in the images. An image processing method is proposed to detect the dots accurately. The following six steps yield the positions of the dots as shown in
<xref rid="f5-sensors-14-05805" ref-type="fig">Figure 5</xref>
.</p>
<p>
<list list-type="simple">
<list-item>
<label>Step 1:</label>
<p>The captured color image is transformed into a gray scale image.</p>
</list-item>
<list-item>
<label>Step 2:</label>
<p>The contrast in the gray scale image is emphasized.</p>
</list-item>
<list-item>
<label>Step 3:</label>
<p>The regions of the dots are extracted by binarizing the emphasized image. The extracted regions are represented by the black color in
<xref rid="f5-sensors-14-05805" ref-type="fig">Figure 5</xref>
.</p>
</list-item>
<list-item>
<label>Step 4:</label>
<p>The brightness of the gray scale image is inverted.</p>
</list-item>
<list-item>
<label>Step 5:</label>
<p>The brightness of the inverted image is extracted in the regions of the dots</p>
</list-item>
<list-item>
<label>Step 6:</label>
<p>The position of each dot is obtained by calculating the brightness center in the region based on 0- and 1st-order moments as follows:
<disp-formula id="FD15">
<label>(15)</label>
<mml:math id="mm15">
<mml:mrow>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi>x</mml:mi>
<mml:mi>k</mml:mi>
</mml:msub>
<mml:mo>,</mml:mo>
<mml:msub>
<mml:mi>y</mml:mi>
<mml:mi>k</mml:mi>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
<mml:mo>=</mml:mo>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:mfrac>
<mml:mrow>
<mml:msub>
<mml:mi>m</mml:mi>
<mml:mrow>
<mml:mi>k</mml:mi>
<mml:mo>,</mml:mo>
<mml:mn>1</mml:mn>
<mml:mo>,</mml:mo>
<mml:mn>0</mml:mn>
</mml:mrow>
</mml:msub>
</mml:mrow>
<mml:mrow>
<mml:msub>
<mml:mi>m</mml:mi>
<mml:mrow>
<mml:mi>k</mml:mi>
<mml:mo>,</mml:mo>
<mml:mn>0</mml:mn>
<mml:mo>,</mml:mo>
<mml:mn>0</mml:mn>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:mfrac>
<mml:mo>,</mml:mo>
<mml:mfrac>
<mml:mrow>
<mml:msub>
<mml:mi>m</mml:mi>
<mml:mrow>
<mml:mi>k</mml:mi>
<mml:mo>,</mml:mo>
<mml:mn>0</mml:mn>
<mml:mo>,</mml:mo>
<mml:mn>1</mml:mn>
</mml:mrow>
</mml:msub>
</mml:mrow>
<mml:mrow>
<mml:msub>
<mml:mi>m</mml:mi>
<mml:mrow>
<mml:mi>k</mml:mi>
<mml:mo>,</mml:mo>
<mml:mn>0</mml:mn>
<mml:mo>,</mml:mo>
<mml:mn>0</mml:mn>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:mfrac>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
</mml:mrow>
</mml:math>
</disp-formula>
where (
<italic>x
<sub>k</sub>
</italic>
,
<italic>y
<sub>k</sub>
</italic>
) are the
<italic>x</italic>
- and
<italic>y</italic>
-directional positions of the dot
<italic>k</italic>
in the image. The
<italic>v</italic>
- and
<italic>w</italic>
-order moments are defined as follows:
<disp-formula id="FD16">
<label>(16)</label>
<mml:math id="mm16">
<mml:mrow>
<mml:mtable>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:msub>
<mml:mi>m</mml:mi>
<mml:mrow>
<mml:mi>k</mml:mi>
<mml:mo>,</mml:mo>
<mml:mi>v</mml:mi>
<mml:mo>,</mml:mo>
<mml:mi>w</mml:mi>
</mml:mrow>
</mml:msub>
<mml:mo>=</mml:mo>
<mml:munder>
<mml:mtext></mml:mtext>
<mml:mi>i</mml:mi>
</mml:munder>
<mml:mrow>
<mml:munder>
<mml:mtext></mml:mtext>
<mml:mi>j</mml:mi>
</mml:munder>
<mml:mrow>
<mml:mi>I</mml:mi>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:mi>i</mml:mi>
<mml:mo>,</mml:mo>
<mml:mi>j</mml:mi>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
<mml:msup>
<mml:mi>i</mml:mi>
<mml:mi>v</mml:mi>
</mml:msup>
<mml:msup>
<mml:mi>j</mml:mi>
<mml:mi>w</mml:mi>
</mml:msup>
</mml:mrow>
</mml:mrow>
</mml:mrow>
</mml:mtd>
<mml:mtd>
<mml:mrow>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:mi>i</mml:mi>
<mml:mo>,</mml:mo>
<mml:mi>j</mml:mi>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
<mml:mo></mml:mo>
<mml:msub>
<mml:mi>R</mml:mi>
<mml:mi>k</mml:mi>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
</mml:mrow>
</mml:mtd>
</mml:mtr>
</mml:mtable>
</mml:mrow>
</mml:math>
</disp-formula>
where
<italic>I</italic>
(
<italic>i</italic>
,
<italic>j</italic>
) is the inverted brightness of the pixel (
<italic>i</italic>
,
<italic>j</italic>
), and
<italic>R
<sub>k</sub>
</italic>
is the region of the dot
<italic>k</italic>
extracted in Step 3. Here, inverting the brightness of the image in Step 3 can increase the detection accuracy of the dots. Note that the border of the extracted region of each dot may contain the error because of binarizing the image. If the brightness of the image is not inverted, the brightness at the border of the region is larger than the brightness at the center of the region. Therefore, the brightness of the border is dominant and thus decreases accuracy.</p>
</list-item>
</list>
</p>
</sec>
<sec>
<label>3.5.</label>
<title>Numbering the Detected Dots</title>
<p>By the method introduced in the previous section, the positions of the dots have been obtained. However, the data are insufficient for calculating the displacements of the dots. In order to obtain the displacements, the identical dots must be identified between two different images. One approach is tracking the position of each dot by starting from the previous position. However, when the dots move quickly and the displacements of the dots are large, as shown in the difference between
<xref rid="f2-sensors-14-05805" ref-type="fig">Figure 2</xref>
, tracking methods may not be successfully applied, and thus a different dot is identified.</p>
<p>As an alternative to the tracking methods, the approach applied in this paper is to assign an identification number to each dot. When each dot has a fixed number, it can be easily identified even if it moves quickly. The following five steps are applied to assign numbers to the dots as shown in
<xref rid="f6-sensors-14-05805" ref-type="fig">Figure 6</xref>
:
<list list-type="simple">
<list-item>
<label>Step 1:</label>
<p>The constant points,
<italic>Q
<sub>x</sub>
</italic>
and
<italic>Q
<sub>y</sub>
</italic>
, were defined.</p>
</list-item>
<list-item>
<label>Step 2:</label>
<p>The dots
<italic>D
<sub>x</sub>
</italic>
<sub>,1</sub>
and
<italic>D
<sub>y</sub>
</italic>
<sub>,1</sub>
which are the dots nearest to
<italic>Q
<sub>x</sub>
</italic>
and
<italic>Q
<sub>y</sub>
</italic>
, respectively, are selected.</p>
</list-item>
<list-item>
<label>Step 3:</label>
<p>The dots
<italic>D
<sub>x</sub>
</italic>
<sub>,</sub>
<italic>
<sub>k</sub>
</italic>
and
<italic>D
<sub>y</sub>
</italic>
<sub>,</sub>
<italic>
<sub>k</sub>
</italic>
are located starting from
<italic>D
<sub>x</sub>
</italic>
<sub>,1</sub>
and
<italic>D
<sub>y</sub>
</italic>
<sub>,1</sub>
, respectively, in order of increasing
<italic>k</italic>
. Here,
<italic>D
<sub>x</sub>
</italic>
<sub>,</sub>
<italic>
<sub>k</sub>
</italic>
and
<italic>D
<sub>y</sub>
</italic>
<sub>,</sub>
<italic>
<sub>k</sub>
</italic>
are found near the positions
<bold>
<italic>P</italic>
</bold>
<italic>'
<sub>x</sub>
</italic>
<sub>,</sub>
<italic>
<sub>k</sub>
</italic>
and
<bold>
<italic>P</italic>
</bold>
<italic>'
<sub>y</sub>
</italic>
<sub>,</sub>
<italic>
<sub>k</sub>
</italic>
, respectively, which predict the positions of
<italic>D
<sub>x</sub>
</italic>
<sub>,</sub>
<italic>
<sub>k</sub>
</italic>
and
<italic>D
<sub>y</sub>
</italic>
<sub>,</sub>
<italic>
<sub>k</sub>
</italic>
. Because the distances between two dots should be gradually changed when the next dots are searched, the following approximations are satisfied:
<disp-formula id="FD17">
<label>(17)</label>
<mml:math id="mm17">
<mml:mrow>
<mml:mtable>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:msub>
<mml:mi mathvariant="bold-italic">P</mml:mi>
<mml:mrow>
<mml:mi>x</mml:mi>
<mml:mo>,</mml:mo>
<mml:mi>k</mml:mi>
</mml:mrow>
</mml:msub>
<mml:mo></mml:mo>
<mml:msub>
<mml:mi mathvariant="bold-italic">P</mml:mi>
<mml:mrow>
<mml:mi>x</mml:mi>
<mml:mo>,</mml:mo>
<mml:mi>k</mml:mi>
<mml:mo></mml:mo>
<mml:mn>1</mml:mn>
</mml:mrow>
</mml:msub>
<mml:mo></mml:mo>
<mml:msub>
<mml:mi mathvariant="bold-italic">P</mml:mi>
<mml:mrow>
<mml:mi>x</mml:mi>
<mml:mo>,</mml:mo>
<mml:mi>k</mml:mi>
<mml:mo></mml:mo>
<mml:mn>1</mml:mn>
</mml:mrow>
</mml:msub>
<mml:mo></mml:mo>
<mml:msub>
<mml:mi mathvariant="bold-italic">P</mml:mi>
<mml:mrow>
<mml:mi>x</mml:mi>
<mml:mo>,</mml:mo>
<mml:mi>k</mml:mi>
<mml:mo></mml:mo>
<mml:mn>2</mml:mn>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:msub>
<mml:mi mathvariant="bold-italic">P</mml:mi>
<mml:mrow>
<mml:mi>y</mml:mi>
<mml:mo>,</mml:mo>
<mml:mi>k</mml:mi>
</mml:mrow>
</mml:msub>
<mml:mo></mml:mo>
<mml:msub>
<mml:mi mathvariant="bold-italic">P</mml:mi>
<mml:mrow>
<mml:mi>y</mml:mi>
<mml:mo>,</mml:mo>
<mml:mi>k</mml:mi>
<mml:mo></mml:mo>
<mml:mn>1</mml:mn>
</mml:mrow>
</mml:msub>
<mml:mo></mml:mo>
<mml:msub>
<mml:mi mathvariant="bold-italic">P</mml:mi>
<mml:mrow>
<mml:mi>y</mml:mi>
<mml:mo>,</mml:mo>
<mml:mi>k</mml:mi>
<mml:mo></mml:mo>
<mml:mn>1</mml:mn>
</mml:mrow>
</mml:msub>
<mml:mo></mml:mo>
<mml:msub>
<mml:mi mathvariant="bold-italic">P</mml:mi>
<mml:mrow>
<mml:mi>y</mml:mi>
<mml:mo>,</mml:mo>
<mml:mi>k</mml:mi>
<mml:mo></mml:mo>
<mml:mn>2</mml:mn>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:mtd>
</mml:mtr>
</mml:mtable>
</mml:mrow>
</mml:math>
</disp-formula>
where
<bold>
<italic>P</italic>
</bold>
<italic>
<sub>x</sub>
</italic>
<sub>,</sub>
<italic>
<sub>k</sub>
</italic>
and
<bold>
<italic>P</italic>
</bold>
<italic>
<sub>y</sub>
</italic>
<sub>,</sub>
<italic>
<sub>k</sub>
</italic>
are the positions of
<italic>D
<sub>x</sub>
</italic>
<sub>,</sub>
<italic>
<sub>k</sub>
</italic>
and
<italic>D
<sub>y</sub>
</italic>
<sub>,</sub>
<italic>
<sub>k</sub>
</italic>
, respectively. Therefore, the positions
<bold>
<italic>P</italic>
</bold>
<italic>'
<sub>x</sub>
</italic>
<sub>,</sub>
<italic>
<sub>k</sub>
</italic>
and
<bold>
<italic>P</italic>
</bold>
<italic>'
<sub>y</sub>
</italic>
<sub>,</sub>
<italic>
<sub>k</sub>
</italic>
predicting
<bold>
<italic>P</italic>
</bold>
<italic>
<sub>x</sub>
</italic>
<sub>,</sub>
<italic>
<sub>k</sub>
</italic>
and
<bold>
<italic>P</italic>
</bold>
<italic>
<sub>y</sub>
</italic>
<sub>,</sub>
<italic>
<sub>k</sub>
</italic>
are defined as follows:
<disp-formula id="FD18">
<label>(18)</label>
<mml:math id="mm18">
<mml:mrow>
<mml:mtable>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:mi mathvariant="bold-italic">P</mml:mi>
<mml:msub>
<mml:mo>'</mml:mo>
<mml:mrow>
<mml:mi>x</mml:mi>
<mml:mo>,</mml:mo>
<mml:mi>k</mml:mi>
</mml:mrow>
</mml:msub>
<mml:mo>=</mml:mo>
<mml:msub>
<mml:mi mathvariant="bold-italic">P</mml:mi>
<mml:mrow>
<mml:mi>x</mml:mi>
<mml:mo>,</mml:mo>
<mml:mi>k</mml:mi>
<mml:mo></mml:mo>
<mml:mn>1</mml:mn>
</mml:mrow>
</mml:msub>
<mml:mo>+</mml:mo>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi mathvariant="bold-italic">P</mml:mi>
<mml:mrow>
<mml:mi>x</mml:mi>
<mml:mo>,</mml:mo>
<mml:mi>k</mml:mi>
<mml:mo></mml:mo>
<mml:mn>1</mml:mn>
</mml:mrow>
</mml:msub>
<mml:mo></mml:mo>
<mml:msub>
<mml:mi mathvariant="bold-italic">P</mml:mi>
<mml:mrow>
<mml:mi>x</mml:mi>
<mml:mo>,</mml:mo>
<mml:mi>k</mml:mi>
<mml:mo></mml:mo>
<mml:mn>2</mml:mn>
</mml:mrow>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
</mml:mrow>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:mrow>
<mml:mi mathvariant="bold-italic">P</mml:mi>
<mml:msub>
<mml:mo>'</mml:mo>
<mml:mrow>
<mml:mi>y</mml:mi>
<mml:mo>,</mml:mo>
<mml:mi>k</mml:mi>
</mml:mrow>
</mml:msub>
<mml:mo>=</mml:mo>
<mml:msub>
<mml:mi mathvariant="bold-italic">P</mml:mi>
<mml:mrow>
<mml:mi>y</mml:mi>
<mml:mo>,</mml:mo>
<mml:mi>k</mml:mi>
<mml:mo></mml:mo>
<mml:mn>1</mml:mn>
</mml:mrow>
</mml:msub>
<mml:mo>+</mml:mo>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi mathvariant="bold-italic">P</mml:mi>
<mml:mrow>
<mml:mi>y</mml:mi>
<mml:mo>,</mml:mo>
<mml:mi>k</mml:mi>
<mml:mo></mml:mo>
<mml:mn>1</mml:mn>
</mml:mrow>
</mml:msub>
<mml:mo></mml:mo>
<mml:msub>
<mml:mi mathvariant="bold-italic">P</mml:mi>
<mml:mrow>
<mml:mi>y</mml:mi>
<mml:mo>,</mml:mo>
<mml:mi>k</mml:mi>
<mml:mo></mml:mo>
<mml:mn>2</mml:mn>
</mml:mrow>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
</mml:mrow>
</mml:mtd>
</mml:mtr>
</mml:mtable>
</mml:mrow>
</mml:math>
</disp-formula>
<bold>
<italic>P</italic>
</bold>
<italic>'
<sub>x</sub>
</italic>
<sub>,</sub>
<italic>
<sub>k</sub>
</italic>
and
<bold>
<italic>P</italic>
</bold>
<italic>'
<sub>y</sub>
</italic>
<sub>,</sub>
<italic>
<sub>k</sub>
</italic>
can predict
<bold>
<italic>P</italic>
</bold>
<italic>
<sub>x</sub>
</italic>
<sub>,</sub>
<italic>
<sub>k</sub>
</italic>
and
<bold>
<italic>P</italic>
</bold>
<italic>
<sub>y</sub>
</italic>
<sub>,</sub>
<italic>
<sub>k</sub>
</italic>
accurately even if the surface of the touchpad is significantly deformed because of the relationships defined in
<xref rid="FD17" ref-type="disp-formula">Equation (17)</xref>
. When
<italic>k</italic>
is 2, a constant vector is applied instead of (
<bold>
<italic>P</italic>
</bold>
<italic>
<sub>x</sub>
</italic>
<sub>,</sub>
<italic>
<sub>k</sub>
</italic>
<sub>−1</sub>
<bold>
<italic>P</italic>
</bold>
<italic>
<sub>x</sub>
</italic>
<sub>,</sub>
<italic>
<sub>k</sub>
</italic>
<sub>−2</sub>
) and (
<bold>
<italic>P</italic>
</bold>
<italic>
<sub>y</sub>
</italic>
<sub>,</sub>
<italic>
<sub>k</sub>
</italic>
<sub>−1</sub>
<bold>
<italic>P</italic>
</bold>
<italic>
<sub>y</sub>
</italic>
<sub>,</sub>
<italic>
<sub>k</sub>
</italic>
<sub>−2</sub>
) in
<xref rid="FD18" ref-type="disp-formula">Equation (18)</xref>
.</p>
</list-item>
<list-item>
<label>Step 4:</label>
<p>The central dot
<italic>D
<sub>c</sub>
</italic>
is identified, which corresponds to
<italic>D
<sub>x</sub>
</italic>
<sub>,</sub>
<italic>
<sub>v</sub>
</italic>
and
<italic>D
<sub>y,w</sub>
</italic>
detected in Step 3, where
<italic>v</italic>
and
<italic>w</italic>
are arbitrary. There is only one dot defined as
<italic>D
<sub>c</sub>
</italic>
in the image. The central dot
<italic>D
<sub>c</sub>
</italic>
is assigned the value (12,12).</p>
</list-item>
<list-item>
<label>Step 5:</label>
<p>The dot with the number
<italic>i</italic>
,
<italic>j</italic>
that is the
<italic>i</italic>
-th dot from the left and the
<italic>j</italic>
-th dot from the top is defined. When starting from
<italic>D
<sub>c</sub>
</italic>
(12,12), each dot
<italic>i</italic>
,
<italic>j</italic>
is searched by using the predicted position
<bold>
<italic>P</italic>
</bold>
<italic>'
<sub>i</sub>
</italic>
<sub>,</sub>
<italic>
<sub>j</sub>
</italic>
which is determined based on the relationship in
<xref rid="FD17" ref-type="disp-formula">Equation (17)</xref>
as follows:
<disp-formula id="FD19">
<mml:math id="mm19">
<mml:mrow>
<mml:mi mathvariant="bold-italic">P</mml:mi>
<mml:msub>
<mml:mo>'</mml:mo>
<mml:mrow>
<mml:mi>i</mml:mi>
<mml:mo>,</mml:mo>
<mml:mi>j</mml:mi>
</mml:mrow>
</mml:msub>
<mml:mo>=</mml:mo>
<mml:mrow>
<mml:mo>{</mml:mo>
<mml:mtable columnalign="left">
<mml:mtr>
<mml:mtd>
<mml:msub>
<mml:mi mathvariant="bold-italic">P</mml:mi>
<mml:mrow>
<mml:mi>i</mml:mi>
<mml:mo>,</mml:mo>
<mml:mi>j</mml:mi>
<mml:mo>±</mml:mo>
<mml:mn>1</mml:mn>
</mml:mrow>
</mml:msub>
<mml:mo>+</mml:mo>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi mathvariant="bold-italic">P</mml:mi>
<mml:mrow>
<mml:mi>i</mml:mi>
<mml:mo>,</mml:mo>
<mml:mi>j</mml:mi>
<mml:mo>±</mml:mo>
<mml:mn>1</mml:mn>
</mml:mrow>
</mml:msub>
<mml:mo></mml:mo>
<mml:msub>
<mml:mi mathvariant="bold-italic">P</mml:mi>
<mml:mrow>
<mml:mi>i</mml:mi>
<mml:mo>,</mml:mo>
<mml:mi>j</mml:mi>
<mml:mo>±</mml:mo>
<mml:mn>2</mml:mn>
</mml:mrow>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:msub>
<mml:mi mathvariant="bold-italic">P</mml:mi>
<mml:mrow>
<mml:mi>i</mml:mi>
<mml:mo>±</mml:mo>
<mml:mn>1</mml:mn>
<mml:mo>,</mml:mo>
<mml:mi>j</mml:mi>
</mml:mrow>
</mml:msub>
<mml:mo>+</mml:mo>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi mathvariant="bold-italic">P</mml:mi>
<mml:mrow>
<mml:mi>i</mml:mi>
<mml:mo>±</mml:mo>
<mml:mn>1</mml:mn>
<mml:mo>,</mml:mo>
<mml:mi>j</mml:mi>
</mml:mrow>
</mml:msub>
<mml:mo></mml:mo>
<mml:msub>
<mml:mi mathvariant="bold-italic">P</mml:mi>
<mml:mrow>
<mml:mi>i</mml:mi>
<mml:mo>±</mml:mo>
<mml:mn>2</mml:mn>
<mml:mo>,</mml:mo>
<mml:mi>j</mml:mi>
</mml:mrow>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
</mml:mtd>
</mml:mtr>
<mml:mtr>
<mml:mtd>
<mml:msub>
<mml:mi mathvariant="bold-italic">P</mml:mi>
<mml:mrow>
<mml:mi>i</mml:mi>
<mml:mo>±</mml:mo>
<mml:mn>1</mml:mn>
<mml:mo>,</mml:mo>
<mml:mi>j</mml:mi>
<mml:mo>±</mml:mo>
<mml:mn>1</mml:mn>
</mml:mrow>
</mml:msub>
<mml:mo>+</mml:mo>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi mathvariant="bold-italic">P</mml:mi>
<mml:mrow>
<mml:mi>i</mml:mi>
<mml:mo>±</mml:mo>
<mml:mn>1</mml:mn>
<mml:mo>,</mml:mo>
<mml:mi>j</mml:mi>
</mml:mrow>
</mml:msub>
<mml:mo></mml:mo>
<mml:msub>
<mml:mi mathvariant="bold-italic">P</mml:mi>
<mml:mrow>
<mml:mi>i</mml:mi>
<mml:mo>±</mml:mo>
<mml:mn>1</mml:mn>
<mml:mo>,</mml:mo>
<mml:mi>j</mml:mi>
<mml:mo>±</mml:mo>
<mml:mn>1</mml:mn>
</mml:mrow>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
<mml:mo>+</mml:mo>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi mathvariant="bold-italic">P</mml:mi>
<mml:mrow>
<mml:mi>i</mml:mi>
<mml:mo>,</mml:mo>
<mml:mi>j</mml:mi>
<mml:mo>±</mml:mo>
<mml:mn>1</mml:mn>
</mml:mrow>
</mml:msub>
<mml:mo></mml:mo>
<mml:msub>
<mml:mi mathvariant="bold-italic">P</mml:mi>
<mml:mrow>
<mml:mi>i</mml:mi>
<mml:mo>±</mml:mo>
<mml:mn>1</mml:mn>
<mml:mo>,</mml:mo>
<mml:mi>j</mml:mi>
<mml:mo>±</mml:mo>
<mml:mn>1</mml:mn>
</mml:mrow>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
</mml:mtd>
</mml:mtr>
</mml:mtable>
</mml:mrow>
</mml:mrow>
</mml:math>
</disp-formula>
</p>
</list-item>
</list>
</p>
<p>The equations are used to predict
<bold>
<italic>P</italic>
</bold>
<italic>
<sub>i</sub>
</italic>
<sub>,</sub>
<italic>
<sub>j</sub>
</italic>
. In these steps, numbers (
<italic>I</italic>
= 1,2,…,23,
<italic>j</italic>
= 1,2,…,23) can be assigned to all dots in each image.</p>
</sec>
</sec>
<sec sec-type="results">
<label>4.</label>
<title>Experimental Results</title>
<p>In this section, the proposed method is confirmed by the experimental results. The proposed sensor was fixed on a movable stage, in contact with variously-shaped objects such as a rectangular object, a circular-shaped object and a ring-shaped object as shown in
<xref rid="f7-sensors-14-05805" ref-type="fig">Figure 7</xref>
. When the object was moved in the normal direction, the contact region was calculated by using the proposed method. The actual shapes of the contact regions were observed from the image of the inside of the touchpad when the objects were enough deeply contacted. We set the parameters
<italic>D</italic>
<sub>max</sub>
and
<bold>
<italic>δ</italic>
</bold>
<italic>
<sub>d</sub>
</italic>
to 12.0 pixel and 2.5 pixel, respectively, according to trial and error.</p>
<p>We evaluate the proposed method by considering the results of discriminated dots in the experiment. In the following figures, true positive dots are correctly discriminated contacting dots. On the other hand, false positive dots are non-contacting dots really but are discriminated as the contacting dots. False negative dots are contacting dots really but are discriminated as the non-contacting dots. Therefore, when the proposed method are successfully applied, there are few false positive/negative dots.</p>
<p>
<xref rid="f8-sensors-14-05805" ref-type="fig">Figure 8</xref>
shows the results of the estimated contact region when the touchpad contacted a rectangular object. Although a worse case in
<xref rid="f8-sensors-14-05805" ref-type="fig">Figure 8a</xref>
includes the some false positive/negative dots, there are few false positive/negative dots in a better case in
<xref rid="f8-sensors-14-05805" ref-type="fig">Figure 8b</xref>
.
<xref rid="f8-sensors-14-05805" ref-type="fig">Figure 8c</xref>
shows the variation of the numbers of true positive, false positive and false negative dots. Here, the touchpad was moved such that the contact of the object became increasingly deeper with an increase in the index of sampling. The contact became enough deep for estimating the contact region after the index of sampling became approximately 10. In the experiment for the rectangular object, false positive dots were hardly occurred whereas there were some false negative dots. It seems that the three-dimensional dot positions based on the touchpad shape contain positional errors because the estimation accuracy of the touchpad shape by the previous work [
<xref rid="b26-sensors-14-05805" ref-type="bibr">26</xref>
] can degrade around the sharp edge of the rectangular object, where the touchpad shape is used for calculating dot positions. However, we can see that the set of the contacting dots in
<xref rid="f8-sensors-14-05805" ref-type="fig">Figure 8b</xref>
construct a rectangular shape approximately. We regard the shape of the contact region as a set of contacting dots which constructing the contact region.</p>
<p>Next,
<xref rid="f9-sensors-14-05805" ref-type="fig">Figure 9</xref>
illustrates the results of the estimated contact region when the touchpad contacted a cylinder-shaped object.
<xref rid="f9-sensors-14-05805" ref-type="fig">Figure 9a</xref>
,
<xref rid="f9-sensors-14-05805" ref-type="fig">b</xref>
shows a worse case and a better case, respectively.
<xref rid="f9-sensors-14-05805" ref-type="fig">Figure 9c</xref>
shows the variation of the numbers of true positive, false positive and false negative dots. The contact of the object became increasingly deeper with an increase in the index of sampling and the contact became enough deep for estimating the contact region after the index of sampling became approximately 10. We can see that the set of the contacting dots in the better case construct a circular shape whereas there are some worse cases as shown in
<xref rid="f9-sensors-14-05805" ref-type="fig">Figure 9c</xref>
. It seems that the estimation error of the dot positions also invokes false positive/negative dots.</p>
<p>Finally,
<xref rid="f10-sensors-14-05805" ref-type="fig">Figure 10</xref>
illustrates the results of the estimated contact region when the touchpad contacted a ring-shaped object.
<xref rid="f10-sensors-14-05805" ref-type="fig">Figure 10a</xref>
,
<xref rid="f10-sensors-14-05805" ref-type="fig">b</xref>
shows a worse case and a better case, respectively.
<xref rid="f10-sensors-14-05805" ref-type="fig">Figure 10c</xref>
shows the variation of the numbers of true positive, false positive and false negative dots. The contact of the object became increasingly deeper with an increase in the index of sampling and the contact became enough deep for estimating the contact region after the index of sampling became approximately 15. In the case of the ring-shaped object, false positive dots were occurred on the inside of the ring. This is because that the membrane on the inside of the ring can move along with the ring-shaped object due to the stiffness of the membrane. However, many dots were appropriately discriminated except the inside of the ring. When we use a softer/thinner membrane for the surface of the touchpad, false positive dots will be diminished on the inside of the ring.</p>
<p>In these results of
<xref rid="f8-sensors-14-05805" ref-type="fig">Figures 8</xref>
,
<xref rid="f9-sensors-14-05805" ref-type="fig">9</xref>
and
<xref rid="f10-sensors-14-05805" ref-type="fig">10</xref>
, we have observed some errors of false positive/negative dots. These errors are due to the estimation error of the three-dimensional positions of dots which are invoked by the estimation error of the touchpad shape [
<xref rid="b26-sensors-14-05805" ref-type="bibr">26</xref>
] and the estimation error of the two-dimensional dot position in images. However, it seems that the these errors can be decrease by using a camera with higher resolution and higher signal-noise ratio and by painting the dot pattern with higher accuracy, where the dot pattern is currently painted by hand work. Moreover, the sets of the contacting dots can construct the shape of the actual contact region in some cases. Therefore, we consider that the proposed method can be applied for estimating the contact region and the estimation accuracy of the proposed method will be increased by improving the hardware such as a camera and a dot pattern.</p>
</sec>
<sec sec-type="conclusions|discussion">
<label>5.</label>
<title>Conclusions and Discussion</title>
<p>We have proposed a new method to estimate the contact region between a touchpad and a contacted object without strict assumptions. The proposed method is based on the movements of dots printed on the surface of the touchpad. The contact state of the dots has been defined as three types—the non-contacting dot, the sticking dot and the slipping dot. In consideration of the movements of the dots, equations have been formulated to discriminate between the contacting dots and the non-contacting dots. The equations have been modified to decrease the effects of noise and the error of estimating the positions of the dots. A set of the contacting dots discriminated by the formulated equations can construct the contact region. Next, a six-step image processing has been also proposed to detect the dots in captured images. Next, a method has been developed to assign numbers to dots for calculating their displacements. Finally, the validation of our proposed methods has been confirmed by experimental results.</p>
<p>Although some errors of false positive/negative dots remained in the experimental results, more accurate discrimination of the dots will be expected by enhancing the calculation accuracy of the three-dimensional positions of dots. It seems that the better estimation of the positions can be achieved by using a camera with higher resolution and higher signal-noise ratio and by painting the dot pattern with higher accuracy, where the dot pattern is currently painted by hand work.</p>
<p>The method can be generalized because it can be applied to other sensors including dots/markers on the sensor surface. The size of each dot of the developed sensor in this research is relatively large because the dot pattern is painted by hand work. However, differently to using many sensing elements, the size of the dots can easily be made smaller by the printing technique, and thus high resolution is expected. The proposed sensor can be fabricated easily at minimal cost, because it has a simple structure and does not require complex sensing elements or wiring. Although the size of the sensor developed and used in this research is relatively large, it can be easily downsized by using a smaller CCD or complementary metal-oxide semiconductor (CMOS) camera with high resolution.</p>
<p>In the process of contributing to this paper, our vision-based sensor has been developed to a greater level of practicality. Combined with our previous research [
<xref rid="b23-sensors-14-05805" ref-type="bibr">23</xref>
<xref rid="b27-sensors-14-05805" ref-type="bibr">27</xref>
], we believe that the vision-based sensor can simultaneously obtain multiple types of tactile information, including the contact region, multi-axis contact force, slippage, shape, position and orientation of an object in contact with the touchpad. Future investigation will include the implementation of fluid-type tactile sensors in industrial and medical applications, and in various practical applications, such as for robot hands for dexterous handling.</p>
</sec>
</sec>
</body>
<back>
<ack>
<p>The authors would like to thank K. Yamamoto for his help in constructing the experimental setup.</p>
</ack>
<notes>
<title>Author Contributions</title>
<p>Yuji Ito conceived and designed the proposed method in this study. Yuji Ito and Goro Obinata conceived and designed the concept of the proposed sensor. Yuji Ito did the experiments and wrote the paper. All authors discussed the method, the experiment and the results to improve this study.</p>
</notes>
<notes>
<title>Conflicts of Interest</title>
<p>The authors declare no conflict of interest.</p>
</notes>
<ref-list>
<title>References</title>
<ref id="b1-sensors-14-05805">
<label>1.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Lee</surname>
<given-names>M.H.</given-names>
</name>
<name>
<surname>Nicholls</surname>
<given-names>H.R.</given-names>
</name>
</person-group>
<article-title>Tactile sensing for mechatronics—A state of the art survey</article-title>
<source>Mechatronics</source>
<year>1999</year>
<volume>1</volume>
<fpage>1</fpage>
<lpage>31</lpage>
</element-citation>
</ref>
<ref id="b2-sensors-14-05805">
<label>2.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Dahiya</surname>
<given-names>R.S.</given-names>
</name>
<name>
<surname>Metta</surname>
<given-names>G.</given-names>
</name>
<name>
<surname>Valle</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>Sandini</surname>
<given-names>G.</given-names>
</name>
</person-group>
<article-title>Tactile Sensing—From Humans to Humanoids</article-title>
<source>IEEE Trans. Robot.</source>
<year>2010</year>
<volume>26</volume>
<fpage>1</fpage>
<lpage>20</lpage>
</element-citation>
</ref>
<ref id="b3-sensors-14-05805">
<label>3.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Yamada</surname>
<given-names>D.</given-names>
</name>
<name>
<surname>Maeno</surname>
<given-names>T.</given-names>
</name>
<name>
<surname>Yamada</surname>
<given-names>Y.</given-names>
</name>
</person-group>
<article-title>Artificial Finger Skin having Ridges and Distributed Tactile Sensors used for Grasp Force Control</article-title>
<source>J. Robot. Mechatron.</source>
<year>2002</year>
<volume>14</volume>
<fpage>140</fpage>
<lpage>146</lpage>
</element-citation>
</ref>
<ref id="b4-sensors-14-05805">
<label>4.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Noda</surname>
<given-names>K.</given-names>
</name>
<name>
<surname>Hoshino</surname>
<given-names>K.</given-names>
</name>
<name>
<surname>Matsumoto</surname>
<given-names>K.</given-names>
</name>
<name>
<surname>Shimoyama</surname>
<given-names>I.</given-names>
</name>
</person-group>
<article-title>A shear stress sensor for tactile sensing with the piezoresistive cantilever standing in elastic material</article-title>
<source>Sens. Actuators A</source>
<year>2006</year>
<volume>127</volume>
<fpage>295</fpage>
<lpage>301</lpage>
</element-citation>
</ref>
<ref id="b5-sensors-14-05805">
<label>5.</label>
<element-citation publication-type="confproc">
<person-group person-group-type="author">
<name>
<surname>Schmitz</surname>
<given-names>A.</given-names>
</name>
<name>
<surname>Maggiali</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>Natale</surname>
<given-names>L.</given-names>
</name>
<name>
<surname>Bonino</surname>
<given-names>B.</given-names>
</name>
<name>
<surname>Metta</surname>
<given-names>G.</given-names>
</name>
</person-group>
<article-title>A Tactile Sensor for the Fingertips of the Humanoid Robot iCub</article-title>
<conf-name>Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems</conf-name>
<conf-loc>Taipei, Taiwan</conf-loc>
<conf-date>18–22 October 2010</conf-date>
<fpage>2212</fpage>
<lpage>2217</lpage>
</element-citation>
</ref>
<ref id="b6-sensors-14-05805">
<label>6.</label>
<element-citation publication-type="confproc">
<person-group person-group-type="author">
<name>
<surname>Hakozaki</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>Shinoda</surname>
<given-names>H.</given-names>
</name>
</person-group>
<article-title>Digital tactile sensing elements communicating through conductive skin layers</article-title>
<conf-name>Proceedings of the IEEE/RSJ International Conference on Intelligent Robotics and Automation</conf-name>
<conf-loc>Washington, DC, USA</conf-loc>
<conf-date>11–15 May 2002</conf-date>
<fpage>3813</fpage>
<lpage>3817</lpage>
</element-citation>
</ref>
<ref id="b7-sensors-14-05805">
<label>7.</label>
<element-citation publication-type="confproc">
<person-group person-group-type="author">
<name>
<surname>Yamada</surname>
<given-names>K.</given-names>
</name>
<name>
<surname>Goto</surname>
<given-names>K.</given-names>
</name>
<name>
<surname>Nakajima</surname>
<given-names>Y.</given-names>
</name>
<name>
<surname>Koshida</surname>
<given-names>N.</given-names>
</name>
<name>
<surname>Shinoda</surname>
<given-names>H.</given-names>
</name>
</person-group>
<article-title>Wire-Free Tactile Sensing Element Based on Optical Connection</article-title>
<conf-name>Proceedings of the 19th Sensor Symposium</conf-name>
<conf-loc>Toyko, Japan</conf-loc>
<conf-date>5–7 August 2002</conf-date>
<fpage>433</fpage>
<lpage>436</lpage>
</element-citation>
</ref>
<ref id="b8-sensors-14-05805">
<label>8.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Yang</surname>
<given-names>S.</given-names>
</name>
<name>
<surname>Chen</surname>
<given-names>X.</given-names>
</name>
<name>
<surname>Motojima</surname>
<given-names>S.</given-names>
</name>
</person-group>
<article-title>Tactile sensing properties of protein-like single-helix carbon microcoils</article-title>
<source>Carbon</source>
<year>2006</year>
<volume>44</volume>
<fpage>3352</fpage>
<lpage>3355</lpage>
</element-citation>
</ref>
<ref id="b9-sensors-14-05805">
<label>9.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Takao</surname>
<given-names>H.</given-names>
</name>
<name>
<surname>Sawada</surname>
<given-names>K.</given-names>
</name>
<name>
<surname>Ishida</surname>
<given-names>M.</given-names>
</name>
</person-group>
<article-title>Monolithic Silicon Smart Tactile Image Sensor With Integrated Strain Sensor Array on Pneumatically Swollen Single-Diaphragm Structure</article-title>
<source>IEEE Trans. Electron. Devices</source>
<year>2006</year>
<volume>53</volume>
<fpage>1250</fpage>
<lpage>1259</lpage>
</element-citation>
</ref>
<ref id="b10-sensors-14-05805">
<label>10.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Engel</surname>
<given-names>J.</given-names>
</name>
<name>
<surname>Chen</surname>
<given-names>J.</given-names>
</name>
<name>
<surname>Liu</surname>
<given-names>C.</given-names>
</name>
</person-group>
<article-title>Development of polyimide flexible tactile sensor skin</article-title>
<source>J. Micromech. Microeng.</source>
<year>2003</year>
<volume>13</volume>
<fpage>359</fpage>
<lpage>366</lpage>
</element-citation>
</ref>
<ref id="b11-sensors-14-05805">
<label>11.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Mei</surname>
<given-names>T.</given-names>
</name>
<name>
<surname>Li</surname>
<given-names>W.J.</given-names>
</name>
<name>
<surname>Ge</surname>
<given-names>Y.</given-names>
</name>
<name>
<surname>Chen</surname>
<given-names>Y.</given-names>
</name>
<name>
<surname>Ni</surname>
<given-names>L.</given-names>
</name>
<name>
<surname>Chan</surname>
<given-names>M.H.</given-names>
</name>
</person-group>
<article-title>An integrated MEMS three-dimensional tactile sensor with large force range</article-title>
<source>Sens. Actuators A</source>
<year>2000</year>
<volume>80</volume>
<fpage>155</fpage>
<lpage>162</lpage>
</element-citation>
</ref>
<ref id="b12-sensors-14-05805">
<label>12.</label>
<element-citation publication-type="confproc">
<person-group person-group-type="author">
<name>
<surname>Engel</surname>
<given-names>J.</given-names>
</name>
<name>
<surname>Chen</surname>
<given-names>J.</given-names>
</name>
<name>
<surname>Liu</surname>
<given-names>C.</given-names>
</name>
</person-group>
<article-title>Development of a multimodal, flexible tactile sensing skin using polymer micromachining</article-title>
<conf-name>Proceedings of the 12th International Conference on Transducers, Solid-State Sensors, Actuators and Microsystems</conf-name>
<conf-loc>Boston, MA, USA</conf-loc>
<conf-date>8–12 June 2003</conf-date>
<fpage>1027</fpage>
<lpage>1030</lpage>
</element-citation>
</ref>
<ref id="b13-sensors-14-05805">
<label>13.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Ferrier</surname>
<given-names>N.J.</given-names>
</name>
<name>
<surname>Brockett</surname>
<given-names>R.W.</given-names>
</name>
</person-group>
<article-title>Reconstructing the Shape of a Deformable Membrane from Image Data</article-title>
<source>Int. J. Robot. Res.</source>
<year>2000</year>
<volume>19</volume>
<fpage>795</fpage>
<lpage>816</lpage>
</element-citation>
</ref>
<ref id="b14-sensors-14-05805">
<label>14.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Saga</surname>
<given-names>S.</given-names>
</name>
<name>
<surname>Kajimoto</surname>
<given-names>H.</given-names>
</name>
<name>
<surname>Tachi</surname>
<given-names>S.</given-names>
</name>
</person-group>
<article-title>High-resolution tactile sensor using the deformation of a reflection image</article-title>
<source>Sens. Rev.</source>
<year>2007</year>
<volume>27</volume>
<fpage>35</fpage>
<lpage>42</lpage>
</element-citation>
</ref>
<ref id="b15-sensors-14-05805">
<label>15.</label>
<element-citation publication-type="confproc">
<person-group person-group-type="author">
<name>
<surname>Johnson</surname>
<given-names>M.K.</given-names>
</name>
<name>
<surname>Adelson</surname>
<given-names>E.H.</given-names>
</name>
</person-group>
<article-title>Retrographic sensing for the measurement of surface texture and shape</article-title>
<conf-name>Proceedings. of the IEEE Conference on Computer Vision and Pattern Recognition</conf-name>
<conf-loc>Miami, FL, USA</conf-loc>
<conf-date>20–25 June 2009</conf-date>
<fpage>1070</fpage>
<lpage>1077</lpage>
</element-citation>
</ref>
<ref id="b16-sensors-14-05805">
<label>16.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Kamiyama</surname>
<given-names>K.</given-names>
</name>
<name>
<surname>Vlack</surname>
<given-names>K.</given-names>
</name>
<name>
<surname>Mizota</surname>
<given-names>T.</given-names>
</name>
<name>
<surname>Kajimoto</surname>
<given-names>H.</given-names>
</name>
<name>
<surname>Kawakami</surname>
<given-names>N.</given-names>
</name>
<name>
<surname>Tachi</surname>
<given-names>S.</given-names>
</name>
</person-group>
<article-title>Vision-Based Sensor for Real-Time Measuring of Surface Traction Fields</article-title>
<source>IEEE Comput. Graphi. Appl.</source>
<year>2005</year>
<volume>25</volume>
<fpage>68</fpage>
<lpage>75</lpage>
</element-citation>
</ref>
<ref id="b17-sensors-14-05805">
<label>17.</label>
<element-citation publication-type="confproc">
<person-group person-group-type="author">
<name>
<surname>Sato</surname>
<given-names>K.</given-names>
</name>
<name>
<surname>Kamiyama</surname>
<given-names>K.</given-names>
</name>
<name>
<surname>Nii</surname>
<given-names>H.</given-names>
</name>
<name>
<surname>Kawakami</surname>
<given-names>N.</given-names>
</name>
<name>
<surname>Tachi</surname>
<given-names>S.</given-names>
</name>
</person-group>
<article-title>Measurement of Force Vector Field of Robotic Finger using Vision-based Haptic Sensor</article-title>
<conf-name>Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems</conf-name>
<conf-loc>Nice, France</conf-loc>
<conf-date>22–26 September 2008</conf-date>
<fpage>488</fpage>
<lpage>493</lpage>
</element-citation>
</ref>
<ref id="b18-sensors-14-05805">
<label>18.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Ohka</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>Mitsuya</surname>
<given-names>Y.</given-names>
</name>
<name>
<surname>Matsunaga</surname>
<given-names>Y.</given-names>
</name>
<name>
<surname>Takeuchi</surname>
<given-names>S.</given-names>
</name>
</person-group>
<article-title>Sensing characteristics of an optical three-axis tactile sensor under combined loading</article-title>
<source>Robotica</source>
<year>2004</year>
<volume>22</volume>
<fpage>213</fpage>
<lpage>221</lpage>
</element-citation>
</ref>
<ref id="b19-sensors-14-05805">
<label>19.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Ohka</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>Takata</surname>
<given-names>J.</given-names>
</name>
<name>
<surname>Kobayashi</surname>
<given-names>H.</given-names>
</name>
<name>
<surname>Suzuki</surname>
<given-names>H.</given-names>
</name>
<name>
<surname>Morisawa</surname>
<given-names>N.</given-names>
</name>
<name>
<surname>Yussof</surname>
<given-names>H.B.</given-names>
</name>
</person-group>
<article-title>Object exploration and manipulation using a robotic finger equipped with an optical three-axis tactile sensor</article-title>
<source>Robotica</source>
<year>2009</year>
<volume>27</volume>
<fpage>763</fpage>
<lpage>770</lpage>
</element-citation>
</ref>
<ref id="b20-sensors-14-05805">
<label>20.</label>
<element-citation publication-type="confproc">
<person-group person-group-type="author">
<name>
<surname>Yamada</surname>
<given-names>Y.</given-names>
</name>
<name>
<surname>Iwanaga</surname>
<given-names>Y.</given-names>
</name>
<name>
<surname>Fukunaga</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>Fujimoto</surname>
<given-names>N.</given-names>
</name>
<name>
<surname>Ohta</surname>
<given-names>E.</given-names>
</name>
<name>
<surname>Morizono</surname>
<given-names>T.</given-names>
</name>
<name>
<surname>Umetani</surname>
<given-names>Y.</given-names>
</name>
</person-group>
<article-title>Soft Viscoelastic Robot Skin Capable of Accurately Sensing Contact Location of Objects</article-title>
<conf-name>Proceedings of the IEEE/RSJ/SICE International Conference on Multisensor Fusion and Integration for Intelligent Systems</conf-name>
<conf-loc>Taipei, Taiwan</conf-loc>
<conf-date>15–18 August 1999</conf-date>
<fpage>105</fpage>
<lpage>110</lpage>
</element-citation>
</ref>
<ref id="b21-sensors-14-05805">
<label>21.</label>
<element-citation publication-type="confproc">
<person-group person-group-type="author">
<name>
<surname>Miyamoto</surname>
<given-names>R.</given-names>
</name>
<name>
<surname>Komatsu</surname>
<given-names>S.</given-names>
</name>
<name>
<surname>Iwase</surname>
<given-names>E.</given-names>
</name>
<name>
<surname>Matsumoto</surname>
<given-names>K.</given-names>
</name>
<name>
<surname>Shimoyama</surname>
<given-names>I.</given-names>
</name>
</person-group>
<article-title>The Estimation of Surface Shape Using Lined Strain Sensors (in Japanese)</article-title>
<conf-name>Proceedings of Robotics and Mechatronics Conference 2008 (ROBOMEC 2008)</conf-name>
<conf-loc>Nagano, Japan</conf-loc>
<conf-date>6–7 June 2008</conf-date>
<fpage>1P1</fpage>
<lpage>I05</lpage>
</element-citation>
</ref>
<ref id="b22-sensors-14-05805">
<label>22.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Maekawa</surname>
<given-names>H.</given-names>
</name>
<name>
<surname>Tanie</surname>
<given-names>K.</given-names>
</name>
<name>
<surname>Kaneko</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>Suzuki</surname>
<given-names>N.</given-names>
</name>
<name>
<surname>Horiguchi</surname>
<given-names>C.</given-names>
</name>
<name>
<surname>Sugawara</surname>
<given-names>T.</given-names>
</name>
</person-group>
<article-title>Development of a Finger-Shaped Tactile Sensor Using Hemispherical Optical Waveguide</article-title>
<source>J. Soc. Instrum. Control Eng.</source>
<year>2001</year>
<volume>E-1</volume>
<fpage>205</fpage>
<lpage>213</lpage>
</element-citation>
</ref>
<ref id="b23-sensors-14-05805">
<label>23.</label>
<element-citation publication-type="book">
<person-group person-group-type="author">
<name>
<surname>Obinata</surname>
<given-names>G.</given-names>
</name>
<name>
<surname>Ashis</surname>
<given-names>D.</given-names>
</name>
<name>
<surname>Watanabe</surname>
<given-names>N.</given-names>
</name>
<name>
<surname>Moriyama</surname>
<given-names>N.</given-names>
</name>
</person-group>
<article-title>Vision Based Tactile Sensor Using Transparent Elastic Fingertip for Dexterous Handling</article-title>
<source>Mobile Robots: Perception & Navigation</source>
<person-group person-group-type="editor">
<name>
<surname>Kolski</surname>
<given-names>S.</given-names>
</name>
</person-group>
<publisher-name>Pro Literatur Verlag</publisher-name>
<publisher-loc>Germany</publisher-loc>
<year>2007</year>
<fpage>137</fpage>
<lpage>148</lpage>
</element-citation>
</ref>
<ref id="b24-sensors-14-05805">
<label>24.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Ito</surname>
<given-names>Y.</given-names>
</name>
<name>
<surname>Kim</surname>
<given-names>Y.</given-names>
</name>
<name>
<surname>Obinata</surname>
<given-names>G.</given-names>
</name>
</person-group>
<article-title>Robust Slippage Degree Estimation based on Reference Update of Vision-based Tactile Sensor</article-title>
<source>IEEE Sens. J.</source>
<year>2011</year>
<volume>11</volume>
<fpage>2037</fpage>
<lpage>2047</lpage>
</element-citation>
</ref>
<ref id="b25-sensors-14-05805">
<label>25.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Ito</surname>
<given-names>Y.</given-names>
</name>
<name>
<surname>Kim</surname>
<given-names>Y.</given-names>
</name>
<name>
<surname>Nagai</surname>
<given-names>C.</given-names>
</name>
<name>
<surname>Obinata</surname>
<given-names>G.</given-names>
</name>
</person-group>
<article-title>Contact State Estimation by Vision-based Tactile Sensors for Dexterous Manipulation with Robot Hands Based on Shape-Sensing</article-title>
<source>Int. J. Adv. Robot. Syst.</source>
<year>2011</year>
<volume>8</volume>
<fpage>225</fpage>
<lpage>234</lpage>
</element-citation>
</ref>
<ref id="b26-sensors-14-05805">
<label>26.</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Ito</surname>
<given-names>Y.</given-names>
</name>
<name>
<surname>Kim</surname>
<given-names>Y.</given-names>
</name>
<name>
<surname>Nagai</surname>
<given-names>C.</given-names>
</name>
<name>
<surname>Obinata</surname>
<given-names>G.</given-names>
</name>
</person-group>
<article-title>Vision-based Tactile Sensing and Shape Estimation Using a Fluid-type Touchpad</article-title>
<source>IEEE Trans. Autom. Sci. Eng.</source>
<year>2012</year>
<volume>9</volume>
<fpage>734</fpage>
<lpage>744</lpage>
</element-citation>
</ref>
<ref id="b27-sensors-14-05805">
<label>27.</label>
<element-citation publication-type="confproc">
<person-group person-group-type="author">
<name>
<surname>Ito</surname>
<given-names>Y.</given-names>
</name>
<name>
<surname>Kim</surname>
<given-names>Y.</given-names>
</name>
<name>
<surname>Obinata</surname>
<given-names>G.</given-names>
</name>
</person-group>
<article-title>Multi-axis Force Measurement based on Vision-based Fluid-type Hemispherical Tactile Sensor</article-title>
<conf-name>Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems</conf-name>
<conf-loc>Toyko, Japan</conf-loc>
<conf-date>3–7 November 2011</conf-date>
<fpage>4729</fpage>
<lpage>4734</lpage>
</element-citation>
</ref>
<ref id="b28-sensors-14-05805">
<label>28.</label>
<element-citation publication-type="book">
<person-group person-group-type="author">
<name>
<surname>Watanabe</surname>
<given-names>N.</given-names>
</name>
<name>
<surname>Obinata</surname>
<given-names>G.</given-names>
</name>
</person-group>
<article-title>Grip Force Control Using Vision-Based Tactile Sensor for Dexterous Handling</article-title>
<source>European Robotics Symposium</source>
<publisher-name>Springer</publisher-name>
<publisher-loc>Berlin, Germany</publisher-loc>
<year>2008</year>
<fpage>113</fpage>
<lpage>122</lpage>
</element-citation>
</ref>
</ref-list>
</back>
<floats-group>
<fig id="f1-sensors-14-05805" position="float">
<label>Figure 1.</label>
<caption>
<p>Configuration of the vision-based tactile sensor and the deformable touchpad.</p>
</caption>
<graphic xlink:href="sensors-14-05805f1"></graphic>
</fig>
<fig id="f2-sensors-14-05805" position="float">
<label>Figure 2.</label>
<caption>
<p>The image captured by the CCD camera when (
<bold>a</bold>
) the touchpad is not in contact with an object; and (
<bold>b</bold>
) when the touchpad contacts an object moving in the tangential direction.</p>
</caption>
<graphic xlink:href="sensors-14-05805f2"></graphic>
</fig>
<fig id="f3-sensors-14-05805" position="float">
<label>Figure 3.</label>
<caption>
<p>The contact region between the touchpad and an object is constructed as a set of the contacting dots.</p>
</caption>
<graphic xlink:href="sensors-14-05805f3"></graphic>
</fig>
<fig id="f4-sensors-14-05805" position="float">
<label>Figure 4.</label>
<caption>
<p>Geometric relationship between the object and the sticking dot
<italic>k</italic>
.</p>
</caption>
<graphic xlink:href="sensors-14-05805f4"></graphic>
</fig>
<fig id="f5-sensors-14-05805" position="float">
<label>Figure 5.</label>
<caption>
<p>Image processing to detect the dots in captured images.</p>
</caption>
<graphic xlink:href="sensors-14-05805f5"></graphic>
</fig>
<fig id="f6-sensors-14-05805" position="float">
<label>Figure 6.</label>
<caption>
<p>The process of assigning numbers to all dots in the image.</p>
</caption>
<graphic xlink:href="sensors-14-05805f6"></graphic>
</fig>
<fig id="f7-sensors-14-05805" position="float">
<label>Figure 7.</label>
<caption>
<p>The side view images when (
<bold>a</bold>
) the touchpad was in contact with a rectangular object; (
<bold>b</bold>
) the touchpad was in contact with a circular-shaped object; and (
<bold>c</bold>
) the touchpad was in contact with a ring-shaped object.</p>
</caption>
<graphic xlink:href="sensors-14-05805f7"></graphic>
</fig>
<fig id="f8-sensors-14-05805" position="float">
<label>Figure 8.</label>
<caption>
<p>The estimation result of contact region when the touchpad contacted a rectangular object; (
<bold>a</bold>
) is a worse case; (
<bold>b</bold>
) is a better case; (
<bold>c</bold>
) is the variation of the estimation when the contact became increasingly deeper.</p>
</caption>
<graphic xlink:href="sensors-14-05805f8"></graphic>
</fig>
<fig id="f9-sensors-14-05805" position="float">
<label>Figure 9.</label>
<caption>
<p>The estimation result of contact region when the touchpad contacted a cylinder-shaped object; (
<bold>a</bold>
) is a worse case; (
<bold>b</bold>
) is a better case; (
<bold>c</bold>
) is the variation of the estimation when the contact became increasingly deeper.</p>
</caption>
<graphic xlink:href="sensors-14-05805f9"></graphic>
</fig>
<fig id="f10-sensors-14-05805" position="float">
<label>Figure 10.</label>
<caption>
<p>The estimation result of contact region when the touchpad contacted a ring-shaped object; (
<bold>a</bold>
) is a worse case; (
<bold>b</bold>
) is a better case; (
<bold>c</bold>
) is the variation of the estimation when the contact became increasingly deeper.</p>
</caption>
<graphic xlink:href="sensors-14-05805f10"></graphic>
</fig>
</floats-group>
</pmc>
<affiliations>
<list>
<country>
<li>Japon</li>
</country>
</list>
<tree>
<noCountry>
<name sortKey="Kim, Youngwoo" sort="Kim, Youngwoo" uniqKey="Kim Y" first="Youngwoo" last="Kim">Youngwoo Kim</name>
<name sortKey="Obinata, Goro" sort="Obinata, Goro" uniqKey="Obinata G" first="Goro" last="Obinata">Goro Obinata</name>
</noCountry>
<country name="Japon">
<noRegion>
<name sortKey="Ito, Yuji" sort="Ito, Yuji" uniqKey="Ito Y" first="Yuji" last="Ito">Yuji Ito</name>
</noRegion>
</country>
</tree>
</affiliations>
</record>

Pour manipuler ce document sous Unix (Dilib)

EXPLOR_STEP=$WICRI_ROOT/Ticri/CIDE/explor/HapticV1/Data/Pmc/Checkpoint
HfdSelect -h $EXPLOR_STEP/biblio.hfd -nk 000D37 | SxmlIndent | more

Ou

HfdSelect -h $EXPLOR_AREA/Data/Pmc/Checkpoint/biblio.hfd -nk 000D37 | SxmlIndent | more

Pour mettre un lien sur cette page dans le réseau Wicri

{{Explor lien
   |wiki=    Ticri/CIDE
   |area=    HapticV1
   |flux=    Pmc
   |étape=   Checkpoint
   |type=    RBID
   |clé=     PMC:4029664
   |texte=   Contact Region Estimation Based on a Vision-Based Tactile Sensor Using a Deformable Touchpad
}}

Pour générer des pages wiki

HfdIndexSelect -h $EXPLOR_AREA/Data/Pmc/Checkpoint/RBID.i   -Sk "pubmed:24670719" \
       | HfdSelect -Kh $EXPLOR_AREA/Data/Pmc/Checkpoint/biblio.hfd   \
       | NlmPubMed2Wicri -a HapticV1 

Wicri

This area was generated with Dilib version V0.6.23.
Data generation: Mon Jun 13 01:09:46 2016. Site generation: Wed Mar 6 09:54:07 2024