Serveur d'exploration sur les dispositifs haptiques

Attention, ce site est en cours de développement !
Attention, site généré par des moyens informatiques à partir de corpus bruts.
Les informations ne sont donc pas validées.

Automated Tracking of Whiskers in Videos of Head Fixed Rodents

Identifieur interne : 002159 ( Pmc/Curation ); précédent : 002158; suivant : 002160

Automated Tracking of Whiskers in Videos of Head Fixed Rodents

Auteurs : Nathan G. Clack ; Daniel H. O'Connor ; Daniel Huber ; Leopoldo Petreanu ; Andrew Hires ; Simon Peron ; Karel Svoboda ; Eugene W. Myers

Source :

RBID : PMC:3390361

Abstract

We have developed software for fully automated tracking of vibrissae (whiskers) in high-speed videos (>500 Hz) of head-fixed, behaving rodents trimmed to a single row of whiskers. Performance was assessed against a manually curated dataset consisting of 1.32 million video frames comprising 4.5 million whisker traces. The current implementation detects whiskers with a recall of 99.998% and identifies individual whiskers with 99.997% accuracy. The average processing rate for these images was 8 Mpx/s/cpu (2.6 GHz Intel Core2, 2 GB RAM). This translates to 35 processed frames per second for a 640 px×352 px video of 4 whiskers. The speed and accuracy achieved enables quantitative behavioral studies where the analysis of millions of video frames is required. We used the software to analyze the evolving whisking strategies as mice learned a whisker-based detection task over the course of 6 days (8148 trials, 25 million frames) and measure the forces at the sensory follicle that most underlie haptic perception.


Url:
DOI: 10.1371/journal.pcbi.1002591
PubMed: 22792058
PubMed Central: 3390361

Links toward previous steps (curation, corpus...)


Links to Exploration step

PMC:3390361

Le document en format XML

<record>
<TEI>
<teiHeader>
<fileDesc>
<titleStmt>
<title xml:lang="en">Automated Tracking of Whiskers in Videos of Head Fixed Rodents</title>
<author>
<name sortKey="Clack, Nathan G" sort="Clack, Nathan G" uniqKey="Clack N" first="Nathan G." last="Clack">Nathan G. Clack</name>
<affiliation>
<nlm:aff id="aff1"></nlm:aff>
</affiliation>
</author>
<author>
<name sortKey="O Connor, Daniel H" sort="O Connor, Daniel H" uniqKey="O Connor D" first="Daniel H." last="O'Connor">Daniel H. O'Connor</name>
</author>
<author>
<name sortKey="Huber, Daniel" sort="Huber, Daniel" uniqKey="Huber D" first="Daniel" last="Huber">Daniel Huber</name>
</author>
<author>
<name sortKey="Petreanu, Leopoldo" sort="Petreanu, Leopoldo" uniqKey="Petreanu L" first="Leopoldo" last="Petreanu">Leopoldo Petreanu</name>
</author>
<author>
<name sortKey="Hires, Andrew" sort="Hires, Andrew" uniqKey="Hires A" first="Andrew" last="Hires">Andrew Hires</name>
</author>
<author>
<name sortKey="Peron, Simon" sort="Peron, Simon" uniqKey="Peron S" first="Simon" last="Peron">Simon Peron</name>
</author>
<author>
<name sortKey="Svoboda, Karel" sort="Svoboda, Karel" uniqKey="Svoboda K" first="Karel" last="Svoboda">Karel Svoboda</name>
</author>
<author>
<name sortKey="Myers, Eugene W" sort="Myers, Eugene W" uniqKey="Myers E" first="Eugene W." last="Myers">Eugene W. Myers</name>
</author>
</titleStmt>
<publicationStmt>
<idno type="wicri:source">PMC</idno>
<idno type="pmid">22792058</idno>
<idno type="pmc">3390361</idno>
<idno type="url">http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3390361</idno>
<idno type="RBID">PMC:3390361</idno>
<idno type="doi">10.1371/journal.pcbi.1002591</idno>
<date when="2012">2012</date>
<idno type="wicri:Area/Pmc/Corpus">002159</idno>
<idno type="wicri:Area/Pmc/Curation">002159</idno>
</publicationStmt>
<sourceDesc>
<biblStruct>
<analytic>
<title xml:lang="en" level="a" type="main">Automated Tracking of Whiskers in Videos of Head Fixed Rodents</title>
<author>
<name sortKey="Clack, Nathan G" sort="Clack, Nathan G" uniqKey="Clack N" first="Nathan G." last="Clack">Nathan G. Clack</name>
<affiliation>
<nlm:aff id="aff1"></nlm:aff>
</affiliation>
</author>
<author>
<name sortKey="O Connor, Daniel H" sort="O Connor, Daniel H" uniqKey="O Connor D" first="Daniel H." last="O'Connor">Daniel H. O'Connor</name>
</author>
<author>
<name sortKey="Huber, Daniel" sort="Huber, Daniel" uniqKey="Huber D" first="Daniel" last="Huber">Daniel Huber</name>
</author>
<author>
<name sortKey="Petreanu, Leopoldo" sort="Petreanu, Leopoldo" uniqKey="Petreanu L" first="Leopoldo" last="Petreanu">Leopoldo Petreanu</name>
</author>
<author>
<name sortKey="Hires, Andrew" sort="Hires, Andrew" uniqKey="Hires A" first="Andrew" last="Hires">Andrew Hires</name>
</author>
<author>
<name sortKey="Peron, Simon" sort="Peron, Simon" uniqKey="Peron S" first="Simon" last="Peron">Simon Peron</name>
</author>
<author>
<name sortKey="Svoboda, Karel" sort="Svoboda, Karel" uniqKey="Svoboda K" first="Karel" last="Svoboda">Karel Svoboda</name>
</author>
<author>
<name sortKey="Myers, Eugene W" sort="Myers, Eugene W" uniqKey="Myers E" first="Eugene W." last="Myers">Eugene W. Myers</name>
</author>
</analytic>
<series>
<title level="j">PLoS Computational Biology</title>
<idno type="ISSN">1553-734X</idno>
<idno type="eISSN">1553-7358</idno>
<imprint>
<date when="2012">2012</date>
</imprint>
</series>
</biblStruct>
</sourceDesc>
</fileDesc>
<profileDesc>
<textClass></textClass>
</profileDesc>
</teiHeader>
<front>
<div type="abstract" xml:lang="en">
<p>We have developed software for fully automated tracking of vibrissae (whiskers) in high-speed videos (>500 Hz) of head-fixed, behaving rodents trimmed to a single row of whiskers. Performance was assessed against a manually curated dataset consisting of 1.32 million video frames comprising 4.5 million whisker traces. The current implementation detects whiskers with a recall of 99.998% and identifies individual whiskers with 99.997% accuracy. The average processing rate for these images was 8 Mpx/s/cpu (2.6 GHz Intel Core2, 2 GB RAM). This translates to 35 processed frames per second for a 640 px×352 px video of 4 whiskers. The speed and accuracy achieved enables quantitative behavioral studies where the analysis of millions of video frames is required. We used the software to analyze the evolving whisking strategies as mice learned a whisker-based detection task over the course of 6 days (8148 trials, 25 million frames) and measure the forces at the sensory follicle that most underlie haptic perception.</p>
</div>
</front>
<back>
<div1 type="bibliography">
<listBibl>
<biblStruct>
<analytic>
<author>
<name sortKey="O Connor, Dh" uniqKey="O Connor D">DH O'Connor</name>
</author>
<author>
<name sortKey="Clack, Ng" uniqKey="Clack N">NG Clack</name>
</author>
<author>
<name sortKey="Huber, D" uniqKey="Huber D">D Huber</name>
</author>
<author>
<name sortKey="Komiyama, T" uniqKey="Komiyama T">T Komiyama</name>
</author>
<author>
<name sortKey="Myers, Ew" uniqKey="Myers E">EW Myers</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Knutsen, Pm" uniqKey="Knutsen P">PM Knutsen</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Carvell, Ge" uniqKey="Carvell G">GE Carvell</name>
</author>
<author>
<name sortKey="Simons, Dj" uniqKey="Simons D">DJ Simons</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Heimendahl Von, M" uniqKey="Heimendahl Von M">M Heimendahl von</name>
</author>
<author>
<name sortKey="Itskov, Pm" uniqKey="Itskov P">PM Itskov</name>
</author>
<author>
<name sortKey="Arabzadeh, E" uniqKey="Arabzadeh E">E Arabzadeh</name>
</author>
<author>
<name sortKey="Diamond, Me" uniqKey="Diamond M">ME Diamond</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Mitchinson, B" uniqKey="Mitchinson B">B Mitchinson</name>
</author>
<author>
<name sortKey="Martin, Cj" uniqKey="Martin C">CJ Martin</name>
</author>
<author>
<name sortKey="Grant, Ra" uniqKey="Grant R">RA Grant</name>
</author>
<author>
<name sortKey="Prescott, Tj" uniqKey="Prescott T">TJ Prescott</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Diamond, Me" uniqKey="Diamond M">ME Diamond</name>
</author>
<author>
<name sortKey="Heimendahl Von, M" uniqKey="Heimendahl Von M">M Heimendahl von</name>
</author>
<author>
<name sortKey="Knutsen, Pm" uniqKey="Knutsen P">PM Knutsen</name>
</author>
<author>
<name sortKey="Kleinfeld, D" uniqKey="Kleinfeld D">D Kleinfeld</name>
</author>
<author>
<name sortKey="Ahissar, E" uniqKey="Ahissar E">E Ahissar</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Hill, Dn" uniqKey="Hill D">DN Hill</name>
</author>
<author>
<name sortKey="Bermejo, R" uniqKey="Bermejo R">R Bermejo</name>
</author>
<author>
<name sortKey="Zeigler, Hp" uniqKey="Zeigler H">HP Zeigler</name>
</author>
<author>
<name sortKey="Kleinfeld, D" uniqKey="Kleinfeld D">D Kleinfeld</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Venkatraman, S" uniqKey="Venkatraman S">S Venkatraman</name>
</author>
<author>
<name sortKey="Elkabany, K" uniqKey="Elkabany K">K Elkabany</name>
</author>
<author>
<name sortKey="Long, Jd" uniqKey="Long J">JD Long</name>
</author>
<author>
<name sortKey="Yao, Y" uniqKey="Yao Y">Y Yao</name>
</author>
<author>
<name sortKey="Carmena, Jm" uniqKey="Carmena J">JM Carmena</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Jadhav, Sp" uniqKey="Jadhav S">SP Jadhav</name>
</author>
<author>
<name sortKey="Wolfe, J" uniqKey="Wolfe J">J Wolfe</name>
</author>
<author>
<name sortKey="Feldman, De" uniqKey="Feldman D">DE Feldman</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Harvey, Ma" uniqKey="Harvey M">MA Harvey</name>
</author>
<author>
<name sortKey="Bermejo, R" uniqKey="Bermejo R">R Bermejo</name>
</author>
<author>
<name sortKey="Zeigler, Hp" uniqKey="Zeigler H">HP Zeigler</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Voigts, J" uniqKey="Voigts J">J Voigts</name>
</author>
<author>
<name sortKey="Sakmann, B" uniqKey="Sakmann B">B Sakmann</name>
</author>
<author>
<name sortKey="Celikel, T" uniqKey="Celikel T">T Celikel</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Ritt, Jt" uniqKey="Ritt J">JT Ritt</name>
</author>
<author>
<name sortKey="Andermann, Ml" uniqKey="Andermann M">ML Andermann</name>
</author>
<author>
<name sortKey="Moore, Ci" uniqKey="Moore C">CI Moore</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Knutsen, Pm" uniqKey="Knutsen P">PM Knutsen</name>
</author>
<author>
<name sortKey="Derdikman, D" uniqKey="Derdikman D">D Derdikman</name>
</author>
<author>
<name sortKey="Ahissar, E" uniqKey="Ahissar E">E Ahissar</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Gyory, G" uniqKey="Gyory G">G Gyory</name>
</author>
<author>
<name sortKey="Rankov, V" uniqKey="Rankov V">V Rankov</name>
</author>
<author>
<name sortKey="Gordon, G" uniqKey="Gordon G">G Gordon</name>
</author>
<author>
<name sortKey="Perkon, I" uniqKey="Perkon I">I Perkon</name>
</author>
<author>
<name sortKey="Mitchinson, B" uniqKey="Mitchinson B">B Mitchinson</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Perkon, I" uniqKey="Perkon I">I Perkon</name>
</author>
<author>
<name sortKey="Kosir, A" uniqKey="Kosir A">A Kosir</name>
</author>
<author>
<name sortKey="Itskov, Pm" uniqKey="Itskov P">PM Itskov</name>
</author>
<author>
<name sortKey="Tasic, J" uniqKey="Tasic J">J Tasic</name>
</author>
<author>
<name sortKey="Diamond, Me" uniqKey="Diamond M">ME Diamond</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Birdwell, Ja" uniqKey="Birdwell J">JA Birdwell</name>
</author>
<author>
<name sortKey="Solomon, Jh" uniqKey="Solomon J">JH Solomon</name>
</author>
<author>
<name sortKey="Thajchayapong, M" uniqKey="Thajchayapong M">M Thajchayapong</name>
</author>
<author>
<name sortKey="Taylor, Ma" uniqKey="Taylor M">MA Taylor</name>
</author>
<author>
<name sortKey="Cheely, M" uniqKey="Cheely M">M Cheely</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Huber, D" uniqKey="Huber D">D Huber</name>
</author>
<author>
<name sortKey="Gutnisky, Da" uniqKey="Gutnisky D">DA Gutnisky</name>
</author>
<author>
<name sortKey="Peron, S" uniqKey="Peron S">S Peron</name>
</author>
<author>
<name sortKey="O Connor, Dh" uniqKey="O Connor D">DH O'Connor</name>
</author>
<author>
<name sortKey="Wiegert, Js" uniqKey="Wiegert J">JS Wiegert</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Mehta, Sb" uniqKey="Mehta S">SB Mehta</name>
</author>
<author>
<name sortKey="Whitmer, D" uniqKey="Whitmer D">D Whitmer</name>
</author>
<author>
<name sortKey="Figueroa, R" uniqKey="Figueroa R">R Figueroa</name>
</author>
<author>
<name sortKey="Williams, Ba" uniqKey="Williams B">BA Williams</name>
</author>
<author>
<name sortKey="Kleinfeld, D" uniqKey="Kleinfeld D">D Kleinfeld</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Mikolajczyk, K" uniqKey="Mikolajczyk K">K Mikolajczyk</name>
</author>
<author>
<name sortKey="Schmid, C" uniqKey="Schmid C">C Schmid</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Torre, V" uniqKey="Torre V">V Torre</name>
</author>
<author>
<name sortKey="Poggio, Ta" uniqKey="Poggio T">TA Poggio</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Rabiner, Lr" uniqKey="Rabiner L">LR Rabiner</name>
</author>
</analytic>
</biblStruct>
<biblStruct>
<analytic>
<author>
<name sortKey="Gordon, G" uniqKey="Gordon G">G Gordon</name>
</author>
<author>
<name sortKey="Mitcheson, B" uniqKey="Mitcheson B">B Mitcheson</name>
</author>
<author>
<name sortKey="Grant, Ra" uniqKey="Grant R">RA Grant</name>
</author>
<author>
<name sortKey="Diamond, M" uniqKey="Diamond M">M Diamond</name>
</author>
<author>
<name sortKey="Prescot, T" uniqKey="Prescot T">T Prescot</name>
</author>
</analytic>
</biblStruct>
</listBibl>
</div1>
</back>
</TEI>
<pmc article-type="research-article">
<pmc-dir>properties open_access</pmc-dir>
<front>
<journal-meta>
<journal-id journal-id-type="nlm-ta">PLoS Comput Biol</journal-id>
<journal-id journal-id-type="iso-abbrev">PLoS Comput. Biol</journal-id>
<journal-id journal-id-type="publisher-id">plos</journal-id>
<journal-id journal-id-type="pmc">ploscomp</journal-id>
<journal-title-group>
<journal-title>PLoS Computational Biology</journal-title>
</journal-title-group>
<issn pub-type="ppub">1553-734X</issn>
<issn pub-type="epub">1553-7358</issn>
<publisher>
<publisher-name>Public Library of Science</publisher-name>
<publisher-loc>San Francisco, USA</publisher-loc>
</publisher>
</journal-meta>
<article-meta>
<article-id pub-id-type="pmid">22792058</article-id>
<article-id pub-id-type="pmc">3390361</article-id>
<article-id pub-id-type="publisher-id">PCOMPBIOL-D-12-00134</article-id>
<article-id pub-id-type="doi">10.1371/journal.pcbi.1002591</article-id>
<article-categories>
<subj-group subj-group-type="heading">
<subject>Research Article</subject>
</subj-group>
<subj-group subj-group-type="Discipline-v2">
<subject>Biology</subject>
<subj-group>
<subject>Neuroscience</subject>
<subj-group>
<subject>Sensory Perception</subject>
<subj-group>
<subject>Psychophysics</subject>
</subj-group>
</subj-group>
<subj-group>
<subject>Behavioral Neuroscience</subject>
</subj-group>
</subj-group>
</subj-group>
<subj-group subj-group-type="Discipline-v2">
<subject>Computer Science</subject>
<subj-group>
<subject>Algorithms</subject>
</subj-group>
</subj-group>
<subj-group subj-group-type="Discipline-v2">
<subject>Engineering</subject>
<subj-group>
<subject>Signal Processing</subject>
<subj-group>
<subject>Image Processing</subject>
<subject>Video Processing</subject>
</subj-group>
</subj-group>
</subj-group>
<subj-group subj-group-type="Discipline-v2">
<subject>Mathematics</subject>
<subj-group>
<subject>Probability Theory</subject>
<subj-group>
<subject>Markov Model</subject>
</subj-group>
</subj-group>
</subj-group>
</article-categories>
<title-group>
<article-title>Automated Tracking of Whiskers in Videos of Head Fixed Rodents</article-title>
<alt-title alt-title-type="running-head">Whisker Tracking</alt-title>
</title-group>
<contrib-group>
<contrib contrib-type="author">
<name>
<surname>Clack</surname>
<given-names>Nathan G.</given-names>
</name>
<xref ref-type="aff" rid="aff1"></xref>
<xref ref-type="corresp" rid="cor1">
<sup>*</sup>
</xref>
</contrib>
<contrib contrib-type="author">
<name>
<surname>O'Connor</surname>
<given-names>Daniel H.</given-names>
</name>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Huber</surname>
<given-names>Daniel</given-names>
</name>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Petreanu</surname>
<given-names>Leopoldo</given-names>
</name>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Hires</surname>
<given-names>Andrew</given-names>
</name>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Peron</surname>
<given-names>Simon</given-names>
</name>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Svoboda</surname>
<given-names>Karel</given-names>
</name>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Myers</surname>
<given-names>Eugene W.</given-names>
</name>
</contrib>
</contrib-group>
<aff id="aff1">
<addr-line>Janelia Farm Research Campus, Howard Hughes Medical Institute, Ashburn, Virginia, United States of America</addr-line>
</aff>
<contrib-group>
<contrib contrib-type="editor">
<name>
<surname>Prlic</surname>
<given-names>Andreas</given-names>
</name>
<role>Editor</role>
<xref ref-type="aff" rid="edit1"></xref>
</contrib>
</contrib-group>
<aff id="edit1">University of California, San Diego, United States of America</aff>
<author-notes>
<corresp id="cor1">* E-mail:
<email>clackn@janelia.hhmi.org</email>
</corresp>
<fn fn-type="con">
<p>Conceived and designed the experiments: NGC DHO KS EWM. Performed the experiments: NGC DHO DH LP. Analyzed the data: NGC DHO DH LP AH SP KS. Contributed reagents/materials/analysis tools: NGC DHO DH LP. Wrote the paper: NGC KS EWM.</p>
</fn>
</author-notes>
<pub-date pub-type="collection">
<month>7</month>
<year>2012</year>
</pub-date>
<pmc-comment> Fake ppub added to accomodate plos workflow change from 03/2008 and 03/2009 </pmc-comment>
<pub-date pub-type="ppub">
<month>7</month>
<year>2012</year>
</pub-date>
<pub-date pub-type="epub">
<day>5</day>
<month>7</month>
<year>2012</year>
</pub-date>
<volume>8</volume>
<issue>7</issue>
<elocation-id>e1002591</elocation-id>
<history>
<date date-type="received">
<day>23</day>
<month>1</month>
<year>2012</year>
</date>
<date date-type="accepted">
<day>12</day>
<month>5</month>
<year>2012</year>
</date>
</history>
<permissions>
<copyright-statement>Clack et al. This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.</copyright-statement>
<copyright-year>2012</copyright-year>
</permissions>
<abstract>
<p>We have developed software for fully automated tracking of vibrissae (whiskers) in high-speed videos (>500 Hz) of head-fixed, behaving rodents trimmed to a single row of whiskers. Performance was assessed against a manually curated dataset consisting of 1.32 million video frames comprising 4.5 million whisker traces. The current implementation detects whiskers with a recall of 99.998% and identifies individual whiskers with 99.997% accuracy. The average processing rate for these images was 8 Mpx/s/cpu (2.6 GHz Intel Core2, 2 GB RAM). This translates to 35 processed frames per second for a 640 px×352 px video of 4 whiskers. The speed and accuracy achieved enables quantitative behavioral studies where the analysis of millions of video frames is required. We used the software to analyze the evolving whisking strategies as mice learned a whisker-based detection task over the course of 6 days (8148 trials, 25 million frames) and measure the forces at the sensory follicle that most underlie haptic perception.</p>
</abstract>
<counts>
<page-count count="8"></page-count>
</counts>
</article-meta>
</front>
<body>
<disp-quote>
<p>“This is a
<italic>PLoS Computational Biology</italic>
Software article.”</p>
</disp-quote>
<sec id="s2">
<title>Introduction</title>
<p>Rats and mice move their large whiskers (vibrissae), typically in a rhythmic pattern, to locate object features
<xref ref-type="bibr" rid="pcbi.1002591-OConnor1">[1]</xref>
,
<xref ref-type="bibr" rid="pcbi.1002591-Knutsen1">[2]</xref>
, or to identify textures and objects
<xref ref-type="bibr" rid="pcbi.1002591-Carvell1">[3]</xref>
,
<xref ref-type="bibr" rid="pcbi.1002591-Heimendahlvon1">[4]</xref>
. Whisker movements in turn are influenced by touch
<xref ref-type="bibr" rid="pcbi.1002591-OConnor1">[1]</xref>
,
<xref ref-type="bibr" rid="pcbi.1002591-Mitchinson1">[5]</xref>
. The whisker system is a powerful model for studying the principles underlying sensorimotor integration and active somatosensation
<xref ref-type="bibr" rid="pcbi.1002591-Diamond1">[6]</xref>
. Critical to any mechanistic study at the level of neurons is the quantitative analysis of behavior. Whisker movements have been measured in a variety of ways. Electromyograms, recorded from the facial muscles, correlate with whisker movements
<xref ref-type="bibr" rid="pcbi.1002591-Hill1">[7]</xref>
. This invasive method does not report whisker position and shape per se and is complimentary to whisker tracking. Imaging individual labeled whiskers, for example by gluing a high-contrast particle on the whisker
<xref ref-type="bibr" rid="pcbi.1002591-Venkatraman1">[8]</xref>
, provides the position of the marked whisker, but does not reveal whisker shape. In addition, the particle will change the whisker mass and stiffness and thereby perturb whisker dynamics. Monitoring a single point along the whisker using linescan imaging
<xref ref-type="bibr" rid="pcbi.1002591-Jadhav1">[9]</xref>
, or a linear light sheet
<xref ref-type="bibr" rid="pcbi.1002591-Harvey1">[10]</xref>
, can provide high-speed information about whisker position, but only at a single line of intersection.</p>
<p>High-speed (>500 Hz) videography is a non-invasive method for measuring whisker movements and forces acting on whiskers and yields nearly complete information about whiskers during behavior
<xref ref-type="bibr" rid="pcbi.1002591-OConnor1">[1]</xref>
,
<xref ref-type="bibr" rid="pcbi.1002591-Voigts1">[11]</xref>
<xref ref-type="bibr" rid="pcbi.1002591-Perkon1">[15]</xref>
. The position of the whisker base with respect to the face reveals the motor programs underlying behavior. The deformation of whisker shape by touch can be used to extract the forces felt by the mouse sensory follicles
<xref ref-type="bibr" rid="pcbi.1002591-Birdwell1">[16]</xref>
,
<xref ref-type="bibr" rid="pcbi.1002591-Huber1">[17]</xref>
. However, high-speed videography brings its own technical challenges. The large number of images required makes manual analysis impossible for more than a few seconds of video. Comprehensive studies require fully automated analysis. In addition, extraction of motions and touch forces demands accurate measurement of whisker shape, often with sub-pixel precision, and identification of rapidly moving whiskers across time. Finally, the large volume of video data potentially places severe demands even on advanced computational infrastructures, making efficient algorithms necessary.</p>
<p>Tracking whiskers is challenging. Whiskers can move at high speeds
<xref ref-type="bibr" rid="pcbi.1002591-Ritt1">[12]</xref>
and in complex patterns
<xref ref-type="bibr" rid="pcbi.1002591-OConnor1">[1]</xref>
,
<xref ref-type="bibr" rid="pcbi.1002591-Mehta1">[18]</xref>
. Adjacent whiskers can have distinct trajectories. Moreover, whiskers are thin hairs (e.g. mouse whiskers taper to a thickness of a few micrometers) and thus provide only limited contrast in imaging experiments.</p>
<p>To address these challenges, we have developed software for tracking a single row of whiskers in a fully automated fashion. Over a manually curated database of 400 video sequences of head fixed mice (1.32×10
<sup>6</sup>
cumulative images, 4.5×10
<sup>6</sup>
traced whiskers, 8 mice), whiskers were correctly detected and identified with an accuracy of 99.997% (1 error per 3×10
<sup>4</sup>
traced whiskers). In other whisker tracking systems, models constraining possible motions and whisker shapes have been used to aid tracking. In contrast, our approach uses statistics gleaned from the video itself to estimate the most likely identity of each traced object that maintains the expected order of whiskers along the face. As a result, the shapes of highly strained whiskers (curvature>0.25/mm) can be traced with sub-pixel accuracy, and tracking is faithful despite occasional fast motion (deflections >10,000 degrees/s).</p>
<p>Our method consists of two steps performed in succession: tracing and linking. Tracing produces a set of piecewise linear curves that represent whisker-like objects for each individual image in a video. Each image is analyzed independently. The linking algorithm is then applied to determine the identity of each traced curve. The trajectory of a whisker is described by collecting curves with the same identity throughout the video. Importantly, once a faithful tracing is completed, the original voluminous image data set is no longer required for downstream processing.</p>
<p>Once tracing is complete, a small set of features is tabulated for each curve. Based on these features, a heuristic is used to make an initial guess as to the identity of curves in a set of images where it works well. These initial identifications are then used to get a statistical description of the shapes and motions of whiskers. These statistics are then used to compute a final optimal labeling for each traced curve subject to the constraint that whiskers are ordered along the face.</p>
<p>We validated this approach using manually curated video acquired from a head-fixed, whisker-dependant object localization task
<xref ref-type="bibr" rid="pcbi.1002591-OConnor1">[1]</xref>
. In these experiments the field of view contains a single row of partially and wholly imaged whiskers as well as the stimulus object (a thin pole, normal to the imaging plane) (
<xref ref-type="fig" rid="pcbi-1002591-g001">Figure 1A</xref>
). The camera is positioned below the whiskers and illuminated from above (
<xref ref-type="fig" rid="pcbi-1002591-g001">Figure 1B</xref>
) producing silhouetted views of the whiskers (
<xref ref-type="fig" rid="pcbi-1002591-g001">Figure 1C–D</xref>
). Automated tracing is robust to occlusions introduced by the stimulus and to whisker crossings. Tracing yields curves for both whiskers and facial hairs (
<xref ref-type="fig" rid="pcbi-1002591-g001">Figure 1C–D</xref>
). Linking is used to uniquely identify each whisker so features such as angle of deflection (
<xref ref-type="fig" rid="pcbi-1002591-g001">Figure 1F</xref>
) and curvature (
<xref ref-type="fig" rid="pcbi-1002591-g001">Figure 1G</xref>
) can be extracted as time-series.</p>
<fig id="pcbi-1002591-g001" position="float">
<object-id pub-id-type="doi">10.1371/journal.pcbi.1002591.g001</object-id>
<label>Figure 1</label>
<caption>
<title>Tracking whiskers from high-speed (500 Hz) videos during an object detection task.</title>
<p>(A) A typical field of view. (B) Typical imaging configuration. (C–G) Automated results of tracing and linking. (C) Facial hairs and whiskers are traced in each video frame and then identified by a separate tracking step. (D) A whisker (blue) touches the pole. (E) Two whiskers (blue & green) are bent by the pole. The most posterior whisker is strongly retracted so that only a small segment is visible. (F) Tracking measures whisker orientation, such as the angle at base. (G) Tracking measures whisker shape, such as mean curvature, which can be observed over time. Changes in curvature allow the calculation of forces acting on the whisker follicle
<xref ref-type="bibr" rid="pcbi.1002591-Birdwell1">[16]</xref>
.</p>
</caption>
<graphic xlink:href="pcbi.1002591.g001"></graphic>
</fig>
</sec>
<sec id="s3">
<title>Design and Implementation</title>
<sec id="s3a">
<title>Tracing</title>
<p>The process of tracing the backbone of a whisker consists of three phases: initiation, extension, and termination. Tracing starts by using the angle and position estimated at the highest scoring candidate initiation site. As a curve is extended, initiation sites that fall under the traced curve are removed from consideration. When none remain, potentially duplicate curves are resolved.</p>
<p>Before any analysis, images were preprocessed to remove imaging artifacts specific to particular cameras (
<xref ref-type="supplementary-material" rid="pcbi.1002591.s003">Text S1</xref>
,
<xref ref-type="supplementary-material" rid="pcbi.1002591.s002">Figure S1</xref>
). Tracing begins by searching the image for candidate initiation sites. These are found by performing a pixel-level segmentation isolating locally line-like features in the image. With each segmented pixel, a score and an angle are computed. This is an optimization step; it is possible to initiate tracing at any pixel in an image. However, tracing is relatively expensive, and, by filtering out unproductive initiation sites, computation can be focused on the small number of relevant pixels. On the other hand, the filtration threshold should be set conservatively so that every whisker is sure to have at least one initiation site. Parameters were set by examining results over a small set of 10 images but worked well over the entire data set of over 1 million images. Typically, 50–100 candidate sites were found per whisker with 10–20 false positives per image.</p>
<p>A variety of methods have been developed to detect interest points in natural images
<xref ref-type="bibr" rid="pcbi.1002591-Mikolajczyk1">[19]</xref>
, and the Hough transform is conventionally used to find linear features. However, given the high contrast and controlled imaging conditions used to acquire data (
<xref ref-type="supplementary-material" rid="pcbi.1002591.s003">Text S1</xref>
), we developed a linear time algorithm, summarized in
<xref ref-type="fig" rid="pcbi-1002591-g002">Figure 2</xref>
, that requires only local image information and yields an estimate of salience and line orientation in a single pass.</p>
<fig id="pcbi-1002591-g002" position="float">
<object-id pub-id-type="doi">10.1371/journal.pcbi.1002591.g002</object-id>
<label>Figure 2</label>
<caption>
<title>Detecting whiskers.</title>
<p>(A) Lines passing through a 7×7 pixel box are detected by finding the location of intensity minima (black dots) in regions from two partitions. The eccentricity of the best Gaussian fit (indicated by ellipses) to these points is used to score salience. (B) The score computed at each point in an image. (C) For high scoring seeds (eccentricity>0.95), the orientation of the line is indicated by the major axis of the ellipse. Whisker tracing is initiated at these sites using the measured angle.</p>
</caption>
<graphic xlink:href="pcbi.1002591.g002"></graphic>
</fig>
<p>First, a 7×7 square box is centered on a pixel of interest (
<xref ref-type="fig" rid="pcbi-1002591-g002">Figure 2A</xref>
). The box is divided into two partitions and the position of the intensity minima found in each subset of the two partitions is found. These are designed to ensure that when a whisker passes through the box, one partition will preferentially collect minima along the whisker backbone, and, as a result, the position of those minima will be linearly correlated. For each partition, principle components are computed over the covariance of the collected positions. The principle components describe the major and minor axis of an ellipse. The partition with the highest eccentricity is chosen, and the eccentricity is used to score how line-like the image is at the queried point (
<xref ref-type="fig" rid="pcbi-1002591-g002">Figure 2B</xref>
). The orientation of the major axis is used to estimate the direction of the line (
<xref ref-type="fig" rid="pcbi-1002591-g002">Figure 2C</xref>
,
<xref ref-type="supplementary-material" rid="pcbi.1002591.s004">Video S1</xref>
).</p>
<p>One advantage of this approach is that the entire image need not be queried for line-like objects. For example, in the data analyzed here, whiskers span hundreds of pixels. A grid of lines spaced appropriately (50 px; this is an adjustable parameter) will cross each whisker at least once. Restricting the search for line-like features to this grid greatly reduces the amount of time required for this step. Additionally, since the algorithm relies on local minima, demands on background subtraction and other preprocessing is minimal. The structure of the partitions ensures that lines in any orientation are detected. Greater than 80% of the pixels that could be used to start tracing a whisker (typically 1000 per whisker over the full image) are detected using an eccentricity threshold of 0.95; only 1 is required to fully trace a whisker. This detector systematically avoids potential starting locations near the face, near occlusions or where whiskers cross or nearly touch.</p>
<p>Curves are bidirectionally extended from candidate initiation sites by performing a small step (1 px) along the measured direction, and repeating until one of several criteria is met. The sub-pixel position and tangent angle are estimated by optimizing the correlation between an oriented line detector that is parameterized as a function of sub-pixel position, angle and width (
<xref ref-type="fig" rid="pcbi-1002591-g003">Figure 3</xref>
).</p>
<fig id="pcbi-1002591-g003" position="float">
<object-id pub-id-type="doi">10.1371/journal.pcbi.1002591.g003</object-id>
<label>Figure 3</label>
<caption>
<title>Parameterized line detector.</title>
<p>(A) Two parallel step edge detectors are separated by a distance,
<italic>w</italic>
, and oriented by an angle,
<italic>θ</italic>
about a center point. The center point (black dot) is determined by a sub-pixel displacement from a pixel anchor (red dot). (B) The line detector is computed for discrete values of
<italic>w</italic>
,
<italic>θ</italic>
, and
<italic>offset</italic>
.</p>
</caption>
<graphic xlink:href="pcbi.1002591.g003"></graphic>
</fig>
<p>The line detector is designed based on modeling the intensity profile of a whisker as a rectangular valley in the image with variable position, width and angle. The center of the whisker is estimated with sub-pixel precision by finding a position that minimizes the Laplacian of the correlation between the model and the image
<xref ref-type="bibr" rid="pcbi.1002591-Torre1">[20]</xref>
. The line detector approximates the Laplacian of the model. It consists of two rectangular parallel step-edge detectors (20×1 px) spaced by the detector width (
<xref ref-type="fig" rid="pcbi-1002591-g003">Figure 3A</xref>
). The length was chosen to match the distance over which highly curved whiskers remain approximately linear. The correlation at a given point is evaluated using a pixel representation of the detector that is computed by evaluating the area integral of the detector over the square domain of each pixel (
<xref ref-type="fig" rid="pcbi-1002591-g003">Figure 3B</xref>
). The value of the correlation at that position is then the dot product between pixels in the image and pixels in the evaluated detector. For efficiency, pixel representations of the detector are pre-tabulated for discrete values of the sub-pixel position (0.1 px), width (0.2 px), and angle (2.5 degree precision). Optimizing the width of the detector was important for reliable tracing of whiskers of different width and for reliable tracing of the very tips of whiskers.</p>
<p>Tracing is stopped if any one of several criteria indicates that the optimization procedure cannot be trusted to yield accurate results. This is necessary to handle cases where whiskers were partially occluded, such as when whiskers crossed or contacted an obstructing object, such as the stimulus pole in an object localization experiment (
<xref ref-type="fig" rid="pcbi-1002591-g004">Figure 4</xref>
). The tests were for low correlation with the line detector, large left-right asymmetry in the image intensity about the detector, low mean intensity about the detector, and large angular change between steps. A corresponding user-adjustable threshold parameterizes each test. The tests are applied at each step as the curve is traced from the initiation point (
<xref ref-type="fig" rid="pcbi-1002591-g004">Figure 4A,B</xref>
). If one of these fails (
<xref ref-type="fig" rid="pcbi-1002591-g004">Figure 4C,D</xref>
), a number of single pixel steps are taken along the last trusted direction up to a user-settable maximum distance in an attempt to jump past the distrusted region (
<xref ref-type="fig" rid="pcbi-1002591-g004">Figure 4E,F</xref>
). Linear extrapolation is usually a good guess. If all tests are satisfied at one of the points, normal tracing is resumed (
<xref ref-type="fig" rid="pcbi-1002591-g004">Figure 4G,H</xref>
). Otherwise, the trace is terminated at the last trusted point (
<xref ref-type="fig" rid="pcbi-1002591-g004">Figure 4G</xref>
).</p>
<fig id="pcbi-1002591-g004" position="float">
<object-id pub-id-type="doi">10.1371/journal.pcbi.1002591.g004</object-id>
<label>Figure 4</label>
<caption>
<title>Tracing illustrated for cases with whisker crossing and pole contact.</title>
<p>(A,B) As a curve is extended from an initiation point, a local test region within the image is queried at each step to detect cases where the line detector may be unreliable. This happens near (C) crossing whiskers and (D) whisker-pole contacts. (E,F) When such a cases are encountered, the curve is linearly extended from the last trusted point, up to a threshold distance. (G,H) If all tests are satisfied at one of the points, a line segment is used to jump the gap and normal tracing is resumed. Otherwise, the trace is terminated at the last trusted point.</p>
</caption>
<graphic xlink:href="pcbi.1002591.g004"></graphic>
</fig>
<p>Occasionally, overlapping curves are generated during the tracing step. These need to be resolved before tracking so that individual whiskers have exactly one associated curve per frame. Searching for curves that cross the same pixel identifies pairs of potentially redundant curves. If all the points within an interval of one of the curves is within a small distance (typically 2 px) of the other curve and that interval consists of over 50% of the total length, then it is considered redundant. The shorter of the two curves is discarded; the shapes of shorter curves were not reliable enough to average. This procedure is repeated until there are no redundant curves left.</p>
</sec>
<sec id="s3b">
<title>Linking</title>
<p>Identifying individual whiskers is challenging. First, the tracing algorithm generates curves (false positives) for facial hairs, trimmed whiskers, or other artifacts in the scene (
<xref ref-type="fig" rid="pcbi-1002591-g001">Figure 1C</xref>
). The linking algorithm must correctly identify true whiskers against this noise. Second, the motion of whiskers can be difficult to reliably model
<xref ref-type="bibr" rid="pcbi.1002591-Venkatraman1">[8]</xref>
,
<xref ref-type="bibr" rid="pcbi.1002591-Ritt1">[12]</xref>
. A typical trial may consist of slow, predictable motions interrupted by rapid transient motions that are occasionally too fast even for high speed (1000 fps) cameras to adequately sample
<xref ref-type="bibr" rid="pcbi.1002591-OConnor1">[1]</xref>
. Although difficult to track, these unpredictable motions may correspond to some of the most interesting data.</p>
<p>Our approach is based on the observation that whiskers are relatively long and maintain a consistent ordering along the face. We use a statistical model to describe the shapes and motions that are characteristic of whiskers. The model is then used to assign each traced whisker the most probable identity subject to the constraint that whiskers are ordered along the face.</p>
<p>For the majority of images, a simple length threshold can be used to remove false positives and leave exactly one traced curve per whisker. The linking algorithm is either supplied the number,
<italic>N</italic>
, of whiskers on the subject mouse, or estimates it as follows: For a given length threshold
<inline-formula>
<inline-graphic xlink:href="pcbi.1002591.e001.jpg" mimetype="image"></inline-graphic>
</inline-formula>
, let
<inline-formula>
<inline-graphic xlink:href="pcbi.1002591.e002.jpg" mimetype="image"></inline-graphic>
</inline-formula>
be the set of frames that have
<italic>n</italic>
curves longer than
<inline-formula>
<inline-graphic xlink:href="pcbi.1002591.e003.jpg" mimetype="image"></inline-graphic>
</inline-formula>
. If
<italic>N</italic>
is not supplied, then we find
<inline-formula>
<inline-graphic xlink:href="pcbi.1002591.e004.jpg" mimetype="image"></inline-graphic>
</inline-formula>
and
<inline-formula>
<inline-graphic xlink:href="pcbi.1002591.e005.jpg" mimetype="image"></inline-graphic>
</inline-formula>
such that
<inline-formula>
<inline-graphic xlink:href="pcbi.1002591.e006.jpg" mimetype="image"></inline-graphic>
</inline-formula>
is maximal over all possible choices. In most of the data sets we have tested,
<inline-formula>
<inline-graphic xlink:href="pcbi.1002591.e007.jpg" mimetype="image"></inline-graphic>
</inline-formula>
, i.e. this heuristic is spot on. But when one or more whiskers frequently leave the field of view it may fail. If
<italic>N</italic>
is supplied, then we let
<inline-formula>
<inline-graphic xlink:href="pcbi.1002591.e008.jpg" mimetype="image"></inline-graphic>
</inline-formula>
be the value for which
<inline-formula>
<inline-graphic xlink:href="pcbi.1002591.e009.jpg" mimetype="image"></inline-graphic>
</inline-formula>
is maximal because the set of long curves in each frame in
<inline-formula>
<inline-graphic xlink:href="pcbi.1002591.e010.jpg" mimetype="image"></inline-graphic>
</inline-formula>
are always true whiskers. For the data sets tested here,
<inline-formula>
<inline-graphic xlink:href="pcbi.1002591.e011.jpg" mimetype="image"></inline-graphic>
</inline-formula>
represented 99.8% of a video on average (min: 90.1%, max: 100%).</p>
<p>The curves in the frames
<inline-formula>
<inline-graphic xlink:href="pcbi.1002591.e012.jpg" mimetype="image"></inline-graphic>
</inline-formula>
and their heuristic classification into whisker and non-whisker based on
<inline-formula>
<inline-graphic xlink:href="pcbi.1002591.e013.jpg" mimetype="image"></inline-graphic>
</inline-formula>
supplies a training data set for the specific video. From these, we can learn various characteristics to estimate the probability of a curve being one or the other. Specifically, we build normalized histograms,
<italic>n
<sub>f</sub>
</italic>
,
<italic>
<sub>W</sub>
</italic>
and
<italic>n
<sub>f</sub>
</italic>
,
<italic>
<sub>FP</sub>
</italic>
(
<xref ref-type="fig" rid="pcbi-1002591-g005">Figure 5C</xref>
), one for whiskers (
<italic>W</italic>
) and one for false positives (
<italic>FP</italic>
), for each of six features,
<italic>f</italic>
, using the training set. The features used are the angle near the face, average curvature, average tracing score, length, and endpoints of a curve. The angle and curvature are computed from a parametric third degree polynomial fit to each curve. These histograms are then used to estimate,
<inline-formula>
<inline-graphic xlink:href="pcbi.1002591.e014.jpg" mimetype="image"></inline-graphic>
</inline-formula>
, the probability that a curve
<italic>c</italic>
is of kind
<italic>k</italic>
, either
<italic>W</italic>
or
<italic>FP</italic>
, as the product,
<disp-formula>
<graphic xlink:href="pcbi.1002591.e015"></graphic>
<label>(1)</label>
</disp-formula>
To ensure no feature has zero likelihood, a count of one is added to each bin in a histogram before normalizing.</p>
<fig id="pcbi-1002591-g005" position="float">
<object-id pub-id-type="doi">10.1371/journal.pcbi.1002591.g005</object-id>
<label>Figure 5</label>
<caption>
<title>When linking
<italic>N</italic>
whiskers, each curve in a frame is assigned a label
<italic>W
<sub>1</sub>
</italic>
to
<italic>W
<sub>N</sub>
</italic>
, or
<italic>F
<sub>0</sub>
</italic>
to
<italic>F
<sub>N</sub>
</italic>
.</title>
<p>Rules constrain the labeling to enforce consistent anterior-posterior ordering of whiskers. The most proximal point of curves labeled W
<sub>i</sub>
or F
<sub>i</sub>
must be posterior to that of curves labeled W
<sub>j</sub>
or F
<sub>j</sub>
when
<italic>i</italic>
, and at most one curve may be labeled
<italic>W
<sub>i</sub>
</italic>
for a given
<italic>i</italic>
. (A) A correct labeling is schematically illustrated. (B) These rules are encoded as transitions in a hidden Markov model. (C) Normalized feature histograms are used to compute the likelihood a curve is, or is not, a whisker.</p>
</caption>
<graphic xlink:href="pcbi.1002591.g005"></graphic>
</fig>
<p>In (1) above, the features are assumed to be conditionally independent in order to simplify estimating feature distributions, even though this not strictly true in practice. Moreover, errors in the heuristic may introduce a systematic sampling bias. For example, at high deflection angles near the edge of the field of view, only a small segment of a whisker is imaged (
<xref ref-type="fig" rid="pcbi-1002591-g001">Figure 1E</xref>
, purple whisker). The traced curves from such segments might systematically fall below
<inline-formula>
<inline-graphic xlink:href="pcbi.1002591.e016.jpg" mimetype="image"></inline-graphic>
</inline-formula>
and be excluded from whisker feature distributions. As a result, estimated probabilities will be biased towards longer whiskers at milder deflection angles. Despite these caveats, the use of these feature distributions leads to a highly accurate result.</p>
<p>Appearance alone is not sufficient to uniquely identify individual whiskers in some cases. To address this, we designed a naive Bayes' classifier to determine the most probable identity of each traced curve subject to ordering constraints. The traced curves are ordered into a sequence,
<italic>C</italic>
, according to their time and relative anterior-posterior position. Identifying a curve,
<italic>c</italic>
, involves assigning a particular label,
<italic>l</italic>
, from the following set of
<italic>2N+1</italic>
labels. There is a label for each whisker,
<italic>W
<sub>1</sub>
</italic>
to
<italic>W
<sub>N</sub>
</italic>
, and labels,
<italic>F
<sub>0</sub>
</italic>
to
<italic>F
<sub>N</sub>
</italic>
, where
<italic>F
<sub>i</sub>
</italic>
identifies all false positives between whisker
<italic>W
<sub>i</sub>
</italic>
and
<italic>W
<sub>i+1</sub>
</italic>
(
<xref ref-type="fig" rid="pcbi-1002591-g005">Figure 5A</xref>
). The kind of a label,
<italic>K(l)</italic>
, is
<italic>W</italic>
if
<italic>l</italic>
is a whisker label, and
<italic>FP</italic>
if
<italic>l</italic>
is a false positive label. A curve labeled
<italic>W
<sub>i</sub>
</italic>
or
<italic>F
<sub>i</sub>
</italic>
must be posterior to curves labeled
<italic>W
<sub>j</sub>
</italic>
or
<italic>F
<sub>j</sub>
</italic>
for all
<italic>i</italic>
. For a given
<italic>i</italic>
, only one curve per frame may be labeled
<italic>W
<sub>i</sub>
</italic>
, but several curves may be labeled
<italic>F
<sub>i</sub>
</italic>
.</p>
<p>Applying Bayes' theorem, the most probable sequence of labels,
<inline-formula>
<inline-graphic xlink:href="pcbi.1002591.e017.jpg" mimetype="image"></inline-graphic>
</inline-formula>
, is given by
<disp-formula>
<graphic xlink:href="pcbi.1002591.e018"></graphic>
<label>(2)</label>
</disp-formula>
where
<inline-formula>
<inline-graphic xlink:href="pcbi.1002591.e019.jpg" mimetype="image"></inline-graphic>
</inline-formula>
is the
<italic>a priori</italic>
probability of a labeling sequence, and
<inline-formula>
<inline-graphic xlink:href="pcbi.1002591.e020.jpg" mimetype="image"></inline-graphic>
</inline-formula>
is the likelihood that the curves in
<italic>C</italic>
are generated from the sequence of items labeled by
<italic>L</italic>
. By design, certain labeling sequences are ruled out
<italic>a priori</italic>
. Conceptually drawing a directed edge between one identity and the possible identities of the next curve yields a directed graph (
<xref ref-type="fig" rid="pcbi-1002591-g005">Figure 5B</xref>
). For a video with
<italic>T</italic>
frames,
<disp-formula>
<graphic xlink:href="pcbi.1002591.e021"></graphic>
<label>(3)</label>
</disp-formula>
where
<italic>N
<sup>t</sup>
</italic>
is the number of curves traced in frame
<italic>t</italic>
, and
<inline-formula>
<inline-graphic xlink:href="pcbi.1002591.e022.jpg" mimetype="image"></inline-graphic>
</inline-formula>
is the label of the
<italic>i</italic>
'th curve found in frame
<italic>t</italic>
. Values for
<inline-formula>
<inline-graphic xlink:href="pcbi.1002591.e023.jpg" mimetype="image"></inline-graphic>
</inline-formula>
and
<inline-formula>
<inline-graphic xlink:href="pcbi.1002591.e024.jpg" mimetype="image"></inline-graphic>
</inline-formula>
are estimated by computing the frequency of observed label pairs that occur in the a training set,
<inline-formula>
<inline-graphic xlink:href="pcbi.1002591.e025.jpg" mimetype="image"></inline-graphic>
</inline-formula>
. This describes a hidden Markov model for labeling curves in a single video frame. The optimal labeling can be computed efficiently with well-known dynamic programming techniques
<xref ref-type="bibr" rid="pcbi.1002591-Rabiner1">[21]</xref>
.</p>
<p>The likelihood,
<inline-formula>
<inline-graphic xlink:href="pcbi.1002591.e026.jpg" mimetype="image"></inline-graphic>
</inline-formula>
, is computed under the simplifying assumption that the likelihood of a single curve depends only on the curves in the previous or following frame. Using this and applying Bayes' theorem,
<disp-formula>
<graphic xlink:href="pcbi.1002591.e027"></graphic>
<label>(4)</label>
</disp-formula>
where
<inline-formula>
<inline-graphic xlink:href="pcbi.1002591.e028.jpg" mimetype="image"></inline-graphic>
</inline-formula>
is the label
<italic>L</italic>
assigns to the curve
<italic>c</italic>
,
<italic>C
<sup>t</sup>
</italic>
is the (possibly empty) set of curves found in frame
<italic>t</italic>
, and
<inline-formula>
<inline-graphic xlink:href="pcbi.1002591.e029.jpg" mimetype="image"></inline-graphic>
</inline-formula>
are the labels assigned to the curves in
<italic>C
<sup>t</sup>
</italic>
. The first component,
<inline-formula>
<inline-graphic xlink:href="pcbi.1002591.e030.jpg" mimetype="image"></inline-graphic>
</inline-formula>
, is the likelihood that
<italic>c</italic>
is an object of the kind denoted by label
<italic>l</italic>
which we estimate with formula (1).</p>
<p>The second component of (4),
<inline-formula>
<inline-graphic xlink:href="pcbi.1002591.e031.jpg" mimetype="image"></inline-graphic>
</inline-formula>
, is interpreted as the likelihood that a curve is part of the same trajectory as corresponding curves in the previous (or following) frame. Similar to the approach used in equation (1) for estimating
<inline-formula>
<inline-graphic xlink:href="pcbi.1002591.e032.jpg" mimetype="image"></inline-graphic>
</inline-formula>
, we need normalized histograms
<inline-formula>
<inline-graphic xlink:href="pcbi.1002591.e033.jpg" mimetype="image"></inline-graphic>
</inline-formula>
and
<inline-formula>
<inline-graphic xlink:href="pcbi.1002591.e034.jpg" mimetype="image"></inline-graphic>
</inline-formula>
of the changes of whiskers and false positives between successive frames for each feature
<italic>f</italic>
over a “training” data set in which the corresponding curves in successive frames is known. While we could use the implied assignment over
<inline-formula>
<inline-graphic xlink:href="pcbi.1002591.e035.jpg" mimetype="image"></inline-graphic>
</inline-formula>
, we first estimate a hopefully better assignment by restricting the model in (4) to use shape features alone. That is,
<inline-formula>
<inline-graphic xlink:href="pcbi.1002591.e036.jpg" mimetype="image"></inline-graphic>
</inline-formula>
is treated as a constant and thus the assignment to labels in a frame can be computed independently of other frames. We then use this preliminary labeling over the frames of
<inline-formula>
<inline-graphic xlink:href="pcbi.1002591.e037.jpg" mimetype="image"></inline-graphic>
</inline-formula>
as the training set over which to build
<inline-formula>
<inline-graphic xlink:href="pcbi.1002591.e038.jpg" mimetype="image"></inline-graphic>
</inline-formula>
and
<inline-formula>
<inline-graphic xlink:href="pcbi.1002591.e039.jpg" mimetype="image"></inline-graphic>
</inline-formula>
.</p>
<p>Given these change histograms, one can estimate the correspondence likelihood according to the formula,
<disp-formula>
<graphic xlink:href="pcbi.1002591.e040"></graphic>
<label>(5)</label>
</disp-formula>
Note that when evaluating this likelihood, a unique corresponding curve is not always present in the previous (or following) frame. There may be zero or many false positive curves in the previous frame with the same label. Similarly a whisker may be missing because it exited the field of view.</p>
<p>Directly solving for the most probable labeling (2) is complicated by the fact that the likelihood of a labeling in one frame depends on the labeling in neighboring frames. Our approach is to initially score each frame by computing
<inline-formula>
<inline-graphic xlink:href="pcbi.1002591.e041.jpg" mimetype="image"></inline-graphic>
</inline-formula>
, where
<inline-formula>
<inline-graphic xlink:href="pcbi.1002591.e042.jpg" mimetype="image"></inline-graphic>
</inline-formula>
is obtained using shape features alone. In decreasing order of this score, we ‘visit’ the next best frame, say
<italic>t</italic>
, and update the label assignment for each of the two adjacent frames that maximizes the full version of (4), provided the adjacent frame has not already been visited. The new assignment replaces the current assignment and the frame's visitation order is updated according to the score of this new assignment (under the full model of (4)). In this way, we let the most confident frames influence their neighbors in a transitive fashion until every frame has been visited. This was critical for achieving high accuracy. Previous whisker tracking algorithms have relied on propagating the identity of a whisker frame-wise from the beginning of the video to the end, and as a result, an error in one frame is propagated throughout the rest of the video
<xref ref-type="bibr" rid="pcbi.1002591-Voigts1">[11]</xref>
<xref ref-type="bibr" rid="pcbi.1002591-Knutsen2">[13]</xref>
.</p>
</sec>
</sec>
<sec id="s4">
<title>Results/Discussion</title>
<p>Average processing time was 8 Mpx/s/cpu (35 fps for 640×352 pixel video) measured on a 2.6 GHz Intel Core2 Duo Macbook Pro with 2 GB of RAM, and was dominated by the CPU time required to trace curves; linking time was 0.5 ms per frame. This is faster than published speeds of similar whisker analysis packages (typically, 1–5 fps.)
<xref ref-type="bibr" rid="pcbi.1002591-Voigts1">[11]</xref>
,
<xref ref-type="bibr" rid="pcbi.1002591-Knutsen2">[13]</xref>
,
<xref ref-type="bibr" rid="pcbi.1002591-Perkon1">[15]</xref>
. However, performance can be difficult to compare as implementations may improve over time. For example, our software is equally fast as the current version of the only other publically available whisker tracker
<xref ref-type="bibr" rid="pcbi.1002591-Knutsen2">[13]</xref>
. More importantly, our software can readily be run on inexpensive cluster nodes to process videos in parallel. This is not the case for whisker trackers that require supervision
<xref ref-type="bibr" rid="pcbi.1002591-Knutsen2">[13]</xref>
or that depend on software with expensive licensing requirements (such as Matlab)
<xref ref-type="bibr" rid="pcbi.1002591-Voigts1">[11]</xref>
<xref ref-type="bibr" rid="pcbi.1002591-Perkon1">[15]</xref>
.</p>
<p>Tracing was accurate to within 0.2 px as estimated by analyzing 39 hand-traced whiskers. Individual hand-tracings had an accuracy of 0.23±0.2 pixels when compared against consensus curves. Mouse whiskers (row C) were automatically traced in images with resolutions from 5 µm/px to 320 µm/px using default settings and in noisy, low-contrast images (
<xref ref-type="supplementary-material" rid="pcbi.1002591.s003">Text S1</xref>
). For the best results, it is important to minimize motion blur and uniformly illuminate the scene, although illumination inhomogeneities can be partially corrected with background subtraction. In contrast with methods that use Kalman filtering
<xref ref-type="bibr" rid="pcbi.1002591-Knutsen2">[13]</xref>
, traced curves are not biased by whisker dynamics. Additionally, tracing is faithful even when curves are not well approximated by low order polynomials, in contrast to published tracing methods
<xref ref-type="bibr" rid="pcbi.1002591-Knutsen2">[13]</xref>
<xref ref-type="bibr" rid="pcbi.1002591-Perkon1">[15]</xref>
. Whiskers could be detected and traced in videos of freely behaving rats (
<xref ref-type="supplementary-material" rid="pcbi.1002591.s005">Video S2</xref>
) and mice (
<xref ref-type="supplementary-material" rid="pcbi.1002591.s006">Video S3</xref>
) with all whiskers intact.</p>
<p>Linking accuracy was measured against a hand-annotated set of videos selected by choosing 100 random trials from behavioral sessions of 4 mice (1.32 million frames)
<xref ref-type="bibr" rid="pcbi.1002591-OConnor1">[1]</xref>
,
<xref ref-type="bibr" rid="pcbi.1002591-Huber1">[17]</xref>
. The curated videos captured a range of behavior including protracted bouts (>1 s) of whisking (
<xref ref-type="supplementary-material" rid="pcbi.1002591.s007">Video S4</xref>
), multiple whiskers simultaneously contacting an object (a thin metal pole) (
<xref ref-type="supplementary-material" rid="pcbi.1002591.s007">Video S4</xref>
), and extremely fast motion (>10,000°/second) (
<xref ref-type="supplementary-material" rid="pcbi.1002591.s008">Video S5</xref>
). Of the 4.5 million traced whiskers, 130 were incorrectly identified or not detected, less than one mistake per behavioral trial on average. Linking is robust to whiskers that occasionally leave the field of view (
<xref ref-type="supplementary-material" rid="pcbi.1002591.s009">Video S6</xref>
), and works well even at relatively low frame rates (100 fps; see
<xref ref-type="supplementary-material" rid="pcbi.1002591.s003">Text S1</xref>
). Lesser image quality will ultimately degrade results.</p>
<p>There are two principle sources of error in the linking. First, covariance between different features is ignored. For example, when the field of view is small, whiskers at high deflection angles might not be fully contained within the image. Any curve tracing one of these whiskers would appear shorter than the full whisker length. Under these conditions there is a strong correlation between angle and whisker length. The current model penalizes the shorter length because it is rarely observed; accounting for the covariance should eliminate this problem. Second, the estimated feature distributions are affected by systematic biases introduced from the heuristic used to initially guess whisker identity. A heuristic relying on a length threshold, like the one used here, may systematically reject these short curves at high deflection angle. This will result in a bias against strongly deflected whiskers.</p>
<p>Fundamentally, our linking method is applicable only to single rows of whiskers. The whiskers in videos imaging multiple rows of whiskers will be traced, but, because whisker images no longer maintain a strict anterior-posterior ordering, the linking will likely fail. Although this software was validated with images of one side of the mouse's whiskers, we have successfully tracked whiskers in images containing both whisker fields (
<xref ref-type="supplementary-material" rid="pcbi.1002591.s010">Video S7</xref>
). Also, the whiskers of freely moving rats and mice can be traced (
<xref ref-type="supplementary-material" rid="pcbi.1002591.s005">Videos S2</xref>
,
<xref ref-type="supplementary-material" rid="pcbi.1002591.s006">S3</xref>
). This suggests that it will be possible to use this software for two-dimensional tracking of single rows of whiskers in freely moving animals by incorporating head tracking
<xref ref-type="bibr" rid="pcbi.1002591-Knutsen1">[2]</xref>
,
<xref ref-type="bibr" rid="pcbi.1002591-Voigts1">[11]</xref>
,
<xref ref-type="bibr" rid="pcbi.1002591-Perkon1">[15]</xref>
, though we have not explicitly tested this.</p>
<p>We analyzed data acquired during learning of an object detection task (see
<xref ref-type="supplementary-material" rid="pcbi.1002591.s003">Text S1</xref>
)
<xref ref-type="bibr" rid="pcbi.1002591-Huber1">[17]</xref>
. We tracked whiskers from four mice during the first 6 days of training (8148 trials, 3122 frames each). This allowed us to observe the changes in whisking behavior that emerge during learning of the task. We looked specifically at correct rejection trials (2485 trials), which were not perturbed by contact between object and whisker, to analyze the distribution of whisking patterns during learning. We characterized whisking by computing normalized histograms of whisker angle (1° resolution) over time (100 ms bins) for a single representative whisker (C1). The angle and position of other whiskers were strongly correlated throughout (r
<sup>2</sup>
>0.95) in these trials. Two patterns emerged. After learning, two mice (JF25607 and JF25609) showed changes in whisking amplitude and mean angular position when the stimulus was present (
<xref ref-type="fig" rid="pcbi-1002591-g006">Figure 6A</xref>
). The other two mice (JF27332 and JF26706) did not show nearly as much stereotypy and appeared to rely on lower amplitude whisking (
<xref ref-type="fig" rid="pcbi-1002591-g006">Figure 6B</xref>
). All mice held their whiskers in a non-resting anterior position during most of the trial
<xref ref-type="bibr" rid="pcbi.1002591-OConnor1">[1]</xref>
. For JF25607 and JF25609, reduction in task error rate was correlated with an increase in mean deflection angle during stimulus presentation (r
<sup>2</sup>
 = 0.97,0.84 respectively) (
<xref ref-type="fig" rid="pcbi-1002591-g006">Figure 6C</xref>
). This correlation was smaller for the other two mice, which also learned the task but appeared to rely on a different whisking strategy.</p>
<fig id="pcbi-1002591-g006" position="float">
<object-id pub-id-type="doi">10.1371/journal.pcbi.1002591.g006</object-id>
<label>Figure 6</label>
<caption>
<title>Changes in whisker behavior as mice successfully learn the object detection task.</title>
<p>(A–B) Normalized histograms of whisker angle (1° resolution, whisker C1, log scale, 0° lies on the lateral axis) were computed over 100 ms time bins during the first 6 training sessions over correct rejection trials. Each histogram shows data from 21–150 trials. (A) Some mice, such as JF25607, increase the frequency of large deflections during stimulus presentation (trial counts: 99, 150, 135, 109, 102, and 90 respectively). (B) Others, such as JF27332, do not (trial counts: 21, 96, 90, 109, 107 and 86 respectively). (C) During stimulus presentation, an increase in mean deflection angle was correlated with a decrease in the false alarm (FA) rate, a measure of behavioral performance, for two mice (JF25609, JF25607). Two mice did not exhibit this correlation (R
<sup>2</sup>
: 0.14, 0.61, 0.97, 0.84 for JF27332, JF26706, JF25609, and JF25607 respectively).</p>
</caption>
<graphic xlink:href="pcbi.1002591.g006"></graphic>
</fig>
<p>The forces at the base of the whisker constitute the information available to an animal for somatosensory perception. These forces are proportional to changes in whisker curvature
<xref ref-type="bibr" rid="pcbi.1002591-Birdwell1">[16]</xref>
. The instantaneous moment acting at a point on the whisker is proportional to the curvature change at that point. In contrast to whisker tracing software that represents whiskers as line segments
<xref ref-type="bibr" rid="pcbi.1002591-Gyory1">[14]</xref>
,
<xref ref-type="bibr" rid="pcbi.1002591-Perkon1">[15]</xref>
or non-parametric cubic polynomials
<xref ref-type="bibr" rid="pcbi.1002591-Knutsen2">[13]</xref>
, the curves produced by our tracing algorithm provide a high-resolution representation of whisker shape, allowing measurement of varying curvature along the length of the whisker.</p>
<p>To illustrate this feature we measured curvature in the following manner: For each video frame (
<xref ref-type="fig" rid="pcbi-1002591-g007">Figure 7A</xref>
), the traced whisker midline (
<xref ref-type="fig" rid="pcbi-1002591-g007">Figure 7B</xref>
) was fit via least-squares to a parametric 5
<sup>th</sup>
degree polynomial of the form:
<disp-formula>
<graphic xlink:href="pcbi.1002591.e043"></graphic>
<label>(6)</label>
</disp-formula>
where [
<italic>x(t),y(t)</italic>
] are points that span the curve over the interval
<inline-formula>
<inline-graphic xlink:href="pcbi.1002591.e044.jpg" mimetype="image"></inline-graphic>
</inline-formula>
, and
<italic>a
<sub>i</sub>
,b
<sub>i</sub>
</italic>
are the fit parameters (
<xref ref-type="fig" rid="pcbi-1002591-g007">Figure 7C</xref>
). Curvature was measured at a specified distance along the curve from the follicle chosen for best signal to noise. The follicle was not imaged, and so the intersection between the curve and an arc circumscribing the face was used as a reference point instead. The curvature was then measured as the mean curvature inside a 1–2.5 mm long region about the point of interest where it was approximately constant. To ensure the measurement was not biased by shape outside the interval, another parametric polynomial fit (degree 2) was performed, but over the region of interest rather than the entire curve (
<xref ref-type="fig" rid="pcbi-1002591-g007">Figure 7D</xref>
). The follicle position was set by linearly extrapolating a set distance into the face. The point of whisker-pole contact was determined as the closest point to the center of the pole on the curve or on a line extrapolated from the nearest end of the curve (
<xref ref-type="fig" rid="pcbi-1002591-g007">Figure 7E</xref>
).</p>
<fig id="pcbi-1002591-g007" position="float">
<object-id pub-id-type="doi">10.1371/journal.pcbi.1002591.g007</object-id>
<label>Figure 7</label>
<caption>
<title>Analysis of curvature change on contact.</title>
<p>(A–E) The sequence of steps used to extract detailed curvature measurements. (A) Whiskers in raw video frames are automatically traced and linked, yielding (B) an identified set of curves for each frame. (C) The raw curve is fit with a 5
<sup>th</sup>
-degree parametric polynomial. (D) A mask is specified to determine where the curve intersects the face. Within a small interval (1–2.5 mm path length) about an interest point chosen for high signal to noise, the raw curve is re-fit to ensure measurements are not biased by whisker shape outside the interval. This new fit is to a 2
<sup>nd</sup>
-degree polynomial. The curvature at the interest point is then measured as the curvature of this 2
<sup>nd</sup>
fitted curve. (E) Follicle position is estimated by extrapolating a fixed distance into the face from the mask. Similarly, curves are extrapolated, when necessary, to contact points on the pole. Trajectories for curvature (F) and the angle of the whisker at its base (G) are shown for the first contacting whisker in 10 trials grouped by whether the first contact was during a retraction (top 5) or protraction (bottom 5). Trajectories are aligned to first contact. The intervals when the whisker is in whisker-pole contact are highlighted in red. (H) Histograms of peak contact curvature change (from resting) for the first whisker-pole contact in each trial (green) and all whisker-pole contacts prior to an answer-lick (red).</p>
</caption>
<graphic xlink:href="pcbi.1002591.g007"></graphic>
</fig>
<p>Time series of curvature measurement (
<xref ref-type="fig" rid="pcbi-1002591-g007">Figure 7F</xref>
, measurements at 5 mm from follicle) can be used to identify most contact events. For whisker-pole contacts first made during a protraction, the first touch tends to be followed by a series of more forceful contacts. Contacts are correlated with relatively constant angular position of the whisker (
<xref ref-type="fig" rid="pcbi-1002591-g007">Figure 7G</xref>
). This indicates the mouse may press the whisker against the pole by translating the whisker pad rather than by rotating the whisker. The distribution of peak curvature-change during a whisker-pole contact indicates more the bulk of contacts were made during protraction and that these were more forceful (
<xref ref-type="fig" rid="pcbi-1002591-g007">Figure 7H</xref>
).</p>
<sec id="s4a">
<title>Availability</title>
<p>The software and documentation are freely available online (
<ext-link ext-link-type="uri" xlink:href="http://whiskertracking.janelia.org">http://whiskertracking.janelia.org</ext-link>
) as cross-platform (Windows, OS X, Linux) source code written in C with Python and Matlab interfaces included. Pre-built binaries are also available. A graphical user interface is provided to aid semi-automated tracing as well as the viewing and editing of results. The software is capable of analyzing 8-bit grayscale StreamPix (Norpix Sequence Format: SEQ), TIFF, MPEG, MOV, and AVI formatted video.</p>
</sec>
</sec>
<sec sec-type="supplementary-material" id="s5">
<title>Supporting Information</title>
<supplementary-material content-type="local-data" id="pcbi.1002591.s001">
<label>Dataset S1</label>
<caption>
<p>Sample data and tutorial.</p>
<p>(ZIP)</p>
</caption>
<media xlink:href="pcbi.1002591.s001.zip" mimetype="application" mime-subtype="x-zip-compressed">
<caption>
<p>Click here for additional data file.</p>
</caption>
</media>
</supplementary-material>
<supplementary-material content-type="local-data" id="pcbi.1002591.s002">
<label>Figure S1</label>
<caption>
<p>Line bias correction. (A) Raw data exhibits a fixed-pattern artifact where odd-numbered lines are systematically darker than bright lines. (B) This artifact is corrected by multiplying odd-lines by a factor estimated from the raw data itself. This removes the stripes without blurring or reducing the contrast of whiskers. Scale bar, 0.5 mm.</p>
<p>(TIF)</p>
</caption>
<media xlink:href="pcbi.1002591.s002.tif" mimetype="image" mime-subtype="tiff">
<caption>
<p>Click here for additional data file.</p>
</caption>
</media>
</supplementary-material>
<supplementary-material content-type="local-data" id="pcbi.1002591.s003">
<label>Text S1</label>
<caption>
<p>Methodological details.</p>
<p>(DOC)</p>
</caption>
<media xlink:href="pcbi.1002591.s003.doc" mimetype="application" mime-subtype="msword">
<caption>
<p>Click here for additional data file.</p>
</caption>
</media>
</supplementary-material>
<supplementary-material content-type="local-data" id="pcbi.1002591.s004">
<label>Video S1</label>
<caption>
<p>Video overlaid with high scoring seeds (eccentricity>0.99) for each frame. Pixels are colored according to the orientation indicated by the seed.</p>
<p>(MP4)</p>
</caption>
<media xlink:href="pcbi.1002591.s004.mp4" mimetype="video" mime-subtype="mp4">
<caption>
<p>Click here for additional data file.</p>
</caption>
</media>
</supplementary-material>
<supplementary-material content-type="local-data" id="pcbi.1002591.s005">
<label>Video S2</label>
<caption>
<p>Results of automatic tracing applied to video 500 Hz video (1.7 s duration) of a freely behaving rat exploring a corner with its full whisker field. The video was obtained with permission from the BIOTACT Whisker Tracking Benchmark
<xref ref-type="bibr" rid="pcbi.1002591-Gordon1">[22]</xref>
(Clip ID:behavingFullContact, courtesy of Igor Perkon and Goren Gordon). It has been cropped, and the frame rate has been changed for presentation.</p>
<p>(MP4)</p>
</caption>
<media xlink:href="pcbi.1002591.s005.mp4" mimetype="video" mime-subtype="mp4">
<caption>
<p>Click here for additional data file.</p>
</caption>
</media>
</supplementary-material>
<supplementary-material content-type="local-data" id="pcbi.1002591.s006">
<label>Video S3</label>
<caption>
<p>Results of automatic tracing applied to a video acquired at 100 Hz video (8 s duration) of a freely behaving mouse with its full whisker field. Background subtraction was performed before tracing to reduce the effects of non-uniform illumination. Because no image of the scene without the animal present was available, the background was estimated for each pixel as the maximum intensity observed at that point throughout the video. The video was obtained with permission from the BIOTACT Whisker Tracking Benchmark
<xref ref-type="bibr" rid="pcbi.1002591-Gordon1">[22]</xref>
(Clip ID: behavingMouse100, courtesy of Ehud Fonio and Ehud Ahissar). It has been cropped, and the frame rate has been changed for presentation.</p>
<p>(MP4)</p>
</caption>
<media xlink:href="pcbi.1002591.s006.mp4" mimetype="video" mime-subtype="mp4">
<caption>
<p>Click here for additional data file.</p>
</caption>
</media>
</supplementary-material>
<supplementary-material content-type="local-data" id="pcbi.1002591.s007">
<label>Video S4</label>
<caption>
<p>Results from a full trial. The results of automated tracing and linking applied to a video (500 Hz, 9.2 s duration) of a head fixed mouse trimmed to a single row of 4 whiskers interacting with a pole. Curves that were classified as whiskers are colored according to their identity, and otherwise they are not shown. Multiple whiskers simultaneously interact with the pole at 1.2–1.4 sec into the trial. Protracted bouts of whisking can be observed throughout the video.</p>
<p>(MP4)</p>
</caption>
<media xlink:href="pcbi.1002591.s007.mp4" mimetype="video" mime-subtype="mp4">
<caption>
<p>Click here for additional data file.</p>
</caption>
</media>
</supplementary-material>
<supplementary-material content-type="local-data" id="pcbi.1002591.s008">
<label>Video S5</label>
<caption>
<p>Tracing and linking is robust to rapid changes in whisker angle. The results of automated tracing and linking applied to a video (500 Hz, 150 ms) of a head fixed mouse trimmed to a single row of 3 whiskers interacting with a pole. Curves that were classified as whiskers are colored according to their identity, and otherwise they are not shown. One whisker (red) has been trimmed so it cannot contact the pole. The green whisker presses against the pole, and quickly flicks past it as it is removed from the field. This is the fastest angular motion (16°/ms) observed in the data set used to measure tracking accuracy.</p>
<p>(MP4)</p>
</caption>
<media xlink:href="pcbi.1002591.s008.mp4" mimetype="video" mime-subtype="mp4">
<caption>
<p>Click here for additional data file.</p>
</caption>
</media>
</supplementary-material>
<supplementary-material content-type="local-data" id="pcbi.1002591.s009">
<label>Video S6</label>
<caption>
<p>Linking is robust to whiskers that leave the field of view. The results of automated tracing and linking applied to a video (500 Hz, 190 ms) of a head fixed mouse trimmed to a single row of 3 whiskers interacting with a pole. Curves that were classified as whiskers are colored according to their identity, and otherwise they are not shown. Two whiskers (green and blue) are frequently occluded by the lick-port (black bar, lower right), but they are properly identified before and after such events.</p>
<p>(MP4)</p>
</caption>
<media xlink:href="pcbi.1002591.s009.mp4" mimetype="video" mime-subtype="mp4">
<caption>
<p>Click here for additional data file.</p>
</caption>
</media>
</supplementary-material>
<supplementary-material content-type="local-data" id="pcbi.1002591.s010">
<label>Video S7</label>
<caption>
<p>Tracing and linking of whiskers bilaterally. The results of automated tracing and linking applied to a video (500 Hz, 1 s duration) of a bilateral view to a head fixed mouse trimmed to a single row of whiskers (2 on each side).</p>
<p>(MP4)</p>
</caption>
<media xlink:href="pcbi.1002591.s010.mp4" mimetype="video" mime-subtype="mp4">
<caption>
<p>Click here for additional data file.</p>
</caption>
</media>
</supplementary-material>
</sec>
</body>
<back>
<fn-group>
<fn fn-type="conflict">
<p>The authors have declared that no competing interests exist.</p>
</fn>
<fn fn-type="financial-disclosure">
<p>The Howard Hughes Medical Institute funded this work (
<ext-link ext-link-type="uri" xlink:href="http://www.hhmi.org">www.hhmi.org</ext-link>
). The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.</p>
</fn>
</fn-group>
<ref-list>
<title>References</title>
<ref id="pcbi.1002591-OConnor1">
<label>1</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>O'Connor</surname>
<given-names>DH</given-names>
</name>
<name>
<surname>Clack</surname>
<given-names>NG</given-names>
</name>
<name>
<surname>Huber</surname>
<given-names>D</given-names>
</name>
<name>
<surname>Komiyama</surname>
<given-names>T</given-names>
</name>
<name>
<surname>Myers</surname>
<given-names>EW</given-names>
</name>
<etal></etal>
</person-group>
<year>2010</year>
<article-title>Vibrissa-based object localization in head-fixed mice.</article-title>
<source>J Neurosci</source>
<volume>30</volume>
<fpage>1947</fpage>
<lpage>1967</lpage>
<pub-id pub-id-type="pmid">20130203</pub-id>
</element-citation>
</ref>
<ref id="pcbi.1002591-Knutsen1">
<label>2</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Knutsen</surname>
<given-names>PM</given-names>
</name>
</person-group>
<year>2006</year>
<article-title>Haptic Object Localization in the Vibrissal System: Behavior and Performance.</article-title>
<source>J Neurosci</source>
<volume>26</volume>
<fpage>8451</fpage>
<lpage>8464</lpage>
<pub-id pub-id-type="pmid">16914670</pub-id>
</element-citation>
</ref>
<ref id="pcbi.1002591-Carvell1">
<label>3</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Carvell</surname>
<given-names>GE</given-names>
</name>
<name>
<surname>Simons</surname>
<given-names>DJ</given-names>
</name>
</person-group>
<year>1990</year>
<article-title>Biometric analyses of vibrissal tactile discrimination in the rat.</article-title>
<source>J Neurosci</source>
<volume>10</volume>
<fpage>2638</fpage>
<lpage>2648</lpage>
<pub-id pub-id-type="pmid">2388081</pub-id>
</element-citation>
</ref>
<ref id="pcbi.1002591-Heimendahlvon1">
<label>4</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Heimendahl von</surname>
<given-names>M</given-names>
</name>
<name>
<surname>Itskov</surname>
<given-names>PM</given-names>
</name>
<name>
<surname>Arabzadeh</surname>
<given-names>E</given-names>
</name>
<name>
<surname>Diamond</surname>
<given-names>ME</given-names>
</name>
</person-group>
<year>2007</year>
<article-title>Neuronal activity in rat barrel cortex underlying texture discrimination.</article-title>
<source>Plos Biol</source>
<volume>5</volume>
<fpage>e305</fpage>
<pub-id pub-id-type="pmid">18001152</pub-id>
</element-citation>
</ref>
<ref id="pcbi.1002591-Mitchinson1">
<label>5</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Mitchinson</surname>
<given-names>B</given-names>
</name>
<name>
<surname>Martin</surname>
<given-names>CJ</given-names>
</name>
<name>
<surname>Grant</surname>
<given-names>RA</given-names>
</name>
<name>
<surname>Prescott</surname>
<given-names>TJ</given-names>
</name>
</person-group>
<year>2007</year>
<article-title>Feedback control in active sensing: rat exploratory whisking is modulated by environmental contact.</article-title>
<source>Proc Biol Sci</source>
<volume>274</volume>
<fpage>1035</fpage>
<lpage>1041</lpage>
<pub-id pub-id-type="pmid">17331893</pub-id>
</element-citation>
</ref>
<ref id="pcbi.1002591-Diamond1">
<label>6</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Diamond</surname>
<given-names>ME</given-names>
</name>
<name>
<surname>Heimendahl von</surname>
<given-names>M</given-names>
</name>
<name>
<surname>Knutsen</surname>
<given-names>PM</given-names>
</name>
<name>
<surname>Kleinfeld</surname>
<given-names>D</given-names>
</name>
<name>
<surname>Ahissar</surname>
<given-names>E</given-names>
</name>
</person-group>
<year>2008</year>
<article-title>‘Where’ and “what” in the whisker sensorimotor system.</article-title>
<source>Nat Rev Neurosci</source>
<volume>9</volume>
<fpage>601</fpage>
<lpage>612</lpage>
<pub-id pub-id-type="pmid">18641667</pub-id>
</element-citation>
</ref>
<ref id="pcbi.1002591-Hill1">
<label>7</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Hill</surname>
<given-names>DN</given-names>
</name>
<name>
<surname>Bermejo</surname>
<given-names>R</given-names>
</name>
<name>
<surname>Zeigler</surname>
<given-names>HP</given-names>
</name>
<name>
<surname>Kleinfeld</surname>
<given-names>D</given-names>
</name>
</person-group>
<year>2008</year>
<article-title>Biomechanics of the vibrissa motor plant in rat: rhythmic whisking consists of triphasic neuromuscular activity.</article-title>
<source>J Neurosci</source>
<volume>28</volume>
<fpage>3438</fpage>
<lpage>3455</lpage>
<pub-id pub-id-type="pmid">18367610</pub-id>
</element-citation>
</ref>
<ref id="pcbi.1002591-Venkatraman1">
<label>8</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Venkatraman</surname>
<given-names>S</given-names>
</name>
<name>
<surname>Elkabany</surname>
<given-names>K</given-names>
</name>
<name>
<surname>Long</surname>
<given-names>JD</given-names>
</name>
<name>
<surname>Yao</surname>
<given-names>Y</given-names>
</name>
<name>
<surname>Carmena</surname>
<given-names>JM</given-names>
</name>
</person-group>
<year>2009</year>
<article-title>A system for neural recording and closed-loop intracortical microstimulation in awake rodents.</article-title>
<source>IEEE Trans Biomed Eng</source>
<volume>56</volume>
<fpage>15</fpage>
<lpage>22</lpage>
<pub-id pub-id-type="pmid">19224714</pub-id>
</element-citation>
</ref>
<ref id="pcbi.1002591-Jadhav1">
<label>9</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Jadhav</surname>
<given-names>SP</given-names>
</name>
<name>
<surname>Wolfe</surname>
<given-names>J</given-names>
</name>
<name>
<surname>Feldman</surname>
<given-names>DE</given-names>
</name>
</person-group>
<year>2009</year>
<article-title>Sparse temporal coding of elementary tactile features during active whisker sensation.</article-title>
<source>Nat Neurosci</source>
<volume>12</volume>
<fpage>792</fpage>
<lpage>2328</lpage>
<pub-id pub-id-type="pmid">19430473</pub-id>
</element-citation>
</ref>
<ref id="pcbi.1002591-Harvey1">
<label>10</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Harvey</surname>
<given-names>MA</given-names>
</name>
<name>
<surname>Bermejo</surname>
<given-names>R</given-names>
</name>
<name>
<surname>Zeigler</surname>
<given-names>HP</given-names>
</name>
</person-group>
<year>2001</year>
<article-title>Discriminative whisking in the head-fixed rat: optoelectronic monitoring during tactile detection and discrimination tasks.</article-title>
<source>Somatosens Mot Res</source>
<volume>18</volume>
<fpage>211</fpage>
<lpage>222</lpage>
<pub-id pub-id-type="pmid">11562084</pub-id>
</element-citation>
</ref>
<ref id="pcbi.1002591-Voigts1">
<label>11</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Voigts</surname>
<given-names>J</given-names>
</name>
<name>
<surname>Sakmann</surname>
<given-names>B</given-names>
</name>
<name>
<surname>Celikel</surname>
<given-names>T</given-names>
</name>
</person-group>
<year>2008</year>
<article-title>Unsupervised whisker tracking in unrestrained behaving animals.</article-title>
<source>J Neurophys</source>
<volume>100</volume>
<fpage>504</fpage>
<lpage>515</lpage>
</element-citation>
</ref>
<ref id="pcbi.1002591-Ritt1">
<label>12</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Ritt</surname>
<given-names>JT</given-names>
</name>
<name>
<surname>Andermann</surname>
<given-names>ML</given-names>
</name>
<name>
<surname>Moore</surname>
<given-names>CI</given-names>
</name>
</person-group>
<year>2008</year>
<article-title>Embodied information processing: Vibrissa mechanics and texture features shape micromotions in actively sensing rats.</article-title>
<source>Neuron</source>
<volume>57</volume>
<fpage>599</fpage>
<lpage>613</lpage>
<pub-id pub-id-type="pmid">18304488</pub-id>
</element-citation>
</ref>
<ref id="pcbi.1002591-Knutsen2">
<label>13</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Knutsen</surname>
<given-names>PM</given-names>
</name>
<name>
<surname>Derdikman</surname>
<given-names>D</given-names>
</name>
<name>
<surname>Ahissar</surname>
<given-names>E</given-names>
</name>
</person-group>
<year>2005</year>
<article-title>Tracking whisker and head movements in unrestrained behaving rodents.</article-title>
<source>J Neurophys</source>
<volume>93</volume>
<fpage>2294</fpage>
<lpage>2301</lpage>
</element-citation>
</ref>
<ref id="pcbi.1002591-Gyory1">
<label>14</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Gyory</surname>
<given-names>G</given-names>
</name>
<name>
<surname>Rankov</surname>
<given-names>V</given-names>
</name>
<name>
<surname>Gordon</surname>
<given-names>G</given-names>
</name>
<name>
<surname>Perkon</surname>
<given-names>I</given-names>
</name>
<name>
<surname>Mitchinson</surname>
<given-names>B</given-names>
</name>
<etal></etal>
</person-group>
<year>2010</year>
<article-title>An Algorithm for Automatic Tracking of Rat Whiskers.</article-title>
<source>Proc Int Workshop on Visual observation and Analysis of Animal and Insect Behavior (VAIB), Istanbul, in conjunction with ICPR</source>
<volume>2010</volume>
<fpage>1</fpage>
<lpage>4</lpage>
</element-citation>
</ref>
<ref id="pcbi.1002591-Perkon1">
<label>15</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Perkon</surname>
<given-names>I</given-names>
</name>
<name>
<surname>Kosir</surname>
<given-names>A</given-names>
</name>
<name>
<surname>Itskov</surname>
<given-names>PM</given-names>
</name>
<name>
<surname>Tasic</surname>
<given-names>J</given-names>
</name>
<name>
<surname>Diamond</surname>
<given-names>ME</given-names>
</name>
</person-group>
<year>2011</year>
<article-title>Unsupervised quantification of whisking and head movement in freely moving rodents.</article-title>
<source>J Neurophys</source>
<volume>105</volume>
<fpage>1950</fpage>
<lpage>1962</lpage>
</element-citation>
</ref>
<ref id="pcbi.1002591-Birdwell1">
<label>16</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Birdwell</surname>
<given-names>JA</given-names>
</name>
<name>
<surname>Solomon</surname>
<given-names>JH</given-names>
</name>
<name>
<surname>Thajchayapong</surname>
<given-names>M</given-names>
</name>
<name>
<surname>Taylor</surname>
<given-names>MA</given-names>
</name>
<name>
<surname>Cheely</surname>
<given-names>M</given-names>
</name>
<etal></etal>
</person-group>
<year>2007</year>
<article-title>Biomechanical models for radial distance determination by the rat vibrissal system.</article-title>
<source>J Neurophys</source>
<volume>98</volume>
<fpage>2439</fpage>
<lpage>2455</lpage>
</element-citation>
</ref>
<ref id="pcbi.1002591-Huber1">
<label>17</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Huber</surname>
<given-names>D</given-names>
</name>
<name>
<surname>Gutnisky</surname>
<given-names>DA</given-names>
</name>
<name>
<surname>Peron</surname>
<given-names>S</given-names>
</name>
<name>
<surname>O'Connor</surname>
<given-names>DH</given-names>
</name>
<name>
<surname>Wiegert</surname>
<given-names>JS</given-names>
</name>
<etal></etal>
</person-group>
<year>2012</year>
<article-title>Multiple dynamic representations in the motor cortex during sensorimotor learning.</article-title>
<source>Nature</source>
<volume>484</volume>
<fpage>473</fpage>
<lpage>8</lpage>
<pub-id pub-id-type="pmid">22538608</pub-id>
</element-citation>
</ref>
<ref id="pcbi.1002591-Mehta1">
<label>18</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Mehta</surname>
<given-names>SB</given-names>
</name>
<name>
<surname>Whitmer</surname>
<given-names>D</given-names>
</name>
<name>
<surname>Figueroa</surname>
<given-names>R</given-names>
</name>
<name>
<surname>Williams</surname>
<given-names>BA</given-names>
</name>
<name>
<surname>Kleinfeld</surname>
<given-names>D</given-names>
</name>
</person-group>
<year>2007</year>
<article-title>Active spatial perception in the vibrissa scanning sensorimotor system.</article-title>
<source>Plos Biol</source>
<volume>5</volume>
<fpage>e15</fpage>
<pub-id pub-id-type="pmid">17227143</pub-id>
</element-citation>
</ref>
<ref id="pcbi.1002591-Mikolajczyk1">
<label>19</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Mikolajczyk</surname>
<given-names>K</given-names>
</name>
<name>
<surname>Schmid</surname>
<given-names>C</given-names>
</name>
</person-group>
<year>2005</year>
<article-title>Performance evaluation of local descriptors.</article-title>
<source>IEEE Trans Pattern Anal Mach Intell</source>
<volume>27</volume>
<fpage>1615</fpage>
<lpage>1630</lpage>
<pub-id pub-id-type="pmid">16237996</pub-id>
</element-citation>
</ref>
<ref id="pcbi.1002591-Torre1">
<label>20</label>
<element-citation publication-type="journal">
<person-group person-group-type="author">
<name>
<surname>Torre</surname>
<given-names>V</given-names>
</name>
<name>
<surname>Poggio</surname>
<given-names>TA</given-names>
</name>
</person-group>
<year>1986</year>
<article-title>On Edge Detection.</article-title>
<source>IEEE Trans Pattern Anal Mach Intell</source>
<volume>PAMI-8</volume>
<fpage>147</fpage>
<lpage>163</lpage>
<pub-id pub-id-type="pmid">21869334</pub-id>
</element-citation>
</ref>
<ref id="pcbi.1002591-Rabiner1">
<label>21</label>
<element-citation publication-type="other">
<person-group person-group-type="author">
<name>
<surname>Rabiner</surname>
<given-names>LR</given-names>
</name>
</person-group>
<year>1989</year>
<article-title>A tutorial on hidden Markov models and selected applications in speech recognition.</article-title>
<fpage>257</fpage>
<lpage>286</lpage>
<comment>Proceedings of the IEEE. Vol. 77</comment>
</element-citation>
</ref>
<ref id="pcbi.1002591-Gordon1">
<label>22</label>
<element-citation publication-type="other">
<person-group person-group-type="author">
<name>
<surname>Gordon</surname>
<given-names>G</given-names>
</name>
<name>
<surname>Mitcheson</surname>
<given-names>B</given-names>
</name>
<name>
<surname>Grant</surname>
<given-names>RA</given-names>
</name>
<name>
<surname>Diamond</surname>
<given-names>M</given-names>
</name>
<name>
<surname>Prescot</surname>
<given-names>T</given-names>
</name>
<etal></etal>
</person-group>
<year>2012</year>
<article-title>The BIOTACT Whisker Tracking Benchmark.</article-title>
<comment>Available:
<ext-link ext-link-type="uri" xlink:href="https://mushika.shef.ac.uk/benchmark">https://mushika.shef.ac.uk/benchmark</ext-link>
</comment>
</element-citation>
</ref>
</ref-list>
</back>
</pmc>
</record>

Pour manipuler ce document sous Unix (Dilib)

EXPLOR_STEP=$WICRI_ROOT/Ticri/CIDE/explor/HapticV1/Data/Pmc/Curation
HfdSelect -h $EXPLOR_STEP/biblio.hfd -nk 002159 | SxmlIndent | more

Ou

HfdSelect -h $EXPLOR_AREA/Data/Pmc/Curation/biblio.hfd -nk 002159 | SxmlIndent | more

Pour mettre un lien sur cette page dans le réseau Wicri

{{Explor lien
   |wiki=    Ticri/CIDE
   |area=    HapticV1
   |flux=    Pmc
   |étape=   Curation
   |type=    RBID
   |clé=     PMC:3390361
   |texte=   Automated Tracking of Whiskers in Videos of Head Fixed Rodents
}}

Pour générer des pages wiki

HfdIndexSelect -h $EXPLOR_AREA/Data/Pmc/Curation/RBID.i   -Sk "pubmed:22792058" \
       | HfdSelect -Kh $EXPLOR_AREA/Data/Pmc/Curation/biblio.hfd   \
       | NlmPubMed2Wicri -a HapticV1 

Wicri

This area was generated with Dilib version V0.6.23.
Data generation: Mon Jun 13 01:09:46 2016. Site generation: Wed Mar 6 09:54:07 2024