Discussion:Proc. SPIE Int. Soc. Opt. Eng. (2011) Moqqaddem

De Wicri Maroc
<record>
  <inist h6="B">
    <pA>
      <fA01 i1="01" i2="1">
        <s0>0277-786X</s0>
      </fA01>
      <fA02 i1="01">
        <s0>PSISDG</s0>
      </fA02>
      <fA03 i2="1">
        <s0>Proc. SPIE Int. Soc. Opt. Eng.</s0>
      </fA03>
      <fA05>
        <s2>7878</s2>
      </fA05>
      <fA08 i1="01" i2="1" l="ENG">
        <s1>Linear stereo vision based objects detection and tracking using spectral clustering</s1>
      </fA08>
      <fA09 i1="01" i2="1" l="ENG">
        <s1>Intelligent robots and computer vision XXVIII : algorithms and techniques</s1>
      </fA09>
      <fA11 i1="01" i2="1">
        <s1>MOQQADDEM (Safaa)</s1>
      </fA11>
      <fA11 i1="02" i2="1">
        <s1>RUICHEK (Y.)</s1>
      </fA11>
      <fA11 i1="03" i2="1">
        <s1>TOUAHNI (R.)</s1>
      </fA11>
      <fA11 i1="04" i2="1">
        <s1>SBIHI (A.)</s1>
      </fA11>
      <fA12 i1="01" i2="1">
        <s1>RÖNING (Juha)</s1>
        <s9>ed.</s9>
      </fA12>
      <fA12 i1="02" i2="1">
        <s1>CASASENT (David Paul)</s1>
        <s9>ed.</s9>
      </fA12>
      <fA12 i1="03" i2="1">
        <s1>HALL (Ernest L.)</s1>
        <s9>ed.</s9>
      </fA12>
      <fA14 i1="01">
        <s1>Systems and Transportation Laboratory, University of Technology of Belfort-Montbélaird</s1>
        <s2>90010 Belfort</s2>
        <s3>FRA</s3>
        <sZ>1 aut.</sZ>
        <sZ>2 aut.</sZ>
      </fA14>
      <fA14 i1="02">
        <s1>LASTID Laboratory, Ibn Tofail University of Kénitra</s1>
        <s3>MAR</s3>
        <sZ>1 aut.</sZ>
        <sZ>3 aut.</sZ>
        <sZ>4 aut.</sZ>
      </fA14>
      <fA18 i1="01" i2="1">
        <s1>SPIE</s1>
        <s3>USA</s3>
        <s9>org-cong.</s9>
      </fA18>
      <fA20>
        <s2>787806.1-787806.9</s2>
      </fA20>
      <fA21>
        <s1>2011</s1>
      </fA21>
      <fA23 i1="01">
        <s0>ENG</s0>
      </fA23>
      <fA25 i1="01">
        <s1>SPIE</s1>
        <s2>Bellingham, Wash.</s2>
      </fA25>
      <fA25 i1="02">
        <s1>IS&amp;T</s1>
        <s2>Springfield, Va.</s2>
      </fA25>
      <fA26 i1="01">
        <s0>978-0-8194-8415-4</s0>
      </fA26>
      <fA43 i1="01">
        <s1>INIST</s1>
        <s2>21760</s2>
        <s5>354000174732410050</s5>
      </fA43>
      <fA44>
        <s0>0000</s0>
        <s1>© 2011 INIST-CNRS. All rights reserved.</s1>
      </fA44>
      <fA45>
        <s0>10 ref.</s0>
      </fA45>
      <fA47 i1="01" i2="1">
        <s0>11-0219039</s0>
      </fA47>
      <fA60>
        <s1>P</s1>
        <s2>C</s2>
      </fA60>
      <fA61>
        <s0>A</s0>
      </fA61>
      <fA64 i1="01" i2="1">
        <s0>Proceedings of SPIE, the International Society for Optical Engineering</s0>
      </fA64>
      <fA66 i1="01">
        <s0>USA</s0>
      </fA66>
      <fC01 i1="01" l="ENG">
        <s0>Objects detection and tracking is a key function for many applications like video surveillance, robotic, intelligent transportation systems,...etc. This problem is widely treated in the literature in terms of sensors (video cameras, laser range finder, Radar) and methodologies. This paper proposes a new approach for detecting and tracking objects using stereo vision with linear cameras. After the matching process applied to edge points extracted from the images, the reconstructed points in the scene are clustered using spectral analysis. The obtained clusters are then tracked throughout their center of gravity using a Kalman filter and a Nearest Neighbour (NN) based data association algorithm. The approach is tested and evaluated on real data to demonstrate its effectiveness for obstacle detection and tracking in front of a vehicle. This work is a part of a project that aims to develop advanced driving aid systems, supported by the CPER, STIC and Volubilis programs.</s0>
      </fC01>
      <fC02 i1="01" i2="X">
        <s0>001D02C03</s0>
      </fC02>
      <fC02 i1="02" i2="X">
        <s0>001D02B07B</s0>
      </fC02>
      <fC02 i1="03" i2="X">
        <s0>001D15C</s0>
      </fC02>
      <fC03 i1="01" i2="3" l="FRE">
        <s0>Traitement image stéréoscopique</s0>
        <s5>06</s5>
      </fC03>
      <fC03 i1="01" i2="3" l="ENG">
        <s0>Stereo image processing</s0>
        <s5>06</s5>
      </fC03>
      <fC03 i1="02" i2="X" l="FRE">
        <s0>Vision ordinateur</s0>
        <s5>07</s5>
      </fC03>
      <fC03 i1="02" i2="X" l="ENG">
        <s0>Computer vision</s0>
        <s5>07</s5>
      </fC03>
      <fC03 i1="02" i2="X" l="SPA">
        <s0>Visión ordenador</s0>
        <s5>07</s5>
      </fC03>
      <fC03 i1="03" i2="X" l="FRE">
        <s0>Vision stéréoscopique</s0>
        <s5>08</s5>
      </fC03>
      <fC03 i1="03" i2="X" l="ENG">
        <s0>Stereopsis</s0>
        <s5>08</s5>
      </fC03>
      <fC03 i1="03" i2="X" l="SPA">
        <s0>Visión estereoscópica</s0>
        <s5>08</s5>
      </fC03>
      <fC03 i1="04" i2="X" l="FRE">
        <s0>Asservissement visuel</s0>
        <s5>09</s5>
      </fC03>
      <fC03 i1="04" i2="X" l="ENG">
        <s0>Visual servoing</s0>
        <s5>09</s5>
      </fC03>
      <fC03 i1="04" i2="X" l="SPA">
        <s0>Servomando visual</s0>
        <s5>09</s5>
      </fC03>
      <fC03 i1="05" i2="X" l="FRE">
        <s0>Surveillance</s0>
        <s5>10</s5>
      </fC03>
      <fC03 i1="05" i2="X" l="ENG">
        <s0>Surveillance</s0>
        <s5>10</s5>
      </fC03>
      <fC03 i1="05" i2="X" l="SPA">
        <s0>Vigilancia</s0>
        <s5>10</s5>
      </fC03>
      <fC03 i1="06" i2="X" l="FRE">
        <s0>Robotique</s0>
        <s5>11</s5>
      </fC03>
      <fC03 i1="06" i2="X" l="ENG">
        <s0>Robotics</s0>
        <s5>11</s5>
      </fC03>
      <fC03 i1="06" i2="X" l="SPA">
        <s0>Robótica</s0>
        <s5>11</s5>
      </fC03>
      <fC03 i1="07" i2="X" l="FRE">
        <s0>Capteur mesure</s0>
        <s5>12</s5>
      </fC03>
      <fC03 i1="07" i2="X" l="ENG">
        <s0>Measurement sensor</s0>
        <s5>12</s5>
      </fC03>
      <fC03 i1="07" i2="X" l="SPA">
        <s0>Captador medida</s0>
        <s5>12</s5>
      </fC03>
      <fC03 i1="08" i2="X" l="FRE">
        <s0>Détection contour</s0>
        <s5>13</s5>
      </fC03>
      <fC03 i1="08" i2="X" l="ENG">
        <s0>Edge detection</s0>
        <s5>13</s5>
      </fC03>
      <fC03 i1="08" i2="X" l="SPA">
        <s0>Detección contorno</s0>
        <s5>13</s5>
      </fC03>
      <fC03 i1="09" i2="X" l="FRE">
        <s0>Reconstruction image</s0>
        <s5>14</s5>
      </fC03>
      <fC03 i1="09" i2="X" l="ENG">
        <s0>Image reconstruction</s0>
        <s5>14</s5>
      </fC03>
      <fC03 i1="09" i2="X" l="SPA">
        <s0>Reconstrucción imagen</s0>
        <s5>14</s5>
      </fC03>
      <fC03 i1="10" i2="X" l="FRE">
        <s0>Plus proche voisin</s0>
        <s5>15</s5>
      </fC03>
      <fC03 i1="10" i2="X" l="ENG">
        <s0>Nearest neighbour</s0>
        <s5>15</s5>
      </fC03>
      <fC03 i1="10" i2="X" l="SPA">
        <s0>Vecino más cercano</s0>
        <s5>15</s5>
      </fC03>
      <fC03 i1="11" i2="X" l="FRE">
        <s0>Fouille donnée</s0>
        <s5>16</s5>
      </fC03>
      <fC03 i1="11" i2="X" l="ENG">
        <s0>Data mining</s0>
        <s5>16</s5>
      </fC03>
      <fC03 i1="11" i2="X" l="SPA">
        <s0>Busca dato</s0>
        <s5>16</s5>
      </fC03>
      <fC03 i1="12" i2="X" l="FRE">
        <s0>Télémètre laser</s0>
        <s5>18</s5>
      </fC03>
      <fC03 i1="12" i2="X" l="ENG">
        <s0>Laser range finder</s0>
        <s5>18</s5>
      </fC03>
      <fC03 i1="12" i2="X" l="SPA">
        <s0>Telémetro láser</s0>
        <s5>18</s5>
      </fC03>
      <fC03 i1="13" i2="X" l="FRE">
        <s0>Esquive collision</s0>
        <s5>19</s5>
      </fC03>
      <fC03 i1="13" i2="X" l="ENG">
        <s0>Collision avoidance</s0>
        <s5>19</s5>
      </fC03>
      <fC03 i1="13" i2="X" l="SPA">
        <s0>Esquiva colisión</s0>
        <s5>19</s5>
      </fC03>
      <fC03 i1="14" i2="X" l="FRE">
        <s0>Analyse amas</s0>
        <s5>23</s5>
      </fC03>
      <fC03 i1="14" i2="X" l="ENG">
        <s0>Cluster analysis</s0>
        <s5>23</s5>
      </fC03>
      <fC03 i1="14" i2="X" l="SPA">
        <s0>Analisis cluster</s0>
        <s5>23</s5>
      </fC03>
      <fC03 i1="15" i2="X" l="FRE">
        <s0>Méthode spectrale</s0>
        <s5>24</s5>
      </fC03>
      <fC03 i1="15" i2="X" l="ENG">
        <s0>Spectral method</s0>
        <s5>24</s5>
      </fC03>
      <fC03 i1="15" i2="X" l="SPA">
        <s0>Método espectral</s0>
        <s5>24</s5>
      </fC03>
      <fC03 i1="16" i2="X" l="FRE">
        <s0>Analyse spectrale</s0>
        <s5>25</s5>
      </fC03>
      <fC03 i1="16" i2="X" l="ENG">
        <s0>Spectral analysis</s0>
        <s5>25</s5>
      </fC03>
      <fC03 i1="16" i2="X" l="SPA">
        <s0>Análisis espectral</s0>
        <s5>25</s5>
      </fC03>
      <fC03 i1="17" i2="X" l="FRE">
        <s0>Filtre Kalman</s0>
        <s5>26</s5>
      </fC03>
      <fC03 i1="17" i2="X" l="ENG">
        <s0>Kalman filter</s0>
        <s5>26</s5>
      </fC03>
      <fC03 i1="17" i2="X" l="SPA">
        <s0>Filtro Kalman</s0>
        <s5>26</s5>
      </fC03>
      <fC03 i1="18" i2="X" l="FRE">
        <s0>Poursuite cible</s0>
        <s4>CD</s4>
        <s5>96</s5>
      </fC03>
      <fC03 i1="18" i2="X" l="ENG">
        <s0>Target tracking</s0>
        <s4>CD</s4>
        <s5>96</s5>
      </fC03>
      <fC03 i1="18" i2="X" l="SPA">
        <s0>Seguimiento de blanco</s0>
        <s4>CD</s4>
        <s5>96</s5>
      </fC03>
      <fC03 i1="19" i2="X" l="FRE">
        <s0>Détection objet</s0>
        <s4>CD</s4>
        <s5>97</s5>
      </fC03>
      <fC03 i1="19" i2="X" l="ENG">
        <s0>Object detection</s0>
        <s4>CD</s4>
        <s5>97</s5>
      </fC03>
      <fC03 i1="19" i2="X" l="SPA">
        <s0>Detección de Objetos</s0>
        <s4>CD</s4>
        <s5>97</s5>
      </fC03>
      <fC03 i1="20" i2="X" l="FRE">
        <s0>Reconnaissance objet</s0>
        <s4>CD</s4>
        <s5>98</s5>
      </fC03>
      <fC03 i1="20" i2="X" l="ENG">
        <s0>Object recognition</s0>
        <s4>CD</s4>
        <s5>98</s5>
      </fC03>
      <fC03 i1="20" i2="X" l="SPA">
        <s0>Reconocimiento de objetos</s0>
        <s4>CD</s4>
        <s5>98</s5>
      </fC03>
      <fC03 i1="21" i2="X" l="FRE">
        <s0>Caméra vidéo</s0>
        <s4>CD</s4>
        <s5>99</s5>
      </fC03>
      <fC03 i1="21" i2="X" l="ENG">
        <s0>Video cameras</s0>
        <s4>CD</s4>
        <s5>99</s5>
      </fC03>
      <fC03 i1="21" i2="X" l="SPA">
        <s0>Cámara de vídeo</s0>
        <s4>CD</s4>
        <s5>99</s5>
      </fC03>
      <fN21>
        <s1>143</s1>
      </fN21>
      <fN44 i1="01">
        <s1>OTO</s1>
      </fN44>
      <fN82>
        <s1>OTO</s1>
      </fN82>
    </pA>
    <pR>
      <fA30 i1="01" i2="1" l="ENG">
        <s1>Electronic Imaging Science and Technology Symposium</s1>
        <s3>San Francisco CA USA</s3>
        <s4>2010</s4>
      </fA30>
    </pR>
  </inist>
  <server>
    <NO>PASCAL 11-0219039 INIST</NO>
    <ET>Linear stereo vision based objects detection and tracking using spectral clustering</ET>
    <AU>MOQQADDEM (Safaa); RUICHEK (Y.); TOUAHNI (R.); SBIHI (A.); RÖNING (Juha); CASASENT (David Paul); HALL (Ernest L.)</AU>
    <AF>Systems and Transportation Laboratory, University of Technology of Belfort-Montbélaird/90010 Belfort/France (1 aut., 2 aut.); LASTID Laboratory, Ibn Tofail University of Kénitra/Maroc (1 aut., 3 aut., 4 aut.)</AF>
    <DT>Publication en série; Congrès; Niveau analytique</DT>
    <SO>Proceedings of SPIE, the International Society for Optical Engineering; ISSN 0277-786X; Coden PSISDG; Etats-Unis; Da. 2011;  Vol. 7878; 787806.1-787806.9; Bibl. 10 ref.</SO>
    <LA>Anglais</LA>
    <EA>Objects detection and tracking is a key function for many applications like video surveillance, robotic, intelligent transportation systems,...etc. This problem is widely treated in the literature in terms of sensors (video cameras, laser range finder, Radar) and methodologies. This paper proposes a new approach for detecting and tracking objects using stereo vision with linear cameras. After the matching process applied to edge points extracted from the images, the reconstructed points in the scene are clustered using spectral analysis. The obtained clusters are then tracked throughout their center of gravity using a Kalman filter and a Nearest Neighbour (NN) based data association algorithm. The approach is tested and evaluated on real data to demonstrate its effectiveness for obstacle detection and tracking in front of a vehicle. This work is a part of a project that aims to develop advanced driving aid systems, supported by the CPER, STIC and Volubilis programs.</EA>
    <CC>001D02C03; 001D02B07B; 001D15C</CC>
    <FD>Traitement image stéréoscopique; Vision ordinateur; Vision stéréoscopique; Asservissement visuel; Surveillance; Robotique; Capteur mesure; Détection contour; Reconstruction image; Plus proche voisin; Fouille donnée; Télémètre laser; Esquive collision; Analyse amas; Méthode spectrale; Analyse spectrale; Filtre Kalman; Poursuite cible; Détection objet; Reconnaissance objet; Caméra vidéo</FD>
    <ED>Stereo image processing; Computer vision; Stereopsis; Visual servoing; Surveillance; Robotics; Measurement sensor; Edge detection; Image reconstruction; Nearest neighbour; Data mining; Laser range finder; Collision avoidance; Cluster analysis; Spectral method; Spectral analysis; Kalman filter; Target tracking; Object detection; Object recognition; Video cameras</ED>
    <SD>Visión ordenador; Visión estereoscópica; Servomando visual; Vigilancia; Robótica; Captador medida; Detección contorno; Reconstrucción imagen; Vecino más cercano; Busca dato; Telémetro láser; Esquiva colisión; Analisis cluster; Método espectral; Análisis espectral; Filtro Kalman; Seguimiento de blanco; Detección de Objetos; Reconocimiento de objetos; Cámara de vídeo</SD>
    <LO>INIST-21760.354000174732410050</LO>
    <ID>11-0219039</ID>
  </server>
</record>