Serveur d'exploration sur la recherche en informatique en Lorraine

Attention, ce site est en cours de développement !
Attention, site généré par des moyens informatiques à partir de corpus bruts.
Les informations ne sont donc pas validées.

How to speed up the training mechanism in a connectionist model

Identifieur interne : 004159 ( Crin/Curation ); précédent : 004158; suivant : 004160

How to speed up the training mechanism in a connectionist model

Auteurs : Szilard Vajda ; Abdel Belaïd

Source :

RBID : CRIN:vajda05b

English descriptors

Abstract

In this paper a fast data driven learning-corpus building algorithm (FDDLCB) is proposed. The generic technique allows to build dynamically a representative and compact learning corpus for a connectionist model. The constructed dataset contains just a reduced number of patterns but sufficiently descriptive to characterize the different classes which should be separated. The method is based on a double least mean squares (LMS) error minimization mechanism trying to find the optimal boundaries of the different pattern classes. In the classical learning process the LMS is serving to minimize the error during the learning and this process is improved with a second one, as the new samples selection is also based on the idea to minimize the recognition error. Reinforcing the class boundaries where the recognition fails let us achieve a rapid and good generalization without any loss of occuracy. A modified version of the algorithm will be also presented. The experiments were performed on MNIST (Modified NIST) separated digit dataset. The encouraging result (98.51%) using just 1.85% of the available patterns from the original training dataset is comparable even with the state of the art techniques.

Links toward previous steps (curation, corpus...)


Links to Exploration step

CRIN:vajda05b

Le document en format XML

<record>
<TEI>
<teiHeader>
<fileDesc>
<titleStmt>
<title xml:lang="en" wicri:score="466">How to speed up the training mechanism in a connectionist model</title>
</titleStmt>
<publicationStmt>
<idno type="RBID">CRIN:vajda05b</idno>
<date when="2005" year="2005">2005</date>
<idno type="wicri:Area/Crin/Corpus">004159</idno>
<idno type="wicri:Area/Crin/Curation">004159</idno>
<idno type="wicri:explorRef" wicri:stream="Crin" wicri:step="Curation">004159</idno>
</publicationStmt>
<sourceDesc>
<biblStruct>
<analytic>
<title xml:lang="en">How to speed up the training mechanism in a connectionist model</title>
<author>
<name sortKey="Vajda, Szilard" sort="Vajda, Szilard" uniqKey="Vajda S" first="Szilard" last="Vajda">Szilard Vajda</name>
</author>
<author>
<name sortKey="Belaid, Abdel" sort="Belaid, Abdel" uniqKey="Belaid A" first="Abdel" last="Belaïd">Abdel Belaïd</name>
</author>
</analytic>
</biblStruct>
</sourceDesc>
</fileDesc>
<profileDesc>
<textClass>
<keywords scheme="KwdEn" xml:lang="en">
<term>multi-layer perceptron</term>
<term>neural networks</term>
<term>pattern selection</term>
<term>separated digit recognition</term>
</keywords>
</textClass>
</profileDesc>
</teiHeader>
<front>
<div type="abstract" xml:lang="en" wicri:score="2984">In this paper a fast data driven learning-corpus building algorithm (FDDLCB) is proposed. The generic technique allows to build dynamically a representative and compact learning corpus for a connectionist model. The constructed dataset contains just a reduced number of patterns but sufficiently descriptive to characterize the different classes which should be separated. The method is based on a double least mean squares (LMS) error minimization mechanism trying to find the optimal boundaries of the different pattern classes. In the classical learning process the LMS is serving to minimize the error during the learning and this process is improved with a second one, as the new samples selection is also based on the idea to minimize the recognition error. Reinforcing the class boundaries where the recognition fails let us achieve a rapid and good generalization without any loss of occuracy. A modified version of the algorithm will be also presented. The experiments were performed on MNIST (Modified NIST) separated digit dataset. The encouraging result (98.51%) using just 1.85% of the available patterns from the original training dataset is comparable even with the state of the art techniques.</div>
</front>
</TEI>
<BibTex type="inproceedings">
<ref>vajda05b</ref>
<crinnumber>A05-R-145</crinnumber>
<category>3</category>
<equipe>READ</equipe>
<author>
<e>Vajda, Szilard</e>
<e>Belaïd, Abdel</e>
</author>
<title>How to speed up the training mechanism in a connectionist model</title>
<booktitle>{IAPR - TC3 International Workshop on Neural Networks and Learning in Document Recognition - NNLDAR 2005, Seoul, Korea}</booktitle>
<year>2005</year>
<editor>Simone Marinai, Hiromichi Fujisawa</editor>
<pages>13--17</pages>
<month>Aug</month>
<note>A satellite workshop of ICDAR 2005</note>
<keywords>
<e>pattern selection</e>
<e>neural networks</e>
<e>multi-layer perceptron</e>
<e>separated digit recognition</e>
</keywords>
<abstract>In this paper a fast data driven learning-corpus building algorithm (FDDLCB) is proposed. The generic technique allows to build dynamically a representative and compact learning corpus for a connectionist model. The constructed dataset contains just a reduced number of patterns but sufficiently descriptive to characterize the different classes which should be separated. The method is based on a double least mean squares (LMS) error minimization mechanism trying to find the optimal boundaries of the different pattern classes. In the classical learning process the LMS is serving to minimize the error during the learning and this process is improved with a second one, as the new samples selection is also based on the idea to minimize the recognition error. Reinforcing the class boundaries where the recognition fails let us achieve a rapid and good generalization without any loss of occuracy. A modified version of the algorithm will be also presented. The experiments were performed on MNIST (Modified NIST) separated digit dataset. The encouraging result (98.51%) using just 1.85% of the available patterns from the original training dataset is comparable even with the state of the art techniques.</abstract>
</BibTex>
</record>

Pour manipuler ce document sous Unix (Dilib)

EXPLOR_STEP=$WICRI_ROOT/Wicri/Lorraine/explor/InforLorV4/Data/Crin/Curation
HfdSelect -h $EXPLOR_STEP/biblio.hfd -nk 004159 | SxmlIndent | more

Ou

HfdSelect -h $EXPLOR_AREA/Data/Crin/Curation/biblio.hfd -nk 004159 | SxmlIndent | more

Pour mettre un lien sur cette page dans le réseau Wicri

{{Explor lien
   |wiki=    Wicri/Lorraine
   |area=    InforLorV4
   |flux=    Crin
   |étape=   Curation
   |type=    RBID
   |clé=     CRIN:vajda05b
   |texte=   How to speed up the training mechanism in a connectionist model
}}

Wicri

This area was generated with Dilib version V0.6.33.
Data generation: Mon Jun 10 21:56:28 2019. Site generation: Fri Feb 25 15:29:27 2022