Serveur d'exploration sur l'OCR

Attention, ce site est en cours de développement !
Attention, site généré par des moyens informatiques à partir de corpus bruts.
Les informations ne sont donc pas validées.

An efficient constrained training algorithm for feedforward networks

Identifieur interne : 000A35 ( PascalFrancis/Checkpoint ); précédent : 000A34; suivant : 000A36

An efficient constrained training algorithm for feedforward networks

Auteurs : D. A. Karras [Grèce] ; S. J. Perantonis

Source :

RBID : Pascal:96-0015272

Descripteurs français

English descriptors

Abstract

A novel algorithm is presented which supplements the training phase in feedforward networks with various forms of information about desired learning properties. This information is represented by conditions which must be satisfied in addition to the demand for minimization of the usual mean square error cost function. The purpose of these conditions is to improve convergence, learning speed, and generalization properties through prompt activation of the hidden units, optimal alignment of successive weight vector offsets, elimination of excessive hidden nodes, and regulation of the magnitude of search steps in the weight space. The algorithm is applied to several small- and large-scale binary benchmark training tasks, to test its convergence ability and learning speed, as well as to a large-scale OCR problem, to test its generalization capability. Its performance in terms of percentage of local minima, learning speed, and generalization ability is evaluated and found superior to the performance of the backpropagation algorithm and variants thereof taking especially into account the statistical significance of the results.


Affiliations:


Links toward previous steps (curation, corpus...)


Links to Exploration step

Pascal:96-0015272

Le document en format XML

<record>
<TEI>
<teiHeader>
<fileDesc>
<titleStmt>
<title xml:lang="en" level="a">An efficient constrained training algorithm for feedforward networks</title>
<author>
<name sortKey="Karras, D A" sort="Karras, D A" uniqKey="Karras D" first="D. A." last="Karras">D. A. Karras</name>
<affiliation wicri:level="1">
<inist:fA14 i1="01">
<s1>National res. cent. "Demokritos", inst. informatics telecommunications</s1>
<s2>Athens</s2>
<s3>GRC</s3>
</inist:fA14>
<country>Grèce</country>
<wicri:noRegion>Athens</wicri:noRegion>
</affiliation>
</author>
<author>
<name sortKey="Perantonis, S J" sort="Perantonis, S J" uniqKey="Perantonis S" first="S. J." last="Perantonis">S. J. Perantonis</name>
</author>
</titleStmt>
<publicationStmt>
<idno type="wicri:source">INIST</idno>
<idno type="inist">96-0015272</idno>
<date when="1995">1995</date>
<idno type="stanalyst">PASCAL 96-0015272 INIST</idno>
<idno type="RBID">Pascal:96-0015272</idno>
<idno type="wicri:Area/PascalFrancis/Corpus">000A35</idno>
<idno type="wicri:Area/PascalFrancis/Curation">000964</idno>
<idno type="wicri:Area/PascalFrancis/Checkpoint">000A35</idno>
</publicationStmt>
<sourceDesc>
<biblStruct>
<analytic>
<title xml:lang="en" level="a">An efficient constrained training algorithm for feedforward networks</title>
<author>
<name sortKey="Karras, D A" sort="Karras, D A" uniqKey="Karras D" first="D. A." last="Karras">D. A. Karras</name>
<affiliation wicri:level="1">
<inist:fA14 i1="01">
<s1>National res. cent. "Demokritos", inst. informatics telecommunications</s1>
<s2>Athens</s2>
<s3>GRC</s3>
</inist:fA14>
<country>Grèce</country>
<wicri:noRegion>Athens</wicri:noRegion>
</affiliation>
</author>
<author>
<name sortKey="Perantonis, S J" sort="Perantonis, S J" uniqKey="Perantonis S" first="S. J." last="Perantonis">S. J. Perantonis</name>
</author>
</analytic>
<series>
<title level="j" type="main">IEEE transactions on neural networks</title>
<title level="j" type="abbreviated">IEEE trans. neural netw.</title>
<idno type="ISSN">1045-9227</idno>
<imprint>
<date when="1995">1995</date>
</imprint>
</series>
</biblStruct>
</sourceDesc>
<seriesStmt>
<title level="j" type="main">IEEE transactions on neural networks</title>
<title level="j" type="abbreviated">IEEE trans. neural netw.</title>
<idno type="ISSN">1045-9227</idno>
</seriesStmt>
</fileDesc>
<profileDesc>
<textClass>
<keywords scheme="KwdEn" xml:lang="en">
<term>Algorithm</term>
<term>Learning</term>
<term>Neural network</term>
<term>Optimization</term>
<term>Performance</term>
</keywords>
<keywords scheme="Pascal" xml:lang="fr">
<term>Réseau neuronal</term>
<term>Apprentissage</term>
<term>Optimisation</term>
<term>Algorithme</term>
<term>Performance</term>
</keywords>
</textClass>
</profileDesc>
</teiHeader>
<front>
<div type="abstract" xml:lang="en">A novel algorithm is presented which supplements the training phase in feedforward networks with various forms of information about desired learning properties. This information is represented by conditions which must be satisfied in addition to the demand for minimization of the usual mean square error cost function. The purpose of these conditions is to improve convergence, learning speed, and generalization properties through prompt activation of the hidden units, optimal alignment of successive weight vector offsets, elimination of excessive hidden nodes, and regulation of the magnitude of search steps in the weight space. The algorithm is applied to several small- and large-scale binary benchmark training tasks, to test its convergence ability and learning speed, as well as to a large-scale OCR problem, to test its generalization capability. Its performance in terms of percentage of local minima, learning speed, and generalization ability is evaluated and found superior to the performance of the backpropagation algorithm and variants thereof taking especially into account the statistical significance of the results.</div>
</front>
</TEI>
<inist>
<standard h6="B">
<pA>
<fA01 i1="01" i2="1">
<s0>1045-9227</s0>
</fA01>
<fA03 i2="1">
<s0>IEEE trans. neural netw.</s0>
</fA03>
<fA05>
<s2>6</s2>
</fA05>
<fA06>
<s2>6</s2>
</fA06>
<fA08 i1="01" i2="1" l="ENG">
<s1>An efficient constrained training algorithm for feedforward networks</s1>
</fA08>
<fA11 i1="01" i2="1">
<s1>KARRAS (D. A.)</s1>
</fA11>
<fA11 i1="02" i2="1">
<s1>PERANTONIS (S. J.)</s1>
</fA11>
<fA14 i1="01">
<s1>National res. cent. "Demokritos", inst. informatics telecommunications</s1>
<s2>Athens</s2>
<s3>GRC</s3>
</fA14>
<fA20>
<s1>1420-1434</s1>
</fA20>
<fA21>
<s1>1995</s1>
</fA21>
<fA23 i1="01">
<s0>ENG</s0>
</fA23>
<fA43 i1="01">
<s1>INIST</s1>
<s2>22204</s2>
<s5>354000058828900110</s5>
</fA43>
<fA44>
<s0>0000</s0>
</fA44>
<fA45>
<s0>65 ref.</s0>
</fA45>
<fA47 i1="01" i2="1">
<s0>96-0015272</s0>
</fA47>
<fA60>
<s1>P</s1>
</fA60>
<fA61>
<s0>A</s0>
</fA61>
<fA64 i1="01" i2="1">
<s0>IEEE transactions on neural networks</s0>
</fA64>
<fA66 i1="01">
<s0>USA</s0>
</fA66>
<fC01 i1="01" l="ENG">
<s0>A novel algorithm is presented which supplements the training phase in feedforward networks with various forms of information about desired learning properties. This information is represented by conditions which must be satisfied in addition to the demand for minimization of the usual mean square error cost function. The purpose of these conditions is to improve convergence, learning speed, and generalization properties through prompt activation of the hidden units, optimal alignment of successive weight vector offsets, elimination of excessive hidden nodes, and regulation of the magnitude of search steps in the weight space. The algorithm is applied to several small- and large-scale binary benchmark training tasks, to test its convergence ability and learning speed, as well as to a large-scale OCR problem, to test its generalization capability. Its performance in terms of percentage of local minima, learning speed, and generalization ability is evaluated and found superior to the performance of the backpropagation algorithm and variants thereof taking especially into account the statistical significance of the results.</s0>
</fC01>
<fC02 i1="01" i2="X">
<s0>001D02C06</s0>
</fC02>
<fC03 i1="01" i2="X" l="FRE">
<s0>Réseau neuronal</s0>
<s5>01</s5>
</fC03>
<fC03 i1="01" i2="X" l="ENG">
<s0>Neural network</s0>
<s5>01</s5>
</fC03>
<fC03 i1="01" i2="X" l="SPA">
<s0>Red neuronal</s0>
<s5>01</s5>
</fC03>
<fC03 i1="02" i2="X" l="FRE">
<s0>Apprentissage</s0>
<s5>02</s5>
</fC03>
<fC03 i1="02" i2="X" l="ENG">
<s0>Learning</s0>
<s5>02</s5>
</fC03>
<fC03 i1="02" i2="X" l="SPA">
<s0>Aprendizaje</s0>
<s5>02</s5>
</fC03>
<fC03 i1="03" i2="X" l="FRE">
<s0>Optimisation</s0>
<s5>03</s5>
</fC03>
<fC03 i1="03" i2="X" l="ENG">
<s0>Optimization</s0>
<s5>03</s5>
</fC03>
<fC03 i1="03" i2="X" l="GER">
<s0>Optimierung</s0>
<s5>03</s5>
</fC03>
<fC03 i1="03" i2="X" l="SPA">
<s0>Optimización</s0>
<s5>03</s5>
</fC03>
<fC03 i1="04" i2="X" l="FRE">
<s0>Algorithme</s0>
<s5>04</s5>
</fC03>
<fC03 i1="04" i2="X" l="ENG">
<s0>Algorithm</s0>
<s5>04</s5>
</fC03>
<fC03 i1="04" i2="X" l="GER">
<s0>Algorithmus</s0>
<s5>04</s5>
</fC03>
<fC03 i1="04" i2="X" l="SPA">
<s0>Algoritmo</s0>
<s5>04</s5>
</fC03>
<fC03 i1="05" i2="X" l="FRE">
<s0>Performance</s0>
<s5>05</s5>
</fC03>
<fC03 i1="05" i2="X" l="ENG">
<s0>Performance</s0>
<s5>05</s5>
</fC03>
<fC03 i1="05" i2="X" l="SPA">
<s0>Rendimiento</s0>
<s5>05</s5>
</fC03>
<fN21>
<s1>359</s1>
</fN21>
</pA>
</standard>
</inist>
<affiliations>
<list>
<country>
<li>Grèce</li>
</country>
</list>
<tree>
<noCountry>
<name sortKey="Perantonis, S J" sort="Perantonis, S J" uniqKey="Perantonis S" first="S. J." last="Perantonis">S. J. Perantonis</name>
</noCountry>
<country name="Grèce">
<noRegion>
<name sortKey="Karras, D A" sort="Karras, D A" uniqKey="Karras D" first="D. A." last="Karras">D. A. Karras</name>
</noRegion>
</country>
</tree>
</affiliations>
</record>

Pour manipuler ce document sous Unix (Dilib)

EXPLOR_STEP=$WICRI_ROOT/Ticri/CIDE/explor/OcrV1/Data/PascalFrancis/Checkpoint
HfdSelect -h $EXPLOR_STEP/biblio.hfd -nk 000A35 | SxmlIndent | more

Ou

HfdSelect -h $EXPLOR_AREA/Data/PascalFrancis/Checkpoint/biblio.hfd -nk 000A35 | SxmlIndent | more

Pour mettre un lien sur cette page dans le réseau Wicri

{{Explor lien
   |wiki=    Ticri/CIDE
   |area=    OcrV1
   |flux=    PascalFrancis
   |étape=   Checkpoint
   |type=    RBID
   |clé=     Pascal:96-0015272
   |texte=   An efficient constrained training algorithm for feedforward networks
}}

Wicri

This area was generated with Dilib version V0.6.32.
Data generation: Sat Nov 11 16:53:45 2017. Site generation: Mon Mar 11 23:15:16 2024