Serveur d'exploration sur la recherche en informatique en Lorraine

Attention, ce site est en cours de développement !
Attention, site généré par des moyens informatiques à partir de corpus bruts.
Les informations ne sont donc pas validées.

Neural network topology optimization

Identifieur interne : 000551 ( PascalFrancis/Curation ); précédent : 000550; suivant : 000552

Neural network topology optimization

Auteurs : Mohammed Attik [France] ; Laurent Bougrain [France] ; Frédéric Alexandre [France]

Source :

RBID : Pascal:06-0067795

Descripteurs français

English descriptors

Abstract

The determination of the optimal architecture of a supervised neural network is an important and a difficult task. The classical neural network topology optimization methods select weight(s) or unit(s) from the architecture in order to give a high performance of a learning algorithm. However, all existing topology optimization methods do not guarantee to obtain the optimal solution. In this work, we propose a hybrid approach which combines variable selection method and classical optimization method in order to improve optimization topology solution. The proposed approach suggests to identify the relevant subset of variables which gives a good classification performance in the first step and then to apply a classical topology optimization method to eliminate unnecessary hidden units or weights. A comparison of our approach to classical techniques for architecture optimization is given.
pA  
A01 01  1    @0 0302-9743
A05       @2 3697
A08 01  1  ENG  @1 Neural network topology optimization
A09 01  1  ENG  @1 Artificial neural networks. Part II : formal models and their applications : ICANN 2005 : 15th International Conference, Warsaw, Poland, September 11-15, 2005 : proceedings
A11 01  1    @1 ATTIK (Mohammed)
A11 02  1    @1 BOUGRAIN (Laurent)
A11 03  1    @1 ALEXANDRE (Frédéric)
A14 01      @1 LORIA/INRIA-Lorraine, Campus Scientifique - BP 239 @2 54506, Vandœuvre-lès-Nancy @3 FRA @Z 1 aut. @Z 2 aut. @Z 3 aut.
A14 02      @1 BRGM, 3 av Claude Guillemin - BP 6009 @2 45060 Orléans @3 FRA @Z 1 aut.
A20       @1 53-58
A21       @1 2005
A23 01      @0 ENG
A26 01      @0 3-540-28755-8
A43 01      @1 INIST @2 16343 @5 354000138682610090
A44       @0 0000 @1 © 2006 INIST-CNRS. All rights reserved.
A45       @0 12 ref.
A47 01  1    @0 06-0067795
A60       @1 P @2 C
A61       @0 A
A64 01  1    @0 Lecture notes in computer science
A66 01      @0 DEU
C01 01    ENG  @0 The determination of the optimal architecture of a supervised neural network is an important and a difficult task. The classical neural network topology optimization methods select weight(s) or unit(s) from the architecture in order to give a high performance of a learning algorithm. However, all existing topology optimization methods do not guarantee to obtain the optimal solution. In this work, we propose a hybrid approach which combines variable selection method and classical optimization method in order to improve optimization topology solution. The proposed approach suggests to identify the relevant subset of variables which gives a good classification performance in the first step and then to apply a classical topology optimization method to eliminate unnecessary hidden units or weights. A comparison of our approach to classical techniques for architecture optimization is given.
C02 01  X    @0 001D02C
C03 01  X  FRE  @0 Méthode formelle @5 01
C03 01  X  ENG  @0 Formal method @5 01
C03 01  X  SPA  @0 Método formal @5 01
C03 02  X  FRE  @0 Haute performance @5 06
C03 02  X  ENG  @0 High performance @5 06
C03 02  X  SPA  @0 Alto rendimiento @5 06
C03 03  X  FRE  @0 Intelligence artificielle @5 07
C03 03  X  ENG  @0 Artificial intelligence @5 07
C03 03  X  SPA  @0 Inteligencia artificial @5 07
C03 04  X  FRE  @0 Solution optimale @5 08
C03 04  X  ENG  @0 Optimal solution @5 08
C03 04  X  SPA  @0 Solución óptima @5 08
C03 05  X  FRE  @0 Classification @5 09
C03 05  X  ENG  @0 Classification @5 09
C03 05  X  SPA  @0 Clasificación @5 09
C03 06  3  FRE  @0 Topologie circuit @5 18
C03 06  3  ENG  @0 Network topology @5 18
C03 07  X  FRE  @0 Réseau neuronal @5 23
C03 07  X  ENG  @0 Neural network @5 23
C03 07  X  SPA  @0 Red neuronal @5 23
C03 08  X  FRE  @0 Optimisation @5 24
C03 08  X  ENG  @0 Optimization @5 24
C03 08  X  SPA  @0 Optimización @5 24
C03 09  X  FRE  @0 Méthode optimisation @5 25
C03 09  X  ENG  @0 Optimization method @5 25
C03 09  X  SPA  @0 Método optimización @5 25
C03 10  X  FRE  @0 Algorithme apprentissage @5 26
C03 10  X  ENG  @0 Learning algorithm @5 26
C03 10  X  SPA  @0 Algoritmo aprendizaje @5 26
C03 11  X  FRE  @0 . @4 INC @5 82
N21       @1 037
N44 01      @1 OTO
N82       @1 OTO
pR  
A30 01  1  ENG  @1 International Conference on Artificial Neural Networks @2 15 @3 Warsaw POL @4 2005-09-11

Links toward previous steps (curation, corpus...)


Links to Exploration step

Pascal:06-0067795

Le document en format XML

<record>
<TEI>
<teiHeader>
<fileDesc>
<titleStmt>
<title xml:lang="en" level="a">Neural network topology optimization</title>
<author>
<name sortKey="Attik, Mohammed" sort="Attik, Mohammed" uniqKey="Attik M" first="Mohammed" last="Attik">Mohammed Attik</name>
<affiliation wicri:level="1">
<inist:fA14 i1="01">
<s1>LORIA/INRIA-Lorraine, Campus Scientifique - BP 239</s1>
<s2>54506, Vandœuvre-lès-Nancy</s2>
<s3>FRA</s3>
<sZ>1 aut.</sZ>
<sZ>2 aut.</sZ>
<sZ>3 aut.</sZ>
</inist:fA14>
<country>France</country>
</affiliation>
<affiliation wicri:level="1">
<inist:fA14 i1="02">
<s1>BRGM, 3 av Claude Guillemin - BP 6009</s1>
<s2>45060 Orléans</s2>
<s3>FRA</s3>
<sZ>1 aut.</sZ>
</inist:fA14>
<country>France</country>
</affiliation>
</author>
<author>
<name sortKey="Bougrain, Laurent" sort="Bougrain, Laurent" uniqKey="Bougrain L" first="Laurent" last="Bougrain">Laurent Bougrain</name>
<affiliation wicri:level="1">
<inist:fA14 i1="01">
<s1>LORIA/INRIA-Lorraine, Campus Scientifique - BP 239</s1>
<s2>54506, Vandœuvre-lès-Nancy</s2>
<s3>FRA</s3>
<sZ>1 aut.</sZ>
<sZ>2 aut.</sZ>
<sZ>3 aut.</sZ>
</inist:fA14>
<country>France</country>
</affiliation>
</author>
<author>
<name sortKey="Alexandre, Frederic" sort="Alexandre, Frederic" uniqKey="Alexandre F" first="Frédéric" last="Alexandre">Frédéric Alexandre</name>
<affiliation wicri:level="1">
<inist:fA14 i1="01">
<s1>LORIA/INRIA-Lorraine, Campus Scientifique - BP 239</s1>
<s2>54506, Vandœuvre-lès-Nancy</s2>
<s3>FRA</s3>
<sZ>1 aut.</sZ>
<sZ>2 aut.</sZ>
<sZ>3 aut.</sZ>
</inist:fA14>
<country>France</country>
</affiliation>
</author>
</titleStmt>
<publicationStmt>
<idno type="wicri:source">INIST</idno>
<idno type="inist">06-0067795</idno>
<date when="2005">2005</date>
<idno type="stanalyst">PASCAL 06-0067795 INIST</idno>
<idno type="RBID">Pascal:06-0067795</idno>
<idno type="wicri:Area/PascalFrancis/Corpus">000482</idno>
<idno type="wicri:Area/PascalFrancis/Curation">000551</idno>
</publicationStmt>
<sourceDesc>
<biblStruct>
<analytic>
<title xml:lang="en" level="a">Neural network topology optimization</title>
<author>
<name sortKey="Attik, Mohammed" sort="Attik, Mohammed" uniqKey="Attik M" first="Mohammed" last="Attik">Mohammed Attik</name>
<affiliation wicri:level="1">
<inist:fA14 i1="01">
<s1>LORIA/INRIA-Lorraine, Campus Scientifique - BP 239</s1>
<s2>54506, Vandœuvre-lès-Nancy</s2>
<s3>FRA</s3>
<sZ>1 aut.</sZ>
<sZ>2 aut.</sZ>
<sZ>3 aut.</sZ>
</inist:fA14>
<country>France</country>
</affiliation>
<affiliation wicri:level="1">
<inist:fA14 i1="02">
<s1>BRGM, 3 av Claude Guillemin - BP 6009</s1>
<s2>45060 Orléans</s2>
<s3>FRA</s3>
<sZ>1 aut.</sZ>
</inist:fA14>
<country>France</country>
</affiliation>
</author>
<author>
<name sortKey="Bougrain, Laurent" sort="Bougrain, Laurent" uniqKey="Bougrain L" first="Laurent" last="Bougrain">Laurent Bougrain</name>
<affiliation wicri:level="1">
<inist:fA14 i1="01">
<s1>LORIA/INRIA-Lorraine, Campus Scientifique - BP 239</s1>
<s2>54506, Vandœuvre-lès-Nancy</s2>
<s3>FRA</s3>
<sZ>1 aut.</sZ>
<sZ>2 aut.</sZ>
<sZ>3 aut.</sZ>
</inist:fA14>
<country>France</country>
</affiliation>
</author>
<author>
<name sortKey="Alexandre, Frederic" sort="Alexandre, Frederic" uniqKey="Alexandre F" first="Frédéric" last="Alexandre">Frédéric Alexandre</name>
<affiliation wicri:level="1">
<inist:fA14 i1="01">
<s1>LORIA/INRIA-Lorraine, Campus Scientifique - BP 239</s1>
<s2>54506, Vandœuvre-lès-Nancy</s2>
<s3>FRA</s3>
<sZ>1 aut.</sZ>
<sZ>2 aut.</sZ>
<sZ>3 aut.</sZ>
</inist:fA14>
<country>France</country>
</affiliation>
</author>
</analytic>
<series>
<title level="j" type="main">Lecture notes in computer science</title>
<idno type="ISSN">0302-9743</idno>
<imprint>
<date when="2005">2005</date>
</imprint>
</series>
</biblStruct>
</sourceDesc>
<seriesStmt>
<title level="j" type="main">Lecture notes in computer science</title>
<idno type="ISSN">0302-9743</idno>
</seriesStmt>
</fileDesc>
<profileDesc>
<textClass>
<keywords scheme="KwdEn" xml:lang="en">
<term>Artificial intelligence</term>
<term>Classification</term>
<term>Formal method</term>
<term>High performance</term>
<term>Learning algorithm</term>
<term>Network topology</term>
<term>Neural network</term>
<term>Optimal solution</term>
<term>Optimization</term>
<term>Optimization method</term>
</keywords>
<keywords scheme="Pascal" xml:lang="fr">
<term>Méthode formelle</term>
<term>Haute performance</term>
<term>Intelligence artificielle</term>
<term>Solution optimale</term>
<term>Classification</term>
<term>Topologie circuit</term>
<term>Réseau neuronal</term>
<term>Optimisation</term>
<term>Méthode optimisation</term>
<term>Algorithme apprentissage</term>
<term>.</term>
</keywords>
<keywords scheme="Wicri" type="topic" xml:lang="fr">
<term>Intelligence artificielle</term>
<term>Classification</term>
</keywords>
</textClass>
</profileDesc>
</teiHeader>
<front>
<div type="abstract" xml:lang="en">The determination of the optimal architecture of a supervised neural network is an important and a difficult task. The classical neural network topology optimization methods select weight(s) or unit(s) from the architecture in order to give a high performance of a learning algorithm. However, all existing topology optimization methods do not guarantee to obtain the optimal solution. In this work, we propose a hybrid approach which combines variable selection method and classical optimization method in order to improve optimization topology solution. The proposed approach suggests to identify the relevant subset of variables which gives a good classification performance in the first step and then to apply a classical topology optimization method to eliminate unnecessary hidden units or weights. A comparison of our approach to classical techniques for architecture optimization is given.</div>
</front>
</TEI>
<inist>
<standard h6="B">
<pA>
<fA01 i1="01" i2="1">
<s0>0302-9743</s0>
</fA01>
<fA05>
<s2>3697</s2>
</fA05>
<fA08 i1="01" i2="1" l="ENG">
<s1>Neural network topology optimization</s1>
</fA08>
<fA09 i1="01" i2="1" l="ENG">
<s1>Artificial neural networks. Part II : formal models and their applications : ICANN 2005 : 15th International Conference, Warsaw, Poland, September 11-15, 2005 : proceedings</s1>
</fA09>
<fA11 i1="01" i2="1">
<s1>ATTIK (Mohammed)</s1>
</fA11>
<fA11 i1="02" i2="1">
<s1>BOUGRAIN (Laurent)</s1>
</fA11>
<fA11 i1="03" i2="1">
<s1>ALEXANDRE (Frédéric)</s1>
</fA11>
<fA14 i1="01">
<s1>LORIA/INRIA-Lorraine, Campus Scientifique - BP 239</s1>
<s2>54506, Vandœuvre-lès-Nancy</s2>
<s3>FRA</s3>
<sZ>1 aut.</sZ>
<sZ>2 aut.</sZ>
<sZ>3 aut.</sZ>
</fA14>
<fA14 i1="02">
<s1>BRGM, 3 av Claude Guillemin - BP 6009</s1>
<s2>45060 Orléans</s2>
<s3>FRA</s3>
<sZ>1 aut.</sZ>
</fA14>
<fA20>
<s1>53-58</s1>
</fA20>
<fA21>
<s1>2005</s1>
</fA21>
<fA23 i1="01">
<s0>ENG</s0>
</fA23>
<fA26 i1="01">
<s0>3-540-28755-8</s0>
</fA26>
<fA43 i1="01">
<s1>INIST</s1>
<s2>16343</s2>
<s5>354000138682610090</s5>
</fA43>
<fA44>
<s0>0000</s0>
<s1>© 2006 INIST-CNRS. All rights reserved.</s1>
</fA44>
<fA45>
<s0>12 ref.</s0>
</fA45>
<fA47 i1="01" i2="1">
<s0>06-0067795</s0>
</fA47>
<fA60>
<s1>P</s1>
<s2>C</s2>
</fA60>
<fA61>
<s0>A</s0>
</fA61>
<fA64 i1="01" i2="1">
<s0>Lecture notes in computer science</s0>
</fA64>
<fA66 i1="01">
<s0>DEU</s0>
</fA66>
<fC01 i1="01" l="ENG">
<s0>The determination of the optimal architecture of a supervised neural network is an important and a difficult task. The classical neural network topology optimization methods select weight(s) or unit(s) from the architecture in order to give a high performance of a learning algorithm. However, all existing topology optimization methods do not guarantee to obtain the optimal solution. In this work, we propose a hybrid approach which combines variable selection method and classical optimization method in order to improve optimization topology solution. The proposed approach suggests to identify the relevant subset of variables which gives a good classification performance in the first step and then to apply a classical topology optimization method to eliminate unnecessary hidden units or weights. A comparison of our approach to classical techniques for architecture optimization is given.</s0>
</fC01>
<fC02 i1="01" i2="X">
<s0>001D02C</s0>
</fC02>
<fC03 i1="01" i2="X" l="FRE">
<s0>Méthode formelle</s0>
<s5>01</s5>
</fC03>
<fC03 i1="01" i2="X" l="ENG">
<s0>Formal method</s0>
<s5>01</s5>
</fC03>
<fC03 i1="01" i2="X" l="SPA">
<s0>Método formal</s0>
<s5>01</s5>
</fC03>
<fC03 i1="02" i2="X" l="FRE">
<s0>Haute performance</s0>
<s5>06</s5>
</fC03>
<fC03 i1="02" i2="X" l="ENG">
<s0>High performance</s0>
<s5>06</s5>
</fC03>
<fC03 i1="02" i2="X" l="SPA">
<s0>Alto rendimiento</s0>
<s5>06</s5>
</fC03>
<fC03 i1="03" i2="X" l="FRE">
<s0>Intelligence artificielle</s0>
<s5>07</s5>
</fC03>
<fC03 i1="03" i2="X" l="ENG">
<s0>Artificial intelligence</s0>
<s5>07</s5>
</fC03>
<fC03 i1="03" i2="X" l="SPA">
<s0>Inteligencia artificial</s0>
<s5>07</s5>
</fC03>
<fC03 i1="04" i2="X" l="FRE">
<s0>Solution optimale</s0>
<s5>08</s5>
</fC03>
<fC03 i1="04" i2="X" l="ENG">
<s0>Optimal solution</s0>
<s5>08</s5>
</fC03>
<fC03 i1="04" i2="X" l="SPA">
<s0>Solución óptima</s0>
<s5>08</s5>
</fC03>
<fC03 i1="05" i2="X" l="FRE">
<s0>Classification</s0>
<s5>09</s5>
</fC03>
<fC03 i1="05" i2="X" l="ENG">
<s0>Classification</s0>
<s5>09</s5>
</fC03>
<fC03 i1="05" i2="X" l="SPA">
<s0>Clasificación</s0>
<s5>09</s5>
</fC03>
<fC03 i1="06" i2="3" l="FRE">
<s0>Topologie circuit</s0>
<s5>18</s5>
</fC03>
<fC03 i1="06" i2="3" l="ENG">
<s0>Network topology</s0>
<s5>18</s5>
</fC03>
<fC03 i1="07" i2="X" l="FRE">
<s0>Réseau neuronal</s0>
<s5>23</s5>
</fC03>
<fC03 i1="07" i2="X" l="ENG">
<s0>Neural network</s0>
<s5>23</s5>
</fC03>
<fC03 i1="07" i2="X" l="SPA">
<s0>Red neuronal</s0>
<s5>23</s5>
</fC03>
<fC03 i1="08" i2="X" l="FRE">
<s0>Optimisation</s0>
<s5>24</s5>
</fC03>
<fC03 i1="08" i2="X" l="ENG">
<s0>Optimization</s0>
<s5>24</s5>
</fC03>
<fC03 i1="08" i2="X" l="SPA">
<s0>Optimización</s0>
<s5>24</s5>
</fC03>
<fC03 i1="09" i2="X" l="FRE">
<s0>Méthode optimisation</s0>
<s5>25</s5>
</fC03>
<fC03 i1="09" i2="X" l="ENG">
<s0>Optimization method</s0>
<s5>25</s5>
</fC03>
<fC03 i1="09" i2="X" l="SPA">
<s0>Método optimización</s0>
<s5>25</s5>
</fC03>
<fC03 i1="10" i2="X" l="FRE">
<s0>Algorithme apprentissage</s0>
<s5>26</s5>
</fC03>
<fC03 i1="10" i2="X" l="ENG">
<s0>Learning algorithm</s0>
<s5>26</s5>
</fC03>
<fC03 i1="10" i2="X" l="SPA">
<s0>Algoritmo aprendizaje</s0>
<s5>26</s5>
</fC03>
<fC03 i1="11" i2="X" l="FRE">
<s0>.</s0>
<s4>INC</s4>
<s5>82</s5>
</fC03>
<fN21>
<s1>037</s1>
</fN21>
<fN44 i1="01">
<s1>OTO</s1>
</fN44>
<fN82>
<s1>OTO</s1>
</fN82>
</pA>
<pR>
<fA30 i1="01" i2="1" l="ENG">
<s1>International Conference on Artificial Neural Networks</s1>
<s2>15</s2>
<s3>Warsaw POL</s3>
<s4>2005-09-11</s4>
</fA30>
</pR>
</standard>
</inist>
</record>

Pour manipuler ce document sous Unix (Dilib)

EXPLOR_STEP=$WICRI_ROOT/Wicri/Lorraine/explor/InforLorV4/Data/PascalFrancis/Curation
HfdSelect -h $EXPLOR_STEP/biblio.hfd -nk 000551 | SxmlIndent | more

Ou

HfdSelect -h $EXPLOR_AREA/Data/PascalFrancis/Curation/biblio.hfd -nk 000551 | SxmlIndent | more

Pour mettre un lien sur cette page dans le réseau Wicri

{{Explor lien
   |wiki=    Wicri/Lorraine
   |area=    InforLorV4
   |flux=    PascalFrancis
   |étape=   Curation
   |type=    RBID
   |clé=     Pascal:06-0067795
   |texte=   Neural network topology optimization
}}

Wicri

This area was generated with Dilib version V0.6.33.
Data generation: Mon Jun 10 21:56:28 2019. Site generation: Fri Feb 25 15:29:27 2022