Serveur d'exploration sur l'opéra

Attention, ce site est en cours de développement !
Attention, site généré par des moyens informatiques à partir de corpus bruts.
Les informations ne sont donc pas validées.

Some comparisons of complexity in dictionary-based and linear computational models.

Identifieur interne : 000824 ( Ncbi/Merge ); précédent : 000823; suivant : 000825

Some comparisons of complexity in dictionary-based and linear computational models.

Auteurs : Giorgio Gnecco [Italie] ; V Ra Kůrková ; Marcello Sanguineti

Source :

RBID : pubmed:21094023

English descriptors

Abstract

Neural networks provide a more flexible approximation of functions than traditional linear regression. In the latter, one can only adjust the coefficients in linear combinations of fixed sets of functions, such as orthogonal polynomials or Hermite functions, while for neural networks, one may also adjust the parameters of the functions which are being combined. However, some useful properties of linear approximators (such as uniqueness, homogeneity, and continuity of best approximation operators) are not satisfied by neural networks. Moreover, optimization of parameters in neural networks becomes more difficult than in linear regression. Experimental results suggest that these drawbacks of neural networks are offset by substantially lower model complexity, allowing accuracy of approximation even in high-dimensional cases. We give some theoretical results comparing requirements on model complexity for two types of approximators, the traditional linear ones and so called variable-basis types, which include neural networks, radial, and kernel models. We compare upper bounds on worst-case errors in variable-basis approximation with lower bounds on such errors for any linear approximator. Using methods from nonlinear approximation and integral representations tailored to computational units, we describe some cases where neural networks outperform any linear approximator.

DOI: 10.1016/j.neunet.2010.10.002
PubMed: 21094023

Links toward previous steps (curation, corpus...)


Links to Exploration step

pubmed:21094023

Le document en format XML

<record>
<TEI>
<teiHeader>
<fileDesc>
<titleStmt>
<title xml:lang="en">Some comparisons of complexity in dictionary-based and linear computational models.</title>
<author>
<name sortKey="Gnecco, Giorgio" sort="Gnecco, Giorgio" uniqKey="Gnecco G" first="Giorgio" last="Gnecco">Giorgio Gnecco</name>
<affiliation wicri:level="1">
<nlm:affiliation>Department of Communications, Computer, and System Sciences (DIST), University of Genoa, Via Opera Pia 13, 16145 Genova, Italy. giorgio.gnecco@dist.unige.it</nlm:affiliation>
<country xml:lang="fr">Italie</country>
<wicri:regionArea>Department of Communications, Computer, and System Sciences (DIST), University of Genoa, Via Opera Pia 13, 16145 Genova</wicri:regionArea>
<wicri:noRegion>16145 Genova</wicri:noRegion>
</affiliation>
</author>
<author>
<name sortKey="Kurkova, V Ra" sort="Kurkova, V Ra" uniqKey="Kurkova V" first="V Ra" last="Kůrková">V Ra Kůrková</name>
</author>
<author>
<name sortKey="Sanguineti, Marcello" sort="Sanguineti, Marcello" uniqKey="Sanguineti M" first="Marcello" last="Sanguineti">Marcello Sanguineti</name>
</author>
</titleStmt>
<publicationStmt>
<idno type="wicri:source">PubMed</idno>
<date when="2011">2011</date>
<idno type="doi">10.1016/j.neunet.2010.10.002</idno>
<idno type="RBID">pubmed:21094023</idno>
<idno type="pmid">21094023</idno>
<idno type="wicri:Area/PubMed/Corpus">000340</idno>
<idno type="wicri:Area/PubMed/Curation">000340</idno>
<idno type="wicri:Area/PubMed/Checkpoint">000300</idno>
<idno type="wicri:Area/Ncbi/Merge">000824</idno>
</publicationStmt>
<sourceDesc>
<biblStruct>
<analytic>
<title xml:lang="en">Some comparisons of complexity in dictionary-based and linear computational models.</title>
<author>
<name sortKey="Gnecco, Giorgio" sort="Gnecco, Giorgio" uniqKey="Gnecco G" first="Giorgio" last="Gnecco">Giorgio Gnecco</name>
<affiliation wicri:level="1">
<nlm:affiliation>Department of Communications, Computer, and System Sciences (DIST), University of Genoa, Via Opera Pia 13, 16145 Genova, Italy. giorgio.gnecco@dist.unige.it</nlm:affiliation>
<country xml:lang="fr">Italie</country>
<wicri:regionArea>Department of Communications, Computer, and System Sciences (DIST), University of Genoa, Via Opera Pia 13, 16145 Genova</wicri:regionArea>
<wicri:noRegion>16145 Genova</wicri:noRegion>
</affiliation>
</author>
<author>
<name sortKey="Kurkova, V Ra" sort="Kurkova, V Ra" uniqKey="Kurkova V" first="V Ra" last="Kůrková">V Ra Kůrková</name>
</author>
<author>
<name sortKey="Sanguineti, Marcello" sort="Sanguineti, Marcello" uniqKey="Sanguineti M" first="Marcello" last="Sanguineti">Marcello Sanguineti</name>
</author>
</analytic>
<series>
<title level="j">Neural networks : the official journal of the International Neural Network Society</title>
<idno type="e-ISSN">1879-2782</idno>
<imprint>
<date when="2011" type="published">2011</date>
</imprint>
</series>
</biblStruct>
</sourceDesc>
</fileDesc>
<profileDesc>
<textClass>
<keywords scheme="KwdEn" xml:lang="en">
<term>Computational Biology</term>
<term>Dictionaries as Topic</term>
<term>Linear Models</term>
<term>Models, Neurological</term>
<term>Neural Networks (Computer)</term>
<term>Statistics, Nonparametric</term>
</keywords>
<keywords scheme="MESH" xml:lang="en">
<term>Computational Biology</term>
<term>Dictionaries as Topic</term>
<term>Linear Models</term>
<term>Models, Neurological</term>
<term>Neural Networks (Computer)</term>
<term>Statistics, Nonparametric</term>
</keywords>
</textClass>
</profileDesc>
</teiHeader>
<front>
<div type="abstract" xml:lang="en">Neural networks provide a more flexible approximation of functions than traditional linear regression. In the latter, one can only adjust the coefficients in linear combinations of fixed sets of functions, such as orthogonal polynomials or Hermite functions, while for neural networks, one may also adjust the parameters of the functions which are being combined. However, some useful properties of linear approximators (such as uniqueness, homogeneity, and continuity of best approximation operators) are not satisfied by neural networks. Moreover, optimization of parameters in neural networks becomes more difficult than in linear regression. Experimental results suggest that these drawbacks of neural networks are offset by substantially lower model complexity, allowing accuracy of approximation even in high-dimensional cases. We give some theoretical results comparing requirements on model complexity for two types of approximators, the traditional linear ones and so called variable-basis types, which include neural networks, radial, and kernel models. We compare upper bounds on worst-case errors in variable-basis approximation with lower bounds on such errors for any linear approximator. Using methods from nonlinear approximation and integral representations tailored to computational units, we describe some cases where neural networks outperform any linear approximator.</div>
</front>
</TEI>
<pubmed>
<MedlineCitation Owner="NLM" Status="MEDLINE">
<PMID Version="1">21094023</PMID>
<DateCreated>
<Year>2011</Year>
<Month>02</Month>
<Day>02</Day>
</DateCreated>
<DateCompleted>
<Year>2011</Year>
<Month>12</Month>
<Day>12</Day>
</DateCompleted>
<Article PubModel="Print-Electronic">
<Journal>
<ISSN IssnType="Electronic">1879-2782</ISSN>
<JournalIssue CitedMedium="Internet">
<Volume>24</Volume>
<Issue>2</Issue>
<PubDate>
<Year>2011</Year>
<Month>Mar</Month>
</PubDate>
</JournalIssue>
<Title>Neural networks : the official journal of the International Neural Network Society</Title>
<ISOAbbreviation>Neural Netw</ISOAbbreviation>
</Journal>
<ArticleTitle>Some comparisons of complexity in dictionary-based and linear computational models.</ArticleTitle>
<Pagination>
<MedlinePgn>171-82</MedlinePgn>
</Pagination>
<ELocationID EIdType="doi" ValidYN="Y">10.1016/j.neunet.2010.10.002</ELocationID>
<Abstract>
<AbstractText>Neural networks provide a more flexible approximation of functions than traditional linear regression. In the latter, one can only adjust the coefficients in linear combinations of fixed sets of functions, such as orthogonal polynomials or Hermite functions, while for neural networks, one may also adjust the parameters of the functions which are being combined. However, some useful properties of linear approximators (such as uniqueness, homogeneity, and continuity of best approximation operators) are not satisfied by neural networks. Moreover, optimization of parameters in neural networks becomes more difficult than in linear regression. Experimental results suggest that these drawbacks of neural networks are offset by substantially lower model complexity, allowing accuracy of approximation even in high-dimensional cases. We give some theoretical results comparing requirements on model complexity for two types of approximators, the traditional linear ones and so called variable-basis types, which include neural networks, radial, and kernel models. We compare upper bounds on worst-case errors in variable-basis approximation with lower bounds on such errors for any linear approximator. Using methods from nonlinear approximation and integral representations tailored to computational units, we describe some cases where neural networks outperform any linear approximator.</AbstractText>
<CopyrightInformation>Copyright © 2010 Elsevier Ltd. All rights reserved.</CopyrightInformation>
</Abstract>
<AuthorList CompleteYN="Y">
<Author ValidYN="Y">
<LastName>Gnecco</LastName>
<ForeName>Giorgio</ForeName>
<Initials>G</Initials>
<AffiliationInfo>
<Affiliation>Department of Communications, Computer, and System Sciences (DIST), University of Genoa, Via Opera Pia 13, 16145 Genova, Italy. giorgio.gnecco@dist.unige.it</Affiliation>
</AffiliationInfo>
</Author>
<Author ValidYN="Y">
<LastName>Kůrková</LastName>
<ForeName>Věra</ForeName>
<Initials>V</Initials>
</Author>
<Author ValidYN="Y">
<LastName>Sanguineti</LastName>
<ForeName>Marcello</ForeName>
<Initials>M</Initials>
</Author>
</AuthorList>
<Language>eng</Language>
<PublicationTypeList>
<PublicationType UI="D003160">Comparative Study</PublicationType>
<PublicationType UI="D016428">Journal Article</PublicationType>
<PublicationType UI="D013485">Research Support, Non-U.S. Gov't</PublicationType>
</PublicationTypeList>
<ArticleDate DateType="Electronic">
<Year>2010</Year>
<Month>11</Month>
<Day>19</Day>
</ArticleDate>
</Article>
<MedlineJournalInfo>
<Country>United States</Country>
<MedlineTA>Neural Netw</MedlineTA>
<NlmUniqueID>8805018</NlmUniqueID>
<ISSNLinking>0893-6080</ISSNLinking>
</MedlineJournalInfo>
<CitationSubset>IM</CitationSubset>
<MeshHeadingList>
<MeshHeading>
<DescriptorName MajorTopicYN="N" UI="D019295">Computational Biology</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="Y" UI="D004014">Dictionaries as Topic</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="Y" UI="D016014">Linear Models</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="N" UI="D008959">Models, Neurological</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="Y" UI="D016571">Neural Networks (Computer)</DescriptorName>
</MeshHeading>
<MeshHeading>
<DescriptorName MajorTopicYN="N" UI="D018709">Statistics, Nonparametric</DescriptorName>
</MeshHeading>
</MeshHeadingList>
</MedlineCitation>
<PubmedData>
<History>
<PubMedPubDate PubStatus="received">
<Year>2010</Year>
<Month>5</Month>
<Day>23</Day>
</PubMedPubDate>
<PubMedPubDate PubStatus="revised">
<Year>2010</Year>
<Month>10</Month>
<Day>5</Day>
</PubMedPubDate>
<PubMedPubDate PubStatus="accepted">
<Year>2010</Year>
<Month>10</Month>
<Day>9</Day>
</PubMedPubDate>
<PubMedPubDate PubStatus="aheadofprint">
<Year>2010</Year>
<Month>11</Month>
<Day>19</Day>
</PubMedPubDate>
<PubMedPubDate PubStatus="entrez">
<Year>2010</Year>
<Month>11</Month>
<Day>25</Day>
<Hour>6</Hour>
<Minute>0</Minute>
</PubMedPubDate>
<PubMedPubDate PubStatus="pubmed">
<Year>2010</Year>
<Month>11</Month>
<Day>26</Day>
<Hour>6</Hour>
<Minute>0</Minute>
</PubMedPubDate>
<PubMedPubDate PubStatus="medline">
<Year>2011</Year>
<Month>12</Month>
<Day>14</Day>
<Hour>6</Hour>
<Minute>0</Minute>
</PubMedPubDate>
</History>
<PublicationStatus>ppublish</PublicationStatus>
<ArticleIdList>
<ArticleId IdType="pii">S0893-6080(10)00188-7</ArticleId>
<ArticleId IdType="doi">10.1016/j.neunet.2010.10.002</ArticleId>
<ArticleId IdType="pubmed">21094023</ArticleId>
</ArticleIdList>
</PubmedData>
</pubmed>
<affiliations>
<list>
<country>
<li>Italie</li>
</country>
</list>
<tree>
<noCountry>
<name sortKey="Kurkova, V Ra" sort="Kurkova, V Ra" uniqKey="Kurkova V" first="V Ra" last="Kůrková">V Ra Kůrková</name>
<name sortKey="Sanguineti, Marcello" sort="Sanguineti, Marcello" uniqKey="Sanguineti M" first="Marcello" last="Sanguineti">Marcello Sanguineti</name>
</noCountry>
<country name="Italie">
<noRegion>
<name sortKey="Gnecco, Giorgio" sort="Gnecco, Giorgio" uniqKey="Gnecco G" first="Giorgio" last="Gnecco">Giorgio Gnecco</name>
</noRegion>
</country>
</tree>
</affiliations>
</record>

Pour manipuler ce document sous Unix (Dilib)

EXPLOR_STEP=$WICRI_ROOT/Wicri/Musique/explor/OperaV1/Data/Ncbi/Merge
HfdSelect -h $EXPLOR_STEP/biblio.hfd -nk 000824 | SxmlIndent | more

Ou

HfdSelect -h $EXPLOR_AREA/Data/Ncbi/Merge/biblio.hfd -nk 000824 | SxmlIndent | more

Pour mettre un lien sur cette page dans le réseau Wicri

{{Explor lien
   |wiki=    Wicri/Musique
   |area=    OperaV1
   |flux=    Ncbi
   |étape=   Merge
   |type=    RBID
   |clé=     pubmed:21094023
   |texte=   Some comparisons of complexity in dictionary-based and linear computational models.
}}

Pour générer des pages wiki

HfdIndexSelect -h $EXPLOR_AREA/Data/Ncbi/Merge/RBID.i   -Sk "pubmed:21094023" \
       | HfdSelect -Kh $EXPLOR_AREA/Data/Ncbi/Merge/biblio.hfd   \
       | NlmPubMed2Wicri -a OperaV1 

Wicri

This area was generated with Dilib version V0.6.21.
Data generation: Thu Apr 14 14:59:05 2016. Site generation: Thu Jan 4 23:09:23 2024