Bounds on the Risk for M-SVMs
Identifieur interne : 003992 ( Crin/Corpus ); précédent : 003991; suivant : 003993Bounds on the Risk for M-SVMs
Auteurs : Yann Guermeur ; André Elisseeff ; Dominique ZelusSource :
- Applied Stochastic Models in Business and Industry ; 2003.
English descriptors
- KwdEn :
Abstract
Vapnik's statistical learning theory has mainly been developed for two types of problems : pattern recognition (computation of dichotomies) and regression (estimation of real-valued functions). Only in recent years has multi-class discriminant analysis been studied independently. Extending several standard results, among which a famous theorem by Bartlett, we have derived distribution-free uniform strong laws of large numbers devoted to multi-class large margin discriminant models. The capacity measure appearing in the confidence interval, a covering number, has been bounded from above in terms of a new generalized VC dimension. In this paper, the aforementioned theorems are applied to the architecture shared by all the multi-class SVMs proposed so far, which provides us with a simple theoretical framework to study them, compare their performance and design new machines.
Links to Exploration step
CRIN:guermeur03cLe document en format XML
<record><TEI><teiHeader><fileDesc><titleStmt><title xml:lang="en" wicri:score="11">Bounds on the Risk for M-SVMs</title>
</titleStmt>
<publicationStmt><idno type="RBID">CRIN:guermeur03c</idno>
<date when="2003" year="2003">2003</date>
<idno type="wicri:Area/Crin/Corpus">003992</idno>
</publicationStmt>
<sourceDesc><biblStruct><analytic><title xml:lang="en">Bounds on the Risk for M-SVMs</title>
<author><name sortKey="Guermeur, Yann" sort="Guermeur, Yann" uniqKey="Guermeur Y" first="Yann" last="Guermeur">Yann Guermeur</name>
</author>
<author><name sortKey="Elisseeff, Andre" sort="Elisseeff, Andre" uniqKey="Elisseeff A" first="André" last="Elisseeff">André Elisseeff</name>
</author>
<author><name sortKey="Zelus, Dominique" sort="Zelus, Dominique" uniqKey="Zelus D" first="Dominique" last="Zelus">Dominique Zelus</name>
</author>
</analytic>
<series><title level="j">Applied Stochastic Models in Business and Industry</title>
<imprint><date when="2003" type="published">2003</date>
</imprint>
</series>
</biblStruct>
</sourceDesc>
</fileDesc>
<profileDesc><textClass><keywords scheme="KwdEn" xml:lang="en"><term>extended vc dimensions</term>
<term>generalization error bounds</term>
<term>large margin classifiers</term>
<term>multi-class support vector machines (m-svms)</term>
</keywords>
</textClass>
</profileDesc>
</teiHeader>
<front><div type="abstract" xml:lang="en" wicri:score="2516">Vapnik's statistical learning theory has mainly been developed for two types of problems : pattern recognition (computation of dichotomies) and regression (estimation of real-valued functions). Only in recent years has multi-class discriminant analysis been studied independently. Extending several standard results, among which a famous theorem by Bartlett, we have derived distribution-free uniform strong laws of large numbers devoted to multi-class large margin discriminant models. The capacity measure appearing in the confidence interval, a covering number, has been bounded from above in terms of a new generalized VC dimension. In this paper, the aforementioned theorems are applied to the architecture shared by all the multi-class SVMs proposed so far, which provides us with a simple theoretical framework to study them, compare their performance and design new machines.</div>
</front>
</TEI>
<BibTex type="article"><ref>guermeur03c</ref>
<crinnumber>A03-R-263</crinnumber>
<category>1</category>
<equipe>MODBIO</equipe>
<author><e>Guermeur, Yann</e>
<e>Elisseeff, André</e>
<e>Zelus, Dominique</e>
</author>
<title>Bounds on the Risk for M-SVMs</title>
<journal>Applied Stochastic Models in Business and Industry</journal>
<year>2003</year>
<keywords><e>multi-class support vector machines (m-svms)</e>
<e>generalization error bounds</e>
<e>large margin classifiers</e>
<e>extended vc dimensions</e>
</keywords>
<abstract>Vapnik's statistical learning theory has mainly been developed for two types of problems : pattern recognition (computation of dichotomies) and regression (estimation of real-valued functions). Only in recent years has multi-class discriminant analysis been studied independently. Extending several standard results, among which a famous theorem by Bartlett, we have derived distribution-free uniform strong laws of large numbers devoted to multi-class large margin discriminant models. The capacity measure appearing in the confidence interval, a covering number, has been bounded from above in terms of a new generalized VC dimension. In this paper, the aforementioned theorems are applied to the architecture shared by all the multi-class SVMs proposed so far, which provides us with a simple theoretical framework to study them, compare their performance and design new machines.</abstract>
</BibTex>
</record>
Pour manipuler ce document sous Unix (Dilib)
EXPLOR_STEP=$WICRI_ROOT/Wicri/Lorraine/explor/InforLorV4/Data/Crin/Corpus
HfdSelect -h $EXPLOR_STEP/biblio.hfd -nk 003992 | SxmlIndent | more
Ou
HfdSelect -h $EXPLOR_AREA/Data/Crin/Corpus/biblio.hfd -nk 003992 | SxmlIndent | more
Pour mettre un lien sur cette page dans le réseau Wicri
{{Explor lien |wiki= Wicri/Lorraine |area= InforLorV4 |flux= Crin |étape= Corpus |type= RBID |clé= CRIN:guermeur03c |texte= Bounds on the Risk for M-SVMs }}
This area was generated with Dilib version V0.6.33. |