Kernel Methods for Nonlinear Discriminative Data Analysis
Identifieur interne : 001321 ( Main/Merge ); précédent : 001320; suivant : 001322Kernel Methods for Nonlinear Discriminative Data Analysis
Auteurs : Xiuwen Liu [États-Unis] ; Washington Mio [États-Unis]Source :
- Lecture Notes in Computer Science [ 0302-9743 ] ; 2005.
Abstract
Abstract: Optimal Component Analysis (OCA) is a linear subspace technique for dimensionality reduction designed to optimize object classification and recognition performance. The linear nature of OCA often limits recognition performance, if the underlying data structure is nonlinear or cluster structures are complex. To address these problems, we investigate a kernel analogue of OCA, which consists of applying OCA techniques to the data after it has been mapped nonlinearly into a new feature space, typically a high (possibly infinite) dimensional Hilbert space. In this paper, we study both the theoretical and algorithmic aspects of the problem and report results obtained in several object recognition experiments.
Url:
DOI: 10.1007/11585978_38
Links toward previous steps (curation, corpus...)
- to stream Istex, to step Corpus: 001789
- to stream Istex, to step Curation: 001691
- to stream Istex, to step Checkpoint: 000B94
Links to Exploration step
ISTEX:0DBA3A1211F38D7A24362213C22F4BFB7EA014E7Le document en format XML
<record><TEI wicri:istexFullTextTei="biblStruct"><teiHeader><fileDesc><titleStmt><title xml:lang="en">Kernel Methods for Nonlinear Discriminative Data Analysis</title>
<author><name sortKey="Liu, Xiuwen" sort="Liu, Xiuwen" uniqKey="Liu X" first="Xiuwen" last="Liu">Xiuwen Liu</name>
</author>
<author><name sortKey="Mio, Washington" sort="Mio, Washington" uniqKey="Mio W" first="Washington" last="Mio">Washington Mio</name>
</author>
</titleStmt>
<publicationStmt><idno type="wicri:source">ISTEX</idno>
<idno type="RBID">ISTEX:0DBA3A1211F38D7A24362213C22F4BFB7EA014E7</idno>
<date when="2005" year="2005">2005</date>
<idno type="doi">10.1007/11585978_38</idno>
<idno type="url">https://api.istex.fr/document/0DBA3A1211F38D7A24362213C22F4BFB7EA014E7/fulltext/pdf</idno>
<idno type="wicri:Area/Istex/Corpus">001789</idno>
<idno type="wicri:Area/Istex/Curation">001691</idno>
<idno type="wicri:Area/Istex/Checkpoint">000B94</idno>
<idno type="wicri:doubleKey">0302-9743:2005:Liu X:kernel:methods:for</idno>
<idno type="wicri:Area/Main/Merge">001321</idno>
</publicationStmt>
<sourceDesc><biblStruct><analytic><title level="a" type="main" xml:lang="en">Kernel Methods for Nonlinear Discriminative Data Analysis</title>
<author><name sortKey="Liu, Xiuwen" sort="Liu, Xiuwen" uniqKey="Liu X" first="Xiuwen" last="Liu">Xiuwen Liu</name>
<affiliation><wicri:noCountry code="subField">Tallahassee</wicri:noCountry>
</affiliation>
<affiliation wicri:level="1"><country wicri:rule="url">États-Unis</country>
</affiliation>
</author>
<author><name sortKey="Mio, Washington" sort="Mio, Washington" uniqKey="Mio W" first="Washington" last="Mio">Washington Mio</name>
<affiliation wicri:level="2"><country xml:lang="fr">États-Unis</country>
<wicri:regionArea>Department of Mathematics, Florida State University, 32306, Tallahassee, FL</wicri:regionArea>
<placeName><region type="state">Floride</region>
</placeName>
</affiliation>
</author>
</analytic>
<monogr></monogr>
<series><title level="s">Lecture Notes in Computer Science</title>
<imprint><date>2005</date>
</imprint>
<idno type="ISSN">0302-9743</idno>
<idno type="eISSN">1611-3349</idno>
<idno type="ISSN">0302-9743</idno>
</series>
<idno type="istex">0DBA3A1211F38D7A24362213C22F4BFB7EA014E7</idno>
<idno type="DOI">10.1007/11585978_38</idno>
<idno type="ChapterID">38</idno>
<idno type="ChapterID">Chap38</idno>
</biblStruct>
</sourceDesc>
<seriesStmt><idno type="ISSN">0302-9743</idno>
</seriesStmt>
</fileDesc>
<profileDesc><textClass></textClass>
<langUsage><language ident="en">en</language>
</langUsage>
</profileDesc>
</teiHeader>
<front><div type="abstract" xml:lang="en">Abstract: Optimal Component Analysis (OCA) is a linear subspace technique for dimensionality reduction designed to optimize object classification and recognition performance. The linear nature of OCA often limits recognition performance, if the underlying data structure is nonlinear or cluster structures are complex. To address these problems, we investigate a kernel analogue of OCA, which consists of applying OCA techniques to the data after it has been mapped nonlinearly into a new feature space, typically a high (possibly infinite) dimensional Hilbert space. In this paper, we study both the theoretical and algorithmic aspects of the problem and report results obtained in several object recognition experiments.</div>
</front>
</TEI>
</record>
Pour manipuler ce document sous Unix (Dilib)
EXPLOR_STEP=$WICRI_ROOT/Ticri/CIDE/explor/OcrV1/Data/Main/Merge
HfdSelect -h $EXPLOR_STEP/biblio.hfd -nk 001321 | SxmlIndent | more
Ou
HfdSelect -h $EXPLOR_AREA/Data/Main/Merge/biblio.hfd -nk 001321 | SxmlIndent | more
Pour mettre un lien sur cette page dans le réseau Wicri
{{Explor lien |wiki= Ticri/CIDE |area= OcrV1 |flux= Main |étape= Merge |type= RBID |clé= ISTEX:0DBA3A1211F38D7A24362213C22F4BFB7EA014E7 |texte= Kernel Methods for Nonlinear Discriminative Data Analysis }}
![]() | This area was generated with Dilib version V0.6.32. | ![]() |