Tampere University of Technology

TUTCRIS Research Portal

Kernel reference discriminant analysis

Research output: Contribution to journalArticleScientificpeer-review

Details

Original languageEnglish
Pages (from-to)85-91
Number of pages7
JournalPattern Recognition Letters
Volume49
DOIs
Publication statusPublished - 1 Nov 2014
Publication typeA1 Journal article-refereed

Abstract

Linear Discriminant Analysis (LDA) and its nonlinear version Kernel Discriminant Analysis (KDA) are well-known and widely used techniques for supervised feature extraction and dimensionality reduction. They determine an optimal discriminant space for (non)linear data projection based on certain assumptions, e.g. on using normal distributions (either on the input or in the kernel space) for each class and employing class representation by the corresponding class mean vectors. However, there might be other vectors that can be used for classes representation, in order to increase class discrimination in the resulted feature space. In this paper, we propose an optimization scheme aiming at the optimal class representation, in terms of Fisher ratio maximization, for nonlinear data projection. Compared to the standard approach, the proposed optimization scheme increases class discrimination in the reduced-dimensionality feature space and achieves higher classification rates in publicly available data sets.

Keywords

  • Kernel Discriminant Analysis, Kernel Spectral Regression, Optimized class representation