Class-Specific Kernel Discriminant Analysis Revisited: Further Analysis and Extensions
Research output: Contribution to journal › Article › Scientific › peer-review
|Journal||IEEE Transactions on Cybernetics|
|Early online date||13 Oct 2016|
|Publication status||Published - 1 Dec 2017|
|Publication type||A1 Journal article-refereed|
In this paper, we revisit class-specific kernel discriminant analysis (KDA) formulation, which has been applied in various problems, such as human face verification and human action recognition. We show that the original optimization problem solved for the determination of class-specific discriminant projections is equivalent to a low-rank kernel regression (LRKR) problem using training data-independent target vectors. In addition, we show that the regularized version of class-specific KDA is equivalent to a regularized LRKR problem, exploiting the same targets. This analysis allows us to devise a novel fast solution. Furthermore, we devise novel incremental, approximate and deep (hierarchical) variants. The proposed methods are tested in human facial image and action video verification problems, where their effectiveness and efficiency is shown.
- Data models, Kernel, Optimization, Principal component analysis, Standards, Training, Training data, Approximation, class-specific kernel discriminant analysis (CSKDA), incremental learning, low-rank kernel regression (LRKR), regularization