Tampere University of Technology

TUTCRIS Research Portal

Nyström-based approximate kernel subspace learning

Research output: Contribution to journalArticleScientificpeer-review

Standard

Nyström-based approximate kernel subspace learning. / Iosifidis, Alexandros; Gabbouj, Moncef.

In: Pattern Recognition, 09.2016, p. 190-197.

Research output: Contribution to journalArticleScientificpeer-review

Harvard

APA

Vancouver

Author

Iosifidis, Alexandros ; Gabbouj, Moncef. / Nyström-based approximate kernel subspace learning. In: Pattern Recognition. 2016 ; pp. 190-197.

Bibtex - Download

@article{471b2555295a49dba5aaca74b98a40f5,
title = "Nystr{\"o}m-based approximate kernel subspace learning",
abstract = "In this paper, we describe a method for the determination of a subspace of the feature space in kernel methods, which is suited to large-scale learning problems. Linear model learning in the obtained space corresponds to a nonlinear model learning process in the input space. Since the obtained feature space is determined only by exploiting properties of the training data, this approach can be used for generic nonlinear pattern recognition. That is, nonlinear data mapping can be considered to be a pre-processing step exploiting nonlinear relationships between the training data. Linear techniques can be subsequently applied in the new feature space and, thus, they can model nonlinear properties of the problem at hand. In order to appropriately address the inherent problem of kernel learning methods related to their time and memory complexities, we follow an approximate learning approach. We show that the method can lead to considerable operation speed gains and achieve very good performance. Experimental results verify our analysis.",
keywords = "Kernel methods, Nonlinear pattern recognition, Nonlinear projection trick, Nystr{\"o}m approximation",
author = "Alexandros Iosifidis and Moncef Gabbouj",
year = "2016",
month = "9",
doi = "10.1016/j.patcog.2016.03.018",
language = "English",
pages = "190--197",
journal = "Pattern Recognition",
issn = "0031-3203",
publisher = "ELSEVIER SCI LTD",

}

RIS (suitable for import to EndNote) - Download

TY - JOUR

T1 - Nyström-based approximate kernel subspace learning

AU - Iosifidis, Alexandros

AU - Gabbouj, Moncef

PY - 2016/9

Y1 - 2016/9

N2 - In this paper, we describe a method for the determination of a subspace of the feature space in kernel methods, which is suited to large-scale learning problems. Linear model learning in the obtained space corresponds to a nonlinear model learning process in the input space. Since the obtained feature space is determined only by exploiting properties of the training data, this approach can be used for generic nonlinear pattern recognition. That is, nonlinear data mapping can be considered to be a pre-processing step exploiting nonlinear relationships between the training data. Linear techniques can be subsequently applied in the new feature space and, thus, they can model nonlinear properties of the problem at hand. In order to appropriately address the inherent problem of kernel learning methods related to their time and memory complexities, we follow an approximate learning approach. We show that the method can lead to considerable operation speed gains and achieve very good performance. Experimental results verify our analysis.

AB - In this paper, we describe a method for the determination of a subspace of the feature space in kernel methods, which is suited to large-scale learning problems. Linear model learning in the obtained space corresponds to a nonlinear model learning process in the input space. Since the obtained feature space is determined only by exploiting properties of the training data, this approach can be used for generic nonlinear pattern recognition. That is, nonlinear data mapping can be considered to be a pre-processing step exploiting nonlinear relationships between the training data. Linear techniques can be subsequently applied in the new feature space and, thus, they can model nonlinear properties of the problem at hand. In order to appropriately address the inherent problem of kernel learning methods related to their time and memory complexities, we follow an approximate learning approach. We show that the method can lead to considerable operation speed gains and achieve very good performance. Experimental results verify our analysis.

KW - Kernel methods

KW - Nonlinear pattern recognition

KW - Nonlinear projection trick

KW - Nyström approximation

U2 - 10.1016/j.patcog.2016.03.018

DO - 10.1016/j.patcog.2016.03.018

M3 - Article

SP - 190

EP - 197

JO - Pattern Recognition

JF - Pattern Recognition

SN - 0031-3203

ER -