TUTCRIS - Tampereen teknillinen yliopisto

TUTCRIS

Focused multi-task learning in a Gaussian process framework

Tutkimustuotosvertaisarvioitu

Standard

Focused multi-task learning in a Gaussian process framework. / Leen, Gayle; Peltonen, Jaakko; Kaski, Samuel.

julkaisussa: Machine Learning, Vuosikerta 89, Nro 1-2, 10.2012, s. 157-182.

Tutkimustuotosvertaisarvioitu

Harvard

Leen, G, Peltonen, J & Kaski, S 2012, 'Focused multi-task learning in a Gaussian process framework', Machine Learning, Vuosikerta. 89, Nro 1-2, Sivut 157-182. https://doi.org/10.1007/s10994-012-5302-y

APA

Leen, G., Peltonen, J., & Kaski, S. (2012). Focused multi-task learning in a Gaussian process framework. Machine Learning, 89(1-2), 157-182. https://doi.org/10.1007/s10994-012-5302-y

Vancouver

Leen G, Peltonen J, Kaski S. Focused multi-task learning in a Gaussian process framework. Machine Learning. 2012 loka;89(1-2):157-182. https://doi.org/10.1007/s10994-012-5302-y

Author

Leen, Gayle ; Peltonen, Jaakko ; Kaski, Samuel. / Focused multi-task learning in a Gaussian process framework. Julkaisussa: Machine Learning. 2012 ; Vuosikerta 89, Nro 1-2. Sivut 157-182.

Bibtex - Lataa

@article{1729741fc37e43af97193e7c015f5693,
title = "Focused multi-task learning in a Gaussian process framework",
abstract = "Multi-task learning, learning of a set of tasks together, can improve performance in the individual learning tasks. Gaussian process models have been applied to learning a set of tasks on different data sets, by constructing joint priors for functions underlying the tasks. In these previous Gaussian process models, the setting has been symmetric in the sense that all the tasks have been assumed to be equally important, whereas in settings such as transfer learning the goal is asymmetric, to enhance performance in a target task given the other tasks. We propose a focused Gaussian process model which introduces an {"}explaining away{"} model for each of the additional tasks to model their non-related variation, in order to focus the transfer to the task-of-interest. This focusing helps reduce the key problem of negative transfer, which may cause performance to even decrease if the tasks are not related closely enough. In experiments, our model improves performance compared to single-task learning, symmetric multi-task learning using hierarchical Dirichlet processes, transfer learning based on predictive structure learning, and symmetric multi-task learning with Gaussian processes.",
keywords = "Gaussian processes, Multi-task learning, Negative transfer, Transfer learning",
author = "Gayle Leen and Jaakko Peltonen and Samuel Kaski",
year = "2012",
month = "10",
doi = "10.1007/s10994-012-5302-y",
language = "English",
volume = "89",
pages = "157--182",
journal = "Machine Learning",
issn = "0885-6125",
publisher = "Springer Verlag",
number = "1-2",

}

RIS (suitable for import to EndNote) - Lataa

TY - JOUR

T1 - Focused multi-task learning in a Gaussian process framework

AU - Leen, Gayle

AU - Peltonen, Jaakko

AU - Kaski, Samuel

PY - 2012/10

Y1 - 2012/10

N2 - Multi-task learning, learning of a set of tasks together, can improve performance in the individual learning tasks. Gaussian process models have been applied to learning a set of tasks on different data sets, by constructing joint priors for functions underlying the tasks. In these previous Gaussian process models, the setting has been symmetric in the sense that all the tasks have been assumed to be equally important, whereas in settings such as transfer learning the goal is asymmetric, to enhance performance in a target task given the other tasks. We propose a focused Gaussian process model which introduces an "explaining away" model for each of the additional tasks to model their non-related variation, in order to focus the transfer to the task-of-interest. This focusing helps reduce the key problem of negative transfer, which may cause performance to even decrease if the tasks are not related closely enough. In experiments, our model improves performance compared to single-task learning, symmetric multi-task learning using hierarchical Dirichlet processes, transfer learning based on predictive structure learning, and symmetric multi-task learning with Gaussian processes.

AB - Multi-task learning, learning of a set of tasks together, can improve performance in the individual learning tasks. Gaussian process models have been applied to learning a set of tasks on different data sets, by constructing joint priors for functions underlying the tasks. In these previous Gaussian process models, the setting has been symmetric in the sense that all the tasks have been assumed to be equally important, whereas in settings such as transfer learning the goal is asymmetric, to enhance performance in a target task given the other tasks. We propose a focused Gaussian process model which introduces an "explaining away" model for each of the additional tasks to model their non-related variation, in order to focus the transfer to the task-of-interest. This focusing helps reduce the key problem of negative transfer, which may cause performance to even decrease if the tasks are not related closely enough. In experiments, our model improves performance compared to single-task learning, symmetric multi-task learning using hierarchical Dirichlet processes, transfer learning based on predictive structure learning, and symmetric multi-task learning with Gaussian processes.

KW - Gaussian processes

KW - Multi-task learning

KW - Negative transfer

KW - Transfer learning

UR - http://www.scopus.com/inward/record.url?scp=84865229433&partnerID=8YFLogxK

U2 - 10.1007/s10994-012-5302-y

DO - 10.1007/s10994-012-5302-y

M3 - Article

VL - 89

SP - 157

EP - 182

JO - Machine Learning

JF - Machine Learning

SN - 0885-6125

IS - 1-2

ER -