Tampere University of Technology

TUTCRIS Research Portal

MEG Decoding with Hierarchical Combination of Logistic Regression and Random Forests

Research output: Other contributionScientific

Standard

MEG Decoding with Hierarchical Combination of Logistic Regression and Random Forests. / Huttunen, Heikki; Gencoglu, Oguzhan; Lehmusvaara, Johannes; Vartiainen, Teemu.

10 p. 2014, Technical report of our 2nd place submission to the DecMeg 2014 competition at Kaggle.com.

Research output: Other contributionScientific

Harvard

APA

Vancouver

Author

Huttunen, Heikki ; Gencoglu, Oguzhan ; Lehmusvaara, Johannes ; Vartiainen, Teemu. / MEG Decoding with Hierarchical Combination of Logistic Regression and Random Forests. 2014. 10 p.

Bibtex - Download

@misc{8596c3f31b2040129ea4cbadb6a8d6b4,
title = "MEG Decoding with Hierarchical Combination of Logistic Regression and Random Forests",
abstract = "This document describes the solution of the second place team in the DecMeg2014 brain decoding competition hosted at Kaggle.com. The model is a hierarchical combination of logistic regression and random forest. The first layer consists of a collection of 337 logistic regression classifiers, each using data either from a single sensor (31 features) or data from a single time point (306 features). The resulting probability estimates are fed to a 1000-tree random forest, which makes the final decision. In order to adjust the model to an unlabeled subject, the classifier is trained iteratively: After initial training, the model is retrained with unlabeled samples in the test set using their predicted labels from first iteration.",
keywords = "Machine learning",
author = "Heikki Huttunen and Oguzhan Gencoglu and Johannes Lehmusvaara and Teemu Vartiainen",
year = "2014",
language = "English",
type = "Other",

}

RIS (suitable for import to EndNote) - Download

TY - GEN

T1 - MEG Decoding with Hierarchical Combination of Logistic Regression and Random Forests

AU - Huttunen, Heikki

AU - Gencoglu, Oguzhan

AU - Lehmusvaara, Johannes

AU - Vartiainen, Teemu

PY - 2014

Y1 - 2014

N2 - This document describes the solution of the second place team in the DecMeg2014 brain decoding competition hosted at Kaggle.com. The model is a hierarchical combination of logistic regression and random forest. The first layer consists of a collection of 337 logistic regression classifiers, each using data either from a single sensor (31 features) or data from a single time point (306 features). The resulting probability estimates are fed to a 1000-tree random forest, which makes the final decision. In order to adjust the model to an unlabeled subject, the classifier is trained iteratively: After initial training, the model is retrained with unlabeled samples in the test set using their predicted labels from first iteration.

AB - This document describes the solution of the second place team in the DecMeg2014 brain decoding competition hosted at Kaggle.com. The model is a hierarchical combination of logistic regression and random forest. The first layer consists of a collection of 337 logistic regression classifiers, each using data either from a single sensor (31 features) or data from a single time point (306 features). The resulting probability estimates are fed to a 1000-tree random forest, which makes the final decision. In order to adjust the model to an unlabeled subject, the classifier is trained iteratively: After initial training, the model is retrained with unlabeled samples in the test set using their predicted labels from first iteration.

KW - Machine learning

M3 - Other contribution

ER -