Tampere University of Technology

TUTCRIS Research Portal

Enhanced gaze interaction using simple head gestures

Research output: Chapter in Book/Report/Conference proceedingConference contributionScientificpeer-review

Standard

Enhanced gaze interaction using simple head gestures. / Špakov, Oleg; Majaranta, Päivi.

UbiComp'12 - Proceedings of the 2012 ACM Conference on Ubiquitous Computing. 2012. p. 705-710.

Research output: Chapter in Book/Report/Conference proceedingConference contributionScientificpeer-review

Harvard

Špakov, O & Majaranta, P 2012, Enhanced gaze interaction using simple head gestures. in UbiComp'12 - Proceedings of the 2012 ACM Conference on Ubiquitous Computing. pp. 705-710, 14th International Conference on Ubiquitous Computing, UbiComp 2012, Pittsburgh, PA, United States, 5/09/12.

APA

Špakov, O., & Majaranta, P. (2012). Enhanced gaze interaction using simple head gestures. In UbiComp'12 - Proceedings of the 2012 ACM Conference on Ubiquitous Computing (pp. 705-710)

Vancouver

Špakov O, Majaranta P. Enhanced gaze interaction using simple head gestures. In UbiComp'12 - Proceedings of the 2012 ACM Conference on Ubiquitous Computing. 2012. p. 705-710

Author

Špakov, Oleg ; Majaranta, Päivi. / Enhanced gaze interaction using simple head gestures. UbiComp'12 - Proceedings of the 2012 ACM Conference on Ubiquitous Computing. 2012. pp. 705-710

Bibtex - Download

@inproceedings{97a4639800da43d69b3bf79520e974e5,
title = "Enhanced gaze interaction using simple head gestures",
abstract = "We propose a combination of gaze pointing and head gestures for enhanced hands-free interaction. Instead of the traditional dwell-time selection method, we experimented with five simple head gestures: nodding, turning left/right, and tilting left/right. The gestures were detected from the eye-tracking data by a range-based algorithm, which was found accurate enough in recognizing nodding and leftdirected gestures. The gaze estimation accuracy did not noticeably suffer from the quick head motions. Participants pointed to nodding as the best gesture for occasional selections tasks and rated the other gestures as promising methods for navigation (turning) and functional mode switching (tilting). In general, dwell time works well for repeated tasks such as eye typing. However, considering multimodal games or transient interactions in pervasive and mobile environments, we believe a combination of gaze and head interaction could potentially provide a natural and more accurate interaction method.",
keywords = "Dwell time, Eye tracking, Head gestures, Selection",
author = "Oleg Špakov and P{\"a}ivi Majaranta",
year = "2012",
language = "English",
isbn = "9781450312240",
pages = "705--710",
booktitle = "UbiComp'12 - Proceedings of the 2012 ACM Conference on Ubiquitous Computing",

}

RIS (suitable for import to EndNote) - Download

TY - GEN

T1 - Enhanced gaze interaction using simple head gestures

AU - Špakov, Oleg

AU - Majaranta, Päivi

PY - 2012

Y1 - 2012

N2 - We propose a combination of gaze pointing and head gestures for enhanced hands-free interaction. Instead of the traditional dwell-time selection method, we experimented with five simple head gestures: nodding, turning left/right, and tilting left/right. The gestures were detected from the eye-tracking data by a range-based algorithm, which was found accurate enough in recognizing nodding and leftdirected gestures. The gaze estimation accuracy did not noticeably suffer from the quick head motions. Participants pointed to nodding as the best gesture for occasional selections tasks and rated the other gestures as promising methods for navigation (turning) and functional mode switching (tilting). In general, dwell time works well for repeated tasks such as eye typing. However, considering multimodal games or transient interactions in pervasive and mobile environments, we believe a combination of gaze and head interaction could potentially provide a natural and more accurate interaction method.

AB - We propose a combination of gaze pointing and head gestures for enhanced hands-free interaction. Instead of the traditional dwell-time selection method, we experimented with five simple head gestures: nodding, turning left/right, and tilting left/right. The gestures were detected from the eye-tracking data by a range-based algorithm, which was found accurate enough in recognizing nodding and leftdirected gestures. The gaze estimation accuracy did not noticeably suffer from the quick head motions. Participants pointed to nodding as the best gesture for occasional selections tasks and rated the other gestures as promising methods for navigation (turning) and functional mode switching (tilting). In general, dwell time works well for repeated tasks such as eye typing. However, considering multimodal games or transient interactions in pervasive and mobile environments, we believe a combination of gaze and head interaction could potentially provide a natural and more accurate interaction method.

KW - Dwell time

KW - Eye tracking

KW - Head gestures

KW - Selection

UR - http://www.scopus.com/inward/record.url?scp=84879496342&partnerID=8YFLogxK

M3 - Conference contribution

SN - 9781450312240

SP - 705

EP - 710

BT - UbiComp'12 - Proceedings of the 2012 ACM Conference on Ubiquitous Computing

ER -