Enhanced gaze interaction using simple head gestures
Research output: Chapter in Book/Report/Conference proceeding › Conference contribution › Scientific › peer-review
Standard
Enhanced gaze interaction using simple head gestures. / Špakov, Oleg; Majaranta, Päivi.
UbiComp'12 - Proceedings of the 2012 ACM Conference on Ubiquitous Computing. 2012. p. 705-710.Research output: Chapter in Book/Report/Conference proceeding › Conference contribution › Scientific › peer-review
Harvard
APA
Vancouver
Author
Bibtex - Download
}
RIS (suitable for import to EndNote) - Download
TY - GEN
T1 - Enhanced gaze interaction using simple head gestures
AU - Špakov, Oleg
AU - Majaranta, Päivi
PY - 2012
Y1 - 2012
N2 - We propose a combination of gaze pointing and head gestures for enhanced hands-free interaction. Instead of the traditional dwell-time selection method, we experimented with five simple head gestures: nodding, turning left/right, and tilting left/right. The gestures were detected from the eye-tracking data by a range-based algorithm, which was found accurate enough in recognizing nodding and leftdirected gestures. The gaze estimation accuracy did not noticeably suffer from the quick head motions. Participants pointed to nodding as the best gesture for occasional selections tasks and rated the other gestures as promising methods for navigation (turning) and functional mode switching (tilting). In general, dwell time works well for repeated tasks such as eye typing. However, considering multimodal games or transient interactions in pervasive and mobile environments, we believe a combination of gaze and head interaction could potentially provide a natural and more accurate interaction method.
AB - We propose a combination of gaze pointing and head gestures for enhanced hands-free interaction. Instead of the traditional dwell-time selection method, we experimented with five simple head gestures: nodding, turning left/right, and tilting left/right. The gestures were detected from the eye-tracking data by a range-based algorithm, which was found accurate enough in recognizing nodding and leftdirected gestures. The gaze estimation accuracy did not noticeably suffer from the quick head motions. Participants pointed to nodding as the best gesture for occasional selections tasks and rated the other gestures as promising methods for navigation (turning) and functional mode switching (tilting). In general, dwell time works well for repeated tasks such as eye typing. However, considering multimodal games or transient interactions in pervasive and mobile environments, we believe a combination of gaze and head interaction could potentially provide a natural and more accurate interaction method.
KW - Dwell time
KW - Eye tracking
KW - Head gestures
KW - Selection
UR - http://www.scopus.com/inward/record.url?scp=84879496342&partnerID=8YFLogxK
M3 - Conference contribution
SN - 9781450312240
SP - 705
EP - 710
BT - UbiComp'12 - Proceedings of the 2012 ACM Conference on Ubiquitous Computing
ER -