Tampere University of Technology

TUTCRIS Research Portal

Haptic feedback of gaze gestures with glasses: Localization accuracy and effectiveness

Research output: Chapter in Book/Report/Conference proceedingConference contributionScientificpeer-review

Details

Original languageEnglish
Title of host publicationUbiComp and ISWC 2015 - Proceedings of the 2015 ACM International Joint Conference on Pervasive and Ubiquitous Computing and the Proceedings of the 2015 ACM International Symposium on Wearable Computers
PublisherAssociation for Computing Machinery, Inc
Pages855-862
Number of pages8
ISBN (Electronic)9781450335751
DOIs
Publication statusPublished - 7 Sep 2015
Publication typeA4 Article in a conference publication
EventACM International Joint Conference on Pervasive and Ubiquitous Computing and the 2015 ACM International Symposium on Wearable Computers, UbiComp and ISWC 2015 - Osaka, Japan
Duration: 7 Sep 201511 Sep 2015

Conference

ConferenceACM International Joint Conference on Pervasive and Ubiquitous Computing and the 2015 ACM International Symposium on Wearable Computers, UbiComp and ISWC 2015
CountryJapan
CityOsaka
Period7/09/1511/09/15

Abstract

Wearable devices including smart eyewear require new interaction methods between the device and the user. In this paper, we describe our work on the combined use of eye tracking for input and haptic (touch) stimulation for output with eyewear. Input with eyes can be achieved by utilizing gaze gestures which are predefined patterns of gaze movements identified as commands. The frame of the eyeglasses offers three natural contact points with the wearer's skin for haptic stimulation. The results of two user studies reported in this paper showed that stimulation moving between the contact points was easy for users to localize, and that the stimulation has potential to make the use of gaze gestures more efficient.

Keywords

  • Gaze gestures, Gaze tracking, Haptic stimulation, Haptics, Pervasive computing, Smart eyewear, Smart glasses, Wearable computing