A wearable, wireless gaze tracker with integrated selection command source for human-computer interaction
Research output: Contribution to journal › Article › Scientific › peer-review
|Journal||IEEE Transactions on Information Technology in Biomedicine|
|Publication status||Published - 2011|
|Publication type||A1 Journal article-refereed|
A light-weight, wearable, wireless gaze tracker with integrated selection command source for human–computer interaction is introduced. The prototype system combines head-mounted, video-based gaze tracking with capacitive facial movement detection that enable multimodal interaction by gaze pointing and making selections with facial gestures. The system is targeted mainly to disabled people with limited mobility over their hands. The hardware was made wireless to remove the need to take off the device when moving away from the computer, and to allow future use in more mobile contexts. The algorithms responsible for determining the eye and head orientations to map gaze direction to on-screen coordinates are presented together with the one to detect movements from the measured capacitance signal. Point-and-click experiments were conducted to assess the performance of the multimodal system. The results show decent performance in laboratory and office conditions. The overall point-and-click accuracy in the multimodal experiments is comparable to the errors in previous research on head-mounted, single modality gaze tracking that does not compensate for changes in head orientation.