Robustifying Eye Interaction

Dan Witzner Hansen, John Paulin Hansen

Research output: Chapter in Book/Report/Conference proceedingArticle in proceedingsResearchpeer-review

Abstract

This paper presents a gaze typing system based on consumer hardware. Eye tracking based on consumer hardware is subject to several unknown factors. We propose methods using robust statistical principles to accommodate uncertainties in image data as well as in gaze estimates to improve accuracy. We have succeeded to track the gaze of people with a standard consumer camera, obtaining accuracies about 160 pixels on screen. Proper design of the typing interface, however, reduces the need for high accuracy. We have observed typing speeds in the range of 3 - 5 words per minute for untrained subjects using large on-screen buttons and a new noise tolerant dwell-time principle.
Original languageEnglish
Title of host publicationProceedings of the 2006 Conference on Computer Vision and Pattern Recognition Workshop (CVPRW’06)
Number of pages8
PublisherIEEE
Publication date2006
ISBN (Print)0-7695-2646-2
DOIs
Publication statusPublished - 2006
Externally publishedYes
Event2006 Conference on Computer Vision and Pattern Recognition Workshop (CVPRW’06) - New York, NY, United States
Duration: 17 Jun 200622 Jun 2006

Conference

Conference2006 Conference on Computer Vision and Pattern Recognition Workshop (CVPRW’06)
CountryUnited States
CityNew York, NY
Period17/06/200622/06/2006

Cite this

Hansen, D. W., & Hansen, J. P. (2006). Robustifying Eye Interaction. In Proceedings of the 2006 Conference on Computer Vision and Pattern Recognition Workshop (CVPRW’06) IEEE. https://doi.org/10.1109/CVPRW.2006.181