Abstract
Mobile gaze interaction is challenged by inherent motor
noise. We examined the gaze tracking accuracy and
precision of twelve subjects wearing a gaze tracker on
their wrist while standing and walking. Results suggest
that it will be possible to detect whether people are
glancing the watch, but not where on the screen they
are looking. To counter the motor noise we present a
word-by-word textual UI that shows temporary
command options to be executed by gaze-strokes.
Twenty-seven participants conducted a simulated
smartwatch task and were able to reliably perform
commands that would adjust the speed of word
presentation or make regressions. We discuss future
design and usage options for a textual smartwatch gaze
interface.
Original language | English |
---|---|
Title of host publication | Adjunct Proceedings of the 2015 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2015 ACM International Symposium on Wearable Computers |
Publisher | Association for Computing Machinery |
Publication date | 2015 |
Pages | 839-847 |
ISBN (Electronic) | 978-1-4503-3575-1 |
DOIs | |
Publication status | Published - 2015 |
Event | ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp 2015) - Osaka, Japan Duration: 7 Sep 2015 → 11 Sep 2015 http://ubicomp.org/ubicomp2015/ |
Conference
Conference | ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp 2015) |
---|---|
Country/Territory | Japan |
City | Osaka |
Period | 07/09/2015 → 11/09/2015 |
Other | Colocated with the 19th International Symposium on Wearable Computers (ISWIC 2015) |
Internet address |