Gaze-controlled Driving

Martin Tall, Alexandre Alapetite, Javier San Agustin, Dan Witzner Hansen, Henrik Hegner Tomra Skovsgaard, Emilie Møllenbach, John Paulin Hansen

    Research output: Chapter in Book/Report/Conference proceedingConference abstract in proceedingsResearchpeer-review

    Abstract

    We investigate if the gaze (point of regard) can control a remote vehicle driving on a racing track. Five different input devices (on-screen buttons, mouse-pointing low-cost webcam eye tracker and two commercial eye tracking systems) provide heading and speed control on the scene view transmitted from the moving robot. Gaze control was found to be similar to mouse control. This suggests that robots and wheelchairs may be controlled “hands-free” through gaze. Low precision gaze tracking and image transmission delays had noticeable effect on performance.
    Original languageEnglish
    Title of host publicationConference on Human Factors in Computing Systems : Proceedings of the 27th international conference extended abstracts on Human factors in computing systems
    VolumeSESSION: Spotlight on work in progress session 2
    Place of PublicationNew York, NY, USA
    PublisherACM Conference on Computer-Human Interaction
    Publication date2009
    Pages4387-4392
    ISBN (Print)978-1-60558-247-4
    DOIs
    Publication statusPublished - 2009
    EventACM SIGCHI Conference on Human Factors in Computing Systems : Digital Life New World - Boston, United States
    Duration: 4 Apr 20099 Apr 2009
    Conference number: 2009

    Conference

    ConferenceACM SIGCHI Conference on Human Factors in Computing Systems : Digital Life New World
    Number2009
    Country/TerritoryUnited States
    CityBoston
    Period04/04/200909/04/2009

    Keywords

    • control
    • input
    • wheelchair
    • Gaze
    • robot
    • mobile

    Fingerprint

    Dive into the research topics of 'Gaze-controlled Driving'. Together they form a unique fingerprint.

    Cite this