Gaze-controlled Driving

Martin Tall, Alexandre Alapetite, Javier San Agustin, Dan Witzner Hansen, Henrik Hegner Tomra Skovsgaard, Emilie Møllenbach, John Paulin Hansen

    Research output: Chapter in Book/Report/Conference proceedingConference abstract in proceedingsResearchpeer-review


    We investigate if the gaze (point of regard) can control a remote vehicle driving on a racing track. Five different input devices (on-screen buttons, mouse-pointing low-cost webcam eye tracker and two commercial eye tracking systems) provide heading and speed control on the scene view transmitted from the moving robot. Gaze control was found to be similar to mouse control. This suggests that robots and wheelchairs may be controlled “hands-free” through gaze. Low precision gaze tracking and image transmission delays had noticeable effect on performance.
    Original languageEnglish
    Title of host publicationConference on Human Factors in Computing Systems : Proceedings of the 27th international conference extended abstracts on Human factors in computing systems
    VolumeSESSION: Spotlight on work in progress session 2
    Place of PublicationNew York, NY, USA
    PublisherACM Conference on Computer-Human Interaction
    Publication date2009
    ISBN (Print)978-1-60558-247-4
    Publication statusPublished - 2009
    EventACM Conference on Human Factors in Computing Systems : Digital Life New World - Boston, MA
    Duration: 1 Jan 2009 → …
    Conference number: 2009


    ConferenceACM Conference on Human Factors in Computing Systems : Digital Life New World
    CityBoston, MA
    Period01/01/2009 → …


    • control
    • input
    • wheelchair
    • Gaze
    • robot
    • mobile

    Fingerprint Dive into the research topics of 'Gaze-controlled Driving'. Together they form a unique fingerprint.

    Cite this