Tracking Eyes using Shape and Appearance

Dan Witzner Hansen, Mads Nielsen, John Paulin Hansen, Anders Sewerin Johansen, Mikkel Bille Stegmann

    Research output: Chapter in Book/Report/Conference proceedingArticle in proceedingsResearchpeer-review


    We propose a non-intrusive eye tracking system intended for the use of everyday gaze typing using web cameras. We argue that high precision in gaze tracking is not needed for on-screen typing due to natural language redundancy. This facilitates the use of low-cost video components for advanced multi-modal interactions based on video tracking systems. Robust methods are needed to track the eyes using web cameras due to the poor image quality. A real-time tracking scheme using a mean-shift color tracker and an Active Appearance Model of the eye is proposed. From this model, it is possible to infer the state of the eye such as eye corners and the pupil location under scale and rotational changes. We use a Gaussian Process interpolation method for gaze determination, which facilitates stability feedback from the system. The use of a learning method for gaze estimation gives more flexibility to the choice of camera and its position.
    Original languageEnglish
    Title of host publicationIAPR Workshop on Machine Vision Applications - MVA
    Publication date2002
    Publication statusPublished - 2002


    Dive into the research topics of 'Tracking Eyes using Shape and Appearance'. Together they form a unique fingerprint.

    Cite this