Eye Typing using Markov and Active Appearance Models

Dan Witzner Hansen, John Paulin Hansen, Mads Nielsen, Anders Sewerin Johansen, Mikkel Bille Stegmann

    Research output: Chapter in Book/Report/Conference proceedingArticle in proceedingsResearchpeer-review


    We propose a non-intrusive eye tracking system intended for the use of everyday gaze typing using web cameras. We argue that high precision in gaze tracking is not needed for on-screen typing due to natural language redundancy. This facilitates the use of low-cost video components for advanced multi-modal interactions based on video tracking systems. Robust methods are needed to track the eyes using web cameras due to the poor image quality. A real-time tracking scheme using a mean-shift color tracker and an Active Appearance Model of the eye is proposed. It is possible from this model to infer the state of the eye such as eye corners and the pupil location under scale and rotational changes.
    Original languageEnglish
    Title of host publicationIEEE Workshop on Applications of Computer Vision - WACV
    Publication date2002
    Publication statusPublished - 2002

    Fingerprint Dive into the research topics of 'Eye Typing using Markov and Active Appearance Models'. Together they form a unique fingerprint.

    Cite this