We propose a non-intrusive eye tracking system intended for the use of everyday gaze typing using web cameras. We argue that high precision in gaze tracking is not needed for on-screen typing due to natural language redundancy. This facilitates the use of low-cost video components for advanced multi-modal interactions based on video tracking systems. Robust methods are needed to track the eyes using web cameras due to the poor image quality. A real-time tracking scheme using a mean-shift color tracker and an Active Appearance Model of the eye is proposed. From this model, it is possible to infer the state of the eye such as eye corners and the pupil location under scale and rotational changes. We use a Gaussian Process interpolation method for gaze determination, which facilitates stability feedback from the system. The use of a learning method for gaze estimation gives more flexibility to the choice of camera and its position.
|Title of host publication||IAPR Workshop on Machine Vision Applications - MVA|
|Publication status||Published - 2002|