Attentive interfaces for users with disabilities: eye gaze for intention and uncertainty estimation
Attentive user interfaces (AUIs) capitalize on the rich information that can be obtained from users' gaze behavior in order to infer relevant aspects of their cognitive state. Not only is eye gaze an excellent clue to states of interest and intention, but also to preference and confidence in comprehension. AUIs are built with the aim of adapting the interface to the user's current information need, and thus reduce workload of interaction. Given those characteristics, it is believed that AUIs can have particular benefits for users with severe disabilities, for whom operating a physical device (like a mouse pointer) might be very strenuous or infeasible. This paper presents three studies that attempt to gauge uncertainty and intention on the part of the user from gaze data, and compare the success of each approach. The paper discusses how the application of the approaches adopted in each study to user interfaces can support users with severe disabilities.
Citation : Prendinger, H., Hyrskykari, A., Nakayama, M., Istance, H., Bee, N. and Takahasi, Y. (2009) Attentive interfaces for users with disabilities: eye gaze for intention and uncertainty estimation. Universal Access in the Information Society, 8, (4), pp 339-354
ISSN : 16155289
Research Group : Centre for Computational Intelligence