Show simple item record

dc.contributor.authorPrendinger, Helmuten
dc.contributor.authorHyrskykari, Aulikkien
dc.contributor.authorNakayama, Minoruen
dc.contributor.authorIstance, Howellen
dc.contributor.authorBee, Nikolausen
dc.contributor.authorTakahasi, Yosiyukien
dc.identifier.citationPrendinger, H., Hyrskykari, A., Nakayama, M., Istance, H., Bee, N. and Takahasi, Y. (2009) Attentive interfaces for users with disabilities: eye gaze for intention and uncertainty estimation. Universal Access in the Information Society, 8, (4), pp 339-354en
dc.description.abstractAttentive user interfaces (AUIs) capitalize on the rich information that can be obtained from users' gaze behavior in order to infer relevant aspects of their cognitive state. Not only is eye gaze an excellent clue to states of interest and intention, but also to preference and confidence in comprehension. AUIs are built with the aim of adapting the interface to the user's current information need, and thus reduce workload of interaction. Given those characteristics, it is believed that AUIs can have particular benefits for users with severe disabilities, for whom operating a physical device (like a mouse pointer) might be very strenuous or infeasible. This paper presents three studies that attempt to gauge uncertainty and intention on the part of the user from gaze data, and compare the success of each approach. The paper discusses how the application of the approaches adopted in each study to user interfaces can support users with severe disabilities.en
dc.subjectattentive interfacesen
dc.subjecteye gazeen
dc.titleAttentive interfaces for users with disabilities: eye gaze for intention and uncertainty estimationen
dc.researchgroupCentre for Computational Intelligenceen

Files in this item


There are no files associated with this item.

This item appears in the following Collection(s)

Show simple item record