Perceptual User Interfaces Logo
University of Stuttgart Logo

EyeScout: Active Eye Tracking for Position and Movement Independent Gaze Interaction with Large Public Displays

Mohamed Khamis, Axel Hoesl, Alexander Klimczak, Martin Reiss, Florian Alt, Andreas Bulling

Proc. ACM Symposium on User Interface Software and Technology (UIST), pp. 155-166, 2017.


Abstract

While gaze holds a lot of promise for hands-free interaction with public displays, remote eye trackers with their confined tracking box restrict users to a single stationary position in front of the display. We present EyeScout, an active eye tracking system that combines an eye tracker mounted on a rail system with a computational method to automatically detect and align the tracker with the user’s lateral movement. EyeScout addresses key limitations of current gaze-enabled large public displays by offering two novel gaze-interaction modes for a single user: In "Walk then Interact" the user can walk up to an arbitrary position in front of the display and interact, while in "Walk and Interact" the user can interact even while on the move. We report on a user study that shows that EyeScout is well perceived by users, extends a public display’s sweet spot into a sweet line, and reduces gaze interaction kick- off time to 3.5 seconds - a 62% improvement over state of the art solutions. We discuss sample applications that demonstrate how EyeScout can enable position and movement-independent gaze interaction with large public displays.

Links


BibTeX

@inproceedings{khamis17_uist, title = {EyeScout: Active Eye Tracking for Position and Movement Independent Gaze Interaction with Large Public Displays}, author = {Khamis, Mohamed and Hoesl, Axel and Klimczak, Alexander and Reiss, Martin and Alt, Florian and Bulling, Andreas}, year = {2017}, pages = {155-166}, doi = {10.1145/3126594.3126630}, booktitle = {Proc. ACM Symposium on User Interface Software and Technology (UIST)}, video = {https://www.youtube.com/watch?v=J7_OiRqsmdM} }