
Our goal is to develop next-generation human-machine interfaces that offer human-like interactive capabilities. To this end, we research fundamental computational methods as well as ambient and on-body systems to sense, model, and analyse everyday non-verbal human behavior and cognition.
Our research towards this goal spans five main areas: Cognition-aware Computing, Social Signal Processing, Multimodal Interaction, Attentive User Interfaces, and Usable Security and Privacy.
Please find selected key publications for each area below.
Cognition-aware Computing
-
Moment-to-Moment Detection of Internal Thought during Video Viewing from Eye Vergence Behavior
Proc. ACM Multimedia (MM), pp. 1–9, 2019.
-
Eye movements during everyday behavior predict personality traits
Frontiers in Human Neuroscience, 12, pp. 1–8, 2018.
-
Predicting the Category and Attributes of Visual Search Targets Using Deep Gaze Pooling
Proc. IEEE International Conference on Computer Vision Workshops (ICCVW), pp. 2740-2748, 2017.
-
Prediction of Search Targets From Fixations in Open-world Settings
Proc. IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp. 981-990, 2015.
-
I know what you are reading – Recognition of document types using mobile eye tracking
Proc. IEEE International Symposium on Wearable Computers (ISWC), pp. 113-116, 2013.
-
Recognition of Visual Memory Recall Processes Using Eye Movement Analysis
Proc. ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp), pp. 455-464, 2011.
-
Recognition of Hearing Needs From Body and Eye Movements to Improve Hearing Instruments
Proc. International Conference on Pervasive Computing (Pervasive), pp. 314-331, 2011.
Social Signal Processing
-
Emergent Leadership Detection Across Datasets
Proc. ACM International Conference on Multimodal Interaction (ICMI), pp. 274-278, 2019.
-
Detecting Low Rapport During Natural Interactions in Small Groups from Non-Verbal Behavior
Proc. ACM International Conference on Intelligent User Interfaces (IUI), pp. 153-164, 2018.
-
Robust Eye Contact Detection in Natural Multi-Person Interactions Using Gaze and Speaking Behaviour
Proc. ACM International Symposium on Eye Tracking Research and Applications (ETRA), pp. 1–10, 2018.
-
Discovery of Everyday Human Activities From Long-term Visual Behaviour Using Topic Models
Proc. ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp), pp. 75-85, 2015.
-
Emotion recognition from embedded bodily expressions and speech during dyadic interactions
Proc. International Conference on Affective Computing and Intelligent Interaction (ACII), pp. 663-669, 2015.
-
The Royal Corgi: Exploring Social Gaze Interaction for Immersive Gameplay
Proc. ACM SIGCHI Conference on Human Factors in Computing Systems (CHI), pp. 115-124, 2015.
-
A Tutorial on Human Activity Recognition Using Body-worn Inertial Sensors
ACM Computing Surveys, 46(3), pp. 1–33, 2014.
-
EyeContext: Recognition of High-level Contextual Cues from Human Visual Behaviour
Proc. ACM SIGCHI Conference on Human Factors in Computing Systems (CHI), pp. 305-308, 2013.
-
AutoBAP: Automatic Coding of Body Action and Posture Units from Wearable Sensors
Proc. Humaine Association Conference on Affective Computing and Intelligent Interaction (ACII), pp. 135-140, 2013.
-
MotionMA: Motion Modelling and Analysis by Demonstration
Proc. ACM SIGCHI Conference on Human Factors in Computing Systems (CHI), pp. 1309-1318, 2013.
-
Multimodal Recognition of Reading Activity in Transit Using Body-Worn Sensors
ACM Transactions on Applied Perception (TAP), 9(1), pp. 1–21, 2012.
-
Eye Movement Analysis for Activity Recognition Using Electrooculography
IEEE Transactions on Pattern Analysis and Machine Intelligence (TPAMI), 33(4), pp. 741-753, 2011.
Multimodal Interaction
-
Towards High-Frequency SSVEP-Based Target Discrimination with an Extended Alphanumeric Keyboard
Proc. IEEE International Conference on Systems, Man, and Cybernetics (SMC), pp. 1-6, 2019.
-
A Design Space for Gaze Interaction on Head-mounted Displays
Proc. ACM SIGCHI Conference on Human Factors in Computing Systems (CHI), pp. 1–12, 2019.
-
Which one is me? Identifying Oneself on Public Displays
Proc. ACM SIGCHI Conference on Human Factors in Computing Systems (CHI), pp. 1–12, 2018.
-
The Past, Present, and Future of Gaze-enabled Handheld Mobile Devices: Survey and Lessons Learned
Proc. ACM International Conference on Human-Computer Interaction with Mobile Devices and Services (MobileHCI), pp. 1–17, 2018.
-
Look together: using gaze for assisting co-located collaborative search
Personal and Ubiquitous Computing, 21(1), pp. 173-186, 2017.
-
EyePACT: Eye-Based Parallax Correction on Touch-Enabled Interactive Displays
Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies (IMWUT), 1(4), pp. 1–18, 2017.
-
InvisibleEye: Mobile Eye Tracking Using Multiple Low-Resolution Cameras and Learning-Based Gaze Estimation
Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies (IMWUT), 1(3), pp. 1–21, 2017.
-
EyeScout: Active Eye Tracking for Position and Movement Independent Gaze Interaction with Large Public Displays
Proc. ACM Symposium on User Interface Software and Technology (UIST), pp. 155-166, 2017.
-
Concept for Using Eye Tracking in a Head-mounted Display to Adapt Rendering to the User’s Current Visual Field
Proc. of the 22nd ACM Conference on Virtual Reality Software and Technology (VRST), pp. 323-324, 2016.
-
Combining Eye Tracking with Optimizations for Lens Astigmatism in Modern Wide-Angle HMDs
Proc. IEEE Virtual Reality (VR), pp. 269-270, 2016.
-
TextPursuits: Using Text for Pursuits-Based Interaction and Calibration on Public Displays
Proc. ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp), pp. 274-285, 2016.
-
Eye tracking for public displays in the wild
Springer Personal and Ubiquitous Computing, 19(5), pp. 967-981, 2015.
-
Orbits: Enabling Gaze Interaction in Smart Watches using Moving Targets
Proc. ACM Symposium on User Interface Software and Technology (UIST), pp. 457-466, 2015.
-
GravitySpot: Guiding Users in Front of Public Displays Using On-Screen Visual Cues
Proc. ACM Symposium on User Interface Software and Technology (UIST), pp. 47-56, 2015.
-
Gaze+RST: Integrating Gaze and Multitouch for Remote Rotate-Scale-Translate Tasks
Proc. ACM SIGCHI Conference on Human Factors in Computing Systems (CHI), pp. 4179-4188, 2015.
-
Pupil: an open source platform for pervasive eye tracking and mobile gaze-based interaction
Adj. Proc. ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp), pp. 1151-1160, 2014.
-
GazeHorizon: Enabling Passers-by to Interact with Public Displays by Gaze
Proc. ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp), pp. 559-563, 2014.
-
Pursuits: Spontaneous Interaction with Displays based on Smooth Pursuit Eye Movement and Moving Targets
Proc. ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp), pp. 439-448, 2013.
-
Pursuit Calibration: Making Gaze Calibration Less Tedious and More Flexible
Proc. ACM Symposium on User Interface Software and Technology (UIST), pp. 261-270, 2013.
-
SideWays: A Gaze Interface for Spontaneous Interaction with Situated Displays
Proc. ACM SIGCHI Conference on Human Factors in Computing Systems (CHI), pp. 851-860, 2013.
-
Wearable EOG goggles: Seamless sensing and context-awareness in everyday environments
Journal of Ambient Intelligence and Smart Environments, 1(2), pp. 157-171, 2009.
Attentive User Interfaces
-
MPIIGaze: Real-World Dataset and Deep Appearance-Based Gaze Estimation
IEEE Transactions on Pattern Analysis and Machine Intelligence (TPAMI), 41(1), pp. 162-175, 2019.
-
Evaluation of Appearance-Based Methods and Implications for Gaze-Based Applications
Proc. ACM SIGCHI Conference on Human Factors in Computing Systems (CHI), pp. 1–13, 2019.
-
GazeDirector: Fully Articulated Eye Gaze Redirection in Video
Computer Graphics Forum (CGF), 37(2), pp. 217-225, 2018.
-
Training Person-Specific Gaze Estimators from Interactions with Multiple Devices
Proc. ACM SIGCHI Conference on Human Factors in Computing Systems (CHI), pp. 1–12, 2018.
-
Revisiting Data Normalization for Appearance-Based Gaze Estimation
Proc. ACM International Symposium on Eye Tracking Research and Applications (ETRA), pp. 1–9, 2018.
-
Forecasting User Attention During Everyday Mobile Interactions Using Device-Integrated and Wearable Sensors
Proc. ACM International Conference on Human-Computer Interaction with Mobile Devices and Services (MobileHCI), pp. 1–13, 2018.
-
Error-Aware Gaze-Based Interfaces for Robust Mobile Gaze Interaction
Proc. ACM International Symposium on Eye Tracking Research and Applications (ETRA), pp. 1–10, 2018.
-
Understanding Face and Eye Visibility in Front-Facing Cameras of Smartphones used in the Wild
Proc. ACM SIGCHI Conference on Human Factors in Computing Systems (CHI), pp. 1–12, 2018.
-
It’s Written All Over Your Face: Full-Face Appearance-Based Gaze Estimation
Proc. IEEE Conference on Computer Vision and Pattern Recognition Workshops (CVPRW), pp. 2299-2308, 2017.
-
Everyday Eye Contact Detection Using Unsupervised Gaze Target Discovery
Proc. ACM Symposium on User Interface Software and Technology (UIST), pp. 193-203, 2017.
-
Learning an appearance-based gaze estimator from one million synthesised images
Proc. ACM International Symposium on Eye Tracking Research and Applications (ETRA), pp. 131–138, 2016.
-
A 3D Morphable Eye Region Model for Gaze Estimation
Proc. European Conference on Computer Vision (ECCV), pp. 297-313, 2016.
-
Spatio-Temporal Modeling and Prediction of Visual Attention in Graphical User Interfaces
Proc. ACM SIGCHI Conference on Human Factors in Computing Systems (CHI), pp. 3299-3310, 2016.
-
AggreGaze: Collective Estimation of Audience Attention on Public Displays
Proc. ACM Symposium on User Interface Software and Technology (UIST), pp. 821-831, 2016.
-
Appearance-based Gaze Estimation in the Wild
Proc. IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp. 4511-4520, 2015.
-
GazeProjector: Accurate Gaze Estimation and Seamless Gaze Interaction Across Multiple Displays
Proc. ACM Symposium on User Interface Software and Technology (UIST), pp. 395-404, 2015.
-
Self-Calibrating Head-Mounted Eye Trackers Using Egocentric Visual Saliency
Proc. ACM Symposium on User Interface Software and Technology (UIST), pp. 363-372, 2015.
-
Rendering of Eyes for Eye-Shape Registration and Gaze Estimation
Proc. IEEE International Conference on Computer Vision (ICCV), pp. 3756-3764, 2015.
-
EyeTab: Model-based gaze estimation on unmodified tablet computers
Proc. ACM International Symposium on Eye Tracking Research and Applications (ETRA), pp. 207-210, 2014.
-
Eye Tracking and Eye-Based Human-Computer Interaction
Stephen H. Fairclough, Kiel Gilleade (Eds.): Advances in Physiological Computing, Springer Publishing London, pp. 39-65, 2014.
-
Robust, real-time pupil tracking in highly off-axis images
Proc. ACM International Symposium on Eye Tracking Research and Applications (ETRA), pp. 173-176, 2012.
Usable Security and Privacy
-
PrivacEye: Privacy-Preserving Head-Mounted Eye Tracking Using Egocentric Scene Image and Eye Movement Features
Proc. ACM International Symposium on Eye Tracking Research and Applications (ETRA), pp. 1–10, 2019.
-
Privacy-Aware Eye Tracking Using Differential Privacy
Proc. ACM International Symposium on Eye Tracking Research and Applications (ETRA), pp. 1–9, 2019.
-
CueAuth: Comparing Touch, Mid-Air Gestures, and Gaze for Cue-based Authentication on Situated Displays
Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies (IMWUT), 2(7), pp. 1–22, 2018.
-
GazeTouchPIN: Protecting Sensitive Data on Mobile Devices using Secure Multimodal Authentication
Proc. ACM International Conference on Multimodal Interaction (ICMI), pp. 446-450, 2017.
-
They are all after you: Investigating the Viability of a Threat Model that involves Multiple Shoulder Surfers
Proc. International Conference on Mobile and Ubiquitous Multimedia (MUM), pp. 1–5, 2017.
-
SkullConduct: Biometric User Identification on Eyewear Computers Using Bone Conduction Through the Skull
Proc. ACM SIGCHI Conference on Human Factors in Computing Systems (CHI), pp. 1379-1384, 2016.
-
GazeTouchPass: Multimodal Authentication Using Gaze and Touch on Mobile Devices
Ext. Abstr. ACM SIGCHI Conference on Human Factors in Computing Systems (CHI), pp. 2156-2164, 2016.
-
Graphical Passwords in the Wild – Understanding How Users Choose Pictures and Passwords in Image-based Authentication Schemes
Proc. ACM International Conference on Human-Computer Interaction with Mobile Devices and Services (MobileHCI), pp. 316-322, 2015.