MPIIMobileAttention: Forecasting User Attention During Everyday Mobile Interactions Using Device-Integrated and Wearable
Visual attention is highly fragmented during mobile interactions, but the erratic nature of attention shifts currently limits attentive user interfaces to adapting after the fact, i.e. after shifts have already happened. We instead study attention forecasting – the challenging task of predicting users’ gaze behaviour (overt visual attention) in the near future. We present a novel long-term dataset of everyday mobile phone interactions, continuously recorded from 20 participants engaged in common activities on a university campus over 4.5 hours each (more than 90 hours in total). We propose a proof-of-concept method that uses device-integrated sensors and body-worn cameras to encode rich information on device usage and users’ visual scene. We demonstrate that our method can forecast bidirectional attention shifts and predict whether the primary attentional focus is on the handheld mobile device. We study the impact of different feature sets on performance and discuss the significant potential but also remaining challenges of forecasting user attention during mobile interactions.
More information can be found here.
The dataset consists of a .zip file with three files per participant for each of the three recording blocks (RB). Each recording block file is saved as a .pkl file which can be read with python using pandas. The data scheme of the 213 columns is given in this README.txt file .
Download: Please download the full dataset here (2.4 Gb).
Contact: Andreas Bulling,
The data is only to be used for non-commercial scientific purposes. If you use this dataset in a scientific publication, please cite the following paper:
Forecasting User Attention During Everyday Mobile Interactions Using Device-Integrated and Wearable Sensors
Proc. ACM International Conference on Human-Computer Interaction with Mobile Devices and Services (MobileHCI),