Perceptual User Interfaces Logo
University of Stuttgart Logo

PrivacEye: Privacy-Preserving Head-Mounted Eye Tracking Using Egocentric Scene Image and Eye Movement Features

Julian Steil, Marion Koelle, Wilko Heuten, Susanne Boll, Andreas Bulling

Proc. International Symposium on Eye Tracking Research and Applications (ETRA), 2019.

Best video award



Our method uses a mechanical camera shutter (top) to preserve users’ and bystanders’ privacy with head-mounted eye trackers. Privacy-sensitive situations are detected by combining deep scene image and eye movement features (middle) while changes in eye movement behaviour alone trigger the reopening of the camera shutter (bottom).

Abstract

Eyewear devices, such as augmented reality displays, increasingly integrate eye tracking but the first-person camera required to map a user’s gaze to the visual scene can pose a significant threat to user and bystander privacy. We present PrivacEye, a method to detect privacy-sensitive everyday situations and automatically enable and disable the eye tracker’s first-person camera using a mechanical shutter. To close the shutter in privacy-sensitive situations, the method uses a deep representation of the first-person video combined with rich features that encode users’ eye movements. To open the shutter without visual input, PrivacEye detects changes in users’ eye movements alone to gauge changes in the "privacy level" of the current situation. We evaluate our method on a first-person video dataset recorded in daily life situations of 17 participants, annotated by themselves for privacy sensitivity, and show that our method is effective in preserving privacy in this challenging setting.

Links


BibTeX

@inproceedings{steil19_etra, title = {PrivacEye: Privacy-Preserving Head-Mounted Eye Tracking Using Egocentric Scene Image and Eye Movement Features}, author = {Steil, Julian and Koelle, Marion and Heuten, Wilko and Boll, Susanne and Bulling, Andreas}, year = {2019}, booktitle = {Proc. International Symposium on Eye Tracking Research and Applications (ETRA)}, doi = {10.1145/3314111.3319913} }

Acknowledgements

This work was funded, in part, by a JST CREST research grant (JPMJCR14E1), Japan.