Perceptual User Interfaces Logo
University of Stuttgart Logo

KnuckleTouch: Enabling Knuckle Gestures on Capacitive Touchscreens using Deep Learning

Robin Schweigert, Jan Leusmann, Simon Hagenmayer, Maximilian Weiß, Huy Viet Le, Sven Mayer, Andreas Bulling

Proc. Mensch und Computer, pp. 387-397, 2019.


Abstract

While mobile devices have become essential for social communication and have paved the way for work on the go, their interactive capabilities are still limited to simple touch input. A promising enhancement for touch interaction is knuckle input but recognizing knuckle gestures robustly and accurately remains challenging. We present a method to differentiate between 17 finger and knuckle gestures based on a long short-term memory (LSTM) machine learning model. Furthermore, we introduce an open source approach that is ready-to-deploy on commodity touch-based devices. The model was trained on a new dataset that we collected in a mobile interaction study with 18 participants. We show that our method can achieve an accuracy of 86.8% on recognizing one of the 17 gestures and an accuracy of 94.6% to differentiate between finger and knuckle. In our evaluation study, we validate our models and found that the LSTM gestures recognizing archived an accuracy of 88.6%. We show that KnuckleTouch can be used to improve the input expressiveness and to provide shortcuts to frequently used functions.

Links


BibTeX

@inproceedings{schweigert19_muc, title = {KnuckleTouch: Enabling Knuckle Gestures on Capacitive Touchscreens using Deep Learning}, author = {Schweigert, Robin and Leusmann, Jan and Hagenmayer, Simon and Weiß, Maximilian and Le, Huy Viet and Mayer, Sven and Bulling, Andreas}, year = {2019}, booktitle = {Proc. Mensch und Computer}, doi = {10.1145/3340764.3340767}, pages = {387-397}, code = {https://git.hcics.simtech.uni-stuttgart.de/public-projects/knuckletouch}, video = {https://www.youtube.com/watch?v=akL3Ejx3bv8} }