ConAn: A Usable Tool for Multimodal Conversation Analysis
Anna Penzkofer,
Philipp Müller,
Felix Bühler,
Sven Mayer,
Andreas Bulling
Proc. ACM International Conference on Multimodal Interaction (ICMI),
pp. 341-351,
2021.
Abstract
Links
BibTeX
Project
Multimodal analysis of group behavior is a key task in human-computer interaction, as well as the social and behavioral sciences, but is often limited to more easily controllable laboratory settings or requires elaborate multi-sensor setups and time-consuming manual data annotation. We present ConAn – a usable tool to explore and automatically analyze non-verbal behavior of multiple persons during natural group conversations. In contrast to traditional multi-sensor setups, our tool only requires a single 360° camera and uses state-of-the-art computer vision methods to automatically extract behavioral indicators, such as gaze direction, facial expressions, and speaking activity. Thus, our tool allows for easy and fast deployment supporting researchers in understanding both individual behavior and group interaction dynamics, but also in quantifying user-object interactions. We illustrate the benefits of our tool on three sample use cases: general conversation analysis, assessment of collaboration quality, and impact of technology on audience behavior. Taken together, ConAn represents an important step towards democratizing automatic conversation analysis in HCI and beyond.
@inproceedings{penzkofer21_icmi,
author = {Penzkofer, Anna and Müller, Philipp and Bühler, Felix and Mayer, Sven and Bulling, Andreas},
title = {ConAn: A Usable Tool for Multimodal Conversation Analysis},
booktitle = {Proc. ACM International Conference on Multimodal Interaction (ICMI)},
year = {2021},
doi = {10.1145/3462244.3479886},
pages = {341-351},
video = {https://www.youtube.com/watch?v=H2KfZNgx6CQ}
}