Explain Yourself to Reveal Your Mind
Description: When we explain our behaviour to other people, we reveal some of our mental states like beliefs, desires, emotions. Humans generate these kind of explanations not only for their own behaviour but also for other people's behaviour. This ability is called theory of mind (ToM). It is an open challenge to artificially generate ToM explanations in human-computer-interaction. As a first step, it is important to understand how users themselves explain their behaviour and to what extent they use theory of mind in doing so.
The goal of this project is to collect and classify theory of mind explanations during human-computer-interaction. For this, it is required to design a task interface, that prompts users to explain their behaviour in terms of mental states. The task could be a game like mastermind, where players have to form beliefs about the unknown goal. The second step is to conduct a user study and collect explanations. Lastly, the explanations should be analysed with a combination of automatic natural language processing tools and qualitative analysis. The proportion of the two can be varied according to interest.
The outcome of the project is two-fold: a novel data set and its classification.
Supervisor: Susanne Hindennach
Distribution: 20% Literature, 50% Implementation, 30% Data Analysis
Requirements: Strong programming skills, experience with machine learning and data analysis