Perceptual User Interfaces Logo
University of Stuttgart Logo

Online Crowdsourcing for Saliency Dataset on Visualizations

Dataset image

Description: Eye-tracking equipments are timewise and moneywise expensive to use. The current largest attention dataset of information visualizations only contains hundreds of images. On the other hand, previous studies demonstrated that mouse clicks have a strong correlation with human visual attention. In this project, we plan to use mouse clicks to collect a large-scale (~3,000) saliency dataset on information visualizations.

Goal:
* Collect a saliency dataset by bubble-view on Amazon Mechanical Turk
* Data analysis: how close is the bubble-view data to real eye-tracking data; how each task biases attention behaviour, etc.

Supervisor: Yao Wang

Distribution: 50% Implementation (web frontend and backend), 30% Data collection & preprocessing, 20% Evaluation & analysis

Requirements: Strong Web programming skills (JavaScript, JQuery, Flask, etc). Basic knowledge of database (MySQL) and data analysis.

Literature:

Kim, N. W., Z. Bylinskii, M. A. Borkin, K. Z. Gajos, A. Oliva, F. Durand, and H. Pfister. 2017. Bubbleview: an interface for crowdsourcing image importance maps and tracking visual attention. ACM Transactions on Computer-Human Interaction (TOCHI), 24(5), p.1-40.