Perceptual User Interfaces Logo
University of Stuttgart Logo

Human Eye Gaze and Pose Data Collection and Analysis in Daily Activities

Dataset image

Description: Human eye gaze and pose information is significant for many applications, including human-computer interaction, autonomous driving, assistive devices, as well as virtual reality and augmented reality. However, the number of datasets that contain both eye gaze and human pose data is very limited, making it difficult to analyse the coordination between eye gaze and human pose. In this project, we aim at collecting a novel dataset that contains human eye gaze and pose in daily activities. Based on the collected data, we will analyse the eye-body coordination and provide substantial insights for related applications.

Goal: collect human eye gaze and pose data in daily activities, process and analyse the collected dataset.

Supervisor: Zhiming Hu

Distribution: 70% Implementation, 10% Literature, 20% Analysis

Requirements: strong programming skills in Python, strong interests in dataset collection

Preferable: knowledge of eye gaze and human pose, experience in motion capture and eye tracking

Literature: [1] Kratzer, P., et al. (2020). MoGaze: A dataset of full-body motions that includes workspace geometry and eye-gaze. IEEE ROBOTICS AND AUTOMATION LETTERS 6(2): 367-373.
[2] Zheng, Y., et al. (2022). GIMO: Gaze-Informed Human Motion Prediction in Context. ECCV 2022.
[3] Zhang, S., et al. (2021). "EgoBody: Human Body Shape, Motion and Social Interactions from Head-Mounted Devices."