Zihan Yan

Collage of Computer Science

Show All Publications

ARGaze: A Dataset of Eye Gaze Images for Calibration-Free Eye Tracking with Augmented Reality Headset

Zihan Yan, Yue Wu, Yifei Shan, Wenqian Chen, Xiangdong Li

argaze

Abstract

Eye-tracking is a widespread method in human-computer interaction. However, it is often criticised for the troublesome calibration with new users and scenes. Despite progress in machine learning-based eye tracking, preparing a qualified dataset remains challenging. We present ARGaze, a dataset of eye gaze images, for calibration-free eye tracking with AR headset. The dataset was derived from 25 participants who conducted eye gaze tasks in augmented reality and real-world scenes for approximately 30min. It comprises 1,321,968 pairs of eye images and corresponding world view in 50 videos. To validate the dataset, we implemented the SIFTNet- and ALSTM-FCN-hybrid model and compared the results with that of the state-of-the-art research. The results show that the dataset is of high compatibility with different machine learning models and it contains sufficient eye gaze-related features that enable the record low eye gaze estimation error by 3.70degree on average and 1.56degree on specific participant, without involving any pre-study calibrations across the participants. Guidance for dataset reuse and related implications for eye tracking design and evaluation are described.

Show All Publications