论文标题
基于事件,近乎眼神跟踪10,000Hz以上
Event Based, Near Eye Gaze Tracking Beyond 10,000Hz
论文作者
论文摘要
现代目光跟踪系统中的相机具有基本的带宽和功率限制,实际上将数据采集速度限制在300 Hz上。这会阻碍使用移动眼镜手术器的使用,例如低潜伏期预测性渲染,或者使用野外的头部安装设备进行快速而微妙的眼动作,例如微扫视。在这里,我们提出了一个基于混合框架的近眼凝视跟踪系统,可提供超过10,000 Hz的更新速率,其准确性与在相同条件下评估时相匹配的高端台式机商业跟踪器。我们的系统建立在新兴事件摄像机的基础上,该摄像机同时获得定期采样框架和自适应采样事件。我们开发了一种在线2D学生拟合方法,该方法每一个或几个事件都会更新参数模型。此外,我们提出了一个多项式回归器,用于实时估算参数学生模型的凝视点。使用第一个基于事件的凝视数据集,可在https://github.com/aangelopoulos/event_based_gaze_tracking上获得,我们证明我们的系统可实现0.45度 - 1.75度的准确性,从45度到98度。借助这项技术,我们希望能够为虚拟和增强现实提供新一代的超低延迟凝视呈现和展示技术。
The cameras in modern gaze-tracking systems suffer from fundamental bandwidth and power limitations, constraining data acquisition speed to 300 Hz realistically. This obstructs the use of mobile eye trackers to perform, e.g., low latency predictive rendering, or to study quick and subtle eye motions like microsaccades using head-mounted devices in the wild. Here, we propose a hybrid frame-event-based near-eye gaze tracking system offering update rates beyond 10,000 Hz with an accuracy that matches that of high-end desktop-mounted commercial trackers when evaluated in the same conditions. Our system builds on emerging event cameras that simultaneously acquire regularly sampled frames and adaptively sampled events. We develop an online 2D pupil fitting method that updates a parametric model every one or few events. Moreover, we propose a polynomial regressor for estimating the point of gaze from the parametric pupil model in real time. Using the first event-based gaze dataset, available at https://github.com/aangelopoulos/event_based_gaze_tracking , we demonstrate that our system achieves accuracies of 0.45 degrees--1.75 degrees for fields of view from 45 degrees to 98 degrees. With this technology, we hope to enable a new generation of ultra-low-latency gaze-contingent rendering and display techniques for virtual and augmented reality.