CIS Seminar 12/06/23: Gabriel Diaz; The current and evolving state of head-mounted eye tracking

Описание к видео CIS Seminar 12/06/23: Gabriel Diaz; The current and evolving state of head-mounted eye tracking

CIS Weekly Seminar

The current and evolving state of head-mounted eye tracking using frame and event-based sensors.

Dr. Gabriel Diaz

Associate Professor, Center for Imaging Science, Rochester Institute of Technology


Abstract

Algorithms for the estimation of gaze direction from mobile and video-based eye trackers typically begin by tracking a feature of the eye that is visible in an image taken from a near-eye infrared eye camera, such as the center of the pupil. However, feature tracking with traditional computer vision techniques is especially prone to error when the eye tracker is taken outdoors and subjected to infrared interference and environmental reflections. Over the past few years, we have addressed this challenge by training image segmentation networks using pools of hand-labelled training images and synthetic training images generated using a custom graphical pipeline in Blender. Despite great progress in this area, we are now faced with a new limitation: these networks require too much power to be incorporated into the next generation of battery-powered virtual and augmented reality displays. This realization has prompted us to move away from traditional and power-inefficient frame-based sensors and towards the emerging “event-sensor.” Unlike traditional sensors that capture entire frames at regular intervals, each pixel in the event sensor independently detects changes in the light intensity and transmits a pixel-level event only if this change exceeds a certain threshold. Consequently, the event stream is both spatially sparse and temporally dense, and we speculate that it will enable eye tracking at a fraction of the power of traditional sensors. This presentation will end with a presentation of our early work with event sensors, and our vision of their future in the context of eye tracking.

Комментарии

Информация по комментариям в разработке