Event-based Visual-Inertial Velometer

Описание к видео Event-based Visual-Inertial Velometer

RSS2024: Event-based Visual-Inertial Velometer
Neuromorphic event-based cameras are bio-inspired visual sensors with asynchronous pixels and extremely high temporal resolution. Such favorable properties make them an excellent choice for solving state estimation tasks under aggressive ego motion. However, failures of camera pose tracking are frequently witnessed in state-of-the-art event-based visual odometry systems when the local map cannot be updated in time. One of the biggest roadblocks for this specific field is the absence of efficient and robust methods for data association without imposing any assumption on the environment. This problem seems, however, unlikely to be addressed as in standard vision due to the motion dependent observability of event data. Therefore, we propose a map-free design for event-based visual-inertial state estimation in this paper. Instead of estimating the position of the event camera, we find that recovering the instantaneous linear velocity is more consistent with the differential working principle of event cameras. The proposed event-based visual-inertial velometer leverages a continuous-time formulation that incrementally fuses the heterogeneous measurements from a stereo event camera and an inertial measurement unit. Experiments on both synthetic and real data demonstrate that the proposed method can recover instantaneous linear velocity in metric scale with low latency.

PDF: https://arxiv.org/pdf/2311.18189

Authors:
Xiuyuan Lu 1*, Yi Zhou 2*, Junkai Niu 2 , Sheng Zhong 2 , and Shaojie Shen 1

Affiliations:
1 CKS Robotic Institute, Hong Kong University of Science and Technology, Hong Kong, China
2 School of Robotics, Hunan University, Changsha, China

Contacts:
Email: [email protected]; [email protected] (*equal contribution)

Комментарии

Информация по комментариям в разработке