Ultimate SLAM? Combining Events, Images, and IMU for Visual SLAM in HDR and High-Speed Scenarios

Описание к видео Ultimate SLAM? Combining Events, Images, and IMU for Visual SLAM in HDR and High-Speed Scenarios

Event cameras are bio-inspired vision sensors that output pixel-level brightness changes instead of standard intensity frames. These cameras do not suffer from motion blur and have a very high dynamic range, which enables them to provide reliable visual information during high speed motions or in scenes characterized by high dynamic range. However, event cameras output only little information when the amount of motion is limited, such as in the case of almost still motion. Conversely, standard cameras provide instant and rich information about the environment most of the time (in low-speed and good lighting scenarios), but they fail severely in case of fast motions, or difficult lighting such as high dynamic range or low light scenes. In this paper, we present the first state estimation pipeline that leverages the complementary advantages of these two sensors by fusing in a tightly-coupled manner events, standard frames, and inertial measurements. We show on the publicly available Event Camera Dataset that our hybrid pipeline leads to an accuracy improvement of 130% over event-only pipelines, and 85% over standard-frames only visual-inertial systems, while still being computationally tractable. Furthermore, we use our pipeline to demonstrate—to the best of our knowledge—the first autonomous quadrotor flight using an event camera for state estimation, unlocking
flight scenarios that were not reachable with traditional visual inertial odometry, such as low-light environments and high dynamic range scenes.

Reference:
Antoni Rosinol Vidal, Henri Rebecq, Timo Horstschaefer, Davide Scaramuzza
Ultimate SLAM? Combining Events, Images, and IMU for Robust Visual SLAM in HDR and High Speed Scenarios
IEEE Robotics and Automation Letters (RA-L), 2018.
DOI: 10.1109/LRA.2018.2793357
PDF: http://rpg.ifi.uzh.ch/docs/RAL18_Vida...

Project Webpage:
http://rpg.ifi.uzh.ch/ultimateslam.html

Our research page on event based vision:
http://rpg.ifi.uzh.ch/research_dvs.html

Our research page on vision-based navigation for MAVs:
http://rpg.ifi.uzh.ch/research_mav.html

For event-camera datasets and event camera simulator, see here: http://rpg.ifi.uzh.ch/davis_data.html

Other resources on event cameras (publications, software, drivers, where to buy, etc.):
https://github.com/uzh-rpg/event-base...

Affiliations: A.R. Vidal, H. Rebecq, T. Horstschaefer and D. Scaramuzza are with the Robotics and Perception Group, Dep. of Informatics, University of Zurich, and Dep. of Neuroinformatics, University of Zurich and ETH Zurich, Switzerland http://rpg.ifi.uzh.ch/

Комментарии

Информация по комментариям в разработке