RA-L2022: Real-time Hetero-Stereo Matching for Event and Frame Camera with Aligned Events Using Maximum Shift Distance

- 1 min

Abstract

Event cameras can show better performance than frame cameras in challenging scenarios such as fast-moving environments or high-dynamic-range scenes. However, it is still difficult for event cameras to replace frame cameras in non-challenging normal scenarios. In order to leverage the advantages of both cameras, we conduct a study for the heterogeneous stereo camera system which employs both an event and a frame camera. The proposed system estimates the semi-dense disparity in real-time by matching heterogeneous data of an event and a frame camera in stereo. We propose an accurate, intuitive and efficient way to align events with 6-DOF camera motion, by suggesting the maximum shift distance method. The aligned event image shows high similarity to the edge image of the frame camera. The proposed method can estimate poses of an event camera and depth of events in a few frames, which can speed up the initialization of the event camera system. We verified our algorithm in the DSEC data set. The proposed hetero-stereo matching outperformed than other methods. For real-time operation, we implemented our code using parallel computation with CUDA and provided our code to the public.

Video

Source code

https://github.com/Haram-kim/Hetero_Stereo_Matching

rss facebook twitter github gitlab youtube mail spotify lastfm instagram linkedin google google-plus pinterest medium vimeo stackoverflow reddit quora quora