Optical Flow for Event Detection Camera

Date of Award


Degree Name

Ph.D. in Engineering


Department of Electrical and Computer Engineering


Advisor: Keigo Hirakawa


Optical flow (OF) which refers to the task of determining the apparent motion of objects in the scene has been a topic of core interest in commuter vision for the past three decades. Optical flow methods of conventional camera struggle in the presence of large motion and occlusion due to slow frame rates. Optical flow of dynamic vision sensor (DVS) has gained attention recently as a way to overcome these shortcomings. DVS known also as event detection camera emerged recently as an alternative to a conventional camera by replacing a fixed analog-to-digital (A/D) converter with a floating asynchronous circuit. Rather than reporting a pixel intensity at a fixed time interval, the event detection cameras report only the significant changes (i.e. above threshold) to the pixel intensity (the "events") and the time that such event occurs. Such circuit significantly reduces the communication bandwidth of the camera, enabling the operation at equivalent of roughly 80,000 frames per second. In addition, the floating A/D converter may adapt to extremely high dynamic range, making it suitable for applications in automotives and scientific instruments. However, the sparsity of the output data renders existing image processing and computer vision methods useless. For example, the "brightness constancy constraint" that is at the heart of optical flow does not apply to the edge-like features that event detection cameras output, and the very notion of "frames" is absent in the asynchronous outputs. In this work, we consider a new sensor called DAViS that combines the conventional active pixel sensor (APS) and DVS circuitries, yielding a conventional intensity image frames as well as the events. We propose three novel optical flow methods: First, We propose a novel optical flow method designed specifically for a DAViS camera that leverages the high spatial fidelity of intensity image frames and the high temporal resolution of events generated by DVS. Hence, the proposed DAViS-OF method yields reliable motion vector estimates while overcoming the fast motion and occlusion problems. Secondly, we develop a novel DVS optical flow method using the 2D distance transform---computed from the detected events---as a proxy for object texture. Treating multiple 2D distance transforms collectively as a ``distance surface'' improves optical flow significantly over working directly with the sparse events generated by the DVS camera. Finally, we introduce a new DVS-based method that extend the notion of 2D distance transform into 3D distance transform by incorporating the temporal information, resulting in reliable optical flow estimate. Real sensor experiments verify the accuracy and robustness of proposed methods to reliably recover the true two dimensional pixel motion, not limited to the ``normal flow.''


Electrical Engineering, Engineering, Optical Flow, Dynamic Vision Sensor, DVS, Event Camera, Motion Estimation

Rights Statement

Copyright © 2019, author