Detection and Tracking with Event-Based Sensors
Date of Award
5-5-2024
Degree Name
M.S. in Computer Engineering
Department
Department of Electrical and Computer Engineering
Advisor/Chair
Tarek Taha
Abstract
The work outlined here seeks to address the issue of detection and tracking of a moving object using a moving Event-Based Sensor (EBS) camera. Others have solved this issue by using power-hungry Convolutional Neural Networks (CNNs) which negate the low Size, Weight, And Power (SWAP) and high-speed benefits of an EBS camera. Throughout this paper, an attempt is made to solve the detection and tracking problem while keeping the low SWAP benefits of the EBS camera. This starts by looking at lightweight stationary EBS tracking algorithms and applying neuromorphic and hyperdimensional computing approaches to optimize the storage and runtime of the software. Ultimately, it was determined that the original approach was more time-efficient and therefore was used as a starting point for the Moving Sensor Moving Object (MSMO) detection and tracking algorithm. The MSMO algorithm uses the velocities of each event to create an average of the scene and filter out dissimilar events. This work shows the study performed on the velocity values of the events and explains why ultimately an average-based velocity filter is insufficient for lightweight MSMO detection and tracking of objects using an EBS camera.
Keywords
Computer Engineering
Rights Statement
Copyright 2024, author
Recommended Citation
Molskow, Greg, "Detection and Tracking with Event-Based Sensors" (2024). Graduate Theses and Dissertations. 7596.
https://ecommons.udayton.edu/graduate_theses/7596
