Leveraging Hierarchical Methods for Multi-Sensor Fusion

Date of Award

5-1-2025

Degree Name

M.S. in Computer Engineering

Department

Department of Electrical and Computer Engineering

Advisor/Chair

Tarek Taha

Abstract

Performing object classification is challenging under diverse sets of operating conditions. In electro-optical (EO) data, the position of the sun and sensor angle can significantly impact the appearance of objects. The pose of the object can impact performance in synthetic aperture radar (SAR) data. By combining multiple sensors, the performance drop can be reduced when operating conditions in your training set and testing set diverge significantly. Traditional multi-sensor fusion methods have primarily considered the fusion problem as a flat problem. Flat classification and fusion problems do not consider the relationships between classes. These relationships can be used to extract additional information and allow us to provide partial decisions (e.g., declare an object as a pick-up truck instead of a Ford F-150). In this thesis, several traditional decision-level and feature-level multi-sensor fusion methods are extended to work with hierarchical classification methods. The fusion methods on two multi-sensor datasets are evaluated: 1) visible EO (EO-vis) plus synthetic aperture radar (SAR) dataset and 2) EO-vis plus near infrared (EO-NIR) dataset. Classification performance is evaluated with traditional and hierarchical methods.

Keywords

Artificial Intelligence, Computer Engineering, Computer Science

Rights Statement

Copyright 2025, author.

Share

COinS