LandNET: A Multi-Modal Fusion Network for Classification
Presenter(s)
Jonathan Paul Schierl
Files
Description
There is a need for classifying land coverage by usage. As these classes are somewhat abstract, this provides a challenge in classifying them and a need for as much information as possible. We propose an architecture capable of classify such scenes, using 2D aerial imagery and 3D point clouds. This is done by fusing the learned feature space of each modality, to be classified with fully connected layers. This method provides a high degree of accuracy for each modality and then learns the benefits of data type, for more accurate classification.
Publication Date
4-22-2021
Project Designation
Graduate Research
Primary Advisor
Theus H. Aspiras
Primary Advisor's Department
Electrical and Computer Engineering
Keywords
Stander Symposium project, School of Engineering
United Nations Sustainable Development Goals
Industry, Innovation, and Infrastructure; Decent Work and Economic Growth
Recommended Citation
"LandNET: A Multi-Modal Fusion Network for Classification" (2021). Stander Symposium Projects. 2367.
https://ecommons.udayton.edu/stander_posters/2367