Deep Learning Approach for Vision Navigation in Flight
Date of Award
M.S. in Electrical Engineering
Department of Electrical and Computer Engineering
Advisor: Eric Balster
The recent advancements in the field of Deep Learning have fostered solutions to many complex image based problems such as image classification, object detection, and image captioning. The goal of this work is to apply Deep Learning techniques to the problem of image based navigation in a flight environment. In the situation GPS is not available, it is important to have alternate navigation systems. An image based navigation system is potentially a cost effective alternative during a GPS outage. The current state of the art results are obtained using a perspective-n-point (PnP) approach. The downsides to the PnP approach include carrying a large database of features for matching and sparse availability of distinct features in all scenes. A deep learning approach allows for a lightweight solution and provides a position estimation for any scene. A variety of published networks are modified for regression and trained to estimate a virtual drones North and East position as a function of a single input image. The best network tested produces an average euclidean distance error, in a 2.5 x 2.5 Km virtual environment, is 5.643 meters.
Engineering, Deep Learning, Alternative Navigation, Vision Navigation, Image Based Navigation, Flight Navigation, Convolutional Neural Networks
Copyright 2018, author
McNally, Branden Timothy, "Deep Learning Approach for Vision Navigation in Flight" (2018). Graduate Theses and Dissertations. 6740.