Presenter(s)
Ruixu Liu
Files
Download Project (2.2 MB)
Description
The 3D reconstructed maps can be used in many applications such as robot navigation, augmented reality and virtual reality. 3D maps for the environment has been developed using RGB-D sensor data that provides color and depth information. RGB-D camera noise, fast camera movement, and rotation introduce drift in the reconstructed 3D maps. If the scale of the 3D model increases, the drift error is also accumulated which can affect the final 3D model performance. A good way to reduce the drift is loop closure detection which is based on visual place recognition. It is an extremely challenging problem to solve in the general sense. First, a place recognition system must have an internal representation of a map of the environment to compare to the incoming visual data. Second, the place recognition system must report a brief about whether or not the current visual information is a place already included in the map, and if so, which one. If the loop closure detects successful, we could use the loop closure pose to correct current camera pose to enhance the camera tracking accuracy and 3D model performance.
Publication Date
4-18-2018
Project Designation
Graduate Research
Primary Advisor
Vijayan K. Asari
Primary Advisor's Department
Electrical and Computer Engineering
Keywords
Stander Symposium project
Recommended Citation
"Real-time camera tracking and 3D scene reconstruction based on pose graph" (2018). Stander Symposium Projects. 1156.
https://ecommons.udayton.edu/stander_posters/1156