Download Project (1.7 MB)
In the past decade, novel sensor systems that provide both color and dense depth images became readily available. There are great expectations that this new technology will lead to a boost of new applications in the field of 3D scene reconstruction and change detection in unstructured environments and under real-world conditions. The change detection problems are not new; however, 3D change detection is a challenging problem that has developed in recent years. In order to get the high resolution 3D model, we need more voxels in the 3D model, like high resolution 2D pictures need more pixels. We acquire a point cloud model from video captured by a Microsoft Kinect, which provides the required RGB and depth information. Instead of using ICP (Iterative Closest Point) algorithm align the target frame with the reference frame, frame-to-model registration scheme has more resistance to noise and camera distortion, and is sufficiently efficient to allow real-time applications. Then 3D change detection will be completed on the created 3D point cloud models. There are two kinds of voxels model, point model and color model. A point voxel model defines voxel as surface or free space. The threshold for the voxels which define as the surface is the number of points estimated by computing the nearest neighborhood voxels. A color model defines each voxel that has a color by the Hue value of all points’ HSV value in this voxel. Then using TSDF (Truncated Signed Distance Function) to detect surface of objects, we are able to find which voxels belong to the surface in the staggered voxel model. Combination of the point and color voxels model, and given the surface voxels more bigger weight, the difference between two scenes will presented by points in the voxels which are defined as scene changes.
Vijayan K. Asari
Primary Advisor's Department
Electrical and Computer Engineering
Stander Symposium project
"3D scene reconstruction and change detection using RGB-D sensor data" (2016). Stander Symposium Projects. 812.