ISPRS Ann. Photogramm. Remote Sens. Spatial Inf. Sci., I-3, 81-86, 2012
http://www.isprs-ann-photogramm-remote-sens-spatial-inf-sci.net/I-3/81/2012/
doi:10.5194/isprsannals-I-3-81-2012
© Author(s) 2012. This work is distributed
under the Creative Commons Attribution 3.0 License.
 
20 Jul 2012
AUTOMATIC FUSION OF PARTIAL RECONSTRUCTIONS
A. Wendel, C. Hoppe, H. Bischof, and F. Leberl Institute for Computer Graphics and Vision, Graz University of Technology, Austria
Keywords: Fusion, Reconstruction, Registration, Close Range, Aerial, Robotics, Vision Abstract. Novel image acquisition tools such as micro aerial vehicles (MAVs) in form of quad- or octo-rotor helicopters support the creation of 3D reconstructions with ground sampling distances below 1 cm. The limitation of aerial photogrammetry to nadir and oblique views in heights of several hundred meters is bypassed, allowing close-up photos of facades and ground features. However, the new acquisition modality also introduces challenges: First, flight space might be restricted in urban areas, which leads to missing views for accurate 3D reconstruction and causes fracturing of large models. This could also happen due to vegetation or simply a change of illumination during image acquisition. Second, accurate geo-referencing of reconstructions is difficult because of shadowed GPS signals in urban areas, so alignment based on GPS information is often not possible.

In this paper, we address the automatic fusion of such partial reconstructions. Our approach is largely based on the work of (Wendel et al., 2011a), but does not require an overhead digital surface model for fusion. Instead, we exploit that patch-based semi-dense reconstruction of the fractured model typically results in several point clouds covering overlapping areas, even if sparse feature correspondences cannot be established. We approximate orthographic depth maps for the individual parts and iteratively align them in a global coordinate system. As a result, we are able to generate point clouds which are visually more appealing and serve as an ideal basis for further processing. Mismatches between parts of the fused models depend only on the individual point density, which allows us to achieve a fusion accuracy in the range of ±1 cm on our evaluation dataset.

Conference paper (PDF, 4517 KB)


Citation: Wendel, A., Hoppe, C., Bischof, H., and Leberl, F.: AUTOMATIC FUSION OF PARTIAL RECONSTRUCTIONS, ISPRS Ann. Photogramm. Remote Sens. Spatial Inf. Sci., I-3, 81-86, doi:10.5194/isprsannals-I-3-81-2012, 2012.

BibTeX EndNote Reference Manager XML