ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences
Download
Publications Copernicus
Download
Citation
Articles | Volume IV-2
https://doi.org/10.5194/isprs-annals-IV-2-49-2018
https://doi.org/10.5194/isprs-annals-IV-2-49-2018
28 May 2018
 | 28 May 2018

LARGE SCALE TEXTURED MESH RECONSTRUCTION FROM MOBILE MAPPING IMAGES AND LIDAR SCANS

M. Boussaha, B. Vallet, and P. Rives

Keywords: Urban scene, Mobile mapping, LiDAR, Oriented images, Surface reconstruction, Texturing

Abstract. The representation of 3D geometric and photometric information of the real world is one of the most challenging and extensively studied research topics in the photogrammetry and robotics communities. In this paper, we present a fully automatic framework for 3D high quality large scale urban texture mapping using oriented images and LiDAR scans acquired by a terrestrial Mobile Mapping System (MMS). First, the acquired points and images are sliced into temporal chunks ensuring a reasonable size and time consistency between geometry (points) and photometry (images). Then, a simple, fast and scalable 3D surface reconstruction relying on the sensor space topology is performed on each chunk after an isotropic sampling of the point cloud obtained from the raw LiDAR scans. Finally, the algorithm proposed in (Waechter et al., 2014) is adapted to texture the reconstructed surface with the images acquired simultaneously, ensuring a high quality texture with no seams and global color adjustment. We evaluate our full pipeline on a dataset of 17 km of acquisition in Rouen, France resulting in nearly 2 billion points and 40000 full HD images. We are able to reconstruct and texture the whole acquisition in less than 30 computing hours, the entire process being highly parallel as each chunk can be processed independently in a separate thread or computer.