ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences
Download
Publications Copernicus
Download
Citation
Articles | Volume V-2-2020
ISPRS Ann. Photogramm. Remote Sens. Spatial Inf. Sci., V-2-2020, 235–242, 2020
https://doi.org/10.5194/isprs-annals-V-2-2020-235-2020
ISPRS Ann. Photogramm. Remote Sens. Spatial Inf. Sci., V-2-2020, 235–242, 2020
https://doi.org/10.5194/isprs-annals-V-2-2020-235-2020

  03 Aug 2020

03 Aug 2020

RIDF: A ROBUST ROTATION-INVARIANT DESCRIPTOR FOR 3D POINT CLOUD REGISTRATION IN THE FREQUENCY DOMAIN

R. Huang1, W. Yao2, Z. Ye1,3, Y. Xu1, and U. Stilla1 R. Huang et al.
  • 1Photogrammetry and Remote Sensing, Technical University of Munich (TUM), Munich, Germany
  • 2Department of Land Surveying and Geo-Informatics, The Hong Kong Polytechnic University, Hung Hom, Hong Kong
  • 3College of Surveying and Geo-Informatics, Tongji University, 1239 Siping Road, Shanghai 200092, China

Keywords: 3D descriptor, Rotation-invariance, Fourier analysis, Point cloud registration

Abstract. Registration of point clouds is a fundamental problem in the community of photogrammetry and 3D computer vision. Generally, point cloud registration consists of two steps: the search of correspondences and the estimation of transformation parameters. However, to find correspondences from point clouds, generating robust and discriminative features is of necessity. In this paper, we address the problem of extracting robust rotation-invariant features for fast coarse registration of point clouds under the assumption that the pairwise point clouds are transformed with rigid transformation. With a Fourier-based descriptor, point clouds represented by volumetric images can be mapped from the image to feature space. It is achieved by considering a gradient histogram as a continuous angular signal which can be well represented by the spherical harmonics. The rotation-invariance is established based on the Fourier-based analysis, in which high-frequency signals can be filtered out. This makes the extracted features robust to noises and outliers. Then, with the extracted features, pairwise correspondence can be found by the fast search. Finally, the transformation parameters can be estimated by fitting the rigid transformation model using the corresponding points and RANSAC algorithm. Experiments are conducted to prove the effectiveness of our proposed method in the task of point cloud registration. Regarding the experimental results of the point cloud registration using two TLS benchmark point cloud datasets, featuring with limited overlaps and uneven point densities and covering different urban scenes, our proposed method can achieve a fast coarse registration with rotation errors of less than 1 degree and translation errors of less than 1m.